‘Mother of ERA’ set to pull plug on research assessment exercise

‘Mother of ERA’ now charting its demise, as the ‘law of diminishing returns’ prompts a rethink

二月 23, 2023
Executioner's axe
Source: iStock

The architect of Australia’s research evaluation scheme is now shaping up as its executioner, with a Canberra conference hearing that the 2018 round of Excellence in Research for Australia (ERA) is set to be its last.

Queensland University of Technology vice-chancellor Margaret Sheil said the “huge amount of effort and resources” invested in ERA were yielding “diminishing returns” and would be better directed elsewhere.

“[We] need to retain a strong evaluation capability…but it may take a different form to the incredibly rigorous and well-designed ERA exercise,” she told the Universities Australia conference.

As chief executive of the Australian Research Council (ARC), Professor Sheil earned the nickname “mother of ERA” when she presided over its 2010 introduction. Now, as government-appointed chair of an ARC review, she must decide what to do with her progeny.

She suggested that ERA’s prospects of survival would have been stronger if it had been used to allocate money, as originally intended. The checks and balances needed to justify funding decisions, which had been built into the framework’s design, were simply too expensive for a system that served as a quality assurance, reputational and institutional benchmarking tool.

“Part of the burden of ERA – the rigour around ensuring that outputs were allocated to the right institutions, and so on – was [that] we did design it in a way that could have been tied to funding,” she told the conference. “But the politics and numerous other things didn’t allow that to happen.”

A 2022 discussion paper strongly hinted that the review intended to recommend ERA’s scrapping. Professor Sheil acknowledged that this may have influenced people’s views.

She told Times Higher Education that the overwhelming majority of submissions to her review had called for ERA to be retired, saying it had “done its job” and the payback no longer justified the expense.

Now was the “perfect time” to replace ERA because recent changes to “field of research” classifications had interrupted the time series, diminishing its value as a longitudinal record of universities’ performance in particular disciplines.

Professor Sheil said an evaluation process would still be needed for informing national science priorities, and for assessing universities’ research quality for registration purposes. But the ARC was well equipped for the job. “It’s got huge knowledge about the research sector and a great evaluation capability,” she told the conference.

She said many submissions had also highlighted the importance of “trust”, following multiple interventions in research grants. “Over the last 20 or so years, that trust has been broken…because of political interference. Decisions on the best direction of research should be made by those who have the expertise,” she said.

Professor Sheil also said that a well-intentioned desire to improve gender equity in research careers, by acknowledging the impacts of things like maternity leave on publication track records, had inadvertently added to researchers’ paperwork burden when they applied for grants.

As ARC chief executive she had introduced a “research opportunity and performance evidence” statement (Rope) so that people could document career breaks that had interrupted their progress. “But over time, people wanted to add things to it.

“Now if you go on to the ARC website and look at what you can include in your Rope, it’s a huge array of activities. It’s one of the things that contributes to the length and complexity of the applications.”




  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
Please 登录 or 注册 to read this article.

Reader's comments (1)

The sector in Australia has always opposed the ERA, or its equivalent. Yes, it may be onerous, but I suspect that the real reason is that the sector just doesn't like being evaluated from outside like this. And of course it can be simplified if the issue is its expense and complexity, but I suspect that this would not satisfy the critics. From my perception of the impact of PBRF in New Zealand tertiary institutions, its introduction had a galvanising effect. Everybody just got that much more serious and professional about their research work. One key aspect of the NZ system is that it required academics to submit their best four research outputs, so the emphasis was certainly on quality not quantity. I would be interested in the stats - is Australia going to consider them? - but my impression is that both the quality and the quantity of research outputs have improved measurably.