New Zealand academics have called for their research assessment exercise to be scrapped, saying that it has led to the marginalisation of locally-focused scholarship.
Michael Gilchrist, president of the Tertiary Education Union, said that while the evaluation which determines the distribution of the Performance-Based Research Fund had prompted initial change, the pay-offs no longer justified the effort “after four gruelling rounds”.
“All that remains are the negative aspects: high compliance costs and administrative overheads; a six yearly treadmill for staff; intrusive processes; gaming of the system; and misuse of results,” he said.
Interim outcomes of the latest quality evaluation, which dictates the annual allocation of NZ$315 million (£160 million) in research funds, suggest steady progress in research output since the previous analysis in 2012. The number of entries deemed fundable rose by 17 per cent while the proportion judged as “high quality” or “world-class standard” rose three percentage points to 56 per cent.
But Mr Gilchrist said that this year’s planned review of the scheme should recommend its abolition. “The imperative to publish in a select few international journals has caused a marked reduction in specialised and locally based research. We urgently need that research to meet the unique challenges our country faces,” he said.
Tertiary education analyst Dave Guerin said that there were “diminishing returns” from exercises such as the PBRF. “In the first round or two they drive a greater focus on research, but 16 years into this approach, people are already doing what they were incentivised to do.
“In the early 2000s we had a fair few academics who weren’t doing much research because it wasn’t prioritised or measured. Those issues have now been resolved.”
The New Zealand review follows the publication of the results of the Excellence in Research for Australia exercise.
While “world-class standard” is the highest rating available in New Zealand, Australia’s version also features “above” and “well above” world standard, with 57 per cent of entries ranked in those two categories – up from 46 per cent in the previous 2015 assessment.
Former University of Melbourne deputy vice-chancellor Frank Larkins has questioned whether these results reflect grade inflation. “No information is provided to the general research community as to the quantitative or qualitative world standard benchmarks, and how they have changed with time,” he points out in a new paper.
Professor Larkins says that in seven of 11 science, technology, engineering and mathematics fields, the number of universities ranked above or well above world standard has more than doubled since 2012. “The Australian Research Council has a responsibility to release more metric data so that independent assessment can be undertaken.”
The ARC said that it calculated ERA benchmarks “using data supplied from citation providers under contract”, and contractual conditions meant that the benchmarks could only be shared with participating universities for internal use.
It said that the benchmarks formed only part of the overall citation profiles used by panels of subject experts to evaluate entries. “There is not a direct relationship between the benchmarks and the rating received,” it said.