New Zealand cancels research quality evaluation

New government’s decision spells the end of ‘back-breaking’ exercise, commentators say

April 9, 2024
New Zealand Parliament Wellington Beehive
Source: THE

New Zealand has cancelled the upcoming round of its national research assessment, in what is widely regarded as a death blow for the decades-old exercise.

Tertiary education minister Penny Simmonds has decided not to proceed with data collection for the Performance-Based Research Fund (PBRF), which guides the allocation of NZ$315 million (£150 million) of block grants each year.

The decision, revealed by the Tertiary Education Commission (TEC), coincided with the announcement of a higher education review whose terms of reference include a “particular focus” on the PBRF.

Originally scheduled for this year, the quality evaluation had already been delayed until 2025 and then 2026 by the former Labour government.

Data collection for the first PBRF occurred in 2003, with subsequent rounds in 2006, 2012 and 2018. Asked whether the TEC had “now served its purpose”, TEC chief executive Tim Fowler said it was an “open question”.

“We’re getting to the point where…very marginal gains are being made in comparison to what we saw for the first decade,” Mr Fowler told the Education and Workforce Select Committee in February.

“It has…provided us with really good, robust evidence of the quality of the research that has been delivered across our institutions. On the downside, it’s extremely compliance-heavy for us to run it. It’s a back-breaking six-year gestation period every round. The institutions themselves have to put in a lot of administrative effort to make it work.”

Universities New Zealand chief executive Chris Whelan said the cancellation was “sensible” in the context of the higher education review and a concurrent science appraisal.

“Both…will consider the role of the PBRF and, between them, are likely to suggest different ways of measuring and assessing effectiveness and impact,” Mr Whelan said.

The advisory groups conducting the two reviews are both headed by former prime ministerial science advisor Peter Gluckman, who led a 2021 evaluation of the UK’s Research Excellence Framework (REF). His advice helped shape a massive shake-up of the REF, with individual academics no longer obliged to participate in the exercise.

In Australia, a review by Queensland University of Technology vice-chancellor Margaret Sheil led to the scrapping of that country’s national research exercise last August, a year after the government had postponed the exercise.

Higher education policy analyst Dave Guerin said he expected the same fate for the PBRF. “I suspect that the evaluation of academics’ portfolios will go,” said Mr Guerin, editor of the Tertiary Insight newsletter.

“It’s a huge compliance exercise and it provides little value these days. The early evaluations provided some value in changing behaviour, but the behaviour has changed, and we’re adding a huge amount of compliance costs on academics and research managers for little impact.”

Mr Guerin said the PBRF had been among several UK-influenced exercises introduced decades ago when the absence of research performance measures had aroused public scepticism. “A lot of people said: ‘What are all these academics doing?’

“But those problems have been addressed. The management systems of institutions have changed. There isn’t the mythical academic who’s just twiddling their thumbs. That stereotype has gone – instead you’ve got the mythical academic filling in lots of forms.”

John Egan, associate dean of learning and teaching at the University of Auckland, said that the workload “burden” generated by the PBRF impeded rather than enhanced research productivity.

john.ross@timeshighereducation.com

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Reader's comments (4)

The same comments can be applied to other countries that blindly followed the UK in adopting RAE Will other countries sit up and take notice [that include the UK]?
After another PBRF 'review' and the SRG 'consultation' process, we ended up with the same old PBRF with some minor tweaks. These people seemingly ignored all the glaring issues with the quality evaluation exercise, prescribing that 2026 would be yet another evaluation based on individual portfolios. In what industry does some third party assess individuals (with little regard for their day-to-day employment expectations), especially where the output (research) requires a multi-faceted team effort? It's akin to claiming that Jim from marketing is more important than Sally in engineering; both are integral members of the company, and neither is more crucial to its success. In the same way, the PBRF attacks younger researchers and anybody with heavier teaching loads while heaping all the glory on old professors (who coincidentally are the same people writing the PBRF rules). Sadly, even when something exceeds its use-by date, a handful of people will always be determined to see it continue unchanged. Someone had to step in and do something, and I applaud the new government for taking bold action.
PBRF was opposed by academics from the get go. In the early years it proved its worth, showing up academics holding senior positions who had woeful research track records (largely now cleaned out). Yes, there is a law of diminishing returns, but if you want to keep track of your achievements for the purposes of making grant applications, applying for promotion, or going for another job, wouldn't it be useful to have the (research) details available and on hand - and then update them when the time comes? The authorities could have made the exercise less onerous, and they did. For example, although people have to have a complete portfolio, they present just their best four publications for full evaluation. Academics are on the taxpayer's tab and there should be some accountability for quality and performance, but maybe it does not have to be this onerous. Also, with New Zealand trying to make its luck internationally it needs to be sure it has top talent in key research positions, and exercises like PBRF are part of making sure that is the case (in my view anyway).
Another PBRF round might have worked. However, the quality evaluation required an appropriately revised approach. Having worked at two institutions in New Zealand in the last 15 years, I witnessed the toxic side effects of the PBRF progressively amplified with each round. These include bullying and exploitation of junior researchers, counterproductive elitism and faculty politics, and the devaluing of teaching. The bureaucracy associated with the PBRF seemed to me the least of its flaws. Sure, there may have been some gains in early PBRF rounds. Yet, the exercise has never adapted from a system for identifying senior academics with "woeful research track records" to a system that effectively nurtures and supports a healthy, holistic academic and research culture. The consultation group never seemed genuinely interested in changing anything, repeatedly ignoring sensible suggestions for modifications to improve the exercise.

Sponsored