RAE review: Tired, hard-up RAE seeks saviour with light touch

December 6, 2002

The research assessment exercise review panel wants to see radical change. The THES asks three experts to do some out-of-the-box thinking

Walk into any academic department in the UK, ask what the main bugbears are and the words "research assessment exercise" are sure to rank high, although possibly a little lower than "Quality Assurance Agency". The RAE, originally named the research selectivity exercise, was set up in the 1980s by the University Grants Committee in response to demands for greater accountability. The aim was to boost UK research nationally and internationally and to improve efficiency.

According to Evidence Ltd, which has carried out analyses of the RAE for the Higher Education Funding Council for England, the problem is that it has been too successful. It has "driven the quality of university research so successfully that UK research funding has run into some anticipated problems," says managing director Jonathan Adams in a recent article in Science . He adds: "The RAE may now have reached its limits... With some 55 per cent of UK academics in the top-rated category, the funding differentials are being flattened, and the resource pot hasn't grown as fast as the improvements."

Besides funding, the RAE has been criticised for the way assessment is carried out, and because it encourages universities to game-play, putting forward only their most "successful" performers. Criticism has also come from departments that have lost out, including those that have made large strides but have not gained top status. This, they say, creates a system that maintains the status quo and discourages improvement.

Sir Gareth Roberts, who is chairing the Joint Funding Bodies' review of the RAE, says the exercise is "looking tired". But he adds that few people contributing to the consultation on its future have come up with radical ways of changing it. "It is as if we are all too mesmerised by the RAE to see a way past it."

Adams admits that Evidence Ltd's surveys have found that academics are mainly divided into two camps - those resolutely opposed to external assessment and those who mainly accept the system and are wary of any alternatives. "They are basically very conservative," he says.

But the review wants something more radical than tinkering, such as changing the way panel members are appointed. It focuses on creating a "less burdensome assessment method", rather than on funding. A favoured option is said to be light-touch assessment. This would involve core money being allocated according to criteria such as previous performance, and additional money going to centres of excellence identified by the university, which would then be assessed.

One of the more radical proponents of change is Southampton University's Stevan Harnad. He would like to see UK research made "accessible and assessable continuously online" rather than in a four-yearly process, as happens now. He argues that research in every discipline can continue to be refereed, but can be made freely accessible to all academics if they archive their own refereed research online, bypassing the access tolls charged by research publications. He says software can also determine research impact, that is, how much other researchers cite research and build on it, as opposed to research productivity. This, he says, matters more to academics than the importance of the publication in which it appears.

Harnad adds that the software to do this is available. All it needs is for the RAE to encourage the process "by mandating that all UK universities self archive all their annual refereed research in their own e-print archives". The benefits, he says, include "a far more effective and sensitive measure of research productivity and impact at far less cost" to both universities and the RAE and a strengthening of the uptake and impact of UK research by increasing its accessibility. "The UK is uniquely placed to move ahead with this and lead the world," Harnad says, "because the RAE is already in place."

Other proponents of change include the Royal Society's John Enderby. He suggests "a simple solution, so obvious that it must have been thought of and I may have missed it". He says that instead of research being "shoehorned" into international, national or sub-national levels in seven broad grades, the RAE should "simply produce a research quality profile and fund accordingly". He argues that this would reward excellence instead of penalising departments for making a "mistake" in their choice of research-active staff, would avoid game-playing and, combined with "proper cross-referral, would ease the problem of interdisciplinary activities".

But for many, the problem remains firmly tied to funding. "It's a bit tricky to devise a new RAE without an explicit policy on research that addresses how much research we are prepared to fund, in what areas and for what purposes," says Robert Harris, former pro vice-chancellor of Hull University and an academic auditor with the QAA. He proposes that the RAE create several assessment routes to reflect universities' different strengths. "The variegated assessment system is a logical outcome of variegated sectoral reality," he says. "Opening up alternative competitions that more accurately reflected an institution's distinctiveness would be wholly to the good."

The public consultation on the future of the RAE closed last week. The review panel is assessing the contributions and will report shortly.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored