The established hierarchy of university research excellence is set for a massive shake-up under the system set up to replace the research assessment exercise, according to a new study.
Some academic departments with the highest ratings under the current peer-review RAE system - including 5* chemistry and engineering departments at Imperial College London - could see themselves plunge down the pecking order under the proposed new Research Excellence Framework.
Lower-rated departments could rise substantially under the REF, which replaces peer-review judgments in science subjects with a system of metrics, including a count of the number of times researchers' published work is cited by their peers.
These are the findings of an 18-month study funded by two research councils and undertaken by Cranfield University, released exclusively to Times Higher Education this week.
The study took all research submitted for the 2001 RAE and determined the citation counts for every single submitted journal article where possible - 112,201 in total. The researchers used the Web of Science database, which is the system the Higher Education Funding Council for England plans to use in the REF, to measure citations.
It then compared RAE 2001 scores with scores that would have been produced had citation counts been used instead of peer review.
The study found that there is a good correlation between the RAE results and citation counts in six out of 28 subjects. But 13 have a weak correlation and nine showed no significant correlation at all.
Individual universities' performance was examined in two science subjects - chemistry and a branch of engineering - and large deviations between the systems were found.
In chemistry, where the Cranfield study found a good general correlation, the University of Surrey and Swansea University dropped dramatically. Imperial College London, Queen's University Belfast and the universities of Durham, Leeds, Edinburgh, Reading, Newcastle and Strathclyde also fell in the rankings.
Showing the greatest improvement was Northumbria University, but others including the universities of Warwick, York, St Andrew's, Glasgow, Bath, East Anglia and Huddersfield also rose, indicating that the quality of research at those institutions had been underestimated by peer review compared with citations.
In the case of engineering, many more universities change position, reflecting the poor correlation between RAE results and citation counts in the subject. Citations enhanced the University of Derby's position the most, while University College London and the universities of Strathclyde and Glasgow fell sharply. The universities of Bath, Queen's, Southampton, Imperial College and Liverpool, all 5* institutions for engineering in the RAE, showed declines.
"If the WoS was bought in, it would be a complete upheaval of current hierarchies," said Andy Neely, deputy director of Cranfield's Advanced Institute of Management Research, which carried out the study.
Bahram Bekhradnia, the director of the Higher Education Policy Institute, described the study as a "good and interesting" piece of work.
"There must be real doubts about whether that pilot (which will compare RAE 2008 to citations) will provide the reassurance that is needed confidently to put our faith in metrics alone."
Hefce, however, dismissed the study as "not a good test" of its REF proposals.
"It takes into account only four publications per researcher; it makes no allowance for variation in citation behaviour between sub-disciplines; and it presents the outcomes as summary grades rather than quality profiles," a Hefce spokesman said.
Hefce said that the Cranfield study also ignored the research income and postgraduate student numbers that would be factored into institutions' scores in the REF.