The funding council is tipped to back 'per paper' citations rather than journal impact. Zoe Corbyn reports. The quality of academics' research will be judged according to the number of times their published papers are cited by their peers under a key part of the system that will replace the research assessment exercise.
The Times Higher understands that, after next year's RAE, funding chiefs will measure the number of citations for each published paper in large science subjects as part of the new system to determine the allocation of more than £1 billion a year in research funding.
A report published by Universities UK this week endorses such a "citations per paper" system as the only sensible option among a number of so-called bibliometric quality measurements. It concludes that measuring citations can accurately indicate research quality.
But the report, by consultants Evidence Ltd, also highlights a series of potential threats to the credibility of a citations system, including likely changes in researchers' behaviour to maximise their performance.
"It is facile to pretend that all behavioural effects can be anticipated and modelled," the report says. "The metrics system will be assaulted, from the day it is promulgated, by 50,000 intelligent and motivated individuals deeply suspicious of its outcomes. There will be consequences."
The Times Higher had previously understood that the Higher Education Funding Council for England was planning to place academics' research papers into bands according to the impact factor of the journals they are published in. But a well-informed source confirmed that under current plans, to be published for consultation this month, profiles will be drawn up, in broad, science-based subject areas, in each university based on citations for individual papers.
The citation measurements will be "normalised" to account for different citation levels between disciplines.
The Evidence report, The Use of Bibliometrics to Measure Research Quality , concludes that citations per paper provide "the most likely route to developing comprehensive and acceptable metrics" for science, engineering and technology subjects.
But it warns that the way the raw data is adjusted will be "central and critical" to how well the system functions and is able to account for higher citation levels for older papers and large disciplinary differences.
The report also highlights possible distorting effects on behaviour. In the Netherlands, where such bibliometrics are in use, there has been an "exceptional rise" in the number of citations for Dutch academics and the country's share of citations. This has raised concerns about the rise of "spurious publications" and manipulation of the system.
The Evidence report says the system should still include citations where academics referred to their own work. "Those who are ahead of their field will always cite themselves. To penalise these self-citations would be antithetical to research," said Jonathan Adams, Evidence's director.