Research excellence framework panels may be prohibited from judging academics solely on the basis of metrics, but suspicions that some universities do just that will be heightened by the acrimony over one university’s restructuring programme.
As reported in Times Higher Education last week, Queen Mary, University of London has designated 29 academics from its School of Medicine and Dentistry and 11 from its School of Biological and Chemical Sciences to be “at risk” of redundancy or what is widely perceived as demotion to teaching-only positions.
The School of Medicine and Dentistry needs to save £3 million a year, but its published rationale for the redundancies also makes clear its anxiety over maintaining the high rank it received in the 2008 research assessment exercise.
Meanwhile, the School of Biological and Chemical Sciences’ restructuring is explicitly motivated by its relatively low RAE score, which is deemed incompatible with the desire of the institution - which in August becomes a member of the Russell Group - to enter the top 10 per cent of UK universities.
The school has been restructured to focus on areas in which it is already strong or which are prioritised by research funders. By replacing academics who do not fit in with the new structure, it hopes to come close to doubling its research income and to be ranked among the top 20 per cent by the 2019 REF.
The school’s original restructuring plans envisaged recruiting 15 academics in informatics, noting that every UK research council had “informatics or computational initiatives” among their strategic aims.
However, David Bignell, emeritus professor of zoology at Queen Mary, pointed out that the Engineering and Physical Sciences Research Council had subsequently indicated it would cut funding for biological informatics. He said this illustrated the relative volatility of funders’ priorities and the folly of trying to chase them.
“A robust department which covers most of the corners of its subject will always be in a position to benefit somewhere, wherever the current fashions lie,” he said.
A spokeswoman for Queen Mary said the informatics recruits would now be distributed across its Faculty of Science and Engineering.
Academics within both schools have been quick to criticise the use of research metrics to designate those “at risk”. According to the proposal document, scholars in the School of Medicine and Dentistry were primarily assessed on the basis of the impact factor “and/or reputation” of the journals in which they had recently published, with extra credit for first or last authorship.
Their average research income was taken into account, too. This was also one of four measures used by the School of Biological and Chemical Sciences to judge “at-risk” staff: academics were required to reach a minimum threshold in three.
But John Allen, professor of biochemistry at Queen Mary, said use of such a measure would encourage academics to “burn money”, rather than conduct high-quality, cost-effective research. He was also critical of the use of PhD completions as a metric, arguing that this depended primarily on the “capricious and slightly warped” internal allocation of studentships.
Academics in the biological and chemical sciences have also been assessed according to the number and quality of their recent research outputs, with thresholds linked to seniority. But all academics must have published at least one article in a “top 5 per cent journal”: one definition of such is a journal with an impact factor (ie, the average number of citations received per paper published in the two preceding years) of at least seven.
However, Professor Bignell pointed out that citation counts vary widely between fields. To publish in a journal with an impact factor of five would be considered a big achievement in his field of zoology, he said, adding that it would be fairer to count the number of citations garnered by individual papers.
However, Matthew Evans, head of the School of Biological and Chemical Sciences, pointed out that more recent papers garnered fewer citations. “Impact factor reflects the number of times an average paper is cited, [so] is a good indication of how many citations a particular paper is likely to achieve,” he said.
He said that all “at-risk” academics in his school had been offered face-to-face meetings to “raise any important publications which they think have not been fully considered during the process”, but added: “The research criteria have been rigorously constructed in consultation with all staff and relevant unions, and cannot be altered for an individual member of staff.”
An even more controversial alternative definition of a top journal, which has been adopted by the School of Biological and Chemical Sciences, is a publication that was assigned an A* ranking by the Australian Research Council for the inaugural 2010 run of its version of the REF - Excellence in Research for Australia. The rankings attracted huge controversy and have been dropped for the second ERA iteration this year.
“How can the needs of British science be adjudicated by reference to a system discredited in Australia? It is absurd,” Professor Bignell said.
‘Crass, bureaucratic simulacra’
But while journal rankings might be history Down Under, the ERA will remain largely metrics-driven. That might be one reason why the University of Sydney used metrics to shortlist 100 academics for redundancy and another 64 for teaching-only roles late last year.
It explained that a serious miscalculation of future income meant that it would otherwise be unable to afford necessary investments in infrastructure and ICT.
“Safe” academics were required to have published at least four research outputs during the past three years, with no account taken of seniority, discipline or type of output.
Those threatened were given an opportunity to make a case for extenuating circumstances, but Nick Riemer, senior lecturer in Sydney’s department of English, said this did not lessen the “scandal” of their having to defend their jobs on the basis of retrospectively adopted criteria that considered only research - despite the fact that Sydney’s academic contracts stipulate that research accounts for only 40 per cent of scholars’ duties.
Dr Riemer and Professor Allen questioned the transparency with which the criteria at Sydney and Queen Mary had been applied, particularly given that the metrics drew on publicly available information.
“I can see no honest motive for that,” Professor Allen said. “I am shocked that people who describe themselves as scientists can think…evidence is something that is not evident.”
Dr Riemer also claimed that none of the scholars singled out by Sydney “could justifiably be accused of underperforming overall”.
However, he believed that the ERA’s focus on metrics meant that Sydney would not be the last Australian university to conduct retrenchment exercises driven by “crass, bureaucratic, quantifiable simulacra of genuine research”.
Professor Bignell said there was “widespread agreement” among UK academics that “mechanistic assessment methods are inaccurate and very damaging to the academy”.
But Professor Evans described metrics as a “vital tool” in assessing academics’ contributions to research and “the only empirical way of measuring success in science”.
“REF panels might allow a broader view of the impact of a particular piece of research, but the resources involved in setting up such a panel would be prohibitive,” he added.