The REF: why did it cost so much?

The sums spent on the exercise were exorbitant and the money could be better spent, argues Derek Sayer

July 30, 2015
Nate Kitch illustration (30 July 2015)
Source: Nate Kitch

In 2007, David Eastwood, then chief executive of the Higher Education Funding Council for England, said that in devising the research excellence framework the funding bodies were “as committed to lightening the burden as we are to rigour in assessing quality”. Yet earlier this month we learned that the 2014 REF cost an estimated four times as much as the 2008 research assessment exercise (“REF 2014 cost almost £250 million”, 13 July).

According to the REF Accountability Review, the REF cost the funding bodies £14 million and universities £232 million. About £19 million of the latter amount (8 per cent) was for REF panellists’ time, leaving £212 million (92 per cent) as the cost of submission.

Of that, preparing impact case studies cost £55 million, but the review chooses to discount this figure when comparing the costs of the REF and the RAE. Believing the £47 million figure for the RAE to be “a conservative estimate”, the review revises it up to £66 million. Comparing this with the £157 million spend on submitting outputs to the REF, it concludes that “the cost of submitting to the last RAE was roughly 43 per cent of the cost of submitting to the REF”. Or to put it another way, submission costs for the REF were about 238 per cent higher than for the RAE.

But the fact remains that higher education institutions did incur an additional £55 million in real costs of preparing impact submissions, which were a mandatory element of the exercise. If impact is included in the calculation, the increase in submission costs rises to about 321 per cent.

The figures also reveal that, contrary to the perceptions of the higher education institutions surveyed for the report, the increased cost of the REF was not “mainly” because of the new impact element.

“The strengthening of equality and diversity measures, in relation to individual staff circumstances” is also singled out. This is the only place in the report where an item is identified as “a disproportionately costly element of the whole process” (my emphasis). But the document also notes that dealing with special circumstances “took an average 11 per cent of the total central management time devoted to REF” and “consumed around 1 per cent of the effort” at departmental level. This amounts to £6 million (or 4 per cent) of institutions’ £157 million non-impact submission costs – a drop in the ocean.

We are left, then, with an increase in submission costs of about £85 million that is not attributable to changes in the formal submission requirements. According to the review, “the REF element on research outputs, which included time spent reviewing and negotiating the selection of staff and publications” was “the main cost driver” for institutions, with some running “two or three formal mock REFs, with the final [one] leading directly into the REF submission”.

But it nowhere explains why this element should have consumed so much more staff time in 2014 than it did in 2008. The most likely explanation lies in a factor the review does not even mention: Hefce’s changes to the quality-related funding formula in 2010‑11, which defunded 2* outputs. At that point, in the words of Adam Tickell, who was then pro vice-chancellor for research and knowledge transfer at the University of Birmingham, universities “had no rational reason to submit people who haven’t got at least one 3* piece of work”. More importantly, they had an incentive to eliminate every 2* (or lower) output from their submissions because these would lower their ranking without any compensatory gain in income. Mock REF “iterative processes” were designed for this purpose. This competition is only likely to intensify given Hefce’s further “tweaking” of the QR funding formula in February, which changed the weighting of 3* to 4* outputs from 3:1 to 4:1.

Many have argued that the human costs of this competition are inordinately high, and the review confesses that it “does not include an estimate of non-time related burdens on staff, such as the stress on staff arising from whether they would be selected for the REF”. What is clear is that there is an exorbitant financial cost as well. The review argues that this cost is “less than 1 per cent” of total public expenditure on research and “roughly 2.4 per cent of the £102 billion in research funds expected to be distributed by the UK’s funding bodies” over the next six years. But this latter figure is misleading since some elements of the funding bodies’ research budget are distributed without reference to the REF. Once these are excluded the figure rises to about 3.3 per cent. By comparison, “the funding bodies estimated the costs of the 2008 RAE in England to be around 0.5 per cent of the value of public research funding that was subsequently allocated with reference to its results”. If this is supposed to be a measure of cost-efficiency, REF 2014 scores very much worse.

When considering the cost-effectiveness of the exercise, we would also do well to remember that the considerable sums of money currently devoted to paying academics to sit on committees to decide which of their colleagues should be excluded, in the interest of securing their university a marginal (and in many cases misleading) advantage in the league tables, could be spent in the classroom, the library and the lab. The QR funding formula has set up a classic prisoner’s dilemma, in which what may appear “rational” behaviour for ambitious research-intensive institutions has toxic consequences for the system as a whole.

Derek Sayer is professor of history at Lancaster University.


Print headline: Do not resuscitate: the REF is a drain on precious resources

A longer version of this article can be viewed at:




Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Please Login or Register to read this article.

Reader's comments (1)

An interesting analysis of the system's own "impact". This is in terms of costs, issues and inconsistencies, as well as showing "the law of unintended consequences" in action. Well done.