New year’s resolutions are notoriously short-lived. For the many academics who resolved never again to utter the three letters RAE, their hope will be in vain. The excitement, disappointment or complete confusion over the pre-Christmas announcement of the results may be diminishing, but the analyses go on.
The number of interpretations and permutations of the data are such that almost every institution can find something to celebrate, even if it may be those four staff in an unheard-of department who were ranked 3 or 4*.
Local press headlines such as “University that you have never heard of beats Oxford” abound. It is only by the fourth paragraph that you realise that the success is in that curious Unit of Assessment (UoA) that you also hadn’t heard of. Institutions will inevitably be assessing their research power, grade-point average (GPA), or some multiplier of the GPA times the number of staff, or Olympic-style “gold medals” (the number of staff ranked at 4* or, for some, 3 and 4*). The few that are claiming success based on the number of 2, 3 and 4* may be struggling a bit.
One of the hottest debates of RAE 2008 has been about the proportion of staff returned in each higher education institution, or more precisely the absence of data on this proportion. The Higher Education Funding Council for England chose not to request data on the number of eligible staff, probably because of the complexities of assessing eligibility.
Many institutions will have returned independent research fellows, who are not formally recorded as academic staff, hence their return could be greater than 100 per cent (as shown in some of the Times Higher Education estimates based on last year’s Higher Education Statistics Agency data). But the absence of such data has attracted remarkable attention, given the fact that information about return rates was largely ignored in RAE 2001.
On the first day back for many, 5 January, we will see the grade profiles for each UoA in our own institution. These will reveal how much of the profile of each UoA submitted was due to outputs, environment and esteem. The contribution of each varies between disciplines, making comparisons even trickier than for overall profiles. When the profiles for all institutions are released, there will be an opportunity for yet more league tables.
Then we will also be able to scrutinise the thousands of pages of submissions in an attempt to understand why that place down the road snipped us in what we thought was our top department. But of course the real outcome won’t come until March, when we hear about the funding, which will lead to yet more league tables. There is already much second-guessing of how the funding model will operate. Many expect a selective weighting favouring the upper grades by at least fourfold and possibly even more.
The RAE has always been the subject of controversy. This time perhaps more than most, since a few vice-chancellors have suggested that it is outdated, expensive, time-consuming and even perhaps unfair. The Hefce costs are a tiny fraction of the costs of other funding schemes such as research council grants. Ongoing analysis of the costs of the RAE to higher education institutions will reveal these to be much greater, but isn’t regular assessment of our research quality a normal part of any university? In which case, at least some of the RAE activity would be normal practice.
As the person with overall responsibility for all of our university’s RAE submission (a modest 53 units), as well as serving as a panel member and participating in analysis of the results, I can say that yes of course it was hard work, but it was short-lived. By comparison, membership of a major funding panel (or chair of such bodies) is much more onerous and lasts for many years.
My experience of panel membership was of rigour, fairness and collective agreement. I was surprised that the average GPA in our discipline (pre-clinical studies), which is ranked internationally as first or second by various measures, appeared somewhat lower than in other subjects, but parity across the whole range of disciplines is probably impossible. Hefce has certainly succeeded in avoiding the “cliff-edge” effect of grade bands.
But the profile system is complicated and open to numerous interpretations, most of which seem to be a mystery to the British press. It was fascinating to read on 18 December that four different institutions were claimed as RAE leaders by different publications.
The focus on outputs rather than individuals made assessments much more difficult than in 2001, when a combination of factors was used to assess each member of staff. Hefce is understandably concerned about legal challenges, but most of us assess individuals on a regular basis as part of our day jobs.
So what next? The RAE’s replacement, the research excellence framework, was initially proposed as an assessment based solely on metrics, at least for the sciences. But it has already suffered much criticism. The divide between sciences and arts is at best fuzzy and many disciplines span both. How do we deal with geography, social sciences, linguistics and even computer sciences (where my colleagues insist that one citation a year is very good)? Metrics can be powerful and objective, but they also have problems ranging from sub-discipline effects and inaccuracies in public databases to the very historical nature of bibliometrics.
The first REF would have assessed many of the same outputs as RAE 2008. Many panels in the recent RAE used bibliometrics, but with caution and peer assessment. The REF should do the same. Then there is the fact that, if funding bodies are worried about legal challenge, “peer judgment” by experts is the very best cloak to hide behind, since it will rarely be challenged by courts. In contrast, the use of potentially dodgy numbers is an open door to the lawyers. There are many ways to streamline the RAE process, but academics will always find ways of making it more complicated.
Assessment does improve performance, and as long as it isn’t too onerous it will always have a key role in the UK higher education sector. One message, though, from RAE 2008 is that the outcomes are just a bit too complicated for the British press to fully grasp, so could Hefce please give a bit more guidance next time?
Nancy Rothwell is MRC research professor, deputy president and deputy vice-chancellor, University of Manchester.