REF 2014 rerun: who are the 'game players'?

How do rankings shift when institutions are compared on research intensity, which takes into account percentage of staff submitted, rather than on standard GPA alone?

January 1, 2015

Source: Elly Walton

Talk off the record to any university head of research about the research excellence framework and it will not be long before you get on to the topic of the biggest “game players”.

These are institutions that were supposedly highly selective in terms of which researchers they submitted in the last round of research assessment so that they could maximise their position in quality rankings, such as those published by Times Higher Education on 18 December. To their critics, such institutions are in essence cheating because in reality their quality score reflects the work produced by only a small proportion of their staff.

One such critic is Bob Allison, vice-chancellor of Loughborough University. “How can it be fair to say ‘Well done – your league table position is excellent’ [when it] in part reflects the fact that you left out a large group of staff who would have dragged down your GPA?” he asks. “To get a fair and comparable picture [of research quality] across the sector, you have to consider the full academic research base.”

For this reason, he prefers research power as a basis for rankings: this measure combines GPA with the number of staff submitted. However, research power arguably gives an unfair advantage to large departments and institutions, which can still achieve a high research power despite being selective.

In light of this, and in response to demand from the sector, THE this week takes a different approach to identifying true research intensity, presenting the results of the 2014 REF on the basis of GPA multiplied by the proportion of eligible staff each institution and department submitted (for detailed methodology, see below).

This method is not perfect. The major flaw is that it in effect gives a zero score to the research of anyone not submitted to the REF – which, in many cases, will clearly not reflect reality. On the other hand, critics of alleged game-playing will argue that if institutions value someone’s research, they should have submitted it. For Allison, a ranking based on intensity-weighted GPA is “legitimate and important”.

Our data on the number of staff eligible for submission come from the Higher Education Statistics Agency, but the figures were not available in time to be used in THE’s first set of rankings on the REF results (“Check your coordinates”, 18/25 December). The Hesa data come with a number of “health warnings” (see methodology), and THE is also aware of at least two universities that dispute their Hesa figures. However, the overall rankings and subject tables that follow still provide an interesting alternative take on the REF results.

When ranked by intensity-weighted GPA, the Institute of Cancer Research retains the top spot it also holds on standard GPA, having submitted 95 per cent of its eligible staff. The University of Cambridge rises from fifth to second; Imperial College London slips from second to third; University College London rises from joint eighth to fourth; and the University of Bristol rises from joint 11th to joint fifth with the University of Oxford (which slips one spot from its position in the GPA table).

Loughborough is among a glut of small research-intensive universities that do much better on intensity-weighted GPA than they do on standard GPA; it shoots up from joint 49th to 14th on the back of submitting 88 per cent of its eligible staff. (Other similar cases include the universities of Kent, Reading, Essex and Leicester, as well as University of London colleges Soas and Birkbeck.)

According to Allison, Loughborough’s approach to the REF reflects its values, which include “be inclusive”, “recognise…our staff for their contribution and commitment” and “work together…with professionalism and integrity”.

“We have done our best to live by [those] values…I wonder how many other universities took that as their starting point?” he asks. “We wanted to demonstrate to both current and prospective students that we value research by all the staff who teach them.”

Giles Carden, director of strategic planning and analytics at the University of Warwick, also believes that the research intensity ranking “makes a useful comparison with the standard GPA ranking”, even though Warwick performs slightly better on the latter measure (in joint eighth position, as opposed to 11th position on intensity-weighted GPA). “Clearly, higher education institutions that are more selective can concentrate their research quality and rise up the GPA rankings. The intensity model adjusts for this and has quite a dramatic effect on some institutions’ rank position,” he notes.

Not surprisingly, research-intensive universities, which generally submit substantial proportions of eligible staff, tend to do better on intensity-weighted GPA than do post-92 institutions. Excluding anomalies (see methodology and notes), the highest proportions of staff – 95 per cent – were submitted by Queen’s University Belfast, the Institute of Cancer Research and Cambridge (although Hesa warns that the colleges of the universities of Cambridge and Oxford and the University of the Highlands and Islands have staff eligible for REF submission who are not recorded in its statistics, potentially making those universities’ submission percentages artificially high). Imperial College submitted 92 per cent of eligible staff, and University College London and the University of Bristol 91 per cent.

Among research-intensives, the lowest proportions of eligible staff were submitted by Cranfield University (37 per cent) and Aston University (43 per cent). As a consequence, the institutions slip, respectively, from joint 31st on standard GPA to 64th and from joint 35th to 60th. Neither responded to a request for comment.

The highest post-92 university in the intensity table is the University of Roehampton, which submitted 67 per cent of its eligible staff. It sits in 52nd place, compared with its 60th position based on standard GPA. Cardiff Metropolitan University, which placed higher (41st) in the standard table, plummets to joint 123rd place in the intensity table having submitted just 9 per cent of eligible staff.

Among institutions that submitted to more than one unit of assessment, Southampton Solent University had the lowest percentage of staff entered in the REF – 7 per cent. It is among 19 post-92s that submitted less than 20 per cent of their staff.

Alongside Loughborough, Brunel University London is the research-intensive that rises the most places on intensity-weighted GPA – from joint 75th to 40th – after submitting 85 per cent of its eligible staff.

Geoff Rodgers, deputy vice-chancellor for research at Brunel, says the institution put in “basically everybody who is doing research” because it wants to encourage all staff to improve their research and to have that improvement assessed.

“The REF is a measurement point, not an end point,” he continues. “We are committed to trying to say we are a broad-based research community all working together to improve it. It is a nice, straightforward message about what we are trying to achieve and where we are going. Hiring people to do research and then not putting them in [to the REF] sends a very mixed message, which is very difficult to manage.”

It is little surprise, perhaps, that he thinks intensity-weighted GPA is “a truer and more accurate measure of institution-wide research quality and a more rigorous basis on which to compare institutions” than standard GPA.

A similar view is held by Patrick Johnston, vice-chancellor of Queen’s University Belfast, which rises from joint 42nd on basic GPA to joint eighth on intensity-weighted GPA.

“Each institution will have different priorities, but, as the lead university in Northern Ireland and a Russell Group member, we have a responsibility to be as broad as we can [be while remaining] competitive [on GPA],” he says.

Apart from Aston and Cranfield, other research-intensives to slip significantly on intensity-weighted GPA compared with standard GPA include the universities of Swansea (joint 42nd compared with joint 26th), York (32nd, down from joint 14th), Sheffield (33rd, down from joint 14th), Bath (joint 34th, down from joint 14th) and Queen Mary University of London (joint 34th, down from joint 11th), all of which submitted just under three-quarters of their eligible staff.

But arguably the most notable faller is Cardiff University. On standard GPA, it ranks fifth among multi-departmental universities. In a press release put out on the day of the results, Colin Riordan, Cardiff’s vice-chancellor, said: “This extraordinary achievement marks us out as a world leader” and was “a testimony to the excellence and hard work of all our staff”. However, the university submitted only 62 per cent of its eligible staff, causing it to slip to 50th on intensity-weighted GPA. Cardiff did not respond to a request for comment.

Among post-92s, big fallers include Sheffield Hallam University, down from joint 63rd to joint 109th after submitting only 16 per cent of staff, and Coventry University, down from joint 75th to joint 120th after submitting just 13 per cent. The University of Brighton, which is the second highest post-92 institution in the standard GPA table, in joint 58th, falls to joint 95th with a 21 per cent submission rate.

When assessed by intensity-weighted GPA for outputs alone, the Institute of Cancer Research retains the top slot it shares on standard GPA with the London School of Economics (which falls to seventh place). Cambridge rises from joint fourth to second, and Imperial College rises from joint fourth to third. Significant risers include Bristol, up from joint 15th to fifth, and University College London, up from joint 18th to sixth. Loughborough climbs from 56th to 12th, while Queen’s rises from joint 50th to joint eighth. Cardiff falls from joint seventh to 50th.

Intensity-weighted GPA also shakes things up at subject level. Although Oxford is the big winner on standard GPA, coming top in 10 of the REF’s 36 units of assessment, it tops only half that number on intensity-weighted GPA. By contrast, Cambridge doubles its number of first places, to eight. However, some caution is required in interpreting this about-turn given the vagaries, discussed above, of Hesa’s Oxbridge data.

Imperial College also improves its haul of first places on intensity-weighted GPA, from none on standard GPA to four. The LSE also has four, and Queen’s, Lancaster University, the Institute of Cancer Research and the universities of Bristol and Glasgow head two units each. Only 13 of the 36 subpanels are led by the same institution on both standard and intensity-weighted GPA.

At the level of individual units of assessment, some institutions that rank first on standard GPA fall a long way when ranked on intensity-weighted GPA. For example, in allied health professions, dentistry, nursing and pharmacy, the University of Sheffield’s biomedical science submission drops to 32nd (of 94 institutions) with a 56 per cent submission proportion; in law, King’s College London falls to 26th (of 67), submitting 59 per cent; in civil and construction engineering, Cardiff falls to 12th (of 14) with 55 per cent; and in physics the University of Strathclyde falls to 24th (of 41) with 77 per cent. By contrast, in psychology, psychiatry and neuroscience, Glasgow rises from joint 22nd to joint first.

Looking at units of assessment as a whole, education appears to be the least research-intensive discipline, with institutions submitting just per cent of their eligible staff. Allied health professions, dentistry, nursing and pharmacy; business and management studies; art and design; and sport and exercise sciences, leisure and tourism also submitted less than 40 per cent. The highest proportions of staff were submitted by Classics (94 per cent), philosophy (89 per cent), physics (87 per cent) and history (84 per cent).

This means that Classics has the highest overall intensity-weighted GPA, followed by physics, philosophy, mathematical sciences and history. The joint highest ranking subjects by standard GPA, chemistry and clinical medicine, slip to sixth and seventh respectively; and the third, biological sciences, to joint 15th. The highest social science, anthropology and development studies, is in ninth, with an 80 per cent submission rate.

Given the imperfection, mentioned above, of all methods of devising rankings based on the REF results, it is arguable that a definitive picture of overall departmental quality could be achieved only if universities were required to submit all their eligible staff. But as many observers have pointed out, that might only prompt a different kind of game-playing around the redefinition of staff contracts.

In light of that, THE invites universities and academics to examine the results that follow in conjunction with those based on standard GPA and research power, and to make of them all as much or as little as they see fit.

Methodology

In the ranking of institutions on intensity, ranking of single-subject institutions on intensity and subject ranking on intensity tables, institutions are ranked according to the grade point average (GPA) of their overall quality profiles weighted according to the proportion of eligible staff they submitted. This is termed an intensity-weighted GPA.

The overall quality profiles are calculated from the REF data published on 18 December. These present the percentage of each institution’s submission, in each unit of assessment, that falls into each of five quality categories. These are, for outputs, 4* (world-leading), 3* (internationally excellent), 2* (internationally recognised), 1* (nationally recognised) and unclassified. For impact, 4* indicates “outstanding”, 3* “very considerable”, 2* “considerable” and 1* “recognised but modest”.

Times Higher Education then aggregates these profiles into a single institutional quality profile based on the number of full-time equivalent staff submitted to each unit of assessment. This reflects the view that larger departments should count for more in calculating an institution’s overall quality.

Each institution’s overall quality profile is then converted into a GPA by multiplying its percentage of 4* research by 4, its percentage of 3* research by 3, its percentage of 2* research by 2 and its percentage of 1* research by 1. The totals are then added together and divided by 100 to give a number between 0 and 4. Note that because of the rounding of decimals, there may appear to be a small discrepancy between the overall quality profiles and the stated GPA. Where universities have the same GPA, they are ranked alphabetically.

The intensity-weighted GPA is then calculated by multiplying the standard GPA by the total number of FTE staff the institution submitted to the REF and then dividing that figure by the total number of FTE staff who were eligible to be submitted.

The stated figure for number of UoAs entered counts multiple submissions separately.

In the subject ranking on intensity tables, each institutional submission to each of the 36 units of assessment is ranked according to the intensity-weighted GPA of each institution’s overall quality profiles in that unit. These are calculated by producing a standard GPA for the university’s submission, by the method described above, then multiplying it by the total number of FTE staff the institution submitted to that unit of assessment and then dividing it by the total number of FTE staff who were eligible to be submitted to that unit.

Where the same institution made multiple submissions to the same unit of assessment, the various submissions are marked with different letters (eg, University of Poppleton A). Where universities have made joint submissions, they are listed alphabetically on separate lines.

The figures on the number of staff eligible to be submitted were published by the Higher Education Statistics Agency on 18 December. They have been produced to match the REF’s definition of FTEs as closely as possible, but Hesa warns that there is still not a “perfect match”.

Hesa also notes that eligibility for REF submission does not necessarily imply that an academic is “research active”.

Hesa also warns that its number count will be artificially high where extra staff are appointed to cover leaves of absence, and may be artificially low where research assistants have been returned to the REF, since it does not count research assistants. Hesa has also counted staff even where it is unclear which unit of assessment they were eligible for. Hence, some institutions may have had more eligible staff for particular units of assessment than Hesa is aware of.

In addition, the colleges of the universities of Oxford and Cambridge and the University of the Highlands and Islands have staff who are eligible for submission to the REF but who are not included in the Hesa figures. All this means that, in some cases, the number of staff eligible appears to exceed the number submitted. Where this statistical anomaly arises, THE has given the institution in question a submission percentage of 100 per cent and put an asterisk next to the figure. Hesa has provided no figures for the Institute of Zoology or the University of London Institute in Paris, so they are excluded from the table. UHI is also excluded from a number of subject tables where no data are given for submission numbers.

Note that the staff numbers supplied by Hesa are rounded, whereas those supplied by Hefce are not. This means that the figure calculated for the proportion of eligible staff may be slightly inaccurate.


Data analysis was undertaken by TES Global Information. Special thanks to Almut Sprigade and Ilan Lazarus. The official REF data are at: www.ref.ac.uk.
View all of THE’s REF coverage and rankings.

Times Higher Education free 30-day trial

You've reached your article limit

Register to continue

Registration is free and only takes a moment. Once registered you can read a total of 6 articles each month, plus:

  • Sign up for the editor's highlights
  • Receive World University Rankings news first
  • Get job alerts, shortlist jobs and save job searches
  • Participate in reader discussions and post comments
Register

Reader's comments (7)

Thank you Paul for the calculations, the reporting and the hard work. Happy New Year to you and all your colleagues at and readers of the Times Higher Education. While it is already 2015 in London, I am writing from 2014 in Mexico with an invitation to a reading of 'pythonesque success', a REFugee's response in poetry: https://fanismissirlis.wordpress.com/2014/12/28/pythonesque-success/
1. Re research intensity, your 'eligible FTE per department' bear no relation to the actual figure. In my own department, you exaggerate its size by 33%. I wish we did have the staff you claim we have..... 2. The 'intensity' per subject measurement takes no account of staff who may have been entered into another UoA because their research is a better fit. I think it's called multi-disciplinary work. You call it 'gameplaying'. I don't deny the latter exists, but what checks have you done to see whether the 'missing' members from a UoA have actually been entered - in a related discipline? None, is my guess. 3. You are obsessed with GPA. What really matters - or ought to - in terms of prestige is the percentage of world class i.e. 4* material. What certainly WILL matter in funding terms will be 4* - not GPA - but you continue to base it as your main league table. Happy New Year nonetheless!
I don't think Steve's criticism of the THE's methodology is quite fair. Every University in their staff return to HESA had to attach individuals to units of assessment (not departments). So all the THE are doing is using the data returned by universities themselves as to who was eligible to be submitted to each UoA in order to create their tables. There are some minor problems with the data (eg REF operates on a census date, HESA staff data usually counts staff over a year (although I understand HESA tried to correct for this)). What this approach illustrates are the HEIs who leave out half their staff and then claim to be "top" for research!
Silly argument. Were this metric added there would be a bit of shuffling amongst the Russell Group but the post-1992s would be badly hit. Of course the post-1992s will return a lower percentage of staff. They're traditionally teaching-intensive and that legacy is difficult to displace in a national environment loaded against them. The dice are systematically loaded by the Russell Group who want to maintain their privileged position. Including this daft metric in the overall exercise would further prevent centres of excellence in post-1992s from developing in the future. In addition, it would allow the Russell Group to be rewarded, with the aid of a little bit of judicious grade inflation where it counts, for volume of mediocrity, which is the forte of those outside the top four or five. Please bury this.
So, should any researcher or university be happy with research which is of 'Quality that is recognised internationally in terms of originality, significance and rigour.' ? Well, certainly not the THE, or anyone it seems, because this is is the description of work which is 'only' 2*. Therefore, by the lights of this article it should be discounted. One could say, go figure, and leave it at that. But what one might say, in addition, recognizing @INCUBUS's comment, that a teaching intensive university with a significant amount of 2* research is doing really well by the standards of civilized society. Of, course, there are perverse incentives not to submit such work, in any university. Is that what is intended. Work 'recognized internationally' should be discounted ?
And, at the heart of all this is the ridiculousness of the GPA. GPA calculations assume that a 3* paper is better than a 2* paper by 20%. Moreover, a 3* paper in any subject is 20% better than a 2* paper in any other subject ? Moreover, that to be one of several authors of a 3* paper in any subject is 20% better than being a single authored 2* paper in another subject. When, really, it may be a multiple authored 3* paper just scraped it, whereas a single authored 2* paper just failed to get to the 3* bar. In each case, there being no 'real' reason for the categorization. Or, to put it another way, this is all getting a bit out of hand, isn't it.
The anomaly in the figures for Oxford and Cambridge effectively gives them licence to play with impunity the game that the article discusses. They can fail to submit as many people as they want and still get a 100% submission rate. In my field, philosophy, for example, Oxford takes top place thanks to the THE's hunch that since according to the HESA figures there were 65 eligible philosophers in Oxford, and that can’t be right, the right figure must be 72. By my calculation, even if Oxford failed to submit only three people they would lose their top place.

Have your say

Log in or register to post comments

Featured Jobs

Assistant Professorship in Behavioural Science LONDON SCHOOL OF ECONOMICS & POLITICAL SCIENCE LSE
Foundation Partnerships Officer LONDON SCHOOL OF ECONOMICS & POLITICAL SCIENCE LSE

Most Commented

Social media icons

Gabriel Egan laments the narcissistic craving for others’ approval brought on, he says, by the use of social networking websites

James Fryer illustration (8 September 2016)

Some lecturers will rightly encourage forms of student interaction that are impossible for those covering their faces, Eric Heinze argues

University of Oxford students walking on campus

University of Oxford snatches top spot from Caltech in this year’s World University Rankings as Asia’s rise continues

Handwritten essay on table

Universities must pay more attention to the difficulties faced by students, says Daniel Dennehy

Theresa May entering 10 Downing Street, London

The prospect of new grammar schools on the horizon raises big questions for HE, writes Nick Hillman