REF 2014 rerun: who are the 'game players'?

How do rankings shift when institutions are compared on research intensity, which takes into account percentage of staff submitted, rather than on standard GPA alone?

一月 1, 2015

Source: Elly Walton

Talk off the record to any university head of research about the research excellence framework and it will not be long before you get on to the topic of the biggest “game players”.

These are institutions that were supposedly highly selective in terms of which researchers they submitted in the last round of research assessment so that they could maximise their position in quality rankings, such as those published by Times Higher Education on 18 December. To their critics, such institutions are in essence cheating because in reality their quality score reflects the work produced by only a small proportion of their staff.

One such critic is Bob Allison, vice-chancellor of Loughborough University. “How can it be fair to say ‘Well done – your league table position is excellent’ [when it] in part reflects the fact that you left out a large group of staff who would have dragged down your GPA?” he asks. “To get a fair and comparable picture [of research quality] across the sector, you have to consider the full academic research base.”

For this reason, he prefers research power as a basis for rankings: this measure combines GPA with the number of staff submitted. However, research power arguably gives an unfair advantage to large departments and institutions, which can still achieve a high research power despite being selective.

In light of this, and in response to demand from the sector, THE this week takes a different approach to identifying true research intensity, presenting the results of the 2014 REF on the basis of GPA multiplied by the proportion of eligible staff each institution and department submitted (for detailed methodology, see below).

This method is not perfect. The major flaw is that it in effect gives a zero score to the research of anyone not submitted to the REF – which, in many cases, will clearly not reflect reality. On the other hand, critics of alleged game-playing will argue that if institutions value someone’s research, they should have submitted it. For Allison, a ranking based on intensity-weighted GPA is “legitimate and important”.

Our data on the number of staff eligible for submission come from the Higher Education Statistics Agency, but the figures were not available in time to be used in THE’s first set of rankings on the REF results (“Check your coordinates”, 18/25 December). The Hesa data come with a number of “health warnings” (see methodology), and THE is also aware of at least two universities that dispute their Hesa figures. However, the overall rankings and subject tables that follow still provide an interesting alternative take on the REF results.

When ranked by intensity-weighted GPA, the Institute of Cancer Research retains the top spot it also holds on standard GPA, having submitted 95 per cent of its eligible staff. The University of Cambridge rises from fifth to second; Imperial College London slips from second to third; University College London rises from joint eighth to fourth; and the University of Bristol rises from joint 11th to joint fifth with the University of Oxford (which slips one spot from its position in the GPA table).

Loughborough is among a glut of small research-intensive universities that do much better on intensity-weighted GPA than they do on standard GPA; it shoots up from joint 49th to 14th on the back of submitting 88 per cent of its eligible staff. (Other similar cases include the universities of Kent, Reading, Essex and Leicester, as well as University of London colleges Soas and Birkbeck.)

According to Allison, Loughborough’s approach to the REF reflects its values, which include “be inclusive”, “recognise…our staff for their contribution and commitment” and “work together…with professionalism and integrity”.

“We have done our best to live by [those] values…I wonder how many other universities took that as their starting point?” he asks. “We wanted to demonstrate to both current and prospective students that we value research by all the staff who teach them.”

Giles Carden, director of strategic planning and analytics at the University of Warwick, also believes that the research intensity ranking “makes a useful comparison with the standard GPA ranking”, even though Warwick performs slightly better on the latter measure (in joint eighth position, as opposed to 11th position on intensity-weighted GPA). “Clearly, higher education institutions that are more selective can concentrate their research quality and rise up the GPA rankings. The intensity model adjusts for this and has quite a dramatic effect on some institutions’ rank position,” he notes.

Not surprisingly, research-intensive universities, which generally submit substantial proportions of eligible staff, tend to do better on intensity-weighted GPA than do post-92 institutions. Excluding anomalies (see methodology and notes), the highest proportions of staff – 95 per cent – were submitted by Queen’s University Belfast, the Institute of Cancer Research and Cambridge (although Hesa warns that the colleges of the universities of Cambridge and Oxford and the University of the Highlands and Islands have staff eligible for REF submission who are not recorded in its statistics, potentially making those universities’ submission percentages artificially high). Imperial College submitted 92 per cent of eligible staff, and University College London and the University of Bristol 91 per cent.

Among research-intensives, the lowest proportions of eligible staff were submitted by Cranfield University (37 per cent) and Aston University (43 per cent). As a consequence, the institutions slip, respectively, from joint 31st on standard GPA to 64th and from joint 35th to 60th. Neither responded to a request for comment.

The highest post-92 university in the intensity table is the University of Roehampton, which submitted 67 per cent of its eligible staff. It sits in 52nd place, compared with its 60th position based on standard GPA. Cardiff Metropolitan University, which placed higher (41st) in the standard table, plummets to joint 123rd place in the intensity table having submitted just 9 per cent of eligible staff.

Among institutions that submitted to more than one unit of assessment, Southampton Solent University had the lowest percentage of staff entered in the REF – 7 per cent. It is among 19 post-92s that submitted less than 20 per cent of their staff.

Alongside Loughborough, Brunel University London is the research-intensive that rises the most places on intensity-weighted GPA – from joint 75th to 40th – after submitting 85 per cent of its eligible staff.

Geoff Rodgers, deputy vice-chancellor for research at Brunel, says the institution put in “basically everybody who is doing research” because it wants to encourage all staff to improve their research and to have that improvement assessed.

“The REF is a measurement point, not an end point,” he continues. “We are committed to trying to say we are a broad-based research community all working together to improve it. It is a nice, straightforward message about what we are trying to achieve and where we are going. Hiring people to do research and then not putting them in [to the REF] sends a very mixed message, which is very difficult to manage.”

It is little surprise, perhaps, that he thinks intensity-weighted GPA is “a truer and more accurate measure of institution-wide research quality and a more rigorous basis on which to compare institutions” than standard GPA.

A similar view is held by Patrick Johnston, vice-chancellor of Queen’s University Belfast, which rises from joint 42nd on basic GPA to joint eighth on intensity-weighted GPA.

“Each institution will have different priorities, but, as the lead university in Northern Ireland and a Russell Group member, we have a responsibility to be as broad as we can [be while remaining] competitive [on GPA],” he says.

Apart from Aston and Cranfield, other research-intensives to slip significantly on intensity-weighted GPA compared with standard GPA include the universities of Swansea (joint 42nd compared with joint 26th), York (32nd, down from joint 14th), Sheffield (33rd, down from joint 14th), Bath (joint 34th, down from joint 14th) and Queen Mary University of London (joint 34th, down from joint 11th), all of which submitted just under three-quarters of their eligible staff.

But arguably the most notable faller is Cardiff University. On standard GPA, it ranks fifth among multi-departmental universities. In a press release put out on the day of the results, Colin Riordan, Cardiff’s vice-chancellor, said: “This extraordinary achievement marks us out as a world leader” and was “a testimony to the excellence and hard work of all our staff”. However, the university submitted only 62 per cent of its eligible staff, causing it to slip to 50th on intensity-weighted GPA. Cardiff did not respond to a request for comment.

Among post-92s, big fallers include Sheffield Hallam University, down from joint 63rd to joint 109th after submitting only 16 per cent of staff, and Coventry University, down from joint 75th to joint 120th after submitting just 13 per cent. The University of Brighton, which is the second highest post-92 institution in the standard GPA table, in joint 58th, falls to joint 95th with a 21 per cent submission rate.

When assessed by intensity-weighted GPA for outputs alone, the Institute of Cancer Research retains the top slot it shares on standard GPA with the London School of Economics (which falls to seventh place). Cambridge rises from joint fourth to second, and Imperial College rises from joint fourth to third. Significant risers include Bristol, up from joint 15th to fifth, and University College London, up from joint 18th to sixth. Loughborough climbs from 56th to 12th, while Queen’s rises from joint 50th to joint eighth. Cardiff falls from joint seventh to 50th.

Intensity-weighted GPA also shakes things up at subject level. Although Oxford is the big winner on standard GPA, coming top in 10 of the REF’s 36 units of assessment, it tops only half that number on intensity-weighted GPA. By contrast, Cambridge doubles its number of first places, to eight. However, some caution is required in interpreting this about-turn given the vagaries, discussed above, of Hesa’s Oxbridge data.

Imperial College also improves its haul of first places on intensity-weighted GPA, from none on standard GPA to four. The LSE also has four, and Queen’s, Lancaster University, the Institute of Cancer Research and the universities of Bristol and Glasgow head two units each. Only 13 of the 36 subpanels are led by the same institution on both standard and intensity-weighted GPA.

At the level of individual units of assessment, some institutions that rank first on standard GPA fall a long way when ranked on intensity-weighted GPA. For example, in allied health professions, dentistry, nursing and pharmacy, the University of Sheffield’s biomedical science submission drops to 32nd (of 94 institutions) with a 56 per cent submission proportion; in law, King’s College London falls to 26th (of 67), submitting 59 per cent; in civil and construction engineering, Cardiff falls to 12th (of 14) with 55 per cent; and in physics the University of Strathclyde falls to 24th (of 41) with 77 per cent. By contrast, in psychology, psychiatry and neuroscience, Glasgow rises from joint 22nd to joint first.

Looking at units of assessment as a whole, education appears to be the least research-intensive discipline, with institutions submitting just per cent of their eligible staff. Allied health professions, dentistry, nursing and pharmacy; business and management studies; art and design; and sport and exercise sciences, leisure and tourism also submitted less than 40 per cent. The highest proportions of staff were submitted by Classics (94 per cent), philosophy (89 per cent), physics (87 per cent) and history (84 per cent).

This means that Classics has the highest overall intensity-weighted GPA, followed by physics, philosophy, mathematical sciences and history. The joint highest ranking subjects by standard GPA, chemistry and clinical medicine, slip to sixth and seventh respectively; and the third, biological sciences, to joint 15th. The highest social science, anthropology and development studies, is in ninth, with an 80 per cent submission rate.

Given the imperfection, mentioned above, of all methods of devising rankings based on the REF results, it is arguable that a definitive picture of overall departmental quality could be achieved only if universities were required to submit all their eligible staff. But as many observers have pointed out, that might only prompt a different kind of game-playing around the redefinition of staff contracts.

In light of that, THE invites universities and academics to examine the results that follow in conjunction with those based on standard GPA and research power, and to make of them all as much or as little as they see fit.

Methodology

In the ranking of institutions on intensity, ranking of single-subject institutions on intensity and subject ranking on intensity tables, institutions are ranked according to the grade point average (GPA) of their overall quality profiles weighted according to the proportion of eligible staff they submitted. This is termed an intensity-weighted GPA.

The overall quality profiles are calculated from the REF data published on 18 December. These present the percentage of each institution’s submission, in each unit of assessment, that falls into each of five quality categories. These are, for outputs, 4* (world-leading), 3* (internationally excellent), 2* (internationally recognised), 1* (nationally recognised) and unclassified. For impact, 4* indicates “outstanding”, 3* “very considerable”, 2* “considerable” and 1* “recognised but modest”.

Times Higher Education then aggregates these profiles into a single institutional quality profile based on the number of full-time equivalent staff submitted to each unit of assessment. This reflects the view that larger departments should count for more in calculating an institution’s overall quality.

Each institution’s overall quality profile is then converted into a GPA by multiplying its percentage of 4* research by 4, its percentage of 3* research by 3, its percentage of 2* research by 2 and its percentage of 1* research by 1. The totals are then added together and divided by 100 to give a number between 0 and 4. Note that because of the rounding of decimals, there may appear to be a small discrepancy between the overall quality profiles and the stated GPA. Where universities have the same GPA, they are ranked alphabetically.

The intensity-weighted GPA is then calculated by multiplying the standard GPA by the total number of FTE staff the institution submitted to the REF and then dividing that figure by the total number of FTE staff who were eligible to be submitted.

The stated figure for number of UoAs entered counts multiple submissions separately.

In the subject ranking on intensity tables, each institutional submission to each of the 36 units of assessment is ranked according to the intensity-weighted GPA of each institution’s overall quality profiles in that unit. These are calculated by producing a standard GPA for the university’s submission, by the method described above, then multiplying it by the total number of FTE staff the institution submitted to that unit of assessment and then dividing it by the total number of FTE staff who were eligible to be submitted to that unit.

Where the same institution made multiple submissions to the same unit of assessment, the various submissions are marked with different letters (eg, University of Poppleton A). Where universities have made joint submissions, they are listed alphabetically on separate lines.

The figures on the number of staff eligible to be submitted were published by the Higher Education Statistics Agency on 18 December. They have been produced to match the REF’s definition of FTEs as closely as possible, but Hesa warns that there is still not a “perfect match”.

Hesa also notes that eligibility for REF submission does not necessarily imply that an academic is “research active”.

Hesa also warns that its number count will be artificially high where extra staff are appointed to cover leaves of absence, and may be artificially low where research assistants have been returned to the REF, since it does not count research assistants. Hesa has also counted staff even where it is unclear which unit of assessment they were eligible for. Hence, some institutions may have had more eligible staff for particular units of assessment than Hesa is aware of.

In addition, the colleges of the universities of Oxford and Cambridge and the University of the Highlands and Islands have staff who are eligible for submission to the REF but who are not included in the Hesa figures. All this means that, in some cases, the number of staff eligible appears to exceed the number submitted. Where this statistical anomaly arises, THE has given the institution in question a submission percentage of 100 per cent and put an asterisk next to the figure. Hesa has provided no figures for the Institute of Zoology or the University of London Institute in Paris, so they are excluded from the table. UHI is also excluded from a number of subject tables where no data are given for submission numbers.

Note that the staff numbers supplied by Hesa are rounded, whereas those supplied by Hefce are not. This means that the figure calculated for the proportion of eligible staff may be slightly inaccurate.


Data analysis was undertaken by TES Global Information. Special thanks to Almut Sprigade and Ilan Lazarus. The official REF data are at: www.ref.ac.uk.
View all of THE’s REF coverage and rankings.

Times Higher Education free 30-day trial

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.

Reader's comments (4)

Thank you Paul for the calculations, the reporting and the hard work. Happy New Year to you and all your colleagues at and readers of the Times Higher Education. While it is already 2015 in London, I am writing from 2014 in Mexico with an invitation to a reading of 'pythonesque success', a REFugee's response in poetry: https://fanismissirlis.wordpress.com/2014/12/28/pythonesque-success/
Silly argument. Were this metric added there would be a bit of shuffling amongst the Russell Group but the post-1992s would be badly hit. Of course the post-1992s will return a lower percentage of staff. They're traditionally teaching-intensive and that legacy is difficult to displace in a national environment loaded against them. The dice are systematically loaded by the Russell Group who want to maintain their privileged position. Including this daft metric in the overall exercise would further prevent centres of excellence in post-1992s from developing in the future. In addition, it would allow the Russell Group to be rewarded, with the aid of a little bit of judicious grade inflation where it counts, for volume of mediocrity, which is the forte of those outside the top four or five. Please bury this.
So, should any researcher or university be happy with research which is of 'Quality that is recognised internationally in terms of originality, significance and rigour.' ? Well, certainly not the THE, or anyone it seems, because this is is the description of work which is 'only' 2*. Therefore, by the lights of this article it should be discounted. One could say, go figure, and leave it at that. But what one might say, in addition, recognizing @INCUBUS's comment, that a teaching intensive university with a significant amount of 2* research is doing really well by the standards of civilized society. Of, course, there are perverse incentives not to submit such work, in any university. Is that what is intended. Work 'recognized internationally' should be discounted ?
And, at the heart of all this is the ridiculousness of the GPA. GPA calculations assume that a 3* paper is better than a 2* paper by 20%. Moreover, a 3* paper in any subject is 20% better than a 2* paper in any other subject ? Moreover, that to be one of several authors of a 3* paper in any subject is 20% better than being a single authored 2* paper in another subject. When, really, it may be a multiple authored 3* paper just scraped it, whereas a single authored 2* paper just failed to get to the 3* bar. In each case, there being no 'real' reason for the categorization. Or, to put it another way, this is all getting a bit out of hand, isn't it.