REF 2014 impact case studies: government policy cited most

Informing government policy was the most common kind of impact submitted to the 2014 research excellence framework, a study has found

March 26, 2015

Source: Alamy

Need to know: informing government policy was most frequent in the sciences

The analysis of the 6,679 unredacted impact case studies submitted was conducted by a team from King’s College London and Digital Science, a scientific technology firm. It was set to be presented at the Higher Education Funding Council for England’s REFlections conference on 25 March.

Text mining revealed that the most common kinds of impact related to “informing government policy”, “parliamentary scrutiny”, “technology commercialisation” and “print, media and publishing”.

Parliamentary scrutiny is more frequent in the social sciences, while informing government policy is more prominent in the sciences. Impact on the media is most common in the humanities.

The prominence of media could be seen as surprising given that the REF rules made clear that media mentions did not in themselves count as impact.

Jonathan Grant, director of the Policy Institute at King’s College London and leader of the analysis, said that he planned to examine further whether panels had given credit for media mentions despite the rules.

Another area where rules may have been bent relates to impacts within the academy – such as the writing of textbooks – which were officially banned. Despite this, the analysis identifies students as the second most frequent beneficiaries – after companies and ahead of children. But Professor Grant “worried” about the accuracy of that analysis given the difficulty of identifying beneficiaries using text mining.

He was more confident about broader conclusions, such as that the research underpinning impact was typically multidisciplinary (in 80 per cent of cases), and that impact was multifaceted (with 60 separate “impact topics” being identified).

Impact was also widespread. The analysis found every country in the world being mentioned in case studies; the US, Australia, Canada, Germany and France were the most cited non-UK beneficiaries.

Although the research underpinning impact could date back to 1993, the majority had been published since 2008. Professor Grant suspected that this was because universities had been more confident about the quality of recent underlying research – which was required to be rated at least 2*.

Such risk aversion had also been apparent, he said, in a RAND Europe analysis of universities’ experiences of preparing impact submissions, also published on 25 March. But he added that there was no consensus among the universities on how to improve the process.

Times Higher Education free 30-day trial

You've reached your article limit.

Register to continue

Registration is free and only takes a moment. Once registered you can read a total of 3 articles each month, plus:

  • Sign up for the editor's highlights
  • Receive World University Rankings news first
  • Get job alerts, shortlist jobs and save job searches
  • Participate in reader discussions and post comments

Have your say

Log in or register to post comments

Featured Jobs

Most Commented

Worried man wiping forehead
Two academics explain how to beat some of the typical anxieties associated with a doctoral degree
A group of flamingos and a Marabou stork

A right-wing philosopher in Texas tells John Gill how a minority of students can shut down debates and intimidate lecturers – and why he backs Trump

A face made of numbers looks over a university campus

From personalising tuition to performance management, the use of data is increasingly driving how institutions operate

students use laptops

Researchers say students who use computers score half a grade lower than those who write notes

As the country succeeds in attracting even more students from overseas, a mixture of demographics, ‘soft power’ concerns and local politics help explain its policy