A report has revealed which universities are awarding more first and upper second class degrees than would be expected based on their students’ backgrounds, raising fresh questions about the comparability of exam standards across the sector.
Of the 123 institutions listed in the Higher Education Funding Council for England report, 21 awarded significantly more firsts and upper seconds than would be expected – based on students’ school qualifications, social background, region and school type, gender, ethnicity and subject – while 18 awarded fewer than expected.
Hefce did not calculate exactly how the results translate into actual degree classes awarded, and the data, contained in an appendix of Differences in Degree Outcomes: Key Findings, released in March, are not explicitly analysed in the report.
But it is thought that the data could indicate that some universities deviate by up to 10 percentage points from the expected proportion of students who get “good” degrees.
Just because a university awards more firsts and upper seconds than expected does not necessarily mean it has low exam standards, as this could be down to better teaching, or being able to identify which students are likely to succeed irrespective of their school grades.
But Alan Smithers, director of the Centre for Education and Employment Research at the University of Buckingham, said that the results added “statistical clout to the widely held view that some universities award more firsts and upper seconds than is justified by their intakes”.
The university with the largest indicator that it was awarding more firsts and upper seconds was Liverpool Hope University. A spokesman attributed this to high retention rates and a “tutorial system” where students were taught in small groups.
The University of Exeter also awarded a higher rate of “good” degrees than expected. In 2010-11, the year when most of the 2007-08 cohort examined by Hefce would have graduated, Exeter awarded the sixth highest proportion of “good” degrees (82.7 per cent) in the UK, following Oxford, Cambridge, Edinburgh, Imperial College London and Bristol.
An Exeter spokeswoman said that the university’s “focus on excellent teaching is borne out in our degree results”.
On the other end of the scale, University Campus Suffolk – whose degrees are validated by the universities of East Anglia and Essex – had the biggest indicator that fewer students were getting “good” degrees than would be expected.
A spokeswoman said that this was because the campus was keen to “maintain integrity with its degree classifications” in the face of grade inflation and a cohort with relatively low entry qualifications when it opened in 2007-08.
Alison Wolf, Sir Roy Griffiths professor of public sector management at King’s College London, said that overall, there was “no obvious explanation” for the variations in degree classes exposed by the Hefce report.
“Employers are supposedly interested in the skills that students acquire at university,” she said. “I have always found it hard to reconcile this with their almost complete lack of interest in comparability of standards across time or across institutions.” It was “surely time” to investigate the issue further, she said.
Universities do use external examiners from other institutions to check how exams are marked. But they are generally only supposed to ensure a consistent standard for a pass mark, rather than a first or upper second.
All three of the major domestic university league tables – produced by The Sunday Times, The Guardian and the Complete University Guide – factor in degree classes awarded by an institution, which creates a “perverse incentive to inflate [grades]” and should end, said Professor Smithers.
Degree classifications are arguably “not a very objective measure of quality” because they are controlled by universities themselves, the Complete University Guide admits in its methodology, but it points out that they do have an impact on graduates’ employment prospects.
On the issue of degree comparability, Nick Hillman, director of the Higher Education Policy Institute, said that the sector should “never be afraid” of new data on the subject. “The most important goal should be to ensure UK institutions stay within the controlled reputational range for which they are famous,” he said.
Highs and lows: how the institutions’ grades stack up
|Universities with more 2:1s and firsts than expected||Universities with fewer 2:1s and firsts than expected|
Note: Universities are in alphabetical order because it is not statistically possible to rank them except for those at the top and bottom of the scale.
|Brunel; Coventry; Exeter; Hertfordshire; Kingston; Lancaster; Liverpool; Liverpool Hope; Liverpool John Moores; Manchester Metropolitan; Middlesex; Newcastle; Northampton; Northumbria; Oxford; Oxford Brookes; Sheffield Hallam; Sunderland; Warwick; Westminster; Wolverhampton||Bath; Creative Arts; Goldsmiths; Imperial College London; King’s College London; Leeds Metropolitan; Leeds Trinity; London Metropolitan; Loughborough; Nottingham Trent; Portsmouth; Queen Mary; Royal Holloway; St Mark and St John; Southampton Solent; Surrey; University Campus Suffolk; Winchester|