Has the proportion of good degrees awarded by universities changed? Nicola Bright, Andrew Hindmarsh and Bernard Kingston examine the claim that institutions are guilty of grade inflation.
Almost all broadsheet newspapers publish university league tables using perceived measures of quality. Most, including The Times , which published its first ranking almost ten years ago, include degree classes as one of their measures.
Using good honours degrees as a measure of quality has met with criticism, not least because, it is argued, universities themselves are judge and jury in deciding their pass lists.
Although it is moderated by the external examiner system, degree class probably remains the softest measure used in league tables. But it is the primary measure of individual success at undergraduate level and deserves a place in any ranking of quality.
In compiling the Times league tables, which are expanded in The THES , we have assembled five years' worth of good-quality data (1995-99) on UK universities and are able with some confidence to look at longitudinal trends.
Changes in the degree classification measure over that five-year period shed light on the debate about "more means worse", dumbing-down in education, and grade inflation in school and university qualifications.
According to some commentators, league tables create perverse incentives, with universities under pressure to give more firsts.
Some may consciously monitor their position relative to comparator universities through their formal structure. Others may subconsciously, at departmental level, be mindful of the proportion of good degrees they award.
So how has the proportion of good degrees awarded by universities changed? In this analysis we have included all 97 universities that appear in The Times Good University Guide 2002 .
Much of the source material comes from the universities themselves via their annual returns to the Higher Education Statistics Agency.
We use firsts plus upper seconds as a proportion of all honours degrees awarded, and since 1997-98, have included unclassified "enhanced" first degrees.
Because the 1994-95 data for the Scottish universities failed to take full account of the ordinary degree, that year's results for Scotland have not been used. Thereafter, the Scottish ordinary degree numbers were excluded from the data alongside medical and related unclassified degrees.
Data were unavailable for Lincolnshire and Humberside, Luton and the School of Oriental and African Studies for 1994-95 and for Soas for 1995-96. The data are unadjusted for subject mix, although we are well aware of a significant spread within the 40-70 per cent range.
Those universities awarding the highest and lowest proportion of good honours were looked at in more detail.
Finally, changes within individual universities over time have been examined. The values of the summary location statistics (maximum, median, minimum and mean) show an increasing trend over the years (see graph). Variation in awards has been fairly consistent over the period.
Cambridge has awarded the highest proportion of good degrees and this percentage has gradually increased year on year from 84.4 to 88.9 per cent.
But it has been pointed out to us that Cambridge reports its degree classifications to Hesa in a slightly different way from other universities by providing the better of the Part I or Part II Tripos and not the final degree result.
This will inevitably inflate its proportion of good first degrees. Bristol, Nottingham and Oxford universities have been consistently in the top ten, with Edinburgh and St Andrews universities joining them from 1995-96 onwards. All the institutions identified in this category are old universities.
Greenwich, South Bank, London Guildhall and Thames Valley are the most consistently low-scoring universities. With the exception of Lampeter in 1997-98, all the institutions are new universities.
Universities displaying most and least variation in their percentage of good degrees come from across the sector.
Oxford, which is in the group with the widest variation, indicating significant change over the five years, started from a high base and has risen even higher.
There has been a slight increase overall in the proportion of good honours graduates leaving UK universities over the years from 1994-95 to 1998-99 - but not to the extent that many observers might have anticipated.
But within that modest upward drift overall, there have been some major movements for a group of individual universities, many of 10 percentage points or more.
Thus, overall, there is no dramatic evidence of widespread change in standards as expressed by degree results. In fact, a greater increase in good degrees might perhaps have been expected, given the fact that A-level performance has increased year on year.
But there has been considerable variation at institutional level, although the reasons for this have to be left to speculation. This individual variation looks, in some cases, to be much greater than the variation in entry standards over the same period.
Can these changes be put down to student effort, teaching standards, the examination system or all three? Probably all three, but there are fewer checks and balances in place to ensure comparability of standards from year to year or from institution to institution than there are, for example, in the A-level system.
Debates about standards are not likely to go away. Degree classification itself has been criticised, and many, including the Quality Assurance Agency, have called for its replacement with some form of personal profile.
Nicola Bright runs a consultancy, Bright Statistics. Andrew Hindmarsh and Bernard Kingston are partners in Mayfield University Consultants. Comments are welcome at: firstname.lastname@example.org </a>