Andrew Hindmarsh and Bernard Kingston cast a critical eye over the league tables published today
When it comes to league tables, people rarely sit on the fence. In university press releases they are lauded as revealing true quality; in common rooms they are often derided; and in the statistics community they have a professional eye cast over them. The compilers of The Times league table, published in that newspaper today, have consulted extensively with the university community and, this year, have responded specifically to some of the things the statisticians have been saying.
The result is a significant methodological advance. A standard statistical technique was applied to each of the nine quality measures in The Times league table; the raw data for these measures also appear on pages T2-T3 of The THES . Previously, the score for each university had been a proportion of the highest value. Now the score measures the deviation of the university's raw score from the mean raw score of all universities, expressed in terms of standard deviations.
The reason for transforming the scores in this way is that it gives more equal weight to all nine measures in the table. Previously, the effective weight varied considerably according to the distribution of the data and so some measures contributed more to the total score than others. The usual weightings of 2.5 for teaching assessment and 1.5 for research assessment are still applied in The Times league table to reflect the importance of those measures.
The impact of this development was surprisingly small. The largest change in ranking due solely to the new method-ology was ten places, with just 23 universities (out of 97) moving by more than three places. Only one university in the top and bottom ten moved by more than one place. This suggests that the basic approach to creating a league table - adding up scores for a series of quality measures to create a total score - has considerable robustness. Even fairly significant changes to the methodology do not create large changes in the overall ranking. This may be because, as some have argued, all the measures are proxies of one form or another for income and that a table of income per student gives similar results. There may be an element of truth in that suggestion, in which case league tables have the potential to support an argument that additional resources really do help to increase quality.
These latest refinements to the methodology of The Times league table will not be the last. The developments taking place at the Quality Assurance Agency will need to be looked at to see how teaching quality can be reflected in the table. And the new Universities and Colleges Admissions Service tariff, assuming that the Higher Education Statistics Agency collects the data appropriately, will open the possibility of widening the measure of entry qualifications beyond A levels in a few years' time.
The Times table has been running long enough to start looking at trends in the data over time. One area of interest, in the light of discussions over quality and standards, is the question of inflation in degree classifications. Are universities awarding more good degrees than they used to?
An analysis of The Times data indicates that, for the 97 universities in the table, the mean percentage of good degrees awarded (ie firsts and upper seconds) increased from 55.1 per cent in 1995-96 to 56.0 per cent in 1998-99, just under 1 percentage point. The highest and lowest universities changed rather more, the highest going up from 84.4 per cent to 88.9 per cent and the lowest from .8 per cent to 33.7 per cent. Cambridge awarded the highest proportion of good degrees throughout this period, with Bristol, Nottingham, Oxford and St Andrews consistently in the top ten. Thus there is evidence of an increasing proportion of good degrees but the change is not very great. There is certainly no evidence of a widespread change in standards as expressed by degree results.
An innovation for The THES this year is the inclusion of the university colleges in the tables of raw data. The most significant conclusion from this is that the colleges are by no means clustered at the bottom of the tables. Like the universities, they have their individual strengths and weaknesses, but there is no evidence that in general they perform worse than the universities on the quality measures used here.
League tables are here to stay. We have tried to take account of the views of both the higher education sector and the professional statisticians, but there is more work to be done.
Andrew Hindmarsh and Bernard Kingston are partners in Mayfield University Consultants. Nicola Bright of Bright Statistics provided valuable statistical support. We welcome your comments to firstname.lastname@example.org