Radical changes to this year's league tables give some surprising results, says John O'Leary
Ten years after the publication of the first British university league tables, the rankings still have their critics but their annual appearance has become an important date in the higher education calendar.
Like them or loathe them, few universities can afford to ignore them.
Surveys have shown that prospective students and their mentors - whether at home or at school - take increasing notice of the results. Indeed, universities have been known to skew their activities in ways that the compilers never considered to move up the rankings.
The tables in this supplement provide the raw data behind the overall university ranking that appeared in The Times yesterday, as well as the full results from 62 subject tables published this week. In addition, The THES has commissioned three tables covering research income, the proportion of permanent academic staff and the percentage whose main function is both teaching and research.
In The Times table, Oxford University was top, while last week the Financial Times had Cambridge University in the lead - an illustration of the influence wielded by those who set the criteria, since both newspapers use several of the same datasets. The figures on pages iv and v allow readers to look beyond the aggregation to see how universities match up on individual indicators.
Two changes have been made this year, on teaching assessments and graduate destinations - both vital measures for prospective students. The early teaching grades at English universities (when departments were rated excellent, satisfactory or unsatisfactory) have been dropped from the calculations as too dated to reflect current practice. The change hits universities such as Sheffield, which had nine top grades in those initial assessments, but enables York University to overtake Cambridge as top for teaching quality. Several new universities benefit - Kingston University leaps 25 places to 12th on this indicator.
The change on destinations is more radical. For the first time distinctions are made between types of employment. Graduate destinations have always been among the most unsatisfactory statistics associated with higher education. The figures are compiled just six months after students graduate, and few published surveys have made use of the wealth of detail that is collected. These tables follow a classification of jobs devised by Abigail McKnight of the London School of Economics during her time at Warwick University's Institute for Employment Research. The system defines "positive destinations" as postgraduate study or training, traditional graduate employment or another set of occupations, labelled "graduate track", that have become a normal route into more senior posts. These include technicians and skilled clerical jobs. Columns on page v illustrate the difference made by this change: while only 5 per cent of Cambridge's graduates went into "non-graduate" jobs, at others the proportion was almost 30 per cent. Aberdeen University, which would have topped the table if all occupations were treated equally, slips to 15th.
Cambridge, which heads the revised destinations ranking, also has the highest entry standards and the best research grades, as well as 22 top places in the 62 subject tables. University College London is the only other institution to top more than one of the rankings, boasting the best staffing levels and the highest research income over a three-year period.
Unlike The Times , this pullout covers university colleges, which perform well on a number of indicators. The Bolton Institute of Higher Education, for example, is in the top five for facilities spending and for the proportion of permanent staff on full-time contracts.
The tables are compiled by Bright Statistics for Mayfield University Consultants ( firstname.lastname@example.org ).
• Subscribers, click here for the full 2003 University league tables