All universities are equal, but some are more equal than others, as George Orwell might well have said. When The Times first imported the idea of league tables designed to inform the public on the relative status of Britain's universities, this was the attitude of the university establishment, crystallising into a reaction which ranged from an eagle-eyed readiness to seize on any error in order to undermine the enterprise to a decision by some institutions to boycott the exercise by declining to verify data sent to them for checking.
Guides for university applicants were probably unnecessary, although they did exist, when the United Kingdom had only 30 or so universities, mostly uniform and broadly comparable. An informal hierarchy existed, reinforced by word of mouth among the relatively small pool of would-be undergraduates. The increase in the number of institutions - almost 100 are now covered by The Times's guide - and the arrival of the former polytechnics with unfamiliar, often confusing, names meant that something more systematic and objective was required.
Other countries with a mass and diverse system of higher education had been there before and US News and World Report in the United States and Macleans in Canada had turned guides for university applicants into big circulation winners, inspiring The Times to target the new market of young people and their parents who were considering for the first time the possibility of breaking into higher education.
But The Times angered the establishment by attempting to rank universities in a single hierarchy using whatever statistical information it could to shed light on their excellence - or otherwise. A number of universities, wholly or predominantly postgraduate institutions such as Cranfield or the London Business School, were excluded, as well as the Open University.
The data were patchy. The exercise involved amassing material from a variety of published sources, with the inevitable danger that not all of it was strictly comparable. The idea was that if a sufficient number and range of indicators were used, any irregularities would be insignificant, and that data would be weighted for their relative value and reliability. Every care was taken to ensure that universities saw the data on which they were to be assessed and had the opportunity to correct any errors.
The broad outcome was predictable. Some universities - Cambridge, Oxford, Imperial, University College - would top the table, while others, mainly the smaller, more diverse former polytechnics and colleges of higher education, only recently designated universities, would appear to do "worse".
Those at the top could afford to be relaxed, criticising the methodology while basking in the endorsement which would do them no harm when funding followed students. Universities with little to shout about had more legitimate concerns about the impact on their very viability of an exercise which judged entire institutions rather than assessing the strengths of individual subjects or departments.
The concept of a hierarchy of institutions worked against the university system's growing diversity. It helped to reinforce an established pecking order rather than recognise the specialised qualities of many of the newer and smaller institutions. The Times recognised this, adjusting weightings to counteract historic advantages and experimenting with the concept of added value.
The establishment has come to accept the inevitability of exercises such as The Times tables. But that acceptance is grudging. The Committee of Vice Chancellors and Principals, which resisted the idea at the outset, insists it has not changed its view on league tables.
"We think it is useful to have as much information available to help people choose courses and find out about different aspects of universities. But we are against a super league table. Once you start putting together different factors and making subjective decisions about which factors to include and what weighting you give to them, you end up with a measure which is inevitably skewed, which is subjective and only makes sense to one kind of student."
The Times accepted many of the limitations but persevered, issuing a health warning about its own league: "The table . . . is intended as a signpost for students considering a university application. It gives a broad indication of a university's standing, but can only be the starting point for more detailed inquiries." The table forms part of a broader publishing exercise by The Times including a detailed guide to universities published in book form and a week-long sequence of articles on the issues of the day.
With the arrival of the Higher Education Statistics Agency, the task of collecting the data has been simplified. Much of the information used is collected by HESA from institutions on a uniform basis across the sector. The growing body of information on teaching quality drawn from funding councils' assessments is a recognised and accepted indicator of one aspect of performance, as is the recent research assessment exercise. Indicators which were either unreliable or of questionable relevance have been dropped.
* As in past years, the league tables published by The Thes today complement The Times's superleague table, which also appears today. Our league tables comprise the raw, unweighted data used to compile the table, and are intended to enhance understanding of The Times's exercise. In addition, The THES today lists the highest ranking departments for teaching in 47 subject areas along with each department's research rating. See page iv.
These tables show the raw data from published sources used to produce The Times League Table of UK universities. The information was gathered from a number of sources including the Higher Education Statistics Agency, the Universities and Colleges Admissions Service, the university funding councils and the Standing Conference of National and University Libraries. Every effort has been made to ensure its accuracy, but no responsibility can be taken for errors or omissions. Equally ranked institutions are listed alphabetically except where there are decimal point differences between them. Some institutions do not appear in every table and in a few cases the data were not available in a compatible form. All universities were provided with summaries of their HESA data in advance of publication and where anomalous figures were identified in the 1994/95 data the relevant universities were contacted to provide a check. The data providers do not necessarily concur with data aggregations or manipulations and are not responsible for any inferences or conclusions thereby derived. Minor variations within the tables are not significant. Direct comparison with earlier years is not possible.
These are for students starting degree courses in 1997 and are based on the average A level points score required across the university.
Percentage of full-time and sandwich first-degree students in attendance at the university who were in accommodation provided by the institution in 1994/95.
Average student/staff ratio across the university based on student full-time equivalent numbers known to be on non-franchised courses and total teaching and teaching/research staff in 1994/95.
Library spending per FTE
Spending on books, periodicals and staff (but excluding buildings) per full-time equivalent student in 1994/95. Data was generally provided by SCONUL.
Mean TQA points
Mean teaching quality assessment score across departments based on the Teaching Quality Assessments.
Average research assessment score per member of staff based on the 1996 Research Assessment Exercise. All staff, not just those included in the RAE return, were counted and scores calucated on a seven-point scale.
Percentage of all first-degree qualifiers gaining firsts and upper seconds in 1994/95. Upper second honours is included for the first time this year. Students following four-year honours degree programmes at Scottish universities are entitled to receive an ordinary degree after three years of study and as such many first degrees awarded in Scotland are unclassified. Where ordinary degrees have been identified these have been removed from the grand total, prior to calculating the proportion of qualifiers with firsts and upper seconds.
Proportion of UK-domiciled graduates taking up employment or further study/training in 1994/95. HESA was unable to release levels of unknowns by institution in 1994/95.
HESA issued the following statement on the quality of the data it supplied: "The data sets from which the above data were drawn represent the first full year of data collection for HESA (1994/95). Having analysed the data in detail, it is clear that a small number of anomalies exist. A structured programme of data quality improvement measures is under way in collaboration with the higher education institutions involved, and therefore we expect data relating to subsequent years to make considerable improvements in quality and accuracy."
(* NOTE: University League Tables 1997 listings is NOT available on this database)