The Times Higher 's analysis of the world's top universities shows that quality is not the preserve of any single country. Martin Ince explains how the positions were worked out.
The first lesson of the rankings on these pages is that although the US - the world's biggest economy - houses the top universities, no country has a monopoly on excellence in higher education. Instead, applying a single set of measures consistently across the world reveals that the top 20 universities are spread across seven countries, and the top 200 are in 29 nations.
The measures used to develop this analysis will be altered and improved in future years. They are designed to be as objective as possible and as free as possible from international and cultural bias.
The scores in the final table have been normalised against a score of 1,000 for Harvard University, the top-ranked institution by some distance.
The first element in the score for each institution is based on peer review, the most trusted method for university comparison. It was produced by QS, a London-based company best known for its worldwide activities in MBA and graduate recruitment.
QS surveyed 1,300 academics in 88 countries. Each was asked to nominate both the academic subjects and the geographical areas on which they felt able to comment, and QS sought other respondents to balance nominations in academic discipline and location. The academics were each asked to name the top institutions in the areas and subjects on which they felt able to make an informed judgement. The survey took place during August and September.
This unique and groundbreaking material is weighted at half of the total score.
A further 20 per cent of the score is accounted for by a ranking of research impact, which is calculated by measuring citations per faculty member. These data are derived from the Essential Science Indicators database produced by Thomson Scientific (formerly the Institute of Scientific Information, www.isinet.com) in Philadelphia, US, and analysed for The Times Higher by Evidence Ltd in Leeds, England, under licence from Thomson Scientific.
A comparison between the institutions that do well in citations and those that perform well in peer review shows that this criterion tends to favour institutions in the US and, to a lesser extent, other English-speaking countries. Researchers in countries such as France, Germany, Switzerland, Italy and Spain, and in Latin America and India, were either absent or performed poorly in terms of citations received. Citations also perform less well for some subjects than for others. Researchers in fields of the social sciences such as law and education, which are based in national systems, tend to publish in national publications, often not in English, which are less likely to be covered by Thomson Scientific's database than work in the natural sciences.
In the course of this exercise, QS collected a wide range of other data on university performance. Rated at a further 20 per cent of the total is a measure of faculty-to-student ratio. While institutional practices and international variations in employment law make staff numbers less than completely comparable across the world, this indicator is a simple and robust one that captures a university's commitment to teaching.
The other two measures weighted here, each at 5 per cent of the total, are designed to encapsulate a university's international orientation. More than 2 million undergraduates now study outside their own country worldwide, and this number is growing at about 20 per cent a year. A university's ability to attract them is one measure of its ambition and is captured by a measure of its percentage of overseas students. Equally important is its ability to bring in the best academics from around the world, measured here via its percentage of international faculty. A university that relies on an influx of ambitious but underqualified immigrants to deliver its lectures could do well on this count. But it is unlikely that such an institution would do well enough on our other criteria to make it into our world top 200.
QS collected these data on the top 300 universities as discovered in the peer review, after eliminating a small number of single-subject institutions. It performed the research in several ways. For Germany, the UK and the US, there are national bodies that gather education or higher education statistics. In Japan, student number data are also available from a central national source. The rest was gathered from university websites, from direct email and telephone contact with the institutions in question or from internationally accepted reference sources.
A close look at the table reveals that in a very few cases it was simply impossible to collect some data despite QS's extensive research with national and institutional sources. These gaps were filled with a weighted estimate based on other aspects of the relevant institution's performance in the context of its location and its apparent profile.
In addition to the main table that precedes this article, this supplement to The Times Higher contains detailed analyses of our findings about the top institutions in Europe, North America and the rest of the world, and the institutions that do especially well in terms of peer review, citations and staffing.
In future months, The Times Higher will publish further analyses of these and other data, which will extend it into specific discipline areas including science, technology, biomedicine, social science and the arts and humanities.
We would welcome reader reaction to this publication.
The World University Rankings were coordinated by Martin Ince ( email@example.com ), contributing editor of The Times Higher . He wishes to thank Nunzio Quacquarelli of QS ( www.qsnetwork.com ), Jonathan Adams of Evidence Ltd ( www.evidence.co.uk ) and their colleagues for their participation in this project.
View full tables here
World university rankings 2004
Back to index page
Download full-colour version
Times Higher World University Rankings PDF (331K)