With its improved accuracy and the inclusion of even more information, the second Times Higher World Rankings is the best guide to the world's top universities, says Martin Ince
Today The Times Higher publishes the World University Rankings for the second year running. The aim is the same as it was in 2004: to offer a consistent and systematic look at the world's top universities in the context of the globalisation of higher education. But we think that this version is more robust and reliable than the first.
We have gathered new data on employers' opinions of universities around the world. This has allowed us to widen the pool of information we present, but we have gone further and deepened the pool as well. This year's tables are virtually free of gaps in data. And because we have collected a wealth of data on institutions outside the top 200, we are confident that no institution that should be in these tables has been overlooked. These efforts have resulted in what we believe is the world's best guide to the standing of top universities.
The core of our analysis is peer review, which has long been accepted in academic life and across social research as the most reliable means of gauging institutional quality. The sample used to compile the peer-review column of this table comprises 2,375 research-active academics. They were chosen by QS Quacquarelli Symonds, consultants to The Times Higher and experts in international rankings of MBA courses. The selection was weighted so that just under a third of the academics came from each of the world's major economic regions - Asia, Europe and North America - with a smaller number from Africa and Latin America. It also had to yield roughly equal numbers from the main spheres of academic life: science, technology, biomedicine, social sciences and the arts. The selected academics were asked to name the top universities in the subject areas and the geographical regions in which they have expertise.
Data collected in 2005 were supplemented by opinions from our 2004 survey, where the same question was asked but no individual's opinion was counted twice. We believe that this two-year rolling average provides improved statistical reliability.
The information derived from the responses was used to generate the faculty-level data on the top institutions for specific subject areas published in The Times Higher this month and was aggregated to produce the peer-review column of the main table in this supplement. We are confident that the sample is large enough and sufficiently well chosen for its aggregate opinion to be statistically valid.
The point has been made that peer reviewers might be more likely to cite large old universities, especially those with the name of a major city in their titles, than smaller, less familiar ones. But the peers are all experts in their fields; and in their responses they rated as excellent more than 500 universities, some of which were unknown even to staff of The Times Higher .
The peer-review data account for 40 per cent of the available score in the World University Rankings. This is 10 percentage points lower than in 2004 because of the addition of data on the opinion of major international employers of graduates. Like the other columns we show, and in an improvement on the presentation of the data in 2004, we have normalised these data to show the top institution scoring 100.
Two other columns of data in this table account for 20 per cent each of the final score for each university listed. One is the number of citations for academic papers generated by each staff member. This has been compiled from staff numbers collected by QS and citations data supplied by Evidence Ltd on the basis of data from Thomson Scientific. The citations data, which come from Thomson's Essential Science Indicators, cover the period between 1995 and 2005. A lower cut-off of 5,000 papers has been applied to eliminate small specialist institutions. This criterion provides a clear measure of universities' research prowess, but it has some systematic biases. It disadvantages some institutions, especially those in Asia, that publish few papers in the high-impact journals surveyed.
Teaching is, of course, central to the university mission. To gauge it, we consider a classic measure of commitment to teaching, the staff-to-student ratio, which is worth up to 20 percentage points. Like citations per staff member, this measure depends on accurate staff numbers. We believe we have improved the accuracy of the figures we collect. Nevertheless, any inconsistency is to some extent self-correcting because exaggerating staff numbers would increase a university's staff-to-student ratio but reduce its citations per staff member.
The principal motivation for the World University Rankings is our realisation that although scholarship has always been international, the world of higher education is becoming one of the most global sectors of the world economy. The final two columns of data we show, each accounting for 5 per cent of the total, attempt to quantify universities' international orientation. The first reflects their percentage of international staff and the second their percentage of international students.
Our aim in these tables is to rank large general universities. We have not counted institutions that do not teach undergraduates. This removes from the listing a number of high-prestige institutions, especially in medicine and business. We have, however, included universities that teach a broad but not a full complement of subjects. These range from the London School of Economics to a large number of technology universities.
A frequent query about the 2004 rankings concerned the level of detail they provided. In general, we have tried to tease apart large federal universities such as California or London that consist of many in essence free-standing colleges. But we have not been able to disaggregate the many US state universities that boast more than one campus. Doing so would have complicated the task too much.
We have managed to remove some ambiguities that were present last year by distinguishing between the Flemish-speaking and Francophone institutions of Belgium and by providing clearer labelling of the many universities of Paris and other French cities.
As research on composite tables such as these has shown, it is important to read them with care. Although the overall score tells the full story, a specific column may be of more interest to a student or researcher contemplating his or her next move. It would be wrong to attribute too much weight to the small differences in overall scores between universities lower down the rankings.
We welcome your responses to the World University Rankings and to the faculty-level analyses that The Times Higher has already published. In particular, we are interested in suggestions of other measures of university quality that could be gathered consistently from institutions around the world.
This year's World University Rankings feature an extra column of data designed to add another vital dimension by revealing which universities are taken most seriously by the world's top employers of internationally mobile graduates.
The sample of employers was generated by QS from its own extensive knowledge of graduate recruiters and from universities, which provided names of companies that are frequent recruiters of their graduates. All the companies involved recruit either around the world or on a national scale in large countries. They were asked to identify up to 20 universities whose graduates they prefer to employ most.
The respondents were guaranteed anonymity. They include banks and financial organisations, airlines, manufacturers in areas such as pharmaceuticals and the automotive industry, consumer goods companies, and firms involved in international communications and distribution. There were 333 respondents.
The World University Rankings were coordinated by Martin Ince ( email@example.com ), contributing editor of The Times Higher .