The European Commission seems to be labouring under a pretty basic misunderstanding about world university rankings.
In its 2010 “Communication on Youth on the Move”, the commission recognised the importance of global comparisons of universities and the role that performance indicators can play.
It said: “Maintaining a high level of quality is crucial for the attractiveness of higher education. Moreover, in a more global and mobile world, transparency regarding performance of higher education institutions can stimulate competition and cooperation.”
But it said that the existing ranking systems were not delivering everything that was needed. They “can give an incomplete picture of the performance of universities, over-emphasising research, while excluding other key factors that make universities successful, such as teaching quality, innovation, regional involvement and internationalization”.
This missive was drafted before the publication of the new Times Higher Education World University Rankings 2010-11 – indeed, it was formally adopted the day before the rankings were published to massive global attention in September 2010. But eight months later, it would appear that misconceptions remain.
Speaking at an EC-supported seminar on the Youth on the Move initiative, organised by the European Journalism Centre and held in Florence on 8 May, Androulla Vassiliou, the European Commissioner for Education, Culture, Multilingualism and Youth, said European universities needed to “open up to the globalised world”.
She argued that they must operate in a new market “so that the young people and their parents know exactly what is on offer by the various universities and have better guidance”.
But she repeated the inaccurate information in the 2010 EC communication: “For the time being the international rankings of universities are based exclusively on research.”
I was delighted to be able to set the record straight at the Florence seminar, and demonstrate that the Times Higher Education World University Rankings, which were substantially revised during early 2010, do indeed deliver clear performance indicators on teaching (or at least the teaching environment), innovation and internationalisation.
It is true that our tables are dominated by research, and I make no apology for that. We look at a university’s ability to place research papers in top journals, its record in publishing highly cited research, its reputation for research excellence among academics around the world and its ability to attract research funding. These things are essential to any nation serious about pushing forward the boundaries of our understanding and being part of the knowledge economy.
But we do look at much more than research. On teaching, we have five separate indicators, collectively worth about a third of the overall ranking score. Times Higher Education’s rankings are the only world rankings that take a serious look at teaching.
On innovation, we look at the income a university makes from its work with industry, a clear signal of its strength in knowledge transfer. On internationalisation, we look at the proportion of international staff and students attracted to an institution. For 2011, we may add an additional indicator looking at the proportion of research papers co-authored with international partners.
No ranking will ever be able to capture everything that a university does. Some of the most important things simply can not be measured. But Times Higher Education has worked very hard to produce the most comprehensive and wide-ranging rankings possible.