16th September 2010
Jonathan Adams wants academic feedback on the rankings revolution
Thomson Reuters' data and analyses for Times Higher Education's World University Rankings were drawn from our global project to profile activity and support management in more than 1,000 research-based institutions. Our five-year plan involves teams in the US, Asia-Pacific and Europe gathering premium information.
In addition to our own expertise, experience and resources, we also drew on external and academic consultants for technical, logistical and statistical support. Vitally, everything we have done is validated by the global academy — through advisory groups and surveys — to ensure that, even if we don't get it right first time, we can work towards an outcome that makes sense for people who know the system and need these data.
That said, 25 years of practical experience in higher education has provided me with plenty of evidence that universities are too complex to readily summarise in a few numbers. But not everyone who cares about the academy has the time for detailed analysis. Many need a "best possible" summary to focus on the things that matter to them, create a shortlist of questions and institutions and target their efforts.
There is no perfect solution, nor could we directly measure some important characteristics — such as the relative standing of degree classes. Some data variables were easier to get than others, but we didn't confuse easy data with the right data.
We talked to vice-chancellors, senior academics, junior researchers and leading administrators to get a rounded view of what people in the system think makes a good university. We have a better, more diverse and detailed "basket" of indicators than has ever been used before.
Our reputational surveys are the most comprehensive and structured seen to date. Reputation is a fickle quality, but increasing the subject and regional spread — and making sure we know who we are surveying — has improved the relevance of this component.
Financial data are also included. We wanted information on total income and how that broke down into teaching and research. Because the value of money varies globally, we adjusted the data using purchasing-power parity indices. This is going to be controversial. Are the adjustments right for the sector? Did we get comparable estimates of income? Is more money good or is less money an index of effectiveness?
If the restructuring of the data and analyses is right, then the new rankings are likely to do two things. They should look broadly sensible at the national and international level, and treat different institutions equitably. But they should also produce a few surprises that change the perceptions drawn from weaker data and methodology.
For example, we were told that English-language institutions were over-represented in the old rankings. The 2010-11 table has fewer UK but more US universities at the top. Has one correction created another problem, or does this point to the much weaker funding base of UK universities?
Meanwhile, among specialist institutions, the technology-based ones seem to do well but social and economic schools of acknowledged excellence have struggled to move up. Is this because they have lower funding per capita? Should we modify the indicators for those subjects? How can we get the data balance and weighting right?
The rank order of institutions within countries accords with the advice we received from local experts. Some regional coverage, however, still seems unbalanced. We need better information about some Asian countries; Eastern Europe looks under-represented; France is in transition and will look very different by 2014. Do we need regional adjustments within the US to offset what looks like a "California effect"?
The data definitions require more work. We translated our requests into seven languages, but we must add others to be more effective. We also need a long-term data series to see where universities came from and where they are now. And what is a student? Many found it difficult to tie down the numbers or equate the tally of registered students to a full-time norm.
Into 2011, we will review and refine what we started. We want to hear from the global academy so that the changes started here can be pushed through to create even more comprehensive guides to university activity. Does this work for you?
Jonathan Adams is director of research evaluation, Thomson Reuters.