I’m in Hong Kong this week for the world’s largest international higher education conference – the British Council’s Going Global event – and rankings are high on the agenda. Here is an edited version of my speech for the plenary session, “International World Rankings: where do you stand?”
OK, I admit it. University rankings are really rather crude. They simply cannot capture many of the things that matter most in higher education: how a great lecturer can transform the lives of her students, for example, or how free enquiry enhances our society.
They can never be objective, because their indicators and methodologies are based on the subjective judgement of the compilers.
And let’s be brutally honest: at their worst, university rankings can impose uniformity on a sector that thrives on diversity. They can pervert university missions and distract policymakers. When they are done badly, they can be manipulated for unfair gain. They can mislead the public.
Rankings can be guilty of all these charges. And yet I am happy to stand here today and declare: “I am a ranker and I am proud!”
Why? Because I believe passionately that as long as rankers are responsible and transparent; as long as they invest properly in serious research and sound data; as long as they are frank about the limitations of the proxies they employ; and as long as they help to educate their users and engage in grown-up debates like this one, rankings can be a positive force in higher education.
They can play a crucial role in helping us to understand the dramatic changes the sector is facing, and there is no doubt that dramatic change is upon us. That is why we are all here in Hong Kong for this conference.
Indeed, the last time I was lucky enough to be here, just under a year ago, the British Council’s director in Hong Kong, Peter Upton, summed up the situation.
Speaking at the University of Hong Kong, he said: “Higher education has been called the last unregulated global business – valued at more than US$14 billion a year – and we are living through one of those tipping points where in five years, colleagues will comment that this was the period when the landscape changed forever, when the speed of reputational growth and decline suddenly accelerated.”
We all know the figures: 3.3 million students studying outside their home countries; 162 satellite campuses. Almost half of all UK research papers are now written with co-authors from overseas. We live in a world of global education hubs, of joint degrees, faculty and student-mobility schemes, of franchised programmes, global research networks and bi-national universities.
We are entering a world of mass higher education and the traditional world order is shifting.
But there is an information gap. Of course, important research is being done on the globalisation of higher education, much of it discussed here at Going Global. But there is a need for clear – and yes, easily accessible – comparative information.
National governments need information when they are seeking to invest billions in universities to drive the knowledge economy.
Industry needs help in choosing where to invest research and development money and where to find top talent.
University leaders need to map the shifting global sands and improve strategy and performance.
Newly emerging universities, often based in developing countries, need help in clearly demonstrating their excellence to the world against better-known and more established brands.
University faculty, seeking to foster new research partnerships and consider their career options, need help in identifying new opportunities.
Students looking to make the right choice of degree course, wherever in the world it might be delivered, need help, too. This is crucial as the world gets smaller, global demand for higher education gets bigger, and choices become more bewildering.
Rankings have a positive role to play, as long as those who compile them are responsible and transparent.
And make no mistake. Rankings are here to stay.
Ellen Hazelkorn, director of research and enterprise at the Dublin Institute of Technology, has catalogued the extraordinary growth and influence of rankings in her new book, Rankings and the Reshaping of Higher Education: The Battle for World-Class Excellence. She demonstrates clearly how much they are changing university and even government behaviour.
So surely the best way forward is for the rankers to work closely with the university community and engage openly with their critics, to ensure they offer tables that are meaningful and carry all the necessary health warnings?
And who better to do that than Times Higher Education magazine? THE has been serving the higher education world for four decades – it is our 40th anniversary this year. We live or die by our reputation among university staff and policymakers as a trusted source of news, analysis and data, week in, week out.
Our rankings are part of that. They need to stand up to the close scrutiny of our highly intelligent and demanding readership.
So in 2009, we did something quite extraordinary. Some say it was crazy. Others think it was responsible and transparent.
We abandoned an established world university ranking methodology that had become influential during the six years we had published it, from 2004 to 2009, and started again from scratch. Why? Because we wanted to do a better job.
So what did we do?
First of all, we brought in one of the world’s most trusted and respected information specialists, Thomson Reuters, to collect and analyse all the data to be used for a brand-new rankings system and to help us develop an entirely new methodology.
We convened a special meeting of our editorial board of world-leading higher education specialists to discuss the concerns about rankings and we reported the outcome in our magazine.
We hired a polling company to help us carry out a worldwide survey of university staff, asking them what they wanted and needed from rankers. We published the results – we were responsible and transparent.
We used the results of the survey to help develop our plans, and we submitted detailed proposals to a specially convened group of more than 50 leading figures from 15 countries around the world.
We listened to the feedback we received and modified our initial plans to take as much account as possible of their expert engagement.
We ran a weekly column in the magazine for several months explaining step by step how our thinking was developing; we wrote in other newspapers around the world; we opened up uncensored discussion threads on our website; we attended dozens of conferences to engage in open public debate about rankings.
We were responsible and transparent.
So where did all this take us? I firmly believe that what we came up with is the most comprehensive and thorough global ranking system in the world.
The new Times Higher Education World University Rankings, published in September 2010, used 13 indicators to cover the university’s three core missions: research, knowledge transfer and teaching.
We made major improvements to our reputation survey by using the invited views of more than 13,000 targeted, identifiable and experienced academics, questioned on their narrow fields of expertise.
We employed a bibliometric indicator that drew on more than 25 million citations from five million journal articles over five years. And we fully normalised the citations data to take account of major variations in citations behaviour between subjects.
We decided to officially rank only the world’s top 200 universities – less than 1 per cent of the world’s institutions – to help ensure we looked at universities with a shared global outlook.
We made the first serious attempt to capture the teaching and learning environment through five separate indicators – an essential element of any university, but one missed by the other world-ranking systems.
And when we unveiled the results last autumn, we published the tables with reams of methodological information, making it clear to users where compromises had been made, where proxies were employed and what the tables – any tables – simply cannot capture.
We published them warts and all – including some statistical outliers, plus a detailed discussion of some of the continuing issues with the methodology.
We were responsible and transparent.
We published the rankings on an interactive website displaying live feeds from Twitter, allowing a live, real-time – and uncensored – public discussion of our results.
And we admit here today, that despite the huge efforts we have gone to, our new and dramatically improved rankings could be better still. We are now working again in open consultation with the global community and our expert advisers to discuss what we can improve this year.
We are being responsible and transparent.
And just yesterday, THE published more of the data behind the tables – a new world reputation ranking - that reveals the results of our reputation survey in isolation from the overall results.
One of the things I am most proud of is that we have handed much of the data over to the user. We have created a rankings application for the iPhone and iPad, which I believe represents a major step forward in the field.
Of course, we choose our indicators and weightings very carefully and only after lengthy consultation. But with the app, the weightings can be changed by the user to suit their individual needs. If you don’t agree with our weightings, you can set your own. That is responsible and transparent.
We publish the Times Higher Education World University Rankings to provide meaningful and useful information, in the right context, for everyone in the global academy. I am very proud of what we have achieved, but we will keep working and, most importantly, keep talking.
This is the responsible and transparent thing to do.