Change in rankings as expert group sets principles

June 9, 2006

The Times Higher has joined its data on trends with those used by The Times for its league tables. David Jobbins reports on the evolution of rankings

University rankings - now a familiar feature of the UK higher education landscape - are still highly controversial in many other parts of the world.

How accurate a picture do they give would-be students and others of the health of a country's higher education system? This week, for the first time, The Times Higher combines its trends data, an unparalleled timeline of statistics for the university sector, with the data used by The Times to produce its national league tables, which are published this week.

The Times data are aimed at would-be students. The Times Higher breakdown of the ranking aims to explain that ranking to those working in higher education. But some in the wider community still ask whether rankings should be "allowed", arguing that students need no such guidance in selecting a place to study. Others oppose the wisdom of a numerical hierarchy, suggesting that such an approach does not allow for the nuanced dissemination of essential information.

A three-year review of league tables' fitness for purpose culminated last month in Berlin with the publication of a report from an expert group convened by the Unesco-European Centre for Higher Education and the Institute for Higher Education Policy in Washington DC.

The "Berlin Principles on Ranking of Higher Education Institutions" are the outcome of a three-year process that has included most of the leading ranking organisations - including The Times Higher , US News & World Report and Shanghai Jiao Tong University - academic leaders and others with expertise in the field of ranking. The 16 Berlin Principles focus on good practice that will be useful for the improvement and evaluation of ranking systems over time. They emphasise the purposes and goals of rankings, the design and weighting of indicators, the collection and processing of data and the presentation of ranking results.

Among its recommendations, the group said rankings should:

  • Use transparent methodology
  • Measure outcomes, not inputs
  • Use audited and verifiable data whenever possible
  • Take into account the different missions and goals of institutions.

The expert group estimates that there are some 20 ranking exercises around the world, most of which look at national systems but increasingly adopt an international focus. Currently, only The Times Higher and Shanghai Jiao Tong publish international rankings. But the group predicts that more will appear and says: "It is important that those producing rankings and league tables hold themselves accountable for quality in their data collection, methodology and dissemination."

What could have been a restrictive set of guidelines was headed off by the ranking organisations, which argued that unrealistic principles would be ignored. Unlike US News & World Report , which uses self-reporting for much of its data, The Times has drawn on universities' statistical returns to bodies such as the Higher Education Statistics Agency. But, in one key respect, this year's Times league tables have changed significantly, which will make year-on-year comparisons something to approach with caution. Teaching quality has been dropped as an indicator because of the age of the assessments. Instead, the National Student Survey is used as a measure of the learning experience.

Three indicators were commissioned independently by The Times Higher . Those on employment show significant change, with seven institutions (compared with one last year) reporting all staff on teaching and research contracts. Seven institutions also exceed last year's top score for the proportion of permanent staff.

Interest in the trends data centres on the Government's measure for participation and the failure to close in on its 50 per cent target.

In April, the Department for Education and Skills released the latest figures for the higher education initial participation rate (HEIPR), its measure for 17 to 30-year-old English-domiciled first-time entrants to higher education courses at UK higher education institutions and English further education colleges. The headline rate for 2004-05 was 42 per cent, no change from the previous year. But 2003-04 saw a fall from 43 per cent in 2002-03. For men, the small and continuing decline from 38 per cent in 2003-04 to 37 per cent was no surprise - and officials pointed out that the fall represented 0.2 percentage points.

The interesting, and worrying, trend is that the advance made by women, which for several years has largely fuelled the rise in overall participation, has apparently slipped off the edge of the plateau. The female HEIPR, which rose to 47 per cent in 2002-03, peaked at 47.4 per cent in 2003-04 before falling to 46.8 per cent in 2004-05.

A rate of more than 40 per cent, when coupled with the UK's internationally high completion rates, is still to be envied. But the figures show that the target set in the Public Service Agreement in the early years of the Labour Government to increase participation "towards" 50 per cent of the 18 to 30-year-old age group remains frustratingly elusive.

david.jobbins@thes.co.uk

  •    2006 UK university league tables click here  

You've reached your article limit.

Register to continue

Registration is free and only takes a moment. Once registered you can read a total of 3 articles each month, plus:

  • Sign up for the editor's highlights
  • Receive World University Rankings news first
  • Get job alerts, shortlist jobs and save job searches
  • Participate in reader discussions and post comments
Register

Have your say

Log in or register to post comments