Love them or loathe them, league tables ranking universities against each other have grown inexorably in recent years.
Now, a vice-chancellor has set out plans for a new way of assessing institutions based on com- prehensive "quality profiles" rather than linear rankings.
Chris Brink, vice-chancellor of Newcastle University, said assessments should ask if the university is "good at what it does", rather than if it is "better than the others".
His recommendations, made in a recent speech in Australia, follow moves by the Government to develop its own performance measures to gauge the "value added" by universities to the country.
The intention is to supersede the lists compiled by newspapers with a tool that allows more detailed comparison of institutions' strengths and weaknesses and better reflects the sector's diversity.
Last autumn, the Higher Education Funding Council for England suggested that web-based "spidergrams" could be used to illustrate university performance across a range of areas. Times Higher Education understands the proposal was accepted by the former Department for Innovation, Universities and Skills, and appeared in the draft Higher Education Framework drawn up by John Denham as Universities Secretary.
The framework has been delayed in the wake of the department's demise, and is now under review by the First Secretary, Lord Mandelson.
Professor Brink, whose plans are separate from the Government's, said that "quality is a more subtle and multi-dimensional concept than can be captured in a linear ranking".
Institutions should rank themselves on teaching, research and civic engagement, he suggested.
"If we could find a way of quality profiling that allows for all three core functions as well as sector diversity, we would be doing ourselves and the general public a favour."
Professor Brink set out his idea in a keynote address to the Australian Universities Quality Forum in Alice Springs last month.
He is not the only vice-chancellor to have expressed concern over the rise of league tables, although privately many admit to "using or abusing" the rankings according to how their own institution has fared.
Professor Brink said a solution was needed to the problems universities face in offering prospective students and others a degree of comparability across the sector.
"Imagine, for example, that each university can respond to the question 'Do you offer good-quality education?' by exhibiting a profile of its educational programmes, its curricula, its teaching methodologies and technologies, its student cohorts, their entry- and exit-level performance, the contact hours offered, the teachers who will teach and the assessment methods used," he said.
"Quality profiling of this kind would give us a fresh way of dealing with the issue of comparability - particularly if profiles could be compiled on the basis of some sector-wide guidelines and categories."
Such a system would allow a physics degree from one university to be compared with a physics degree from another "not, in the first place, as similar degrees, but as degrees on a similar topic at dissimilar universities," he explained.
"The profile of each university would give a good indication of the kind of physics degree you may legitimately expect at each."
Although it would still be possible to ask whether one university was better than the other, profiles would allow the answer to differ depending on the needs of the person asking the question, Professor Brink said.
This would address, to some extent, concerns raised in the recent inquiry into students and universities by the Innovation, Universities, Science and Skills Committee, which was critical of vice-chancellors' inability to answer its questions on comparability.
An analysis of newspaper rankings commissioned by Hefce last year raised a number of concerns, but acknowledged that institutions were strongly influenced by league tables.
The "spidergram" approach, being considered by the Government, is based on performance indicators in research, knowledge transfer, teaching, workforce skills and widening participation.
Graduate employment could be used, Hefce suggested, as part of a "basket of measures that could collectively be used as a basis for incentive funding mechanisms".
But spidergrams also have their critics. Rosemary Deem, dean of history and social sciences at Royal Holloway, University of London, said: "The essentially one-dimensional nature of the indicators does not always do justice either to the range of institutional missions ... or to the resource and other performance constraints of different universities."
She added: "Spider diagrams present a snapshot, so unless frequently updated they will only provide an accurate picture for a short time in relation to most indicators."
For non-expert audiences, such as parents and students, spidergrams would require detailed explanatory notes, she said. Even where these were available, the diagrams would be vulnerable to "over-simplistic interpretation by those who either don't know or don't care" about the complexities underpinning the data.
The 1994 Group of research-led universities welcomed Professor Brink's proposals, saying that the more accurate information there is available to students, the better.
But a spokesman added that although quality profiles are worth examining in more detail, "which-ever type of ranking or classification system you use ... the devil will be in the detail. None is yet perfect by any means."