Hotline for not-so-blind data

September 25, 1998

HESA is in the statistics business but that does not mean league tables, Brian Ramsden argues

Long ago, one evening when I was the only person left in the office, I had the dubious pleasure of answering one of the first data enquiries ever received by the Higher Education Statistics Agency.

"This is HESA: can I help you?" "Yes, good: I'm from Coopers Waterhouse Deloitte. I need some information."

"That's what we're here for," I replied, with my fingers crossed, thinking to myself that we had only recently collected our very first tranche of data from higher education institutions, and had yet to publish anything from it.

"Well," he went on, "I need to know what are the top ten universities in the UK." My heart sank. Here was an opportunity to alienate 90 per cent of the universities and all of the colleges at a stroke.

"Right," I said. "When you say 'university' are you specifically referring only to institutions that have university status, or do you include other higher education institutions, such as the colleges and music conservatoires, many of which have an international reputation?" "Oh yes, whatever," he replied impatiently. "I just need to know quickly, because I've got to get a report drafted for a client tonight."

"Well," I said. "When you say 'top', have you any definition in mind?" By now he was, understandably, becoming impatient: I had kept him on the line for more than 50 seconds without giving him an answer. "You know," he said "the ten top universities - TOP!" I thought for a moment: "Well, the Open University is very large, and Manchester Met is pretty big: they come top of our list of institutions by size. Or, in terms of antiquity, Aberdeen had two universities 500 years ago. Would those be interesting?" "No, no, NO!. You know what I mean: TOP! Like Oxford and Cambridge ..."

Sadly, this was one commercial client HESA was unable to trap: a shame, since he was probably charging as much for an hour of his time as we would normally charge for a couple of days. Even more sadly, although that phone call was about three years ago, expectations have not really changed significantly.

This is not to say that Oxford and Cambridge are not splendid institutions - they are - and there are many other UK higher education institutions that perform superbly in the context in which they are set, whether that is a specialist area or a particular teaching mode or a national or regional or local context.

In fact, if the question had been the reverse - "which are the bottom ten institutions?" - I would have been hard pressed to identify any, even if I had seriously consulted my own prejudices.

In the higher education sector, we seem to be unable to come to terms with our own general level of excellence, and so, throughout HESA's short existence, it has continually been expected to look at ways in which its data can differentiate in terms of institutional performance.

An outsider might find this odd: in many walks of life the measure of a successful institution can be found in its annual accounts.

The analogy is apt - higher education institutions draw substantial and increasing proportions of their funds from outside the public sector, and so financial measures of performance may grow in significance for universities and colleges, and also for their staff and students.

Maybe HESA continues to disappoint those who look for clear performance indicators and league tables; but I would argue that those who are looking for specific and well-defined information for a specific and reasonable purpose can, on the whole, find it - and as the quality of the data institutions provide improves year by year, there will be scope to extend this information. (HESA processes 500 million data items each year and publishes 28 million. Within this, we must have what you want.) If, for example, you are an institution that wants to compare itself with its competitors, look at the institutional management statistics HESA publishes on behalf of the HEMS group - a group set up by the sector for the sector.

They are not the same as previous performance indicators published for the old universities on behalf of the Universities Funding Council and the Committee of Vice-Chancellors and Principals: but the sector decided, in my view quite rightly, that some indicators which were appropriate for a largely homogeneous sector, were inappropriate for the much more diverse sector post-1992 and the world of lifelong learning.

Or, if those indicators are not appropriate for your institution, then take the data in our CD-Roms, which contain an enormous amount of information about all institutions, and use the data to construct your own comparative tables.

And watch our website as it generates more comparative information. Or, if the data is not there, ring us and ask us to extract it for you.

Or if you are an intending applicant to higher education, look at some of the commercial products, many of which in one way or another draw on and interpret HESA data.

These attract much interest and occasionally odium: but there can be no doubt that they are now founded on more robust data than ever before. We see our role in these as providing the data, and, within limitations, to advise on its interpretation; but not to involve ourselves in their construction of indicators and league tables.

There is a good reason for this. During the last year, we have been involved in a detailed research and analysis project, looking at the factors which influence student performance.

The findings will be reported before long, but it is fair to say that all of the factors that you might guess at are indeed relevant to student performance; and the inter-relationship between them is incredibly complex.

Do not, please, ask yourself or ask HESA whether it is better to study history or journalism. Or whether you are more likely to succeed if you are male or female. And, especially, do not ask whether it is better to study at the University of Poppleton or the University of Uttoxeter.

Simply ask us for the data, and draw your own conclusions; but do so carefully and responsibly, and recognise that it is a very complex picture.

This is not a cop-out. We take seriously our role in helping the world to draw its conclusions: we are working with colleagues in institutions, in government and in the funding councils in developing information streams, including performance indicators, to meet all the needs of all the constituencies.

Brian Ramsden is chief executive of the Higher Education Statistics Agency.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored