Higher education is soon to get its first official league tables. Alison Goddard looks at what performance indicators are under consideration.
The first official league tables for universities are being developed and are due to be published later this year. The data will show how well each university performs according to numerous different criteria.
"Universities and colleges have been very resistant to performance indicators but the world is changing fast around them," said Bahram Bekhradnia, director of policy at the Higher Education Funding Council for England. "There has been a very strong government push for performance indicators and, although the higher education sector is apprehensive, I hope that they will regard this as better than any alternative that would otherwise be forced upon them. This is far more intellectually rigorous than what has gone before and if we don't make progress on this then others will."
The performance indicators have been drawn up by a steering group of representatives from the Department for Education and Employment, the Treasury, the Higher Education Statistics Agency, the Committee of Vice-Chancellors and Principals, the Standing Conference of Principals, HEFCE and the Higher Education Funding Council for Wales.
In the next six to eight weeks, HEFCE and HEFCW plan to send each institution details of how it is performing in each of six areas:
* broadening participation of under-represented groups
* student progression
* learning outcomes
* learning and teaching efficiency
* student employment
* research output (see box below). They will also produce statistics for the whole sector in each of these categories plus sector-wide information on the links between universities and industry. Institutions will have several weeks to comment on the performance indicators before the first data are published later this year, depending on the comments received. Under present plans the performance indicators will apply only to higher education institutions; information will not be collected on higher education students in further education colleges. The data will cover undergraduates and sub-degree students but not postgraduates.
"The results will not be for government alone," said a spokeswoman for the DFEE, "the main purpose in developing performance indicators is to assist institutions in monitoring, managing and improving their own performance."
"In principle, the CVCP is in favour of accurate information being publicly available," said David Young, who represents the CVCP on the steering group. "The government makes available large amounts of money and it wants to know that the money is being spent well. On the face of it, we have got what looks like sensible performance indicators."
The performance indicators steering group was set up following Lord Dearing's review of higher education. "To assist governing bodies in carrying out their systematic reviews," he recommended that, "funding bodies and representative bodies develop appropriate performance indicators and benchmarks for families of institutions with similar characteristics and aspirations." However, the steering group rejected as too restrictive the formation of families of universities. "That is just too crude," said Mr Bekhradnia, who is chairing the group. "You could have any number of families: effectively one for each variable."
Instead, each institution will be able to compare its standing against context statistics. These figures will take account of the intake of students to the institution, their educational backgrounds and the subject mix of that institution. The results for any institution can then be compared with the average for similar institutions.
"For example, you might think that you are doing terribly well because few students are dropping out," said Mr Bekhradnia. "But in actual fact, you might not be doing very well at all because you take students with a good A-level points score and those students are less likely to drop out. The context statistics or 'adjusted sector outcome' will be able to tell you." Likewise the social class of an institution's students should be similar to the population from which they are drawn.
At present the performance indicators are not intended for prospective students but for government departments, funding councils and other interested parties. The steering group decided to wait for reports from the DFEE's student information needs group and the Institute for Employment Studies investigation of the advice and information that students need. Both are due to be published shortly. Nevertheless, the first report of the steering group states that "in practice, the needs of prospective students, and of the general public, have formed a continuing background to the group's work".
"It would be nice if they consulted students as well," said a spokesman from the National Union of Students. "Performance indicators are useless unless you plan to do something positive with the information," he added. "There has to be enough information to give a proper rounded picture and there has to be something positive rather than the negative stigmatisation of 'failing' universities."
There are still remaining difficulties with the statistics. For example, the government had been particularly interested in what students do after leaving higher education but this is difficult to record. "There is a lot of disappointment that we can't do more with the employment statistics but the data are just not there," said Mr Bekhradnia.
There are other problems with the measurement. For example, all the graduates from one institution could have found non-graduate jobs, while 80 per cent of graduates from another institution could have found graduate jobs. How to judge the relative merits of these two institutions has yet to be decided.
Likewise measuring the efficiency of learning and teaching has its problems. "There are some people who say: 'It is easy to judge efficiency - just look at the staff: student ratios'," Mr Bekhradnia said. "But a university with a lot of research money will spend that money on staff and the large staff:student ratio will make it appear inefficient."
Despite the difficulties, HEFCE intends to publish the performance of all universities in these categories later this year. "Performance indicators are bound to be controversial," said Mr Bekhradnia. "What you measure is value-laden and a particular issue is whether you are comparing like with like. There is considerable diversity in the higher education sector, from the Rose Bruford College to universities like Oxford and Cambridge or Imperial College, London."