A bold success, a brave successor: David Eastwood sees quality assessment going from strength to strength. On any impartial assessment, the research assessment exercise has been a success. Although it was born in 1986 in the white heat of cuts, it matured in a more benign climate for higher education funding.
Its impact has been profoundly positive. It initially secured research funding by justifying it to a then-sceptical Government. Since then, it has been the bedrock of the case for investment in a high-quality research base and a driver of our globally competitive research base. We would not be where we are today, with some £4 billion annually invested in the research base, without the evidence provided by RAEs. Nor would research- intensive institutions have the benefit of quality research income (QR) to invest against their priorities.
The RAE has also been the key instrument for performance management in institutions, and much of the obloquy that has been heaped on it has arisen from university managements doing what they should do but sheltering behind the pretext of the RAE. To this extent, the RAE has done more than drive research quality; it has been crucial to modernisation.
So in justifying investment, driving quality and underpinning a research base whose quality and impact is second only to that of the US, the RAE has done more than could ever have been expected. The RAE of 2008 will be perhaps the most robust and nuanced so far.
Nevertheless, there is a consensus that the RAE, in its present form, has served its purpose and by 2008 will have run its course. The Higher Education Funding Council for England had come to this conclusion back in 2005 and decided to develop proposals for a new approach to research assessment before 2008. The public debate on the future of research assessment, initiated by the Government in 2006, elicited a consensus around the broad architecture of a new system. Hefce has been working with other stakeholders on the design of what we will come to know as the research excellence framework (REF).
Reflecting the priorities identified in the 2006 consultation, we have sought to elaborate an approach that will embody a robust measure of quality, based on bibliometric indicators combined with research income and postgraduate student data in the science-based disciplines. The new system will offer a lighter touch, reduced burden and robust disciplinary and institutional benchmarking while retaining the focus on identifying and funding quality.
As there is now a broad consensus that the RAE in the form we have known has reached the limits of its effectiveness, the debate is now rightly moving to how the system should evolve after 2008. In the new system, we are as committed to lightening the burden as we are to rigour in assessing quality, so identifying excellence will remain both a mantra and reality. We believe that bibliometrics, used with sensitivity and sophistication, can offer international benchmarking of quality, which is one thing that the RAE has not been able to do. We also believe that the new system will capture quality in different types of research.
Nevertheless, the new model will need further refinement. There are important decisions to be made regarding the way in which bibliometrics will identify quality in particular disciplines. The REF must capture research quality without distorting research priorities or creating perverse incentives. It must capture interdisciplinary and multidisciplinary work, appropriately reflect applied and practice-based research, and robustly protect equal opportunities.
The quality framework on which we are consulting achieves much of this. When embedded in a model that weights research and business income, research volume and quality, it should produce funding outcomes that are fit for purpose and incentives that drive quality, impact and relevance.
There are, of course, questions to be resolved: frontiers between disciplines must be reflected; there is work to be done on the way in which non-science-based disciplines will be evaluated; and we will need to elaborate the role of expert panels and institutions' discretion in shaping returns.
After February, we will evaluate consultation outcomes and revise our model before piloting the new system later in the year. On the basis of consultation and evaluation, we will be in a position to make final decisions. Working with the sector, the other funding councils, the Government and those with a key interest in the funding and use of research, we are confident we can develop a research assessment and funding system that will be appropriate to the next decades.
As the new system takes root, no doubt some will blame the REF. But working together, I believe we can build a system in which action replays will generally show the REF got it right.
David Eastwood is chief executive of the Higher Education Funding Council for England. Responses to the consultation document Research Excellence Framework: Consultation on the Assessment and Funding of Higher Education Research Post-2008 are due by February 14, 2008.