The results of the sixth and final research assessment exercise (RAE) are finally published. For the sector, this is a hugely important time: the RAE is not just a tool that determines funding allocations but the most famous branding exercise there is for research prestige.
After years of strategic decision-making, the world sees in glorious detail whether the judgment calls made by institutions - balancing quality of research against the number of researchers entered (to achieve the right balance between prestige and funding) - have paid off.
And the results give cause for celebration. More researchers were entered than ever before and nearly 90 per cent of the research submitted has been judged to be of international quality, falling into the top three grades of "world-leading", "internationally excellent" or "internationally recognised". Within that, over 50 per cent is in the top two grades. Make no mistake, UK universities are dominant on the international stage.
The fact that 150 of the 159 institutions taking part in the RAE 2008 have at least 5 per cent of the research activity in one of their submissions classed as "world-leading" is remarkable and demonstrates the power of the new quality-profile approach, used for the first time in this RAE. It has certainly done what it was claimed it would: uncover excellence wherever it is found.
For institutions, the new profiles tell them more than ever before. This is no longer a single rating giving an institution no idea where its departments come among those with the same ranking. Rather, it exposes a wealth of detail and gives universities a more in-depth and rounded view of themselves and each other.
It will pose challenges for some but it will also be a great leveller. Research-intensive institutions might not like it, but the fact is that not all of their work is internationally recognised. Similarly, the post-1992 universities will no longer be quite as easy to write off: "world-leading" research is taking place in them, too, something that may have been apparent in 2001 but is now more visible than ever. How this research will be funded remains to be seen and the Higher Education Funding Council for England has some tough decisions ahead.
We have also had some tough decisions to make in presenting the results. The approach we have chosen is one that spotlights excellence wherever it is, rather than trying to second-guess the final RAE funding formula. Above all else, we have gone for transparency and fairness in our tables. Institutions are ranked for the excellence they have achieved, regardless of size, and contextual information is included so it is easy to see who has the research power.
Finally, it cannot pass without comment that the results of the RAE 2008 have been marred by infighting over the value of data on the proportion of eligible staff submitted. While the large research-intensive universities may think it an irrelevant measure (as it will not feature in the funding formula) and the smaller ones have been desperate to have it recognised (they submit a higher proportion than their larger colleagues), the truth is that it needs to be measured for another reason. Individuals within institutions have a right to know to what extent they were "selected out".
We provide the best estimate we can and accept it is far from satisfactory, but no more accurate method is likely to emerge. If selectivity of staff continues under the system that will replace the RAE, the sector needs to make a decision about the value of this measure.
Another story will emerge in March when the results translate into funding but, for now, we should raise a glass to brand research UK.