This book opens up for public scrutiny the weaknesses of the research assessment exercise established by the UK Government to quantify the performance of scientists, and then uses these insights to suggest positive, achievable methods for allocating research funding.
Donald Gillies, a philosopher of science who studied for his PhD in Sir Karl Popper's group at the London School of Economics, writes with huge respect for the scientific community. He brings well-manicured logic to this controversial subject, writing from a historical perspective that challenges those who value British science not to repeat past mistakes.
He defines two categories of error in the RAE. Type 1 occurs when research that would have obtained excellent results receives a low rating and the researcher is given more teaching duties; Type 2 when research that obtains no good results is rated highly. The RAE concentrates on eliminating Type 2 errors, but the history of science shows that Type 1 errors are the serious ones.
In a less formal analogy, Gillies invokes a system for the quality control of diamonds that removes blemished stones, leaving clear ones. Then someone notices that the system removes pink diamonds, which are more valuable than their clear counterparts.
What are the consequences? The RAE encourages scientists to "downsize" by choosing problems that are solvable within its time frame to achieve a high rating. It relies on two layers of peer review and is guaranteed to obscure paradigm-shifting work. It prevents the funding of poor research, but inhibits groundbreaking advances.
Giants emerge from the twilight of history to stand behind Gillies as he presents his case. Copernicus would have been refused research funding under an early 16th-century RAE and been forced to teach the Aristotelian-Ptolemaic system. Friedrich Gottlob Frege, the founder of modern mathematical logic, was demolished by his peers and his work was not recognised fully for another 40 years. In 1848, Ignaz Semmelweis suggested that surgeons should disinfect their hands, long before Joseph Lister's 1865 work - but peers judged his insight to have no value. Many people died unnecessarily in the intervening years as a result.
It might seem that Gillies is molesting a cadaver because the RAE protocol is destined for landfill, but his criticisms apply to its proposed replacement, the research excellence framework. Indeed, he warns that the REF will make the same errors and then some, as a system of "Byzantine complexity" emerges.
"What would you put in its place?", one may ask. It takes some lateral thinking to realise that to solve the problems of research assessment, we must attend to ... teaching.
The argument runs like this. Everyone wants to be a researcher because that's how you get promoted. However, many people would much rather become excellent teachers, if only that work were valued.
So, Gillies suggests, let scholars decide which calling they want to follow and reward excellence in both. It is a creative solution that eliminates tension and provides internal regulation within a community of devoted and well-meaning people.
He then turns his sights on the new practice of researcher banning orders and debarments. He shows how the threat of the Arts and Humanities Research Council's two-year debarment prevents researchers from trying new ideas.
Gillies has combined his expansive knowledge of the history of science to marshal arguments, with pristine logic, that most scientists know in their bones but are reluctant to express in public. He has spoken wisely and with perfect timing.
How Should Research be Organised?
By Donald Gillies
Published 15 December 2008