The need for a modernised, more anonymous process of application for research funds was recently advocated on these pages. Here, Richard Brook defends the present system but calls for key refinements
WE ARE all agreed on the objective: to pick projects for support that are creative, imaginative and original. We are, for the most part, agreed on the mechanism: to ask those who are themselves expert in the field and who themselves have experience of the research endeavour to advise on the selection. Despite this agreement, worries persist. They are generally of two kinds. There is the concern, most recently advanced by your anonymous contributor John Smith (THES, November 14), that the system may be unfair. The peers may be unscrupulous or fall victim to special pleading within the smoke-filled rooms where decisions are made. A second concern is that where all involved behave with unblemished integrity, there may nonetheless be a less than ideal selection of projects as a consequence of using a bureaucratic process to select the creative, imaginative and original.
One response to the two concerns is to concede the weakness but to claim that the process is the best of the available methods - the parallel with democracy in the political arena is often quoted.
My suspicion is that this reaction is too passive. In accepting peer review as the best of the available methods, we should in no sense abdicate from the responsibility of ensuring that it operates in the best possible way. Current developments within the Engineering and Physical Sciences Research Council make it important for us to take this task seriously indeed. The tendency in recent years has been to give greater weight to the so-called responsive mode where applicants to the council for the support of research have the freedom to select the theme and to describe the way in which they wish to advance it. The council then has the task of using peer review to decide whether such proposals can be accepted or not. Placing this weight on the responsive mode rather than on a managed mode, where the researchers are asked to work within the prescribed themes, does place a vital responsibility upon the peer reviewer.
Recognising this central role of peer review, the EPSRC has tried to clarify the processes that it uses. It has consulted all of those who have put proposals to the council in recent years, successfully or not, and asked that they nominate three peer reviewers whom they believe to be professionally suited for the review of proposals. The EPSRC has then established colleges of reviewers, i.e. in a subject area such as physics, there is a named list of some of 120 experts who enjoy the confidence of the research community in acting as peer reviewers. On receipt of a proposal, the EPSRC takes the advice of at least two college members together with the advice of a peer selected from three whose names have been put forward by the applicant. Those who receive favourable written references are then seen by a panel meeting where a small number of members drawn from the college construct a rank order. The available funding then decides on the extent to which projects can be financially supported down the rank order.
After three years of operation in this mode, a number of small refinements have been made. It is now policy to supply all of the written comments made by the referees back to the proposer; there is then better consistency between the decision that has been made and the peer advice that has been provided. In a second change, the written comments of the reviewers are on a trial basis to be provided to the applicant for comment prior to the panel meeting, this comment then being available for the ranking decision.
There is, however, growing indication that a larger refinement is possible. If we compare, for example, the average age of principal investigators (47) with the average age of college members (48), the drift of the argument can be seen. Looking at age distribution, at gender balance, at professional affiliation, at extent of multi-disciplinarity the suggestion is that when we ask the research community for the names of suitable peer reviewers, they may choose people who are remarkably like themselves. When we then ask peer reviewers to select project leaders, they in turn choose people remarkably like themselves. The risk is that we have a closed loop that will work to preserve the research community in the state in which it first finds itself.
How can a research council mediate constructively in this closed loop? If, as I am sure we all agree, it is important to select proposals through peer review, rather than through research council diktat, then the college members must identify the responsive mode proposals. The point at which it is fair to engage in the cycle is in the design of the peer review colleges.
There are two general rules for procedure. We must first ensure that we operate a peer review system that is as open as possible and that enjoys the full confidence of the research community. We can, however, adjust the make-up of the peer review colleges so that their membership takes on the characteristics of the research community that we would wish to see at work in the coming years.
The EPSRC tries to meet the first objective by using listed college reviewers nominated by the academic community, by stating the names of ranking panel members following meetings, by providing full feedback to the applicants and by allowing the applicant to make comments on peer review reports that are to be considered by the ranking panels.
More tentatively, we are considering methods for modifying the constituency of the colleges. A first step in this direction has been to add the names of our 125 advanced fellows to the college groups. These are younger workers who have been awarded fellowships on the basis of their excellent research promise. A further weighting of the colleges toward the younger members of the research community is also believed to be justified.
In addition to these two steps, the council believes that it has a clear responsibility to advise reviewers and peer colleagues fully of the processes that are in operation and of the context within which their advice is being sought. The adage for university examinations that "the quality of answer depends upon the quality of question" can be extended to the process of peer review. A peer reviewer will be that much more confident of the ability to select work of originality, imagination and creativity if there is a full awareness of the basis on which the advice is being provided.
So John Smith should not be so depressed. Peer review has great merits. It remains a clear responsibility for research councils to ensure that these merits are fully recognised and developed so that a portfolio of work can be set in place that brings credit and advantage - in Pascal's phrase - to the individual researcher, to the nation and indeed to us all.
Richard Brook is chief executive of the Engineering and Physical Sciences Research Council.