Key evidence that has helped persuade policy-makers to fund basic biomedical research the world over cannot be relied on, a new study has found.
Analysis of work by US researchers Julius Comroe and Robert Dripps, which is often cited by funders, has shown their methodology to be ambiguous.
The original 1970s study purported to demonstrate that 62 per cent of key research articles that were judged to be essential for later clinical advances were the result of basic, as opposed to applied, research.
This was widely interpreted as justification for funding laboratory-bench projects, since many would lead to benefits at the hospital bedside.
But work by Jonathan Grant, associate programme director at Rand Europe, and colleagues at Rand and Brunel University's health economics research group has shown fatal ambiguities in the study.
Among the problems was a lack of clarity over whose opinions had been surveyed, how clinical advances were assessed and how a "key article" was defined.
"It is an insufficient evidence base for increased expenditure on basic biomedical research," Dr Grant said.
The Comroe and Dripps study was published in the journal Science in 1978, but it has become influential among research policy-makers in recent years.
The proportion of UK research-council spending on basic research has increased over the past decade, from 42 per cent of civilian research-and-development spending in 1991-92 to 61 per cent in 1998-99.
Similar patterns are found in the US and other G7 nations.
The British team did not question the essential thrust of Comroe and Dripps' argument. Their own revised approach suggested that between 2 and 21 per cent of research that underpinned clinical advances could be defined as basic, albeit after an average time lag of 17 years since publication.
But Dr Grant said new evidence was needed to get to the root of the problem.
A pilot study to explore how this could be done has used bibliometric analysis of 29 papers on diabetes, authored by George Alberti, professor of medicine at Newcastle University, in 1981, their subsequent citations in 799 other papers, and in turn their citations in more 12,891.
Experts then assessed the importance of the papers and those that cited them according to set criteria.
Finally the papers' significance was evaluated through detailed interviews with four leading scientists.
Each technique produced different results, but Dr Grant said that together they showed how basic research could lead to clinical advances, though often the contributions were small and incremental.
The new report has been studied by the Office of Science and Technology and will be discussed at a meeting with the Medical Research Council in December.