Research intelligence - Rip it up and start again

Journal articles are an outdated way of sharing scientific research, says open-access advocate. Paul Jump reports

December 16, 2010

"Once you start looking at how the scholarly communication system works with any degree of outside perspective, it looks utterly insane."

This is the view of biophysicist Cameron Neylon, a senior scientist at the Science and Technology Facilities Council's Rutherford Appleton Laboratory at Harwell, Oxfordshire, and author of the blog Science in the Open.

He said the current system of communicating the results of scientific research via journal articles is a 17th-century solution to a 17th-century problem. "Printing was adopted because researchers got tired of sending letters to each other," he told Times Higher Education.

"Publishing was essentially letter aggregation. When there became too many letters, peer review was introduced. You can argue that the biggest innovation since then has been the removal of 'Dear Sir' from the beginning of articles."

Dr Neylon believes that if scholarly communication were redesigned from scratch for the digital age, it would look radically different. Most significantly, the monopoly of the journal article would be smashed. He conceded that articles would still have their place, but added that they fail to maximise funders' return on their investment because they almost never contain enough information to allow other researchers to replicate experiments.

Those who wish to do so are obliged to request the relevant information from the authors, but "compliance is around 20 per cent".

Worse still, a lot of potentially valuable research outputs that do not "fit into a story" currently either get crowbarred into a paper or, more likely, never see the light of day.

Dr Neylon thinks it would be far better for all the artefacts of the research process, such as videos, samples, data and images, to be made freely available in an open-access format - hosted either by journal websites or alternatives such as university repositories, individual researchers' websites or large commercial providers such as Amazon.

Nor is Dr Neylon worried by the potential for information overload to which this proliferation of information could give rise.

"The idea that we need to protect ourselves from the flow of information is getting the web completely backwards. Rather than filter failure, we have a discovery deficit," he said.

The trick is to develop specific search algorithms that allow scientists to find the information they are looking for.

End of the peer show

Dr Neylon said that researchers are "obsessed with their legacy" and value the article system for its ability to identify particular people with particular ideas at particular times, ensuring that "the guy who publishes three days later loses".

But he thinks the technology exists to authenticate both the date and the veracity of online content, ensuring that "notches on bedposts" can still be recorded.

He admitted it may be more difficult to replicate journals' role, via peer review, of separating the scientific wheat from the chaff.

"If someone tells me a piece of data is reliable, I'll pay it more attention if they are experts. We have to be able to make similar judgements in some form at web scale. There are examples of it working at some level but nothing in the research space that demonstrably could replace peer review and (replicate) the confidence people have in it," Dr Neylon said.

On the other hand, he thinks that confidence is misplaced. He agreed that peer review is the "core of science" but noted there is "more peer review of the efficacy of homeopathy than there is of the efficacy of (traditional forms of) peer review".

He argued that the traditional three opinions are not statistically significant enough to make "binary decisions" about whether to publish a paper. Far better, he thinks, to publish everything and leave it to readers to make judgements about quality, either through commenting or via metrics such as how many times an article is read or cited or how many times data are reused.

Dr Neylon is an academic editor of the online journal PLoS ONE, which is pioneering such an approach, and selects papers only on the basis of their rigour rather than their importance. But he admitted that, in general, article-level metrics and commenting facilities currently remain rather crude.

Getting there from here

Dr Neylon admitted his brave new world would require a "big cultural shift" currently being resisted by "entrenched financial interests" in the academy and in publishing.

But he sensed a "building momentum in certain areas". The publisher Elsevier, for instance, is involved in a number of projects to make datasets more available.

And he is confident a tipping point will be reached within the next decade - particularly given the pressure from funders for the impact of research to be maximised.

"Public research funding is not a sheltered housing scheme for people with PhDs," he said. "It is something that is expected to deliver and communications is part of that."

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Please Login or Register to read this article.