Journal transparency index will be ‘alternative’ to impact scores

Center for Open Science project will pass judgment on journals’ commitment to research transparency

January 29, 2020
Source: Getty
Crystal clear it’s hoped the new system will encourage open science

A new ranking system for academic journals measuring their commitment to research transparency will be launched next month – providing what many believe will be a useful alternative to journal impact scores.

Under a new initiative from the Center for Open Science, based in Charlottesville, Virginia, more than 300 scholarly titles in psychology, education and biomedical science will be assessed on 10 measures related to transparency, with their overall result for each category published in a publicly available league table.

The centre aims to provide scores for about 1,000 journals within six to eight months of their site’s launch in early February.

Among the measures of assessment are whether the publications ask authors to share their raw data or if they set standards for research design disclosure.

Other categories cover whether journals encourage the replication of studies and whether authors are required to preregister their experiments before data collection.

On data sharing, for example, journals would be given a one-out-of-three score for publishing a data availability statement, two for requiring authors to share data (subject to exceptions) and a full three out of three for providing enough data to enable full replication.

Journals will also receive credit if they offer the option of peer review before any research is undertaken – a published format known as “registered reports”, used by more than 200 journals, in which assessors focus on research design, rather than end results.

The new ranking system comes amid concerns over the difficulty of reproducing some of science’s most high-impact papers owing to large amounts of missing or withheld methodological data. Last week Brian Nosek, the centre’s director, told Times Higher Education that a forthcoming paper on cancer research replication would be a “wake-up call” for science given how the weak methodology sections of most papers made reproduction impossible.

David Mellor, the centre’s director of policy initiatives, said he believed the new table would “promote those who are doing the most to encourage open science”.

“We want to recognise those who are taking difficult steps to address bad practices,” said Dr Mellor, who hoped journals would “require or reward open science practices that are not as common as they should be”.

Dr Mellor added that he hoped the new scorecard system – in which scores are decided by the centre’s assessment teams based on publicly available information – would provide an alternative to the current journal ranking system, which is based on citations per paper, known as “journal impact”.

“There is a clear pecking order based on impact, but this is really about a paper’s novelty or prestige, rather than [adhering to] core scientific values,” said Dr Mellor.

jack.grove@timeshighereducation.com

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Sponsored