Call for global overhaul of research evaluation gaining traction

‘Hong Kong Principles’ aim to tackle ‘perverse incentives’ in metrics-based policies and university promotion criteria

August 1, 2019
Graphene development from graphite
Source: Getty

Integrity campaigners are finessing a set of principles to guide the assessment of research as concerns mount over the reliability and relevance of global scholarship.

A revised version of the “Hong Kong Principles” is due to be circulated by early August after a preliminary draft was appraised at the sixth World Congress on Research Integrity (WCRI) in Hong Kong in early June.

The document follows statements about research integrity and collaborations developed through the second WCRI in Singapore in 2010 and the third conference in Montreal in 2013. The new principles have evolved from the “Hong Kong Manifesto” developed earlier this year.

They offer broad guidelines to overcome “detrimental practices” encouraged by metrics-based policies that reward hype and overlook valuable contributions to global knowledge – such as replication studies, longitudinal datasets and research with null results.

“Current university promotion and tenure schemes may have been useful decades ago…[but] they are out of step today,” a May draft of the manifesto proclaims. “There is growing awareness that current reward criteria are of limited value, do not foster research integrity and might function as perverse incentives.”

Co-author Paul Glasziou said that the statement was part of a “multi-step process” destined to take decades. “We need sustained attention by the funders in particular but also research organisations to fix this problem,” said Professor Glasziou, director of the Institute for Evidence-Based Healthcare at Queensland’s Bond University.

The document mandates “responsible practices” in research, such as pre-registration of study protocols, and says that promotion and tenure committees should demand such practices. It says that all research should be reported, regardless of the results, with data made available as preprints or in repositories.

The principles also advocate open science and support “a broad range of research activities” such as replications, synthesis, meta-research and innovative study approaches. “Blue-sky research building on accidental findings, or curiosity-driven research based on out-of-the-box thinking, should be encouraged in an academic reward system,” the May draft says.

“For example, the discovery of graphene at the University of Manchester was the result of Friday afternoon discussions outside normal research activities.”

Research assessment practices should also acknowledge “other contributions” such as peer review and mentoring, the draft says.

Professor Glasziou said that the principles aimed to build on transparency measures pioneered about 35 years ago by University of Sydney oncologist John Simes. “He called for universal registration of clinical trials and has been working for years to get that to happen.”

Registration of clinical trials is now a condition of grants from the US National Institutes of Health, Australia’s National Health and Medical Research Council and other funding bodies, Professor Glasziou said.

“It’s also mandatory to post the results within a year of the trial completing," he added. “That has led to more results being available, even if they’re not published. It’s been a big switch of the dial.”

He said that while similar practices were entering other realms of research, particularly preclinical research involving animals, they languished “30 years behind where we are with clinical trials.

“Because of the vanguard of the clinical trials movement, the animal study work is catching up, but it’s a slow process.”

Professor Glasziou said that he expected the Hong Kong Principles to be submitted for publication in a few months following further revision. “Getting a public discussion going is equally important,” he said.

john.ross@timeshighereducation.com

POSTSCRIPT:

Print headline: Movement for global overhaul of research evaluation gains traction

Please login or register to read this article.

Register to continue

Get a month's unlimited access to THE content online. Just register and complete your career summary.

Registration is free and only takes a moment. Once registered you can read a total of 3 articles each month, plus:

  • Sign up for the editor's highlights
  • Receive World University Rankings news first
  • Get job alerts, shortlist jobs and save job searches
  • Participate in reader discussions and post comments
Register

Related articles

Related universities

Reader's comments (1)

I don’t see how these ‘principles’ would better help to identify the best researchers. Anyway, metrics are not taken seriously in uk universities in many fields, especially those in which big funding is available. Publications are a tick box criterion and not of great importance. The only time publications are mentioned in anger is in relation to REF, where we asked for s tiny number and are not allowed to mention any metrics!!! The main thing is money, along with effective internal self promotion. It’s not uncommon even in the so called top universities to see chair positions handed out to people with fewer than 30 journal papers and only a few hundred citations. It wasn’t like this 3 decades ago.

Have your say

Log in or register to post comments

Most commented

Summer is upon northern hemisphere academics. But its cherished traditional identity as a time for intensive research is being challenged by the increasing obligations around teaching and administration that often crowd out research entirely during term time. So is the 40/40/20 workload model still sustainable? Respondents to a THE survey suggest not. Nick Mayo hears why

25 July

Sponsored