Psychology reproducibility ‘crisis’ overstated, new report claims

But authors of original paper hit back, saying latest assessment is ‘very optimistic’

March 4, 2016
Brain scan
Source: iStock

Academics have continued to trade blows over the state of psychology research following the release of a paper questioning the results of a major project that cast doubt on reproducibility in the field.

In August 2015, an attempt to reproduce 100 prominent papers by the Center for Open Science found that only 36 per cent produced statistically significant results, stoking concerns about scientific reliability that have also engulfed biomedicine.

But today saw a group of researchers from Harvard University and the University of Virginia respond with claims that the study contained several statistical errors and failed to repeat the experiments properly.

It lists a number of what it claims are discrepancies between the original studies and attempts at replication.

“An original study that measured Americans’ attitudes toward African-Americans was replicated with Italians, who do not share the same stereotypes; an original study that asked college students to imagine being called on by a professor was replicated with participants who had never been to college,” says the comment, published in Science.

It also criticises the study for only having attempted to reproduce each study once, leading to a much lower rate of replication.

The study “seriously underestimated the reproducibility of psychological science”, it concludes.

But the authors of last year’s replication study have hit back themselves in Science, calling today’s critical paper a “very optimistic assessment” that is “limited by statistical misconceptions and by causal inferences from selectively interpreted, correlational data”.

The corresponding author from last year’s study, Brian Nosek, a psychology professor at Virginia, also took aim at some of the reporting on today’s paper attacking his findings.

One journalist “talks to me at 9.20p saying she only read the press release – not original article, comment or response. Files story @ 9:30p,” he wrote on Twitter.

Other commentators have questioned whether today’s rebuttal means that all is well in psychology.

“The Reproducibility Project is far from the only line of evidence for psychology’s problems. There’s the growing list of failures to replicate textbook phenomena,” wrote Ed Yong in The Atlantic.

“There’s publication bias – the tendency to only publish studies with positive results, while dismissing those with negative ones. There’s evidence of questionable research practices that are widespread and condoned,” he argued.

david.matthews@tesglobal.com

You've reached your article limit.

Register to continue

Registration is free and only takes a moment. Once registered you can read a total of 3 articles each month, plus:

  • Sign up for the editor's highlights
  • Receive World University Rankings news first
  • Get job alerts, shortlist jobs and save job searches
  • Participate in reader discussions and post comments
Register

Have your say

Log in or register to post comments

Featured Jobs

Most Commented

Daniel Mitchell illustration (29 June 2017)

Academics who think they can do the work of professional staff better than professional staff themselves are not showing the kind of respect they expect from others

celebrate, cheer, tef results

Emilie Murphy calls on those who challenged the teaching excellence framework methodology in the past to stop sharing their university ratings with pride

A podium constructed out of wood

There are good reasons why some big names are missing from our roster

Senior academics at Teesside University put at risk of redundancy as summer break gets under way

Tef, results, gold, silver, bronze, teaching excellence framework

The results of the 2017 teaching excellence framework in full. Find out which universities were awarded gold, silver or bronze