Four ways to make research more open and robust

Metadata can be used to improve research integrity but first major changes to research design and practice need to be made, argues Neil Jacobs

October 26, 2019
Source: iStock

Scientific research is using new ways of diffusing knowledge. Digital technologies and collaborative tools are affecting the whole research cycle and call for everyone in research to consider how research 4.0 technologies, including automated experiment selection or natural language processing, can help improve research. 

We need to move away from opaque research, conducted on PCs, using closed software and reported in pdf documents, to more open research created in a digital environment that is designed for that purpose. For example, technologies such as open source scripting languages and open sharing of data and code are techniques that support this trend.

The research sector is in a transition marked by a renewed interest in metadata used in research. The use of metadata is one of the key pillars of the recently launched Research on Research Institute, which will analyse research systems and experiment with decision and evaluation data, tools and frameworks.

Metadata are data that describe other data, which can make finding and working with particular data sets easier. For example, metadata tools can gather information on authors, date created, date modified and file size.

Metadata tools give the research community the opportunity to gather insights in previously undisclosed territory. Here are four examples of research projects that focus on metadata, showing the potential use of technology to improve research integrity.

 Pre- and post-outcome comparison

A clinical trials review reported in 2019 in the JAMA Network found that a third of clinical trials that had a different primary outcome from what was preregistered were also more likely to have a higher (16 per cent) intervention effect. This significant finding leads me to wonder whether a tool to compare pre-registered with published outcomes might provide useful feedback to make sure that the intervention effect is not skewed by selective reporting, research bias or any other variable that could compromise the research integrity.

 Reporting compliance dashboard

Similarly, a 2019 study of preclinical animal trials showed that there is a real lack of reporting on basic metadata. Information about basic reproducibility and ethical practices, such as blinding, sample size calculation, control group allocation and compliance with guidelines such as ARRIVE (animal research: reporting in vivo experiments) is often missing. It could be useful for institutions, researchers, funders and publishers to access a dashboard that shows gaps in reporting.

 Data availability tracking tool

Another example of where technology might offer greater transparency, is a 2018/2019 study of genome-wide association studies that Jisc was involved with. It showed that only a minority of the studies included useful data availability statements. We are now exploring whether we can develop a tool to identify where these data availability statements are and whether they really do point to data that can be accessed and reused.

Reliability and confidence through AI

The US Defense Advanced Research Projects Agency (Darpa) is working on the Systematising Confidence in Open Research and Evidence project. Darpa is using AI and machine learning to give an estimate of reliability and a confidence score for social and behavioural studies. These AI tools will assign confidence scores about behaviour patterns with a reliability that is equal to, or better than, the best current human expert methods. The scores will inform the way that the US military uses social and behavioural science research to inform their investments and models of human social behaviours to safeguard national security.

Digital collaborative environments

Tools and applications that improve existing research communication practices are paving the way toward a more robust and open science culture. But more fundamentally, we need changes upstream creating digital collaborative environments that embed academic norms and practices such as pre-registration and open code into the research design and practice.

We need to make it easy for researchers to do the right thing during the research process as well as when reporting it afterwards.

For example, we’re in conversations with researchers at the universities of Bristol and Bath, who conduct interdisciplinary research into the built environment, the physical environment and the ways that human beings interact in those. This intrinsically interdisciplinary work involves civil engineers, scientists, psychologists and others.

Those conversations are about what would be an appropriate digital environment in which all these disciplines can bring together data from the internet of things, from sensor networks and from mobile networks – a way that enables hypotheses to be pre-registered and embedded into software agents that could then interrogate these data in a responsible and reproducible way.

There are already providers who deliver such technology such as the Open Science Framework which is a free, open platform supporting open research and collaboration.

Another helpful tool is the Force11 Scholarly Commons initiative, which provides a set of principles, concrete guidance to practice, and action towards including diverse perspectives from around the globe.

What we also need are changes in research assessment to enable these best practice applications to be rewarded, to reward the publication of interactive models and more imaginative ways of reporting science that are truer to the research process.  We also need changes in research study design and funding, for example to recognise longer study set-up times, and changes in research teams and skills so that coding becomes as mainstream as authoring papers and bids.

Neil Jacobs is head of open science and research lifecycle at Jisc.

Please login or register to read this article.

Register to continue

Get a month's unlimited access to THE content online. Just register and complete your career summary.

Registration is free and only takes a moment. Once registered you can read a total of 3 articles each month, plus:

  • Sign up for the editor's highlights
  • Receive World University Rankings news first
  • Get job alerts, shortlist jobs and save job searches
  • Participate in reader discussions and post comments
Register

Related articles

Have your say

Log in or register to post comments

Most commented

Recent controversy over the future directions of both Stanford and Melbourne university presses have raised questions about the role of in-house publishing arms in a world of commercialisation, impact agendas, alternative facts – and ever-diminishing monograph sales. Anna McKie reports

3 October

Sponsored

Featured jobs

Solutions Architect

Bpp University

Quality Manager

University Of Leeds

School Support Assistant

Edinburgh Napier University

Library Advisor

St Marys University, Twickenham