Researchers’ creativity could be undermined by too many new metrical approaches to research evaluation, the director of the Wellcome Trust has warned.
Jeremy Farrar told a conference in London last week that so-called altmetrics - research metrics that go beyond traditional citation counts - offered the possibility of “driving change” by better connecting researchers with each other and with the wider community. But he said that the current burden on the research community was “massive”.
“We are in danger of overburdening it with ever more approaches, and it is on the edge of not being able to cope…such that we will destroy [its] creativity and innovation,” he told the 1:AM conference on altmetrics on 25 September.
Altmetrics often involve counting the number of mentions or “likes” of papers on social media. Some commentators have suggested that this could provide an early indication of how many citations a paper is likely to accrue ultimately. But Euan Adie, founder of altmetrics provider Altmetric, admitted that the correlations were “very weak”.
Altmetrics were better suited to assessing the broader social impact of papers, he said, adding that while analysing social media was useful, the movement had to go further.
He said that his firm was also assessing mentions on blogs and review websites such as Faculty of 1000, as well as policy documents - although he said it would take human intervention to ascertain whether the policy had been implemented.
Andrea Michalek, co-founder and president of Plum Analytics, said that her company was also counting comments and reviews, Wikipedia mentions and bookmarks in reference managers.
Adam Dinsmore, evaluation officer at the Wellcome Trust, said that the trust used altmetrics to flag up potentially interesting “narratives” about the influence of research it funded. However, he said, a high altmetric score did not necessarily imply that the research had made a crucial scientific impact, and it was important to consider whose attention papers had caught. For example, the trust searched for mentions of its papers in academic syllabuses and on websites with .ac and .gov domain names.
Liz Philpots, head of research at the Association of Medical Research Charities, said that she used altmetrics to monitor public discussion of funded research, but warned that assessing researchers on the basis of altmetrics would risk judging them “on the basis of their communication skills, not their research skills”.
James Wilsdon, professor of science and democracy at the University of Sussex, is chairing an independent review of the use of metrics in research assessment for the Higher Education Funding Council for England. It has received 152 written responses, which will soon be posted online. Preliminary conclusions will appear in March, followed by a final report in June.
He said that the “incredible dynamism and innovation” in altmetrics was “genuinely exciting for all of us who want to see a creative, outward facing, socially engaged and impactful scientific enterprise”. But he echoed Professor Farrar’s concern that it also had the potential to overburden researchers with “yet more complex systems and tools for them to worry about”.