Altmetrics: The Impacts on Impact
Posted on November 11, 2014 by Sally Hawkins
Digital innovation has sparked a massive culture shift among researchers around the way they choose to communicate their research.
Social media tools and research-specific networks like Mendeley enable scientists to self-promote their discoveries so that their research output is more accessible to both colleagues and society as a whole. The growth in digital standards such as the digital object identifier (DOI) attached to papers and ORCID mean that it has become easier to track output and any derivative work that might emerge in social media, blogs or the news.
These innovations are further fuelling the debate around the accuracy and value of the traditional measure of impact, the journal impact factor. In 2010, with the publication of the altmetrics manifesto, an alternative measure of impact was proposed. Since then, altmetrics have been hotly debated and have grown in popularity. But what are altmetrics, who uses them, and are they really the answer to traditional measurements of impact that everyone is looking for?
Altmetrics indicate a wide range of activity for individual papers in digital media, such as comments in social media, mentions in the news or policy documents, and sometimes data on downloads and usage. There are a number of providers of altmetrics, such as Altmetric and Plum Analytics, and some publishers develop their own tools. This data is displayed alongside the article.
To give an example, the Society for General Microbiology’s current highest-scoring paper is from the Journal of General Virology, entitled “The 2014 Ebola virus disease outbreak in West Africa”, with an Altmetric score of 94. For those interested in exploring the information that Altmetric can provide, download the free Altmetric Bookmarklet.
As publishers we are interested in altmetrics because it allows us to deepen our understanding of the impact of our publications, with new usage data becoming available sooner after publication than with traditional citation tracking methods. It also allows us to track emerging trends in subject areas and respond to these rapidly by collecting and highlighting papers which are proving popular according to altmetrics. For researchers the information can be invaluable – not only for authors who are able to more easily track the response to their papers outside of formal publication, but to readers as well who use this information to help them manage the immense amount of literature available on the web. Funding institutions also gain from the uptake of altmetrics. At the recent 1:AM conference in London, Adam Dinsmore, Evaluation Officer at the Wellcome Trust, described how altmetrics allow them to more easily track the impact of their funding, but also to delve into the detail of the data to gain a deeper understanding of the story behind it. For example, Dinsmore spoke of a particular case where the article title may have influenced the amount of attention it received on social media (11,849 Facebook mentions and 853 tweets to date).
That altmetrics add value is clear – this is further proved by the fact that HEFCE are in the process of considering how they can use altmetrics by commissioning an independent review of the role of metrics in research assessment, which will report in 2015. But what is also becoming apparent is that they should not be a substitute for citation counts and do not provide a neat solution to the problems of impact factors. Altmetrics scores have many of the same shortcomings as the impact factor – there is inconsistency in data between numerous metric providers, there are not yet standards to ensure the accuracy of data, and high altmetrics scores do not necessarily reflect positive reactions to articles. There is also a worry that some researchers are more active on social media and public engagement than others, and will therefore be unfairly advantaged in altmetrics scores.
Altmetrics may not provide a complete answer to the impact factor, but used in tandem with traditional means they allow us to create a much fuller picture of the impact of research, deepen our understanding of the uptake of science in the wider community and encourage us to explore and experiment with the information we produce in new ways. It is important for publishers, researchers and funders to continue the discussion so that we can begin to create best practice and industry standards and so that we can fully appreciate the potential of these new metrics. In all these discussions it is important to also consider the limitations of impact measurement; as Albert Einstein once said, “Not everything that can be counted counts, and not everything that counts can be counted.”
Videos and blog posts from the recent 1:AM altmetrics conference in London are recommended for anyone interested in learning more.