Home 9 Against the Grain 9 Seeing The Bigger Research Picture: The benefits of altmetrics data for authors

Seeing The Bigger Research Picture: The benefits of altmetrics data for authors

by | Jun 1, 2018 | 0 comments

Editor’s Note: This article is part of an ATG Special Issue on author support

Amy Rees

By: Amy Rees, Customer Support Manager at Altmetric

What and why: does article performance data matter?

With an uptick in the number of journals being published, journals can now use article performance data to provide added value for researchers to encourage them to publish with their organisation and reward the choice to have so done. This additional value can manifest in different ways, both in raw numbers such as downloads, or more nuanced data such as discussions of an article in policy documents.

Collating different types of article performance data allows authors to see rich data associated with their publications. Authors can be rewarded for their community outreach and engagement, seeing payoff for the fruits of their labour.

The article performance data also provides a wider picture of how the research is being received and presented online. This allows authors to consider the questions: How are mainstream sources presenting their research? How are laypersons reacting to the research? How are governmental and non-governmental organisations using the research in the “real world”?

One facet of article performance data is the collection and presentation of altmetrics. Altmetrics, or alternative metrics, describe non-traditional attention to scholarly outputs. Altmetrics as an idea covers a wide range of types of attention: news stories, shares and mentions on social media, references from government policy documents or patents, and much more.

Designed to be complementary to traditional bibliometrics (citations between articles, for example), altmetrics can provide a much more immediate, richer picture of who is engaging with a piece of research, and how it was received.

Recognising author engagement efforts

Providing altmetrics to authors, an approach which has become increasingly commonplace amongst academic publishers, can help not only encourage them to disseminate the results from their research, but also to see the positive effects of doing so. Outreach and engagement by an author is a key factor in increasing the type of attention paid to a publication or the quantity of engagement. As authors take the time to blog, engage with readers on Twitter, do interviews with news programs, and address comments on public peer review forums, they are creating a conversation about their work that is not without considerable effort.

Beyond viewing the impact in the context of journal performance, this type of dissemination also allows for audiences outside of academia to develop their understanding of key issues that impact society. The effort authors put into making their research available and easy to understand is not without benefit to the academic community, whose funders often rely on public donations or whose institutions may seek to raise their profile in a specific field.

Undertaking this kind of broader engagement, and tracking its outcomes, is also increasingly used by individual researchers looking to demonstrate the influence of their work to potential funders, hiring committees, or as part of national research performance reviews.

Staying on top of the story

News coverage, now more than ever and whether true or false, dominates the public understanding of research. Popular science is discussed in major newspapers and dissected in opinion pieces. One journal article can be discussed in multiple news outlets within a short period of time, even with conflicting stories or perspectives. New research published in Science “The spread of true and false news online” indicates that false news stories are shared at a much higher rate than those based in truth. This means that “getting ahead of the story” is critical to authors and the communications teams that support them to ensure their research is being properly positioned.

If an author doesn’t have access to this news data they might not see a misinterpretation of their research and miss the opportunity to respond or clarify. Further, they might also miss the opportunity to engage with an interested community.

The aggregation of news stories, a type of altmetrics, lets the author keep track of how their research is being positioned and then they can work with the media/marketing team of a publisher to correct any issues or highlight particular feedback.

Likewise, public peer review such as Publons, the source used by Altmetric, allows researchers to see the peer reviews of their publication in an open format. This open data allows users to have a chance to respond to relevant criticism within their own field. Concerns about results, data collection, and other aspects of research can be addressed via public discussions. This encourages inter-group conversations about research and allows more direct feedback about the publication.

Expand data available to authors

Collating altmetrics data can be challenging and time consuming for authors. While a simple search online could highlight some of the news stories about a publication it masks the effort necessary to find a complete picture. Some stories may be available but, a user is constrained by the search engine they are using and what they consider “important results”.

Taking the time to truly understand the attention and engagement associated with a publication can lead authors to arduous searching of multiple platforms and sources. By providing altmetrics, journals are pulling together a snapshot of the online attention available and bringing it into a single place, saving authors time and hassle.

Altmetrics are also a great way to highlight sources that might not be available to authors otherwise. While some sources, such as Twitter and news, could be available to users, other sources are harder to find or simply unavailable to authors.

Finding mentions of a publication in policy documents is often a particular challenge for authors. Not all policy sources or organisations make their publications available in an easy to read format, such as PDF or via Word Document. They can be buried in website archives in older formats or simply hard to locate.

Further, there lies a practical issue with extracting policy references. Where does an author even start? Most governments publish thousands of different policy documents per year and it can hard to know even where to start.

As with policy documents, finding references to journal articles in syllabi is nearly impossible for an individual academic. While authors may be aware of where their articles are used in their own institution or maybe in part of their sector there are many other institutions that may be using their research for teaching. This data could be invisible to researchers who may not even realise the use and breadth of their research.

Highlight readership and academic engagement

While engagement with laypersons is a valuable understanding of dissemination, authors are likely interested in which other authors and academics are reading and considering their publications. Readership data, such as that provided by Mendeley, shows who has saved a paper in their academic library to read or use in a future publication. They also provide geographic as well as discipline data for the readers that have saved the paper. Authors can then see where researchers are saving their publications and which discipline. Additionally, Mendeley readership has been correlated, in some fields, to long-term citations.

Services such as F1000 – Faculty of 1000 – allow users to see which academics have recommended their paper. This data allows users to see that academics have not just saved the paper that they have read it and deemed it of value. Though saving a paper does denote at least interest and future engagement these types of recommendations show a direct engagement with the paper and a public endorsement of its content.

Complementary data

Article performance data should be viewed as interlocking and complementary data, with altmetrics working together with more traditional sources such as downloads, views, and citations. While traditional citations take longer to accrue they represent an important part of the story for understanding the performance of an article.

As with any other article performance data, a high volume of citations does not necessarily mean agreement or quality. For example, the now since retracted paper regarding Autism and MWWR written by Andrew Wakefield et al has more than a thousand citations.

Similarly, download counts and views provide another type of article performance data for authors to have a sense of the immediate response to the paper. While a high number of downloads and views don’t necessary denote agreement, it does display engagement and attention to the publication.


Providing altmetrics to authors is more than saying “You have X number of news stories and x number of Facebook posts”, though that can also be valuable attention itself. Altmetrics data allows journals to provide a more complete picture of the attention that has been paid to an author’s publication. From laypersons to science communicators to other academics and everyone in between, article performance data is a key source of valuable data for journals to provide to authors.

Article performance data is not only about addressing potential issues and positioning the research in the media, it is also about allowing authors to see the whole story. Providing a variety of different data allows authors to see areas, both disciplinary as well as geographic, that have shown interest in their publication and building connections to others who might be interested. In adding article performance data to the author package a journal is not only giving an author data, they are showing the value of a publication beyond its appearance in that journal.


Submit a Comment

Your email address will not be published.

Share This