For many decades, research assessment has been predominantly determined based on the Journal Impact Factor and article citation count.
In May 2013, a group of journal publishers and editors released the San Francisco Declaration on Research Assessment (DORA) to express their concerns over journal-based metrics. In DORA, they put forth recommendations that would change the traditional way of evaluating research and explore new indicators of significance and impact.
Nowadays, researchers and scholars have found other ways to communicate, share, and comment on research, as they leverage online channels such as Twitter, blogs, social networks, open data portals and apps, and digital repositories. These online activities present a different way – or perhaps an additional way – to assess research impact. Often referred to as Alt-Metrics – these online scores provide an opportunity for real-time feedback.
I recently attended a webinar where one of the speakers shared changes to a new look and feel for Alt-Metrics scores, implemented within the Scopus database. Many of us who use this resource are familiar with the Alt-Metrics donut, which shows the reader the online presence and activities associated with the paper. This donut score can still be seen within Synapse (MSK Publication Database). For example, if you look at this November 2014 publication (entitled “Cancer: Antitumour immunity gets a boost”) in Synapse, you will see a variety of metrics, including tracking data that let’s you know that the paper was picked up by 3 news outlets, tweeted 90 times, and downloaded by 157 readers to their Mendeley libraries, resulting in a score of 106 (as of 11/20). By clicking on details, the reader can further explore the numbers, as well as, understand the score in context.
If you were to view this paper in Scopus, their redesign presents the paper’s online impact differently replacing the donut with a bar graph and allowing the reader to see at a glance both traditional metrics (citations), along with Scholarly Activity (downloads and posts in research tools such as Mendeley and CiteULike), Scholarly Commentary (reviews, articles and blogs by experts and scholars), Mass Media (coverage of research output from news outlets), and Social Activity (mentions from platforms such as Twitter, Facebook and Google+). In addition to bibliographic abstracting and indexing databases, Alt-Metrics data can be found in other resources, such as journal websites and other publishers’ platforms.
So what does this all mean? We can say that Alt-Metrics is no longer emerging but rather evolving. Capturing mentions of research papers made via tweets, Facebook posts, blogs, Mendeley downloads, mainstream news media, and in other non-traditional online sources, helps to paint a different picture and provide additional real-time impact data from the digitized scholarly communication world. Most importantly, if these metrics are to have an impact, they need to be embraced not only by those who generate the numbers, but also by those who determine the “metric formula” for evaluating research impact for their institution.
Donna Gibson
Director of Library Services