I gave a brief introduction on altmetrics or alternative metrics in a earlier posting and shared some literature talking about altmetrics as a metric for measuring research impact via social media mentions. But altmetrics is more than just social media mentions, it encompasses any metrics that doesn’t fall within the traditional citation metrics. It measures impact not in the terms of the quality of a research output but the level of engagement it creates with the community and even the societal. And that’s what I have been hearing at the 1AM Conference in London that I attended late September.

Being able to be at the conference was definitely an eye-opening for me and allows me to see altmetrics as more than a measurement or indication of social media activities but also extending its scope to other metrics like number of downloads, mendeley readerships, number of views, etc. And what amazed me is the great number of tools out there tracking these altmetrics.

Academia.edu

Academia.edu is one of many social networks for academia like that of SSRN and Research gate (RG). like SSRN and RG, this is more than a platform for sharing research or networking with like-minded researchers. It also provides valuable analytics to your profile and uploaded document views, the countries, search engines and other sites where the traffic were from over the past 30 days. These analytics give valuable immediate feedback on the interests generated around your research.

AE1

Clicking on the total views will help you to see breakdown of the statistics like below.

Altmetric

You might have noticed Altmetric‘s colorful donut-like icon accompanying the articles published in Springer, Elsevier, Sage and Wiley to embed their colorful donut-like metrics by article-level. Altmetric is the service that tracks and measures the broader impact of scholarly articles and datasets across traditional and social media (including non-english language sources likes Sina Weibo, online reference managers, post-publication peer-review sites, and public policy documents. You can install their bookmarklet or sign up to their altmetric explorer to start monitoring the metrics around an article. So how did Altmetric come up with the scoring and the sources? The score is not a total count of the mentions tied to a paper but rather a weighted count of the different sources that was worked out with groups of researchers who ranked different sources in order of how important they were to an article’s broader impact. The score is further “adjusted” to take into account reach (how many people is the mention going to reach?), promiscuity (how many other articles have been mentioned by the same source in a short time period?) and bias (are all the articles from a single journal or publisher?). My personal opinion is not to be too fixated over that score but it can be a easy reference indication when you compare metrics across articles. I do like the fact that all data curated via altmetric.com is audit-able.

ImpactStory

ImpactStory brands itself as the new scientific CV that showcases not only a scientist’s research impact from his traditional outputs like journal articles, but also other web-native scholarships like blog posts, datasets, slides, videos and software. It is free. It allows you to load your Google scholar profile publication using Bibtex or sync it with your ORCID account. Here’s a sample profile and types of metrics available on ImpactStory.

Kudos

KUDOS is the new kid on the block. They launched in April this year and works like Research Gate with the component to allow for online collaboration and sharing of documents among researchers. KUDOS’s social mentions data is link to data by altmetric.com. Researcher can create an account for free. Authors can also easily “claim” publications from KUDOS index list of publication so it is pretty to use. Here’s a sample profile of what a public profile will look like. The public profile only shows the publication list. You can only see the metrics for your profile when you login to your dashboard or if the institution subscribes to KUDOS, then administrators can view the metrics of the researchers that is tie to their institutions.

Below is what it looks like from the individual dashboard. Clicking the graph will show you the metrics from KUDOS and also altmetric.

kudos1

Below is an example of a researcher’s metrics from KUDOS.

Below KUDOS‘s metrics, you will see the publication altmetric display.

KUDOS2

Under the Author’s dashboard, you will notice a little orange button “Improve my Results”. KUDOS claimed that having an account with them will make your publication more discoverable with their range of discovery services following these steps.

Mendeley

Mendeley is a free reference manager and academic social network. Make your own fully-searchable library in seconds, cite as you write, and read and annotate your PDFs on any device. In recent years, there are some focus on the use of Mendeley readership stats as a metric to determine the impact of one’s work as reading and citing are seen as two different scientific activities. One might read an article and use it but not cite it as in cases like applied use by people outside academia, such as medical doctors, surgeons or business manager or even a K-12 student doing her class project. 1. There is study showing statistically significant medium positive correlations between Mendeley readership counts and citations for all the studied disciplines, implying a paper that was well-read in Mendeley will have a moderate chance of future citation. 2 Mendeley Activity has also been recently endorsed by the Snowball Metrics initiative as part of their global standards for institutional benchmarking.

You can read more about altmetrics with Mendeley readership on their blog. See below for the type of readership stats available via Mendeley.


1 Mohammadi Ehsan, Thelwall Mike, Haustein Stefanie, and Larivière Vincent, “Who Reads Research Articles? An Altmetrics Analysis of Mendeley User Categories,” Preprint, submitted 2014.

2 Mohammadi Ehsan, and Thelwall Mike, “Mendeley readership altmetrics for the social sciences and humanities: Research evaluation and knowledge flows,” Journal Of The Association For Information Science & Technology 65, no. 8 (2014):1627-1638, doi:10.1002/asi.23071.

PLOS Impact Explorer

Article-Level Metrics are available, upon publication, for every article published by PLOS. Researchers can stay up-to-date with their published work and share information about the impact of their publications with collaborators, funders, institutions, and the research community at large. These metrics track usage, citations, social bookmarking and dissemination, media and blog coverage, discussion activity and ratings. PLOS ALMs include:

PLOS had proactively enhanced their metrics and in 2013 released ALM Reports that allows you to view article-level metrics for any set of PLOS articles as well as summarize and visualize the data results. To find ALM that interest you, search using author info; institution; funder; subject areas; date range; journal title or provide the DOI (Digital Object Identifier) or PMID (PubMed ID). The resulting article list can be saved for future reference, instantly updated and shared on social media or email. ALM reports also allow you to browse the “Best in Class” articles from PLOS such as top cited and views, among others.

Just last month, PLOS also announced their new rich citations overlay for all PLOS articles. PLOS felt there’s a huge difference between the article you cite once in the introduction alongside 15 others, and the data set that you cite eight times in the methods and results sections, and once more in the conclusions for good measure. Yet both appear once in the list of references and count as one in citation databases for the article. To address this issue, PLOS decides to enrich citations to carry detailed information about a citing paper, the cited object and the relationship between the two. See here for a sample article with rich citations.

Plum Analytics

More and more research output is happening outside the traditional journal or book. Plum Analytics or PlumX tracks more than books, journals, it includes more than 20 different types of artifacts including videos, presentations, conference proceedings, datasets, source code, cases and more. It also captures a wide range of metrics and categorize metrics into 5 separate types: Usage, Captures, Mentions, Social Media, and Citations. Examples of each type are:

  • Usage – clicks, downloads, views, library holdings, video plays
  • Captures – bookmarks, code forks, favorites, readers, watchers
  • Mentions – blog posts, comments, reviews, Wikipedia links
  • Social media – +1s, likes, shares, Tweets
  • Citations – PubMed Central, Scopus, USPTO

PlumX is a subscription-based product but it is also a very comprehensive product as it allows you to create an author profile to track your own research, look at articles level metrics and also group authors in the same institution for ease of management.

Below is an example of an article level metric page that I got from blog.impactstory.org. PlumX reports metrics like downloads and pageviews from other publishers like PubMeb, PLOS and EBSCO and also your institutional repository. For book, it shows metrics from WorldCat holdings, amazon and goodreads too.

PlumX Artifact Screen Shot

The “flower-like” icon is known as a Plum Print and if you hover around it, you can see the metrics relating to that Plum Print. Each “petal” represents one of five categories of metrics that are tracked in PlumX: Usage, Captures, Mentions, Social Media and Citations. The size of each circle corresponds to the number of metrics in that category, allowing users to quickly visualize the relative number of metrics or see if a particular category has any metrics.

Below is an introductory video on what is PlumX.

ResearchGate

Founded in 2008, ResearchGate (RG) aims to connect researchers and make it easy for them to share and access scientific output, knowledge, and expertise. RG tracks metrics relating to total number of views, downloads and requests to your publication list and also to your profile. Below is an example of a researcher profile and the type of metrics capture.

RG1

Clicking on “view stats” to see more details pertaining to the stats.

RG2

A unique metric with RG is their RG Score which is linked to the profile of every researcher. RG algorithm looks at how both the researcher’s published research and their contributions to RG in terms of interaction with their peers. For example, a contribution is anything you share on RG or add to your profile. So whether it’s your questions and answers, a published paper you add to your profile, or the negative results and raw data you upload, anything you contribute can count towards your RG Score. For interactions, RG algorithm looks at how your peers receive and evaluate your contributions, it also looks at who these peers are. This means that the higher the scores of those who interact with your research, the more your own score will increase. Your published research is then factored in to reflect your current standing within the scientific community.

Scholarometer

Scholarometer is a browser extension that provides a smart interface for Google Scholar incorporating advanced features that make it easier and less error prone to facilitate citation analysis and help evaluate the impact of an author’s publications. The tool enables authors to extract their own bibliographic data from Google Scholar, curate it, annotate it, and export it to other tools or share it. More importantly, Scholarometer computes widely adopted metrics for analyzing author impact, such as the h-index and several extensions.

scholarometer

A author profile page with impact analysis.

scholarometer2

Other than Google Scholar citation, Scholarometer came up with their own scholarometer %ile. In the Scholarometer system, scholars can be tagged with multiple disciplines. Each of these annotations is like a vote. The number of votes is used to estimate which discipline tags are reliable. The Scholarometer percentile (%ile) is a global percentile computed based on a weighted average of hs values for all the reliable disciplines with which an author is tagged.

Snowball Metrics

Snowball Metrics are best practice, built by the sector sharing its knowledge and experience. They are a manageable set of metrics that aim eventually to inform all areas of research activity. Agreeing methodologies, which can be consistently applied to research management information, creates consistency and facilitates benchmarking between peer institutions. This helps to establish a reliable foundation for institutional strategic decision making to complement existing approaches. Click here to see a list of participating institutions.

Snowball Metrics is about working on and sharing a common language so that institutions are confident that they can use all of their data to compare their performance with each other in an apples-to-apples way. – Jennifer Johnson, Head of Performance, Governance & Operations, Research & Innovation, University of Leeds, United Kingdom

A Snowball Metric is indicated in the Recipe Book by the use of the symbol placed after the name of the metric. Clear definitions with regards to each of the 24 identified Snowball Metrics and how they are counted are reflected as “recipe”. These 24 metrics are grouped under 3 main categories: (1) Input Metrics, (2) Process Metrics and (3) Output Metrics.

Input Metrics look at Research grants’ applications and awards volume, academic-industry leverage and business consultancy activities. Process Metrics look at Research grants income and market value and contract research volume. Lastly, Output Metrics look at traditional citation measurements, altmetrics, collaboration impact, intellectual property (IP) volum and sustainable spin-offs.

Some interesting metrics are :-

  • Field-Weighted Citation Impact Instead of counting all the citations tied to an article from the time is published up to the current date where the data is extracted, it looks at citation applies on the current-plus-3-year citation window. For example, an article published in October 2007, citation that are received in the reminder of 2007 until end of December 2010 will be counted.
  • Applications Volume Application Volume calculates the number and price, or amount applied for, of research grant applicants that are submitted to external funding bodies. The date used is the date on which the application is submitted to the funding body snowballmetrics1
  • Contract Research Volume Contract research income is that received from an industrial or private external body, from commissioning a particular piece of research with specific terms. The information and IP arising from contract research will contractually be at least partly owned by the third party that is paying for the work.

More details in the Snowball Metrics Recipe Book.

NISO is also undertaking a two-phase initiative to explore, identify, and advance standards and/or best practices related to a altmetrics. This initiative was a direct outgrowth of a breakout discussion group during the altmetrics 12 meeting in Chicago, IL. This project is an important step in the development and adoption of new assessment metrics, which include usage-based metrics, social media references, and network behavioral analysis. In addition, this project will explore potential assessment criteria for non-traditional research outputs, such as data sets, visualizations, software, and other applications. Written notes, summaries of the breakout discussions, and video recordings of each meeting are available from the project webpage. The ideas generated from the first three meetings are reported out in a white paper published by NISO in the middle of this year.

Social Science Research Network (SSRN)

SSRN is a repository of new and forthcoming scholarship in a number of disciplines particularly in humanities and social sciences. It contains both accepted and working papers. Each paper in SSRN comes with paper statistics like abstract views, downloads and download ranks. It also show the total citations for the papers that cited that particular papers within SSRN.

SSRN currently provides paper rankings for the following measures.

  • Total New Downloads: Total SSRN downloads of a paper during the last 12 months. This provides a measure of the current interest in a paper.
  • Total # of Downloads: Total lifetime SSRN downloads for a paper.
  • Total # of Citations: The total number of times that a paper has been cited by other papers in SSRN´s eLibrary.
  • # of Authors: The number of authors named on the paper.
  • Total Downloads Per Author: The number of lifetime downloads per author for a paper.
  • New Downloads Per Author: The number of downloads in the last 12 months per author for a paper.
  • Total Citations Per Author: The total number of times that a paper has been cited by other papers in SSRN´s eLibrary divided by the total number authors.
Wikipedia

Below is a poster by Brian Kelly and Martin Poulter on behalf of Wikimedia UK. Brian’s blog posting on “Wikimedia and Metrics” best summarised how one can obtain metrics like usage statistics for articles and media, in-bound and out-bound links, contributions by editors and evolution of articles by clicking on the “View History” tab.

Of course, there are much more other tools and metrics that can be used to determine the engagement and impact of your research. I like Greg’s analogy that most people pound a nail with a hammer, but one can also pound a nail with a screwdriver. It doesn’t mean that the hammer is better just because it is the “right” tool. It is not about finding the best tool but most importantly, to recognise the value of the different tools and how to properly combine them for different purposes to create a better product and helps you be successful.

Email me if you are keen to find out more about how to track your altmetrics.

Skip to toolbar