Skip to Main Content

Altmetric and Altmetric Explorer for Institutions

This guide provides an introduction and overview of the Altmetric and Altmetric Explorer for Institutions tools licensed by the U-M Library.

What is Altmetric?

Altmetric.com is a tool created by the company Digital Science that searches the web for "mentions" of research outputs, such as journal articles or book chapters, to show how readers are engaging with scholarly publications online. Mentions can appear in social media, scholarly blogs, news outlets, Wikipedia, citation managers like Mendeley, etc. You may have seen Altmetric donuts, badges, and scores on your own published research or on other research outputs that you have encountered.

The color-coded Altmetric "donut" shows the Altmetric score of attention surrounded by colorful bands. Each stripe of color on the donut represents a different type of engagement. For example, light blue indicates Twitter, red indicates news, and yellow indicates blogs. If you hover over the donut, you'll see an abbreviated summary of engagement with the work.

undefined

You can click on the donut to view the details page for that item and dig deeper into each instance of engagement. For some types of engagement, such as Twitter, there are also visualizations to show where in the world people are talking about this work.

undefined

What is the Altmetric "score of attention?"

The Altmetric score of attention is a proprietary number generated by Altmetric by both counting and weighing the value of different types of mentions. For example, Altmetric's formula gives a mention in the news eight times the weight of a tweet.  While the most valuable information that Altmetric can provide is the qualitative detail about each interaction with the research output, the score of attention attempts to communicate at a glance a sense of the overall level of engagement with the work. Altmetric describes the attention score as an "indicator of engagement." It is important to note that the score of attention does not communicate anything about the quality of the work--so if an article receives a great deal of attention online because it has been widely discredited, it may appear to have a very high score of attention. To learn more about the score of attention, see Altmetric's support page, "How is the Altmetric Attention Score Calculated?"

Where does Altmetric look for mentions? How can I be sure that sources I care about are being tracked?

Altmetric has a large and ever-growing set of sources where they look for mentions. These sources fall into a few clear categories:

  • More than 2,000 different news and media outlets (e.g., The New York Times, Wall Street Journal, Vox, Salon, Huffington Post, etc.). An up-to-date, complete list is available.
  • Publicly visible posts on social media (e.g., Facebook, Twitter, Google+, Reddit, etc.)
  • Academic social platforms (Mendeley, F1000)
  • Selected scholarly/academic blogs (e.g., Scholarly Kitchen)
  • Collections of book reviews (e.g., Bryn Mawr Classical Review)
  • Wikipedia
  • Policy documents (e.g., Gov.UK, IMF, WHO, etc.)
  • Syllabi in the Open Syllabus Project
  • Altmetric does not track web usage metrics such as page views, download counts, or other direct interactions between a user and a web site where an article or book is hosted.
  • Altmetric does track citation counts from Scopus and Web of Science, and makes these visible in the Altmetric Explorer tool. However, citation counts are not incorporated into the Altmetric score.

Read more about the sources tracked by Altmetric.

If you’re aware of media channels, platforms, or websites where your work is likely to be mentioned that Altmetric might not be tracking, they welcome suggestions. You can submit a new source for consideration by filling out this form.

What should I do if I'm aware of a "mention" that Altmetric has missed?

Missed mentions can be reported via an Altmetric web form.