Skip to Main Content

Research Impact Challenge

This guide contains 10 activities for researchers to better understand and manage their online scholarly presence, as well as the impact and reach of their research.

Day 9: Alternative Metrics

Welcome to Day 9 of the U-M Library Research Impact Challenge!

Yesterday we looked at the h-index, a calculation of author productivity and impact. Today we’ll look beyond citation-based metrics, considering the ways that alternative metrics (or “altmetrics”) can add to the picture of what we know about the impact of scholarship.

Let’s get started!

Background

The term “altmetrics,” coined in 2010 in “altmetrics: a manifesto,” is defined as “the creation and study of new metrics based on the social web for analyzing, and informing scholarship.” (Priem, Taraborelli, Groth, and Neylon, 2010).

It’s easy to assume that altmetrics are all about social media (people tend to think of Twitter in particular), but that is only part of what they offer. By tracking links from all kinds of websites back to scholarly research, altmetrics can reveal references to and engagement with scholarship in the news, in policy documents, in syllabi, on scholarly blogs, and beyond.

Today’s challenge introduces you to a tool called Altmetric Explorer for Institutions. It’s important to note that Altmetric Explorer—a proprietary tool from a company called Digital Science—is by no means the only source for altmetric data. However, folks affiliated with U-M have access to the Altmetric Explorer, and it has some interesting features that make it easy for you to track and share information about how the web (and the world) is interacting with the research that is important to you.

About Altmetric (with a capital 'A')

First, some background about what Altmetric does and how it works:

Altmetric searches the web for "mentions" of research outputs, such as journal articles or book chapters, to show how readers are engaging with scholarly publications online. Mentions can appear in social media, scholarly blogs, news outlets, Wikipedia, citation managers like Mendeley, and more (read more about which sources Altmetric is tracking).

You may have seen Altmetric donuts, badges, and scores on journal websites, perhaps even attached to your own research. Each stripe of color on the donut represents a different type of engagement. For example, light blue indicates Twitter, red indicates news, and yellow indicates blogs. If you hover over the donut, you'll see an abbreviated summary of engagement with the work.

You can click on the donut to view the Altmetric details for that item and learn more about every single mention:

The Altmetric score of attention—the number inside the donut—is a proprietary number generated by both counting and weighting the value of different types of mentions. Altmetric describes the attention score as an "indicator of engagement." The score of attention does not communicate anything about the quality of the work. To learn more about the score of attention, see Altmetric's support page, "How is the Altmetric Attention Score Calculated?"
Altmetric Explorer for Institutions is a dashboard that allows users to browse, search, and query all of the data that Altmetric has gathered. Let’s explore it together: 

1. Follow this link to the Altmetric Explorer. This will direct you through the U-M authorization system, so if you’re not already logged in, you’ll be prompted to do so.


2. If this is your first time accessing the the Explorer dashboard, you may be prompted to create a personal account on the site. It is not necessary to create an account for today’s challenge. However, if you ever want to save searches, generate reports, or set up alerts, you will need one. Feel free to skip this for now. If you ever want to create an account later, simply click the “plus” icon in the lower left hand corner of the screen.


3. Notice that by default, the dashboard will only show you information about research outputs it knows to be associated with the University of Michigan. We’ll leave this setting in place for now. For example, in this screenshot, notice that Altmetric is aware of 245,768 research outputs that it knows to be associated with a U-M author. Of those, 83,313—just over ⅓—have been mentioned on the web at some point since 2012, when Altmetric started tracking mentions.


4. Your default view will be “Research Outputs,” which simply displays the relevant research outputs. You can use the dropdown menu to sort in various ways. Try navigating along the tabs at the top of the screen (“Research Outputs,” “Timeline,” “Demographics,” “Mentions” and “Journals”) to see different representations of these same research outputs. Along the way, you can click on just about any data point to “zoom in” for more detail. 


 5. More than 83,000 research outputs across all of U-M isn’t a terribly useful ocean of data to sift through. Let’s narrow our search to find something more meaningful. At the top of the page, click “Edit Search” to open up advanced search window.


6. First, check the “full Altmetric database” box (this will de-select the “My Institution Only” box) to make sure we don’t miss anything. 


7. Next, select a publication date range of 1/1/2018-12/31/2018. (Leave the “Altmetric mentions during” date range set to the default of “any time”)


8. Click run search. This query still gives us a huge set of results, more than 1.6 million research outputs, with more than 1.2 million mentions. But now let’s sort them to find an interesting subset: use the drop down menu to sort results by syllabus mentions.


9. We’ve now got a list of works published in 2018, sorted by the number of times that they have already been included on at least one syllabus. It’s no surprise that the vast majority of 2018 publications have not appeared on a syllabus, so the number of results that have at least one syllabus mention is actually quite manageable. Click on any one of these publications to learn more about it, including where it’s being taught:


10. Now, take some time to explore on your own, searching and filtering for the information of greatest interest to you. One important recommendation: if you want to search for an individual author (such as yourself!) we recommend setting your search to include the full Altmetric database, and searching for your name in the keyword field (or try searching on your ORCID). The U-M verified author search has some limitations, so it’s better to take a very broad approach to your search. 


Bonus challenge: create an Altmetric Explorer account for yourself. Now, from anywhere in the Explorer, you should see an option in the upper right hand corner to save your search—try it! Once saved, your search will be added to a panel on the left hand side of your screen. Click on the bulleted list icon to open your saved searches. From here, you can set up email alerts to periodically receive updated reports that meet your search parameters. You can also create a visually appealing report that can be bookmarked and shared with others.

What next?

Learn more: 

Preparing for the next challenge: 

Congratulations! You’ve completed Day 9 of the U-M Library Research Impact Challenge, and we’re almost to the finish line—just one more day to go! Tomorrow, we’ll aim to prepare you to take your new knowledge and skills back out into the world by introducing frameworks for the responsible and ethical application of research impact metrics.