Information Today, Inc. Corporate Site KMWorld CRM Media Streaming Media Faulkner Speech Technology DBTA/Unisphere
PRIVACY/COOKIES POLICY
Other ITI Websites
American Library Directory Boardwalk Empire Database Trends and Applications DestinationCRM Faulkner Information Services Fulltext Sources Online InfoToday Europe KMWorld Literary Market Place Plexus Publishing Smart Customer Service Speech Technology Streaming Media Streaming Media Europe Streaming Media Producer Unisphere Research



Vendors: For commercial reprints in print or digital form, contact LaShawn Fugate (lashawn@infotoday.com)

Magazines > Computers in Libraries > December 2016

Back Index Forward
SUBSCRIBE NOW!
Vol. 36 No. 10 — December 2016
FEATURE

Altmetrics: An Overhyped Fad or an Important Tool for Evaluating Scholarly Output?
by Marc Vinyard

Altmetrics democratize the measurement of the impact of scholarly works because they measure how undergraduates and other lay readers view academic works.
Traditionally, scholarly output has been measured by the impact factor—which considers the prestige of journals based on how often their articles are cited, the number of times that a researcher’s articles have been cited in other scholarly resources, and the h-index, a metric that’s derived from an author’s recent articles and the number of times these works have been cited. Altmetrics untether the evaluation of scholarly impact from metrics tied to citation counts in academic journals. Some researchers initially think of altmetrics as a resource that measures how many times scholarly works are mentioned in social media such as Facebook and Twitter, but this limited definition is a disservice to altmetrics. In addition to social media, altmetrics measure how many times scholarly works have been downloaded or viewed on publisher websites, commercial databases, institutional repositories, library holdings of books, Goodreads reviews, citation managers such as Mendeley, and academic peer networks such as ResearchGate. Even the preceding list is not exhaustive, and the range of potential altmetrics resources is only bound by the creativity of researchers.

Potential Advantages of Altmetrics as a Measure of Scholarly Productivity

Altmetrics’ reliance on the social web to evaluate scholarly publications allows researchers to receive much faster feedback on the impact of their academic work. Due to the lengthy peer-review process for academic journals, it takes at least 2 years for articles to accumulate an appreciable number of citations. While traditional bibliometric measures are tied to the print journal model, altmetrics were created specifically for the web. Heather Piwowar, the co-founder of Impactstory, describes the advantage of altmetrics over traditional bibliometrics as “evidence of impact in days instead of years.” If a researcher is looking for influential articles on a topic, searching the Web of Science or Scopus to rank articles by citations will reveal very few articles published within the last 2 years. However, the peer scholarly network Social Science Research Network (SSRN) ranks academic papers by the number of downloads. By consulting SSRN, researchers can identify the research papers that are most popular among scholars long before they are cited in scholarly journals.

Altmetrics offer promise to researchers who are less likely to publish journal articles. Because traditional bibliometric assessments are tied to citation counts in academic journals, they are less effective for evaluating the impact of monographs or book chapters. Altmetrics provide a more diverse set of evaluations, and researchers can find metrics that are a good match for their research activities. For example, an author of a book can look at reader reviews on Goodreads, along with the number of readers in Mendeley to gauge the influence of his or her book. Another option for them to collect metrics is to share their research on Academia.edu, a social networking site with strong representation from humanities professors. Some altmetrics resources go beyond evaluating books and articles to measuring the impact of less traditional works such as datasets, interviews, and blogs. While altmetrics offer promise for evaluating scholarship outside of articles, the existing tools do a better job of evaluating journal articles than other scholarly works.

Altmetrics democratize the measurement of the impact of scholarly works because they measure how undergraduates and other lay readers view academic works. By drawing on feedback from a wide variety of sources and a diverse group of readers, altmetrics are a good example of crowdsourcing. This is in stark contrast to citations in scholarly journals, which reflect how scholars and other experts are using academic works to the exclusion of lay researchers. For instance, any reader can mention an academic study on Facebook or post a review of a book on Goodreads. Similarly, statistics on the number of times articles have been downloaded or viewed reflect the reading and research preferences of anyone who accessed the articles, rather than just professional researchers.

Another example of how altmetrics are gathering metrics from a broader readership is the effort to measure mentions of scholarly works in popular news websites. Non-academics are more likely to visit news websites than to read scholarly publications. By tracking how many times their works are mentioned in news websites, researchers can receive feedback on the societal impact of their work. It’s worth noting that many scholarly articles are still hidden behind paywalls, and it’s hard to say how many journalists—let alone laypersons—actually read the entire study. The popular press is notorious for simplifying or misinterpreting academic research. However, without news sites mentioning research studies, even fewer members of the general public would be aware of their existence.

Why is it important to measure the broader impact of scholarly work? Funding organizations would like metrics to measure the impact of the research that they fund. This is part of a larger trend of funding agencies wanting research to be more widely available. For example, some agencies are requiring that recipients publish their findings in open access (OA) sources.

Challenges Facing Altmetrics

The social web favors popular, trendy topics over obscure, complicated subjects. Works on Ebola or the refugee crisis in Europe will attract more attention on social media than a publication on hydrogen absorption by zirconium alloys at high temperatures. This bias toward popular topics is evident in Altmetric.com’s annual ranking of the 100 articles with the highest altmetrics scores. Some of the highest ranked articles from 2015 include “Accelerated Modern Human—Induced Species Losses: Entering the Sixth Mass Extinction”
and “Shaping the Oral Microbiota Through Intimate Kissing.” Nearly all of the articles on the list were likely to be of interest to a broad readership. Complicated articles such as “Exploration of Highly Active Bidentate Ligands for Iron (III)Catalyzed ATRP” are only of interest to experts in a certain field. Researchers may wonder if highly cited articles will have strong altmetrics scores. Preliminary research suggests that articles with the most citations in Web of Science didn’t always have high altmetrics scores.

Most researchers are more familiar with traditional bibliometrics than altmetrics. This is understandable, considering the term “altmetrics” was only coined in 2010. Scholars might find altmetrics indicators interesting, but they also want to know how well they correlate with bibliometric measures they are more familiar with. Despite the shortcomings of bibliometric measures, many researchers will be more comfortable with scholarly citations than with altmetrics. Multiple studies examining the correlation between scholarly citations and altmetrics indicators show a positive, but weak to moderate, correlation between citation counts and altmetrics.

Gaining acceptance from faculty tenure committees will be crucial for altmetrics to move from the curiosity stage to a serious tool for evaluating scholarly output. Librarians can make faculty members aware of altmetrics, but, ultimately, professors will decide their importance for promotion decisions. Even though altmetrics offer hope for evaluating works in the fine arts and humanities, those in these fields might be more resistant to quantitative measures than social science and STEM professors.

It’s important to realize that altmetrics are a collection of a diverse set of tools. A professor might be contemptuous of mentions on Twitter, but he or she might be receptive to statistics on article downloads. Faculty members will have to make decisions about which altmetrics statistics they want to use for evaluating scholarly impact. How much weight should be given to readers in Mendeley versus article downloads? Should Twitter and Facebook statistics be considered? Another obstacle to faculty accepting altmetrics is the concern about indicators being gamed by researchers to falsify the influence of their work. Fears about scholars or publishers falsifying scholarly impact metrics are not limited to altmetrics. It’s also possible to manipulate bibliometric indicators; in an experiment, researchers successfully created fake papers in Google Scholar to fabricate citations. In addition, publishers have manipulated citations to increase the impact factor of their journals.

The Outlook for Altmetrics

As it stands, the following observations can be made:

  • Altmetrics provide flexibility in evaluating researchers—Some scholarly works will be difficult to evaluate based on how many times they have been cited in scholarly publications. It’s valuable to have additional options when h-indexes, impact factors, and scholarly citations fail to properly evaluate a researcher’s works.
  • Standardizing altmetrics has pros and cons—More agreement about which specific measures should be used for evaluating academic works will help improve the authority of altmetrics and make it easier to compare researchers. An important advantage of altmetrics is granting researchers the flexibility to use metrics that are a good fit for them. Attempts to centralize altmetrics should provide loose definitions that account for disciplinary differences.
  • Altmetrics are connected to recent scholarly publishing trends—OA journals from publishers such as PLOS have aggressively adopted altmetric indicators. Altmetrics harvesters are using ORCID IDs, which are unique author identifiers, to collect authors’ works.
  • Altmetrics are better at measuring a work’s attention than its quality—Critics correctly point out that article views and social media mentions are not indicators of a work’s quality. However, bibliometrics are not an indicator of quality either. An article published in a journal with a high impact factor is not necessarily a quality article.

Verdict

Altmetrics are a valuable addition to traditional bibliometrics as a resource for evaluating scholarly impact because they provide much faster feedback. Altmetrics have the potential to be especially helpful in evaluating the impact of scholarly works such as books, performances, and visual works that have been neglected by traditional bibliometrics. My prediction is that altmetrics will complement rather than replace the impact factor, h-indexes, and scholarly citations. It will be an uphill battle for altmetrics indicators to be given the same weight as scholarly citations in the scholarly world. If academia resists altmetrics, it will be swimming against the tide of important trends such as the social web, OA publishing, and the democratization of information.

ALTMETRICS RESOURCES TO EXPLORE

Information professionals have many free altmetrics resources they can work with. Librarians may have already been using some of these resources without realizing they fall under the altmetrics mantle.
RESOURCE USE WEBSITE ADDRESS
Altmetric’s bookmarklet Statistics are provided for social media, Mendeley, and CiteULike. Librarians can register for a free account with Altmetric Explorer with more search capabilities. altmetrics.com
ResearchGate Peer scholarly network with a heavy STEM presence; provides metrics on readers researchgate.net
Academia.edu Peer scholarly network for the humanities academia.edu
Publisher websites Many publishers are tracking how many times articles have been downloaded, and they have partnered with Altmetric. varies
Institutional repositories Readership reports on downloads of documents varies
Mendeley Reference manager that provides statistics on the number of readers mendeley.com
CiteULike Statistics on how many works have been bookmarked citeulike.org
Social Science Research Network (SSRN) Rapid dissemination of academic papers with rankings by document views ssrn.com
Impactstory Collects social media metrics impactstory.org
Open Syllabus Project Search more than 1 million course syllabi to discover books on reading lists opensyllabusproject.org
Libraries that have a strong interest in promoting altmetrics could subscribe to the commercial PlumX database, which is one of the most comprehensive resources for altmetrics. This resource is an altmetrics harvester that pulls data from a variety of places.

Selected References

Altmetric (2015). The Altmetrics Top 100: What Academic Research Caught the Public Imagination in 2015? Retrieved from altmetric.com/top100/2015.

Boon, C.Y., and Foon, J.J. (2014). Altmetrics Is an Indication of Quality Research or Just HOT Topics. IATUL Annual Conference Proceedings, (35), 1–8.

Bornmann, L. (2014). “Do Altmetrics Point to the Broader Impact of Research? An Overview of Benefits and Disadvantages of Altmetrics.” Journal of Informetrics, 8(4), 895–903. DOI:10.1016/j.joi.2014.09.005.

Carpenter, T.A. (2012, Nov. 14). Altmetrics—Replacing the Impact Factor Is Not the Only Point. Retrieved from scholarlykitchen.sspnet.org/2012/11/14/altmetrics-replacing-the-impact-factor-is-not-the-only-point.

Colquhoun, D. (2014, Jan.16). Why You Should Ignore Altmetrics and Other Bibliometric Nightmares. Retrieved from dcscience.net/2014/01/16/why-you-should-ignore-altmetrics-and-other-bibliometric-nightmares.

Costas, R., Zahedi, Z., and Wouters, P. (2015). “Do ‘Altmetrics’ Correlate With Citations? Extensive Comparison of Altmetric Indicators With Citations From a Multidisciplinary Perspective.” Journal of the Association for Information Science and Technology, 66(10), 2003–2019. DOI:10.1002/asi.23309.

Delgado López-Cózar, E., Robinson-García, N., and Torres-Salinas, D. (2014). “The Google Scholar Experiment: How to Index False Papers and Manipulate Bibliometric Indicators.” Journal of the Association for Information Science and Technology, 65(3), 446–454. DOI:10.1002/asi.23056.

Issues, Controversies, and Opportunities for Altmetrics (2015). Library Technology Reports, 51(5), 20–30.

Piwowar, H. (2013). Altmetrics: What, Why and Where? “Bulletin” of the American Society for Information Science and Technology (ASIS&T). asis.org/Bulletin/Apr-13/AprMay13_Piwowar.html.

Roemer, R.C., and Borchardt, R. (2015). Meaningful Metrics: A 21st Century Librarian’s Guide to Bibliometrics, Altmetrics, and Research Impact. Chicago: ACRL.


Marc Vinyard (marc.vinyard@pepperdine.edu) is the reference and instruction librarian at Pepperdine University Libraries. His research interests include altmetrics, bibliometrics, library instruction, assessment of reference services, and business information. He has written articles for The Charleston Advisor, Searcher, and the Journal of Library Administration.