This is a guest post contributed by Hui Zhang, Assistant Professor and Digital Application Librarian at Oregon State University Libraries. Hui discusses their experience of adding altmetrics into their Institutional Repository system.
Altmetrics at Oregon State
The concept of altmetrics is already well known in libraries, and as more institutional repositories and publishers offer altmetrics to their scholarly content, more researchers are also becoming familiar with them. However, the controversy over how altmetrics should be used continues to be intensified rather than diminished: one opinion holds that the suites of impact-related indicators called altmetrics should be included in the package for scholarly impacts together with bibliometrics, but the opposite opinion states that altmetrics are not as reliable and accurate as more traditional bibliometrics, such as journal citations.
This blog post is about a case study at Oregon State University (OSU) Libraries, where Altmetric.com badges have been added to journal articles deposited in ScholarsArchive@OSU from 2014 onwards. What’s highlighted in this report is feedback about altmetrics, collected from faculty members, and our thoughts on offering altmetrics as a library service.
The motivation of adding altmetrics in OSU’ IR is twofold: offering it as an incentive for depositing publications in IR, and promoting the use of new metrics for scholarly communication. These objectives made us decide that our implementation should demonstrate the “positive” impact of altmetrics in order to widen its adoption among faculty. In line with this, we have chosen to only display the badges on OSU article pages if there is altmetrics data to show (i.e., the Altmetric score is larger than zero). If there is no altmetrics activity, the badge does not appear on the article page. OSU Libraries also had a pilot test with the institutional edition of the Altmetric Explorer, which we used to find the articles that were attracting the most attention online at the time and notify their authors.
In a recent survey for all OSU faculty members about their perceptions of using web-based metrics (we did not to use the term altmetrics to avoid any possible confusion about its definition) as measures for scholarly impact, we have seen mixed opinions that give us a better understanding on the issues of altmetrics and how to proceed. First of all, some of the faculty members are enthusiastic about the new metrics with comments such as “I think it is an important factor that should be considered” or “It’s a good addition to more ‘traditional’ metrics”. However, we have to admit, the majority of the faculty indicated they are reluctant to adopt altmetrics as they are today, and raised the following concerns:
- Definition: what do web-based metrics include and how the information is harvested?
- Accuracy: usage statistics are not consistent or transparent on how it is collected such as whether web bots are excluded from downloads (side note – usage stats are separate from any data provided via the Altmetric badges)
- Bias: books, book chapters, and some disciplines can be underrepresented in the web.
- Correlation: capture (e.g., bookmarking) or mention (e.g., in social media) an article does not mean the researcher will cite that article later. Although there are numerous studies demonstrating that there is a correlation between altmetrics and bibliometrics in general, the strength of correlation varies significantly by the type of altmetrics (e.g., capture in Mendeley vs. mention in Twitter).
- Gaming for attention: a popular or controversial research topic will probably attract much higher attention in social media but it should not be translated into quality and impact.
These concerns reveal the gap of trust towards altmetrics among scholars, which is not a surprise to many practitioners, and leaves a question for the librarians: how can we improve the situation?
The white paper released by the NISO (National Information Standards Organization) includes a list of 25 potential action items for the next phase of NISO’s altmetrics standard project, which should address the faculty’s concerns if what is proposed is implemented appropriately. However, even after the development of standards is completed in November 2015, it will be a long way for the standards to be adopted and implemented by different altmetrics service providers. So, they are huge opportunities for academic librarians to be involved with the development of altmetrics by spreading the support of altmetrics and defining the types and methodology of altmetrics with other colleagues.
For instance, librarians can help to identify the types of scholarly outcomes that are most applicable for altmetrics, work with IT department and altmetrics providers to improve the methods of data harvesting and metric calculation, and prompt the use of altmetrics via instruction. Here at Oregon State University libraries, the next step will be studying the possibility of making altmetrics available for thesis and dissertations deposited in the IR. Furthermore, OSU libraries also offer a variety of outreach activities to introduce altmetrics with a new online research guide and an altmetrics workshop targeting faculty and graduate students.
I believe the scholarly community will eventually accept the idea of using web-based information to indicate research impacts despite the doubt and uncertainty about it today. However, to make that vision a reality, it requires collaborations and understandings from all major stakeholders including academic librarians, educational organizations (e.g., NISO), and metrics harvesters such as Altmetric.com.