The body of peer-reviewed literature about alt-metrics on Open Access journals is gradually increasing. Some of the research published during 2012 profiled key issues and recognisable patterns.
A prominent Open Access platform for peer-reviewed articles on alt-metrics is the Public Library of Science (PLOS), whose “Altmetrics Collection” collection currently includes eight PLOS ONE articles on alt-metrics. At the time of writing the collection includes four research articles and one overview published during 2012 and in this Fieldwork post we take a quick look at them.
As Priem et al explain in their overview of the “The Altmetrics Collection” (published November 1 2012) gathering a body of research on alt-metrics is “greatly needed as important questions regarding altmetrics’ prevalence, validity, distribution, and reliability remain incompletely answered.”
During 2012 the PLOS ONE Altmetrics Collection covered a range of subjects including statistical analysis of alt-metrics data sources; metric validation, and identification of biases in measurements; validation of models of scientific discovery/recommendation based on alt-metrics; qualitative research describing the scholarly use of online tools and environments; empirically-supported theory guiding alt-metrics’ uses; and other research relating to scholarly impact in online tools and environments.
Research Blogging and Online Discussion
In “Research Blogs and the Discussion of Scholarly Information” (published 11 May 2012) Shema et al contribute to the study of academic blogging by analysing a sample of blog posts citing published peer-reviewed articles and aggregated by ResearchBlogging.org. The authors looked at bloggers, blog posts and referenced journals of bloggers who posted at least 20 items. The article shows that the studied bloggers show a preference for papers from high-impact journals, blogging mostly about research in the life and behavioral sciences. The study showed which were the most frequently cited journal sources in the sample (Science, Nature, PNAS and PLOS ONE).
An important discovery of the study is that most of the sampled bloggers had active Twitter accounts connected with their blogs, and “at least 90% of these accounts connect to at least one other ResearchBlogging-related Twitter count.” The study also provided demographic information about the bloggers, including gender and occupation. Clearly detailing its methodology and limitations, this article provides a serious, reliable example of how the fields of bibliometrics, webometrics and alt-metrics interconnect, making a contribution towards the better understanding of the role blogs, and Twitter, play in a segment of the scholarly communication landscape.
Alt-metrics, sentiment analysis and academic events
In “Tweeting the Meeting: An In-Depth Analysis of Twitter Activity at Kidney Week 2011”, (published 5 July 2012) Desai et al looked at the Twitter backchannel of an academic event called Kidney Week 2011 (#kidneywk11). The authors suggest that any tweet accomplishing educational dissemination would have to include three key features: 1) informative content, 2) internal citations, and 3) positive sentiment score. The authors found informative content in 29% of messages, greater than that found in a similarly sized medical conference (2011 ADA Conference, 16%), but the study showed that few conference attendees composed tweets (1.4%). The authors found a correlation between informative tweets and tweets expressing a negative sentiment, an important discovery in itself.
Unfortunately, the article doesn’t really explain the tools and techniques employed in the collection, analysis and visualisation of the data (perhaps taking these methods for granted), and works on a potentially unfounded presumption that tweeters using #kidneywk11 actually took part in the conference. This notwithstanding the article signals towards future studies looking at how medical conferences engage in public dissemination via Twitter, and proposes an interesting combination of statistical and sentiment analysis.
New Tools and Impact Across Disciplines
In “Scholarometer: A Social Framework for Analyzing Impact Across Disciplines,” Kaur, Jasleen et al (published 12 September 2012) offer an example of the current interest in developing new tools and approaches to assess the reception of scholarly outputs. The authors point out that though citation and download counts are widely available from digital libraries, current annotation systems rely on proprietary labels, refer to journals but not articles or authors, and are manually curated. The authors propose an ambitious social framework based on crowdsourced annotations made by researchers they have called “Scholarometer”, which computes citation-based impact measures. The idea is that disciplinary annotations provided by authors can in turn be used to compute disciplinary metrics.
The article provides a detailed description of the system architecture, their research methodology and report on data sharing and interactive visualizations as produced by the resource, and argues that crowdsourcing metadata can offer a coherent emergent classification of scholarly output.
Correlation Between Mentions and Downloads
In “How the Scientific Community Reacts to Newly Submitted Preprints: Article Downloads, Twitter Mentions, and Citations”, Shuai X, at al (published 1 November 2012) address another scholarly output of interest to alt-metrics: preprints. The authors analysed 4,606 scientific articles submitted to the preprint database arXiv.org between October 2010 and May 2011, and studied three forms of responses to these preprints: downloads, mentions on Twitter, and early citations in the scholarly record. The article shows that Twitter mentions and arXiv downloads of scholarly articles follow two distinct temporal patterns of activity (Twitter mentions have shorter delays and narrower time spans than downloads). Importantly the study shows that the volume of mentions on Twitter “is statistically correlated with downloads and early citations just months after the publication of a preprint, with a possible bias that favors highly mentioned articles.” The detailed methodology and materials described in the article can inform future studies looking at relationships between mentions on Twitter and article downloads, as well as possible qualitative assessments of scholarly online behaviour and the effects of Twitter on scholarly communications and impact.
As we have argued in previous posts, in spite of their growing popularity the relevance, validity, and reliability of alt-metrics is still open to debate. Open access to scholarly literature collections like these allow alt-metrics researchers and the wider community to contribute to a field in its early stages of development and to put hypotheses to the test.
(Literature review continues in part II).