At Altmetric we have been tracking at least 48 journals of different disciplines that have published articles about the humanities. We have been attempting to evaluate what the metrics obtained through the Altmetric Explorer can tell us about the way these articles are received, discussed and promoted online.
We are sharing a dataset which is based on an Altmetric Explorer Report updated on Wednesday 9 January 2013. The report gathered 81 articles with the keyword “humanities” in the title (which were mentioned at least once in the last year on the social media platforms the Explorer is able to track) from a sample of 48 different journals and repositories. The spreadsheet including a graph of the 30 articles with the highest Altmetric scores including what type of mentions they received can be viewed and downloaded from Figshare, here.
In this Fieldwork post we provide some context to the gathering of the data, and list the ten articles that had the highest Altmetric scores in our sample at the time of collecting the data (when the reader read this, this list and its scores may have already changed).
As Priem, Costello and Dzuba showed in their poster “Prevalence and use of Twitter among scholars” (2011; download it from Figshare), “scholars are using Twitter as a scholarly medium, making announcements, linking to articles, even engaging in discussions about methods and literature” (see this post from the LSE).
At Altmetric we have seen extensive growing use of Twitter amongst academics; what was once a fringe activity is quickly becoming a standard method of scholarly communication. We have detected that the number of scholarly article links shared on Twitter grows by between 10 and 15% each month; in November 2012 we tracked approximately 200,000 tweets containing a link to scholarly material.
Not only is Twitter indeed a space of “context collapse” (Marwick and boyd, 2011), in the sense that the personal and the professional merge, but in the sense that scholars from different academic disciplines and members of the public tweet and interact interdisciplinarily. In this space, scholars who self-identify with different disciplines might nevertheless behave very differently to other scholars form other disciplines. We suggest that being able to track the articles they link to can provide one route of access to studying these differences.
This “context collapse” means that it should be possible to compare the online scholarly behaviour of scholars from academic fields that otherwise would be considered as different as apples and oranges. We still need more research about how the increasing number of scholars from clearly-distinct disciplines are using Twitter in different ways (for two very different examples, see Ross, Terras, Warwick and Welsh 2011, who looked at Twitter use in digital humanities conferences, and Desai, Shariff, et al, 2011, who looked at Twitter use in a Nephrology conference).
The Altmetric Explorer is a tool for measuring the attention that scholarly articles receive online, and its intuitive user interface works as a search engine that allows users to browse the journals and repositories Altmetric tracks and obtain detailed reports. On a weekly basis Altmetric captures hundreds of thousands of tweets, blog posts, news stories, Facebook walls and other content that mentions scholarly articles on the Web.
The Explorer can browse, search and filter this data. By tracking the conversations to which it has access, it measures levels of attention over time or compared to other journals, repositories or papers.
As we have discussed previously, the Altmetric score is a quantitative measure of the quality and quantity of attention that a scholarly article has received, so it is not merely a measurement of mentions. The score for an article rises as more people mention it, and the Explorer only counts one mention from each person per source, so if someone tweets about the same paper more than once the algorithm will ignore everything but the first. For Altmetric there are different categories of mention, and each one contributes a different base amount to the final score. For example, a newspaper article contributes more than a blog post which contributes more than a tweet.
This is why articles with less mentions will often have a higher Altmetric score: a scholar sharing a link with other scholars counts for more than a journal account pushing the same link out automatically. Any metrics or ranking needs to take into account the technical context that makes online mentions of articles trackable in the first place. In order to properly assess why a given article gets a particular score, we need to take into account several factors, like the time an article has been available online, the general discipline(s) and specific themes addressed by an article’s title, when Altmetric started tracking a specific feed (for example, a scientific blog), etc.
Until now, those interested in studying and developing methods to obtain metrics of the mentions on social media of scholarly articles have focused primarily on activity around STEM papers (the Public Library of Science being an important driving force behind the implementation and recognition of Article Level Metrics). Nevertheles non-STEM and multidisciplinary papers are also being mentioned online; and the imminent creation of a PLOS-style model for the humanities and social sciences is an indicator of a new scholarly culture exploring the commonalities between fields.
Though the number of scholars in the arts, humanities and social sciences adopting social media for scholarly communications is quickly increasing, the specific use they have generally given to different online platforms until now might exclude that activity from being tracked (and therefore counted) as mentions, mainly because even though there might be online conversations and collection of academic papers and databases, these fail to link to these resources. Therefore a “mention” only counts as one if it includes a hyperlink: the article linked to needs to have been assigned a Digital Object Identifier or other stable reference such as an arXiv or PubMED ID.
We need to point out that pre-2012 papers will not have correct scores because the Explorer will have missed all the traffic about them before then. Therefore when we look at the Altmetric scores of pre-2012 papers we are talking about the minimum bound rather than the actual amount of attention they got.
Now, Who’s Talking? The Ten Articles About the Humanities with Highest Scores
Taking the above into consideration, from our larger dataset we can list the ten papers that at the time of collecting the data appeared as having the highest Altmetric scores in our report:
- Brief of Digital Humanities and Law Scholars as Amici Curiae in Authors Guild. [Deposited 3 August 2012. Tweeted by 61; Blogged by 1; mentions from 9 different countries. Altmetric Article details here].
- The state of the digital humanities: A report and a critique. [Online first version of record 1 December 2011. Mentioned by 21 tweeters and 1 Facebook user; 17 readers on Mendeley; mentions from 6 different countries. Article Altmetric details here].
- Use of social media in graduate-level medical humanities education: Two pilot studies from Penn State College of Medicine. [Published 2011. Mentioned by 18 tweeters, 1 Google+ users; 25 readers on Mendeley and 3 on CiteULike. Mentions from 6 different countries. Article Altmetric details here].
- eScience and the humanities. [October 2007. Mentioned by 2 tweeters and 1 science blog; with 17 readers on Mendeley and 8 on CiteULike. Mentions from 2 countries. Altmetric Article details here.]
- Opening the Wrong Gate? The Academic Spring and Scholarly Publishing in the Humanities and Social Sciences Anthropology and the Humanities. [September 2012. Mentioned by 13 tweeters and 1 Google+ user; with 1 reader on Mendeley; mentions from 4 countries. Altmetric Article details here].
- Anthropology and the Humanities [PDF] [Digitised manuscrupt published online 28 October 2009; original from October 1948. Mentioned by 1 tweeters; 1 science blogs; 13 readers on Mendeley. Altmetric Article details here].
- Challenging traditional premedical requirements as predictors of success in medical school: the Mount Sinai School of Medicine Humanities and Medicine Program [August 2010. Mentioned by
1 science blog and 1 Reddit thread; 9 readers on Mendeley. Altmetric Article details here].
- Philosophy, Ethics, and Humanities in Medicine: Expanding the open-access conversation on health care. [17 March 2006. Mentioned by 1 science blog and 6 readers on Mendeley. Altmetric Article details here].
- The Evolution of Human Culture: Some Notes Prepared for the National Humanities Center, Version 2. [Deposited 15 August 2010. Mentioned by 1 science blog. Altmetric Article details here].
- Discipline Matters: Technology use in the humanities. [6 December 2011. Mentioned by 7 tweeters; 4 readers on Mendeley and 1 on CiteULike. Mentions from 3 different countries. Altmetric Article details here].
This list is not a “Humanities Top Ten”, a hit parade or an indicator of academic popularity. The list represents a moment in time of the online mentions of those papers which mention the keyword “humanities” in their titles, and as such the list might be telling us more about the levels of social media adoption amongst those interested in the humanities than about the papers themselves.
The reader can click on each article’s Altmetric details in the list above and explore the types of mentions, the mentions themselves and the demographics of Twitter accounts mentioning an article (geolocation and occupation data is based on the information Twitter users provide on their profiles, so it is not always accurate). Having access to this data enables the discovery of new and often unexpected connections, allowing authors, publishers and readers to find out who is linking to articles on social media, writing about them on scholarly blogs or collecting them on online reference managers like Mendeley.
Rather than a measurement or evaluation of the article themselves or of the academic disciplines the tracked articles might represent, what we are observing is that alt-metrics can provide an academic measurement of Twitter and other scholarly activity on social media. The prevalence of themes around the medical and digital humanities, technology and social media suggests that these topics are themselves popular in social media, with a considerable number of researchers in those disciplines engaged in online discussions.
Apples and Oranges?
As seen in the list above, the article with the keyword “humanities” with the highest score in the dataset is “Brief of Digital Humanities and Law Scholars as Amici Curiae in Authors Guild” (published 3 August 2012), with an Altmetric score in the time-frame of 65.
The context is important: this paper is amongst the highest ever scored in the Social Sciences Research Network (ranked #27 of 7,154). This score places the article in the top 5% of all articles ranked by attention this year.
Looking into the details we can see the three authors of the paper, Matthew L. Jockers, Matthew Sag and Jason Schultz, tweeted their own paper, and in turn it was re-tweeted by other scholars whose profiles identifies them as researchers in the specific field the paper belongs to. (See our Fieldwork post on Library Science journals for more on the role of authors and publishers tweeting their own papers).
In contrast, a query looking for articles with the keyword “medicine” in the title which were mentioned at least once in the last year showed “The Burden of Disease and the Changing Task of Medicine” (published 21 June 2012) to be the paper with the highest score in the timeframe: 792. This paper’s Altmetric details show an impressive number of mentions in almost all the platforms tracked. Mentions came from 9 different contries, which is the same number of countries from which mentions of the Digital Humanities and Law paper came from, with the majority of mentions of both papers coming from the US, Great Britain and Canada. In the case of the Medicine paper, the Explorer did not show that any of the authors of the paper had linked to it on any of the tracked platforms.
Though it might seem indeed like we are comparing apples and oranges, it can be argued that social media is already challenging traditional conceptions of scholarly boundaries: 4 out of 10 in our list are papers on science, medicine and the humanities. Comparing levels of online attention of scholarly papers from or about different disciplines can perhaps work as a mechanism to encourage specific scholarly behaviour on line: this initial stage of alt-metrics development and adoption can provide insight on the adoption of social media by scholars.
At this moment in time, alt-metrics can be considered a powerful instrument to create awareness about the uses of social media to discover, discuss and promote research published online and the online audiences which are interested in this research.
All article and metrics mentioned on this post were obtained using the Altmetric Explorer and the information was correct at the time of collecting the data. Social media moves quickly and live Altmetric article details and the data contained in the original dataset may differ to the ones transcribed in the body of this post.