To date, Altmetric has collected attention data for nearly 4 million research outputs. However, we sometimes receive feedback saying we have a lot more attention data for the sciences than for the humanities and social sciences. We’re also aware of a demand for more altmetrics for things like datasets, images, monographs and other forms of academic output – not just journal articles.
We’ve already taken steps in this direction – for example we track attention to any datasets or images with a scholarly identifier (such as those that get assigned a DOI via figshare), and have worked with organisations like the Biodiversity Heritage Library and Michigan Publishing to provide metrics for their non-article outputs.
The purpose of this blog post is to provide some insight into how we’re trying to improve and build on these initial steps, as well as providing some general commentary on the opportunities and challenges we and other altmetrics providers face when it comes to data curation.
Tracking other disciplines – challenges and opportunities
In a summary of the recently published HEFCE report on the role of metrics in research assessment, the researchers mentioned “highly variable coverage of metrics across subject areas” as a concern. It’s true that Altmetric do have more data for the sciences than for the humanities and social sciences – and that’s partly just because of how often and the way that people discuss these types of research online. One of the challenges of improving data coverage is that we’re dependent on the behaviours of others; tweets, facebook posts, news stories and blog posts need to exist in order for us to pick them up. We also notice that we pick up less mainstream news coverage for articles published in humanities journals, as it’s likely that an article from a theory-based humanities journal will not be communicated to the general public in the same way as (for example) an important medical discovery.
However, we have picked up extensive attention data for some humanities journals; the screenshot below (pulled from search results in the Altmetric Explorer) demonstrates that articles from the Journal of American History are consistently referred to in blogs (yellow), on Twitter (light blue), in mainstream news sources (red) and on Wikipedia pages (dark grey).
Recently, we’ve been working to add some more humanities and social sciences-based journals, so that we can start accumulating attention data for their articles. In the last month we’ve added a number of theology journals published by Equinox to the Altmetric database, as well as some law and economics journals published by Edward Elgar publishing. We’re also continually reviewing our list of sources and looking for ways of tracking sources that highlight and discuss research from all academic fields.
Tracking other types of research output
In some ways, it was logical for the altmetrics movement to focus on academic articles published in journals initially. By tracking articles, including those hosted on sites like arXiv and SSRN (which Altmetric have been doing for the last few years) we can provide data that can be used alongside traditional bibliometrics such as the Impact Factor, and can provide journal editors and authors with a way of benchmarking the attention their content attracts against other journals, and monitoring the online conversations around published research in real-time.
However, it’s important to remember that one of the other advantages of altmetrics is that they can be used to track online attention and influence for lots of different output types.
Software startups such as Figshare are very aware of these opportunities and have developed systems to help researchers get credit for all their research activities. When (for example) researchers attach Figshare DOIs to their datasets, conference slides, data visualisations and software packages, we accumulate attention data for those outputs as well as articles.
The same applies for other platforms such as Dryad, and in fact we now have the technology to track anything in an institutional repository based on a scholarly identifier OR a unique URI embedded in the metadata – such as a URL for a particular piece of content.
Adding the URI tracking support means that we can track online activity for any institutional or publisher generated content, such as press releases scripts, or anything hosted on the same domain (speak to one of our team if you’re interested to hear more about how you can make use of this!).
Earlier this year, we successfully completed the Bookmetrix project with Springer. For this we collected online attention data, download counts and review snippets for each of the books and chapters hosted on the SpringerLink platform – combining traditional and non-traditional metrics together to enable the authors, readers and editors of those titles to access a record (updated in real time) of the online activity taking place around each title.
We’ll continue to experiment and investigate and expand on our approach to tracking non-article outputs, whilst paying close attention to ensure the relevancy of each source of attention depending on what the output is (book reviews for books, for example).
Why does this matter?
Offering feedback and providing insight to authors of non-journal research outputs is a key objective of ours. Altmetrics hold the promise of enabling everybody, no matter what discipline or at what stage in their career, to get credit and be recognised for the work that they’re doing. Traditional bibliometrics provide little to support this, with traditional citations and Impact Factor bibliometrics focussing purely on journal-to-journal referencing.
Gathering evidence of the influence and reach of your work can be particularly difficult for early-career academics, who are keen to progress but have not yet had the time to accrue citations or establish themselves fully in their field. Altmetrics can provide them with evidence to uncover and demonstrate the broader impacts of their work to funding or hiring committees, and even just amongst their peers.
Everyone should get credit for the work that they do, and we think they should also be able to demonstrate the impacts of doing that work via the easiest route possible. Altmetrics are not the whole picture of the final answer, but they do provide a useful indicator that have the potential to help faculty across all disciplines understand and evidence how their work is being disseminated and interpreted.
Want to get involved?
We’re always on the lookout for new sources and research outputs to track. If you’re a publisher with a list of humanities journals, and you attach unique identifiers to your articles, email firstname.lastname@example.org to find out if we can start tracking your journals. If you’re a researcher with an interest in the humanities and/or social sciences, let us know about popular blogs and news sources in your field, so we can add them to our source lists. All we need to be able to start tracking a blog or news source is a working RSS feed.
In the meantime, we’ve got some other exciting things in development and hope to be able to roll these out more widely soon. As always, feedback is welcome, and we’d love to hear your thoughts on what we’ve been discussing here! Feel free to drop us a line or leave a note in the comments section below.