2014 saw some major developments in altmetrics: new products were launched, working committees formed and more and more researchers, publishers, funders and institutions started to investigate how they might put these new metrics to use.
As we go into 2015 we want to continue this trend. Much of the discussion last year focussed on what the numbers meant and how they should be interpreted. For us 2015 is about looking beyond the numbers. Although they help identify how much attention and engagement a research output has generated (which is why we have the Altmetric score), and where that attention came from, they do not offer many clues as to what people actually think about the work, and can often be misleading. A case of this has been brought to light in a story published in Nature this week – which found that when awarding funding grants the Medical Research Council make a point of looking beyond the external reviewers scores alone, and focus on their written comments instead.
You may also seen have seen Altmetric Founder Euan Adie’s blog post from last year, where he discussed how the term ‘metrics’ itself can seem to promise a false solution. A point first made by Stephen Curry following a HEFCE metrics review day in the UK, the assumption of finite measurement that is associated with ‘metrics’ is something we should address and be wary of as part of the process of research evaluation. So rather than ‘metrics’, we now think of altmetrics as indicators of engagement and attention. A high Altmetric score or a count of mentions of a research output from the sources we track act as indicators that there is activity to be further examined, which may then go on to be taken into consideration in evaluation.
We plan to add a number of new sources this year – these are the places that we track for mentions of academic articles, datasets, figures and other research outputs. Each time we add a new source we’ll be considering what value it would offer our users, and ensuring it adds context beyond what numbers alone can provide. Within our interfaces we’re careful to make each original mention accessible, meaning that users can actually see who is talking about a piece of work, and what they are saying. Without this, knowing that your article has been picked up by 20 news sites, tweeted by hundreds of people, or received a comment on a post-publication peer-review platform has little relevance.
With the launch of our institutional platform last year we began including references from policy documents. We’re still growing the list of what we track in this space but have already uncovered and made available thousands of references to academic work. Feedback from research administrators and institutional partners tells us that the addition of this data has been incredibly beneficial for them – what would previously have taken weeks to collate in the search for evidence of the application of research is now easy to track and report on.
The debate about how and when altmetrics should or could be applied is ongoing. Steering group work is ongoing in the US and UK, and much discussion will take place at industry conferences over the year.
At the same time, researchers are increasingly using our data to provide evidence of their societal impact to funding committees and management, identify potential collaborators and new communities to engage with, identify research worth reading, and to monitor early uptake of their work. Librarians and publishers are considering how they can best support their faculty and readers – many will be keeping a close eye on the ongoing reviews and standards projects taking place in the UK and US. We’ll be offering support and training to these communities throughout the year, and hope to provide some useful case studies that may help generate ideas for your own institutions and platforms.
To begin the year and set things off on the right foot, there’s one thing we’d like to be clear on: quality, importance and impact are not numbers. A number cannot tell you what people think about a piece of research, why they were talking about it or sharing it, or if it was any good. Numbers can identify where there is something to be examined in more depth, but they alone cannot be used to measure the value of research. With altmetrics we have the chance to explore and question, and to make the numbers work for us – not the other way around. This is central to everything we do here, and an approach we’ll be encouraging others in the scholarly community to adopt.