Altmetric Blog

Altmetrics lesson 1: quality, importance and impact are not numbers

Cat Williams, 8th January 2015

2014 saw some major developments in altmetrics: new products were launched, working committees formed and more and more researchers, publishers, funders and institutions started to investigate how they might put these new metrics to use.

As we go into 2015 we want to continue this trend. Much of the discussion last year focussed on what the numbers meant and how they should be interpreted. For us 2015 is about looking beyond the numbers. Although they help identify how much attention and engagement a research output has generated (which is why we have the Altmetric score), and where that attention came from, they do not offer many clues as to what people actually think about the work, and can often be misleading. A case of this has been brought to light in a story published in Nature this week – which found that when awarding funding grants the Medical Research Council make a point of looking beyond the external reviewers scores alone, and focus on their written comments instead.

You may also seen have seen Altmetric Founder Euan Adie’s blog post from last year, where he discussed how the term ‘metrics’ itself can seem to promise a false solution. A point first made by Stephen Curry following a HEFCE metrics review day in the UK, the assumption of finite measurement that is associated with ‘metrics’ is something we should address and be wary of as part of the process of research evaluation. So rather than ‘metrics’, we now think of altmetrics as indicators of engagement and attention. A high Altmetric score or a count of mentions of a research output from the sources we track act as indicators that there is activity to be further examined, which may then go on to be taken into consideration in evaluation.

We plan to add a number of new sources this year – these are the places that we track for mentions of academic articles, datasets, figures detailsand other research outputs. Each time we add a new source we’ll be considering what value it would offer our users, and ensuring it adds context beyond what numbers alone can provide. Within our interfaces we’re careful to make each original mention accessible, meaning that users can actually see who is talking about a piece of work, and what they are saying. Without this, knowing that your article has been picked up by 20 news sites, tweeted by hundreds of people, or received a comment on a post-publication peer-review platform has little relevance.

With the launch of our institutional platform last year we began including references from policy documents. We’re still growing the list of what we track in this space but have already uncovered and made available thousands of references to academic work. Feedback from research administrators and institutional partners tells us that the addition of this data has been incredibly beneficial for them – what would previously have taken weeks to collate in the search for evidence of the application of research is now easy to track and report on.

The debate about how and when altmetrics should or could be applied is ongoing. Steering group work is ongoing in the US and UK, and much discussion will take place at industry conferences over the year.

At the same time, researchers are increasingly using our data to provide evidence of their societal impact to funding committees and management, identify potential collaborators and new communities to engage with, identify research worth reading, and to monitor early uptake of their work. Librarians and publishers are considering how they can best support their faculty and readers – many will be keeping a close eye on the ongoing reviews and standards projects taking place in the UK and US. We’ll be offering support and training to these communities throughout the year, and hope to provide some useful case studies that may help generate ideas for your own institutions and platforms.

To begin the year and set things off on the right foot, there’s one thing we’d like to be clear on: quality, importance and impact are not numbers. A number cannot tell you what people think about a piece of research, why they were talking about it or sharing it, or if it was any good. Numbers can identify where there is something to be examined in more depth, but they alone cannot be used to measure the value of research. With altmetrics we have the chance to explore and question, and to make the numbers work for us – not the other way around. This is central to everything we do here, and an approach we’ll be encouraging others in the scholarly community to adopt.

18 Responses to “Altmetrics lesson 1: quality, importance and impact are not numbers”

InvestigaUNED (@InvestigaUNED)
January 8, 2015 at 12:00 am

Altmetrics lesson 1: quality, importance and impact are not numbers http://t.co/AYvKMWO3lq

Digital Science (@digitalsci)
January 8, 2015 at 12:00 am

Quality, importance and impact are not numbers - an important #altmetrics lesson http://t.co/JvP0Q4zeRe

Lou Woodley (@LouWoodley)
January 8, 2015 at 12:00 am

Altmetrics lesson 1: quality, importance and impact are not numbers: http://t.co/U5H0Rud4gS

Xavier Lasauca (@xavierlasauca)
January 8, 2015 at 12:00 am

#Altmetrics lesson 1: quality, importance and #impact are not numbers, by @altmetric http://t.co/ydI0c9aYwW #research

Impactstory (@Impactstory)
January 8, 2015 at 12:00 am

Well-said! MT @altmetric: #Altmetrics lesson 1: quality, importance and impact are not numbers http://t.co/Kz5GzothU1 #scicomm #libchat

Fabrice Leclerc (@leclercfl)
January 8, 2015 at 12:00 am

Top #openedu story: Altmetrics lesson 1: quality, importance and impact are not… http://t.co/SditzOFev6, see more http://t.co/BLfazjgzZZ

@physicsteo
January 8, 2015 at 12:00 am

.@Altmetric lesson #1: quality, importance & impact are not numbers: http://t.co/M1OMLcwKwi HT @LouWoodley <- Numbers are tools, not goals

@BetsyDonohue
January 8, 2015 at 12:00 am

Quality, importance and impact are not numbers ... http://t.co/UPdIhxt1em #altmetrics

@Biblioteca_UNED
January 9, 2015 at 12:00 am

@InvestigaUNED: Altmetrics lesson 1: quality, importance and impact are not numbers http://t.co/WTStIVQZCV; http://t.co/w2InPOdfax

@UCDSciEx
January 9, 2015 at 12:00 am

#scicomm #Altmetrics lesson 1: quality, importance and impact are not numbers, by @altmetric http://t.co/wab7jFaqgN #research @UCD_Research

Digital Science (@digitalsci)
January 9, 2015 at 12:00 am

Research quality, importance and impact are not numbers - an important #altmetrics lesson http://t.co/JvP0Q4zeRe

@Research2Action
January 9, 2015 at 12:00 am

#Altmetrics lesson: Quality, importance and impact are not numbers http://t.co/pQ8QWXvh9B via @altmetric #research #impact

@nishadoshi
January 10, 2015 at 12:00 am

Numbers alone cannot be used to measure the value of research http://t.co/IswP10MJn0

Chunli Liu
January 10, 2015 at 12:00 am

I agree with your idea. Most important, it would avoid the drawbacks of bibliometrics alone to the evaluation of research

Frank Nuijens (@FrankNu)
January 12, 2015 at 12:00 am

Altmetrics lesson 1: quality, importance and impact are not numbers http://t.co/oGOUAH7myQ

Science journalism (@sci_journalism)
January 12, 2015 at 12:00 am

Altmetrics lesson 1: quality, importance and impact are not numbers http://t.co/AiRYu87JyZ

Leibniz-Science2.0 (@lfvscience20)
January 12, 2015 at 12:00 am

Altmetrics lesson 1: quality, importance and impact are not numbers http://t.co/egFeTO9H4K

@LisaStirUni
February 3, 2015 at 12:00 am

Research quality, importance & impact are not numbers : http://t.co/adg0tG4GPP via @altmetric

Leave a Reply

Your email address will not be published. Required fields are marked *