Elizabeth Gadd, Research Policy Manager at Loughborough University shares their experience of academic reactions to the Altmetric Explorer:
Last week Loughborough University’s Altmetric Explorer trial subscription sprang into life and was launched to staff via two training sessions expertly run by Natalia Madjeravic. What follows here are some comments we had from academics during and immediately after the training sessions.
They are indicative, I think, of some of the uncertainty around altmetrics generally, and also some of the specific concerns we’re going to need to address over the coming months as we test the service across the university.
“What are we supposed to do with this?”
No, this comment wasn’t from someone who’d never heard of altmetrics before. This was from an altmetrics afficionado who called a meeting with me to show how his Altmetric scores rocketed after persuading his publisher to make his papers open access. He was used to donuts – he knew what they meant: they helped him analyse the reach of individual papers. It was just not immediately obvious to him why we would want to aggregate all of Loughborough’s data together. Could there be some sinister ulterior motive?
In actual fact the opposite is true: Loughborough University is keen to work with academics to better understand the meaning and application of altmetrics institutionally. Communicating this will be key to overcoming any suspicions that Altmetric Explorer will become another set of management statistics.
“Is this another rabbit hole for us to run down?”
I don’t know whether you’ve noticed, but academics aren’t short of things to do. And activities such as looking up their h-index on Google Scholar are seen as ‘rabbit hole’ activities for moments of writers block or delayed train journeys. You think it will just take five minutes but one hour later you realise that that was 60 minutes of your life you’ll never get back. So, is Almetric Explorer another rabbit hole full of fascinations but with no particular purpose? Or does it offer more than that? Helping academics understand the range of uses to which this data can be put, perhaps with some institutional case studies will be useful in this regard.
“So what’s a good number?”
The beauty of Altmetric Explorer is the rich context it provides around the attention, engagement and potential impact of our research. But it does also provide numbers: donut numbers and scores in context. So the inevitable question arises: what’s a good number? And this isn’t a question Altmetric Explorer can confidently answer yet – certainly at author, department or institution level. In a world of advanced citation indicators relating to anything that has papers associated with it, altmetric indicators are at an early stage of development. I think we will need to re-focus academic attention on the context rather than the numbers, at least in the short term.
Right, we have to do better than Department X!
As soon as you start putting numbers on things, the competition starts. The draw to the top of the imaginary league table can be very strong. In one way, I’m not too worried about individuals or groups wanting to increase their numbers because that, to me, probably means they’re taking a greater interest in increasing the visibility of their research. However, the competition element needs to be carefully handled. Just as it’s never good practice to compare raw citation counts of different sized groups in different disciplines, it should certainly be avoided in altmetric terms, where there is even less agreement around the meaning of what’s being counted.
Are there any prizes?
This comment is similar to the last, but highlights even more clearly that academics are naturally ambitious. The thought that there may be prizes – be they reputation, visibility, or internal recognition, could lead people to view the garnering of altmetric attention as a bit of a game that can be played to win. And gaming doesn’t serve anyone, least of all scholarship. But there is a growing school of thought that visibility is at least as important as quality in scientific communication. The challenge will be to encourage activity that promotes research visibility for its own sake rather than to boost the numbers.
So, two weeks in to our Altmetric Explorer subscription we have already found the process very instructive. Academics clearly need clarity as to how the institution proposes to use this new tool and how they as individuals would be expected to engage with it. This is no mean feat for a newly subscribed institution that wants academics to feel free to explore and use the data in as many creative ways as they can. We also recognise the importance of focussing academic attention on the context this data provides, rather than the numbers per se.
However, if I was forced to answer the question, “so, what’s a good number?”, I would answer, “one that serves to encourage academics to think about the visibility of their work.” And Altmetric Explorer certainly seems to be doing that.