“Distinguishing the signal from the noise requires both scientific knowledge and self-knowledge.”
On February 3rd, RWTH Aachen hosted an event on the subject of alternative metrics for research, entitled “Publish or Perish: Alternative Indicators for Research Evaluation”, inviting both myself and Juan Gorraiz of the University of Vienna to come and address around 100 researchers.
Juan, a seasoned and enthusiastic bibliometrics expert, began the roughly three hour session, warming up the audience with what is by far the most inventive title for a metrics-based presentation I have come across yet (though it renders itself rather strangely into English:) “A view over the corner of the dinner-plate into the metrics Universe.” A nice image, which for reasons I can’t surmise, remind me of Douglas Adams’ The Restaurant at the End of the Universe.
Tracing the history of metrics through from Eugene Garfield, Juan managed to sum up the entire history of bibliometrics and scientometrics in just under an hour. A clear message in Juan’s talk, and one you will find our blog writers restating time and again, was the fact that alternative indicators, or altmetrics, are useful for some applications and not recommended for others. Just as one would never use pure citation counts to judge whether an article was positively or negatively received, Juan outlined both the benefits of altmetrics, and some of the pitfalls. High numbers are not necessarily good numbers. Low numbers aren’t necessarily bad. The topic of metrics gaming featured healthily (as it should – and as I later pointed out in my succeeding talk – it should also be borne in mind that such charges of gaming are equally to be found in the arena of citation counts.)
The audience Q&A for Juan’s talk raised some interesting questions. The very first question raised the quite sensible point that some of the more esoteric or inaccessible research outputs, such as a new type of catalysis (we were of course dealing with one of the finest Chemistry departments in Germany) would not exactly be of interest to the casual user of the many social media which Altmetric tracks. This is of course absolutely correct.
However, as I then went on to outline in my succeeding talk, what about when an author, operating within any discipline, complex or accessible, finds themselves praised or queried for an elaboration or explanation online in one of the many sites like Reddit or Twitter? Should these questions go unanswered simply because these platforms haven’t found a way of alerting every author of every article which comes under consideration?
Or, far worse, what if you end up being accused of fraud, plagiarism or any other charge, and find your work hauled up on PubPeer, Publons, or berated on Twitter by other researchers? Do you imagine you’ll hear about this in a timely fashion? A brief tour of PubPeer’s comments suggests otherwise; all too often authors are blithely (and quite naturally) unaware that in some remote corner of the internet, their work is coming under the challenge of post-publication peer review. It is not at at all unusual for an author of a major and high profile paper to find out up to 6 weeks after an initial complaint against their work that all is less than rosy. This can lead to rushed, ill-considered responses. Which author would not wish to know about these kinds of developments, to borrow again from Douglas Adams, at the speed at which Bad News is known to travel?
This, I explained, is part of the value proposition of Altmetric. All the mentions, from across disparate sources, collated helpfully and with the ability for alerting to be built right into the system; and – most importantly – without yetanother profile to be filled out by the researcher.
Keeping on top of the myriad discussions about one’s work, or more generally one’s discipline, is one that can’t be done by hand. Or if you do decide to be the author who manually finds each reference to their work, you can’t also meaningfully be a researcher at the same time. Both would be, and are, more than a full-time job on their own.
To switch gears, I used an example of some research published less than a month before the event: the Liu iInstitute for Climate Studies has recently released a stunning piece of research; over the last 50 years humanity has lost 10% of its crops to violent climate events. The Liu Institute’s actual wording is much more sober than mine, but the reverberations across the news, blogosphere, social media and other sources off non-academic attention clearly attest that this story hit a major nerve across theglobe. International news sites have run 78 news stories; over 200 Tweets and a dozen blog entries have been created in reaction. At some point perhaps we can expect to see policy developments also to join the reaction media, perhaps by the IPCC, taking this research into account; who knows? Well, when it happens, we will.
So – where are the citations to tell this particular impact story? Of course within a month, no such citations could ever exist. Science is hard, and its mighty wheels grind slowly but exceedingly fine for good reason. To earn juts a single academic citation, a researcher has to read this paper; they and perhaps several authors have to respond, not in the form of a hastily scribbled email, but a finely formatted, researched piece of literature – possibly including experiments or data analysis. Then we have those pesky human biases to clear out of the process; and the small issue of getting a manuscript accepted and published to get out of the way, of course. And that’s before we even wait for indexing services to keep track of all this. A month? Of course that’s not a realistic or a desirable goal. We are talking years. And that’s fine.
But – when it comes to social impact, we are dealing with a flat-out different, but equally worthy, timeline. The Liu Institute had reached 11 news sites in the first 5 days, and 70 news stories after 19 days; stories ran in Danish, Italian and English, to audiences of millions. Institutions should be empowered to be able to tell these short-term stories.
All in all, a fascinating event, and we were very pleased to be able to attend. Now, I’ll be getting back to the fascinating developments about Gravitational Waves which were revealed just yesterday (Altmetric tracked 50 stories in 24 hours around this news. See our collection of coverage here.)