It’s been a busy week here in London – not least because we hosted our first fully-fledged institutional event here on the Macmillan Campus. Aptly titled “Altmetricon”, the day brought together a great group of enthusiastic attendees, including librarians, research officers, academics, policy, funder and think-tank representatives, and a few publisher folk thrown in for good measure.
Up first on the program was Altmetric Founder Euan, who gave a brief overview of activity in the field of altmetrics in the last year. Euan highlighted that it is key to remember the difference between “attention”, “impact”, and “quality” – terms that are often intermingled but clearly do not carry the same meaning.
A presentation from Jane Tinkler, Senior Adviser to the Parliamentary Office of Science and Technology came next. Jane gave a useful overview of the work that had been done around the HEFCE metrics review – both in the run up to and after the report had been published. Jane too emphasized that metrics cannot be used to measure the quality of an academic output, and discussed the need for promoting ‘responsible metrics’. To do so, she noted, requires input and transparency from a range of stakeholders in the scholarly community: institutions’ hiring committees, publishers, funders, and researchers themselves.
Offering some insight on the part of Funders, Adam Dinsmore from the Wellcome Trust shared their experience of using altmetrics as part of their evaluation process. For the Wellcome, he noted, broader impacts come in many ways and amongst a variety of audiences often spanning from school children to policy makers to visitors to their exhibitions gallery. The application and influence of their research both act as key indicators that they take into account as part of their evaluations – and attention data such as that provided by altmetrics can be a useful indicator for this. Adam noted that altmetrics have given them additional insight and a better understanding of scholarly communication, and that it was vital to have a robust, connected scholarly infrastructure for metrics to reach their full potential.
Following a short break there was the opportunity to hear from two researchers; Melodee Beals, a Lecturer in History and Loughborough University, and Jon Tennant, a PhD at Imperial College London. Melodee spoke of her experience of integrating altmetrics into her workflows and trying to encourage her fellow researchers to do the same. Metrics surrounding humanities content have always been harder to gather that than those for scientific outputs, with many outputs often being under-represented. Melodee spoke of the need for good impact to be ‘purposeful’ – meaning that an academic should know what impact they want their work to have, and why, and should embark on the most effective activities to achieve that impact. She too championed consistent identifiers and a solid infrastructure – highlighting the use of an ORCID ID to help researchers be easily identified and rightly credited for their work.
Jon kicked off his talk by sharing his love for the Kardashian Index – a popularity ranking that attracted much attention when it was released online earlier this year. In his discussion altmetrics he referred to a ‘basket of metrics’ – nothing that they could be useful for helping to better understand the re-use of research and have become an integral part of the open science toolkit. He too noted the need for the responsible use of metrics, and the role of the publishers in de-emphasizing reliance on the Impact Factor as a measure of success. Altmetrics don’t solve our problem of how to measure the impact of research, said Jon, but they do help us to think more about its societal reach.
After lunch the more product related sessions began. Liam Cleere from University College Dublin gave an overview of their experience of exploring Altmetric data to date. As Ireland’s largest University and the national leader in research funding, UCD place a focus on their international research excellence. Unlike the UK Ireland does not have a REF-like program for the evaluation of research. Historially, most of the research evaluation that has been undertaken has been done so based on traditional bibliometric analysis from academic citations. A recent project by the UCD University Research Strategy Board (URSB), “Beyond Publications” was set up to “investigate the definitions, evidence and systems for capturing outputs beyond publications, and the impacts and benefits of that research from the perspective of the university.” The result was the “Furthering the research impact of University College Dublin” report, published in May 2014. Beyond this the University has moved to trying to capture and understand a much broader picture of the impacts of their research – looking beyond standard scholarly measures to try and gauge the real influence of their work on the bigger social agenda. A big part of this is educating their academics on how to get their research out their and noticed – activity that they are able to analyse and report on through the use of Altmetric for Institutions. A multi-phase project, Liam and his team will be producing materials and running workshops to help strengthen the institution’s new approach to defining and monitoring impact.
Last up from our guest speakers was William Nixon from Glasgow University. William and his team recently implemented Altmetric for Institutions to enable them to gather better data on where their research was being shared and discussed – data that they hope will help their researchers more engaged with actively promoting their work. Along with site license access to Altmetric for Institutions, Glasgow have also embedded the Altmetric badges into their institutional repository – encouraging researchers to deposit their publications so that they can see the altmetrics data for their work. In rolling out the new platform out amongst their faculty, William and others from the library have been attending department meetings, running workshops and speaking to the various departmental committees. A focus of their discussions has been helping researchers understand what they can use altmetrics more – tracking the attention around their work, identifying interesting coverage to include in grant applications, and discovering previously unknown citations for their work in public policy documents. Some of William’s team have even signed up to the Altmetric Ambassadors program – and plan to make the slide decks and other materials made available to them to run further workshops and introductory sessions.
Post-sugar rush from some huge Altmetricon donuts, the remainder of the day was spent discussing all things Altmetric. Our Product Development Manager Jean gave an update of the latest Altmetric developments, and provided a sneak peek of what’s to come in the next 6 months.
Attendees were then split into 2 groups for some interactive workshops – one focussed on sources for non-journal content; what would be most relevant to track and why?, and the other on rolling out altmetrics within an institution; what’s worked in the past with other platforms? what materials would you make available and how would you spread the word?
And at last, it was time for a drink! The Altmetric team thoroughly enjoyed the day and found it really useful – and we hope all of our attendees did too. Thanks again to all of our brilliant guest speakers, and we hope to host another event like this in future!