How can alternative digital methods of scholarly assessment maximise the public impact of academic research? This is a question that all of us interested in alt-metrics have been asking ourselves. “Impact” is a term that can be understood in many different ways too.
With this in mind we attended “The Future of Academic Impact” conference organised by the London School of Economics Public Policy Group on Tuesday the 5th of December 2012 at Senate House, London, which sought “to look forward to how impact research and measurement might develop over the next ten-year period looking beyond REF2014.” (REF stands for the Research Excellence Framework).
The event successfully marked the end of the three-year Impact of the Social Sciences project based at the London School of Economics (funded by the HEFCE). This project looked at “the nature and measurement of impact of academic research in the social sciences on government and policy-making, business and industry, and civil society.”
The conference took place in three different areas, with the principal panel discussion sessions hosted in the main hall and more practical “breakout” sessions happening simultaneously in other smaller rooms. We were particularly interested in the main four sessions, which were titled “The Economic Impact of Academic Research”, “Impact and the New Digital Paradigm”, “Next Steps in Assessing Impact”, and “Impact as a Driver for Open Access”.
These discussions brought together a combination of senior figures facilitating the current methods for assessment and funding in UK Higher Education and key figures engaged in the development and promotion of alternative digital methods for maximising the impact of academic research.
The Economic Impact of Academic Research
The first session was introduced by Professor Patrick Dunleavy from the LSE. Nicola Dandridge (Chief Executive, Universities UK) spoke of the different types of impact and made a call “not to shy away from the important question of the economic contribution of research”. She emphasised the need to build “a coherent narrative” about the economic impact of research in order to present the case more effectively to policymakers.
Professor Sir Adrian Smith (Vice Chancellor, University of London) followed Dandridge, addressing the need to understand how decisions are taken in government. He asserted that the argument about the need to fund academic research could not be won by “simply presenting statistics on citations”. Nevertheless, the Vice Chancellor of the University of London spoke of the role of big data as a catalyst for maximising impact, saying that “big data sets are easily accessible and usable. Data is a driver of business growth”.
Sir Adrian Smith also referred to the role of the Technology Strategy Board as the “interface” or “broker” between universities, the government and businesses. He also explained that at Whitehall, it did not matter to what disciplines research belonged to. To this, a member of the audience commented from the floor that this might be because there is a strong dominance of STEM over other disciplines like the social sciences. One of the conclusions of the panel was that universities need to communicate more effectively with public and private sectors. As a member of the audience put it, “disappointment with the failure of academic communication is not enough; we need to resource academic dissemination as a function.”
Impact and the New Digital Paradigm
The second and third sessions offered passionate, engaging discussions presenting two different paradigms of scholarly communciations. Dr Victor Henning (Co-Founder & CEO, Mendeley Ltd) presented online reference managers such as Mendeley as the future of open, collaborative research, where peer review is transformed into transparent, collective assessment through “the wisdom of the crowd”. Ziyad Marar (Global Publishing Director, SAGE) advocated for peer review and the role of publishers in ensuring research excellence and authority. He described academic publishing as “multifaceted” and argued that not all journals can be equally compared since they all have different measures of output. Marar argued that “scholarly reputation goes beyond popularity” and also called journals to “enable young scholars to establish their authoritative voices”.
Jason Priem (Co-Founder, ImpactStory) talked of alt-metrics and revolutions. His keywords were conversation, stories, analysis and data. Using a freer definition of “impact” (that is, not constrained to the definitions currently being discussed in the UK in the setting of the REF), he emphasised impact’s multi-dimensionality: there can be impact on different audiences and there can be different types of engagement. He addressed the resistance to using quantitative methods to assess research outputs by saying that “ideas, though ethereal, do have consequences in the physical realm, and we can track that.”
Priem’s talk presented alt-metrics as an alternative to traditional citations, describing bibliometrics as “a network of ideas”. “Bibliometrics measure citations”, said Priem, but “alt-metrics measure impact”. He also clarified that alt-metrics did not exclude qualitative assessments. The panel seemed to display a clash between different worlds: one represented by publishing models that build reputation on elite knowledge, and another where elite knowledge is openly shared and the emphasis is in the conversation.
The Q & A after this session featured a great debate between Henning, Marar and Priem. The discussion revealed key disagreements between one camp (represented by Henning and Priem, speaking for open online collaborative publishing and alt-metrics) and the other (represented by Marar, speaking for peer-reviewed journals). Marar finished the session by suggesting to Henning and Priem that advocating for more alternative methods for research would be dangerous, “turning social scientists into journalists.”
Next Steps in Assessing Impact
In the third session, Cameron Neylon (Senior Scientist, Science and Technology Facilities Council/Advocacy Director, PLoS) provided a compelling demonstration of why article level metrics can inform qualitative assessments of impact, and said that “reaching the right people is not reaching a lot of people but reaching those critical people”. Neylon echoed some of the ideas expressed by Henning in the previous session, concluding that “it’s not about where you publish, but who you reach.” He called for universities to take leadership and embrace innovation. He would articulate and extend these ideas in this article that the LSE Impact Blog published a day later.
One of the highlights from the conference was seeing David Sweeney (Director, Research, Innovation and Skills, Higher Education Funding Council for England) present his views on impact assessment (REF) in the context of this conference. He explained the HEFCE is “interested in capturing what everybody in the academy does” and said the REF wants to look at how some institutions (not individuals) have been more successful at doing research than others. “We are asking for case studies providing evidence of research impact on society”, he said, but “you will not persuade government by drilling down”.
Sweeney clarified that the REF was not concerned with citations and impact on UK researchers and institutions, but on society at large. Everyone interested in exploring how alt-metrics might inform case studies submitted to the REF was glad to hear from him in reply to a question from the floor that “what Cameron Neylon described [alt-metrics] might be more useful [as evidence of impact] than traditional citations”.
Impact as a Driver for Open Access
This last session focused on the benefits of Open Access and its potential to maximise research impact. Robert Kiley (Head Digital Services, Wellcome Trust) explained that the Wellcome Trust’s Open Access policy had been triggered by an “access denied” message, saying that the Trust’s main reason to invest in Open Access research was “to maximise return on our investment”. (Kiley was a member of the Working Group on Expanding Access to Published Research Findings chaired by Dame Janet Finch CBE).
Kiley also spoke of Creative Commons licenses and alt-metrics as drivers for impact, and showed an example of a Nature article featuring article level metrics as provided by the Altmetric Explorer. Kiley’s key message to take home was that in order to maximise the impact of your research, researchers need to make it Open Access. Mark Thorley from the RCUK Research Outputs Network explained the Research Councils Open Access Policy agenda and its preference for the establishment of the Gold model (research funders, not the readers, pay for publication) over the Green one (authors self-archive in institutional repositories).
The conference Twitter backchannel was very lively, hosting discussion and the sharing of links and resources presented during the event; it also provided a space for critique of the “Impact Agenda” and for remote participation. Until the time of writing we have archived 1506 tweets tagged with #LSEImpact, which contained 275 links. Of those 607 were re-tweets.
You can see a dataset including the 1501 tweets hashtagged with #LSEImpact we collected until Thursday 6 December here. You can also explore an interactive archive of the backchannel we put together using Martin Hawksey‘s TAGSExplorer here (it gets updated every 30 minutes).
A couple of tweets tagged with #LSEImpact sum up two relevant conclusions from the event:
Key message from #LSEimpact social scientists need to share their research more effectively with policy makers & engage in public debates
— Shirley Ayres (@shirleyayres) December 5, 2012
— Ellen Harries (@el_bells) December 5, 2012
The conference addressed many of the recurring anxieties about alt-metrics, measurement instruments, assessment, funding, quality and authority that we saw emerge during this year’s SpotOn London conference. The discussions in both events revealed how different notions of “impact” are, and this goes hand in hand with the need to communicate more effectively the message that “alt-metrics” means different things to different people and that they are a means to an end.
“The Future of Academic Impact” was an excellent opportunity to come face to face with the key influencers in different camps of academic research assessment. It became apparent that senior figures actively involved in negotiating with policymakers agree that neither the economic impact of academic research nor its public impact beyond academia can be demonstrated successfully though citations and bibliometrics.
The conference demonstrated very clearly though that a new paradigm of scholarly communications is emerging, interrogating the prevalence of more traditional methods. Open data and data research was recognised as a key driver for business in the UK. Importantly, the conference showcased the potential that alt-metrics have to provide evidence for case studies for the REF, as this was openly acknowledged by the HEFCE during the event.
The recognition that online collaborative methods, open access publishing and alt-metrics can play an important role to maximise the public impact of academic research outside academia still has some way to go, but the initial steps have already been taken.