Impact Assessment

How do researchers and the public engage in online attention to academic research in different fields and why does it matter?  Can evaluating this engagement contribute to providing reliable evidence of research impact outside academia?  In the “Fieldwork” series of blog posts, we will explore through journal and article-level stories whether alt-metrics can provide a holistic image of impact on diverse audiences.

Alt-metrics, as the quantitative and qualitative “study and use of scholarly impact measures based on activity in online tools and environments” (Priem et al 2012) , has until now primarily focused on science, technology, engineering and mathematics (STEM) fields. Understandably, on the other hand, there has been some resistance from scholars in the humanities and social sciences to what might be called a renewed conflict of the faculties: the perceived threat of quantitative approaches becoming the paradigmatic model for the evaluation of all other academic disciplines and outputs. Indeed, some humanities scholars and commentators have opposed to the quantification of what, it has been argued, cannot be measured.

Nevertheless, major academic publishers such as Oxford University Press, Cambridge University Press, Palgrave and Taylor and Francis, to name a few, publish journals belonging to both STEM and humanities and social sciences fields (and scientific journals that also publish articles relevant to fields beyond STEM). The data about research online engagement we can obtain from them is discipline-agnostic; it is the online behaviour of researchers and interactions with the outputs from different disciplines what can significantly differ. Needless to say, STEM scientists are not the only researchers using the web to mediate their interaction with scholarly activity;  humanities and social sciences scholars are also using online tools and environments (social media, blogs, online reference managers, etc.) to engage with scholarly literature and other events such as lectures, conferences and symposia.

Perhaps unsurprisingly, the Internet Society’s 2012 Global Internet User Survey showed that ninety-eight percent of users agreed the Internet is essential for their access to knowledge and education. A recent study surveying 2,000 researchers (Rowlands et al 2011), found that the majority of the interviewed arts and humanities (79.2%) and social sciences researchers (84.0%) were including social media tools in their workflow. These results are indicators that online access and interaction with published research is increasingly becoming a practice which is not exclusive to STEM fields, but the standard practice across a wide spectrum of disciplines. Online attention to published research and other types of academic activity can therefore be one of the ways to assess their outreach or impact.

This is where the Research Excellence Framework (REF) comes in. In the United Kingdom, the REF is the system for assessing the quality of research in higher education institutions. Funding bodies intend to use the outcomes to inform the allocation of research funding to Higher Education institutions. It aims to provides accountability for public investment in research and evidence of its benefits. An institution’s research reputation will be backed up (if not defined) by the outcome of its REF assessment.

The REF works as a process of expert review, carried out across all research disciplines, divided into 36 subject areas. Universities can submit all types of research, funded from any source, and an expert panel will assess and grade their submissions, taking into consideration three elements:

  • The quality of research outputs (65%)
  • The impact of research beyond academia (20%)
  • The research environment (15%)

Impact will be assessed jointly by academics and research users on the REF expert panels. The term ‘impact’ is used to mean

“any effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia” (REF 02.2011; 26)

This definition explicitly focuses on impacts outside UK Higher Education institutions. For the REF, universities are responsible for gathering and submitting evidence of the public impact of their research, which should include case studies of “impacts” that took place during 2008-2013, as well as an impact strategy detailing how the university engages with users and facilitates impact from its research. The claims institutions make about their impact must be verifiable and backed up by evidence.

The REF seeks “to assess the impact of excellent research”, but it needs to be clarified that its focus is on the impact of the submitted unit’s research, not the impact of individuals or individual research outputs. Importantly, though, the latter may contribute to the evidence of the submitted unit’s impact. (On a related example, The Becker Medical Library Model for Assessment of Research Impact model is a framework for tracking diffusion of research outputs and activities meant to locate indicators that might demonstrate evidence of biomedical research impact).

For example, The REF’s Main Panel D Criteria  (covering Area Studies, Modern Languages and Linguistics, English Language and Literature, History, Classics, Philosophy, Theology and Religious Studies, Art and Design: History, Practice and Theory, Music, Drama, Dance and Performing Arts, Communication, Cultural and Media Studies and Library and Information Management) provides a range of examples of what research units in these fields can submit as evidence of impact (amongst others):

  • Quantitative indicators, for example of broadcasting data and other forms of media, download figures, or database and website hits over a sustained period, evidence of use of education materials arising from the research; audience figures and visitor numbers at exhibitions, events, performances.
  • Critiques or citations in reviews outside academic literature in users’ media, including online documents,  blogs and postings and other forms of recorded feedback; evidence of inclusion in teaching materials or teaching bibliographies; evidence of uptake of research in documents produced by public or commercial bodies; citations in policy documents and reviews, or other published reports on policy debates.
  • Public engagement: information about the number and profile of people engaged and types of audience; follow up activities or media coverage. Evidence of downloads of linked resources or access to web content. Descriptions of the social, cultural or other significance of the research insights with which the public have engaged. Evaluation data. User feedback or testimony. Critical external reviews of the engagement activity.

This is where, regardless of academic field, alt-metrics data could potentially contribute to –or eventually generate a culture towards– strengthening the evidence informing impact case studies. Qualitative analysis of metrics and international demographic data of online activity reflecting the attention paid to research by other researchers, the media and the general public can be gathered, filtered and interpreted to become one of several indicators contributing to the development of narrative case studies used as evidence of impact beyond academia.

To evaluate the extent of their public impact, the arts and humanities, and to some degree other disciplines within the social sciences, rely less than, say, the life sciences on traditional citation metrics, and indeed the Main Panel D Criteria does not necessarily require [peer-reviewed] citations evidence (Section D2; 74:87).  This suggests that any assessment of the engagement with these fields’ research outputs should ideally be as holistic as possible, going well beyond bibliometrics and taking into consideration each field’s unique disciplinary characteristics through a combination of qualitative and quantitative methods.

Curating and mining the databases and quantifying and displaying the data in a human-readable form is only one step in the process of studying online attention to research. One way of engaging in qualitative appraisals of alt-metrics is by interpreting the results by telling stories about the data in the form of case studies. At Altmetric, we have been tracking at least 120 peer-reviewed literature journals, and at least 31 journals of different disciplines that have published articles on humanities themes, evaluating the way articles published in these journals are received, discussed and promoted online.

Many questions arise from looking at the data. How can we better understand and use the data we obtain with the Altmetric Explorer? Can we classify articles and journals by the kind of attention they get? Are there common patterns between disciplines and themes? How do they compare to those found in STEM research? Can online attention metrics encourage specific types of online behaviour across disciplinary boundaries? What does it mean if somebody tweets a paper -what’s the tweeter trying to do? How can journals maximise the online engagement with the research they publish?

These questions will guide our fieldwork.