Altmetric Blog

Reflections on REFlections

Fran Davies, 10th April 2015

The Research Exellence Framework is a process of assessing research quality at UK universities and is funded and organised by the Higher Education Funding Council for England. At Altmetric, we work with a lot of UK universities and are always really interested in learning more about the processes of research assessment. To this end, I attended the one-day REFlections event at the Royal Society on 25th March, to gain an insight into what people thought of the REF 2014, and in particular to hear what people had to say about the Unknowncontroversial decision to introduce “impact” as one of the assessment areas.

The day started with some very positive statistics. According to a the “key facts” sheet in the conference pack, research from 154 UK universities was assessed, and 1,911 submissions were made. The results showed that 30% of submissions were given a four star rating and judged to be “world leading” (up from 14% in 2008) while 46% of submissions were classified as “internationally excellent”, with a three star rating awarded (up from 37% in 2008). To give a bit of context, David Sweeney (Director of Research and Knowledge Exchange at HEFCE) informed the audience that roughly the same number of staff made submissions in both years, suggesting a genuine increase in top quality research.

Listening to each speaker present their thoughts, I felt the theme of the morning could definitely be summed up in one word; “multidisciplinarity”. David Sweeney posed the idea that the results of the REF 2014 defy previous criticisms that the exercise approaches institutional research in a narrow or insular way, by categorising submissions under their respective academic disciplines.

M’hamed El Aisati gave a very impressive presentation about a project undertaken by Elsevier, which was executed with the guiding principle that “some of the most interesting research questions are found at the interface between disciplines”. The project involved looking at how often journals across a large range of subjects were citing each other, and translating this into an infographic, or “map”. However, various members of the audience thought El Saiti was promoting the idea that multidisciplinary research is inherently good. They raised concerns that people would attempt to bias the REF responses to their submissions by including more accounts of multidisciplinary research, if they assumed the REF would favour accounts of academic endeavours combining more than one subject. The question of how to balance an appreciation of multidisciplinary research whilst continuing to honour and recognise the findings of more niche academic subjects was therefore an interesting one.

The late morning and afternoon sessions moved on from the question of quality and started to focus on the “impact” section of the assessment, which accounted for 20% of the overall assessment for each submission. Jonathan Adams closed the morning’s proceedings by introducing the REF impact case study database, which was put together in partnership with Digital Science. Each case study includes an introduction to the research, as well as citation data and an “details of the impact” section. For example, part of the impact of a case study of a submission from Durham university entitled “an X-ray tool for predicting catastrophic failure in semiconductor manufacture” was that “Jordan Valley semiconductors UK made the strategic decision to invest in the design and manufacture” of safer X-ray imaging tools.

After lunch, two analysts from RAND Corporation reported on their evaluation of the impact assessment process, in which they conducted face to face and telephone interviews with those who had been involved in making submissions, and with the assessment panelists. They summed up their findings as follows;

“The introduction of an impact element in REF 2014 might have been expected to generate concerns because of the relative novelty of the approach and because of the obvious difficulties in measurement, but in general it has succeeded.”

So, what can metrics providers make of all this? Taking all the speakers into account, it is seems as though “impact” is increasing in importance as a way of assessing research quality, but that the “obvious difficulties in measurement” described by RAND suggest a lack of tools with which to measure and quantify such a broad and slippery term, and translate it into relevant numbers. This then, is the gap in the market that metrics providers are seeking to fill, whether it’s bibliometrics or altmetrics.

However, this idea was somewhat contradicted by James Wilsdon, when he spoke about the Independent Review of the Role of Metrics in Research Assessment, the full results of which are to be published in July 2015. James concluded that “it is not currently feasible to assess the quality of research outputs based on quantitative indicators alone”. He elaborated that “no set of numbers can create a nuanced judgement of research” and that the “collection of metrics for research is not cost-free”. In response to this, perhaps it’s worth pointing out that at Altmetric is not simply “a set of numbers”, as we try to provide qualitative as well as quantitative data. We aim to give our users a level of granularity by allowing them to click through to the full text of all mentions, and to the profiles of those who have shared research on social media.

In summary, REFlections provided much food for thought as to the role of “impact” in assessing research quality in UK HEIs, and the role of metrics in determining impact. At Altmetric we’re continuously preoccupied with questions of data coverage. How can we go beyond the article, and provide data for other research outputs? How can we increase our coverage beyond the sciences, and provide data for other academic disciplines?

It occurred to me as I left the Royal Society, that even if multidisciplinary research isn’t inherently good, and even if high-impact research doesn’t automatically mean good research, a multi-faceted approach to assessing impact itself might be the best way forward.

6 Responses to “Reflections on REFlections”

InvestigaUNED (@InvestigaUNED)
April 10, 2015 at 12:00 am

Reflections on REFlections http://t.co/fr6BHSIT9G

Valeria Scotti (@ValeriaScotti6)
April 10, 2015 at 12:00 am

# Almetrics Reflections on REFlections - http://t.co/FT7uoj8Xxa

Ricardo Mairal Usón (@rmairal)
April 10, 2015 at 12:00 am

InvestigaUNED: Reflections on REFlections http://t.co/EjJM8zNGcd  http://t.co/neZ4JYCzj3

Fabrice Leclerc (@leclercfl)
April 10, 2015 at 12:00 am

Top #openedu story: Reflections on REFlections | http://t.co/uJaoIyzmKq http://t.co/p2aMdemymI, see more http://t.co/vtUf0W3rg2

Digital Science (@digitalsci)
April 12, 2015 at 12:00 am

Reflections on REFlections: examining the outcomes & aftermath of #REF2014 http://t.co/M3MYjsQqCF via @altmetric

Digital Science (@digitalsci)
April 14, 2015 at 12:00 am

Reflections on REFlections: examining the outcomes & aftermath of #REF2014 http://t.co/M3MYjsQqCF

Leave a Reply

Your email address will not be published. Required fields are marked *