↓ Skip to main content

Evaluating Research Impact: The Development of a Research for Impact Tool

Overview of attention for article published in Frontiers in Public Health, August 2016
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (89th percentile)
  • Good Attention Score compared to outputs of the same age and source (79th percentile)

Mentioned by

blogs
1 blog
twitter
8 X users

Citations

dimensions_citation
37 Dimensions

Readers on

mendeley
90 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Evaluating Research Impact: The Development of a Research for Impact Tool
Published in
Frontiers in Public Health, August 2016
DOI 10.3389/fpubh.2016.00160
Pubmed ID
Authors

Komla Tsey, Kenny Lawson, Irina Kinchin, Roxanne Bainbridge, Janya McCalman, Felecia Watkin, Yvonne Cadet-James, Allison Rossetto

Abstract

This paper examines the process of developing a Research for Impact Tool in the contexts of general fiscal constraint, increased competition for funding, perennial concerns about the over-researching of Aboriginal and Torres Strait Islander issues without demonstrable benefits as well as conceptual and methodological difficulties of evaluating research impact. The aim is to highlight the challenges and opportunities involved in evaluating research impact to serve as resource for potential users of the research for impact tool and others interested in assessing the impact of research. A combination of literature reviews, workshops with researchers, and reflections by project team members and partners using participatory snowball techniques. Assessing research impact is perceived to be difficult, akin to the so-called "wicked problem," but not impossible. Heuristic and collaborative approach to research that takes the expectations of research users, research participants and the funders of research offers a pragmatic solution to evaluating research impact. The logic of the proposed Research for Impact Tool is based on the understanding that the value of research is to create evidence and/or products to support smarter decisions so as to improve the human condition. Research is, therefore, of limited value unless the evidence created is used to make smarter decisions for the betterment of society. A practical way of approaching research impact is, therefore, to start with the decisions confronting decision makers whether they are government policymakers, industry, professional practitioners, or households and the extent to which the research supports them to make smarter policy and practice decisions and the knock-on consequences of doing so. Embedded at each step in the impact planning and tracking process is the need for appropriate mix of expertise, capacity enhancement, and collaborative participatory learning-by-doing approaches. The tool was developed in the context of Aboriginal and Torres Strait Islander research but the basic idea that the way to assess research impact is to start upfront with the information needs of decisions makers is equally applicable to research in other settings, both applied (horizontal) and basic (vertical) research. The tool will be further tested and evaluated with researchers over the next 2 years (2016/17). The decision by the Australian Government to include 'industry engagement' and 'impact' as additions to the Excellence in Research for Australia (ERA) quality measures from 2018 makes the Research for Impact Tool a timely development. The wider challenge is to engage with major Australian research funding agencies to ensure consistent alignment and approaches across research users, communities, and funders in evaluating impact.

X Demographics

X Demographics

The data shown below were collected from the profiles of 8 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 90 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Philippines 1 1%
Unknown 89 99%

Demographic breakdown

Readers by professional status Count As %
Researcher 17 19%
Student > Ph. D. Student 10 11%
Student > Master 8 9%
Student > Bachelor 7 8%
Student > Doctoral Student 5 6%
Other 20 22%
Unknown 23 26%
Readers by discipline Count As %
Medicine and Dentistry 11 12%
Nursing and Health Professions 9 10%
Social Sciences 9 10%
Business, Management and Accounting 7 8%
Psychology 7 8%
Other 18 20%
Unknown 29 32%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 16. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 19 February 2018.
All research outputs
#1,969,950
of 22,883,326 outputs
Outputs from Frontiers in Public Health
#725
of 10,009 outputs
Outputs of similar age
#36,908
of 340,306 outputs
Outputs of similar age from Frontiers in Public Health
#15
of 73 outputs
Altmetric has tracked 22,883,326 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 91st percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 10,009 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 10.0. This one has done particularly well, scoring higher than 92% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 340,306 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 89% of its contemporaries.
We're also able to compare this research output to 73 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 79% of its contemporaries.