↓ Skip to main content

Assessing the applicability of public health intervention evaluations from one setting to another: a methodological study of the usability and usefulness of assessment tools and frameworks

Overview of attention for article published in Health Research Policy and Systems, September 2018
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (91st percentile)
  • Good Attention Score compared to outputs of the same age and source (76th percentile)

Mentioned by

blogs
1 blog
policy
1 policy source
twitter
35 X users

Citations

dimensions_citation
36 Dimensions

Readers on

mendeley
71 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Assessing the applicability of public health intervention evaluations from one setting to another: a methodological study of the usability and usefulness of assessment tools and frameworks
Published in
Health Research Policy and Systems, September 2018
DOI 10.1186/s12961-018-0364-3
Pubmed ID
Authors

Helen Elizabeth Denise Burchett, Laurence Blanchard, Dylan Kneale, James Thomas

Abstract

Public health interventions can be complicated, complex and context dependent, making the assessment of applicability challenging. Nevertheless, for them to be of use beyond the original study setting, they need to be generalisable to other settings and, crucially, research users need to be able to identify to which contexts it may be applicable. There are many tools with set criteria for assessing generalisability/applicability, yet few seem to be widely used and there is no consensus on which should be used, or when. This methodological study aimed to test these tools to assess how easy they were to use and how useful they appeared to be. We identified tools from an existing review and an update of its search. References were screened on pre-specified criteria. Included tools were tested by using them to assess the applicability of a Swedish weight management intervention to the English context. Researcher assessments and reflections on the usability and utility of the tools were gathered using a standard pro-forma. Eleven tools were included. Their length, content, style and time required to complete varied. No tool was considered ideal for assessing applicability. Their limitations included unrealistic criteria (requiring unavailable information), a focus on implementation to the neglect of transferability (i.e. little focus on potential effectiveness in the new setting), overly broad criteria (associated with low reliability), and a lack of an explicit focus on how interventions worked (i.e. their mechanisms of action). Tools presenting criteria ready to be used may not be the best method for applicability assessments. They are likely to be either too long or incomplete, too focused on differences and fail to address elements that matter for the specific topic of interest. It is time to progress from developing lists of set criteria that are not widely used in the literature, to creating a new approach to applicability assessment. Focusing on mechanisms of action, rather than solely on characteristics, could be a useful approach, and one that remains underutilised in current tools. New approaches to assessing generalisability that evolve away from checklist style assessments need to be developed, tested, reported and discussed.

X Demographics

X Demographics

The data shown below were collected from the profiles of 35 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 71 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 71 100%

Demographic breakdown

Readers by professional status Count As %
Student > Master 14 20%
Researcher 8 11%
Student > Ph. D. Student 7 10%
Student > Bachelor 7 10%
Other 4 6%
Other 11 15%
Unknown 20 28%
Readers by discipline Count As %
Medicine and Dentistry 14 20%
Nursing and Health Professions 12 17%
Social Sciences 7 10%
Computer Science 2 3%
Psychology 2 3%
Other 9 13%
Unknown 25 35%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 28. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 06 March 2024.
All research outputs
#1,411,203
of 25,463,091 outputs
Outputs from Health Research Policy and Systems
#131
of 1,393 outputs
Outputs of similar age
#29,230
of 345,513 outputs
Outputs of similar age from Health Research Policy and Systems
#9
of 34 outputs
Altmetric has tracked 25,463,091 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 94th percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,393 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 12.6. This one has done particularly well, scoring higher than 90% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 345,513 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 91% of its contemporaries.
We're also able to compare this research output to 34 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 76% of its contemporaries.