↓ Skip to main content

Designing evaluation studies to optimally inform policy: what factors do policy-makers in China consider when making resource allocation decisions on healthcare worker training programmes?

Overview of attention for article published in Health Research Policy and Systems, February 2018
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (84th percentile)
  • Average Attention Score compared to outputs of the same age and source

Mentioned by

blogs
1 blog
twitter
11 X users

Citations

dimensions_citation
3 Dimensions

Readers on

mendeley
67 Mendeley
Title
Designing evaluation studies to optimally inform policy: what factors do policy-makers in China consider when making resource allocation decisions on healthcare worker training programmes?
Published in
Health Research Policy and Systems, February 2018
DOI 10.1186/s12961-018-0292-2
Pubmed ID
Authors

Shishi Wu, Helena Legido-Quigley, Julia Spencer, Richard James Coker, Mishal Sameer Khan

Abstract

In light of the gap in evidence to inform future resource allocation decisions about healthcare provider (HCP) training in low- and middle-income countries (LMICs), and the considerable donor investments being made towards training interventions, evaluation studies that are optimally designed to inform local policy-makers are needed. The aim of our study is to understand what features of HCP training evaluation studies are important for decision-making by policy-makers in LMICs. We investigate the extent to which evaluations based on the widely used Kirkpatrick model - focusing on direct outcomes of training, namely reaction of trainees, learning, behaviour change and improvements in programmatic health indicators - align with policy-makers' evidence needs for resource allocation decisions. We use China as a case study where resource allocation decisions about potential scale-up (using domestic funding) are being made about an externally funded pilot HCP training programme. Qualitative data were collected from high-level officials involved in resource allocation at the national and provincial level in China through ten face-to-face, in-depth interviews and two focus group discussions consisting of ten participants each. Data were analysed manually using an interpretive thematic analysis approach. Our study indicates that Chinese officials not only consider information about the direct outcomes of a training programme, as captured in the Kirkpatrick model, but also need information on the resources required to implement the training, the wider or indirect impacts of training, and the sustainability and scalability to other settings within the country. In addition to considering findings presented in evaluation studies, we found that Chinese policy-makers pay close attention to whether the evaluations were robust and to the composition of the evaluation team. Our qualitative study indicates that training programme evaluations that focus narrowly on direct training outcomes may not provide sufficient information for policy-makers to make decisions on future training programmes. Based on our findings, we have developed an evidence-based framework, which incorporates but expands beyond the Kirkpatrick model, to provide conceptual and practical guidance that aids in the design of training programme evaluations better suited to meet the information needs of policy-makers and to inform policy decisions.

X Demographics

X Demographics

The data shown below were collected from the profiles of 11 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 67 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 67 100%

Demographic breakdown

Readers by professional status Count As %
Student > Master 9 13%
Researcher 6 9%
Student > Ph. D. Student 5 7%
Student > Bachelor 4 6%
Lecturer 3 4%
Other 12 18%
Unknown 28 42%
Readers by discipline Count As %
Social Sciences 9 13%
Nursing and Health Professions 9 13%
Medicine and Dentistry 7 10%
Environmental Science 3 4%
Psychology 3 4%
Other 7 10%
Unknown 29 43%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 14. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 01 March 2018.
All research outputs
#2,375,655
of 23,509,253 outputs
Outputs from Health Research Policy and Systems
#346
of 1,237 outputs
Outputs of similar age
#52,357
of 331,435 outputs
Outputs of similar age from Health Research Policy and Systems
#21
of 30 outputs
Altmetric has tracked 23,509,253 research outputs across all sources so far. Compared to these this one has done well and is in the 89th percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,237 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 13.1. This one has gotten more attention than average, scoring higher than 72% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 331,435 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 84% of its contemporaries.
We're also able to compare this research output to 30 others from the same source and published within six weeks on either side of this one. This one is in the 33rd percentile – i.e., 33% of its contemporaries scored the same or lower than it.