↓ Skip to main content

Tailoring an educational program on the AHRQ Patient Safety Indicators to meet stakeholder needs: lessons learned in the VA

Overview of attention for article published in BMC Health Services Research, February 2018
Altmetric Badge

Mentioned by

twitter
1 X user

Citations

dimensions_citation
2 Dimensions

Readers on

mendeley
53 Mendeley
Title
Tailoring an educational program on the AHRQ Patient Safety Indicators to meet stakeholder needs: lessons learned in the VA
Published in
BMC Health Services Research, February 2018
DOI 10.1186/s12913-018-2904-5
Pubmed ID
Authors

Marlena H. Shin, Peter E. Rivard, Michael Shwartz, Ann Borzecki, Enzo Yaksic, Kelly Stolzmann, Lisa Zubkoff, Amy K. Rosen

Abstract

Given that patient safety measures are increasingly used for public reporting and pay-for performance, it is important for stakeholders to understand how to use these measures for improvement. The Agency for Healthcare Research and Quality (AHRQ) Patient Safety Indicators (PSIs) are one particularly visible set of measures that are now used primarily for public reporting and pay-for-performance among both private sector and Veterans Health Administration (VA) hospitals. This trend generates a strong need for stakeholders to understand how to interpret and use the PSIs for quality improvement (QI). The goal of this study was to develop an educational program and tailor it to stakeholders' needs. In this paper, we share what we learned from this program development process. Our study population included key VA stakeholders involved in reviewing performance reports and prioritizing and initiating quality/safety initiatives. A pre-program formative evaluation through telephone interviews and web-based surveys assessed stakeholders' educational needs/interests. Findings from the formative evaluation led to development and implementation of a cyberseminar-based program, which we tailored to stakeholders' needs/interests. A post-program survey evaluated program participants' perceptions about the PSI educational program. Interview data confirmed that the concepts we had developed for the interviews could be used for the survey. Survey results informed us on what program delivery mode and content topics were of high interest. Six cyberseminars were developed-three of which focused on two content areas that were noted of greatest interest: learning how to use PSIs for monitoring trends and understanding how to interpret PSIs. We also used snapshots of VA PSI reports so that participants could directly apply learnings. Although initial interest in the program was high, actual attendance was low. However, post-program survey results indicated that perceptions about the program were positive. Conducting a formative evaluation was a highly important process in program development. The useful information that we collected through the interviews and surveys allowed us to tailor the program to stakeholders' needs and interests. Our experiences, particularly with the formative evaluation process, yielded valuable lessons that can guide others when developing and implementing similar educational programs.

X Demographics

X Demographics

The data shown below were collected from the profile of 1 X user who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 53 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 53 100%

Demographic breakdown

Readers by professional status Count As %
Student > Master 9 17%
Student > Bachelor 5 9%
Other 4 8%
Student > Doctoral Student 4 8%
Student > Ph. D. Student 4 8%
Other 13 25%
Unknown 14 26%
Readers by discipline Count As %
Nursing and Health Professions 14 26%
Business, Management and Accounting 4 8%
Medicine and Dentistry 3 6%
Engineering 3 6%
Pharmacology, Toxicology and Pharmaceutical Science 2 4%
Other 13 25%
Unknown 14 26%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 1. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 15 February 2018.
All research outputs
#20,465,050
of 23,023,224 outputs
Outputs from BMC Health Services Research
#7,174
of 7,707 outputs
Outputs of similar age
#383,688
of 446,257 outputs
Outputs of similar age from BMC Health Services Research
#175
of 187 outputs
Altmetric has tracked 23,023,224 research outputs across all sources so far. This one is in the 1st percentile – i.e., 1% of other outputs scored the same or lower than it.
So far Altmetric has tracked 7,707 research outputs from this source. They typically receive more attention than average, with a mean Attention Score of 7.8. This one is in the 1st percentile – i.e., 1% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 446,257 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 1st percentile – i.e., 1% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 187 others from the same source and published within six weeks on either side of this one. This one is in the 1st percentile – i.e., 1% of its contemporaries scored the same or lower than it.