↓ Skip to main content

Comparison of methods of alert acknowledgement by critical care clinicians in the ICU setting

Overview of attention for article published in PeerJ, March 2017
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age

Mentioned by

twitter
2 X users
facebook
1 Facebook page

Citations

dimensions_citation
8 Dimensions

Readers on

mendeley
57 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Comparison of methods of alert acknowledgement by critical care clinicians in the ICU setting
Published in
PeerJ, March 2017
DOI 10.7717/peerj.3083
Pubmed ID
Authors

Andrew M. Harrison, Charat Thongprayoon, Christopher A. Aakre, Jack Y. Jeng, Mikhail A. Dziadzko, Ognjen Gajic, Brian W. Pickering, Vitaly Herasevich

Abstract

Electronic Health Record (EHR)-based sepsis alert systems have failed to demonstrate improvements in clinically meaningful endpoints. However, the effect of implementation barriers on the success of new sepsis alert systems is rarely explored. To test the hypothesis time to severe sepsis alert acknowledgement by critical care clinicians in the ICU setting would be reduced using an EHR-based alert acknowledgement system compared to a text paging-based system. In one arm of this simulation study, real alerts for patients in the medical ICU were delivered to critical care clinicians through the EHR. In the other arm, simulated alerts were delivered through text paging. The primary outcome was time to alert acknowledgement. The secondary outcomes were a structured, mixed quantitative/qualitative survey and informal group interview. The alert acknowledgement rate from the severe sepsis alert system was 3% (N = 148) and 51% (N = 156) from simulated severe sepsis alerts through traditional text paging. Time to alert acknowledgement from the severe sepsis alert system was median 274 min (N = 5) and median 2 min (N = 80) from text paging. The response rate from the EHR-based alert system was insufficient to compare primary measures. However, secondary measures revealed important barriers. Alert fatigue, interruption, human error, and information overload are barriers to alert and simulation studies in the ICU setting.

X Demographics

X Demographics

The data shown below were collected from the profiles of 2 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 57 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United States 1 2%
Unknown 56 98%

Demographic breakdown

Readers by professional status Count As %
Researcher 14 25%
Student > Master 9 16%
Student > Ph. D. Student 6 11%
Student > Bachelor 4 7%
Unspecified 3 5%
Other 11 19%
Unknown 10 18%
Readers by discipline Count As %
Medicine and Dentistry 17 30%
Nursing and Health Professions 5 9%
Computer Science 4 7%
Unspecified 3 5%
Agricultural and Biological Sciences 3 5%
Other 11 19%
Unknown 14 25%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 2. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 14 March 2017.
All research outputs
#14,927,127
of 22,959,818 outputs
Outputs from PeerJ
#8,737
of 13,370 outputs
Outputs of similar age
#184,559
of 307,966 outputs
Outputs of similar age from PeerJ
#238
of 313 outputs
Altmetric has tracked 22,959,818 research outputs across all sources so far. This one is in the 32nd percentile – i.e., 32% of other outputs scored the same or lower than it.
So far Altmetric has tracked 13,370 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 17.3. This one is in the 29th percentile – i.e., 29% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 307,966 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 37th percentile – i.e., 37% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 313 others from the same source and published within six weeks on either side of this one. This one is in the 21st percentile – i.e., 21% of its contemporaries scored the same or lower than it.