↓ Skip to main content

Effects of workload, work complexity, and repeated alerts on alert fatigue in a clinical decision support system

Overview of attention for article published in BMC Medical Informatics and Decision Making, April 2017
Altmetric Badge

About this Attention Score

  • In the top 5% of all research outputs scored by Altmetric
  • Among the highest-scoring outputs from this source (#26 of 1,230)
  • High Attention Score compared to outputs of the same age (92nd percentile)

Mentioned by

news
2 news outlets
policy
1 policy source
twitter
17 tweeters

Citations

dimensions_citation
54 Dimensions

Readers on

mendeley
161 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Effects of workload, work complexity, and repeated alerts on alert fatigue in a clinical decision support system
Published in
BMC Medical Informatics and Decision Making, April 2017
DOI 10.1186/s12911-017-0430-8
Pubmed ID
Authors

Jessica S. Ancker, Alison Edwards, Sarah Nosal, Diane Hauser, Elizabeth Mauer, Rainu Kaushal

Abstract

Although alert fatigue is blamed for high override rates in contemporary clinical decision support systems, the concept of alert fatigue is poorly defined. We tested hypotheses arising from two possible alert fatigue mechanisms: (A) cognitive overload associated with amount of work, complexity of work, and effort distinguishing informative from uninformative alerts, and (B) desensitization from repeated exposure to the same alert over time. Retrospective cohort study using electronic health record data (both drug alerts and clinical practice reminders) from January 2010 through June 2013 from 112 ambulatory primary care clinicians. The cognitive overload hypotheses were that alert acceptance would be lower with higher workload (number of encounters, number of patients), higher work complexity (patient comorbidity, alerts per encounter), and more alerts low in informational value (repeated alerts for the same patient in the same year). The desensitization hypothesis was that, for newly deployed alerts, acceptance rates would decline after an initial peak. On average, one-quarter of drug alerts received by a primary care clinician, and one-third of clinical reminders, were repeats for the same patient within the same year. Alert acceptance was associated with work complexity and repeated alerts, but not with the amount of work. Likelihood of reminder acceptance dropped by 30% for each additional reminder received per encounter, and by 10% for each five percentage point increase in proportion of repeated reminders. The newly deployed reminders did not show a pattern of declining response rates over time, which would have been consistent with desensitization. Interestingly, nurse practitioners were 4 times as likely to accept drug alerts as physicians. Clinicians became less likely to accept alerts as they received more of them, particularly more repeated alerts. There was no evidence of an effect of workload per se, or of desensitization over time for a newly deployed alert. Reducing within-patient repeats may be a promising target for reducing alert overrides and alert fatigue.

Twitter Demographics

The data shown below were collected from the profiles of 17 tweeters who shared this research output. Click here to find out more about how the information was compiled.

Mendeley readers

The data shown below were compiled from readership statistics for 161 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Portugal 1 <1%
Unknown 160 99%

Demographic breakdown

Readers by professional status Count As %
Researcher 29 18%
Student > Master 28 17%
Student > Doctoral Student 18 11%
Other 16 10%
Student > Ph. D. Student 13 8%
Other 34 21%
Unknown 23 14%
Readers by discipline Count As %
Medicine and Dentistry 54 34%
Nursing and Health Professions 21 13%
Social Sciences 11 7%
Computer Science 10 6%
Pharmacology, Toxicology and Pharmaceutical Science 10 6%
Other 22 14%
Unknown 33 20%

Attention Score in Context

This research output has an Altmetric Attention Score of 30. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 25 July 2019.
All research outputs
#567,277
of 13,663,325 outputs
Outputs from BMC Medical Informatics and Decision Making
#26
of 1,230 outputs
Outputs of similar age
#20,347
of 262,758 outputs
Outputs of similar age from BMC Medical Informatics and Decision Making
#1
of 4 outputs
Altmetric has tracked 13,663,325 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 95th percentile: it's in the top 5% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,230 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 5.1. This one has done particularly well, scoring higher than 97% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 262,758 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 92% of its contemporaries.
We're also able to compare this research output to 4 others from the same source and published within six weeks on either side of this one. This one has scored higher than all of them