↓ Skip to main content

Defining the real-world reproducibility of visual grading of left ventricular function and visual estimation of left ventricular ejection fraction: impact of image quality, experience and…

Overview of attention for article published in The International Journal of Cardiovascular Imaging, July 2015
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • One of the highest-scoring outputs from this source (#10 of 2,012)
  • High Attention Score compared to outputs of the same age (94th percentile)
  • High Attention Score compared to outputs of the same age and source (97th percentile)

Mentioned by

news
3 news outlets
twitter
6 X users
patent
1 patent

Citations

dimensions_citation
61 Dimensions

Readers on

mendeley
53 Mendeley
citeulike
1 CiteULike
Title
Defining the real-world reproducibility of visual grading of left ventricular function and visual estimation of left ventricular ejection fraction: impact of image quality, experience and accreditation
Published in
The International Journal of Cardiovascular Imaging, July 2015
DOI 10.1007/s10554-015-0659-1
Pubmed ID
Authors

Graham D. Cole, Niti M. Dhutia, Matthew J. Shun-Shin, Keith Willson, James Harrison, Claire E. Raphael, Massoud Zolgharni, Jamil Mayet, Darrel P. Francis

Abstract

Left ventricular function can be evaluated by qualitative grading and by eyeball estimation of ejection fraction (EF). We sought to define the reproducibility of these techniques, and how they are affected by image quality, experience and accreditation. Twenty apical four-chamber echocardiographic cine loops (Online Resource 1-20) of varying image quality and left ventricular function were anonymized and presented to 35 operators. Operators were asked to provide (1) a one-phrase grading of global systolic function (2) an "eyeball" EF estimate and (3) an image quality rating on a 0-100 visual analogue scale. Each observer viewed every loop twice unknowingly, a total of 1400 viewings. When grading LV function into five categories, an operator's chance of agreement with another operator was 50 % and with themself on blinded re-presentation was 68 %. Blinded eyeball LVEF re-estimates by the same operator had standard deviation (SD) of difference of 7.6 EF units, with the SD across operators averaging 8.3 EF units. Image quality, defined as the average of all operators' assessments, correlated with EF estimate variability (r = -0.616, p < 0.01) and visual grading agreement (r = 0.58, p < 0.01). However, operators' own single quality assessments were not a useful forewarning of their estimate being an outlier, partly because individual quality assessments had poor within-operator reproducibility (SD of difference 17.8). Reproducibility of visual grading of LV function and LVEF estimation is dependent on image quality, but individuals cannot themselves identify when poor image quality is disrupting their LV function estimate. Clinicians should not assume that patients changing in grade or in visually estimated EF have had a genuine clinical change.

X Demographics

X Demographics

The data shown below were collected from the profiles of 6 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 53 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 53 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 9 17%
Student > Bachelor 9 17%
Student > Master 7 13%
Student > Postgraduate 5 9%
Other 3 6%
Other 11 21%
Unknown 9 17%
Readers by discipline Count As %
Medicine and Dentistry 21 40%
Nursing and Health Professions 8 15%
Engineering 3 6%
Computer Science 3 6%
Chemistry 2 4%
Other 3 6%
Unknown 13 25%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 29. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 18 July 2023.
All research outputs
#1,325,696
of 25,394,764 outputs
Outputs from The International Journal of Cardiovascular Imaging
#10
of 2,012 outputs
Outputs of similar age
#16,056
of 276,776 outputs
Outputs of similar age from The International Journal of Cardiovascular Imaging
#1
of 44 outputs
Altmetric has tracked 25,394,764 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 94th percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 2,012 research outputs from this source. They receive a mean Attention Score of 2.3. This one has done particularly well, scoring higher than 99% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 276,776 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 94% of its contemporaries.
We're also able to compare this research output to 44 others from the same source and published within six weeks on either side of this one. This one has done particularly well, scoring higher than 97% of its contemporaries.