↓ Skip to main content

The ASNR-ACR-RSNA Common Data Elements Project: What Will It Do for the House of Neuroradiology?

Overview of attention for article published in American Journal of Neuroradiology, September 2018
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (86th percentile)
  • High Attention Score compared to outputs of the same age and source (95th percentile)

Mentioned by

news
1 news outlet
twitter
11 X users

Citations

dimensions_citation
10 Dimensions

Readers on

mendeley
26 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
The ASNR-ACR-RSNA Common Data Elements Project: What Will It Do for the House of Neuroradiology?
Published in
American Journal of Neuroradiology, September 2018
DOI 10.3174/ajnr.a5780
Pubmed ID
Authors

A.E. Flanders, J.E. Jordan

Abstract

The American Society of Neuroradiology has teamed up with the American College of Radiology and the Radiological Society of North America to create a catalog of neuroradiology common data elements that addresses specific clinical use cases. Fundamentally, a common data element is a question, concept, measurement, or feature with a set of controlled responses. This could be a measurement, subjective assessment, or ordinal value. Common data elements can be both machine- and human-generated. Rather than redesigning neuroradiology reporting, the goal is to establish the minimum number of "essential" concepts that should be in a report to address a clinical question. As medicine shifts toward value-based service compensation methodologies, there will be an even greater need to benchmark quality care and allow peer-to-peer comparisons in all specialties. Many government programs are now focusing on these measures, the most recent being the Merit-Based Incentive Payment System and the Medicare Access Children's Health Insurance Program Reauthorization Act of 2015. Standardized or structured reporting is advocated as one method of assessing radiology report quality, and common data elements are a means for expressing these concepts. Incorporating common data elements into clinical practice fosters a number of very useful downstream processes including establishing benchmarks for quality-assurance programs, ensuring more accurate billing, improving communication to providers and patients, participating in public health initiatives, creating comparative effectiveness research, and providing classifiers for machine learning. Generalized adoption of the recommended common data elements in clinical practice will provide the means to collect and compare imaging report data from multiple institutions locally, regionally, and even nationally, to establish quality benchmarks.

X Demographics

X Demographics

The data shown below were collected from the profiles of 11 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 26 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 26 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 6 23%
Student > Master 3 12%
Professor > Associate Professor 3 12%
Researcher 3 12%
Lecturer 1 4%
Other 3 12%
Unknown 7 27%
Readers by discipline Count As %
Medicine and Dentistry 7 27%
Business, Management and Accounting 3 12%
Computer Science 2 8%
Social Sciences 2 8%
Decision Sciences 1 4%
Other 2 8%
Unknown 9 35%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 16. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 28 March 2019.
All research outputs
#2,283,484
of 25,192,722 outputs
Outputs from American Journal of Neuroradiology
#391
of 5,218 outputs
Outputs of similar age
#46,048
of 348,221 outputs
Outputs of similar age from American Journal of Neuroradiology
#5
of 90 outputs
Altmetric has tracked 25,192,722 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 90th percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 5,218 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 6.7. This one has done particularly well, scoring higher than 92% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 348,221 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 86% of its contemporaries.
We're also able to compare this research output to 90 others from the same source and published within six weeks on either side of this one. This one has done particularly well, scoring higher than 95% of its contemporaries.