↓ Skip to main content

The impact of standardizing the definition of visits on the consistency of multi-database observational health research

Overview of attention for article published in BMC Medical Research Methodology, March 2015
Altmetric Badge

Mentioned by

twitter
1 tweeter

Citations

dimensions_citation
10 Dimensions

Readers on

mendeley
23 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
The impact of standardizing the definition of visits on the consistency of multi-database observational health research
Published in
BMC Medical Research Methodology, March 2015
DOI 10.1186/s12874-015-0001-6
Pubmed ID
Authors

Erica A Voss, Qianli Ma, Patrick B Ryan

Abstract

Use of administrative claims from multiple sources for research purposes is challenged by the lack of consistency in the structure of the underlying data and definition of data across claims data providers. This paper evaluates the impact of applying a standardized revenue code-based logic for defining inpatient encounters across two different claims databases. We selected members who had complete enrollment in 2012 from the Truven MarketScan Commercial Claims and Encounters (CCAE) and the Optum Clinformatics (Optum) databases. The overall prevalence of inpatient conditions in the raw data was compared to that in the common data model (CDM) with the standardized visit definition applied. In CCAE, 87.18% of claims from 2012 that were classified as part of inpatient visits in the raw data were also classified as part of inpatient visits after the data were standardized to CDM, and this overlap was consistent from 2006 to 2011. In contrast, Optum had 83.18% concordance in classification of 2012 claims from inpatient encounters before and after standardization, but the consistency varied over time. The re-classification of inpatient encounters substantially impacted the observed prevalence of medical conditions occurring in the inpatient setting and the consistency in prevalence estimates between the databases. On average, before standardization, each condition in Optum was 12% more prevalent than that same condition in CCAE; after standardization, the prevalence of conditions had a mean difference of only 1% between databases. Amongst 7,039 conditions reviewed, the difference in the prevalence of 67% of conditions in these two databases was reduced after standardization. In an effort to improve consistency in research results across database one should review sources of database heterogeneity, such as the way data holders process raw claims data. Our study showed that applying the Observational Medical Outcomes Partnership (OMOP) CDM with a standardized approach for defining inpatient visits during the extract, transfer, and load process can decrease the heterogeneity observed in disease prevalence estimates across two different claims data sources.

Twitter Demographics

The data shown below were collected from the profile of 1 tweeter who shared this research output. Click here to find out more about how the information was compiled.

Mendeley readers

The data shown below were compiled from readership statistics for 23 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 23 100%

Demographic breakdown

Readers by professional status Count As %
Researcher 9 39%
Student > Master 3 13%
Student > Doctoral Student 3 13%
Student > Bachelor 2 9%
Student > Ph. D. Student 2 9%
Other 2 9%
Unknown 2 9%
Readers by discipline Count As %
Medicine and Dentistry 6 26%
Social Sciences 4 17%
Nursing and Health Professions 2 9%
Mathematics 2 9%
Computer Science 2 9%
Other 3 13%
Unknown 4 17%

Attention Score in Context

This research output has an Altmetric Attention Score of 1. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 27 September 2016.
All research outputs
#6,404,250
of 8,440,126 outputs
Outputs from BMC Medical Research Methodology
#684
of 857 outputs
Outputs of similar age
#176,749
of 252,904 outputs
Outputs of similar age from BMC Medical Research Methodology
#30
of 42 outputs
Altmetric has tracked 8,440,126 research outputs across all sources so far. This one is in the 13th percentile – i.e., 13% of other outputs scored the same or lower than it.
So far Altmetric has tracked 857 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 6.4. This one is in the 8th percentile – i.e., 8% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 252,904 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 18th percentile – i.e., 18% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 42 others from the same source and published within six weeks on either side of this one. This one is in the 21st percentile – i.e., 21% of its contemporaries scored the same or lower than it.