↓ Skip to main content

The impact of standardizing the definition of visits on the consistency of multi-database observational health research

Overview of attention for article published in BMC Medical Research Methodology, March 2015
Altmetric Badge

Mentioned by

twitter
1 X user

Citations

dimensions_citation
22 Dimensions

Readers on

mendeley
43 Mendeley
Title
The impact of standardizing the definition of visits on the consistency of multi-database observational health research
Published in
BMC Medical Research Methodology, March 2015
DOI 10.1186/s12874-015-0001-6
Pubmed ID
Authors

Erica A Voss, Qianli Ma, Patrick B Ryan

Abstract

Use of administrative claims from multiple sources for research purposes is challenged by the lack of consistency in the structure of the underlying data and definition of data across claims data providers. This paper evaluates the impact of applying a standardized revenue code-based logic for defining inpatient encounters across two different claims databases. We selected members who had complete enrollment in 2012 from the Truven MarketScan Commercial Claims and Encounters (CCAE) and the Optum Clinformatics (Optum) databases. The overall prevalence of inpatient conditions in the raw data was compared to that in the common data model (CDM) with the standardized visit definition applied. In CCAE, 87.18% of claims from 2012 that were classified as part of inpatient visits in the raw data were also classified as part of inpatient visits after the data were standardized to CDM, and this overlap was consistent from 2006 to 2011. In contrast, Optum had 83.18% concordance in classification of 2012 claims from inpatient encounters before and after standardization, but the consistency varied over time. The re-classification of inpatient encounters substantially impacted the observed prevalence of medical conditions occurring in the inpatient setting and the consistency in prevalence estimates between the databases. On average, before standardization, each condition in Optum was 12% more prevalent than that same condition in CCAE; after standardization, the prevalence of conditions had a mean difference of only 1% between databases. Amongst 7,039 conditions reviewed, the difference in the prevalence of 67% of conditions in these two databases was reduced after standardization. In an effort to improve consistency in research results across database one should review sources of database heterogeneity, such as the way data holders process raw claims data. Our study showed that applying the Observational Medical Outcomes Partnership (OMOP) CDM with a standardized approach for defining inpatient visits during the extract, transfer, and load process can decrease the heterogeneity observed in disease prevalence estimates across two different claims data sources.

X Demographics

X Demographics

The data shown below were collected from the profile of 1 X user who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 43 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 43 100%

Demographic breakdown

Readers by professional status Count As %
Researcher 13 30%
Student > Bachelor 7 16%
Student > Doctoral Student 4 9%
Student > Master 3 7%
Student > Ph. D. Student 3 7%
Other 6 14%
Unknown 7 16%
Readers by discipline Count As %
Medicine and Dentistry 12 28%
Mathematics 4 9%
Computer Science 4 9%
Social Sciences 4 9%
Agricultural and Biological Sciences 3 7%
Other 6 14%
Unknown 10 23%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 1. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 27 September 2016.
All research outputs
#18,472,072
of 22,889,074 outputs
Outputs from BMC Medical Research Methodology
#1,748
of 2,024 outputs
Outputs of similar age
#188,589
of 258,903 outputs
Outputs of similar age from BMC Medical Research Methodology
#21
of 26 outputs
Altmetric has tracked 22,889,074 research outputs across all sources so far. This one is in the 11th percentile – i.e., 11% of other outputs scored the same or lower than it.
So far Altmetric has tracked 2,024 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 10.1. This one is in the 6th percentile – i.e., 6% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 258,903 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 15th percentile – i.e., 15% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 26 others from the same source and published within six weeks on either side of this one. This one is in the 11th percentile – i.e., 11% of its contemporaries scored the same or lower than it.