↓ Skip to main content

Methods for enhancing the reproducibility of biomedical research findings using electronic health records

Overview of attention for article published in BioData Mining, September 2017
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • Among the highest-scoring outputs from this source (#32 of 324)
  • High Attention Score compared to outputs of the same age (87th percentile)
  • High Attention Score compared to outputs of the same age and source (83rd percentile)

Mentioned by

twitter
25 X users

Citations

dimensions_citation
21 Dimensions

Readers on

mendeley
84 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Methods for enhancing the reproducibility of biomedical research findings using electronic health records
Published in
BioData Mining, September 2017
DOI 10.1186/s13040-017-0151-7
Pubmed ID
Authors

Spiros Denaxas, Kenan Direk, Arturo Gonzalez-Izquierdo, Maria Pikoula, Aylin Cakiroglu, Jason Moore, Harry Hemingway, Liam Smeeth

Abstract

The ability of external investigators to reproduce published scientific findings is critical for the evaluation and validation of biomedical research by the wider community. However, a substantial proportion of health research using electronic health records (EHR), data collected and generated during clinical care, is potentially not reproducible mainly due to the fact that the implementation details of most data preprocessing, cleaning, phenotyping and analysis approaches are not systematically made available or shared. With the complexity, volume and variety of electronic health record data sources made available for research steadily increasing, it is critical to ensure that scientific findings from EHR data are reproducible and replicable by researchers. Reporting guidelines, such as RECORD and STROBE, have set a solid foundation by recommending a series of items for researchers to include in their research outputs. Researchers however often lack the technical tools and methodological approaches to actuate such recommendations in an efficient and sustainable manner. In this paper, we review and propose a series of methods and tools utilized in adjunct scientific disciplines that can be used to enhance the reproducibility of research using electronic health records and enable researchers to report analytical approaches in a transparent manner. Specifically, we discuss the adoption of scientific software engineering principles and best-practices such as test-driven development, source code revision control systems, literate programming and the standardization and re-use of common data management and analytical approaches. The adoption of such approaches will enable scientists to systematically document and share EHR analytical workflows and increase the reproducibility of biomedical research using such complex data sources.

X Demographics

X Demographics

The data shown below were collected from the profiles of 25 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 84 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United Kingdom 1 1%
Unknown 83 99%

Demographic breakdown

Readers by professional status Count As %
Researcher 15 18%
Student > Ph. D. Student 11 13%
Student > Master 11 13%
Student > Bachelor 6 7%
Professor 4 5%
Other 16 19%
Unknown 21 25%
Readers by discipline Count As %
Computer Science 15 18%
Medicine and Dentistry 12 14%
Agricultural and Biological Sciences 6 7%
Biochemistry, Genetics and Molecular Biology 6 7%
Nursing and Health Professions 4 5%
Other 17 20%
Unknown 24 29%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 16. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 06 May 2019.
All research outputs
#2,238,379
of 25,461,852 outputs
Outputs from BioData Mining
#32
of 324 outputs
Outputs of similar age
#41,516
of 323,456 outputs
Outputs of similar age from BioData Mining
#2
of 6 outputs
Altmetric has tracked 25,461,852 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 91st percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 324 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 7.4. This one has done particularly well, scoring higher than 90% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 323,456 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 87% of its contemporaries.
We're also able to compare this research output to 6 others from the same source and published within six weeks on either side of this one. This one has scored higher than 4 of them.