↓ Skip to main content

Model Performance Evaluation (Validation and Calibration) in Model-based Studies of Therapeutic Interventions for Cardiovascular Diseases

Overview of attention for article published in Applied Health Economics and Health Policy, February 2013
Altmetric Badge

Mentioned by

twitter
1 X user

Citations

dimensions_citation
22 Dimensions

Readers on

mendeley
43 Mendeley
Title
Model Performance Evaluation (Validation and Calibration) in Model-based Studies of Therapeutic Interventions for Cardiovascular Diseases
Published in
Applied Health Economics and Health Policy, February 2013
DOI 10.1007/s40258-013-0012-6
Pubmed ID
Authors

Hossein Haji Ali Afzali, Jodi Gray, Jonathan Karnon

Abstract

Decision analytic models play an increasingly important role in the economic evaluation of health technologies. Given uncertainties around the assumptions used to develop such models, several guidelines have been published to identify and assess 'best practice' in the model development process, including general modelling approach (e.g., time horizon), model structure, input data and model performance evaluation. This paper focuses on model performance evaluation. In the absence of a sufficient level of detail around model performance evaluation, concerns regarding the accuracy of model outputs, and hence the credibility of such models, are frequently raised. Following presentation of its components, a review of the application and reporting of model performance evaluation is presented. Taking cardiovascular disease as an illustrative example, the review investigates the use of face validity, internal validity, external validity, and cross model validity. As a part of the performance evaluation process, model calibration is also discussed and its use in applied studies investigated. The review found that the application and reporting of model performance evaluation across 81 studies of treatment for cardiovascular disease was variable. Cross-model validation was reported in 55 % of the reviewed studies, though the level of detail provided varied considerably. We found that very few studies documented other types of validity, and only 6 % of the reviewed articles reported a calibration process. Considering the above findings, we propose a comprehensive model performance evaluation framework (checklist), informed by a review of best-practice guidelines. This framework provides a basis for more accurate and consistent documentation of model performance evaluation. This will improve the peer review process and the comparability of modelling studies. Recognising the fundamental role of decision analytic models in informing public funding decisions, the proposed framework should usefully inform guidelines for preparing submissions to reimbursement bodies.

X Demographics

X Demographics

The data shown below were collected from the profile of 1 X user who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 43 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Switzerland 1 2%
Unknown 42 98%

Demographic breakdown

Readers by professional status Count As %
Researcher 9 21%
Student > Master 8 19%
Student > Ph. D. Student 7 16%
Student > Doctoral Student 4 9%
Lecturer 2 5%
Other 7 16%
Unknown 6 14%
Readers by discipline Count As %
Medicine and Dentistry 11 26%
Agricultural and Biological Sciences 4 9%
Pharmacology, Toxicology and Pharmaceutical Science 4 9%
Business, Management and Accounting 3 7%
Mathematics 3 7%
Other 12 28%
Unknown 6 14%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 1. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 01 December 2014.
All research outputs
#15,311,799
of 22,772,779 outputs
Outputs from Applied Health Economics and Health Policy
#531
of 771 outputs
Outputs of similar age
#121,374
of 193,071 outputs
Outputs of similar age from Applied Health Economics and Health Policy
#9
of 12 outputs
Altmetric has tracked 22,772,779 research outputs across all sources so far. This one is in the 22nd percentile – i.e., 22% of other outputs scored the same or lower than it.
So far Altmetric has tracked 771 research outputs from this source. They typically receive more attention than average, with a mean Attention Score of 8.9. This one is in the 25th percentile – i.e., 25% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 193,071 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 27th percentile – i.e., 27% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 12 others from the same source and published within six weeks on either side of this one. This one is in the 25th percentile – i.e., 25% of its contemporaries scored the same or lower than it.