↓ Skip to main content

A comparison of two methods for expert elicitation in health technology assessments

Overview of attention for article published in BMC Medical Research Methodology, July 2016
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • Good Attention Score compared to outputs of the same age (74th percentile)
  • Average Attention Score compared to outputs of the same age and source

Mentioned by

twitter
12 X users

Citations

dimensions_citation
45 Dimensions

Readers on

mendeley
120 Mendeley
Title
A comparison of two methods for expert elicitation in health technology assessments
Published in
BMC Medical Research Methodology, July 2016
DOI 10.1186/s12874-016-0186-3
Pubmed ID
Authors

Bogdan Grigore, Jaime Peters, Christopher Hyde, Ken Stein

Abstract

When data needed to inform parameters in decision models are lacking, formal elicitation of expert judgement can be used to characterise parameter uncertainty. Although numerous methods for eliciting expert opinion as probability distributions exist, there is little research to suggest whether one method is more useful than any other method. This study had three objectives: (i) to obtain subjective probability distributions characterising parameter uncertainty in the context of a health technology assessment; (ii) to compare two elicitation methods by eliciting the same parameters in different ways; (iii) to collect subjective preferences of the experts for the different elicitation methods used. Twenty-seven clinical experts were invited to participate in an elicitation exercise to inform a published model-based cost-effectiveness analysis of alternative treatments for prostate cancer. Participants were individually asked to express their judgements as probability distributions using two different methods - the histogram and hybrid elicitation methods - presented in a random order. Individual distributions were mathematically aggregated across experts with and without weighting. The resulting combined distributions were used in the probabilistic analysis of the decision model and mean incremental cost-effectiveness ratios and the expected values of perfect information (EVPI) were calculated for each method, and compared with the original cost-effectiveness analysis. Scores on the ease of use of the two methods and the extent to which the probability distributions obtained from each method accurately reflected the expert's opinion were also recorded. Six experts completed the task. Mean ICERs from the probabilistic analysis ranged between £162,600-£175,500 per quality-adjusted life year (QALY) depending on the elicitation and weighting methods used. Compared to having no information, use of expert opinion decreased decision uncertainty: the EVPI value at the £30,000 per QALY threshold decreased by 74-86 % from the original cost-effectiveness analysis. Experts indicated that the histogram method was easier to use, but attributed a perception of more accuracy to the hybrid method. Inclusion of expert elicitation can decrease decision uncertainty. Here, choice of method did not affect the overall cost-effectiveness conclusions, but researchers intending to use expert elicitation need to be aware of the impact different methods could have.

X Demographics

X Demographics

The data shown below were collected from the profiles of 12 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 120 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 120 100%

Demographic breakdown

Readers by professional status Count As %
Student > Master 24 20%
Researcher 19 16%
Student > Ph. D. Student 9 8%
Student > Doctoral Student 9 8%
Student > Bachelor 6 5%
Other 11 9%
Unknown 42 35%
Readers by discipline Count As %
Medicine and Dentistry 13 11%
Economics, Econometrics and Finance 13 11%
Nursing and Health Professions 8 7%
Engineering 6 5%
Pharmacology, Toxicology and Pharmaceutical Science 6 5%
Other 27 23%
Unknown 47 39%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 6. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 23 February 2017.
All research outputs
#5,470,439
of 22,881,154 outputs
Outputs from BMC Medical Research Methodology
#767
of 2,022 outputs
Outputs of similar age
#93,335
of 365,298 outputs
Outputs of similar age from BMC Medical Research Methodology
#20
of 40 outputs
Altmetric has tracked 22,881,154 research outputs across all sources so far. Compared to these this one has done well and is in the 76th percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 2,022 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 10.1. This one has gotten more attention than average, scoring higher than 62% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 365,298 tracked outputs that were published within six weeks on either side of this one in any source. This one has gotten more attention than average, scoring higher than 74% of its contemporaries.
We're also able to compare this research output to 40 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 50% of its contemporaries.