↓ Skip to main content

One Method, Many Methodological Choices: A Structured Review of Discrete-Choice Experiments for Health State Valuation

Overview of attention for article published in PharmacoEconomics, September 2018
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (85th percentile)
  • Good Attention Score compared to outputs of the same age and source (70th percentile)

Mentioned by

blogs
1 blog
twitter
12 X users
facebook
1 Facebook page

Citations

dimensions_citation
55 Dimensions

Readers on

mendeley
53 Mendeley
Title
One Method, Many Methodological Choices: A Structured Review of Discrete-Choice Experiments for Health State Valuation
Published in
PharmacoEconomics, September 2018
DOI 10.1007/s40273-018-0714-6
Pubmed ID
Authors

Brendan Mulhern, Richard Norman, Deborah J. Street, Rosalie Viney

Abstract

Discrete-choice experiments (DCEs) are used in the development of preference-based measure (PBM) value sets. There is considerable variation in the methodological approaches used to elicit preferences. Our objective was to carry out a structured review of DCE methods used for health state valuation. PubMed was searched until 31 May 2018 for published literature using DCEs for health state valuation. Search terms to describe DCEs, the process of valuation and preference-based instruments were developed. English language papers with any study population were included if they used DCEs to develop or directly inform the production of value sets for generic or condition-specific PBMs. Assessment of paper quality was guided by the recently developed Checklist for Reporting Valuation Studies. Data were extracted under six categories: general study information, choice task and study design, type of designed experiment, modelling and analysis methods, results and discussion. The literature search identified 1132 published papers, and 63 papers were included in the review. Paper quality was generally high. The study design and choice task formats varied considerably, and a wide range of modelling methods were employed to estimate value sets. This review of DCE methods used for developing value sets suggests some recurring limitations, areas of consensus and areas where further research is required. Methodological diversity means that the values should be seen as experimental, and users should understand the features of the value sets produced before applying them in decision making.

X Demographics

X Demographics

The data shown below were collected from the profiles of 12 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 53 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 53 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 12 23%
Researcher 9 17%
Student > Bachelor 4 8%
Student > Doctoral Student 3 6%
Other 3 6%
Other 10 19%
Unknown 12 23%
Readers by discipline Count As %
Economics, Econometrics and Finance 12 23%
Medicine and Dentistry 8 15%
Social Sciences 5 9%
Pharmacology, Toxicology and Pharmaceutical Science 3 6%
Computer Science 3 6%
Other 11 21%
Unknown 11 21%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 14. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 04 October 2018.
All research outputs
#2,270,638
of 23,577,761 outputs
Outputs from PharmacoEconomics
#175
of 1,879 outputs
Outputs of similar age
#48,530
of 337,498 outputs
Outputs of similar age from PharmacoEconomics
#9
of 30 outputs
Altmetric has tracked 23,577,761 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 90th percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,879 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 7.0. This one has done particularly well, scoring higher than 90% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 337,498 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 85% of its contemporaries.
We're also able to compare this research output to 30 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 70% of its contemporaries.