↓ Skip to main content

Measuring Individual Differences in Decision Biases: Methodological Considerations

Overview of attention for article published in Frontiers in Psychology, November 2015
Altmetric Badge

About this Attention Score

  • Above-average Attention Score compared to outputs of the same age (55th percentile)
  • Above-average Attention Score compared to outputs of the same age and source (59th percentile)

Mentioned by

twitter
4 X users
facebook
1 Facebook page

Readers on

mendeley
93 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Measuring Individual Differences in Decision Biases: Methodological Considerations
Published in
Frontiers in Psychology, November 2015
DOI 10.3389/fpsyg.2015.01770
Pubmed ID
Authors

Balazs Aczel, Bence Bago, Aba Szollosi, Andrei Foldes, Bence Lukacs

Abstract

Individual differences in people's susceptibility to heuristics and biases (HB) are often measured by multiple-bias questionnaires consisting of one or a few items for each bias. This research approach relies on the assumptions that (1) different versions of a decision bias task measure are interchangeable as they measure the same cognitive failure; and (2) that some combination of these tasks measures the same underlying construct. Based on these assumptions, in Study 1 we developed two versions of a new decision bias survey for which we modified 13 HB tasks to increase their comparability, construct validity, and the participants' motivation. The analysis of the responses (N = 1279) showed weak internal consistency within the surveys and a great level of discrepancy between the extracted patterns of the underlying factors. To explore these inconsistencies, in Study 2 we used three original examples of HB tasks for each of seven biases. We created three decision bias surveys by allocating one version of each HB task to each survey. The participants' responses (N = 527) showed a similar pattern as in Study 1, questioning the assumption that the different examples of the HB tasks are interchangeable and that they measure the same underlying construct. These results emphasize the need to understand the domain-specificity of cognitive biases as well as the effect of the wording of the cover story and the response mode on bias susceptibility before employing them in multiple-bias questionnaires.

X Demographics

X Demographics

The data shown below were collected from the profiles of 4 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 93 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United Kingdom 1 1%
United States 1 1%
Dominican Republic 1 1%
Germany 1 1%
Unknown 89 96%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 22 24%
Student > Master 15 16%
Researcher 12 13%
Student > Bachelor 11 12%
Student > Doctoral Student 7 8%
Other 9 10%
Unknown 17 18%
Readers by discipline Count As %
Psychology 24 26%
Business, Management and Accounting 11 12%
Medicine and Dentistry 7 8%
Economics, Econometrics and Finance 6 6%
Neuroscience 6 6%
Other 17 18%
Unknown 22 24%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 3. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 13 October 2021.
All research outputs
#7,872,091
of 23,864,146 outputs
Outputs from Frontiers in Psychology
#11,463
of 31,827 outputs
Outputs of similar age
#122,125
of 391,763 outputs
Outputs of similar age from Frontiers in Psychology
#177
of 441 outputs
Altmetric has tracked 23,864,146 research outputs across all sources so far. This one is in the 44th percentile – i.e., 44% of other outputs scored the same or lower than it.
So far Altmetric has tracked 31,827 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 12.7. This one has gotten more attention than average, scoring higher than 63% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 391,763 tracked outputs that were published within six weeks on either side of this one in any source. This one has gotten more attention than average, scoring higher than 55% of its contemporaries.
We're also able to compare this research output to 441 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 59% of its contemporaries.