↓ Skip to main content

Speed accuracy trade-off under response deadlines

Overview of attention for article published in Frontiers in Neuroscience, August 2014
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (82nd percentile)
  • High Attention Score compared to outputs of the same age and source (82nd percentile)

Mentioned by

twitter
3 X users
patent
2 patents
facebook
1 Facebook page

Citations

dimensions_citation
27 Dimensions

Readers on

mendeley
71 Mendeley
citeulike
1 CiteULike
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Speed accuracy trade-off under response deadlines
Published in
Frontiers in Neuroscience, August 2014
DOI 10.3389/fnins.2014.00248
Pubmed ID
Authors

Hakan Karşılar, Patrick Simen, Samantha Papadakis, Fuat Balcı

Abstract

Perceptual decision making has been successfully modeled as a process of evidence accumulation up to a threshold. In order to maximize the rewards earned for correct responses in tasks with response deadlines, participants should collapse decision thresholds dynamically during each trial so that a decision is reached before the deadline. This strategy ensures on-time responding, though at the cost of reduced accuracy, since slower decisions are based on lower thresholds and less net evidence later in a trial (compared to a constant threshold). Frazier and Yu (2008) showed that the normative rate of threshold reduction depends on deadline delays and on participants' uncertainty about these delays. Participants should start collapsing decision thresholds earlier when making decisions under shorter deadlines (for a given level of timing uncertainty) or when timing uncertainty is higher (for a given deadline). We tested these predictions using human participants in a random dot motion discrimination task. Each participant was tested in free-response, short deadline (800 ms), and long deadline conditions (1000 ms). Contrary to optimal-performance predictions, the resulting empirical function relating accuracy to response time (RT) in deadline conditions did not decline to chance level near the deadline; nor did the slight decline we typically observed relate to measures of endogenous timing uncertainty. Further, although this function did decline slightly with increasing RT, the decline was explainable by the best-fitting parameterization of Ratcliff's diffusion model (Ratcliff, 1978), whose parameters are constant within trials. Our findings suggest that at the very least, typical decision durations are too short for participants to adapt decision parameters within trials.

X Demographics

X Demographics

The data shown below were collected from the profiles of 3 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 71 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Spain 1 1%
United States 1 1%
Germany 1 1%
Unknown 68 96%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 14 20%
Student > Master 12 17%
Researcher 10 14%
Student > Bachelor 10 14%
Professor > Associate Professor 5 7%
Other 14 20%
Unknown 6 8%
Readers by discipline Count As %
Psychology 28 39%
Neuroscience 14 20%
Decision Sciences 3 4%
Medicine and Dentistry 3 4%
Computer Science 2 3%
Other 14 20%
Unknown 7 10%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 8. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 30 January 2024.
All research outputs
#4,591,218
of 25,610,986 outputs
Outputs from Frontiers in Neuroscience
#3,568
of 11,636 outputs
Outputs of similar age
#42,166
of 243,630 outputs
Outputs of similar age from Frontiers in Neuroscience
#22
of 120 outputs
Altmetric has tracked 25,610,986 research outputs across all sources so far. Compared to these this one has done well and is in the 82nd percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 11,636 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 11.0. This one has gotten more attention than average, scoring higher than 69% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 243,630 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 82% of its contemporaries.
We're also able to compare this research output to 120 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 82% of its contemporaries.