↓ Skip to main content

Timing in audiovisual speech perception: A mini review and new psychophysical data

Overview of attention for article published in Attention, Perception, & Psychophysics, December 2015
Altmetric Badge

About this Attention Score

  • Good Attention Score compared to outputs of the same age (75th percentile)
  • High Attention Score compared to outputs of the same age and source (84th percentile)

Mentioned by

twitter
7 X users
peer_reviews
1 peer review site

Citations

dimensions_citation
21 Dimensions

Readers on

mendeley
117 Mendeley
Title
Timing in audiovisual speech perception: A mini review and new psychophysical data
Published in
Attention, Perception, & Psychophysics, December 2015
DOI 10.3758/s13414-015-1026-y
Pubmed ID
Authors

Jonathan H. Venezia, Steven M. Thurman, William Matchin, Sahara E. George, Gregory Hickok

Abstract

Recent influential models of audiovisual speech perception suggest that visual speech aids perception by generating predictions about the identity of upcoming speech sounds. These models place stock in the assumption that visual speech leads auditory speech in time. However, it is unclear whether and to what extent temporally-leading visual speech information contributes to perception. Previous studies exploring audiovisual-speech timing have relied upon psychophysical procedures that require artificial manipulation of cross-modal alignment or stimulus duration. We introduce a classification procedure that tracks perceptually relevant visual speech information in time without requiring such manipulations. Participants were shown videos of a McGurk syllable (auditory /apa/ + visual /aka/ = perceptual /ata/) and asked to perform phoneme identification (/apa/ yes-no). The mouth region of the visual stimulus was overlaid with a dynamic transparency mask that obscured visual speech in some frames but not others randomly across trials. Variability in participants' responses (~35 % identification of /apa/ compared to ~5 % in the absence of the masker) served as the basis for classification analysis. The outcome was a high resolution spatiotemporal map of perceptually relevant visual features. We produced these maps for McGurk stimuli at different audiovisual temporal offsets (natural timing, 50-ms visual lead, and 100-ms visual lead). Briefly, temporally-leading (~130 ms) visual information did influence auditory perception. Moreover, several visual features influenced perception of a single speech sound, with the relative influence of each feature depending on both its temporal relation to the auditory signal and its informational content.

X Demographics

X Demographics

The data shown below were collected from the profiles of 7 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 117 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Japan 1 <1%
United Kingdom 1 <1%
United States 1 <1%
Canada 1 <1%
Unknown 113 97%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 34 29%
Student > Master 17 15%
Researcher 15 13%
Student > Bachelor 10 9%
Student > Doctoral Student 9 8%
Other 16 14%
Unknown 16 14%
Readers by discipline Count As %
Psychology 44 38%
Neuroscience 18 15%
Linguistics 6 5%
Agricultural and Biological Sciences 4 3%
Medicine and Dentistry 4 3%
Other 19 16%
Unknown 22 19%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 5. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 11 April 2019.
All research outputs
#6,531,293
of 24,003,070 outputs
Outputs from Attention, Perception, & Psychophysics
#287
of 1,773 outputs
Outputs of similar age
#98,671
of 397,480 outputs
Outputs of similar age from Attention, Perception, & Psychophysics
#7
of 46 outputs
Altmetric has tracked 24,003,070 research outputs across all sources so far. This one has received more attention than most of these and is in the 72nd percentile.
So far Altmetric has tracked 1,773 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 5.6. This one has done well, scoring higher than 83% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 397,480 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 75% of its contemporaries.
We're also able to compare this research output to 46 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 84% of its contemporaries.