↓ Skip to main content

Visual speech influences speech perception immediately but not automatically

Overview of attention for article published in Attention, Perception, & Psychophysics, November 2016
Altmetric Badge

About this Attention Score

  • Good Attention Score compared to outputs of the same age (69th percentile)
  • High Attention Score compared to outputs of the same age and source (80th percentile)

Mentioned by

twitter
6 X users

Citations

dimensions_citation
7 Dimensions

Readers on

mendeley
49 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Visual speech influences speech perception immediately but not automatically
Published in
Attention, Perception, & Psychophysics, November 2016
DOI 10.3758/s13414-016-1249-6
Pubmed ID
Authors

Holger Mitterer, Eva Reinisch

Abstract

Two experiments examined the time course of the use of auditory and visual speech cues to spoken word recognition using an eye-tracking paradigm. Results of the first experiment showed that the use of visual speech cues from lipreading is reduced if concurrently presented pictures require a division of attentional resources. This reduction was evident even when listeners' eye gaze was on the speaker rather than the (static) pictures. Experiment 2 used a deictic hand gesture to foster attention to the speaker. At the same time, the visual processing load was reduced by keeping the visual display constant over a fixed number of successive trials. Under these conditions, the visual speech cues from lipreading were used. Moreover, the eye-tracking data indicated that visual information was used immediately and even earlier than auditory information. In combination, these data indicate that visual speech cues are not used automatically, but if they are used, they are used immediately.

X Demographics

X Demographics

The data shown below were collected from the profiles of 6 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 49 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United States 1 2%
Unknown 48 98%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 16 33%
Researcher 7 14%
Student > Master 6 12%
Student > Doctoral Student 5 10%
Other 1 2%
Other 4 8%
Unknown 10 20%
Readers by discipline Count As %
Psychology 16 33%
Linguistics 5 10%
Neuroscience 4 8%
Agricultural and Biological Sciences 2 4%
Engineering 2 4%
Other 6 12%
Unknown 14 29%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 4. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 19 December 2016.
All research outputs
#7,331,201
of 24,003,070 outputs
Outputs from Attention, Perception, & Psychophysics
#361
of 1,773 outputs
Outputs of similar age
#128,333
of 422,725 outputs
Outputs of similar age from Attention, Perception, & Psychophysics
#8
of 40 outputs
Altmetric has tracked 24,003,070 research outputs across all sources so far. This one has received more attention than most of these and is in the 69th percentile.
So far Altmetric has tracked 1,773 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 5.6. This one has done well, scoring higher than 79% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 422,725 tracked outputs that were published within six weeks on either side of this one in any source. This one has gotten more attention than average, scoring higher than 69% of its contemporaries.
We're also able to compare this research output to 40 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 80% of its contemporaries.