↓ Skip to main content

The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception

Overview of attention for article published in Frontiers in Psychology, May 2014
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age
  • Average Attention Score compared to outputs of the same age and source

Mentioned by

twitter
4 X users

Citations

dimensions_citation
23 Dimensions

Readers on

mendeley
35 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception
Published in
Frontiers in Psychology, May 2014
DOI 10.3389/fpsyg.2014.00420
Pubmed ID
Authors

Avril Treille, Coriandre Vilain, Marc Sato

Abstract

Recent magneto-encephalographic and electro-encephalographic studies provide evidence for cross-modal integration during audio-visual and audio-haptic speech perception, with speech gestures viewed or felt from manual tactile contact with the speaker's face. Given the temporal precedence of the haptic and visual signals on the acoustic signal in these studies, the observed modulation of N1/P2 auditory evoked responses during bimodal compared to unimodal speech perception suggest that relevant and predictive visual and haptic cues may facilitate auditory speech processing. To further investigate this hypothesis, auditory evoked potentials were here compared during auditory-only, audio-visual and audio-haptic speech perception in live dyadic interactions between a listener and a speaker. In line with previous studies, auditory evoked potentials were attenuated and speeded up during both audio-haptic and audio-visual compared to auditory speech perception. Importantly, the observed latency and amplitude reduction did not significantly depend on the degree of visual and haptic recognition of the speech targets. Altogether, these results further demonstrate cross-modal interactions between the auditory, visual and haptic speech signals. Although they do not contradict the hypothesis that visual and haptic sensory inputs convey predictive information with respect to the incoming auditory speech input, these results suggest that, at least in live conversational interactions, systematic conclusions on sensory predictability in bimodal speech integration have to be taken with caution, with the extraction of predictive cues likely depending on the variability of the speech stimuli.

X Demographics

X Demographics

The data shown below were collected from the profiles of 4 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 35 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Finland 1 3%
Unknown 34 97%

Demographic breakdown

Readers by professional status Count As %
Student > Master 7 20%
Student > Ph. D. Student 7 20%
Researcher 6 17%
Student > Doctoral Student 3 9%
Other 1 3%
Other 4 11%
Unknown 7 20%
Readers by discipline Count As %
Psychology 12 34%
Neuroscience 4 11%
Computer Science 3 9%
Medicine and Dentistry 2 6%
Linguistics 2 6%
Other 4 11%
Unknown 8 23%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 2. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 31 May 2014.
All research outputs
#13,766,415
of 23,344,526 outputs
Outputs from Frontiers in Psychology
#13,741
of 31,066 outputs
Outputs of similar age
#113,212
of 228,330 outputs
Outputs of similar age from Frontiers in Psychology
#191
of 332 outputs
Altmetric has tracked 23,344,526 research outputs across all sources so far. This one is in the 39th percentile – i.e., 39% of other outputs scored the same or lower than it.
So far Altmetric has tracked 31,066 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 12.6. This one has gotten more attention than average, scoring higher than 54% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 228,330 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 48th percentile – i.e., 48% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 332 others from the same source and published within six weeks on either side of this one. This one is in the 41st percentile – i.e., 41% of its contemporaries scored the same or lower than it.