↓ Skip to main content

The Neural Basis of Speech Perception through Lipreading and Manual Cues: Evidence from Deaf Native Users of Cued Speech

Overview of attention for article published in Frontiers in Psychology, March 2017
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • Good Attention Score compared to outputs of the same age (76th percentile)
  • Good Attention Score compared to outputs of the same age and source (66th percentile)

Mentioned by

twitter
16 X users

Readers on

mendeley
43 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
The Neural Basis of Speech Perception through Lipreading and Manual Cues: Evidence from Deaf Native Users of Cued Speech
Published in
Frontiers in Psychology, March 2017
DOI 10.3389/fpsyg.2017.00426
Pubmed ID
Authors

Mario Aparicio, Philippe Peigneux, Brigitte Charlier, Danielle Balériaux, Martin Kavec, Jacqueline Leybaert

Abstract

We present here the first neuroimaging data for perception of Cued Speech (CS) by deaf adults who are native users of CS. CS is a visual mode of communicating a spoken language through a set of manual cues which accompany lipreading and disambiguate it. With CS, sublexical units of the oral language are conveyed clearly and completely through the visual modality without requiring hearing. The comparison of neural processing of CS in deaf individuals with processing of audiovisual (AV) speech in normally hearing individuals represents a unique opportunity to explore the similarities and differences in neural processing of an oral language delivered in a visuo-manual vs. an AV modality. The study included deaf adult participants who were early CS users and native hearing users of French who process speech audiovisually. Words were presented in an event-related fMRI design. Three conditions were presented to each group of participants. The deaf participants saw CS words (manual + lipread), words presented as manual cues alone, and words presented to be lipread without manual cues. The hearing group saw AV spoken words, audio-alone and lipread-alone. Three findings are highlighted. First, the middle and superior temporal gyrus (excluding Heschl's gyrus) and left inferior frontal gyrus pars triangularis constituted a common, amodal neural basis for AV and CS perception. Second, integration was inferred in posterior parts of superior temporal sulcus for audio and lipread information in AV speech, but in the occipito-temporal junction, including MT/V5, for the manual cues and lipreading in CS. Third, the perception of manual cues showed a much greater overlap with the regions activated by CS (manual + lipreading) than lipreading alone did. This supports the notion that manual cues play a larger role than lipreading for CS processing. The present study contributes to a better understanding of the role of manual cues as support of visual speech perception in the framework of the multimodal nature of human communication.

X Demographics

X Demographics

The data shown below were collected from the profiles of 16 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 43 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 43 100%

Demographic breakdown

Readers by professional status Count As %
Student > Bachelor 8 19%
Student > Master 6 14%
Researcher 5 12%
Student > Ph. D. Student 5 12%
Student > Doctoral Student 4 9%
Other 7 16%
Unknown 8 19%
Readers by discipline Count As %
Psychology 7 16%
Medicine and Dentistry 5 12%
Computer Science 5 12%
Linguistics 4 9%
Neuroscience 4 9%
Other 10 23%
Unknown 8 19%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 8. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 10 October 2019.
All research outputs
#4,552,675
of 25,217,627 outputs
Outputs from Frontiers in Psychology
#7,697
of 34,069 outputs
Outputs of similar age
#73,502
of 314,771 outputs
Outputs of similar age from Frontiers in Psychology
#180
of 540 outputs
Altmetric has tracked 25,217,627 research outputs across all sources so far. Compared to these this one has done well and is in the 81st percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 34,069 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 13.2. This one has done well, scoring higher than 77% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 314,771 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 76% of its contemporaries.
We're also able to compare this research output to 540 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 66% of its contemporaries.