↓ Skip to main content

Atypical Audiovisual Speech Integration in Infants at Risk for Autism

Overview of attention for article published in PLOS ONE, May 2012
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (84th percentile)
  • Good Attention Score compared to outputs of the same age and source (79th percentile)

Mentioned by

twitter
11 X users
facebook
1 Facebook page

Citations

dimensions_citation
39 Dimensions

Readers on

mendeley
196 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Atypical Audiovisual Speech Integration in Infants at Risk for Autism
Published in
PLOS ONE, May 2012
DOI 10.1371/journal.pone.0036428
Pubmed ID
Authors

Jeanne A. Guiraud, Przemyslaw Tomalski, Elena Kushnerenko, Helena Ribeiro, Kim Davies, Tony Charman, Mayada Elsabbagh, Mark H. Johnson

Abstract

The language difficulties often seen in individuals with autism might stem from an inability to integrate audiovisual information, a skill important for language development. We investigated whether 9-month-old siblings of older children with autism, who are at an increased risk of developing autism, are able to integrate audiovisual speech cues. We used an eye-tracker to record where infants looked when shown a screen displaying two faces of the same model, where one face is articulating/ba/and the other/ga/, with one face congruent with the syllable sound being presented simultaneously, the other face incongruent. This method was successful in showing that infants at low risk can integrate audiovisual speech: they looked for the same amount of time at the mouths in both the fusible visual/ga/- audio/ba/and the congruent visual/ba/- audio/ba/displays, indicating that the auditory and visual streams fuse into a McGurk-type of syllabic percept in the incongruent condition. It also showed that low-risk infants could perceive a mismatch between auditory and visual cues: they looked longer at the mouth in the mismatched, non-fusible visual/ba/- audio/ga/display compared with the congruent visual/ga/- audio/ga/display, demonstrating that they perceive an uncommon, and therefore interesting, speech-like percept when looking at the incongruent mouth (repeated ANOVA: displays x fusion/mismatch conditions interaction: F(1,16) = 17.153, p = 0.001). The looking behaviour of high-risk infants did not differ according to the type of display, suggesting difficulties in matching auditory and visual information (repeated ANOVA, displays x conditions interaction: F(1,25) = 0.09, p = 0.767), in contrast to low-risk infants (repeated ANOVA: displays x conditions x low/high-risk groups interaction: F(1,41) = 4.466, p = 0.041). In some cases this reduced ability might lead to the poor communication skills characteristic of autism.

X Demographics

X Demographics

The data shown below were collected from the profiles of 11 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 196 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United States 3 2%
Italy 1 <1%
Malaysia 1 <1%
Spain 1 <1%
United Kingdom 1 <1%
Unknown 189 96%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 38 19%
Researcher 34 17%
Student > Bachelor 19 10%
Professor 16 8%
Student > Doctoral Student 16 8%
Other 47 24%
Unknown 26 13%
Readers by discipline Count As %
Psychology 91 46%
Medicine and Dentistry 10 5%
Neuroscience 10 5%
Social Sciences 9 5%
Linguistics 8 4%
Other 34 17%
Unknown 34 17%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 9. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 12 August 2021.
All research outputs
#4,251,387
of 25,654,806 outputs
Outputs from PLOS ONE
#52,278
of 223,967 outputs
Outputs of similar age
#27,329
of 176,872 outputs
Outputs of similar age from PLOS ONE
#801
of 3,879 outputs
Altmetric has tracked 25,654,806 research outputs across all sources so far. Compared to these this one has done well and is in the 83rd percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 223,967 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 15.8. This one has done well, scoring higher than 76% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 176,872 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 84% of its contemporaries.
We're also able to compare this research output to 3,879 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 79% of its contemporaries.