↓ Skip to main content

Emotion recognition of static and dynamic faces in autism spectrum disorder

Overview of attention for article published in Cognition and Emotion, December 2013
Altmetric Badge

About this Attention Score

  • Good Attention Score compared to outputs of the same age (78th percentile)
  • Good Attention Score compared to outputs of the same age and source (75th percentile)

Mentioned by

twitter
3 X users
wikipedia
2 Wikipedia pages

Citations

dimensions_citation
48 Dimensions

Readers on

mendeley
139 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Emotion recognition of static and dynamic faces in autism spectrum disorder
Published in
Cognition and Emotion, December 2013
DOI 10.1080/02699931.2013.867832
Pubmed ID
Authors

Peter G. Enticott, Hayley A. Kennedy, Patrick J. Johnston, Nicole J. Rinehart, Bruce J. Tonge, John R. Taffe, Paul B. Fitzgerald

Abstract

There is substantial evidence for facial emotion recognition (FER) deficits in autism spectrum disorder (ASD). The extent of this impairment, however, remains unclear, and there is some suggestion that clinical groups might benefit from the use of dynamic rather than static images. High-functioning individuals with ASD (n = 36) and typically developing controls (n = 36) completed a computerised FER task involving static and dynamic expressions of the six basic emotions. The ASD group showed poorer overall performance in identifying anger and disgust and were disadvantaged by dynamic (relative to static) stimuli when presented with sad expressions. Among both groups, however, dynamic stimuli appeared to improve recognition of anger. This research provides further evidence of specific impairment in the recognition of negative emotions in ASD, but argues against any broad advantages associated with the use of dynamic displays.

X Demographics

X Demographics

The data shown below were collected from the profiles of 3 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 139 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Spain 1 <1%
Netherlands 1 <1%
Germany 1 <1%
Italy 1 <1%
Unknown 135 97%

Demographic breakdown

Readers by professional status Count As %
Student > Master 26 19%
Student > Ph. D. Student 25 18%
Researcher 16 12%
Student > Bachelor 15 11%
Student > Doctoral Student 8 6%
Other 20 14%
Unknown 29 21%
Readers by discipline Count As %
Psychology 63 45%
Social Sciences 9 6%
Medicine and Dentistry 9 6%
Neuroscience 6 4%
Computer Science 5 4%
Other 13 9%
Unknown 34 24%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 6. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 14 March 2021.
All research outputs
#6,443,957
of 25,374,917 outputs
Outputs from Cognition and Emotion
#561
of 1,468 outputs
Outputs of similar age
#69,080
of 321,202 outputs
Outputs of similar age from Cognition and Emotion
#9
of 37 outputs
Altmetric has tracked 25,374,917 research outputs across all sources so far. This one has received more attention than most of these and is in the 74th percentile.
So far Altmetric has tracked 1,468 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 15.5. This one has gotten more attention than average, scoring higher than 61% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 321,202 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 78% of its contemporaries.
We're also able to compare this research output to 37 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 75% of its contemporaries.