↓ Skip to main content

Diagnostic Features of Emotional Expressions Are Processed Preferentially

Overview of attention for article published in PLOS ONE, July 2012
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • Good Attention Score compared to outputs of the same age (79th percentile)
  • Good Attention Score compared to outputs of the same age and source (74th percentile)

Mentioned by

twitter
5 X users
wikipedia
1 Wikipedia page

Citations

dimensions_citation
85 Dimensions

Readers on

mendeley
142 Mendeley
citeulike
2 CiteULike
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Diagnostic Features of Emotional Expressions Are Processed Preferentially
Published in
PLOS ONE, July 2012
DOI 10.1371/journal.pone.0041792
Pubmed ID
Authors

Elisa Scheller, Christian Büchel, Matthias Gamer

Abstract

Diagnostic features of emotional expressions are differentially distributed across the face. The current study examined whether these diagnostic features are preferentially attended to even when they are irrelevant for the task at hand or when faces appear at different locations in the visual field. To this aim, fearful, happy and neutral faces were presented to healthy individuals in two experiments while measuring eye movements. In Experiment 1, participants had to accomplish an emotion classification, a gender discrimination or a passive viewing task. To differentiate fast, potentially reflexive, eye movements from a more elaborate scanning of faces, stimuli were either presented for 150 or 2000 ms. In Experiment 2, similar faces were presented at different spatial positions to rule out the possibility that eye movements only reflect a general bias for certain visual field locations. In both experiments, participants fixated the eye region much longer than any other region in the face. Furthermore, the eye region was attended to more pronouncedly when fearful or neutral faces were shown whereas more attention was directed toward the mouth of happy facial expressions. Since these results were similar across the other experimental manipulations, they indicate that diagnostic features of emotional expressions are preferentially processed irrespective of task demands and spatial locations. Saliency analyses revealed that a computational model of bottom-up visual attention could not explain these results. Furthermore, as these gaze preferences were evident very early after stimulus onset and occurred even when saccades did not allow for extracting further information from these stimuli, they may reflect a preattentive mechanism that automatically detects relevant facial features in the visual field and facilitates the orientation of attention towards them. This mechanism might crucially depend on amygdala functioning and it is potentially impaired in a number of clinical conditions such as autism or social anxiety disorders.

X Demographics

X Demographics

The data shown below were collected from the profiles of 5 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 142 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
France 2 1%
Germany 1 <1%
Brazil 1 <1%
Spain 1 <1%
Luxembourg 1 <1%
Unknown 136 96%

Demographic breakdown

Readers by professional status Count As %
Student > Master 26 18%
Student > Ph. D. Student 24 17%
Student > Doctoral Student 17 12%
Researcher 14 10%
Student > Bachelor 10 7%
Other 25 18%
Unknown 26 18%
Readers by discipline Count As %
Psychology 65 46%
Agricultural and Biological Sciences 7 5%
Neuroscience 7 5%
Medicine and Dentistry 6 4%
Engineering 5 4%
Other 17 12%
Unknown 35 25%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 7. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 23 September 2021.
All research outputs
#5,127,897
of 24,404,997 outputs
Outputs from PLOS ONE
#78,232
of 210,535 outputs
Outputs of similar age
#34,038
of 167,474 outputs
Outputs of similar age from PLOS ONE
#1,007
of 3,988 outputs
Altmetric has tracked 24,404,997 research outputs across all sources so far. Compared to these this one has done well and is in the 78th percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 210,535 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 15.6. This one has gotten more attention than average, scoring higher than 62% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 167,474 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 79% of its contemporaries.
We're also able to compare this research output to 3,988 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 74% of its contemporaries.