↓ Skip to main content

Auditory Emotion Word Primes Influence Emotional Face Categorization in Children and Adults, but Not Vice Versa

Overview of attention for article published in Frontiers in Psychology, May 2018
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age

Mentioned by

twitter
3 X users

Citations

dimensions_citation
10 Dimensions

Readers on

mendeley
44 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Auditory Emotion Word Primes Influence Emotional Face Categorization in Children and Adults, but Not Vice Versa
Published in
Frontiers in Psychology, May 2018
DOI 10.3389/fpsyg.2018.00618
Pubmed ID
Authors

Michael Vesker, Daniela Bahn, Christina Kauschke, Monika Tschense, Franziska Degé, Gudrun Schwarzer

Abstract

In order to assess how the perception of audible speech and facial expressions influence one another for the perception of emotions, and how this influence might change over the course of development, we conducted two cross-modal priming experiments with three age groups of children (6-, 9-, and 12-years old), as well as college-aged adults. In Experiment 1, 74 children and 24 adult participants were tasked with categorizing photographs of emotional faces as positive or negative as quickly as possible after being primed with emotion words presented via audio in valence-congruent and valence-incongruent trials. In Experiment 2, 67 children and 24 adult participants carried out a similar categorization task, but with faces acting as visual primes, and emotion words acting as auditory targets. The results of Experiment 1 showed that participants made more errors when categorizing positive faces primed by negative words versus positive words, and that 6-year-old children are particularly sensitive to positive word primes, giving faster correct responses regardless of target valence. Meanwhile, the results of Experiment 2 did not show any congruency effects for priming by facial expressions. Thus, audible emotion words seem to exert an influence on the emotional categorization of faces, while faces do not seem to influence the categorization of emotion words in a significant way.

X Demographics

X Demographics

The data shown below were collected from the profiles of 3 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 44 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 44 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 9 20%
Student > Master 8 18%
Student > Bachelor 4 9%
Researcher 3 7%
Lecturer 2 5%
Other 7 16%
Unknown 11 25%
Readers by discipline Count As %
Psychology 18 41%
Neuroscience 7 16%
Agricultural and Biological Sciences 4 9%
Unspecified 2 5%
Veterinary Science and Veterinary Medicine 1 2%
Other 2 5%
Unknown 10 23%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 2. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 01 May 2018.
All research outputs
#14,387,654
of 23,043,346 outputs
Outputs from Frontiers in Psychology
#15,296
of 30,345 outputs
Outputs of similar age
#185,214
of 326,167 outputs
Outputs of similar age from Frontiers in Psychology
#420
of 627 outputs
Altmetric has tracked 23,043,346 research outputs across all sources so far. This one is in the 35th percentile – i.e., 35% of other outputs scored the same or lower than it.
So far Altmetric has tracked 30,345 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 12.5. This one is in the 46th percentile – i.e., 46% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 326,167 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 40th percentile – i.e., 40% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 627 others from the same source and published within six weeks on either side of this one. This one is in the 29th percentile – i.e., 29% of its contemporaries scored the same or lower than it.