↓ Skip to main content

Motivated empathy: The mechanics of the empathic gaze

Overview of attention for article published in Cognition and Emotion, February 2014
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (91st percentile)
  • High Attention Score compared to outputs of the same age and source (93rd percentile)

Mentioned by

blogs
1 blog
twitter
13 X users

Citations

dimensions_citation
49 Dimensions

Readers on

mendeley
130 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Motivated empathy: The mechanics of the empathic gaze
Published in
Cognition and Emotion, February 2014
DOI 10.1080/02699931.2014.890563
Pubmed ID
Authors

David G. Cowan, Eric J. Vanman, Mark Nielsen

Abstract

Successful human social interactions frequently rely on appropriate interpersonal empathy and eye contact. Here, we report a previously unseen relationship between trait empathy and eye-gaze patterns to affective facial features in video-based stimuli. Fifty-nine healthy adult participants had their eyes tracked while watching a three-minute long "sad" and "emotionally neutral" video. The video stimuli portrayed the head and shoulders of the same actor recounting a fictional personal event. Analyses revealed that the greater participants' trait emotional empathy, the more they fixated on the eye-region of the actor, regardless of the emotional valence of the video stimuli. Our findings provide the first empirical evidence of a relationship between empathic capacity and eye-gaze pattern to the most affective facial region (eyes).

X Demographics

X Demographics

The data shown below were collected from the profiles of 13 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 130 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 130 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 28 22%
Student > Master 22 17%
Student > Bachelor 14 11%
Researcher 10 8%
Student > Doctoral Student 10 8%
Other 25 19%
Unknown 21 16%
Readers by discipline Count As %
Psychology 59 45%
Neuroscience 5 4%
Engineering 5 4%
Linguistics 4 3%
Medicine and Dentistry 4 3%
Other 21 16%
Unknown 32 25%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 17. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 04 August 2021.
All research outputs
#2,115,681
of 25,373,627 outputs
Outputs from Cognition and Emotion
#263
of 1,468 outputs
Outputs of similar age
#20,911
of 234,818 outputs
Outputs of similar age from Cognition and Emotion
#2
of 30 outputs
Altmetric has tracked 25,373,627 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 91st percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,468 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 15.5. This one has done well, scoring higher than 82% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 234,818 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 91% of its contemporaries.
We're also able to compare this research output to 30 others from the same source and published within six weeks on either side of this one. This one has done particularly well, scoring higher than 93% of its contemporaries.