↓ Skip to main content

View dependencies in the visual recognition of social interactions

Overview of attention for article published in Frontiers in Psychology, January 2013
Altmetric Badge

About this Attention Score

  • Good Attention Score compared to outputs of the same age (76th percentile)
  • Above-average Attention Score compared to outputs of the same age and source (58th percentile)

Mentioned by

twitter
6 X users
googleplus
1 Google+ user

Readers on

mendeley
34 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
View dependencies in the visual recognition of social interactions
Published in
Frontiers in Psychology, January 2013
DOI 10.3389/fpsyg.2013.00752
Pubmed ID
Authors

Stephan de la Rosa, Sarah Mieskes, Heinrich H. Bülthoff, Cristóbal Curio

Abstract

Recognizing social interactions, e.g., two people shaking hands, is important for obtaining information about other people and the surrounding social environment. Despite the visual complexity of social interactions, humans have often little difficulties to visually recognize social interactions. What is the visual representation of social interactions and the bodily visual cues that promote this remarkable human ability? Viewpoint dependent representations are considered to be at the heart of the visual recognition of many visual stimuli including objects (Bülthoff and Edelman, 1992), and biological motion patterns (Verfaillie, 1993). Here we addressed the question whether complex social actions acted out between pairs of people, e.g., hugging, are also represented in a similar manner. To this end, we created 3-D models from motion captured actions acted out by two people, e.g., hugging. These 3-D models allowed to present the same action from different viewpoints. Participants' task was to discriminate a target action from distractor actions using a one-interval-forced-choice (1IFC) task. We measured participants' recognition performance in terms of reaction times (RT) and d-prime (d'). For each tested action we found one view that led to superior recognition performance compared to other views. This finding demonstrates view-dependent effects of visual recognition, which are in line with the idea of a view-dependent representation of social interactions. Subsequently, we examined the degree to which velocities of joints are able to predict the recognition performance of social interactions in order to determine candidate visual cues underlying the recognition of social interactions. We found that the velocities of the arms, both feet, and hips correlated with recognition performance.

X Demographics

X Demographics

The data shown below were collected from the profiles of 6 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 34 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United Kingdom 1 3%
United States 1 3%
Germany 1 3%
Unknown 31 91%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 10 29%
Student > Master 6 18%
Researcher 5 15%
Student > Doctoral Student 4 12%
Student > Bachelor 2 6%
Other 4 12%
Unknown 3 9%
Readers by discipline Count As %
Psychology 16 47%
Agricultural and Biological Sciences 3 9%
Neuroscience 3 9%
Physics and Astronomy 2 6%
Social Sciences 2 6%
Other 2 6%
Unknown 6 18%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 5. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 07 January 2014.
All research outputs
#6,312,966
of 23,133,982 outputs
Outputs from Frontiers in Psychology
#9,076
of 30,588 outputs
Outputs of similar age
#67,199
of 282,769 outputs
Outputs of similar age from Frontiers in Psychology
#400
of 969 outputs
Altmetric has tracked 23,133,982 research outputs across all sources so far. This one has received more attention than most of these and is in the 72nd percentile.
So far Altmetric has tracked 30,588 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 12.5. This one has gotten more attention than average, scoring higher than 70% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 282,769 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 76% of its contemporaries.
We're also able to compare this research output to 969 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 58% of its contemporaries.