↓ Skip to main content

Sensing sociality in dogs: what may make an interactive robot social?

Overview of attention for article published in Animal Cognition, September 2013
Altmetric Badge

About this Attention Score

  • In the top 5% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (99th percentile)
  • High Attention Score compared to outputs of the same age and source (95th percentile)

Mentioned by

news
17 news outlets
blogs
2 blogs
twitter
16 X users
facebook
1 Facebook page

Citations

dimensions_citation
18 Dimensions

Readers on

mendeley
89 Mendeley
Title
Sensing sociality in dogs: what may make an interactive robot social?
Published in
Animal Cognition, September 2013
DOI 10.1007/s10071-013-0670-7
Pubmed ID
Authors

Gabriella Lakatos, Mariusz Janiak, Lukasz Malek, Robert Muszynski, Veronika Konok, Krzysztof Tchon, Á. Miklósi

Abstract

This study investigated whether dogs would engage in social interactions with an unfamiliar robot, utilize the communicative signals it provides and to examine whether the level of sociality shown by the robot affects the dogs' performance. We hypothesized that dogs would react to the communicative signals of a robot more successfully if the robot showed interactive social behaviour in general (towards both humans and dogs) than if it behaved in a machinelike, asocial way. The experiment consisted of an interactive phase followed by a pointing session, both with a human and a robotic experimenter. In the interaction phase, dogs witnessed a 6-min interaction episode between the owner and a human experimenter and another 6-min interaction episode between the owner and the robot. Each interaction episode was followed by the pointing phase in which the human/robot experimenter indicated the location of hidden food by using pointing gestures (two-way choice test). The results showed that in the interaction phase, the dogs' behaviour towards the robot was affected by the differential exposure. Dogs spent more time staying near the robot experimenter as compared to the human experimenter, with this difference being even more pronounced when the robot behaved socially. Similarly, dogs spent more time gazing at the head of the robot experimenter when the situation was social. Dogs achieved a significantly lower level of performance (finding the hidden food) with the pointing robot than with the pointing human; however, separate analysis of the robot sessions suggested that gestures of the socially behaving robot were easier for the dogs to comprehend than gestures of the asocially behaving robot. Thus, the level of sociality shown by the robot was not enough to elicit the same set of social behaviours from the dogs as was possible with humans, although sociality had a positive effect on dog-robot interactions.

X Demographics

X Demographics

The data shown below were collected from the profiles of 16 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 89 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Hungary 2 2%
Switzerland 1 1%
Unknown 86 97%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 19 21%
Student > Master 16 18%
Student > Bachelor 11 12%
Researcher 9 10%
Student > Doctoral Student 5 6%
Other 15 17%
Unknown 14 16%
Readers by discipline Count As %
Agricultural and Biological Sciences 21 24%
Psychology 20 22%
Computer Science 9 10%
Social Sciences 4 4%
Medicine and Dentistry 4 4%
Other 13 15%
Unknown 18 20%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 170. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 10 March 2020.
All research outputs
#223,296
of 24,364,603 outputs
Outputs from Animal Cognition
#72
of 1,534 outputs
Outputs of similar age
#1,614
of 203,417 outputs
Outputs of similar age from Animal Cognition
#2
of 22 outputs
Altmetric has tracked 24,364,603 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 99th percentile: it's in the top 5% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,534 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 35.3. This one has done particularly well, scoring higher than 95% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 203,417 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 99% of its contemporaries.
We're also able to compare this research output to 22 others from the same source and published within six weeks on either side of this one. This one has done particularly well, scoring higher than 95% of its contemporaries.