↓ Skip to main content

Picture object recognition in an American black bear (Ursus americanus)

Overview of attention for article published in Animal Cognition, June 2016
Altmetric Badge

About this Attention Score

  • In the top 5% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (98th percentile)
  • High Attention Score compared to outputs of the same age and source (89th percentile)

Mentioned by

news
21 news outlets
blogs
1 blog
twitter
3 X users
facebook
1 Facebook page

Citations

dimensions_citation
32 Dimensions

Readers on

mendeley
28 Mendeley
Title
Picture object recognition in an American black bear (Ursus americanus)
Published in
Animal Cognition, June 2016
DOI 10.1007/s10071-016-1011-4
Pubmed ID
Authors

Zoe Johnson-Ulrich, Jennifer Vonk, Mary Humbyrd, Marilyn Crowley, Ela Wojtkowski, Florence Yates, Stephanie Allard

Abstract

Many animals have been tested for conceptual discriminations using two-dimensional images as stimuli, and many of these species appear to transfer knowledge from 2D images to analogous real life objects. We tested an American black bear for picture-object recognition using a two alternative forced choice task. She was presented with four unique sets of objects and corresponding pictures. The bear showed generalization from both objects to pictures and pictures to objects; however, her transfer was superior when transferring from real objects to pictures, suggesting that bears can recognize visual features from real objects within photographic images during discriminations.

X Demographics

X Demographics

The data shown below were collected from the profiles of 3 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 28 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United States 1 4%
Unknown 27 96%

Demographic breakdown

Readers by professional status Count As %
Researcher 5 18%
Student > Master 4 14%
Student > Bachelor 3 11%
Other 3 11%
Lecturer 2 7%
Other 6 21%
Unknown 5 18%
Readers by discipline Count As %
Agricultural and Biological Sciences 8 29%
Psychology 6 21%
Social Sciences 2 7%
Environmental Science 1 4%
Veterinary Science and Veterinary Medicine 1 4%
Other 2 7%
Unknown 8 29%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 172. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 16 April 2023.
All research outputs
#223,629
of 24,614,554 outputs
Outputs from Animal Cognition
#70
of 1,544 outputs
Outputs of similar age
#4,584
of 359,842 outputs
Outputs of similar age from Animal Cognition
#4
of 28 outputs
Altmetric has tracked 24,614,554 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 99th percentile: it's in the top 5% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,544 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 35.2. This one has done particularly well, scoring higher than 95% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 359,842 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 98% of its contemporaries.
We're also able to compare this research output to 28 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 89% of its contemporaries.