↓ Skip to main content

Insect Brains Use Image Interpolation Mechanisms to Recognise Rotated Objects

Overview of attention for article published in PLOS ONE, December 2008
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (91st percentile)
  • Good Attention Score compared to outputs of the same age and source (77th percentile)

Mentioned by

news
1 news outlet
twitter
1 X user

Citations

dimensions_citation
23 Dimensions

Readers on

mendeley
71 Mendeley
citeulike
2 CiteULike
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Insect Brains Use Image Interpolation Mechanisms to Recognise Rotated Objects
Published in
PLOS ONE, December 2008
DOI 10.1371/journal.pone.0004086
Pubmed ID
Authors

Adrian G. Dyer, Quoc C. Vuong

Abstract

Recognising complex three-dimensional objects presents significant challenges to visual systems when these objects are rotated in depth. The image processing requirements for reliable individual recognition under these circumstances are computationally intensive since local features and their spatial relationships may significantly change as an object is rotated in the horizontal plane. Visual experience is known to be important in primate brains learning to recognise rotated objects, but currently it is unknown how animals with comparatively simple brains deal with the problem of reliably recognising objects when seen from different viewpoints. We show that the miniature brain of honeybees initially demonstrate a low tolerance for novel views of complex shapes (e.g. human faces), but can learn to recognise novel views of stimuli by interpolating between or 'averaging' views they have experienced. The finding that visual experience is also important for bees has important implications for understanding how three dimensional biologically relevant objects like flowers are recognised in complex environments, and for how machine vision might be taught to solve related visual problems.

X Demographics

X Demographics

The data shown below were collected from the profile of 1 X user who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 71 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Germany 3 4%
United States 3 4%
United Kingdom 2 3%
Japan 1 1%
Mexico 1 1%
Unknown 61 86%

Demographic breakdown

Readers by professional status Count As %
Researcher 19 27%
Student > Ph. D. Student 17 24%
Other 5 7%
Professor 5 7%
Student > Master 5 7%
Other 14 20%
Unknown 6 8%
Readers by discipline Count As %
Agricultural and Biological Sciences 31 44%
Psychology 13 18%
Medicine and Dentistry 5 7%
Computer Science 5 7%
Engineering 3 4%
Other 8 11%
Unknown 6 8%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 11. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 03 November 2022.
All research outputs
#2,960,045
of 23,660,057 outputs
Outputs from PLOS ONE
#38,205
of 201,988 outputs
Outputs of similar age
#14,656
of 172,472 outputs
Outputs of similar age from PLOS ONE
#103
of 463 outputs
Altmetric has tracked 23,660,057 research outputs across all sources so far. Compared to these this one has done well and is in the 87th percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 201,988 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 15.4. This one has done well, scoring higher than 80% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 172,472 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 91% of its contemporaries.
We're also able to compare this research output to 463 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 77% of its contemporaries.