↓ Skip to main content

Size does not matter: size-invariant echo-acoustic object classification

Overview of attention for article published in Journal of Comparative Physiology A, November 2012
Altmetric Badge

Mentioned by

twitter
1 X user

Citations

dimensions_citation
9 Dimensions

Readers on

mendeley
55 Mendeley
Title
Size does not matter: size-invariant echo-acoustic object classification
Published in
Journal of Comparative Physiology A, November 2012
DOI 10.1007/s00359-012-0777-3
Pubmed ID
Authors

Daria Genzel, Lutz Wiegrebe

Abstract

Echolocating bats can not only extract spatial information from the auditory analysis of their ultrasonic emissions, they can also discriminate, classify and identify the three-dimensional shape of objects reflecting their emissions. Effective object recognition requires the segregation of size and shape information. Previous studies have shown that, like in visual object recognition, bats can transfer an echo-acoustic object discrimination task to objects of different size and that they spontaneously classify scaled versions of virtual echo-acoustic objects according to trained virtual-object standards. The current study aims to bridge the gap between these previous findings using a different class of real objects and a classification-instead of a discrimination paradigm. Echolocating bats (Phyllostomus discolor) were trained to classify an object as either a sphere or an hour-glass shaped object. The bats spontaneously generalised this classification to objects of the same shape. The generalisation cannot be explained based on similarities of the power spectra or temporal structures of the echo-acoustic object images and thus require dedicated neural mechanisms dealing with size-invariant echo-acoustic object analysis. Control experiments with human listeners classifying the echo-acoustic images of the objects confirm the universal validity of auditory size invariance. The current data thus corroborate and extend previous psychophysical evidence for sonar auditory-object normalisation and suggest that the underlying auditory mechanisms following the initial neural extraction of the echo-acoustic images in echolocating bats may be very similar in bats and humans.

X Demographics

X Demographics

The data shown below were collected from the profile of 1 X user who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 55 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United Kingdom 1 2%
United States 1 2%
Germany 1 2%
Brazil 1 2%
Unknown 51 93%

Demographic breakdown

Readers by professional status Count As %
Researcher 13 24%
Student > Ph. D. Student 12 22%
Student > Master 10 18%
Professor > Associate Professor 4 7%
Student > Bachelor 3 5%
Other 8 15%
Unknown 5 9%
Readers by discipline Count As %
Agricultural and Biological Sciences 24 44%
Engineering 9 16%
Neuroscience 7 13%
Psychology 4 7%
Environmental Science 2 4%
Other 4 7%
Unknown 5 9%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 1. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 01 December 2012.
All research outputs
#16,049,105
of 23,815,455 outputs
Outputs from Journal of Comparative Physiology A
#1,059
of 1,450 outputs
Outputs of similar age
#183,853
of 281,426 outputs
Outputs of similar age from Journal of Comparative Physiology A
#10
of 23 outputs
Altmetric has tracked 23,815,455 research outputs across all sources so far. This one is in the 22nd percentile – i.e., 22% of other outputs scored the same or lower than it.
So far Altmetric has tracked 1,450 research outputs from this source. They typically receive more attention than average, with a mean Attention Score of 7.9. This one is in the 17th percentile – i.e., 17% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 281,426 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 24th percentile – i.e., 24% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 23 others from the same source and published within six weeks on either side of this one. This one is in the 26th percentile – i.e., 26% of its contemporaries scored the same or lower than it.