↓ Skip to main content

A neural representation of depth from motion parallax in macaque visual cortex

Overview of attention for article published in Nature, March 2008
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (91st percentile)
  • Above-average Attention Score compared to outputs of the same age and source (61st percentile)

Mentioned by

blogs
1 blog
twitter
1 X user
wikipedia
1 Wikipedia page
f1000
1 research highlight platform

Citations

dimensions_citation
99 Dimensions

Readers on

mendeley
292 Mendeley
citeulike
4 CiteULike
connotea
1 Connotea
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
A neural representation of depth from motion parallax in macaque visual cortex
Published in
Nature, March 2008
DOI 10.1038/nature06814
Pubmed ID
Authors

Jacob W. Nadler, Dora E. Angelaki, Gregory C. DeAngelis

Abstract

Perception of depth is a fundamental challenge for the visual system, particularly for observers moving through their environment. The brain makes use of multiple visual cues to reconstruct the three-dimensional structure of a scene. One potent cue, motion parallax, frequently arises during translation of the observer because the images of objects at different distances move across the retina with different velocities. Human psychophysical studies have demonstrated that motion parallax can be a powerful depth cue, and motion parallax seems to be heavily exploited by animal species that lack highly developed binocular vision. However, little is known about the neural mechanisms that underlie this capacity. Here we show, by using a virtual-reality system to translate macaque monkeys (Macaca mulatta) while they viewed motion parallax displays that simulated objects at different depths, that many neurons in the middle temporal area (area MT) signal the sign of depth (near versus far) from motion parallax in the absence of other depth cues. To achieve this, neurons must combine visual motion with extra-retinal (non-visual) signals related to the animal's movement. Our findings suggest a new neural substrate for depth perception and demonstrate a robust interaction of visual and non-visual cues in area MT. Combined with previous studies that implicate area MT in depth perception based on binocular disparities, our results suggest that area MT contains a more general representation of three-dimensional space that makes use of multiple cues.

X Demographics

X Demographics

The data shown below were collected from the profile of 1 X user who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 292 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United States 16 5%
Japan 4 1%
United Kingdom 3 1%
Germany 3 1%
Netherlands 2 <1%
Switzerland 2 <1%
Belgium 1 <1%
Canada 1 <1%
Singapore 1 <1%
Other 1 <1%
Unknown 258 88%

Demographic breakdown

Readers by professional status Count As %
Researcher 84 29%
Student > Ph. D. Student 52 18%
Professor > Associate Professor 29 10%
Professor 26 9%
Student > Bachelor 17 6%
Other 56 19%
Unknown 28 10%
Readers by discipline Count As %
Agricultural and Biological Sciences 73 25%
Psychology 68 23%
Neuroscience 56 19%
Engineering 17 6%
Medicine and Dentistry 17 6%
Other 26 9%
Unknown 35 12%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 13. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 16 July 2022.
All research outputs
#2,445,407
of 22,867,327 outputs
Outputs from Nature
#44,277
of 91,042 outputs
Outputs of similar age
#6,806
of 80,490 outputs
Outputs of similar age from Nature
#210
of 552 outputs
Altmetric has tracked 22,867,327 research outputs across all sources so far. Compared to these this one has done well and is in the 89th percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 91,042 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 99.4. This one has gotten more attention than average, scoring higher than 51% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 80,490 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 91% of its contemporaries.
We're also able to compare this research output to 552 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 61% of its contemporaries.