↓ Skip to main content

Comparison of deep neural networks to spatio-temporal cortical dynamics of human visual object recognition reveals hierarchical correspondence

Overview of attention for article published in Scientific Reports, June 2016
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (91st percentile)
  • High Attention Score compared to outputs of the same age and source (88th percentile)

Mentioned by

blogs
2 blogs
twitter
20 X users
facebook
1 Facebook page
f1000
1 research highlight platform

Citations

dimensions_citation
595 Dimensions

Readers on

mendeley
622 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Comparison of deep neural networks to spatio-temporal cortical dynamics of human visual object recognition reveals hierarchical correspondence
Published in
Scientific Reports, June 2016
DOI 10.1038/srep27755
Pubmed ID
Authors

Radoslaw Martin Cichy, Aditya Khosla, Dimitrios Pantazis, Antonio Torralba, Aude Oliva

Abstract

The complex multi-stage architecture of cortical visual pathways provides the neural basis for efficient visual object recognition in humans. However, the stage-wise computations therein remain poorly understood. Here, we compared temporal (magnetoencephalography) and spatial (functional MRI) visual brain representations with representations in an artificial deep neural network (DNN) tuned to the statistics of real-world visual recognition. We showed that the DNN captured the stages of human visual processing in both time and space from early visual areas towards the dorsal and ventral streams. Further investigation of crucial DNN parameters revealed that while model architecture was important, training on real-world categorization was necessary to enforce spatio-temporal hierarchical relationships with the brain. Together our results provide an algorithmically informed view on the spatio-temporal dynamics of visual object recognition in the human visual brain.

X Demographics

X Demographics

The data shown below were collected from the profiles of 20 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 622 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Netherlands 1 <1%
Chile 1 <1%
Sweden 1 <1%
United Kingdom 1 <1%
Belgium 1 <1%
China 1 <1%
United States 1 <1%
Unknown 615 99%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 139 22%
Student > Master 112 18%
Researcher 74 12%
Student > Bachelor 59 9%
Student > Doctoral Student 30 5%
Other 76 12%
Unknown 132 21%
Readers by discipline Count As %
Neuroscience 143 23%
Psychology 90 14%
Computer Science 83 13%
Engineering 60 10%
Medicine and Dentistry 17 3%
Other 63 10%
Unknown 166 27%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 24. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 03 July 2020.
All research outputs
#1,626,707
of 25,757,133 outputs
Outputs from Scientific Reports
#15,399
of 142,830 outputs
Outputs of similar age
#28,983
of 361,371 outputs
Outputs of similar age from Scientific Reports
#405
of 3,612 outputs
Altmetric has tracked 25,757,133 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 93rd percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 142,830 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 18.8. This one has done well, scoring higher than 89% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 361,371 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 91% of its contemporaries.
We're also able to compare this research output to 3,612 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 88% of its contemporaries.