↓ Skip to main content

Time Course of Information Processing in Visual and Haptic Object Classification

Overview of attention for article published in Frontiers in Human Neuroscience, January 2012
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age
  • Average Attention Score compared to outputs of the same age and source

Mentioned by

twitter
2 X users
facebook
1 Facebook page
googleplus
1 Google+ user

Citations

dimensions_citation
18 Dimensions

Readers on

mendeley
56 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Time Course of Information Processing in Visual and Haptic Object Classification
Published in
Frontiers in Human Neuroscience, January 2012
DOI 10.3389/fnhum.2012.00049
Pubmed ID
Authors

Jasna Martinovic, Rebecca Lawson, Matt Craddock

Abstract

Vision identifies objects rapidly and efficiently. In contrast, object recognition by touch is much slower. Furthermore, haptics usually serially accumulates information from different parts of objects, whereas vision typically processes object information in parallel. Is haptic object identification slower simply due to sequential information acquisition and the resulting memory load or due to more fundamental processing differences between the senses? To compare the time course of visual and haptic object recognition, we slowed visual processing using a novel, restricted viewing technique. In an electroencephalographic (EEG) experiment, participants discriminated familiar, nameable from unfamiliar, unnamable objects both visually and haptically. Analyses focused on the evoked and total fronto-central theta-band (5-7 Hz; a marker of working memory) and the occipital upper alpha-band (10-12 Hz; a marker of perceptual processing) locked to the onset of classification. Decreases in total upper alpha-band activity for haptic identification of objects indicate a likely processing role of multisensory extrastriate areas. Long-latency modulations of alpha-band activity differentiated between familiar and unfamiliar objects in haptics but not in vision. In contrast, theta-band activity showed a general increase over time for the slowed-down visual recognition task only. We conclude that haptic object recognition relies on common representations with vision but also that there are fundamental differences between the senses that do not merely arise from differences in their speed of processing.

X Demographics

X Demographics

The data shown below were collected from the profiles of 2 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 56 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United Kingdom 2 4%
Japan 1 2%
Portugal 1 2%
Taiwan 1 2%
Unknown 51 91%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 14 25%
Researcher 10 18%
Student > Master 7 13%
Student > Doctoral Student 6 11%
Student > Bachelor 4 7%
Other 5 9%
Unknown 10 18%
Readers by discipline Count As %
Psychology 14 25%
Neuroscience 11 20%
Medicine and Dentistry 8 14%
Engineering 4 7%
Computer Science 2 4%
Other 3 5%
Unknown 14 25%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 3. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 03 May 2015.
All research outputs
#13,134,992
of 22,675,759 outputs
Outputs from Frontiers in Human Neuroscience
#3,840
of 7,114 outputs
Outputs of similar age
#145,189
of 244,088 outputs
Outputs of similar age from Frontiers in Human Neuroscience
#163
of 294 outputs
Altmetric has tracked 22,675,759 research outputs across all sources so far. This one is in the 41st percentile – i.e., 41% of other outputs scored the same or lower than it.
So far Altmetric has tracked 7,114 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 14.5. This one is in the 44th percentile – i.e., 44% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 244,088 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 40th percentile – i.e., 40% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 294 others from the same source and published within six weeks on either side of this one. This one is in the 41st percentile – i.e., 41% of its contemporaries scored the same or lower than it.