↓ Skip to main content

A bias to detail: how hand position modulates visual learning and visual memory

Overview of attention for article published in Memory & Cognition, October 2011
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (93rd percentile)
  • High Attention Score compared to outputs of the same age and source (85th percentile)

Mentioned by

blogs
2 blogs
peer_reviews
1 peer review site
googleplus
1 Google+ user

Citations

dimensions_citation
50 Dimensions

Readers on

mendeley
94 Mendeley
Title
A bias to detail: how hand position modulates visual learning and visual memory
Published in
Memory & Cognition, October 2011
DOI 10.3758/s13421-011-0147-3
Pubmed ID
Authors

Christopher C. Davoli, James R. Brockmole, Annabelle Goujon

Abstract

In this report, we examine whether and how altered aspects of perception and attention near the hands affect one's learning of to-be-remembered visual material. We employed the contextual cuing paradigm of visual learning in two experiments. Participants searched for a target embedded within images of fractals and other complex geometrical patterns while either holding their hands near to or far from the stimuli. When visual features and structural patterns remained constant across to-be-learned images (Exp. 1), no difference emerged between hand postures in the observed rates of learning. However, when to-be-learned scenes maintained structural pattern information but changed in color (Exp. 2), participants exhibited substantially slower rates of learning when holding their hands near the material. This finding shows that learning near the hands is impaired in situations in which common information must be abstracted from visually unique images, suggesting a bias toward detail-oriented processing near the hands.

Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 94 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United States 2 2%
Germany 1 1%
Netherlands 1 1%
Sweden 1 1%
Ecuador 1 1%
Spain 1 1%
Canada 1 1%
Unknown 86 91%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 30 32%
Researcher 13 14%
Student > Master 8 9%
Professor > Associate Professor 7 7%
Student > Doctoral Student 6 6%
Other 21 22%
Unknown 9 10%
Readers by discipline Count As %
Psychology 48 51%
Medicine and Dentistry 6 6%
Social Sciences 5 5%
Neuroscience 4 4%
Computer Science 3 3%
Other 14 15%
Unknown 14 15%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 19. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 13 September 2016.
All research outputs
#1,661,253
of 22,656,971 outputs
Outputs from Memory & Cognition
#117
of 1,568 outputs
Outputs of similar age
#8,335
of 132,884 outputs
Outputs of similar age from Memory & Cognition
#3
of 20 outputs
Altmetric has tracked 22,656,971 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 92nd percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,568 research outputs from this source. They typically receive more attention than average, with a mean Attention Score of 8.5. This one has done particularly well, scoring higher than 92% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 132,884 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 93% of its contemporaries.
We're also able to compare this research output to 20 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 85% of its contemporaries.