↓ Skip to main content

Lifting a familiar object: visual size analysis, not memory for object weight, scales lift force

Overview of attention for article published in Experimental Brain Research, April 2008
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age and source

Mentioned by

twitter
1 X user
f1000
1 research highlight platform

Readers on

mendeley
64 Mendeley
citeulike
1 CiteULike
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Lifting a familiar object: visual size analysis, not memory for object weight, scales lift force
Published in
Experimental Brain Research, April 2008
DOI 10.1007/s00221-008-1392-y
Pubmed ID
Authors

Kelly J. Cole

Abstract

The brain can accurately predict the forces needed to efficiently manipulate familiar objects in relation to mechanical properties such as weight. These predictions involve memory or some type of central representation, but visual analysis of size also yields accurate predictions of the needed fingertip forces. This raises the issue of which process (weight memory or visual size analysis) is used during everyday life when handling familiar objects. Our aim was to determine if subjects use a sensorimotor memory of weight, or a visual size analysis, to predictively set their vertical lift force when lifting a recently handled object. Two groups of subjects lifted an opaque brown bottle filled with water (470 g) during the first experimental session, and then rested for 15 min in a different room. Both groups were told that they would lift the same bottle in their next session. However, the experimental group returned to lift a slightly smaller bottle filled with water (360 g) that otherwise was identical in appearance to the first bottle. The control group returned to lift the same bottle from the first session, which was only partially filled with water so that it also weighed 360 g. At the end of the second session subjects were asked if they observed any changes between sessions, but no subject indicated awareness of a specific change. An acceleration ratio was computed by dividing the peak vertical acceleration during the first lift of the second session by the average peak acceleration of the last five lifts during the first session. This ratio was >1 for the control subjects 1.30 (SEM 0.08), indicating that they scaled their lift force for the first lift of the second session based on a memory of the (heavier) bottle from the first session. In contrast, the acceleration ratio was 0.94 (0.10) for the experimental group (P < 0.011). We conclude that the experimental group processed visual cues concerning the size of the bottle. These findings raise the possibility that even with familiar objects we predict fingertip forces using an on-line visual analysis of size (along with memory of density), rather than accessing memory related to object weight.

X Demographics

X Demographics

The data shown below were collected from the profile of 1 X user who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 64 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Spain 1 2%
United States 1 2%
Belgium 1 2%
Unknown 61 95%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 17 27%
Researcher 16 25%
Professor 7 11%
Student > Master 7 11%
Student > Bachelor 3 5%
Other 6 9%
Unknown 8 13%
Readers by discipline Count As %
Psychology 18 28%
Neuroscience 8 13%
Medicine and Dentistry 7 11%
Agricultural and Biological Sciences 6 9%
Engineering 6 9%
Other 7 11%
Unknown 12 19%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 2. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 26 May 2019.
All research outputs
#14,137,641
of 22,653,392 outputs
Outputs from Experimental Brain Research
#1,757
of 3,214 outputs
Outputs of similar age
#66,091
of 79,379 outputs
Outputs of similar age from Experimental Brain Research
#10
of 15 outputs
Altmetric has tracked 22,653,392 research outputs across all sources so far. This one is in the 35th percentile – i.e., 35% of other outputs scored the same or lower than it.
So far Altmetric has tracked 3,214 research outputs from this source. They receive a mean Attention Score of 5.0. This one is in the 41st percentile – i.e., 41% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 79,379 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 16th percentile – i.e., 16% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 15 others from the same source and published within six weeks on either side of this one. This one is in the 33rd percentile – i.e., 33% of its contemporaries scored the same or lower than it.