↓ Skip to main content

Preexisting semantic representation improves working memory performance in the visuospatial domain

Overview of attention for article published in Memory & Cognition, January 2016
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (86th percentile)
  • High Attention Score compared to outputs of the same age and source (86th percentile)

Mentioned by

twitter
13 X users
peer_reviews
1 peer review site
facebook
2 Facebook pages

Citations

dimensions_citation
8 Dimensions

Readers on

mendeley
62 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Preexisting semantic representation improves working memory performance in the visuospatial domain
Published in
Memory & Cognition, January 2016
DOI 10.3758/s13421-016-0585-z
Pubmed ID
Authors

Mary Rudner, Eleni Orfanidou, Velia Cardin, Cheryl M. Capek, Bencie Woll, Jerker Rönnberg

Abstract

Working memory (WM) for spoken language improves when the to-be-remembered items correspond to preexisting representations in long-term memory. We investigated whether this effect generalizes to the visuospatial domain by administering a visual n-back WM task to deaf signers and hearing signers, as well as to hearing nonsigners. Four different kinds of stimuli were presented: British Sign Language (BSL; familiar to the signers), Swedish Sign Language (SSL; unfamiliar), nonsigns, and nonlinguistic manual actions. The hearing signers performed better with BSL than with SSL, demonstrating a facilitatory effect of preexisting semantic representation. The deaf signers also performed better with BSL than with SSL, but only when WM load was high. No effect of preexisting phonological representation was detected. The deaf signers performed better than the hearing nonsigners with all sign-based materials, but this effect did not generalize to nonlinguistic manual actions. We argue that deaf signers, who are highly reliant on visual information for communication, develop expertise in processing sign-based items, even when those items do not have preexisting semantic or phonological representations. Preexisting semantic representation, however, enhances the quality of the gesture-based representations temporarily maintained in WM by this group, thereby releasing WM resources to deal with increased load. Hearing signers, on the other hand, may make strategic use of their speech-based representations for mnemonic purposes. The overall pattern of results is in line with flexible-resource models of WM.

X Demographics

X Demographics

The data shown below were collected from the profiles of 13 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 62 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Sweden 1 2%
Unknown 61 98%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 15 24%
Researcher 10 16%
Student > Master 9 15%
Professor 7 11%
Student > Doctoral Student 3 5%
Other 10 16%
Unknown 8 13%
Readers by discipline Count As %
Psychology 26 42%
Neuroscience 9 15%
Linguistics 6 10%
Computer Science 2 3%
Arts and Humanities 2 3%
Other 4 6%
Unknown 13 21%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 11. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 23 January 2017.
All research outputs
#2,950,507
of 23,018,998 outputs
Outputs from Memory & Cognition
#211
of 1,569 outputs
Outputs of similar age
#53,989
of 396,293 outputs
Outputs of similar age from Memory & Cognition
#4
of 22 outputs
Altmetric has tracked 23,018,998 research outputs across all sources so far. Compared to these this one has done well and is in the 87th percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,569 research outputs from this source. They typically receive more attention than average, with a mean Attention Score of 8.6. This one has done well, scoring higher than 86% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 396,293 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 86% of its contemporaries.
We're also able to compare this research output to 22 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 86% of its contemporaries.