↓ Skip to main content

Dispelling the Myth: Training in Education or Neuroscience Decreases but Does Not Eliminate Beliefs in Neuromyths

Overview of attention for article published in Frontiers in Psychology, August 2017
Altmetric Badge

About this Attention Score

  • In the top 5% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (99th percentile)
  • High Attention Score compared to outputs of the same age and source (99th percentile)

Mentioned by

news
19 news outlets
blogs
14 blogs
policy
1 policy source
twitter
514 X users
facebook
25 Facebook pages
wikipedia
1 Wikipedia page
googleplus
2 Google+ users

Readers on

mendeley
468 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Dispelling the Myth: Training in Education or Neuroscience Decreases but Does Not Eliminate Beliefs in Neuromyths
Published in
Frontiers in Psychology, August 2017
DOI 10.3389/fpsyg.2017.01314
Pubmed ID
Authors

Kelly Macdonald, Laura Germine, Alida Anderson, Joanna Christodoulou, Lauren M. McGrath

Abstract

Neuromyths are misconceptions about brain research and its application to education and learning. Previous research has shown that these myths may be quite pervasive among educators, but less is known about how these rates compare to the general public or to individuals who have more exposure to neuroscience. This study is the first to use a large sample from the United States to compare the prevalence and predictors of neuromyths among educators, the general public, and individuals with high neuroscience exposure. Neuromyth survey responses and demographics were gathered via an online survey hosted at TestMyBrain.org. We compared performance among the three groups of interest: educators (N = 598), high neuroscience exposure (N = 234), and the general public (N = 3,045) and analyzed predictors of individual differences in neuromyths performance. In an exploratory factor analysis, we found that a core group of 7 "classic" neuromyths factored together (items related to learning styles, dyslexia, the Mozart effect, the impact of sugar on attention, right-brain/left-brain learners, and using 10% of the brain). The general public endorsed the greatest number of neuromyths (M = 68%), with significantly fewer endorsed by educators (M = 56%), and still fewer endorsed by the high neuroscience exposure group (M = 46%). The two most commonly endorsed neuromyths across all groups were related to learning styles and dyslexia. More accurate performance on neuromyths was predicted by age (being younger), education (having a graduate degree), exposure to neuroscience courses, and exposure to peer-reviewed science. These findings suggest that training in education and neuroscience can help reduce but does not eliminate belief in neuromyths. We discuss the possible underlying roots of the most prevalent neuromyths and implications for classroom practice. These empirical results can be useful for developing comprehensive training modules for educators that target general misconceptions about the brain and learning.

X Demographics

X Demographics

The data shown below were collected from the profiles of 514 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 468 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 468 100%

Demographic breakdown

Readers by professional status Count As %
Student > Master 58 12%
Student > Bachelor 51 11%
Researcher 50 11%
Student > Ph. D. Student 48 10%
Student > Doctoral Student 40 9%
Other 98 21%
Unknown 123 26%
Readers by discipline Count As %
Psychology 119 25%
Social Sciences 67 14%
Neuroscience 40 9%
Arts and Humanities 14 3%
Nursing and Health Professions 12 3%
Other 72 15%
Unknown 144 31%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 617. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 18 April 2024.
All research outputs
#37,151
of 25,809,907 outputs
Outputs from Frontiers in Psychology
#58
of 34,800 outputs
Outputs of similar age
#719
of 328,718 outputs
Outputs of similar age from Frontiers in Psychology
#1
of 581 outputs
Altmetric has tracked 25,809,907 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 99th percentile: it's in the top 5% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 34,800 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 13.5. This one has done particularly well, scoring higher than 99% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 328,718 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 99% of its contemporaries.
We're also able to compare this research output to 581 others from the same source and published within six weeks on either side of this one. This one has done particularly well, scoring higher than 99% of its contemporaries.