↓ Skip to main content

Uncovering students’ misconceptions by assessment of their written questions

Overview of attention for article published in BMC Medical Education, August 2016
Altmetric Badge

Mentioned by

twitter
1 tweeter

Citations

dimensions_citation
7 Dimensions

Readers on

mendeley
72 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Uncovering students’ misconceptions by assessment of their written questions
Published in
BMC Medical Education, August 2016
DOI 10.1186/s12909-016-0739-5
Pubmed ID
Authors

Marleen Olde Bekkink, A. R. T. Rogier Donders, Jan G. Kooloos, Rob M. W. de Waal, Dirk J. Ruiter

Abstract

Misconceptions are ideas that are inconsistent with current scientific views. They are difficult to detect and refractory to change. Misconceptions can negatively influence how new concepts in science are learned, but are rarely measured in biomedical courses. Early identification of misconceptions is of critical relevance for effective teaching, but presents a difficult task for teachers as they tend to either over- or underestimate students' prior knowledge. A systematic appreciation of the existing misconceptions is desirable. This explorative study was performed to determine whether written questions generated by students can be used to uncover their misconceptions. During a small-group work (SGW) session on Tumour Pathology in a (bio)medical bachelor course on General Pathology, students were asked to write down a question about the topic. This concerned a deepening question on disease mechanisms and not mere factual knowledge. Three independent expert pathologists determined whether the content of the questions was compatible with a misconception. Consensus was reached in all cases. Study outcomes were to determine whether misconceptions can be identified in students' written questions, and if so, to measure the frequency of misconceptions that can be encountered, and finally, to determine if the presence of such misconceptions is negatively associated with the students' course formal examination score. A subgroup analysis was performed according to gender and discipline. A total of 242 students participated in the SGW sessions, of whom 221 (91 %) formulated a question. Thirty-six questions did not meet the inclusion criteria. Of the 185 questions rated, 11 % (n = 20) was compatible with a misconception. Misconceptions were only found in medical students' questions, not in biomedical science students' questions. Formal examination score on Tumour Pathology was 5.0 (SD 2.0) in the group with misconceptions and 6.7 (SD 2.4) in the group without misconceptions (p = 0.003). This study demonstrates that misconceptions can be uncovered in students' written questions. The occurrence of these misconceptions was negatively associated with the formal examination score. Identification of misconceptions creates an opportunity to repair them during the remaining course sessions, in advance of the formal examination.

Twitter Demographics

The data shown below were collected from the profile of 1 tweeter who shared this research output. Click here to find out more about how the information was compiled.

Mendeley readers

The data shown below were compiled from readership statistics for 72 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United States 1 1%
Unknown 71 99%

Demographic breakdown

Readers by professional status Count As %
Student > Master 16 22%
Researcher 12 17%
Student > Bachelor 7 10%
Lecturer 6 8%
Professor > Associate Professor 6 8%
Other 16 22%
Unknown 9 13%
Readers by discipline Count As %
Medicine and Dentistry 12 17%
Social Sciences 10 14%
Biochemistry, Genetics and Molecular Biology 6 8%
Chemistry 5 7%
Mathematics 4 6%
Other 24 33%
Unknown 11 15%

Attention Score in Context

This research output has an Altmetric Attention Score of 1. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 25 August 2016.
All research outputs
#7,157,722
of 8,276,988 outputs
Outputs from BMC Medical Education
#1,131
of 1,248 outputs
Outputs of similar age
#213,535
of 253,535 outputs
Outputs of similar age from BMC Medical Education
#69
of 80 outputs
Altmetric has tracked 8,276,988 research outputs across all sources so far. This one is in the 1st percentile – i.e., 1% of other outputs scored the same or lower than it.
So far Altmetric has tracked 1,248 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 5.4. This one is in the 1st percentile – i.e., 1% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 253,535 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 1st percentile – i.e., 1% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 80 others from the same source and published within six weeks on either side of this one. This one is in the 1st percentile – i.e., 1% of its contemporaries scored the same or lower than it.