↓ Skip to main content

Medical students create multiple-choice questions for learning in pathology education: a pilot study

Overview of attention for article published in BMC Medical Education, August 2018
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • Good Attention Score compared to outputs of the same age (71st percentile)
  • Good Attention Score compared to outputs of the same age and source (66th percentile)

Mentioned by

twitter
10 X users

Citations

dimensions_citation
44 Dimensions

Readers on

mendeley
171 Mendeley
Title
Medical students create multiple-choice questions for learning in pathology education: a pilot study
Published in
BMC Medical Education, August 2018
DOI 10.1186/s12909-018-1312-1
Pubmed ID
Authors

Rebecca Grainger, Wei Dai, Emma Osborne, Diane Kenwright

Abstract

Medical students facing high-stakes exams want study resources that have a direct relationship with their assessments. At the same time, they need to develop the skills to think analytically about complex clinical problems. Multiple-choice questions (MCQs) are widely used in medical education and can promote surface learning strategies, but creating MCQs requires both in-depth content knowledge and sophisticated analytical thinking. Therefore, we piloted an MCQ-writing task in which students developed MCQs for their peers to answer. Students in a fourth-year anatomic pathology course (N = 106) were required to write MCQs using the PeerWise platform. Students created two MCQs for each of four topic areas and the MCQs were answered, rated and commented on by their classmates. Questions were rated for cognitive complexity and a paper-based survey was administered to investigate whether this activity was acceptable, feasible, and whether it promoted desirable learning behaviours in students. Students were able to create cognitively challenging MCQs: 313/421 (74%) of the MCQs which we rated required the respondent to apply or analyse pathology knowledge. However, students who responded to the end-of-course questionnaire (N = 62) saw the task as having little educational value. Students found PeerWise easy to use, and indicated that they read widely to prepare questions and monitored the quality of their questions. They did not, however, engage in extensive peer feedback via PeerWise. Our study showed that the MCQ writing task was feasible and engaged students in self-evaluation and synthesising information from a range of sources, but it was not well accepted and did not strongly engage students in peer-learning. Although students were able to create complex MCQs, they found some aspects of the writing process burdensome and tended not to trust the quality of each other's MCQs. Because of the evidence this task did promote deep learning, it is worth continuing this mode of teaching if the task can be made more acceptable to students.

X Demographics

X Demographics

The data shown below were collected from the profiles of 10 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 171 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 171 100%

Demographic breakdown

Readers by professional status Count As %
Student > Bachelor 21 12%
Student > Master 20 12%
Researcher 11 6%
Lecturer 11 6%
Professor > Associate Professor 10 6%
Other 42 25%
Unknown 56 33%
Readers by discipline Count As %
Medicine and Dentistry 45 26%
Social Sciences 13 8%
Nursing and Health Professions 10 6%
Psychology 6 4%
Agricultural and Biological Sciences 5 3%
Other 24 14%
Unknown 68 40%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 6. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 14 August 2021.
All research outputs
#5,589,607
of 23,100,534 outputs
Outputs from BMC Medical Education
#856
of 3,387 outputs
Outputs of similar age
#95,997
of 334,082 outputs
Outputs of similar age from BMC Medical Education
#23
of 69 outputs
Altmetric has tracked 23,100,534 research outputs across all sources so far. Compared to these this one has done well and is in the 75th percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 3,387 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 6.4. This one has gotten more attention than average, scoring higher than 74% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 334,082 tracked outputs that were published within six weeks on either side of this one in any source. This one has gotten more attention than average, scoring higher than 71% of its contemporaries.
We're also able to compare this research output to 69 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 66% of its contemporaries.