↓ Skip to main content

Multiple choice questions can be designed or revised to challenge learners’ critical thinking

Overview of attention for article published in Advances in Health Sciences Education, January 2013
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • Among the highest-scoring outputs from this source (#25 of 851)
  • High Attention Score compared to outputs of the same age (95th percentile)
  • High Attention Score compared to outputs of the same age and source (92nd percentile)

Mentioned by

news
1 news outlet
blogs
1 blog
twitter
5 X users
facebook
1 Facebook page

Citations

dimensions_citation
24 Dimensions

Readers on

mendeley
92 Mendeley
Title
Multiple choice questions can be designed or revised to challenge learners’ critical thinking
Published in
Advances in Health Sciences Education, January 2013
DOI 10.1007/s10459-012-9434-4
Pubmed ID
Authors

Rochelle E. Tractenberg, Matthew M. Gushta, Susan E. Mulroney, Peggy A. Weissinger

Abstract

Multiple choice (MC) questions from a graduate physiology course were evaluated by cognitive-psychology (but not physiology) experts, and analyzed statistically, in order to test the independence of content expertise and cognitive complexity ratings of MC items. Integration of higher order thinking into MC exams is important, but widely known to be challenging-perhaps especially when content experts must think like novices. Expertise in the domain (content) may actually impede the creation of higher-complexity items. Three cognitive psychology experts independently rated cognitive complexity for 252 multiple-choice physiology items using a six-level cognitive complexity matrix that was synthesized from the literature. Rasch modeling estimated item difficulties. The complexity ratings and difficulty estimates were then analyzed together to determine the relative contributions (and independence) of complexity and difficulty to the likelihood of correct answers on each item. Cognitive complexity was found to be statistically independent of difficulty estimates for 88 % of items. Using the complexity matrix, modifications were identified to increase some item complexities by one level, without affecting the item's difficulty. Cognitive complexity can effectively be rated by non-content experts. The six-level complexity matrix, if applied by faculty peer groups trained in cognitive complexity and without domain-specific expertise, could lead to improvements in the complexity targeted with item writing and revision. Targeting higher order thinking with MC questions can be achieved without changing item difficulties or other test characteristics, but this may be less likely if the content expert is left to assess items within their domain of expertise.

X Demographics

X Demographics

The data shown below were collected from the profiles of 5 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 92 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United Kingdom 2 2%
Malaysia 1 1%
Portugal 1 1%
Brazil 1 1%
United States 1 1%
Unknown 86 93%

Demographic breakdown

Readers by professional status Count As %
Student > Master 12 13%
Professor > Associate Professor 10 11%
Researcher 8 9%
Student > Doctoral Student 7 8%
Lecturer > Senior Lecturer 6 7%
Other 30 33%
Unknown 19 21%
Readers by discipline Count As %
Social Sciences 18 20%
Medicine and Dentistry 15 16%
Nursing and Health Professions 6 7%
Computer Science 6 7%
Psychology 4 4%
Other 22 24%
Unknown 21 23%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 25. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 03 March 2015.
All research outputs
#1,302,631
of 22,711,645 outputs
Outputs from Advances in Health Sciences Education
#25
of 851 outputs
Outputs of similar age
#12,429
of 280,711 outputs
Outputs of similar age from Advances in Health Sciences Education
#1
of 14 outputs
Altmetric has tracked 22,711,645 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 94th percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 851 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 5.7. This one has done particularly well, scoring higher than 97% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 280,711 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 95% of its contemporaries.
We're also able to compare this research output to 14 others from the same source and published within six weeks on either side of this one. This one has done particularly well, scoring higher than 92% of its contemporaries.