↓ Skip to main content

Distractor Efficiency in an Item Pool for a Statistics Classroom Exam: Assessing Its Relation With Item Cognitive Level Classified According to Bloom’s Taxonomy

Overview of attention for article published in Frontiers in Psychology, August 2018
Altmetric Badge

Mentioned by

twitter
2 X users

Citations

dimensions_citation
19 Dimensions

Readers on

mendeley
68 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Distractor Efficiency in an Item Pool for a Statistics Classroom Exam: Assessing Its Relation With Item Cognitive Level Classified According to Bloom’s Taxonomy
Published in
Frontiers in Psychology, August 2018
DOI 10.3389/fpsyg.2018.01585
Pubmed ID
Authors

Silvia Testa, Anna Toscano, Rosalba Rosato

Abstract

Multiple-choice items are one of the most commonly used tools for evaluating students' knowledge and skills. A key aspect of this type of assessment is the presence of functioning distractors, i.e., incorrect alternatives intended to be plausible for students with lower achievement. To our knowledge, no work has investigated the relationship between distractor performance and the complexity of the cognitive task required to give the correct answer. The aim of this study was to investigate this relation, employing the first three levels of Bloom's taxonomy (Knowledge, Comprehension, and Application). Specifically, it was hypothesized that items classified into a higher level of Bloom's classification would show a greater number of functioning distractors. The study involved 174 items administered to a sample of 848 undergraduate psychology students during their statistics exam. Each student received 30 items randomly selected from the 174-item pool. The bivariate results mainly supported the authors' hypothesis: the highest percentage of functioning distractors was observed among the items classified into the Application category (η2 = 0.024 and Phi = 0.25 for the dichotomized measure). When the analysis controlled for other item features, it lost statistical significance, partly because of the confounding effect of item difficulty.

X Demographics

X Demographics

The data shown below were collected from the profiles of 2 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 68 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 68 100%

Demographic breakdown

Readers by professional status Count As %
Student > Master 10 15%
Lecturer 8 12%
Student > Bachelor 6 9%
Researcher 5 7%
Student > Ph. D. Student 5 7%
Other 11 16%
Unknown 23 34%
Readers by discipline Count As %
Social Sciences 8 12%
Medicine and Dentistry 7 10%
Linguistics 6 9%
Mathematics 5 7%
Psychology 4 6%
Other 15 22%
Unknown 23 34%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 1. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 28 August 2018.
All research outputs
#18,646,262
of 23,099,576 outputs
Outputs from Frontiers in Psychology
#22,634
of 30,499 outputs
Outputs of similar age
#257,211
of 334,858 outputs
Outputs of similar age from Frontiers in Psychology
#644
of 748 outputs
Altmetric has tracked 23,099,576 research outputs across all sources so far. This one is in the 11th percentile – i.e., 11% of other outputs scored the same or lower than it.
So far Altmetric has tracked 30,499 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 12.5. This one is in the 19th percentile – i.e., 19% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 334,858 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 12th percentile – i.e., 12% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 748 others from the same source and published within six weeks on either side of this one. This one is in the 8th percentile – i.e., 8% of its contemporaries scored the same or lower than it.