↓ Skip to main content

Validation of a performance assessment instrument in problem-based learning tutorials using two cohorts of medical students

Overview of attention for article published in Advances in Health Sciences Education, August 2015
Altmetric Badge

About this Attention Score

  • Above-average Attention Score compared to outputs of the same age (56th percentile)
  • Average Attention Score compared to outputs of the same age and source

Mentioned by

twitter
4 X users
facebook
1 Facebook page
googleplus
1 Google+ user

Citations

dimensions_citation
11 Dimensions

Readers on

mendeley
123 Mendeley
Title
Validation of a performance assessment instrument in problem-based learning tutorials using two cohorts of medical students
Published in
Advances in Health Sciences Education, August 2015
DOI 10.1007/s10459-015-9632-y
Pubmed ID
Authors

Ming Lee, Paul F. Wimmers

Abstract

Although problem-based learning (PBL) has been widely used in medical schools, few studies have attended to the assessment of PBL processes using validated instruments. This study examined reliability and validity for an instrument assessing PBL performance in four domains: Problem Solving, Use of Information, Group Process, and Professionalism. Two cohorts of medical students (N = 310) participated in the study, with 2 years of PBL evaluation data extracted from archive rated by a total of 158 faculty raters. Analyses based on generalizability theory were conducted for reliability examination. Validity was examined through following the Standards for Educational and Psychological Testing to evaluate content validity, response processes, construct validity, predictive validity, and the relationship to the variable of training. For construct validity, correlations of PBL scores with six other outcome measures were examined, including Medical College Admission Test, United States Medical Licensing Examination (USMLE) Step 1, National Board of Medical Examiners (NBME) Comprehensive Basic Science Examination, NBME Comprehensive Clinical Science Examination, Clinical Performance Examination, and USMLE Step 2 Clinical Knowledge. Predictive validity was examined by using PBL scores to predict five medical school outcomes. The highest percentage of PBL total score variance was associated with students (60 %), indicating students in the study differed in their PBL performance. The generalizability and dependability coefficients were moderately high (Ep(2) = .68, ϕ = .60), showing the instrument is reliable for ranking students and identifying competent PBL performers. The patterns of correlations between PBL domain scores and the outcome measures partially support construct validity. PBL performance ratings as a whole significantly (p < .01) predicted all the major medical school achievements. The second year PBL scores were significantly higher than those of the first year, indicating a training effect. Psychometric findings provided support for reliability and many aspects of validity of PBL performance assessment using the instrument.

X Demographics

X Demographics

The data shown below were collected from the profiles of 4 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 123 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United States 3 2%
Argentina 1 <1%
Unknown 119 97%

Demographic breakdown

Readers by professional status Count As %
Student > Master 19 15%
Student > Ph. D. Student 12 10%
Student > Bachelor 11 9%
Researcher 9 7%
Student > Doctoral Student 8 7%
Other 31 25%
Unknown 33 27%
Readers by discipline Count As %
Medicine and Dentistry 32 26%
Social Sciences 18 15%
Nursing and Health Professions 9 7%
Psychology 5 4%
Engineering 4 3%
Other 15 12%
Unknown 40 33%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 3. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 05 April 2016.
All research outputs
#12,740,022
of 22,826,360 outputs
Outputs from Advances in Health Sciences Education
#457
of 851 outputs
Outputs of similar age
#115,794
of 267,563 outputs
Outputs of similar age from Advances in Health Sciences Education
#14
of 22 outputs
Altmetric has tracked 22,826,360 research outputs across all sources so far. This one is in the 43rd percentile – i.e., 43% of other outputs scored the same or lower than it.
So far Altmetric has tracked 851 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 5.7. This one is in the 45th percentile – i.e., 45% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 267,563 tracked outputs that were published within six weeks on either side of this one in any source. This one has gotten more attention than average, scoring higher than 56% of its contemporaries.
We're also able to compare this research output to 22 others from the same source and published within six weeks on either side of this one. This one is in the 36th percentile – i.e., 36% of its contemporaries scored the same or lower than it.