↓ Skip to main content

Do coursework summative assessments predict clinical performance? A systematic review

Overview of attention for article published in BMC Medical Education, February 2017
Altmetric Badge

About this Attention Score

  • Above-average Attention Score compared to outputs of the same age (61st percentile)
  • Average Attention Score compared to outputs of the same age and source

Mentioned by

twitter
4 X users

Citations

dimensions_citation
49 Dimensions

Readers on

mendeley
222 Mendeley
Title
Do coursework summative assessments predict clinical performance? A systematic review
Published in
BMC Medical Education, February 2017
DOI 10.1186/s12909-017-0878-3
Pubmed ID
Authors

Rebecca Terry, Wayne Hing, Robin Orr, Nikki Milne

Abstract

Two goals of summative assessment in health profession education programs are to ensure the robustness of high stakes decisions such as progression and licensing, and predict future performance. This systematic and critical review aims to investigate the ability of specific modes of summative assessment to predict the clinical performance of health profession education students. PubMed, CINAHL, SPORTDiscus, ERIC and EMBASE databases were searched using key terms with articles collected subjected to dedicated inclusion criteria. Rigorous exclusion criteria were applied to ensure a consistent interpretation of 'summative assessment' and 'clinical performance'. Data were extracted using a pre-determined format and papers were critically appraised by two independent reviewers using a modified Downs and Black checklist with level of agreement between reviewers determined through a Kappa analysis. Of the 4783 studies retrieved from the search strategy, 18 studies were included in the final review. Twelve were from the medical profession and there was one from each of physiotherapy, pharmacy, dietetics, speech pathology, dentistry and dental hygiene. Objective Structured Clinical Examinations featured in 15 papers, written assessments in four and problem based learning evaluations, case based learning evaluations and student portfolios each featured in one paper. Sixteen different measures of clinical performance were used. Two papers were identified as 'poor' quality and the remainder categorised as 'fair' with an almost perfect (k = 0.852) level of agreement between raters. Objective Structured Clinical Examination scores accounted for 1.4-39.7% of the variance in student performance; multiple choice/extended matching questions and short answer written examinations accounted for 3.2-29.2%; problem based or case based learning evaluations accounted for 4.4-16.6%; and student portfolios accounted for 12.1%. Objective structured clinical examinations and written examinations consisting of multiple choice/extended matching questions and short answer questions do have significant relationships with the clinical performance of health professional students. However, caution should be applied if using these assessments as predictive measures for clinical performance due to a small body of evidence and large variations in the predictive strength of the relationships identified. Based on the current evidence, the Objective Structured Clinical Examination may be the most appropriate summative assessment for educators to use to identify students that may be at risk of poor performance in a clinical workplace environment. Further research on this topic is needed to improve the strength of the predictive relationship.

X Demographics

X Demographics

The data shown below were collected from the profiles of 4 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 222 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 222 100%

Demographic breakdown

Readers by professional status Count As %
Student > Master 23 10%
Student > Bachelor 22 10%
Other 18 8%
Student > Ph. D. Student 15 7%
Professor > Associate Professor 14 6%
Other 66 30%
Unknown 64 29%
Readers by discipline Count As %
Medicine and Dentistry 68 31%
Nursing and Health Professions 32 14%
Social Sciences 15 7%
Agricultural and Biological Sciences 6 3%
Unspecified 5 2%
Other 28 13%
Unknown 68 31%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 4. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 25 January 2023.
All research outputs
#7,507,557
of 23,197,711 outputs
Outputs from BMC Medical Education
#1,340
of 3,410 outputs
Outputs of similar age
#118,808
of 307,365 outputs
Outputs of similar age from BMC Medical Education
#29
of 55 outputs
Altmetric has tracked 23,197,711 research outputs across all sources so far. This one has received more attention than most of these and is in the 67th percentile.
So far Altmetric has tracked 3,410 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 6.4. This one has gotten more attention than average, scoring higher than 60% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 307,365 tracked outputs that were published within six weeks on either side of this one in any source. This one has gotten more attention than average, scoring higher than 61% of its contemporaries.
We're also able to compare this research output to 55 others from the same source and published within six weeks on either side of this one. This one is in the 47th percentile – i.e., 47% of its contemporaries scored the same or lower than it.