↓ Skip to main content

Toward meaningful evaluation of medical trainees: the influence of participants’ perceptions of the process

Overview of attention for article published in Advances in Health Sciences Education, February 2010
Altmetric Badge

Mentioned by

twitter
5 X users

Citations

dimensions_citation
107 Dimensions

Readers on

mendeley
193 Mendeley
citeulike
1 CiteULike
Title
Toward meaningful evaluation of medical trainees: the influence of participants’ perceptions of the process
Published in
Advances in Health Sciences Education, February 2010
DOI 10.1007/s10459-010-9223-x
Pubmed ID
Authors

Christopher J. Watling, Lorelei Lingard

Abstract

An essential goal of evaluation is to foster learning. Across the medical education spectrum, evaluation of clinical performance is dominated by subjective feedback to learners based on observation by expert supervisors. Research in non-medical settings has suggested that participants' perceptions of evaluation processes exert considerable influence over whether the feedback they receive actually facilitates learning, but similar research on perceptions of feedback in the medical setting has been limited. In this review, we examine the literature on recipient perceptions of feedback and how those perceptions influence the contribution that feedback makes to their learning. A focused exploration of relevant work on this subject in higher education and industrial psychology settings is followed by a detailed examination of available research on perceptions of evaluation processes in medical settings, encompassing both trainee and evaluator perspectives. We conclude that recipients' and evaluators' perceptions of an evaluation process profoundly affect the usefulness of the evaluation and the extent to which it achieves its goals. Attempts to improve evaluation processes cannot, therefore, be limited to assessment tool modification driven by reliability and validity concerns, but must also take account of the critical issue of feedback reception and the factors that influence it. Given the unique context of clinical performance evaluation in medicine, a research agenda is required that seeks to more fully understand the complexity of the processes of giving, receiving, interpreting, and using feedback as a basis for real progress toward meaningful evaluation.

X Demographics

X Demographics

The data shown below were collected from the profiles of 5 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 193 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Canada 3 2%
United States 2 1%
Ireland 1 <1%
Australia 1 <1%
Unknown 186 96%

Demographic breakdown

Readers by professional status Count As %
Professor > Associate Professor 24 12%
Student > Master 22 11%
Researcher 18 9%
Other 17 9%
Student > Ph. D. Student 17 9%
Other 67 35%
Unknown 28 15%
Readers by discipline Count As %
Medicine and Dentistry 88 46%
Social Sciences 25 13%
Nursing and Health Professions 15 8%
Psychology 8 4%
Business, Management and Accounting 5 3%
Other 11 6%
Unknown 41 21%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 3. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 12 July 2022.
All research outputs
#13,825,092
of 24,149,630 outputs
Outputs from Advances in Health Sciences Education
#508
of 910 outputs
Outputs of similar age
#135,851
of 172,692 outputs
Outputs of similar age from Advances in Health Sciences Education
#8
of 11 outputs
Altmetric has tracked 24,149,630 research outputs across all sources so far. This one is in the 42nd percentile – i.e., 42% of other outputs scored the same or lower than it.
So far Altmetric has tracked 910 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 5.6. This one is in the 40th percentile – i.e., 40% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 172,692 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 20th percentile – i.e., 20% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 11 others from the same source and published within six weeks on either side of this one. This one is in the 27th percentile – i.e., 27% of its contemporaries scored the same or lower than it.