↓ Skip to main content

From aggregation to interpretation: how assessors judge complex data in a competency-based portfolio

Overview of attention for article published in Advances in Health Sciences Education, October 2017
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • Among the highest-scoring outputs from this source (#50 of 892)
  • High Attention Score compared to outputs of the same age (89th percentile)
  • High Attention Score compared to outputs of the same age and source (84th percentile)

Mentioned by

twitter
29 X users
facebook
1 Facebook page
googleplus
1 Google+ user

Citations

dimensions_citation
47 Dimensions

Readers on

mendeley
115 Mendeley
Title
From aggregation to interpretation: how assessors judge complex data in a competency-based portfolio
Published in
Advances in Health Sciences Education, October 2017
DOI 10.1007/s10459-017-9793-y
Pubmed ID
Authors

Andrea Oudkerk Pool, Marjan J. B. Govaerts, Debbie A. D. C. Jaarsma, Erik W. Driessen

Abstract

While portfolios are increasingly used to assess competence, the validity of such portfolio-based assessments has hitherto remained unconfirmed. The purpose of the present research is therefore to further our understanding of how assessors form judgments when interpreting the complex data included in a competency-based portfolio. Eighteen assessors appraised one of three competency-based mock portfolios while thinking aloud, before taking part in semi-structured interviews. A thematic analysis of the think-aloud protocols and interviews revealed that assessors reached judgments through a 3-phase cyclical cognitive process of acquiring, organizing, and integrating evidence. Upon conclusion of the first cycle, assessors reviewed the remaining portfolio evidence to look for confirming or disconfirming evidence. Assessors were inclined to stick to their initial judgments even when confronted with seemingly disconfirming evidence. Although assessors reached similar final (pass-fail) judgments of students' professional competence, they differed in their information-processing approaches and the reasoning behind their judgments. Differences sprung from assessors' divergent assessment beliefs, performance theories, and inferences about the student. Assessment beliefs refer to assessors' opinions about what kind of evidence gives the most valuable and trustworthy information about the student's competence, whereas assessors' performance theories concern their conceptualizations of what constitutes professional competence and competent performance. Even when using the same pieces of information, assessors furthermore differed with respect to inferences about the student as a person as well as a (future) professional. Our findings support the notion that assessors' reasoning in judgment and decision-making varies and is guided by their mental models of performance assessment, potentially impacting feedback and the credibility of decisions. Our findings also lend further credence to the assertion that portfolios should be judged by multiple assessors who should, moreover, thoroughly substantiate their judgments. Finally, it is suggested that portfolios be designed in such a way that they facilitate the selection of and navigation through the portfolio evidence.

X Demographics

X Demographics

The data shown below were collected from the profiles of 29 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 115 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 115 100%

Demographic breakdown

Readers by professional status Count As %
Student > Master 20 17%
Student > Ph. D. Student 12 10%
Researcher 8 7%
Professor > Associate Professor 8 7%
Other 7 6%
Other 29 25%
Unknown 31 27%
Readers by discipline Count As %
Medicine and Dentistry 31 27%
Social Sciences 22 19%
Nursing and Health Professions 9 8%
Psychology 4 3%
Unspecified 2 2%
Other 9 8%
Unknown 38 33%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 20. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 06 November 2020.
All research outputs
#1,687,472
of 23,852,694 outputs
Outputs from Advances in Health Sciences Education
#50
of 892 outputs
Outputs of similar age
#34,970
of 329,161 outputs
Outputs of similar age from Advances in Health Sciences Education
#3
of 13 outputs
Altmetric has tracked 23,852,694 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 92nd percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 892 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 5.7. This one has done particularly well, scoring higher than 94% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 329,161 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 89% of its contemporaries.
We're also able to compare this research output to 13 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 84% of its contemporaries.