↓ Skip to main content

Approximate measurement invariance in cross-classified rater-mediated assessments

Overview of attention for article published in Frontiers in Psychology, December 2014
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age

Mentioned by

twitter
2 X users

Citations

dimensions_citation
17 Dimensions

Readers on

mendeley
34 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Approximate measurement invariance in cross-classified rater-mediated assessments
Published in
Frontiers in Psychology, December 2014
DOI 10.3389/fpsyg.2014.01469
Pubmed ID
Authors

Ben Kelcey, Dan McGinn, Heather Hill

Abstract

An important assumption underlying meaningful comparisons of scores in rater-mediated assessments is that measurement is commensurate across raters. When raters differentially apply the standards established by an instrument, scores from different raters are on fundamentally different scales and no longer preserve a common meaning and basis for comparison. In this study, we developed a method to accommodate measurement noninvariance across raters when measurements are cross-classified within two distinct hierarchical units. We conceptualized random item effects cross-classified graded response models and used random discrimination and threshold effects to test, calibrate, and account for measurement noninvariance among raters. By leveraging empirical estimates of rater-specific deviations in the discrimination and threshold parameters, the proposed method allows us to identify noninvariant items and empirically estimate and directly adjust for this noninvariance within a cross-classified framework. Within the context of teaching evaluations, the results of a case study suggested substantial noninvariance across raters and that establishing an approximately invariant scale through random item effects improves model fit and predictive validity.

X Demographics

X Demographics

The data shown below were collected from the profiles of 2 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 34 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 34 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 11 32%
Student > Doctoral Student 5 15%
Researcher 4 12%
Student > Master 3 9%
Student > Bachelor 2 6%
Other 2 6%
Unknown 7 21%
Readers by discipline Count As %
Social Sciences 11 32%
Psychology 9 26%
Economics, Econometrics and Finance 3 9%
Computer Science 2 6%
Agricultural and Biological Sciences 1 3%
Other 3 9%
Unknown 5 15%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 1. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 04 August 2020.
All research outputs
#17,131,478
of 25,169,746 outputs
Outputs from Frontiers in Psychology
#20,960
of 33,998 outputs
Outputs of similar age
#222,281
of 365,353 outputs
Outputs of similar age from Frontiers in Psychology
#291
of 362 outputs
Altmetric has tracked 25,169,746 research outputs across all sources so far. This one is in the 21st percentile – i.e., 21% of other outputs scored the same or lower than it.
So far Altmetric has tracked 33,998 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 13.2. This one is in the 32nd percentile – i.e., 32% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 365,353 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 30th percentile – i.e., 30% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 362 others from the same source and published within six weeks on either side of this one. This one is in the 15th percentile – i.e., 15% of its contemporaries scored the same or lower than it.