↓ Skip to main content

Investigating the Impact of Item Parameter Drift for Item Response Theory Models with Mixture Distributions

Overview of attention for article published in Frontiers in Psychology, February 2016
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age
  • Average Attention Score compared to outputs of the same age and source

Mentioned by

twitter
2 X users

Citations

dimensions_citation
13 Dimensions

Readers on

mendeley
26 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Investigating the Impact of Item Parameter Drift for Item Response Theory Models with Mixture Distributions
Published in
Frontiers in Psychology, February 2016
DOI 10.3389/fpsyg.2016.00255
Pubmed ID
Authors

Yoon Soo Park, Young-Sun Lee, Kuan Xing

Abstract

This study investigates the impact of item parameter drift (IPD) on parameter and ability estimation when the underlying measurement model fits a mixture distribution, thereby violating the item invariance property of unidimensional item response theory (IRT) models. An empirical study was conducted to demonstrate the occurrence of both IPD and an underlying mixture distribution using real-world data. Twenty-one trended anchor items from the 1999, 2003, and 2007 administrations of Trends in International Mathematics and Science Study (TIMSS) were analyzed using unidimensional and mixture IRT models. TIMSS treats trended anchor items as invariant over testing administrations and uses pre-calibrated item parameters based on unidimensional IRT. However, empirical results showed evidence of two latent subgroups with IPD. Results also showed changes in the distribution of examinee ability between latent classes over the three administrations. A simulation study was conducted to examine the impact of IPD on the estimation of ability and item parameters, when data have underlying mixture distributions. Simulations used data generated from a mixture IRT model and estimated using unidimensional IRT. Results showed that data reflecting IPD using mixture IRT model led to IPD in the unidimensional IRT model. Changes in the distribution of examinee ability also affected item parameters. Moreover, drift with respect to item discrimination and distribution of examinee ability affected estimates of examinee ability. These findings demonstrate the need to caution and evaluate IPD using a mixture IRT framework to understand its effects on item parameters and examinee ability.

X Demographics

X Demographics

The data shown below were collected from the profiles of 2 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 26 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 26 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 5 19%
Student > Doctoral Student 4 15%
Student > Master 4 15%
Researcher 2 8%
Student > Bachelor 1 4%
Other 3 12%
Unknown 7 27%
Readers by discipline Count As %
Psychology 7 27%
Social Sciences 4 15%
Engineering 2 8%
Linguistics 1 4%
Computer Science 1 4%
Other 3 12%
Unknown 8 31%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 2. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 05 March 2016.
All research outputs
#14,252,067
of 22,852,911 outputs
Outputs from Frontiers in Psychology
#15,122
of 29,874 outputs
Outputs of similar age
#156,898
of 298,866 outputs
Outputs of similar age from Frontiers in Psychology
#299
of 478 outputs
Altmetric has tracked 22,852,911 research outputs across all sources so far. This one is in the 35th percentile – i.e., 35% of other outputs scored the same or lower than it.
So far Altmetric has tracked 29,874 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 12.5. This one is in the 46th percentile – i.e., 46% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 298,866 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 44th percentile – i.e., 44% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 478 others from the same source and published within six weeks on either side of this one. This one is in the 33rd percentile – i.e., 33% of its contemporaries scored the same or lower than it.