↓ Skip to main content

Integrating Curriculum-Based Dynamic Assessment in Computerized Adaptive Testing: Development and Predictive Validity of the EDPL-BAI Battery on Reading Competence

Overview of attention for article published in Frontiers in Psychology, August 2018
Altmetric Badge

About this Attention Score

  • Good Attention Score compared to outputs of the same age (66th percentile)
  • Above-average Attention Score compared to outputs of the same age and source (58th percentile)

Mentioned by

twitter
8 X users

Readers on

mendeley
42 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Integrating Curriculum-Based Dynamic Assessment in Computerized Adaptive Testing: Development and Predictive Validity of the EDPL-BAI Battery on Reading Competence
Published in
Frontiers in Psychology, August 2018
DOI 10.3389/fpsyg.2018.01492
Pubmed ID
Authors

Juan-José Navarro, Catalina Mourgues-Codern, Eduardo Guzmán, Isabel R. Rodríguez-Ortiz, Ricardo Conejo, Claudia Sánchez-Gutiérrez, Jesús de la Fuente, Diana Martella, Mahia Saracostti

Abstract

In recent decades there have been significant changes in the conceptualization of reading as well as in the perception of how this activity should be assessed. Interest in the analysis of reading processes has led to the emergence of new explanatory models based primarily on the contributions of cognitive psychology. In parallel, there have been notable advances in measurement procedures, especially in models based on Item Response Theory (IRT), as well as in the capacity and performance of specific software programs that allow data to be managed and analyzed. These changes have contributed significantly to the rise of testing procedures such as computerized adaptive tests (CATs), whose fundamental characteristic is that the sequence of items presented in the tests is adapted to the level of competence that the subject manifests. Likewise, the incorporation of elements of dynamic assessment (DA) as the prompts are gradually offered allows for obtaining information about the type and degree of support required to optimize the subject's performance. In this sense, the confluence of contributions from DA and CATs offers a new possibility for approaching the assessment of learning processes. In this article, we present a longitudinal research developed in two phases, through which a computerized dynamic adaptive assessment battery of reading processes (EDPL-BAI) was configured. The research frame involved 1,831 students (46% girls) from 13 public schools in three regions of Chile. The purpose of this study was to analyze the differential contribution on reading competence of dynamic scores obtained in a subsample composed of 324 (47% girls) students from third to sixth grade after the implementation of a set of adaptive dynamic tests of morpho-syntactic processes. The results achieved in the structural equation modeling indicate a good global fit. Individual relationships show a significant contribution of calibrated score that reflects estimated knowledge level on reading competence, as well as dynamic scores based on the assigned value of graduated prompts required by the students. These results showed significant predictive values on reading competence and incremental validity in relation to predictions made by static criterion tests.

X Demographics

X Demographics

The data shown below were collected from the profiles of 8 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 42 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 42 100%

Demographic breakdown

Readers by professional status Count As %
Student > Master 9 21%
Researcher 4 10%
Student > Doctoral Student 4 10%
Student > Ph. D. Student 3 7%
Student > Bachelor 2 5%
Other 7 17%
Unknown 13 31%
Readers by discipline Count As %
Psychology 12 29%
Social Sciences 4 10%
Engineering 3 7%
Linguistics 2 5%
Nursing and Health Professions 1 2%
Other 8 19%
Unknown 12 29%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 5. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 28 July 2019.
All research outputs
#6,479,616
of 23,821,324 outputs
Outputs from Frontiers in Psychology
#9,254
of 31,778 outputs
Outputs of similar age
#110,497
of 336,189 outputs
Outputs of similar age from Frontiers in Psychology
#306
of 748 outputs
Altmetric has tracked 23,821,324 research outputs across all sources so far. This one has received more attention than most of these and is in the 72nd percentile.
So far Altmetric has tracked 31,778 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 12.7. This one has gotten more attention than average, scoring higher than 70% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 336,189 tracked outputs that were published within six weeks on either side of this one in any source. This one has gotten more attention than average, scoring higher than 66% of its contemporaries.
We're also able to compare this research output to 748 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 58% of its contemporaries.