↓ Skip to main content

Applying Rasch analysis to evaluate measurement equivalence of different administration formats of the Activity Limitation scale of the Cambridge Pulmonary Hypertension Outcome Review (CAMPHOR)

Overview of attention for article published in Health and Quality of Life Outcomes, April 2016
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age

Mentioned by

twitter
3 tweeters
facebook
1 Facebook page

Citations

dimensions_citation
5 Dimensions

Readers on

mendeley
30 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Applying Rasch analysis to evaluate measurement equivalence of different administration formats of the Activity Limitation scale of the Cambridge Pulmonary Hypertension Outcome Review (CAMPHOR)
Published in
Health and Quality of Life Outcomes, April 2016
DOI 10.1186/s12955-016-0462-2
Pubmed ID
Authors

J. Twiss, S. P. McKenna, J. Graham, K. Swetz, J. Sloan, M. Gomberg-Maitland

Abstract

Electronic formats of patient-reported outcome (PRO) measures are now routinely used in clinical research studies. When changing from a validated paper and pen to electronic administration it is necessary to establish their equivalence. This study reports on the value of Rasch analysis in this process. Three groups of US pulmonary hypertension (PH) patients participated. The first completed an electronic version of the CAMPHOR Activity Limitation scale (e-sample) and this was compared with two pen and paper administrated samples (pp1 and pp2). The three databases were combined and analysed for fit to the Rasch model. Equivalence was evaluated by differential item functioning (DIF) analyses. The three datasets were matched randomly in terms of sample size (n = 147). Mean age (years) and percentage of male respondents were as follows: e-sample (51.7, 16.0 %); pp1 (50.0, 14.0 %); pp2 (55.5, 40.4 %). The combined dataset achieved fit to the Rasch model. Two items showed evidence of borderline DIF. Further analyses showed the inclusion of these items had little impact on Rasch estimates indicating the DIF identified was unimportant. Differences between the performance of the electronic and pen and paper administrations of the CAMPHOR Activity Limitation scale were minor. The results were successful in showing how the Rasch model can be used to determine the equivalence of alternative formats of PRO measures.

Twitter Demographics

The data shown below were collected from the profiles of 3 tweeters who shared this research output. Click here to find out more about how the information was compiled.

Mendeley readers

The data shown below were compiled from readership statistics for 30 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United Kingdom 1 3%
Unknown 29 97%

Demographic breakdown

Readers by professional status Count As %
Student > Master 6 20%
Student > Ph. D. Student 5 17%
Student > Bachelor 4 13%
Student > Doctoral Student 4 13%
Researcher 3 10%
Other 8 27%
Readers by discipline Count As %
Medicine and Dentistry 7 23%
Psychology 5 17%
Nursing and Health Professions 5 17%
Computer Science 3 10%
Social Sciences 3 10%
Other 6 20%
Unknown 1 3%

Attention Score in Context

This research output has an Altmetric Attention Score of 2. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 12 April 2016.
All research outputs
#9,450,713
of 15,442,255 outputs
Outputs from Health and Quality of Life Outcomes
#861
of 1,660 outputs
Outputs of similar age
#137,870
of 265,936 outputs
Outputs of similar age from Health and Quality of Life Outcomes
#1
of 1 outputs
Altmetric has tracked 15,442,255 research outputs across all sources so far. This one is in the 36th percentile – i.e., 36% of other outputs scored the same or lower than it.
So far Altmetric has tracked 1,660 research outputs from this source. They receive a mean Attention Score of 4.0. This one is in the 44th percentile – i.e., 44% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 265,936 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 44th percentile – i.e., 44% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 1 others from the same source and published within six weeks on either side of this one. This one has scored higher than all of them