↓ Skip to main content

Patient understanding of two commonly used patient reported outcome measures for primary care: a cognitive interview study

Overview of attention for article published in BMC Primary Care, September 2018
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (85th percentile)
  • High Attention Score compared to outputs of the same age and source (83rd percentile)

Mentioned by

twitter
24 X users

Citations

dimensions_citation
5 Dimensions

Readers on

mendeley
41 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Patient understanding of two commonly used patient reported outcome measures for primary care: a cognitive interview study
Published in
BMC Primary Care, September 2018
DOI 10.1186/s12875-018-0850-2
Pubmed ID
Authors

Mairead Murphy, Sandra Hollinghurst, Chris Salisbury

Abstract

Standardised generic patient-reported outcome measures (PROMs) which measure health status are often unresponsive to change in primary care. Alternative formats, which have been used to increase responsiveness, include individualised PROMs (in which respondents specify the outcomes of interest in their own words) and transitional PROMs (in which respondents directly rate change over a period). The objective of this study was to test qualitatively, through cognitive interviews, two PROMs, one using each respective format. The individualised PROM selected was the Measure Yourself Medical Outcomes Profile (MYMOP). The transitional PROM was the Patient Enablement Instrument (PEI). Twenty patients who had recently attended the GP were interviewed while completing the questionnaires. Interview data was analysed using a modification of Tourangeau's model of cognitive processing: comprehension, response, recall and face validity. Patients found the PEI simple to complete, but for some it lacked face validity. The transitional scale was sometimes confused with a status scale and was problematic in situations when the relevant GP appointment was part of a longer episode of care. Some patients reported a high enablement score despite verbally reporting low enablement but high regard for their GP, which suggested hypothesis-guessing. The interpretation of the PEI items was inconsistent between patients. MYMOP was more difficult for patients to complete, but had greater face validity than the PEI. The scale used was open to response-shift: some patients suggested they would recalibrate their definition of the scale endpoints as their illness and expectations changed. The study provides information for both users of PEI/MYMOP and developers of individualised and transitional questionnaires. Users should heed the recommendation that MYMOP should be interview-administered, and this is likely to apply to other individualised scales. The PEI is open to hypothesis-guessing and may lack face-validity for a longer episode of care (e.g. in patients with chronic conditions). Developers should be cognisant that transitional scales can be inconsistently completed: some patients forget during completion that they are measuring change from baseline. Although generic questionnaires require the content to be more general than do disease-specific questionnaires, developers should avoid questions which allow broad and varied interpretations.

X Demographics

X Demographics

The data shown below were collected from the profiles of 24 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 41 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 41 100%

Demographic breakdown

Readers by professional status Count As %
Researcher 5 12%
Student > Bachelor 5 12%
Student > Doctoral Student 4 10%
Student > Ph. D. Student 4 10%
Other 3 7%
Other 6 15%
Unknown 14 34%
Readers by discipline Count As %
Medicine and Dentistry 10 24%
Nursing and Health Professions 5 12%
Social Sciences 3 7%
Psychology 2 5%
Business, Management and Accounting 1 2%
Other 4 10%
Unknown 16 39%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 14. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 13 April 2019.
All research outputs
#2,550,228
of 25,385,509 outputs
Outputs from BMC Primary Care
#299
of 2,359 outputs
Outputs of similar age
#51,647
of 351,592 outputs
Outputs of similar age from BMC Primary Care
#6
of 36 outputs
Altmetric has tracked 25,385,509 research outputs across all sources so far. Compared to these this one has done well and is in the 89th percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 2,359 research outputs from this source. They typically receive more attention than average, with a mean Attention Score of 7.7. This one has done well, scoring higher than 87% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 351,592 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 85% of its contemporaries.
We're also able to compare this research output to 36 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 83% of its contemporaries.