↓ Skip to main content

Content validation of an interprofessional learning video peer assessment tool

Overview of attention for article published in BMC Medical Education, December 2017
Altmetric Badge

About this Attention Score

  • Above-average Attention Score compared to outputs of the same age (54th percentile)
  • Average Attention Score compared to outputs of the same age and source

Mentioned by

twitter
6 X users

Citations

dimensions_citation
12 Dimensions

Readers on

mendeley
141 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Content validation of an interprofessional learning video peer assessment tool
Published in
BMC Medical Education, December 2017
DOI 10.1186/s12909-017-1099-5
Pubmed ID
Authors

Gillian Nisbet, Christine Jorm, Chris Roberts, Christopher J. Gordon, Timothy F. Chen

Abstract

Large scale models of interprofessional learning (IPL) where outcomes are assessed are rare within health professional curricula. To date, there is sparse research describing robust assessment strategies to support such activities. We describe the development of an IPL assessment task based on peer rating of a student generated video evidencing collaborative interprofessional practice. We provide content validation evidence of an assessment rubric in the context of large scale IPL. Two established approaches to scale development in an educational setting were combined. A literature review was undertaken to develop a conceptual model of the relevant domains and issues pertaining to assessment of student generated videos within IPL. Starting with a prototype rubric developed from the literature, a series of staff and student workshops were undertaken to integrate expert opinion and user perspectives. Participants assessed five-minute videos produced in a prior pilot IPL activity. Outcomes from each workshop informed the next version of the rubric until agreement was reached on anchoring statements and criteria. At this point the rubric was declared fit to be used in the upcoming mandatory large scale IPL activity. The assessment rubric consisted of four domains: patient issues, interprofessional negotiation; interprofessional management plan in action; and effective use of video medium to engage audience. The first three domains reflected topic content relevant to the underlying construct of interprofessional collaborative practice. The fourth domain was consistent with the broader video assessment literature calling for greater emphasis on creativity in education. We have provided evidence for the content validity of a video-based peer assessment task portraying interprofessional collaborative practice in the context of large-scale IPL activities for healthcare professional students. Further research is needed to establish the reliability of such a scale.

X Demographics

X Demographics

The data shown below were collected from the profiles of 6 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 141 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 141 100%

Demographic breakdown

Readers by professional status Count As %
Student > Master 14 10%
Student > Doctoral Student 12 9%
Student > Bachelor 11 8%
Student > Ph. D. Student 8 6%
Researcher 7 5%
Other 22 16%
Unknown 67 48%
Readers by discipline Count As %
Nursing and Health Professions 16 11%
Medicine and Dentistry 13 9%
Social Sciences 9 6%
Psychology 6 4%
Linguistics 4 3%
Other 19 13%
Unknown 74 52%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 3. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 20 December 2017.
All research outputs
#12,865,484
of 23,011,300 outputs
Outputs from BMC Medical Education
#1,481
of 3,366 outputs
Outputs of similar age
#199,633
of 440,140 outputs
Outputs of similar age from BMC Medical Education
#57
of 103 outputs
Altmetric has tracked 23,011,300 research outputs across all sources so far. This one is in the 43rd percentile – i.e., 43% of other outputs scored the same or lower than it.
So far Altmetric has tracked 3,366 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 6.3. This one has gotten more attention than average, scoring higher than 55% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 440,140 tracked outputs that were published within six weeks on either side of this one in any source. This one has gotten more attention than average, scoring higher than 54% of its contemporaries.
We're also able to compare this research output to 103 others from the same source and published within six weeks on either side of this one. This one is in the 41st percentile – i.e., 41% of its contemporaries scored the same or lower than it.