↓ Skip to main content

Using mixed methods evaluation to assess the feasibility of online clinical training in evidence based interventions: a case study of cognitive behavioural treatment for low back pain

Overview of attention for article published in BMC Medical Education, June 2016
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • Good Attention Score compared to outputs of the same age (75th percentile)
  • Good Attention Score compared to outputs of the same age and source (73rd percentile)

Mentioned by

twitter
8 tweeters

Citations

dimensions_citation
8 Dimensions

Readers on

mendeley
102 Mendeley
Title
Using mixed methods evaluation to assess the feasibility of online clinical training in evidence based interventions: a case study of cognitive behavioural treatment for low back pain
Published in
BMC Medical Education, June 2016
DOI 10.1186/s12909-016-0683-4
Pubmed ID
Authors

Helen Richmond, Amanda M. Hall, Zara Hansen, Esther Williamson, David Davies, Sarah E. Lamb

Abstract

Cognitive behavioural (CB) approaches are effective in the management of non-specific low back pain (LBP). We developed the CB Back Skills Training programme (BeST) and previously provided evidence of clinical and cost effectiveness in a large pragmatic trial. However, practice change is challenged by a lack of treatment guidance and training for clinicians. We aimed to explore the feasibility and acceptability of an online programme (iBeST) for providing training in a CB approach. This mixed methods study comprised an individually randomised controlled trial of 35 physiotherapists and an interview study of 8 physiotherapists. Participants were recruited from 8 National Health Service departments in England and allocated by a computer generated randomisation list to receive iBeST (n = 16) or a face-to-face workshop (n = 19). Knowledge (of a CB approach), clinical skills (unblinded assessment of CB skills in practice), self-efficacy (reported confidence in using new skills), attitudes (towards LBP management), and satisfaction were assessed after training. Engagement with iBeST was assessed with user analytics. Interviews explored acceptability and experiences with iBeST. Data sets were analysed independently and jointly interpreted. Fifteen (94 %) participants in the iBeST group and 16 (84 %) participants in the workshop group provided data immediately after training. We observed similar scores on knowledge (MD (95 % CI): 0.97 (-1.33, 3.26)), and self-efficacy to deliver the majority of the programme (MD (95 % CI) 0.25 (-1.7; 0.7)). However, the workshop group showed greater reduction in biomedical attitudes to LBP management (MD (95 % CI): -7.43 (-10.97, -3.89)). Clinical skills were assessed in 5 (33 %) iBeST participants and 7 (38 %) workshop participants within 6 months of training and were similar between groups (MD (95 % CI): 0.17(-0.2; 0.54)). Interviews highlighted that while initially sceptical, participants found iBeST acceptable. A number of strategies were identified to enhance future versions of iBeST such as including more skills practice. Combined quantitative and qualitative data indicated that online training was an acceptable and promising method for providing training in an evidence based complex intervention. With future enhancement, the potential reach of this training method may facilitate evidence-based practice through large scale upskilling of the workforce. Current Controlled Trials ISRCTN82203145 (registered prospectively on 03.09.2012).

Twitter Demographics

The data shown below were collected from the profiles of 8 tweeters who shared this research output. Click here to find out more about how the information was compiled.

Mendeley readers

The data shown below were compiled from readership statistics for 102 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United Kingdom 1 <1%
Unknown 101 99%

Demographic breakdown

Readers by professional status Count As %
Student > Master 22 22%
Researcher 18 18%
Student > Ph. D. Student 14 14%
Unspecified 12 12%
Student > Bachelor 8 8%
Other 28 27%
Readers by discipline Count As %
Medicine and Dentistry 27 26%
Nursing and Health Professions 21 21%
Unspecified 20 20%
Psychology 11 11%
Social Sciences 10 10%
Other 13 13%

Attention Score in Context

This research output has an Altmetric Attention Score of 6. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 19 June 2017.
All research outputs
#2,475,273
of 11,383,682 outputs
Outputs from BMC Medical Education
#411
of 1,513 outputs
Outputs of similar age
#67,215
of 271,896 outputs
Outputs of similar age from BMC Medical Education
#15
of 60 outputs
Altmetric has tracked 11,383,682 research outputs across all sources so far. Compared to these this one has done well and is in the 78th percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,513 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 5.8. This one has gotten more attention than average, scoring higher than 72% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 271,896 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 75% of its contemporaries.
We're also able to compare this research output to 60 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 73% of its contemporaries.