↓ Skip to main content

Using mixed methods evaluation to assess the feasibility of online clinical training in evidence based interventions: a case study of cognitive behavioural treatment for low back pain

Overview of attention for article published in BMC Medical Education, June 2016
Altmetric Badge

About this Attention Score

  • Good Attention Score compared to outputs of the same age (73rd percentile)
  • Good Attention Score compared to outputs of the same age and source (68th percentile)

Mentioned by

twitter
8 X users

Citations

dimensions_citation
20 Dimensions

Readers on

mendeley
258 Mendeley
Title
Using mixed methods evaluation to assess the feasibility of online clinical training in evidence based interventions: a case study of cognitive behavioural treatment for low back pain
Published in
BMC Medical Education, June 2016
DOI 10.1186/s12909-016-0683-4
Pubmed ID
Authors

Helen Richmond, Amanda M. Hall, Zara Hansen, Esther Williamson, David Davies, Sarah E. Lamb

Abstract

Cognitive behavioural (CB) approaches are effective in the management of non-specific low back pain (LBP). We developed the CB Back Skills Training programme (BeST) and previously provided evidence of clinical and cost effectiveness in a large pragmatic trial. However, practice change is challenged by a lack of treatment guidance and training for clinicians. We aimed to explore the feasibility and acceptability of an online programme (iBeST) for providing training in a CB approach. This mixed methods study comprised an individually randomised controlled trial of 35 physiotherapists and an interview study of 8 physiotherapists. Participants were recruited from 8 National Health Service departments in England and allocated by a computer generated randomisation list to receive iBeST (n = 16) or a face-to-face workshop (n = 19). Knowledge (of a CB approach), clinical skills (unblinded assessment of CB skills in practice), self-efficacy (reported confidence in using new skills), attitudes (towards LBP management), and satisfaction were assessed after training. Engagement with iBeST was assessed with user analytics. Interviews explored acceptability and experiences with iBeST. Data sets were analysed independently and jointly interpreted. Fifteen (94 %) participants in the iBeST group and 16 (84 %) participants in the workshop group provided data immediately after training. We observed similar scores on knowledge (MD (95 % CI): 0.97 (-1.33, 3.26)), and self-efficacy to deliver the majority of the programme (MD (95 % CI) 0.25 (-1.7; 0.7)). However, the workshop group showed greater reduction in biomedical attitudes to LBP management (MD (95 % CI): -7.43 (-10.97, -3.89)). Clinical skills were assessed in 5 (33 %) iBeST participants and 7 (38 %) workshop participants within 6 months of training and were similar between groups (MD (95 % CI): 0.17(-0.2; 0.54)). Interviews highlighted that while initially sceptical, participants found iBeST acceptable. A number of strategies were identified to enhance future versions of iBeST such as including more skills practice. Combined quantitative and qualitative data indicated that online training was an acceptable and promising method for providing training in an evidence based complex intervention. With future enhancement, the potential reach of this training method may facilitate evidence-based practice through large scale upskilling of the workforce. Current Controlled Trials ISRCTN82203145 (registered prospectively on 03.09.2012).

X Demographics

X Demographics

The data shown below were collected from the profiles of 8 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 258 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United Kingdom 1 <1%
Unknown 257 100%

Demographic breakdown

Readers by professional status Count As %
Student > Master 42 16%
Student > Ph. D. Student 32 12%
Researcher 29 11%
Student > Bachelor 24 9%
Student > Doctoral Student 15 6%
Other 43 17%
Unknown 73 28%
Readers by discipline Count As %
Nursing and Health Professions 50 19%
Medicine and Dentistry 44 17%
Psychology 23 9%
Social Sciences 14 5%
Sports and Recreations 9 3%
Other 29 11%
Unknown 89 34%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 6. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 19 June 2017.
All research outputs
#5,738,787
of 22,877,793 outputs
Outputs from BMC Medical Education
#905
of 3,337 outputs
Outputs of similar age
#94,990
of 353,574 outputs
Outputs of similar age from BMC Medical Education
#17
of 60 outputs
Altmetric has tracked 22,877,793 research outputs across all sources so far. This one has received more attention than most of these and is in the 74th percentile.
So far Altmetric has tracked 3,337 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 6.3. This one has gotten more attention than average, scoring higher than 72% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 353,574 tracked outputs that were published within six weeks on either side of this one in any source. This one has gotten more attention than average, scoring higher than 73% of its contemporaries.
We're also able to compare this research output to 60 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 68% of its contemporaries.