↓ Skip to main content

A national training program for simulation educators and technicians: evaluation strategy and outcomes

Overview of attention for article published in BMC Medical Education, January 2016
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (91st percentile)
  • High Attention Score compared to outputs of the same age and source (94th percentile)

Mentioned by

twitter
26 X users

Citations

dimensions_citation
45 Dimensions

Readers on

mendeley
140 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
A national training program for simulation educators and technicians: evaluation strategy and outcomes
Published in
BMC Medical Education, January 2016
DOI 10.1186/s12909-016-0548-x
Pubmed ID
Authors

Debra Nestel, Margaret Bearman, Peter Brooks, Dylan Campher, Kirsty Freeman, Jennene Greenhill, Brian Jolly, Leanne Rogers, Cobie Rudd, Cyle Sprick, Beverley Sutton, Jennifer Harlim, Marcus Watson

Abstract

Simulation-based education (SBE) has seen a dramatic uptake in health professions education over the last decade. SBE offers learning opportunities that are difficult to access by other methods. Competent faculty is seen as key to high quality SBE. In 2011, in response to a significant national healthcare issue - the need to enhance the quality and scale of SBE - a group of Australian universities was commissioned to develop a national training program - Australian Simulation Educator and Technician Training (AusSETT) Program. This paper reports the evaluation of this large-scale initiative. The AusSETT Program adopted a train-the-trainer model, which offered up to three days of workshops and between four and eight hours of e-learning. The Program was offered across all professions in all states and territories. Three hundred and three participants attended workshops with 230 also completing e-learning modules. Topics included: foundational learning theory; orientation to diverse simulation modalities; briefing; and debriefing. A layered objectives-oriented evaluation strategy was adopted with multiple stakeholders (participants, external experts), methods of data collection (end of module evaluations, workshop observer reports and individual interviews) and at multiple data points (immediate and two months later). Descriptive statistics were used to analyse numerical data while textual data (written comments and transcripts of interviews) underwent content or thematic analysis. For each module, between 45 and 254 participants completed evaluations. The content and educational methods were rated highly with items exceeding the pre-established standard. In written evaluations, participants identified strengths (e.g. high quality facilitation, breadth and depth of content) and areas for development (e.g. electronic portfolio, learning management system) of the Program. Interviews with participants suggested the Program had positively impacted their educational practices. Observers reported a high quality educational experience for participants with alignment of content and methods with perceived participant needs. The AusSETT Program is a significant and enduring learning resource. The development of a national training program to support a competent simulation workforce is feasible. The Program objectives were largely met. Although there are limitations with the study design (e.g. self-report), there are strengths such as exploring the impact two months later. The evaluation of the Program informs the next phase of the national strategy for simulation educators and technicians with respect to content and processes, strengths and areas for development.

X Demographics

X Demographics

The data shown below were collected from the profiles of 26 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 140 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Spain 1 <1%
Ecuador 1 <1%
Ireland 1 <1%
Germany 1 <1%
Unknown 136 97%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 17 12%
Researcher 16 11%
Student > Bachelor 13 9%
Student > Master 12 9%
Student > Doctoral Student 11 8%
Other 41 29%
Unknown 30 21%
Readers by discipline Count As %
Medicine and Dentistry 36 26%
Nursing and Health Professions 33 24%
Social Sciences 14 10%
Business, Management and Accounting 5 4%
Psychology 5 4%
Other 13 9%
Unknown 34 24%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 18. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 25 August 2016.
All research outputs
#1,871,845
of 23,881,329 outputs
Outputs from BMC Medical Education
#243
of 3,576 outputs
Outputs of similar age
#34,045
of 400,422 outputs
Outputs of similar age from BMC Medical Education
#5
of 79 outputs
Altmetric has tracked 23,881,329 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 92nd percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 3,576 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 6.4. This one has done particularly well, scoring higher than 93% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 400,422 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 91% of its contemporaries.
We're also able to compare this research output to 79 others from the same source and published within six weeks on either side of this one. This one has done particularly well, scoring higher than 94% of its contemporaries.