↓ Skip to main content

“Thinking on your feet”—a qualitative study of debriefing practice

Overview of attention for article published in Advances in Simulation, April 2016
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • Among the highest-scoring outputs from this source (#47 of 280)
  • High Attention Score compared to outputs of the same age (92nd percentile)
  • High Attention Score compared to outputs of the same age and source (85th percentile)

Mentioned by

blogs
1 blog
twitter
34 X users

Citations

dimensions_citation
50 Dimensions

Readers on

mendeley
151 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
“Thinking on your feet”—a qualitative study of debriefing practice
Published in
Advances in Simulation, April 2016
DOI 10.1186/s41077-016-0011-4
Pubmed ID
Authors

Kristian Krogh, Margaret Bearman, Debra Nestel

Abstract

Debriefing is a significant component of simulation-based education (SBE). Regardless of how and where immersive simulation is used to support learning, debriefing has a critical role to optimise learning outcomes. Although the literature describes different debriefing methods and approaches that constitute effective debriefing, there are discrepancies as to what is actually practised and how experts or experienced debriefers perceive and approach debriefing. This study sought to explore the self-reported practices of expert debriefers. We used a qualitative approach to explore experts' debriefing practices. Peer-nominated expert debriefers who use immersive manikin-based simulations were identified in the healthcare simulation community across Australia. Twenty-four expert debriefers were purposively sampled to participate in semi-structured telephone interviews lasting 45-90 min. Interviews were transcribed and independently analysed using inductive thematic analysis. Codes emerging through the data analysis clustered into four major categories: (1)Values: ideas and beliefs representing the fundamental principles that underpinned interviewees' debriefing practices. (2)Artistry: debriefing practices which are dynamic and creative. (3)Techniques: the specific methods used by interviewees to promote a productive and safe learning environment. (4)Development: changes in interviewees' debriefing practices over time. The "practice development triangle" inspired by the work of Handal and Lauvas offers a framework for our themes. A feature of the triangle is that thevaluesof expert debriefers provide a foundation for associatedartistryandtechniques. This framework may provide a different emphasis for courses and programmes designed to support debriefing practices where microskill development is often privileged, especially those microskills associated withtechniques(plan of action, creating a safe environment, managing learning objectives, promoting learner reflection and co-debriefing). Across the levels in the practice development triangle, the importance of continuing professional development is acknowledged. Strengths and limitations of the study are noted.

X Demographics

X Demographics

The data shown below were collected from the profiles of 34 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 151 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Spain 1 <1%
United States 1 <1%
Ireland 1 <1%
Unknown 148 98%

Demographic breakdown

Readers by professional status Count As %
Student > Master 22 15%
Other 16 11%
Researcher 14 9%
Student > Postgraduate 13 9%
Student > Ph. D. Student 13 9%
Other 40 26%
Unknown 33 22%
Readers by discipline Count As %
Medicine and Dentistry 62 41%
Nursing and Health Professions 26 17%
Social Sciences 11 7%
Psychology 4 3%
Business, Management and Accounting 2 1%
Other 9 6%
Unknown 37 25%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 30. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 03 August 2018.
All research outputs
#1,343,002
of 25,701,027 outputs
Outputs from Advances in Simulation
#47
of 280 outputs
Outputs of similar age
#22,544
of 315,821 outputs
Outputs of similar age from Advances in Simulation
#1
of 7 outputs
Altmetric has tracked 25,701,027 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 94th percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 280 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 17.5. This one has done well, scoring higher than 83% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 315,821 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 92% of its contemporaries.
We're also able to compare this research output to 7 others from the same source and published within six weeks on either side of this one. This one has scored higher than all of them