Health sciences educator's simulation debriefing practice needs: A mixed methods study
Journal of Education and Health Promotion, ISSN: 2319-6440, Vol: 12, Issue: 1, Page: 55-null
2023
- 1Citations
- 23Captures
Metric Options: Counts1 Year3 YearSelecting the 1-year or 3-year option will change the metrics count to percentiles, illustrating how an article or review compares to other articles or reviews within the selected time period in the same journal. Selecting the 1-year option compares the metrics against other articles/reviews that were also published in the same calendar year. Selecting the 3-year option compares the metrics against other articles/reviews that were also published in the same calendar year plus the two years prior.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Metrics Details
- Citations1
- Citation Indexes1
- Captures23
- Readers23
- 23
Article Description
BACKGROUND: Simulation debriefing influences learning from healthcare simulation activities. Health sciences educators must be competent in conducting simulation debriefing for healthcare students. A structured faculty development intervention for health sciences educators must be informed by educator needs to enhance its utility. This paper describes the needs of health sciences educators regarding simulation debriefing at a faculty of health sciences. MATERIALS AND METHODS: A parallel convergent mixed methods study design was applied on a selected population of 30 health sciences educators at the University (x) who integrate immersive simulation for first- to final-year students in their undergraduate programs. The Objective Structured Assessment of Debriefing tool underpinned observations which informed the quantitative strand of the study, while semi-structured interviews were conducted as part of the qualitative strand. Descriptive statistics and thematic analysis were used to analyze the data. RESULTS: Health sciences educators struggled to establish the learning environment for simulation (median 1), facilitate learning (median 3), and evaluate their debriefing activities. However, they were able to apply an appropriate approach toward simulation (median 4). They identified the need to be educated on the fundamentals of simulation-based education. CONCLUSION: A continuing professional development program must be developed aimed at transforming approaches toward facilitating learning, explaining the fundamentals of simulation-based education, modeling of best-practices related to debriefing, and applying appropriate strategies for evaluating debriefing activities.
Bibliographic Details
Provide Feedback
Have ideas for a new metric? Would you like to see something else here?Let us know