Clinical Practice Supports image

Simulation Learning

Evidence-informed information and resources for simulation-based education

Learner Evaluation

Simulation-based education (SBE) may be used as a form of evaluation. It is important to be clear about the purpose of the evaluation with clear objectives. SBE may be used for:

  • Formative evaluation
  • Summative evaluation
  • High-stakes testing

(INACSL Standards Committee et al., 2021) 

Formative evaluation focuses on progress toward a goal, such as improving performance and behaviours associated with the cognitive, affective, and psychomotor domains of learning. The purpose is to provide constructive feedback to an individual or group to improve performance. 

Summative evaluation is intended to measure outcomes or achievement of objectives, such as competency assessment. This usually occurs at a set point, such as at the end of a course or program. 

High-stakes evaluation is associated with substantial academic, educational or employment consequences, such as a simulation scenario with pass/fail implications or a decision regarding certification or licensure.  
(Lioce et al., 2020) 

The Healthcare Simulation Standards of Best Practice™ (HSSOBP™) (2021) states the method of evaluation should be guided by the objectives, outcomes, and level of learners. If the SBE is being used as a high-stakes evaluation, educators may choose to include video recording to reduce recall bias in debriefing and support decision-making.

In addition to psychological safety, the validity and reliability of the summative or high-stakes evaluation tools should be prioritized (Diaz-Navarro et al., 2024).

©2021 INACSL. Used with permission.

View the completed standard for evaluation of learning and performance:

Faculty and Program Evaluation

It is important to have methods of evaluating the SBE itself to improve its delivery and justify continued investment into programming. This may involve collecting feedback from participants, tracking program statistics or using evaluation tools to assess pertinent outcomes. Evaluation and research should be considered during the planning stage (Diaz-Navarro et al., 2024). These may include:

  • Programs are encouraged to complete regular evaluations.
  • Consider seeking feedback from all those involved in the programming, including learners, simulationists and other stakeholders.
  • Employ strategies to maximize evaluation completion rates. These may include:
    • Building time into the SBE for evaluation completion.
    • Sending out reminders to encourage completion.
    • Follow-up communication summarizing the feedback and outlining how feedback will be incorporated into future programming.

As highlighted in the Professional Development section, continued professional development is key to ensuring safe and effective SBE. The Debriefing Assessment for Simulation in Healthcare (DASH) is one tool that supports professional development of debriefing skills. Using DASH, facilitators (instructors) can self-reflect on their debriefing abilities and invite feedback from participants (learners).

Simulation-based Education and Clinical Outcomes

As previously stated, the overarching goal of SBE is to improve patient care and safety. Although additional research is required on the effect of SBE on clinical outcomes, a systematic review published by the Society for Simulation in Healthcare suggests that in-situ simulation may improve patient morbidity and mortality outcomes (Calhoun et al., 2024). The concept has evolved away from only considering in-situ simulation in terms of location, to the broader idea of translational simulation. Translational simulation is specifically designed to identify and address processes and safety concerns (Brazil, 2017). 

Translational simulation interventions can be used to assess health service performance in a diagnostic or interventional manner. For example: 

  • Diagnostic SBE could be designed to identify latent safety threats.
  • Interventional SBE is used to measure outcomes and test solutions to problems. 

(Nickson et al., 2021).
 

Fig. 1: Translational Simulation by Victoria Brazil is licensed under CC BY 4.0 

References

Brazil, V. (2017). Translational simulation: not ‘where?’ but ‘why?’ A functional view of in situ simulation. Advances in Simulation, 2(20). https://doi.org/10.1186/s41077-017-0052-3

Calhoun, A., Cook, D., Genova, G., Motamedi, S., Waseem, M., Carey, R., Hanson, A., Chan, J., Camacho, C., Harwayne-Gidansky, I., Walsh, B., White, M., Geis, G., Monachino, A., Maa, T., Posner, G., Li, D., & Lin, Y. (2024). Educational and patient care impacts of in situ simulation in healthcare: A systematic review. Simulation in Healthcare, 19(1), S23-S31. https://doi.org/10.1097/SIH.0000000000000773

Diaz-Navarro, C., Laws-Chapman, C., Moneypenny, M., & Purva, M. (2023, November). The ASPiH standards-2023: Guiding simulation-based practice in health and care. Association for Simulated Practice in Healthcare. https://aspih.org.uk/wp-content/uploads/2023/11/ASPiH-Standards-2023-CDN-Final.pdf

Ganley, B. J., & Linnard-Palmer, L. (2012). Academic safety during nursing simulation: Perceptions of nursing students and faculty. Clinical Simulation in Nursing, 8(2), e49-e57. http://dx.doi.org/10.1016/j.ecns.2010.06.004 /

INACSL Standards Committee, McMahon, E., Jimenez, F.A., Lawrence, K. & Victor, J. (2021, September). Healthcare Simulation Standards of Best Practice™ evaluation of learning and performance. Clinical Simulation in Nursing, 58, 54-56. https://doi.org/10.1016/j.ecns.2021.08.016

Lioce L. (Ed.), Lopreiato J. (Founding Ed.), Downing D., Chang T.P., Robertson J.M., Anderson M., Diaz D.A., and Spain A.E. (Assoc. Eds.) and the Terminology and Concepts Working Group (2020). Healthcare Simulation Dictionary - Second Edition. Agency for Healthcare Research and Quality. https://doi.org/10.23970/simulationv2

Nickson, C.P., Petrosoniak, A., Barwick, S., & Brazil, V. (2021). Translational simulation: From description to action. Advances in Simulation, 6(6). https://doi.org/10.1186/s41077-021-00160-6