The impact of simulation on the development of critical thinking and reflection among nursing and medical students: a systematic review

Article information

Korean J Med Educ. 2025;37(2):187-202
Publication date (electronic) : 2025 May 29
doi : https://doi.org/10.3946/kjme.2025.334
Research and Innovation Laboratory in Health Science, Faculty of Medicine and Pharmacy, Ibn Zohr University, Agadir, Morocco
Corresponding Author: Sana Loubbairi (https://orcid.org/0009-0002-7302-5009) Research and Innovation Laboratory in Health Science, Faculty of Medicine and Pharmacy, Ibn Zohr University, BP 7519, Quartier Tilila CP80060 Agadir, Morocco Tel: +212.528227170 Fax: +212.528217171 email: sana.loubbairi@edu.uiz.ac.ma
Received 2025 January 18; Revised 2025 March 4; Accepted 2025 March 26.

Abstract

Simulation is an educational approach that promotes the mastery of technical skills while advancing the development of non-technical competencies, both of which are widely acknowledged as essential in clinical practice. This review aimed to synthesize findings on the impact of simulation in enhancing critical thinking and reflection among nursing and medical students. Following the guidelines of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA), a systematic review was conducted by searching the following databases: PubMed, Science Direct, and Scopus. The quality of the included studies was assessed using Joanna Briggs Institute critical appraisal tools. The protocol was previously registered in the PROSPERO registry (CRD42022371971). From 1,323 studies identified in primary research, 16 were included in this review, involving a total of 1,283 students. Of the 16 studies, seven investigated the impact of simulation on critical thinking and reported a positive effect compared to traditional teaching methods. For student reflection, only one study addressed this theme and reported a positive effect on nursing students. This review demonstrated that simulation has a positive impact on critical thinking; however, its impact on reflection remains inconclusive. Further research is essential to explore its effects across diverse populations, including those in developing countries, to maximize its educational potential in health professions education.

Introduction

Simulation can be presented with different levels of realism or “fidelity”: high, medium, and low fidelity [1]. High-fidelity simulation (HFS) involves the use of computer-controlled mannequins, virtual reality systems, or advanced simulators [2]. These simulations provide an immersive experience that closely mimics real conditions and allow participants to experience similar situations to those they would encounter in practice [3]. Mediumfidelity simulators operate on comparatively simpler computer systems, capable of displaying certain physiological responses; however, they may lack specific details essential for an accurate simulated scenario [1]. In contrast, low-fidelity simulations include approaches with less advanced equipment, which provides a less detailed reality [4,5]. Traditional teaching, on the other hand, is an educational strategy that focuses on methods such as lectures, case studies, and videotapes. The training includes a clinical practicum, clinical placement, or involvement in a hospital setting [6]. Though these approaches can improve clinical performance [7], they limit the engagement and active participation of health science students in clinical education [8]. Therefore, combining these methods with simulation may be an interesting way to improve skills [7]. Moreover, simulation is seen as an ideal substitute for traditional clinical practice [9].

Simulation is increasingly recognized as an educational method that not only enhances technical skills but also develops non-technical ones, which are now widely regarded as essential in clinical practice [7]. These non-technical skills, including cognitive, social, and personal skills, complement technical skills and contribute to the safe and effective execution of tasks [10]. Among them, critical thinking and reflection on practice are two essential competencies in the healthcare training curriculum [11]. Critical thinking guides caregiving actions using a solid foundation of theoretical knowledge [12]. Although critical thinking has several definitions, the most common one is to question whether the received information is factual, reliable, evidence-based, and unbiased. In simpler terms, it means thinking about what to believe or do [13]. The objective of all these definitions remains the same: making a reasoned judgment to make the best possible decision [14]. Critical thinking has demonstrated a highly positive association with problemsolving skills in nursing students along with improved clinical judgment, which in turn boosts students’ confidence in clinical settings [15]. Therefore, critical thinking can promote safe care, competence, and highquality patient care. It can also improve the level of professionalism in healthcare [16].

Reflection, another crucial non-technical skill, promotes inquiry, autonomy, emotional acknowledgment, self-awareness, and understanding, thereby enhancing professional practice in nursing education [12]. Reflection is both a cognitive and metacognitive process aimed at deeply understanding situations [17]. It is defined as active learning by examining practice in the light of knowledge, action, and feedback [16]. There are three types of reflection: content, process, and premise, each leading to action and change. Content reflection focuses on learning from rethinking what was done (actions). Process reflection, on the other hand, looks at the origins of an action and the factors associated with it, enabling new meaning to be gained and understanding to be transformed. Besides, premise reflection involves learning through meaning transformation (action), changing the overall perspective of a situation [15]. Reflection is essential in medical education and professional practice [18]. It serves as a vital component of experiential learning activities, including simulation-based education [19]. Additionally, it can improve quality and cultivate a positive practice environment [20,21]. A significant positive relationship between critical thinking and reflection has been observed in several studies [15,16]. Reflection enables students to integrate learning into their personal context, fostering the development of deeper knowledge and contributing to the improvement of their critical thinking skills [22]. Indeed, students’ reflection on the actions undertaken and the analysis of situations play a crucial role in developing their critical thinking skills [15]. Both critical thinking and reflection are essential because understanding the reasons behind one’s actions through self-reflection and critique enhances practice, leading to deeper learning [23], and improving the quality of care [24].

In recent years, the literature highlights a growing number of reviews examining the influence of simulation on the enhancement of non-technical skills, such as reasoning, clinical judgment, and critical thinking [6,25,26]. Most reviews evaluating the impact of simulation on critical thinking have reported mixed results, emphasizing the need for further research using robust methodologies to assess its effectiveness [2,26]. In this context, the present systematic review aims to synthesize findings on the impact of simulation in fostering critical thinking and reflection among nursing and medical students, while comparing its effectiveness to traditional teaching methods.

Methods

1. Registration and protocol

This systematic review adhered to the PRISMA guidelines (Preferred Reporting Items for Systematic Reviews and Meta-Analyses). The protocol was registered and published on PROSPERO (ID: CRD42022371971).

2. Eligibility criteria

The inclusion and exclusion criteria for this review were carefully formulated based on the Population, Intervention, Comparison, Outcome, and Study Design (PICOS) framework (Table 1).

Inclusion and Exclusion Criteria according to the PICOS Framework

3. Information sources

The literature search was conducted in the following databases: PubMed, ScienceDirect, and Scopus. Only studies published in English from the inception of the databases to January 2024 were included. The final search check was completed on January 20, 2024.

4. Search strategy

The database searches were independently conducted by two authors (S.L. and L.L.) using a research strategy developed based on the inclusion criteria. This strategy incorporated relevant keywords and Medical Subject Headings (MeSH) terms, including: “simulation,” “simulation training” (as a MeSH term), “critical thinking,” “reflection,” “medical students,” and “nursing students.” The search terms are as follows:

1) Scopus

(TITLE-ABS (“simulation” OR “simulation training”)) AND (TITLE-ABS (“medical students” OR “nursing students”)) AND (TITLE-ABS (“critical thinking” OR “reflection”))

2) PubMed

(((simulation[Title/Abstract]) OR (“simulation training”[ Title/Abstract])) AND ((“medical students”[Title/Abstract]) OR (“nursing students”[Title/Abstract]))) AND ((“critical thinking”[Title/Abstract]) OR (“reflection” [Title/Abstract]))

3) Science Direct

(“simulation” OR “simulation training”) AND (“medical students” OR “nursing students”) AND (“critical thinking” OR “reflection”)

5. Selection and data collection process

Two authors (S.L. and L.L.) independently conducted the selection process according to the inclusion and exclusion criteria. Titles, abstracts, full texts, and keywords were reviewed, and duplicates were removed electronically using EndNote X9 (Clarivate, Philadelphia, USA). Screening and selection were then performed using Rayyan QCRI (Rayyan, Cambridge, USA) [27]. Articles considered potentially eligible were downloaded for detailed analysis, and any disagreements were resolved by a third researcher (H.N.).

6. Data items

Data were extracted and included details such as the author, year of publication, study location, study design, sample size, education level, subject area, measurement tools or scales, type and frequency of simulation sessions, duration of each session, methods employed in the intervention and control groups, and key findings.

7. Synthesis methods

The data were classified and analyzed to meet the objective of the review. Two authors (S.L. and L.L.) conducted data extraction and clustering, with each study undergoing two independent reviews to ensure accurate variable categorization. The extracted data were then analyzed using narrative synthesis, beginning with the development of a theoretical framework aligned with the objective of the review. This framework aimed to assess the impact of simulation on the development of critical thinking and reflection in nursing and medical students, and to compare its effectiveness with traditional teaching methods. The key findings of the studies were systematically organized in tabular form for further analysis.

8. Study risk of bias assessment

The assessment of the risk of bias was conducted independently by two authors (S.L., A.A.) using the Joanna Briggs Institute’s Checklist (2023) for randomized controlled trials (RCTs) and quasi-experimental studies [28,29]. The evaluation of RCTs was conducted using a 13-item grid, with each item scored as either “1” (yes) or “0” (no or unclear). The total score for each study was determined by summing the number of “1s.” Studies scoring 10 or higher were classified as Grade A, indicating high quality. Grade B included scores between 7 and 9, reflecting moderate quality, while Grade C encompassed scores of 6 or lower. For quasi-experimental studies, a nine-item evaluation grid was used, following the same scoring method. Studies with a total score of 7 or higher were classified as Grade A (high quality). Grade B corresponded to scores between 5 and 6, representing moderate quality, and Grade C included scores of 4 or lower.

Results

1. Study selection

The search strategy identified 1,323 articles, of which 972 were duplicates and subsequently removed. Screening of the titles and abstracts of the remaining 351 articles led to the exclusion of 330 studies. As a result, 21 studies were selected for further analysis, and 16 were ultimately included in this review. The study selection process included in this review is illustrated in Fig. 1.

Fig. 1.

Study Selection Process according to the PRISMA Diagram

PRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analyses

2. Study characteristics

Sixteen studies were included in this review [30-45]. Four studies were conducted in South Korea [30-33], four in the United States [34-37], two in Turkey [38,39], and one each in Saudi Arabia [40], Jordan [41], Australia [42], Oman [43], China [44], and Taiwan [45]. The total number of participants was 1283 participants, with 670 in experimental groups and 613 in control groups. Most studies focused on nursing students across various grades and programs, while only one study included medical students. Nine studies were RCTs, and seven employed a quasiexperimental design. The key characteristics of these 16 studies are summarized in Table 2.

Summary of the Characteristics of the Included Studies

3. Risk of bias in studies

Among the experimental studies, only one was evaluated as high quality [43], while the remaining eight were rated as moderate quality [34,35,37-41,45]. For the quasiexperimental studies, five were assessed as high-quality [30-33,36], one as moderate quality [44], and one as low-quality [42]. Tables 3 and 4 provide a detailed assessment of the quality of the eligible studies.

Critical Appraisal for Randomized Control Trials Studies

Critical Appraisal for Quasi-Experimental Studies

4. Results of syntheses

1) Critical thinking

The analysis showed that 15 of the 16 studies included in this review examined the impact of simulation on critical thinking. Several types of simulation were identified, with HFS being the most commonly used, appearing in eight studies [30,34,36-40,43]. Electronic simulation was used in four studies [31,35,41,42], while standardized patients were featured in three studies [33,38,44]. Additionally, one study employed a hybrid simulation that combined role-playing with a human body model [32]. The traditional teaching control group employed various methods, with lectures and case studies being the most frequently used, appearing in eight studies [31,33-36,40-42]. Other traditional approaches included practical learning experiences such as practicums, knowledge-based demonstrations, and a clinical day in an acute medical unit, as noted in three studies [30,32,43]. Additionally, four studies mentioned the use of traditional methods without specifying their types [37-39,44]. The frequency of simulation sessions varied, ranging from one to 14 sessions. Most studies, included in this review, featured a single simulation session, noted in seven studies [30-32,36,38,40,42]. Only one study reported a high frequency of 14 simulation sessions [43], followed by one with six sessions [41], and another with four sessions [33]. One study included three simulation sessions [34], while the number of sessions was unspecified in several other studies [35,39,44]. The duration of simulation sessions ranged from 15±5 to 150 minutes. However, most studies did not specify the time allocated to each phase pre-briefing, simulation training, and debriefing except for three studies [32,34,43]. The longest recorded duration was 150 minutes, noted in two studies [31,43], while another study reported a session lasting 135±5 minutes [32]. One article indicated a duration of 120 minutes [33], another described a session of 80 minutes [40], and two studies reported a duration of 60 minutes [37,41]. Additionally, one study noted sessions lasting 35 to 50 minutes [34], and another study reported a 45-minute simulation session [39]. Finally, the shortest duration was 15±5 minutes [30], which referred solely to the simulation phase. The duration of the simulation sessions was not specified in four studies [35,38,42,44].

Regarding the measurement tools, a variety of instruments, including nine tools and scales, were used to assess critical thinking. These instruments include a multiple-choice questionnaire specially developed by the authors [40]. The Yoon Critical Thinking Disposition Scale, used in four studies. It is a standardized tool for nursing students in Korea, consisting of 27 items scored on a Likert scale [30-33]. The California Critical Thinking Disposition Inventory is a multiple-choice questionnaire consisting of 27 items scored on a Likert scale. It was used in four studies [37-39,43]. Two of these four studies also included the California Critical Thinking Skill Test, which contains 34 items [37,43]. Moreover, two studies used the Health Sciences Reasoning Test, an assessment designed for health sciences students, which consists of 33 multiple-choice items [34,35]. Some studies have also employed other instruments, such as the Critical Thinking Self-Assessment Scale, which includes 115 multiplechoice questions [41], and the Critical Thinking Trait Scale, which includes 70 Likert scale items [44]. Furthermore, critical thinking was assessed using key criteria known as Scheffer and Rubenfeld’s critical thinking criteria, which include 10 habits of the mind and seven skills. Improvement in critical thinking is evaluated using three criteria; one habit of the mind and two cognitive skills [42]. Another study used the Assessment Technologies Institute test, consisting of 60 questions related to critical thinking, with a score range from 0 to 100 [36]. In 14 studies, all instruments used to measure critical thinking were valid and reliable. However, one study did not provide any information regarding their validity and reliability [42].

Most studies have revealed a positive impact of simulation on students’ critical thinking. Nine studies showed that the experimental groups using simulation had higher scores compared to the control groups following traditional teaching methods [30-33,39,41-44]. In two studies, critical thinking improved in both the experimental and the control groups [36,40]. Among the 16 studies included in this review, four mentioned nonsignificant results regarding the impact of simulation on critical thinking [34,35,38,39]. Only one study, conducted with medical students, demonstrated a positive impact on their critical thinking [44].

2) Student reflection

The analysis revealed that only one of the 16 studies examined the impact of simulation on reflection [45]. This study employed standardized patients with role-playing as the type of simulation. The control group received traditional teaching methods, including standard courses and case study discussions. The number of simulation sessions was not specified. However, the duration of the scenario and debriefing were provided separately: the scenario lasted 10 to15 minutes, while the debriefing took 30 minutes. The total duration of the simulation session was 65 minutes. The Simulation-Based Learning Evaluation Scale (SBLES), consisting of 37 items across five subscales, was used to assess reflection. The SBLES demonstrated good reliability and validity [45]. The study found a positive effect of simulation on the reflection of the experimental group compared to the control group.

Discussion

This review aims to consolidate findings on the impact of simulation in fostering critical thinking and reflection among nursing and medical students. Notably, nine studies highlighted that simulation, regardless of its type, significantly enhanced students’ critical thinking skills compared to traditional teaching methods. This finding is supported by a recent systematic review, which demonstrates that HFS can improve nursing students’ skills more effectively than traditional teaching methods [46]. This effect could be explained by the immersive environment provided by HFS, which realistically replicates clinical situations. It also allows for repeated scenarios, enabling participants to identify and correct errors, while the debriefing process fosters deep reflection [47]. However, another author reported adverse effects, noting that variations in the fidelity of patient simulators had no impact on students’ learning outcomes [48].

The choice to focus on critical thinking and reflection in this review stems from their pivotal role in healthcare education and clinical practice. Critical thinking equips nursing and medical students with the ability to assess complex clinical scenarios, make well-informed decisions, and ensure patient safety. Meanwhile, reflection promotes deeper learning by encouraging students to evaluate their own reasoning, recognize areas for growth, and cultivate a more flexible and responsive approach to patient care. Although other non-technical skills like communication, leadership, and teamwork are essential, this review prioritizes critical thinking and reflection due to their direct influence on clinical reasoning and lifelong learning. Additionally, their prominence in the literature on simulation-based learning highlights their importance in bridging the gap between theoretical knowledge and real-world application.

Furthermore, two studies demonstrated an improvement in critical thinking within both the experimental and control groups simultaneously, suggesting that critical thinking could be enhanced either through traditional teaching methods or through simulation [36,40]. Similar findings were reported in another study which found no differences between traditional teaching methods and simulation in the development of critical thinking [47]. The results varied based on the frequency of exposure to simulation. Specifically, five studies reported positive effects after a single simulation session, while another study found no significant effect despite exposure to five simulation sessions [37]. This suggests that the frequency of simulation sessions did not consistently influence the positive impact on critical thinking. In contrast, two studies concluded that the development of students’ skills is associated with repeated simulation practice over an extended period [46,49].

The total duration of the simulation sessions in this review varied considerably, ranging from 15 to 150 minutes. This observation was similar to the study conducted by another author, which found variation in the duration of simulation sessions [26]. The difference in duration may greatly affect the effectiveness of simulation-based learning, with the ideal total length for skill improvement being at least 55 minutes. This includes a pre-briefing of at least 10 minutes, 15 minutes of simulation training, and over 30 minutes for debriefing [43]. Furthermore, in this review, among the four studies that showed non-significant results regarding critical thinking, two studies did not specify the duration of the simulation session. Therefore, the disregard for the simulation duration could potentially have been a cause of the non-significant results observed in these studies. This variability in results is also observed in several studies, underscoring the need for further research to more accurately determine the real impact of simulation session frequency and duration on non-technical skills [26,50].

The results of our review showed that, despite the use of various measurement tools and scales, most studies reported significant improvements in critical thinking. This finding is supported by a previous study, which demonstrated a notable enhancement in critical thinking skills among students who participated in HFS, despite the use of different measurement tools [47]. However, other systematic reviews suggested that the inconsistent results may have been influenced by the use of different critical thinking measurement instruments [6,26]. This finding emphasizes the need for additional studies that employ standardized measurement scales.

Regarding reflection, this review demonstrated a positive effect of simulation on nursing students’ reflection compared to traditional teaching methods. However, this result is based on a single study focused on the attitude of reflection [45], which limits the generalization of these findings. This was further supported by a study that highlighted the need for more research measuring students’ reflection both before and after simulation [51].

This review broadly addressed the similarities between the two themes, such as cultural and educational aspects, long-term effects of simulation, population distribution, and methodological quality of the studies. Regarding the cultural and educational aspects, the studies under analysis were conducted in nine different countries, reflecting different cultural and educational contexts. This diversity can impact pedagogical methods, whether simulationbased or traditional approaches, as well as their effect on critical thinking and reflection. A recent study supports this by showing that cultural differences and variations in teaching methodologies and assessment strategies may influence the effectiveness of educational methods [52]. Therefore, it is crucial for future research to explore in depth how different educational systems and cultural contexts affect the development of critical thinking and reflection.

Many studies found that the short-term effects of simulation had a positive impact on the development of critical thinking and reflection [11,33]. This aligns with the results of this review, which demonstrated that all the included studies only explored the short-term effects of simulation, and none examined the long-term effects. In contrast, few studies revealed the long-term benefits of simulation on critical thinking [53], and reflection [54]. Despite these results, there was still a lack of research on the lasting effects of simulation, highlighting the need for further studies to explore these long-term effects [55].

Concerning the distribution of the population studied, this systematic review revealed a striking disparity. Out of 16 studies included, 15 involved nursing students, while only one involved medical student [44]. This finding broadly supports the work of other systematic reviews in this area, showing a significant focus on nursing students [26,47,56]. Similarly, these results confirm the findings of a recent study, which highlighted significant gaps in the literature concerning studies involving medical students [57]. In another study, the main barriers to the lack of involvement in research among medical students were insufficient funding and limited awareness of available opportunities [58]. Therefore, it is necessary to emphasize the need for further research focusing specifically on these students to contribute to the development of more effective educational strategies [57].

In this review, only one experimental study was considered of high quality, while the other eight studies had several methodological weaknesses, such as lack of true randomization, lack of allocation concealment, and lack of adequate blinding. For the quasi-experimental studies, the weaknesses concerned the similarity of the participants in the comparisons, the treatment or care received outside the intervention in question, and the way the outcomes were measured. Nevertheless, the quasi-experimental studies scored higher than the experimental studies. These methodological limitations may affect the overall conclusions of this review. We recommend that future research should focus on strengthening these aspects to improve the validity of the results.

This review provides valuable insights but has several limitations. The use of varied instruments and heterogeneous control groups complicates comparisons and hinders a meta-analysis. Control conditions ranged from lectures to clinical practice, making it difficult to isolate the true impact of simulation-based education. Differences in instructional methods, exposure time, and learner engagement further limit generalizability.

Most studies focused on nursing students, highlighting the need to investigate medical students. Additionally, research on simulation’s role in fostering student reflection remains scarce. Methodological weaknesses were evident, with only one of nine RCTs rated as high quality. Issues such as inadequate randomization, poor allocation concealment, and lack of blinding may have introduced biases.

To strengthen the evidence base, future research should ensure rigorous study designs, adequate sample sizes, standardized measurement tools, and transparent reporting. Standardizing control conditions will also enhance comparability and clarify the benefits of simulation-based learning.

Conclusion

The findings of this review demonstrate that simulation, in its various forms such as high-fidelity and virtual simulation, positively influences the development of critical thinking skills among nursing students compared to traditional teaching methods. Additionally, evidence suggests that simulation can also support the development of critical thinking in medical students, as shown in the studies included. These results highlight the potential of simulation as an effective educational approach for fostering critical thinking skills in health sciences education. Future research could build on these promising outcomes to further explore its broader applications and long-term impact.

Notes

Acknowledgements

None.

Funding

No financial support was received for this study.

Conflicts of interest

No potential conflict of interest relevant to this article was reported.

Author contributions

Conceptualization: SL, LL, HN; data collection: SL, LL, AA; data analysis and interpretation: SL, LL; drafting the article: SL, LL, AA, HN; critical revision of the article: SL, HN; and final approval of the version to be published: all authors.

References

1. Al-Wassia H, Bamehriz M, Atta G, Saltah H, Arab A, Boker A. Effect of training using high-versus low-fidelity simulator mannequins on neonatal intubation skills of pediatric residents: a randomized controlled trial. BMC Med Educ 2022;22(1):497.
2. Wei ZH, Xu MM, Qi TI, Han YJ, Wang ZQ, Zhang W. The impact of simulation-based learning on nursing decision-making ability: a meta-analysis. Clin Simul Nurs 2024;93:101576.
3. Brady S, Bogossian F, Gibbons K. The effectiveness of varied levels of simulation fidelity on integrated performance of technical skills in midwifery students: a randomised intervention trial. Nurse Educ Today 2015;35(3):524–529.
4. Basak T, Unver V, Moss J, Watts P, Gaioso V. Beginning and advanced students’ perceptions of the use of low- and high-fidelity mannequins in nursing simulation. Nurse Educ Today 2016;36:37–43.
5. Hegland PA, Aarlie H, Strømme H, Jamtvedt G. Simulationbased training for nurses: systematic review and metaanalysis. Nurse Educ Today 2017;54:6–20.
6. Alshehri FD, Jones S, Harrison D. The effectiveness of high-fidelity simulation on undergraduate nursing students’ clinical reasoning-related skills: a systematic review. Nurse Educ Today 2023;121:105679.
7. Li YY, Au ML, Tong LK, Ng WI, Wang SC. High-fidelity simulation in undergraduate nursing education: a metaanalysis. Nurse Educ Today 2022;111:105291.
8. Zarifsanaiey N, Amini M, Saadat F. A comparison of educational strategies for the acquisition of nursing student’s performance and critical thinking: simulationbased training vs. integrated training (simulation and critical thinking strategies). BMC Med Educ 2016;16(1):294.
9. Gates MG, Parr MB, Hughen JE. Enhancing nursing knowledge using high-fidelity simulation. J Nurs Educ 2012;51(1):9–15.
10. Jorm C, Roberts C, Lim R, et al. A large-scale mass casualty simulation to develop the non-technical skills medical students require for collaborative teamwork. BMC Med Educ 2016;16:83.
11. Tutticci N, Lewis PA, Coyer F. Measuring third year undergraduate nursing students’ reflective thinking skills and critical reflection self-efficacy following high fidelity simulation: a pilot study. Nurse Educ Pract 2016;18:52–9.
12. Shiau SJ, Chen CH. Reflection and critical thinking of humanistic care in medical education. Kaohsiung J Med Sci 2008;24(7):367–372.
13. Persky AM, Medina MS, Castleberry AN. Developing critical thinking skills in pharmacy students. Am J Pharm Educ 2019;83(2):7033.
14. Châlon B, Lutaud R. Enhancing critical thinking in medical education: a narrative review of current practices, challenges, and future perspectives in context of infodemics. Presse Med Open 2024;5:100047.
15. Oh YJ, Kang HY, Song Y, Lindquist R. Effects of a transformative learning theory based debriefing in simulation: a randomized trial. Nurse Educ Pract 2021;50:102962.
16. Willers S, Jowsey T, Chen Y. How do nurses promote critical thinking in acute care?: a scoping literature review. Nurse Educ Pract 2021;53:103074.
17. Franco R, Ament Giuliani Franco C, de Carvalho Filho MA, Severo M, Amelia Ferreira M. Use of portfolios in teaching communication skills and professionalism for Portuguese-speaking medical students. Int J Med Educ 2020;11:37–46.
18. Feldman M, Edwards C, Wong A, et al. The role for simulation in professional identity formation in medical students. Simul Healthc 2022;17(1):e8–e13.
19. Mulli J, Nowell L, Lind C. Reflection-in-action during high-fidelity simulation: a concept analysis. Nurse Educ Today 2021;97:104709.
20. Nguyen B, Solanki J, Ong E. Exploring pharmacists’ perceptions of using a clinical supervision skills competency tool to reflect and develop their supervisory practices. Curr Pharm Teach Learn 2024;16(4):231–243.
21. MacKenna V, Díaz DA, Chase SK, Boden CJ, Loerzel V. Self-debriefing after virtual simulation: measuring depth of reflection. Clin Simul Nurs 2021;52:59–67.
22. Volkert DR. Building reflection with word clouds for online RN to BSN students. Nurs Educ Perspect 2018;39(1):53–54.
23. Walsh JA, Sethares KA, Viveiros JD, Asselin ME. Reflective journaling to promote critical reflective thinking post-simulation-based education. Clin Simul Nurs 2024;88:101511.
24. Robinson J. Reflection, return to practice and revalidation. Nurs Older People 2015;27(6):31–37.
25. Al Sabei SD, Lasater K. Simulation debriefing for clinical judgment development: a concept analysis. Nurse Educ Today 2016;45:42–47.
26. Adib-Hajbaghery M, Sharifi N. Effect of simulation training on the development of nurses and nursing students’ critical thinking: a systematic literature review. Nurse Educ Today 2017;50:17–24.
27. Ouzzani M, Hammady H, Fedorowicz Z, Elmagarmid A. Rayyan-a web and mobile app for systematic reviews. Syst Rev 2016;5(1):210.
28. Barker TH, Stone JC, Sears K, et al. The revised JBI critical appraisal tool for the assessment of risk of bias for randomized controlled trials. JBI Evid Synth 2023;21(3):494–506.
29. Barker TH, Habibi N, Aromataris E, et al. The revised JBI critical appraisal tool for the assessment of risk of bias for quasi-experimental studies. JBI Evid Synth 2024;22(3):378–388.
30. Son HK. Effects of S-PBL in maternity nursing clinical practicum on learning attitude, metacognition, and critical thinking in nursing students: a quasi-experimental design. Int J Environ Res Public Health 2020;17(21):7866.
31. Yang SY, Kang MK. Efficacy testing of a multi-access metaverse-based early onset schizophrenia nursing simulation program: a quasi-experimental study. Int J Environ Res Public Health 2022;20(1):449.
32. Lee J, Son HK. Comparison of learning transfer using simulation problem-based learning and demonstration: an application of Papanicolaou smear nursing education. Int J Environ Res Public Health 2021;18(4):1765.
33. Lee J. Situation, background, assessment, and recommendation stepwise education program: a quasi-experimental study. Nurse Educ Today 2021;100:104847.
34. Blakeslee JR. Effects of high-fidelity simulation on the critical thinking skills of baccalaureate nursing students: a causal-comparative research study. Nurse Educ Today 2020;92:104494.
35. Turrise SL, Thompson CE, Hepler M. Virtual simulation: comparing critical thinking and satisfaction in RN-BSN students. Clin Simul Nurs 2020;46:57–61.
36. Hudson S, Penkalski MR. High-fidelity simulation versus case study: which is best for practical nursing students? Nurs Educ Perspect 2022;43(1):49–50.
37. Ravert P. Patient simulator sessions and critical thinking. J Nurs Educ 2008;47(12):557–562.
38. Doğan P, Şendir M. Effect of different simulation methods in nursing education on critical thinking dispositions and self-efficacy levels of students. Think Skills Creat 2022;45:101112.
39. Akalin A, Sahin S. The impact of high-fidelity simulation on knowledge, critical thinking, and clinical decisionmaking for the management of pre-eclampsia. Int J Gynaecol Obstet 2020;150(3):354–360.
40. Alamrani MH, Alammar KA, Alqahtani SS, Salem OA. Comparing the effects of simulation-based and traditional teaching methods on the critical thinking abilities and self-confidence of nursing students. J Nurs Res 2018;26(3):152–157.
41. Rababa M, Masha’al D. Using branching path simulations in critical thinking of pain management among nursing students: Experimental study. Nurse Educ Today 2020;86:104323.
42. O’Flaherty J, Costabile M. Using a science simulationbased learning tool to develop students’ active learning, self-confidence and critical thinking in academic writing. Nurse Educ Pract 2020;47:102839.
43. D'Souza MS, Labrague LJ, Karkada SN, Parahoo K, Venkatesaperumal R. Testing a diabetes keotacidosis simulation in critical care nursing: a randomized control trial. Clin Epidemiol Glob Health 2020;8(4):998–1005.
44. Liu S, Li Y, Wang X, Zhang X, Wang R. Research on the effect of big data flipped classroom combined with scenario simulation teaching: based on clinical practice of medical students. Wirel Commun Mobile Comput 2021;2021(1):7107447.
45. Lee BO, Liang HF, Chu TP, Hung CC. Effects of simulation-based learning on nursing student competences and clinical performance. Nurse Educ Pract 2019;41:102646.
46. Wang X, Yang L, Hu S. Teaching nursing students: as an umbrella review of the effectiveness of using high-fidelity simulation. Nurse Educ Pract 2024;77:103969.
47. Lei YY, Zhu L, Sa YT, Cui XS. Effects of high-fidelity simulation teaching on nursing students’ knowledge, professional skills and clinical ability: a meta-analysis and systematic review. Nurse Educ Pract 2022;60:103306.
48. Brown AM. Simulation in undergraduate mental health nursing education: a literature review. Clin Simul Nurs 2015;11(10):445–449.
49. Mok HT, So CF, Chung JW. Effectiveness of high-fidelity patient simulation in teaching clinical reasoning skills. Clin Simul Nurs 2016;12(10):453–467.
50. Tong LK, Li YY, Au ML, Wang SC, Ng WI. High-fidelity simulation duration and learning outcomes among undergraduate nursing students: a systematic review and meta-analysis. Nurse Educ Today 2022;116:105435.
51. Kang SJ, Kim Y. The impact of perinatal loss nursing simulation among undergraduate students. Int J Environ Res Public Health 2022;19(14):8569.
52. Vangone I, Arrigoni C, Magon A, et al. The efficacy of high-fidelity simulation on knowledge and performance in undergraduate nursing students: an umbrella review of systematic reviews and meta-analysis. Nurse Educ Today 2024;139:106231.
53. Montenery SM, Walker M, Sorensen E, et al. Millennial generation student nurses’ perceptions of the impact of multiple technologies on learning. Nurs Educ Perspect 2013;34(6):405–409.
54. Pai HC, Huang YL, Cheng HH, Yen WJ, Lu YC. Modeling the relationship between nursing competence and professional socialization of novice nursing students using a latent growth curve analysis. Nurse Educ Pract 2020;49:102916.
55. Klenke-Borgmann L, Mattson N, Peterman M, Stubenrauch C. The long-term transferability of clinical judgment via in-class simulations to nursing practice: a qualitative descriptive study. Clin Simul Nurs 2023;85:101468.
56. Lapkin S, Fernandez R, Levett-Jones T, Bellchambers H. The effectiveness of using human patient simulation manikins in the teaching of clinical reasoning skills to undergraduate nursing students: a systematic review. JBI Libr Syst Rev 2010;8(16):661–694.
57. Emekli E, Coşkun Ö, Budakoğlu Iİ. Medical recordkeeping educational interventions for medical students and residents: a systematic review. Health Inf Manag 2025;54(2):177–189.
58. Sanabria-de la Torre R, Quiñones-Vico MI, Ubago-Rodríguez A, Buendía-Eisman A, Montero-Vílchez T, Arias-Santiago S. Medical students’ interest in research: changing trends during university training. Front Med (Lausanne) 2023;10:1257574.

Article information Continued

Fig. 1.

Study Selection Process according to the PRISMA Diagram

PRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analyses

Table 1.

Inclusion and Exclusion Criteria according to the PICOS Framework

Items PICOS Inclusion and exclusion criteria
Population All medical and nursing students, regardless of their degree or grades.
Intervention Studies that examined simulation as a teaching method.
Comparison A group of students (control group) exposed only to traditional teaching methods, without simulation, through typical courses, traditional lectures, case studies, clinical placements, clinical practicum.
Outcome To be eligible for inclusion, studies must assess ‘Critical Thinking’ and/or ‘Reflection’ among students.
Study design We included cross-sectional studies, case-control studies, cohort studies, and randomized controlled trials. Qualitative studies, commentary articles conference abstracts were excluded.

Table 2.

Summary of the Characteristics of the Included Studies

Study Country Study design Sample size & education degree Theme Tool/scale Simulation type No. of simulatio n sessions Duration of simulation session Intervention (experimental group) Comparison (control group) Findings
Son [30] (2020) South Korea Quasi-experimental 78 Third-year nursing students CT YCTDS HFS 1 Simulation running:15–20 min 47 Participants engaged S-PBL for a week. 31 Participants engaged for a week in a traditional maternity clinical practicum. The experimental group showed a significant increase in critical thinking values (t=-2.78, p=0.008). The control group did not exhibit any significant difference.
Yang & Kang [31] (2022) South Korea Quasi-experimental 58 Third-year nursing students CT YCTDS Electronic simulation: virtual reality 1 Simulation running: 50 min; Debriefing: 100 min 29 Participants used the simulation program. 29 Participants received an online lecture. The experimental group had significantly increased critical thinking ability (F(2)=9.86, p=0.038), compared to that of the control group.
Alamrani et al. [40] (2018) Saudi Arabia RCT 30 Undergraduate nursing students CT Questionnaire HFS 1 Simulation training: 60 min; Debriefing: 20 min 15 Participants received simulation-based teaching program: 2 hours PowerPoint presentation on the core concepts of ECG and 1-hour immersive simulation on basic arrhythmia interpretation and debriefing session. 15 Participants received the traditional teaching program: 2 hours PowerPoint presentation on the core concepts of ECG and 1 hour lecture on basic arrhythmia interpretation followed by discussion with educator. The critical thinking scores of both groups improved significantly (p<0.05), and no significant differences in critical thinking were identified between the groups (t=1.250, p=0.222; Mann-Whitney U test: p=0.512, NS).
Lee & Son [32] (2021) South Korea Quasi-experimental 105 Third-year nursing students CT YCTDS Hybrid simulation: role play & human body model 1 Pre-briefing: 60 min; Running time: 15–20 min; Debriefing: 60 min 52 Participants engaged in S-PBL based on Pap smear knowledge. 53 Participants engaged in a Pap smear demonstration based on Pap smear knowledge. The experimental group had significantly improved critical thinking, compared to the control group (t=2.07, p=0.041).
Doğan & Şendir [38] (2022) Turkey RCT 71 Undergraduate nursing students CT CCTDI, TV HFS & standardized patient 1 Not specified 2 Experimental groups: 23 participants in high fidelity simulation and 23 participants in standardized patient simulation. 25 Participants had a traditional teaching program. No significant difference between the scores of both groups in the pre-test and post-test regarding critical thinking disposition (p>0.05).
Rababa & Masha’al [41] (2019) Jordan RCT 102 Undergraduate nursing students CT CTSAS Electronic simulation: branching path simulations 6 Training session: 60 min 51 Participants trained by BPS. 51 Participants had traditional lectures. The critical thinking scores in the experimental group significantly increased after the BPS intervention (p<0.05) compared to pre-training. Conversely, no significant change in critical thinking scores was observed in the control group (p>0.05).
O’Flaherty & Costabile [42] (2020) Australia Quasi-experimental 112 Undergraduate nursing students CT Scheffer and Rubenfeld CT criteria 1 Not reported 60 Participants used electronic simulation. 52 Participants used lecture notes as a study tool. The critical thinking improved in the experimental group after using the simulation, in contrast to the control group, which relied on lecture notes as a study tool.
Blakeslee [34] (2020) USA RCT 69 Baccalaureate junior nursing students CT HSRT HFS 3 Briefing: 5–10 min; scenario: 15–20 min; Debriefing: 15–20 min 36 Participants used HFS. 33 Participants used a written case study. No statistically significant differences between the groups (simulation vs. written case studies) F(1, 67)=0.264, p=0.609 (η2=0.004).
Turrise et al. [35] (2020) USA RCT 27 RN-BSN students CT HSRT Electronic simulation: virtual reality (DCE) Not reported Not reported 13 Participants had the DCE included the following: (1) hypertension, dyslipidemia, and antihypertensive concept laboratory; (2) diabetes mellitus, diabetic foot infection, pain, and antidiabetic concept laboratory; and (3) pneumonia and anti-asthmatic concept laboratory. 14 Participants had case studies included the following: (1) hypertension and dyslipidemia. (2) The patient managing his hypertension and dyslipidemia, but also diabetes mellitus, diabetic peripheral neuropathy, and depression. (3) The patient continues to manage his diabetes mellitus, but with a foot infection and pain. No statistically significant difference in critical thinking scores between the control group (Wilks’ Lambda=0.991, F(1, 12)=0.108, p=0.748) and the intervention group (Wilks’ Lambda=0.997, F(1, 13)=0.034, p=0.856).
D’Souza et al. [43] (2020) Oman RCT 140 Undergraduate nursing students CT CCTDI, CCTST HFS 14 Pre-briefing: 60 min; Running simulation: 30 min; Debriefing: 60 min 70 Participants underwent high fidelity simulation teaching on diabetic ketoacidosis. 70 Participants exposed to patient assignments during a clinical day on an acute medical unit. The critical thinking scores significantly increased in the experimental group compared to the control group for both the CCTDI (mean=3.45±0.75 vs. 2.98±0.34) and the CCTST (mean=3.12±0.83 vs. 2.77±0.30)
Akalin & Sahin [39] (2020) Turkey RCT 107 Third-year nursing students CT CCTDI HFS Not reported Running simulation: 15 min; Debriefing: 30 min 53 Participants engaged in high fidelity simulation sessions. 54 Participants attended the classical training on “the management of pre-eclampsia.” The critical thinking scores in the experimental group significantly increased in the post-test compared to the control group, and the difference between the groups was statistically significant (p<0.001)
Liu et al. [44] (2021) China Quasi-experimental 119 Medical students CT CT Trait Scale Standardized patient (role-play) Not reported Not reported 61 Participants received instruction using the combined approach of big data flipped classroom with scenario simulation teaching. 58 Participants received a traditional teaching method. The critical thinking scores of both groups were above 280, but the experimental group had a significantly higher total score of critical thinking compared to the control group.
Lee et al. [45] (2019) Taiwan prospective interventional study 100 Students in a second-year nursing baccalaureate program Reflecti on SBLES Standardized patient (role-play) Not reported 65 min in total scenario: 10–15 min; Debriefing: 30 min 49 Participants received simulation sessions. 51 Participants engaged in typical courses and case study discussions lasting 1.5 hour each. The attitude of reflection in the experimental group significantly increased compared to the control group according to the SBLES responses.
Lee [33] (2021) South Korea Quasi-experimental 96 Fourth year nursing students CT YCTDS Role-play 4 Training session 120 min 48 Participants involved in SBAR role-play sessions conducted over a 2-week training during the clinical placement. 48 Participants had a regular session for a case study of clinical practicum once a week. The critical thinking scores in the experimental group significantly increased compared to the control group increasing by 0.76 points from the pre-test to the post-test. The difference between the groups was statistically significant (F=12.15, p=0.000).
Hudson & Penkalski [36] (2022) USA Quasi-experimental 29 Nursing students CT ATI HFS 1 Training session 50 min 15 Participants engaged in high fidelity simulation sessions. 14 Participants engaged in interactive case studies. The critical thinking scores of both groups improved significantly (p=0.001), however no significant difference was found between the high-fidelity simulation and interactive case study groups in posttest scores.
Ravert [37] (2008) USA RCT 40 Undergraduate baccalaureate nursing students CT CCTDI, CCTST HFS 5 Training session 60 min 2 Experimental groups: 13 participants received five enrichment sessions with no HPS and 12 participants received five enrichment sessions with HPS. 15 Participants received the regular education process with no enrichment sessions. No statistically significant differences were found between the two experimental groups and the control group.

CT: Critical thinking, YCTDS: Yoon’s Critical Thinking Disposition Scale, HFS: High-fidelity simulation, S-PBL: Simulation problem-based learning, RCT: Randomized controlled trial, ECG: Electrocardiogram, NS: not significant, CCTDI, TV: California Critical Thinking Disposition Inventory, Turkish version, CTSAS: Critical Thinking Self-Assessment Scale, BPS: Branching path simulation, HSRT: The Health Science Reasoning Test, RN-BSN: Registered Nurse-Bachelor of Science in Nursing, DCE: Digital Clinical Experiences, CCTDI: California Critical Thinking Disposition Inventory, CCTST: California Critical Thinking Skill Test, SBLES: Simulation-Based Learning Evaluation Scale, SBAR: Situation, Background, Assessment and Recommendation, ATI: Assessment Technologies Institute, HPS: Human patient simulator.

Table 3.

Critical Appraisal for Randomized Control Trials Studies

Study Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 Q10 Q11 Q12 Q13 Total Grade
Alamrani et al. [40] (2018) 0 0 0 0 0 1 0 1 1 1 1 1 1 7 B
Doğan & Şendir [38] (2022) 0 0 1 0 0 1 0 1 1 1 1 1 1 8 B
Rababa & Masha’al [41] (2019) 0 0 1 0 0 1 0 1 1 1 1 1 1 8 B
Blakeslee [34] (2020) 0 0 1 0 0 1 1 1 1 1 1 1 0 8 B
Turrise et al. [35] (2020) 0 0 1 0 0 1 0 1 1 0 1 1 1 7 B
D’Souza et al. [43] (2020) 1 1 1 1 0 1 1 1 1 1 1 1 1 12 A
Akalin & Sahin [39] (2020) 1 0 1 0 0 0 0 1 1 1 1 1 1 8 B
Lee et al. [45] (2019) 0 0 1 0 0 1 0 1 1 1 1 1 1 8 B
Ravert [37] (2008) 0 0 1 0 0 0 0 1 1 1 1 1 1 7 B

1: Yes, 0: No or unclear.

Table 4.

Critical Appraisal for Quasi-Experimental Studies

Study Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 Total Grade
Son [30] (2020) 1 1 0 1 1 1 1 1 1 8 A
Yang & Kang [31] (2022) 1 1 1 1 1 1 1 1 1 9 A
Lee & Son [32] (2021) 1 1 1 1 1 1 1 1 1 9 A
O’Flaherty & Costabile [42] (2020) 1 1 0 0 0 0 0 0 0 2 C
Liu et al. [44] (2021) 1 1 1 0 1 0 0 1 1 6 B
Lee [33] (2021) 1 1 1 1 1 1 1 1 1 9 A
Hudson & Penkalski [36] (2022) 1 1 1 1 1 1 1 1 1 9 A

1: Yes, 0: No or unclear.