The impact of simulation on the development of critical thinking and reflection among nursing and medical students: a systematic review
Article information
Abstract
Simulation is an educational approach that promotes the mastery of technical skills while advancing the development of non-technical competencies, both of which are widely acknowledged as essential in clinical practice. This review aimed to synthesize findings on the impact of simulation in enhancing critical thinking and reflection among nursing and medical students. Following the guidelines of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA), a systematic review was conducted by searching the following databases: PubMed, Science Direct, and Scopus. The quality of the included studies was assessed using Joanna Briggs Institute critical appraisal tools. The protocol was previously registered in the PROSPERO registry (CRD42022371971). From 1,323 studies identified in primary research, 16 were included in this review, involving a total of 1,283 students. Of the 16 studies, seven investigated the impact of simulation on critical thinking and reported a positive effect compared to traditional teaching methods. For student reflection, only one study addressed this theme and reported a positive effect on nursing students. This review demonstrated that simulation has a positive impact on critical thinking; however, its impact on reflection remains inconclusive. Further research is essential to explore its effects across diverse populations, including those in developing countries, to maximize its educational potential in health professions education.
Introduction
Simulation can be presented with different levels of realism or “fidelity”: high, medium, and low fidelity [1]. High-fidelity simulation (HFS) involves the use of computer-controlled mannequins, virtual reality systems, or advanced simulators [2]. These simulations provide an immersive experience that closely mimics real conditions and allow participants to experience similar situations to those they would encounter in practice [3]. Mediumfidelity simulators operate on comparatively simpler computer systems, capable of displaying certain physiological responses; however, they may lack specific details essential for an accurate simulated scenario [1]. In contrast, low-fidelity simulations include approaches with less advanced equipment, which provides a less detailed reality [4,5]. Traditional teaching, on the other hand, is an educational strategy that focuses on methods such as lectures, case studies, and videotapes. The training includes a clinical practicum, clinical placement, or involvement in a hospital setting [6]. Though these approaches can improve clinical performance [7], they limit the engagement and active participation of health science students in clinical education [8]. Therefore, combining these methods with simulation may be an interesting way to improve skills [7]. Moreover, simulation is seen as an ideal substitute for traditional clinical practice [9].
Simulation is increasingly recognized as an educational method that not only enhances technical skills but also develops non-technical ones, which are now widely regarded as essential in clinical practice [7]. These non-technical skills, including cognitive, social, and personal skills, complement technical skills and contribute to the safe and effective execution of tasks [10]. Among them, critical thinking and reflection on practice are two essential competencies in the healthcare training curriculum [11]. Critical thinking guides caregiving actions using a solid foundation of theoretical knowledge [12]. Although critical thinking has several definitions, the most common one is to question whether the received information is factual, reliable, evidence-based, and unbiased. In simpler terms, it means thinking about what to believe or do [13]. The objective of all these definitions remains the same: making a reasoned judgment to make the best possible decision [14]. Critical thinking has demonstrated a highly positive association with problemsolving skills in nursing students along with improved clinical judgment, which in turn boosts students’ confidence in clinical settings [15]. Therefore, critical thinking can promote safe care, competence, and highquality patient care. It can also improve the level of professionalism in healthcare [16].
Reflection, another crucial non-technical skill, promotes inquiry, autonomy, emotional acknowledgment, self-awareness, and understanding, thereby enhancing professional practice in nursing education [12]. Reflection is both a cognitive and metacognitive process aimed at deeply understanding situations [17]. It is defined as active learning by examining practice in the light of knowledge, action, and feedback [16]. There are three types of reflection: content, process, and premise, each leading to action and change. Content reflection focuses on learning from rethinking what was done (actions). Process reflection, on the other hand, looks at the origins of an action and the factors associated with it, enabling new meaning to be gained and understanding to be transformed. Besides, premise reflection involves learning through meaning transformation (action), changing the overall perspective of a situation [15]. Reflection is essential in medical education and professional practice [18]. It serves as a vital component of experiential learning activities, including simulation-based education [19]. Additionally, it can improve quality and cultivate a positive practice environment [20,21]. A significant positive relationship between critical thinking and reflection has been observed in several studies [15,16]. Reflection enables students to integrate learning into their personal context, fostering the development of deeper knowledge and contributing to the improvement of their critical thinking skills [22]. Indeed, students’ reflection on the actions undertaken and the analysis of situations play a crucial role in developing their critical thinking skills [15]. Both critical thinking and reflection are essential because understanding the reasons behind one’s actions through self-reflection and critique enhances practice, leading to deeper learning [23], and improving the quality of care [24].
In recent years, the literature highlights a growing number of reviews examining the influence of simulation on the enhancement of non-technical skills, such as reasoning, clinical judgment, and critical thinking [6,25,26]. Most reviews evaluating the impact of simulation on critical thinking have reported mixed results, emphasizing the need for further research using robust methodologies to assess its effectiveness [2,26]. In this context, the present systematic review aims to synthesize findings on the impact of simulation in fostering critical thinking and reflection among nursing and medical students, while comparing its effectiveness to traditional teaching methods.
Methods
1. Registration and protocol
This systematic review adhered to the PRISMA guidelines (Preferred Reporting Items for Systematic Reviews and Meta-Analyses). The protocol was registered and published on PROSPERO (ID: CRD42022371971).
2. Eligibility criteria
The inclusion and exclusion criteria for this review were carefully formulated based on the Population, Intervention, Comparison, Outcome, and Study Design (PICOS) framework (Table 1).
3. Information sources
The literature search was conducted in the following databases: PubMed, ScienceDirect, and Scopus. Only studies published in English from the inception of the databases to January 2024 were included. The final search check was completed on January 20, 2024.
4. Search strategy
The database searches were independently conducted by two authors (S.L. and L.L.) using a research strategy developed based on the inclusion criteria. This strategy incorporated relevant keywords and Medical Subject Headings (MeSH) terms, including: “simulation,” “simulation training” (as a MeSH term), “critical thinking,” “reflection,” “medical students,” and “nursing students.” The search terms are as follows:
1) Scopus
(TITLE-ABS (“simulation” OR “simulation training”)) AND (TITLE-ABS (“medical students” OR “nursing students”)) AND (TITLE-ABS (“critical thinking” OR “reflection”))
2) PubMed
(((simulation[Title/Abstract]) OR (“simulation training”[ Title/Abstract])) AND ((“medical students”[Title/Abstract]) OR (“nursing students”[Title/Abstract]))) AND ((“critical thinking”[Title/Abstract]) OR (“reflection” [Title/Abstract]))
3) Science Direct
(“simulation” OR “simulation training”) AND (“medical students” OR “nursing students”) AND (“critical thinking” OR “reflection”)
5. Selection and data collection process
Two authors (S.L. and L.L.) independently conducted the selection process according to the inclusion and exclusion criteria. Titles, abstracts, full texts, and keywords were reviewed, and duplicates were removed electronically using EndNote X9 (Clarivate, Philadelphia, USA). Screening and selection were then performed using Rayyan QCRI (Rayyan, Cambridge, USA) [27]. Articles considered potentially eligible were downloaded for detailed analysis, and any disagreements were resolved by a third researcher (H.N.).
6. Data items
Data were extracted and included details such as the author, year of publication, study location, study design, sample size, education level, subject area, measurement tools or scales, type and frequency of simulation sessions, duration of each session, methods employed in the intervention and control groups, and key findings.
7. Synthesis methods
The data were classified and analyzed to meet the objective of the review. Two authors (S.L. and L.L.) conducted data extraction and clustering, with each study undergoing two independent reviews to ensure accurate variable categorization. The extracted data were then analyzed using narrative synthesis, beginning with the development of a theoretical framework aligned with the objective of the review. This framework aimed to assess the impact of simulation on the development of critical thinking and reflection in nursing and medical students, and to compare its effectiveness with traditional teaching methods. The key findings of the studies were systematically organized in tabular form for further analysis.
8. Study risk of bias assessment
The assessment of the risk of bias was conducted independently by two authors (S.L., A.A.) using the Joanna Briggs Institute’s Checklist (2023) for randomized controlled trials (RCTs) and quasi-experimental studies [28,29]. The evaluation of RCTs was conducted using a 13-item grid, with each item scored as either “1” (yes) or “0” (no or unclear). The total score for each study was determined by summing the number of “1s.” Studies scoring 10 or higher were classified as Grade A, indicating high quality. Grade B included scores between 7 and 9, reflecting moderate quality, while Grade C encompassed scores of 6 or lower. For quasi-experimental studies, a nine-item evaluation grid was used, following the same scoring method. Studies with a total score of 7 or higher were classified as Grade A (high quality). Grade B corresponded to scores between 5 and 6, representing moderate quality, and Grade C included scores of 4 or lower.
Results
1. Study selection
The search strategy identified 1,323 articles, of which 972 were duplicates and subsequently removed. Screening of the titles and abstracts of the remaining 351 articles led to the exclusion of 330 studies. As a result, 21 studies were selected for further analysis, and 16 were ultimately included in this review. The study selection process included in this review is illustrated in Fig. 1.
2. Study characteristics
Sixteen studies were included in this review [30-45]. Four studies were conducted in South Korea [30-33], four in the United States [34-37], two in Turkey [38,39], and one each in Saudi Arabia [40], Jordan [41], Australia [42], Oman [43], China [44], and Taiwan [45]. The total number of participants was 1283 participants, with 670 in experimental groups and 613 in control groups. Most studies focused on nursing students across various grades and programs, while only one study included medical students. Nine studies were RCTs, and seven employed a quasiexperimental design. The key characteristics of these 16 studies are summarized in Table 2.
3. Risk of bias in studies
Among the experimental studies, only one was evaluated as high quality [43], while the remaining eight were rated as moderate quality [34,35,37-41,45]. For the quasiexperimental studies, five were assessed as high-quality [30-33,36], one as moderate quality [44], and one as low-quality [42]. Tables 3 and 4 provide a detailed assessment of the quality of the eligible studies.
4. Results of syntheses
1) Critical thinking
The analysis showed that 15 of the 16 studies included in this review examined the impact of simulation on critical thinking. Several types of simulation were identified, with HFS being the most commonly used, appearing in eight studies [30,34,36-40,43]. Electronic simulation was used in four studies [31,35,41,42], while standardized patients were featured in three studies [33,38,44]. Additionally, one study employed a hybrid simulation that combined role-playing with a human body model [32]. The traditional teaching control group employed various methods, with lectures and case studies being the most frequently used, appearing in eight studies [31,33-36,40-42]. Other traditional approaches included practical learning experiences such as practicums, knowledge-based demonstrations, and a clinical day in an acute medical unit, as noted in three studies [30,32,43]. Additionally, four studies mentioned the use of traditional methods without specifying their types [37-39,44]. The frequency of simulation sessions varied, ranging from one to 14 sessions. Most studies, included in this review, featured a single simulation session, noted in seven studies [30-32,36,38,40,42]. Only one study reported a high frequency of 14 simulation sessions [43], followed by one with six sessions [41], and another with four sessions [33]. One study included three simulation sessions [34], while the number of sessions was unspecified in several other studies [35,39,44]. The duration of simulation sessions ranged from 15±5 to 150 minutes. However, most studies did not specify the time allocated to each phase pre-briefing, simulation training, and debriefing except for three studies [32,34,43]. The longest recorded duration was 150 minutes, noted in two studies [31,43], while another study reported a session lasting 135±5 minutes [32]. One article indicated a duration of 120 minutes [33], another described a session of 80 minutes [40], and two studies reported a duration of 60 minutes [37,41]. Additionally, one study noted sessions lasting 35 to 50 minutes [34], and another study reported a 45-minute simulation session [39]. Finally, the shortest duration was 15±5 minutes [30], which referred solely to the simulation phase. The duration of the simulation sessions was not specified in four studies [35,38,42,44].
Regarding the measurement tools, a variety of instruments, including nine tools and scales, were used to assess critical thinking. These instruments include a multiple-choice questionnaire specially developed by the authors [40]. The Yoon Critical Thinking Disposition Scale, used in four studies. It is a standardized tool for nursing students in Korea, consisting of 27 items scored on a Likert scale [30-33]. The California Critical Thinking Disposition Inventory is a multiple-choice questionnaire consisting of 27 items scored on a Likert scale. It was used in four studies [37-39,43]. Two of these four studies also included the California Critical Thinking Skill Test, which contains 34 items [37,43]. Moreover, two studies used the Health Sciences Reasoning Test, an assessment designed for health sciences students, which consists of 33 multiple-choice items [34,35]. Some studies have also employed other instruments, such as the Critical Thinking Self-Assessment Scale, which includes 115 multiplechoice questions [41], and the Critical Thinking Trait Scale, which includes 70 Likert scale items [44]. Furthermore, critical thinking was assessed using key criteria known as Scheffer and Rubenfeld’s critical thinking criteria, which include 10 habits of the mind and seven skills. Improvement in critical thinking is evaluated using three criteria; one habit of the mind and two cognitive skills [42]. Another study used the Assessment Technologies Institute test, consisting of 60 questions related to critical thinking, with a score range from 0 to 100 [36]. In 14 studies, all instruments used to measure critical thinking were valid and reliable. However, one study did not provide any information regarding their validity and reliability [42].
Most studies have revealed a positive impact of simulation on students’ critical thinking. Nine studies showed that the experimental groups using simulation had higher scores compared to the control groups following traditional teaching methods [30-33,39,41-44]. In two studies, critical thinking improved in both the experimental and the control groups [36,40]. Among the 16 studies included in this review, four mentioned nonsignificant results regarding the impact of simulation on critical thinking [34,35,38,39]. Only one study, conducted with medical students, demonstrated a positive impact on their critical thinking [44].
2) Student reflection
The analysis revealed that only one of the 16 studies examined the impact of simulation on reflection [45]. This study employed standardized patients with role-playing as the type of simulation. The control group received traditional teaching methods, including standard courses and case study discussions. The number of simulation sessions was not specified. However, the duration of the scenario and debriefing were provided separately: the scenario lasted 10 to15 minutes, while the debriefing took 30 minutes. The total duration of the simulation session was 65 minutes. The Simulation-Based Learning Evaluation Scale (SBLES), consisting of 37 items across five subscales, was used to assess reflection. The SBLES demonstrated good reliability and validity [45]. The study found a positive effect of simulation on the reflection of the experimental group compared to the control group.
Discussion
This review aims to consolidate findings on the impact of simulation in fostering critical thinking and reflection among nursing and medical students. Notably, nine studies highlighted that simulation, regardless of its type, significantly enhanced students’ critical thinking skills compared to traditional teaching methods. This finding is supported by a recent systematic review, which demonstrates that HFS can improve nursing students’ skills more effectively than traditional teaching methods [46]. This effect could be explained by the immersive environment provided by HFS, which realistically replicates clinical situations. It also allows for repeated scenarios, enabling participants to identify and correct errors, while the debriefing process fosters deep reflection [47]. However, another author reported adverse effects, noting that variations in the fidelity of patient simulators had no impact on students’ learning outcomes [48].
The choice to focus on critical thinking and reflection in this review stems from their pivotal role in healthcare education and clinical practice. Critical thinking equips nursing and medical students with the ability to assess complex clinical scenarios, make well-informed decisions, and ensure patient safety. Meanwhile, reflection promotes deeper learning by encouraging students to evaluate their own reasoning, recognize areas for growth, and cultivate a more flexible and responsive approach to patient care. Although other non-technical skills like communication, leadership, and teamwork are essential, this review prioritizes critical thinking and reflection due to their direct influence on clinical reasoning and lifelong learning. Additionally, their prominence in the literature on simulation-based learning highlights their importance in bridging the gap between theoretical knowledge and real-world application.
Furthermore, two studies demonstrated an improvement in critical thinking within both the experimental and control groups simultaneously, suggesting that critical thinking could be enhanced either through traditional teaching methods or through simulation [36,40]. Similar findings were reported in another study which found no differences between traditional teaching methods and simulation in the development of critical thinking [47]. The results varied based on the frequency of exposure to simulation. Specifically, five studies reported positive effects after a single simulation session, while another study found no significant effect despite exposure to five simulation sessions [37]. This suggests that the frequency of simulation sessions did not consistently influence the positive impact on critical thinking. In contrast, two studies concluded that the development of students’ skills is associated with repeated simulation practice over an extended period [46,49].
The total duration of the simulation sessions in this review varied considerably, ranging from 15 to 150 minutes. This observation was similar to the study conducted by another author, which found variation in the duration of simulation sessions [26]. The difference in duration may greatly affect the effectiveness of simulation-based learning, with the ideal total length for skill improvement being at least 55 minutes. This includes a pre-briefing of at least 10 minutes, 15 minutes of simulation training, and over 30 minutes for debriefing [43]. Furthermore, in this review, among the four studies that showed non-significant results regarding critical thinking, two studies did not specify the duration of the simulation session. Therefore, the disregard for the simulation duration could potentially have been a cause of the non-significant results observed in these studies. This variability in results is also observed in several studies, underscoring the need for further research to more accurately determine the real impact of simulation session frequency and duration on non-technical skills [26,50].
The results of our review showed that, despite the use of various measurement tools and scales, most studies reported significant improvements in critical thinking. This finding is supported by a previous study, which demonstrated a notable enhancement in critical thinking skills among students who participated in HFS, despite the use of different measurement tools [47]. However, other systematic reviews suggested that the inconsistent results may have been influenced by the use of different critical thinking measurement instruments [6,26]. This finding emphasizes the need for additional studies that employ standardized measurement scales.
Regarding reflection, this review demonstrated a positive effect of simulation on nursing students’ reflection compared to traditional teaching methods. However, this result is based on a single study focused on the attitude of reflection [45], which limits the generalization of these findings. This was further supported by a study that highlighted the need for more research measuring students’ reflection both before and after simulation [51].
This review broadly addressed the similarities between the two themes, such as cultural and educational aspects, long-term effects of simulation, population distribution, and methodological quality of the studies. Regarding the cultural and educational aspects, the studies under analysis were conducted in nine different countries, reflecting different cultural and educational contexts. This diversity can impact pedagogical methods, whether simulationbased or traditional approaches, as well as their effect on critical thinking and reflection. A recent study supports this by showing that cultural differences and variations in teaching methodologies and assessment strategies may influence the effectiveness of educational methods [52]. Therefore, it is crucial for future research to explore in depth how different educational systems and cultural contexts affect the development of critical thinking and reflection.
Many studies found that the short-term effects of simulation had a positive impact on the development of critical thinking and reflection [11,33]. This aligns with the results of this review, which demonstrated that all the included studies only explored the short-term effects of simulation, and none examined the long-term effects. In contrast, few studies revealed the long-term benefits of simulation on critical thinking [53], and reflection [54]. Despite these results, there was still a lack of research on the lasting effects of simulation, highlighting the need for further studies to explore these long-term effects [55].
Concerning the distribution of the population studied, this systematic review revealed a striking disparity. Out of 16 studies included, 15 involved nursing students, while only one involved medical student [44]. This finding broadly supports the work of other systematic reviews in this area, showing a significant focus on nursing students [26,47,56]. Similarly, these results confirm the findings of a recent study, which highlighted significant gaps in the literature concerning studies involving medical students [57]. In another study, the main barriers to the lack of involvement in research among medical students were insufficient funding and limited awareness of available opportunities [58]. Therefore, it is necessary to emphasize the need for further research focusing specifically on these students to contribute to the development of more effective educational strategies [57].
In this review, only one experimental study was considered of high quality, while the other eight studies had several methodological weaknesses, such as lack of true randomization, lack of allocation concealment, and lack of adequate blinding. For the quasi-experimental studies, the weaknesses concerned the similarity of the participants in the comparisons, the treatment or care received outside the intervention in question, and the way the outcomes were measured. Nevertheless, the quasi-experimental studies scored higher than the experimental studies. These methodological limitations may affect the overall conclusions of this review. We recommend that future research should focus on strengthening these aspects to improve the validity of the results.
This review provides valuable insights but has several limitations. The use of varied instruments and heterogeneous control groups complicates comparisons and hinders a meta-analysis. Control conditions ranged from lectures to clinical practice, making it difficult to isolate the true impact of simulation-based education. Differences in instructional methods, exposure time, and learner engagement further limit generalizability.
Most studies focused on nursing students, highlighting the need to investigate medical students. Additionally, research on simulation’s role in fostering student reflection remains scarce. Methodological weaknesses were evident, with only one of nine RCTs rated as high quality. Issues such as inadequate randomization, poor allocation concealment, and lack of blinding may have introduced biases.
To strengthen the evidence base, future research should ensure rigorous study designs, adequate sample sizes, standardized measurement tools, and transparent reporting. Standardizing control conditions will also enhance comparability and clarify the benefits of simulation-based learning.
Conclusion
The findings of this review demonstrate that simulation, in its various forms such as high-fidelity and virtual simulation, positively influences the development of critical thinking skills among nursing students compared to traditional teaching methods. Additionally, evidence suggests that simulation can also support the development of critical thinking in medical students, as shown in the studies included. These results highlight the potential of simulation as an effective educational approach for fostering critical thinking skills in health sciences education. Future research could build on these promising outcomes to further explore its broader applications and long-term impact.
Notes
Acknowledgements
None.
Funding
No financial support was received for this study.
Conflicts of interest
No potential conflict of interest relevant to this article was reported.
Author contributions
Conceptualization: SL, LL, HN; data collection: SL, LL, AA; data analysis and interpretation: SL, LL; drafting the article: SL, LL, AA, HN; critical revision of the article: SL, HN; and final approval of the version to be published: all authors.