| Home | E-Submission | Sitemap | Contact us |  
top_img
Korean J Med Educ > Volume 24(4); 2012 > Article
Chung, Issenberg, Phrampus, Miller, Je, Lim, and Kim: International Collaborative Faculty Development Program on Simulation-Based Healthcare Education: A Report on Its Successes and Challenges

Abstract

Purpose:

Countries that are less experienced with simulation-based healthcare education (SBHE) often import Western programs to initiate their efforts to deliver effective simulation training. Acknowledging cultural differences, we sought to determine whether faculty development program on SBHE in the United States could be transported successfully to train faculty members in Korea.

Methods:

An international, collaborative, multi-professional program from a pre-existing Western model was adapted. The process focused on prioritization of curricular elements based on local needs, translation of course materials, and delivery of the program in small group facilitation exercises. Three types of evaluation data were collected: participants' simulation experience; participants' ratings of the course; and participant's self-assessment of the impact of the course on their knowledge, skills, and attitudes (KSA) toward simulation teaching.

Results:

Thirty faculty teachers participated in the course. Eighty percent of the participants answered that they spent less than 25% of their time as simulation instructors. Time spent on planning, scenario development, delivering training, research, and administrative work ranged from 10% to 30%. Twenty-eight of 30 participants agreed or strongly agreed that the course was excellent and relevant to their needs. The participants' assessment of the impact of the course on their KSA toward simulation teaching improved significantly.

Conclusion:

Although there were many challenges to overcome, a systematic approach in the adaptation of a Western simulation faculty development course model was successfully implemented in Korea, and the program improves self-confidence and learning in participants.

INTRODUCTION

Over the past two decades, the use of simulation to improve the quality and safety of patient care has rapidly increased [1]. Simulation allows all levels and disciplines of healthcare providers to acquire and practice clinical skills to safely perform critical procedures [2,3]. Simulation allows for training in crisis resource management and team training in complex clinical environments [4]. In addition, various types of simulation activities can be used in medical education for their application to basic science [5].
Several factors have led to the growth and utilization of simulation-based healthcare education (SBHE) in Korea. Curricula were changed to aim for a more immersive and experiential learning and clinical skills evaluations were integrated as part of the national licensing examination. As a result, medical schools accreditation included evaluation of a dedicated clinical skills lab.
The dissemination of innovation and change in medical education including SBHE has been influenced by a Western bias. Many theoretical and conceptual frameworks, rules and principles, and system development are rooted from the Western culture [6]. Recently, many institutions and organizations in the Western world have begun to offer formal training for the simulation-based healthcare educators [7]. However, these faculty development programs are less prevalent in Korea. Therefore, countries that are less experienced with SBHE often import Western programs in order to initiate their efforts for delivering effective simulation training.
The goal of this study is to evaluate the impact of an international collaborative faculty development program in SBHE by assessing the reaction and learning improvement of participants. We will also describe our experiences and some of the challenges faced in adapting a cross-cultural faculty development program.
Contents of this manuscript were published in a Letter format in the journal Medical Teacher [8]. The manuscript was also presented as an abstract at the 12th Annual International Meeting on Simulation in Healthcare [9]. The course descriptions are presented again in this manuscript.

SUBJECTS AND METHODS

1. Course description

The authors adapted a multi-professional simulation instructor program with the aim of maximizing simulation-based instruction skill acquisition through a collaborative effort between Michael S. Gordon Center for Research in Medical Education (GCRME) at the University of Miami (www.gcrme.miami.edu), Peter M. Winter Institute for Simulation, Education and Research (WISER) at the University of Pittsburgh (www.wiser.pitt. edu), and the Korean Society for Simulation in Healthcare (KoSSH, www.kossh.or.kr). The program was based on previously developed course from the two United States centers, GCRME and WISER. The course is a 3-day program designed to introduce the fundamental skills and abilities for delivering SBHE through a variety of techniques and technologies. The course is designed with ‘interactive lecture-small group activities-debriefing’ format so that reflection and learning with respect to the learning objectives can take place. The course has been run in other English speaking Asian countries, such as Hong Kong and Singapore. But this was the first time the course was offered in a non-English speaking Asian country. A prioritization of the revised curriculum to serve the local needs of Korea was completed to shorten the program. Course materials and handouts were all translated to Korean to provide more comfort and ease for the participants. Translations were done and cross-checked by two members from KoSSH who have previously attended the United States course. The training was performed by three instructors from GCRME and WISER, and two instructors from KoSSH who have previously attended the original United States course, and also completed 1 to 2 years as research scholar fellowships at GCRME or WISER.

2. Course evaluation

The evaluation was grounded on Kirkpatrick's four levels model [10]: Reaction (Level 1: measurement of satisfaction), Learning (Level 2: measurement of learning), Behavior (Level 3: measurement of behavior change), and Results (Level 4: measurement of results). This study focused on evaluating Level 1 and Level 2 of the Kirkpatrick’s model.
We collected three types of evaluation data:
1) We developed a survey on the participants’ precourse baseline status of simulation experience including barriers faced in their work place. It consisted of 4 different sections: general information, operation of simulation center, perceived barriers, and simulationbased learning as a specialty. A total of 48 items were to be completed by participants online before attending the course. Questionnaires included multiple choice, openended questions, or statements on a 5-point score (1= strongly disagree to 5=strongly agree). A questionnaire to measure barriers to implementation was developed based on the Consolidated Framework for Implementation Research [11].
2) Participant’s ratings of the course were obtained using a post-course questionnaire. It consisted of 31-items on reaction to learning objectives responding on a 5-point score (1=strongly disagree to 5=strongly agree), and 14-item questionnaires regarding the reaction for the overall course responding on a 4-point score (1= strongly disagree to 4=strongly agree). Open-ended questions were asked about the top three activities that were most satisfactory and least satisfactory for the participants.
3) Participant assessment of the effects of the course on their knowledge, skills, and attitudes (KSA) toward simulation teaching were obtained with a post-course questionnaire. The items included quantitative ratings and open-ended responses. Modified retrospective preand post-course ratings of changes in KSA related to the learning objectives were used, where the participants answered to both pre- and post-questionnaires simultaneously on-site after the course. Participants were asked to rate their learning on a 5-point score (1= strongly disagree to 5=strongly agree) for both the preand post-course stages. The questionnaire consisted of 31-items related to the KSA of a simulation instructor. The questions were developed based on the learning objectives developed by the course faculties. Additionally, questions regarding confidence in the six teaching roles of healthcare educators by Harden & Crosby [12], were also assessed using the modified retrospective pre- and post-course ratings. All the questionnaires were developed based on Kirkpatrick’s guide for effective training program [10]. A total of 4 individuals with expertise in SBHE evaluated the questionnaires for content validity.
Quantitative data were analyzed using SPSS (SPSS Inc., Chicago, USA) and p-values of ≤0.05 were considered statistically significant. Pre- and post-course comparisons were calculated by Wilcoxon signed rank test.
The study was exempted from the local Institutional Review Board. Participation was voluntary and informed consent was obtained from each participant who returned a complete questionnaire. Respondent anonymity was maintained throughout the data collection and analysis.

RESULTS

Thirty participants completed the course. Twenty-five participants responded to the online pre-course survey for the demographic backgrounds (Table 1). The majority of the participants’ experience as a simulation instructor was less than 3 years. Educators from various disciplines participated in the course: 13 faculties from medical school, 6 from nursing school, 2 from paramedic school, 2 from clinical nursing, 1 coordinator, and 1 administrator. The participants had multi-roles as a simulation educator at their institution. The roles were instructors (72%), scenario developer (56%), course developer (44%), evaluator (36%), director (28%), researcher (16%), and coordinator (12%). Eighty percent of the participants answered that they spend less than 25% of their time as simulation instructors, but they would prefer spending 25% to 50% of their time as simulation instructors. The time spent for planning, scenario development, delivering training, research, administrative work ranged from 10% to 30%.
We asked participants to describe any anticipated barriers to their implementation of individual and organizational goals. Most agreed or strongly agreed barriers to SBHE implementation were as follows: 1) poor perception and knowledge of stakeholders or leaders on the importance of SBHE; 2) insufficient resources dedicated for SBHE program development and implementation and ongoing operations; 3) non-standardized various characterization of the phase an individual is in, as the individual progresses toward skilled, enthusiastic, and sustained use of the intervention; and 4) the high cost of SBHE program development and associated implementation.

1. Evaluation of reaction

Twenty-eight participants agreed or strongly agreed that the course was excellent and that they would highly recommend the course to their colleagues (Table 2). All 30 participants rated agree to strongly agree to the instructors’ thorough knowledge of the subject and professional presentation manners. The questionnaire indicated that 28 participants agreed that the course was relevant to their needs in SBHE. The most disagreed reaction to the course was the presentation of the instructors in a clear, understandable manner and balance between presentation and group involvement.
The top three most satisfactory activities about the course were small group discussions, instructors’ active involvement, and lectures on debriefing and research. The three least satisfactory features regarding the course were insufficient time for the overall course, insufficient time for group discussions, and language barriers. The top three take-home messages perceived by the participants were: 1) importance of debriefing; 2) developing learning objectives and outcome; and 3) necessity for research work.

2. Evaluation of learning

Immediately after the course, participants rated their own changes of learning comparing to their pre-course answers. Statistically significant pre- and postdifferences were found for all learning objectives (p<0.05) (Table 3). Differences in confidence improvement for all the six roles as a simulation instructor were statistically improved (Fig. 1).

DISCUSSION

A systematic review by Issenberg et al. [13] describes 10 features and uses of high-fidelity medical simulations that lead to effective learning, and includes training resources and curricular institutionalization. However, the literature has not focused on attributes of trained educators. A recent critical review by McGaghie et al. [14] identifies and discusses 12 features and best practices of SBHE that medical educators should know and use. One of them is instructor training. There is a paucity of research demonstrating the effectiveness of faculty development interventions in simulation [15]. No other research has looked into the transportability of a SBHE faculty development program from one culture to another. This project measured the effect of United States-based simulation faculty development model on Korean simulation instructors. The significant differences between the pre- and post-course results represented changes in a positive direction towards improved instructor learning. This course was able to improve on all six components of teaching roles in simulation.
Current complexity and pressures of healthcare delivery leads to new approaches in teaching and learning. This requires faculty members to be competent in a broad range of strategies for diverse settings, including SBHE. However, untrained faculty could lead to suboptimal integration of simulation curricula, inappropriate simulation scenario design, counterproductive debriefing and feedback, and ineffective assessment and evaluation. A nation-wide survey conducted by KoSSH, that included 55 centers (out of 97) responding to the survey, indicated faculty development and certification of instructors as the two highest priority issues in fostering SBHE in Korea (unpublished data). The three top competencies lacking from Korean instructors were designing simulation scenarios, debriefing techniques, and assessment/evaluation methods. Due to the reform of healthcare education mentioned previously, the hardware and logistics of healthcare simulation is bound to grow. Despite the potential growth, there seems to be shortages of trained professionals that could efficiently and effectively run SBHE.
In the pre-course baseline survey, most instructors were multi-tasking and spending majority of their time in planning, designing, and running a scenario. The course’s objective met the target to satisfy these circumstances. It is very important to review the objectives and the needs of the participants before transporting any foreign course to another culture. A needs assessment frames the problems or opportunities of interest and builds relationships among the people and groups who have a stake in the issue. It also provides the foundation for planning and action to improve learning, training, and performance [16].
Although the project is an adaptation from a welloperated model, it was still challenging to overcome the differences in culture, language, and educational systems. When a facilitator brings the course to another culture, there is a risk of not appreciating the differences in the new environment. This could diminish the impact for course participants. A widespread model for trauma care teaching is the Advanced Trauma Life Support Instructor Course (ATLS IC). In a recent observational study, it was demonstrated that the ATLS IC had substantial variations in delivery across the continents despite the strict regulations [17]. Thus, when facing a new culture, an exported course might undergo local alterations that may possibly affect its impact. When the present course was exported, the facilitator tried to be completely faithful to the original concept. We believed that no deliberate change to the content or teaching mode was necessary when transferring this program because of the common underlying concepts in SBHE. But some of the methods had to be considered appropriate for the Korean participants. The program was planned as a 2-day course, instead of 3 days, despite the volume of the topic. The working hours of Korean participants made it very difficult to devote whole 3 days for a course. Although the survey resulted in less satisfaction with overall time of the course and group discussions, a 2-day course is a more realistic program for the Koreans. The original course consists of three sections: computerbased simulation (e.g., full-body, computer driven/ programmable manikins), non-computer-based simulation (e.g., task trainers, standardized patients), and team-based simulation. Until recently, many Korean SBHE have been devoted to mannequin-based simulation. Therefore, the contents for the shortened 2 days of the course focused mainly on mannequin-based simulation with some of the didactic lectures covering other simulation modules. This would give the participants the opportunity to think about other modules they could implement in future teaching. Realistically, it would be more efficient to have multiple and shorter 1 to 1½-day courses, tailoring to a more focused and specific area of instructor’s role.
The different teaching methods and active participation and engagement were also key factors for a successful program. The original course is a very interactive course where the participants are engaged in all the lectures and small group activities. A more formal distinction between teachers and learners exists in Korea, which is different from the more free flowing communication that is common between teachers and learners in United States small group teaching [18]. There is also a discrete distinction between the physician and nursing group, as well as the hierarchy between senior and junior faculty. The nature and culture of the Korean society tend to be passive and quiet in a group activity [19]. This could lead to unsatisfied and inadequate intake of learning and was one of the main concerns. Being a very interactive course, instead of a random grouping, participants were pre-arranged to groups. People with similar barriers and obstacles, as well as background and experiences, were considered as important factors for grouping the participants.
Language was another challenge. Translation of the materials was completed and checked by two Korean members who have previously attended the United States course. However, at times it was difficult translating terminology that did not exist in Korean. Many of the SBHE terminologies are yet to be standardized [20], therefore, additional explanation had to be done to clarify many terms mentioned during the course. One suggestion is to have the KoSSH to organize and standardize the terminologies in order to have a more efficient adaptation of foreign educational materials. Language problems were also a challenge on-site. Although the materials were translated to Korean, and the didactic lecture slides were shown both in English and Korean in a dual screen, the instructions and facilitations were done in English. One of the local faculty members, who spoke fluent English, tried to translate many of the comments on site, but it was not sufficient to cover the whole course. When importing a course to a non-English speaking culture, it would help to designate time for simultaneous interpretation.
This study has several limitations. These include a lack of a control group and the subjects were not randomly selected. The participants were selected due to their important positions regarding teaching, and consequently healthcare providers interested in teaching were most likely to apply for the course. Therefore it is possible that they were more motivated than the average healthcare providers to improve their teaching. Second, this study lacked delayed post-evaluation. Thus, the long-term effects on learning behavior among participants were not assessed. Third, the study was conducted for only one course. Several courses with different and various learner groups need to be assessed for a more valid evaluation of the program. Finally, effectiveness data relied on self-reporting. The satisfaction of the course and improvement in the learning objectives does not mean that the participants learned from the course, if they use what they have learned, or if there is any impact on outcome. The higher levels of evaluation in Kirkpatrick’s hierarchy [10] should be addressed by evaluating the application and outcome levels by regular follow-up and monitoring of the participants.
A systematic approach in transporting a well-operated Western SBHE faculty development model in simulation can be adapted for use in a non-English speaking Asian culture. The program has proven to improve selfconfidence and learning for participants of a simulationbased healthcare education instructor program.

Acknowledgments

The authors wish to acknowledge Professor Michael S. Gordon for his general support to this project. The authors would also like to thank Associate Professor Young Sook Roh for her participation in translating the learning materials to Korean. Final thanks to all the faculty and staff of the Korean Society for Simulation in Healthcare who have contributed in organizing and running the course.

Fig. 1.

Differences in Confidence for the Six Roles as a Simulation Instructor (n=30)

p-values between pre- and post-course were significant (<0.001) by Wilcoxon signed rank test.
a)5-point score (1~5), 1=strongly disagree to 5=strongly agree.
Mean±SD.
kjme-24-4-319-6f1.gif
Table 1.
Demographics of Participants (n=25)
Demographic No. (%)
Gender
 Male 9 (36)
 Female 16 (64)
Age
 30~39 13 (52)
 40~49 10 (40)
 ≥50 2 (8)
Professions
 Physician 13 (52)
 Paramedics 10 (40)
 Nurses 2 (8)
Simulation instructor experience
 <1 yr 5 (20)
 1~3 yr 10 (40)
 4~6 yr 8 (32)
 7~9 yr 2 (8)
Self-assessment of their expertise in simulation
 Novice 7 (28)
 Experienced 13 (52)
 Competent 5 (20)
Table 2.
Mean Scores of Reaction to the Overall Course (n=30)
Questionnaires items Scoresa)
My impression of the course was “excellent” 2.30±0.60
The course objectives were clearly stated 2.50±0.70
This course met the defined objectives 2.20±0.55
The facility met all needs of the course 2.47±0.63
The equipment met all needs of the course 2.30±0.60
The food and beverages met all needs of the course 2.40±0.62
The course materials were useful 2.57±0.63
The instructors demonstrated thorough knowledge of the subject 2.83±0.38
The instructors presented information in a clear, understandable manner 2.53±0.68
The instructors presented information in a professional manner 2.77±0.43
The amount of time scheduled was exactly what was needed to meet the course objectives 2.20±0.55
There was a good balance between presentation and group involvement 2.33±0.66
This course relates directly to my job responsibilities 2.50±0.63
I would recommend this course to other teammates 2.57±0.63

a) 4-point score (1~4), 1=strongly disagree to 4=strongly agree. Mean±SD.

Table 3.
Comparison of Perceived Learning Pre- and Post-Course (n=30)
Questionnaire items Pre-course scoresa) Post-course scoresa) p-value
Identify key components for successful simulation 3.55±0.83 4.24±0.51 <0.001
Outline design and development tools for scenario construction 3.34±0.81 4.21±0.62 <0.001
Know how to develop objectives and modeling for scenario design 3.59±0.63 4.31±0.60 <0.001
Describe the scenario equipment, environments, and fidelity selection 3.59±0.68 4.21±0.49 <0.001
Know how to utilize environments, equipment, people and props for scenario design 3.28±0.75 4.07±0.59 <0.001
Know the concept of teaching with simulation 3.45±0.83 4.17±0.60 <0.001
Know the concept of assessment and debriefing for scenario design 3.00±0.80 3.97±0.50 <0.001
Recognize the advancement of the international simulation research agenda 2.59±1.09 4.03±0.68 <0.001
Recognize the art of debriefing 3.10±0.98 4.21±0.62 <0.001
Know the concept of implementation and evaluation for scenario design 3.45±0.83 4.14±0.52 <0.001
Know the design and development of non-computer-based simulations 3.14±0.74 4.03±0.78 <0.001
Know the design and development of team training simulations 2.93±0.62 3.86±0.64 <0.001

p-values between pre- and post-course were calculated by Wilcoxon signed rank test.

a) 5-point score (1~5), 1=strongly disagree to 5=strongly agree. Mean±SD.

REFERENCES

1. Aggarwal R, Mytton OT, Derbrew M, Hananel D, Heydenburg M, Issenberg B, MacAulay C, Mancini ME, Morimoto T, Soper N, Ziv A, Reznick R. Training and simulation for patient safety. Qual Saf Health Care 2010;19(Suppl 2):i34-i43.

2. Wayne DB, Barsuk JH, O'Leary KJ, Fudala MJ, McGaghie WC. Mastery learning of thoracentesis skills by internal medicine residents using simulation technology and deliberate practice. J Hosp Med 2008;3:48-54.

3. Barsuk JH, McGaghie WC, Cohen ER, O'Leary KJ, Wayne DB. Simulation-based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med 2009;37:2697-2701.

4. Salas E, DiazGranados D, Klein C, Burke CS, Stagl KC, Goodwin GF, Halpin SM. Does team training improve team performance? A meta-analysis. Hum Factors 2008;50:903-933.

5. Rosen KR, McBride JM, Drake RL. The use of simulation in medical education to enhance students' understanding of basic sciences. Med Teach 2009;31:842-846.

6. Cooper JB, Taqueti VR. A brief history of the development of mannequin simulators for clinical education and training. Postgrad Med J 2008;84:563-570.

7. The Basic & Advanced EuSim Simulation Instructor Course [Internet]. Copenhagen, Denmark: EuSim Group; 2011 [cited 2011 December 1]. Available from: http://www.wfme.org/standards/pgme.

8. Chung HS, Issenberg SB, Phrampus P, Miller G, Je SM, Lim TH, Kim YM. The impact of an international faculty development program on simulation-based healthcare education. Med Teach 2012;34:510.

9. Chung HS, Issenberg SB, Phrampus P, Miller G, Je SM, Lim TH, Kim YM. The impact of an international faculty development program on simulation-based healthcare education. Paper presented at: the 12th Annual International Meeting on Simulation in Healthcare; 2012 January 27-February 1; San Diego, USA.

10. Kirkpatrick DL, Kirkpatrick JD. Implementing the four levels: a practical guide for effective evaluation of training programs. 1st ed. San Francisco, USA: Berrett-Koehler Publishers; 2007.

11. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci 2009;4:50.

12. Harden RM, Crosby J. AMEE guide no. 20: the good teacher is more than a lecturer: the twelve roles of the teacher. Med Teach 2000;22:334-347.

13. Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach 2005;27:10-28.

14. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003-2009. Med Educ 2010;44:50-63.

15. Steinert Y, Mann K, Centeno A, Dolmans D, Spencer J, Gelula M, Prideaux D. A systematic review of faculty development initiatives designed to improve teaching effectiveness in medical education: BEME guide no. 8. Med Teach 2006;28:497-526.

16. Gupta K, Sleezer CM, Russ-Eft DF. American Society for Training and Development. A practical guide to needs assessment. 2nd ed. San Francisco, USA: Pfeiffer; 2007.

17. Kilroy DA. Teaching the trauma teachers: an international review of the advanced trauma life support instructor course. Emerg Med J 2007;24:467-470.

18. Wallace LS. When West meets East: a short-term immersion experience in South Korea. Int J Nurs Educ Scholarsh 2007;4:Article10.

19. Hofstede G, Hofstede GJ, Minkov M. Cultures and organizations: software of the mind. 3rd ed. New York, USA: McGraw-Hill; 2010.

20. Glavin RJ. Skills, training, and education. Simul Healthc 2011;6:4-7.

Editorial Office
The Korean Society of Medical Education
(204 Yenji-Dreamvile) 10 Daehak-ro, 1-gil, Jongno-gu, Seoul 03129, Korea
Tel: +82-2-2286-1180   Fax: +82-2-747-6206
E-mail : kjme@ksmed.or.kr
About |  Browse Articles |  Current Issue |  For Authors and Reviewers
Copyright © 2024 by Korean Society of Medical Education.                 Developed in M2PI