Enhancing medical students’ documentation skills: the impact of an assessment and feedback program
Article information
Abstract
Purpose
We not only developed a clinical practice program for the assessment and feedback vis-à-vis medical students’ medical records but also evaluated the effectiveness of this program via a self-assessment of medical students’ competence in writing medical records pre- and post-program.
Methods
In 2022, 74 third-year medical students were divided into four groups and participated in a 2-week program. The students’ medical records were graded on a scale ranging from 1 to 3 daily, and the mean scores for 2 weeks were compared. Pre- and post-program, the students’ self-assessment survey was conducted.
Results
The mean scores increased from 1.30 in the first week to 2.14 in the second week. The mean score of self-assessment showed significant improvements, increasing from 2.43 to 4.00 for medical record, 2.64 to 4.08 for write present illness, 2.08 to 3.89 for initial orders, 2.35 to 4.34 for signature, and 2.38 to 3.97 for consent (all p<0.001).
Conclusion
We found that providing students with real-time assessment and feedback on their medical records increased their skills and confidence in medical records writing.
Introduction
Documenting medical records is one of the essential competencies for communication between medical doctors and other health providers [1]. Education about medical records is a very important area in medical education. Both the Liaison Committee on Medical Education and the Association of American Medical Colleges have identified communication, including written communication, as a critical skill to be taught to medical students [2,3].
In Korea, clinical skills are the primary outcome of all medical schools, and documenting medical records is part of clinical skills. Therefore, medical students learn how to write medical records before entering clinical practice in hospitals. While medical students are exposed to medical records and electronic medical records (EMRs) during clinical rotations, they often have limited opportunities to write initial orders and consent forms, typically focusing on recording patient histories. Notably, EMR documentation by students is rife with copy-and-paste text, a process that decreases documentation quality and interferes with diagnostic reasoning [5,6].
In actual practice, a doctor’s notes include orders for treatments and tests for patients. In EMR-dominated clinical practice settings, students seldom understand what fluids, medications, and test orders are prescribed for patients. Accurate ordering for patient care is an essential conversation between doctors and nurses. Therefore, we develop a clinical practice empowerment program that includes the assessment of the medical students’ competency in documenting medical records, including the initial order and consent form, and the instant and direct feedback to the medical students. To evaluate the effectiveness of this program, a self-assessment of medical students’ competence to document medical records before and after participating in this program was conducted. This study compared the results and proposed the impact of this program.
Methods
1. Program development and implementation
From March 2021 to September 2022, over a 6-month period, the Curriculum Improvement Committee of Gyeongsang National University College of Medicine developed an educational program aimed at enhancing medical competencies. With regard to the contents that were judged to be lacking in the graduation competencies of students, professors developed a clinical practice program to implement and evaluate medical records after interviewing patients, focusing on initial orders at the emergency department (ED). Four meetings were held to determine common patient presentations encountered as an ED intern, and the following scenarios were developed: bloody stool, upper gastrointestinal foreign body, abdominal pain, convulsions, ocular trauma, drug intoxication, and ankle trauma. To standardize the assessment process, a sample medical record, initial order, and consent form were developed for each case. Subsequently, a comprehensive scoring system was established to evaluate students’ proficiency and accuracy in completing medical records. The evaluation criteria were collaboratively organized with the participating professors. Scores are graded on a scale ranging from 1 (poor) to 3 (excellent), and they are given daily. It was decided that the evaluation would be undertaken immediately after the practicum, and feedback would be given instantaneously. We established a robust feedback mechanism that provided specific, constructive, and immediate feedback to students on their documentation skills. The overall grade was determined to be a “PASS” if the rate was the same or increased from Week 1 to Week 2 over 2 weeks.
During the 40-week clinical practice for third-year medical students, they were divided into four groups, with one group participating in this 2-week program every 10 weeks. Paper clinical practice guidelines were provided to students, who completed general medical records, initial prescriptions, and consent forms through standardized patient (SP)–student interviews during practice. The Curriculum Improvement Committee developed and determined the scenario. Every 10 weeks, one group participated in the program for 2 weeks, and each group was provided with one opportunity to interview SP during the practicum. The schedule of SP practice was adjusted by the staff and randomly assigned and conducted.
2. Self-assessment survey
Third-year medical students completed pre- and post-program surveys evaluating their competencies in medical record documentation. The survey data included: documenting medical records accurately, knowing how to write present illness, entering initial orders, signing, consulting with other departments, and receiving informed consent.
3. Ethics statement
The Institutional Review Board of Gyeongsang National University Hospital (GNUHIRB23-08-20) granted ethical approval and waived the requirement for informed consent. The requirement for informed consent from the individual students was omitted because of the retrospective study after the regular educational program.
4. Statistical analysis
Paired t-test was used in the study’s primary analysis to assess for different mean results in the pre- and post-program surveys. Reliability analysis found a high level of internal consistency (0.86).
Results
All 74 students, achieving a 100% response rate, participated in both the pre- and post-program surveys. A majority of students positively assessed their competency in documenting medical records after they completed the program.
1.Scores of the clinical practice empowerment program
On the initial day, a professor assessed and provided feedback on students’ medical records, assigning a “poor” rating to all initial orders. The initial orders written by students predominately focused on diagnostic tests, with some including items such as saline, 5% glucose, and so forth. However, no students wrote the initial orders in the default order such as checking vital signs, bedrests, non per oral, monitoring, fluids, and so forth, and only two students signed their orders.
The average weekly scores of students increased from an initial average of 1.30 out of 3 in the first week to an average of 2.14 out of 3 in the second week (Fig. 1). Notably, all students successfully passed the program.
2. Self-assessment survey
Since this is a regular curriculum that all third-year medical students have to participate in at different periods, we could not divide it into two comparison groups, so we surveyed third-year medical students pre- and postprogram. The mean self-assessment score of competency showed significant improvements, increasing for document medical record accurately, write present illness accurately, initial orders for treatments, sign after medical records, consent with patient or guardian. The mean differences between the pre- and post-intensive feedback programs were all statistically significant at the level of 0.001 (Table 1).
After the intensive feedback program, 82.4% of students responded that they could “document medical records accurately,” “well,” and “very well.” Furthermore, 83.8% responded “well” and “very well” to “know how to write present illness accurately,” 74.3% to “enter initial orders for treatments,” 85.1% to “sign after documenting medical records,” and 78.4% to “achieve consent with patient or guardian’s understanding” (Table 1). They assessed their competency the highest to the question of signing after documenting medical records, whereas they assessed the lowest to the question of entering initial orders for treatments.
After students completed the intensive feedback program, 89.2% of them responded having felt or learned “much” and “very much” about “I must list disease history in order.” Moreover, 90.5% responded having felt or learned “much” and “very much” about “medical records,” 90.5% about “write orders,” 91.9% about “sign after records,” 100.0% about “the consent with patient or guardian,” and 94.6% about “tests for symptoms” (Table 2). Students have felt or learned that consent from patients is most important.
Discussion
Through this program, we found that the average weekly scores of students increased from an average of 1.30 in the first week to an average of 2.14 out of 3 in the second week. Our results show that the instant assessment and direct feedback to medical students’ medical records is very effective to improving their documentation of medical records. The job education with feedback in clinical documentation provides a learning opportunity for medical students and is essential to ensure accurate, safe, concise, and timely clinical notes [7]. In semi-structured interviews, formal clinical documentation education using lectures and tutorials was minimal and fourth-year medical student expressed that variations were present in education between teams and receiving limited feedback on performance [7]. This was consistent with our results that direct assessment and immediate feedback helped third-year students, who were taught about medical records in lectures during their second-year grade and had viewed EMRs during clinical rotations, improve their skills.
In the review of 95 medical students’ SOAP (subjective, objective, assessment, and plan) notes, the most significant problem with completeness was the omission of students’ signatures [8], as in our results, only two students did their sign. Interestingly, although nearly all medical students did not write their signatures and could not enter the initial orders, they reported that the mean scores of “I do sign” and “I can enter initial orders” were 2.35 and 2.08 out of 5, respectively, in the pre-intensive feedback program survey. This result suggests that medical students do not have an accurate picture of their medical record documenting skills.
Although medical schools train students to ask the most valuable questions during patient interviews when they teach history-taking skills, similar strategies for obtaining the most useful data from EMRs are not routinely taught [9]. Medical students can view patient data through EMRs but rarely write patient records or orders except for their paper clinical practice notes, such that medical students no longer sign or enter orders, even with supervision in the United States [7]. The survey of clerkship directors by the Alliance for Clinical Education and the University of Michigan Medical School also showed the limited scope of EMR use by students. The survey indicated that 32% of clerkship directors allowed students to view patient records only; 41% allowed students to view patient records and write notes; and 27% allowed students to view patient records, write notes, and enter orders to be cosigned [10]. The findings of this study confirmed that medical students lacked confidence not only in the real-time documentation of patient interviews but also in the composing initial of orders for patients, even after repeatedly checking their notes in the EMRs during clinical rotations. Therefore, this underscores the need and importance of training programs on medical record writing during clinical practice. While there are inherent limitations of methods in this study and previous research on this is lacking, and so various researches needs to be expanded in the future.
In conclusion, we found that providing students with real-time assessment and feedback on their medical records increased their skills and confidence in medical record writing. Therefore, it is necessary to implement a program that incorporates assessment and feedback on medical record writing during clinical practice.
Acknowledgements
None.
Notes
Funding
No financial support was received for this article.
Conflicts of interest
No potential conflict of interest relevant to this article was reported.
Author contributions
Conceptualization: JH, JJ; methodology: JH, YA; validation: JJ; formal analysis: JH, YA investigation: JH; resources: JJ; data curation: YA; writing–original draft: JH, YA; writing–review & editing: JJ, YA; project administration: JH; and final approval of the version to be published: all authors.