Enhancing medical students’ documentation skills: the impact of an assessment and feedback program

Article information

Korean J Med Educ. 2024;36(3):335-340
Publication date (electronic) : 2024 August 29
doi : https://doi.org/10.3946/kjme.2024.307
1Gyeongsang National University College of Medicine, Jinju, Korea
2Gyeongsang Institute of Health Sciences, Jinju, Korea
3Gyeongsang National University Hospital, Jinju, Korea
Corresponding Author: Ji-hyun Seo (https://orcid.org/0000-0002-0691-3957) Department of Pediatrics, Gyeongsang National University College of Medicine, 15 Jinju-daero 816beon-gil, Jinju 52727, Korea Tel: +82.55.750.8731 Fax: +82.55.750.8731 email: seozee@gnu.ac.kr
Received 2024 January 21; Revised 2024 April 23; Accepted 2024 June 3.

Abstract

Purpose

We not only developed a clinical practice program for the assessment and feedback vis-à-vis medical students’ medical records but also evaluated the effectiveness of this program via a self-assessment of medical students’ competence in writing medical records pre- and post-program.

Methods

In 2022, 74 third-year medical students were divided into four groups and participated in a 2-week program. The students’ medical records were graded on a scale ranging from 1 to 3 daily, and the mean scores for 2 weeks were compared. Pre- and post-program, the students’ self-assessment survey was conducted.

Results

The mean scores increased from 1.30 in the first week to 2.14 in the second week. The mean score of self-assessment showed significant improvements, increasing from 2.43 to 4.00 for medical record, 2.64 to 4.08 for write present illness, 2.08 to 3.89 for initial orders, 2.35 to 4.34 for signature, and 2.38 to 3.97 for consent (all p<0.001).

Conclusion

We found that providing students with real-time assessment and feedback on their medical records increased their skills and confidence in medical records writing.

Introduction

Documenting medical records is one of the essential competencies for communication between medical doctors and other health providers [1]. Education about medical records is a very important area in medical education. Both the Liaison Committee on Medical Education and the Association of American Medical Colleges have identified communication, including written communication, as a critical skill to be taught to medical students [2,3].

In Korea, clinical skills are the primary outcome of all medical schools, and documenting medical records is part of clinical skills. Therefore, medical students learn how to write medical records before entering clinical practice in hospitals. While medical students are exposed to medical records and electronic medical records (EMRs) during clinical rotations, they often have limited opportunities to write initial orders and consent forms, typically focusing on recording patient histories. Notably, EMR documentation by students is rife with copy-and-paste text, a process that decreases documentation quality and interferes with diagnostic reasoning [5,6].

In actual practice, a doctor’s notes include orders for treatments and tests for patients. In EMR-dominated clinical practice settings, students seldom understand what fluids, medications, and test orders are prescribed for patients. Accurate ordering for patient care is an essential conversation between doctors and nurses. Therefore, we develop a clinical practice empowerment program that includes the assessment of the medical students’ competency in documenting medical records, including the initial order and consent form, and the instant and direct feedback to the medical students. To evaluate the effectiveness of this program, a self-assessment of medical students’ competence to document medical records before and after participating in this program was conducted. This study compared the results and proposed the impact of this program.

Methods

1. Program development and implementation

From March 2021 to September 2022, over a 6-month period, the Curriculum Improvement Committee of Gyeongsang National University College of Medicine developed an educational program aimed at enhancing medical competencies. With regard to the contents that were judged to be lacking in the graduation competencies of students, professors developed a clinical practice program to implement and evaluate medical records after interviewing patients, focusing on initial orders at the emergency department (ED). Four meetings were held to determine common patient presentations encountered as an ED intern, and the following scenarios were developed: bloody stool, upper gastrointestinal foreign body, abdominal pain, convulsions, ocular trauma, drug intoxication, and ankle trauma. To standardize the assessment process, a sample medical record, initial order, and consent form were developed for each case. Subsequently, a comprehensive scoring system was established to evaluate students’ proficiency and accuracy in completing medical records. The evaluation criteria were collaboratively organized with the participating professors. Scores are graded on a scale ranging from 1 (poor) to 3 (excellent), and they are given daily. It was decided that the evaluation would be undertaken immediately after the practicum, and feedback would be given instantaneously. We established a robust feedback mechanism that provided specific, constructive, and immediate feedback to students on their documentation skills. The overall grade was determined to be a “PASS” if the rate was the same or increased from Week 1 to Week 2 over 2 weeks.

During the 40-week clinical practice for third-year medical students, they were divided into four groups, with one group participating in this 2-week program every 10 weeks. Paper clinical practice guidelines were provided to students, who completed general medical records, initial prescriptions, and consent forms through standardized patient (SP)–student interviews during practice. The Curriculum Improvement Committee developed and determined the scenario. Every 10 weeks, one group participated in the program for 2 weeks, and each group was provided with one opportunity to interview SP during the practicum. The schedule of SP practice was adjusted by the staff and randomly assigned and conducted.

2. Self-assessment survey

Third-year medical students completed pre- and post-program surveys evaluating their competencies in medical record documentation. The survey data included: documenting medical records accurately, knowing how to write present illness, entering initial orders, signing, consulting with other departments, and receiving informed consent.

3. Ethics statement

The Institutional Review Board of Gyeongsang National University Hospital (GNUHIRB23-08-20) granted ethical approval and waived the requirement for informed consent. The requirement for informed consent from the individual students was omitted because of the retrospective study after the regular educational program.

4. Statistical analysis

Paired t-test was used in the study’s primary analysis to assess for different mean results in the pre- and post-program surveys. Reliability analysis found a high level of internal consistency (0.86).

Results

All 74 students, achieving a 100% response rate, participated in both the pre- and post-program surveys. A majority of students positively assessed their competency in documenting medical records after they completed the program.

1.Scores of the clinical practice empowerment program

On the initial day, a professor assessed and provided feedback on students’ medical records, assigning a “poor” rating to all initial orders. The initial orders written by students predominately focused on diagnostic tests, with some including items such as saline, 5% glucose, and so forth. However, no students wrote the initial orders in the default order such as checking vital signs, bedrests, non per oral, monitoring, fluids, and so forth, and only two students signed their orders.

The average weekly scores of students increased from an initial average of 1.30 out of 3 in the first week to an average of 2.14 out of 3 in the second week (Fig. 1). Notably, all students successfully passed the program.

Fig. 1.

The Average Scores of Students’ Medical Records in First Week and Second Weeks (N=74)

The average weekly scores of students increased from an average of 1.30 out of 3 in the first week to an average of 2.14 out of 3 in the second week.

2. Self-assessment survey

Since this is a regular curriculum that all third-year medical students have to participate in at different periods, we could not divide it into two comparison groups, so we surveyed third-year medical students pre- and postprogram. The mean self-assessment score of competency showed significant improvements, increasing for document medical record accurately, write present illness accurately, initial orders for treatments, sign after medical records, consent with patient or guardian. The mean differences between the pre- and post-intensive feedback programs were all statistically significant at the level of 0.001 (Table 1).

The Result of Self-assessment Survey

After the intensive feedback program, 82.4% of students responded that they could “document medical records accurately,” “well,” and “very well.” Furthermore, 83.8% responded “well” and “very well” to “know how to write present illness accurately,” 74.3% to “enter initial orders for treatments,” 85.1% to “sign after documenting medical records,” and 78.4% to “achieve consent with patient or guardian’s understanding” (Table 1). They assessed their competency the highest to the question of signing after documenting medical records, whereas they assessed the lowest to the question of entering initial orders for treatments.

After students completed the intensive feedback program, 89.2% of them responded having felt or learned “much” and “very much” about “I must list disease history in order.” Moreover, 90.5% responded having felt or learned “much” and “very much” about “medical records,” 90.5% about “write orders,” 91.9% about “sign after records,” 100.0% about “the consent with patient or guardian,” and 94.6% about “tests for symptoms” (Table 2). Students have felt or learned that consent from patients is most important.

The Degree of Change in Students’ Confidence in Clinical Skills after the Intensive Feedback Program

Discussion

Through this program, we found that the average weekly scores of students increased from an average of 1.30 in the first week to an average of 2.14 out of 3 in the second week. Our results show that the instant assessment and direct feedback to medical students’ medical records is very effective to improving their documentation of medical records. The job education with feedback in clinical documentation provides a learning opportunity for medical students and is essential to ensure accurate, safe, concise, and timely clinical notes [7]. In semi-structured interviews, formal clinical documentation education using lectures and tutorials was minimal and fourth-year medical student expressed that variations were present in education between teams and receiving limited feedback on performance [7]. This was consistent with our results that direct assessment and immediate feedback helped third-year students, who were taught about medical records in lectures during their second-year grade and had viewed EMRs during clinical rotations, improve their skills.

In the review of 95 medical students’ SOAP (subjective, objective, assessment, and plan) notes, the most significant problem with completeness was the omission of students’ signatures [8], as in our results, only two students did their sign. Interestingly, although nearly all medical students did not write their signatures and could not enter the initial orders, they reported that the mean scores of “I do sign” and “I can enter initial orders” were 2.35 and 2.08 out of 5, respectively, in the pre-intensive feedback program survey. This result suggests that medical students do not have an accurate picture of their medical record documenting skills.

Although medical schools train students to ask the most valuable questions during patient interviews when they teach history-taking skills, similar strategies for obtaining the most useful data from EMRs are not routinely taught [9]. Medical students can view patient data through EMRs but rarely write patient records or orders except for their paper clinical practice notes, such that medical students no longer sign or enter orders, even with supervision in the United States [7]. The survey of clerkship directors by the Alliance for Clinical Education and the University of Michigan Medical School also showed the limited scope of EMR use by students. The survey indicated that 32% of clerkship directors allowed students to view patient records only; 41% allowed students to view patient records and write notes; and 27% allowed students to view patient records, write notes, and enter orders to be cosigned [10]. The findings of this study confirmed that medical students lacked confidence not only in the real-time documentation of patient interviews but also in the composing initial of orders for patients, even after repeatedly checking their notes in the EMRs during clinical rotations. Therefore, this underscores the need and importance of training programs on medical record writing during clinical practice. While there are inherent limitations of methods in this study and previous research on this is lacking, and so various researches needs to be expanded in the future.

In conclusion, we found that providing students with real-time assessment and feedback on their medical records increased their skills and confidence in medical record writing. Therefore, it is necessary to implement a program that incorporates assessment and feedback on medical record writing during clinical practice.

Acknowledgements

None.

Notes

Funding

No financial support was received for this article.

Conflicts of interest

No potential conflict of interest relevant to this article was reported.

Author contributions

Conceptualization: JH, JJ; methodology: JH, YA; validation: JJ; formal analysis: JH, YA investigation: JH; resources: JJ; data curation: YA; writing–original draft: JH, YA; writing–review & editing: JJ, YA; project administration: JH; and final approval of the version to be published: all authors.

References

1. Gliatto P, Masters P, Karani R. Medical student documentation in the medical record: is it a liability? Mt Sinai J Med 2009;76(4):357–364.
2. Functions and Structure of a Medical School: Standards for Accreditation of Medical Education Programs Leading to the MD Degree. http://lcme.org/publications/. Published 2016. Accessed January 10, 2024.
3. Core entrust able professional activities for entering residency. http://aamc.org. Published 2016. Accessed January 10, 2024.
4. Biagioli FE, Elliot DL, Palmer RT, et al. The electronic health record objective structured clinical examination: assessing student competency in patient interactions while using the electronic health record. Acad Med 2017;92(1):87–91.
5. Heiman HL, Rasminsky S, Bierman JA, et al. Medical students’ observations, practices, and attitudes regarding electronic health record documentation. Teach Learn Med 2014;26(1):49–55.
6. Hartzband P, Groopman J. Off the record: avoiding the pitfalls of going electronic. N Engl J Med 2008;358(16):1656–1658.
7. Rowlands S, Coverdale S, Callen J. Documentation of clinical care in hospital patients’ medical records: a qualitative study of medical students’ perspectives on clinical documentation education. Health Inf Manag 2016;45(3):99–106.
8. Seo JH, Kong HH, Im SJ, et al. A pilot study on the evaluation of medical student documentation: assessment of SOAP notes. Korean J Med Educ 2016;28(2):237–241.
9. Zavodnick J, Kouvatsos T. Electronic health record skills workshop for medical students. MedEdPORTAL 2019;15:10849.
10. Hammoud MM, Margo K, Christner JG, Fisher J, Fischer SH, Pangaro LN. Opportunities and challenges in integrating electronic health records into undergraduate medical education: a national survey of clerkship directors. Teach Learn Med 2012;24(3):219–224.

Article information Continued

Fig. 1.

The Average Scores of Students’ Medical Records in First Week and Second Weeks (N=74)

The average weekly scores of students increased from an average of 1.30 out of 3 in the first week to an average of 2.14 out of 3 in the second week.

Table 1.

The Result of Self-assessment Survey

I can do.............. Mean difference of scores from pre- and post-intensive feedback program
Students’ confidence in their clinical skills after the intensive feedback program (N=74)
No. Mean±SD t-value p-value Not at all Not well So and so Well Very well
Document medical records accurately -12.868 0.000
 Pre 74 2.43±0.845 - 1 (1.4) 12 (16.2) 47 (63.5) 14 (18.9)
 Post 74 4.00±0.641 - 1 (1.4) 11 (14.9) 43 (58.1) 19 (25.7)
Know how to write present illness accurately -10.309 0.000
 Pre 74 2.64±1.028 - 1 (1.4) 18 (24.3) 43 (58.1) 12 (16.2)
 Post 74 4.08±0.678 - - 11 (14.9) 27 (36.5) 36 (48.6)
Enter initial orders for treatments -12.392 0.000
 Pre 74 2.08±1.120 - - 16 (21.6) 44 (59.5) 14 (18.9)
 Post 74 3.89±0.674 1 (1.4) 12 (16.2) 47 (63.5) 14 (18.9)
Sign after medical records -12.757 0.000
 Pre 74 2.35±1.152 - 1 (1.4) 11 (14.9) 43 (58.1) 19 (25.7)
 Post 74 4.34±0.727 - 1 (1.4) 18 (24.3) 43 (58.1) 12 (16.2)
Achieve consent with patient or guardian’s understanding -12.955 0.000
 Pre 74 2.38±1.043 - - 11 (14.9) 27 (36.5) 36 (48.6)
 Post 74 3.97±0.640 - - - - -

Data are presented as number, mean±SD, or number (%).

SD: Standard deviation.

Table 2.

The Degree of Change in Students’ Confidence in Clinical Skills after the Intensive Feedback Program

I can do............. Not at all Not much So and so Much Very much
History taking - - 8 (10.8) 49 (66.2) 17 (23.0)
Document medical records - - 7 (9.5) 35 (47.3) 32 (43.2)
Enter initial orders - - 7 (9.5) 37 (50.0) 30 (40.5)
Sign - - 6 (8.1) 29 (39.2) 39 (52.7)
Get consent - - - 37 (50.0) 37 (50.0)
Tests for diagnosis - - 4 (5.4) 48 (64.9) 22 (29.7)

Data are presented as number (%).