Simulation Improves Clinical, Communication, and ...

1
Simulation Improves Clinical, Communication, and Interpersonal Skills of Medical Students in the Real-World Clinical Setting BACKGROUND Holly Womack 1 , Chee Paul Lin, MA 2 , Amy W Amara, MD, PhD 3 , John R Rinker, MD 3,4 , Shilpa J Register, OD, PhD, MS 5 , Dawn Taylor Peterson, PhD 5,6,7 , Alissa A Chitlangia, MD 3 , Brannon L Vines, MD 3 , Margi Patel, MD 3 , Adeel Memon, MD 3 , Benjamin A McCullough, MD 3 , Benjamin A Jones, MD 3 , Bryan L Smelser, MD 3 , Frank S Benesh, MD 3 , Leslie E Perry, MD 3,8 , Manmeet Kaur, MD 3 , Wolfgang G Muhlhofer, MD 3,8 1 The University of Alabama at Birmingham (UAB) School of Medicine; Birmingham, AL; USA 2 Center for Clinical and Translational Science; UAB; Birmingham, AL; USA 3 Neurology Department at UAB; Birmingham, AL; USA 4 Birmingham Veterans Affairs Medical Center, Birmingham, AL; USA 5 Office for Interprofessional Simulation for Innovative Clinical Practice, UAB; Birmingham, AL; USA 6 School of Health Professions, Department of Health Services Administration, UAB; Birmingham, AL; USA 7 School of Medicine, Department of Medical Education, UAB; Birmingham, AL: USA 8 Epilepsy Center at UAB; Birmingham, AL; USA The use of simulation in health professional education has increased rapidly over the past two decades including the medical school setting. The positive effect of simulation-based training on self-confidence and clinical performance, teamwork and interpersonal communication skills as well as increased adherence to medical algorithms has been shown by multiple studies. Yet its direct impact on clinical and interpersonal communication skills in “real life” patient encounters has never been evaluated in a systematic fashion. The goal of this study was to determine, whether simulation can improve the clinical, communication and interpersonal skills of medical students in the real-world clinical setting. METHODS A prospective cohort of 36 third-year medical students were randomly chosen out of a pool of volunteers to undergo a real-life encounter with a neurological patient prior and after actively participating in a simulated encounter taking care of a patient with a neurological complaint as part of their regular clerkship experience. (Figure 1) Each simulation was followed by a debriefing session where their approach to clinical care and communication skills were discussed. The students’ performances during the real-life encounters were evaluated utilizing the Mini Clinical Evaluation Exercise (Mini-CEX) tool, a linear 9-point rating scale assessing clinical and communication skills by direct observation. (Figure 2) Changes on the Mini-CEX scores from pre- to post-simulation encounters were analyzed using a paired t-test. The associations of encounter and cohort characteristics with students’ performances were assessed using a linear mixed model with a compound symmetry covariance structure. Multivariable linear mixed models were fitted to assess the changes in students’ performances while adjusting for potential confounding variables. All analyses were conducted using SAS 9.3 (SAS Institute, Cary NC) software. A p-value < 0.05 was considered statistically significant in two- tailed statistical tests. RESULTS In bivariate analysis, the medical student Mini-CEX performance in the post- compared to the pre-sim encounter showed a significant increase ranging from 0.94 to 1.16 points in all Mini-CEX domains but counseling. For the majority of students, this translated into a category jump from borderline satisfactory or satisfactory to a high satisfactory or even superior performance for clinical and communication skills. Greater clinical experience was associated with a higher score for efficiency and organization (p=0.019). Students who focused primarily on data gathering and diagnosis rather than treatment skills scored higher on the History Taking Skills domain of the Mini-CEX (p=0.04). After adjusting for confounding variables (Table 1 and 2), significant improvement was observed in all Mini-CEX domains but counseling (p<0.05; Figure 3) Students who participated in the study achieved a mean score on the Neurology NBME exam of 86.4±7.3 compared to 86.3±5.5 of the class as a whole. Participants were also more likely to earn a clerkship grade of Honors (42% vs. 37%) than the class as a whole. Only four students (11.4%) were considering a career in clinical neuroscience (i.e. neurology or psychiatry) at the end of their third year, which is higher than the match rate into clinical neuroscience of 1.7 to 6.5% observed over the past three years at UAB. (Table 2) Table 1 - Encounter Details Pre-Simulation Encounter Post-Simulation Encounter Duration in min (mean±SD) 33.8±7.1 36.4±10.9 Patient Age (mean±SD) 51.6±16.7 54.8±17.1 Patient Sex (n; %) Male 13 (36.1) 21 (58.3) Female 23 (63.9) 15 (41.7) Case Complexity (n; %) Low 2 (5.6) 3 (8.3) Moderate 25 (69.4) 26 (72.2) High 9 (25) 7 (19.5) Case Focus (n; %) Data Gathering Only 16 (44.4) 13 (36.1) Data Gathering and Diagnosis 10 (27.8) 17 (47.2) Data Gathering, Diagnosis and Therapy 3 (8.3) 1 (2.8) All Aspects 7 (19.5) 5 (13.9) Case Setting (n; %) Inpatient 35 (97.2) 35 (97.2) Outpatient 1 (2.8) 1 (2.8) Years of Postgraduate Training of Evaluator (mean±SD) 5.11±1.24 5.50±1.34 Table 2 - Cohort Features Academic Performance obtained at End of Clerkship Block National Board of Medical Examiners (NBME) Scores (mean±SD) 86.4±7.3 Clinical Performance obtained at End of Clerkship Block Clerkship Grade (n; %) Fail 1 (2.7) Pass 6 (16.7) High Pass 14 (38.9) Honors 15 (41.7) Clinical Experience (n; %) Beginning of Academic Year 12 (33.3) Halfway through the Academic Year 10 (27.8) End of Academic Year 14 (38.9) Match Preference at End of Third Year* (n; %) Surgical/Procedural Based Specialty a 9 (25.7) Medical, Non-Neuroscience b 18 (51.5) Medical, Neuroscience c 4 (11.4) Pediatrics 4 (11.4) Participation in Interprofessional (IP) Simulation + (n; %) 3 (8.3) * information only available from 35/36 students; + IP with nursing students at a similar level of training a orthopedics, anesthesia, oral and maxillofacial surgery, general surgery, diagnostic radiology and otolaryngology; b dermatology, ophthalmology, OB/GYN, Family Medicine, Internal Medicine, Physical Medicine and Rehabilitation and Emergency Medicine; c neurology and psychiatry SETTING AND PARTICIPANTS Third year medical students at The University of Alabama at Birmingham School of Medicine during their Neurology Clerkship from 12 sequential clerkship blocks (July 2017 to June 2018). A total of 36 students completed the study. Generously funded by the UAB Center for Teaching and Learning Quality Improvement Program Teaching Innovation Award 2017/2018. DISCUSSION Several randomized, controlled studies comparing the impact of simulation- with lecture-based education on the performance of medical students and junior residents on standardized written and practical exams have showed that simulation not only leads to significant immediate but also long-term improvement of academic performance and procedural skills. 1-4 Unlike in our case, none of these studies assessed the potential impact of simulation on the clinical, interpersonal and communication skills of medical trainees during “real-life” clinical encounters. Yet a standardized rating of the performance in all these domains during an encounter with an actual patient translates into a more comprehensive assessment of the trainee’s overall understanding of the medical subject, critical thinking skills and correct application of their medical knowledge. 5 Most of these aspects are hard to reproduce with a written or standardized practical exam (e.g. OSCE) and in particular the latter comes with the potential danger of just assessing a “practice effect” rather than actual comprehension and critical application of the acquired medical knowledge. 1-3 At the same time simulation-based education provides a safe environment for medical trainees to work on all those pertinent skills and creates an emotional response that – if it’s tied to the learning experience – can facilitate long-lasting learning. 2,3 This prospective cohort study directly links the active participation in a simulation with a significant improvement of clinical, communication and interpersonal skills of health professionals in training assessed in a real-world clinical setting. This finding holds true regardless of the setting and focus of the clinical encounter and the learner’s clinical experience, academic interest or final academic performance. This supports the use of simulation as a very practical and hands-on educational tool for a well-rounded training of future health professionals. More prospective studies assessing the impact of simulation on actual patient care and outcome as well as patient satisfaction could boost the use of simulation in health care. CONCLUSION 1 Alluri RK, Tsing P, Lee E, Napolitano J. A randomized controlled trial of high-fidelity simulation versus lecture-based education in preclinical medical students. Medical Teacher 2016, 38: 404–409. 2 Beal MD, Kinnear J, Anderson CR, Martin TD, Wamboldt R, Hooper L. The Effectiveness of Medical Simulation in Teaching Medical Students Critical Care Medicine – A Systematic Review and Meta- Analysis. Society for Simulation in Healthcare 2017, 12 (2): 104-115. 3 Cook DA, Hatala R, Brydges R, Zendejas B, Szostek JH, Wang AT, Erwin PJ, Hamstra SJ. Technology-Enhanced Simulation for Health Professions Education – A Systematic Review and Meta-analysis. JAMA, 2011, 306 (9): 978-988. 4 Rogers GD, McConnell HW, Jones de Rooy N, Ellem F, Lombard M. A randomised controlled trial of extended immersion in multi-method continuing simulation to prepare senior medical students for practice as junior doctors. BMC Medical Education 2014, 14:90: 1-10. 5 Kogan, J. R., Holmboe, E. S. & Hauer, K. E. Tools for direct observation and assessment of clinical skills of medical trainees: a systematic review. JAMA 2009;302:1316-1326. 0 1 2 3 4 5 6 7 8 9 History Taking Skills Physical Exam Skills Humanism and Professionalism Clinical Judgement Counseling Skills Efficiency and Organization Clinical Competence Mini-CEX Scores (Scale 1-9) * * * * * * Figure 3 – Mini-CEX Scores for the Pre- (blue bars) and Post-Simulation Encounter (red bars) Asterisk (*) marks a significant improvement in Mini-CEX score with p<0.05 Figure 1 – Study Protocol Figure 2 – Mini-CEX Tool It is recommended to use a two-step approach for using the nine-point scale: 1. Determine if the performance was satisfactory, unsatisfactory or superior. 2. Determine which of the three possible ratings best reflects the observed trainee- patient encounter within the selected performance category. A rating of four (4), while classified as “satisfactory”, is defined as “marginal” and conveys the need to improve performance. TABLES REFERENCES

Transcript of Simulation Improves Clinical, Communication, and ...

Page 1: Simulation Improves Clinical, Communication, and ...

Simulation Improves Clinical, Communication, and Interpersonal Skills of Medical Students in the Real-World Clinical Setting

BACKGROUND

Holly Womack1, Chee Paul Lin, MA2, Amy W Amara, MD, PhD3, John R Rinker, MD3,4, Shilpa J Register, OD, PhD, MS5, Dawn Taylor Peterson, PhD5,6,7, Alissa A Chitlangia, MD3, Brannon L Vines, MD3, Margi Patel, MD3, Adeel Memon, MD3, Benjamin A McCullough, MD3, Benjamin A Jones, MD3, Bryan L Smelser, MD3, Frank S Benesh, MD3,

Leslie E Perry, MD3,8, Manmeet Kaur, MD3, Wolfgang G Muhlhofer, MD3,8

1 The University of Alabama at Birmingham (UAB) School of Medicine; Birmingham, AL; USA 2 Center for Clinical and Translational Science; UAB; Birmingham, AL; USA 3 Neurology Department at UAB; Birmingham, AL; USA 4 Birmingham Veterans Affairs Medical Center, Birmingham, AL; USA 5 Office for Interprofessional Simulation for Innovative Clinical Practice, UAB; Birmingham, AL; USA 6 School of Health Professions, Department of Health Services Administration, UAB; Birmingham, AL; USA 7 School of Medicine, Department of Medical Education, UAB; Birmingham, AL: USA 8 Epilepsy Center at UAB; Birmingham, AL; USA

The use of simulation in health professional education has increased rapidly over the past two decades

including the medical school setting. The positive effect of simulation-based training on self-confidence and

clinical performance, teamwork and interpersonal communication skills as well as increased adherence to

medical algorithms has been shown by multiple studies. Yet its direct impact on clinical and interpersonal

communication skills in “real life” patient encounters has never been evaluated in a systematic fashion. The

goal of this study was to determine, whether simulation can improve the clinical, communication and

interpersonal skills of medical students in the real-world clinical setting.

METHODS

A prospective cohort of 36 third-year medical students were randomly chosen out of a pool of volunteers to

undergo a real-life encounter with a neurological patient prior and after actively participating in a simulated

encounter taking care of a patient with a neurological complaint as part of their regular clerkship experience.

(Figure 1) Each simulation was followed by a debriefing session where their approach to clinical care and

communication skills were discussed. The students’ performances during the real-life encounters were

evaluated utilizing the Mini Clinical Evaluation Exercise (Mini-CEX) tool, a linear 9-point rating scale assessing

clinical and communication skills by direct observation. (Figure 2) Changes on the Mini-CEX scores from pre-

to post-simulation encounters were analyzed using a paired t-test. The associations of encounter and cohort

characteristics with students’ performances were assessed using a linear mixed model with a compound

symmetry covariance structure. Multivariable linear mixed models were fitted to assess the changes in

students’ performances while adjusting for potential confounding variables. All analyses were conducted using

SAS 9.3 (SAS Institute, Cary NC) software. A p-value < 0.05 was considered statistically significant in two-

tailed statistical tests.

RESULTS

In bivariate analysis, the medical student Mini-CEX performance in the post- compared to the pre-sim

encounter showed a significant increase ranging from 0.94 to 1.16 points in all Mini-CEX domains but

counseling. For the majority of students, this translated into a category jump from borderline satisfactory or

satisfactory to a high satisfactory or even superior performance for clinical and communication skills.

Greater clinical experience was associated with a higher score for efficiency and organization (p=0.019).

Students who focused primarily on data gathering and diagnosis rather than treatment skills scored higher

on the History Taking Skills domain of the Mini-CEX (p=0.04). After adjusting for confounding variables

(Table 1 and 2), significant improvement was observed in all Mini-CEX domains but counseling (p<0.05;

Figure 3) Students who participated in the study achieved a mean score on the Neurology NBME exam of

86.4±7.3 compared to 86.3±5.5 of the class as a whole. Participants were also more likely to earn a

clerkship grade of Honors (42% vs. 37%) than the class as a whole. Only four students (11.4%) were

considering a career in clinical neuroscience (i.e. neurology or psychiatry) at the end of their third year,

which is higher than the match rate into clinical neuroscience of 1.7 to 6.5% observed over the past three

years at UAB. (Table 2)

Table 1 - Encounter Details Pre-Simulation Encounter

Post-Simulation Encounter

Duration in min (mean±SD) 33.8±7.1 36.4±10.9Patient Age (mean±SD) 51.6±16.7 54.8±17.1Patient Sex (n; %) Male 13 (36.1) 21 (58.3) Female 23 (63.9) 15 (41.7)Case Complexity (n; %) Low 2 (5.6) 3 (8.3) Moderate 25 (69.4) 26 (72.2) High 9 (25) 7 (19.5)Case Focus (n; %) Data Gathering Only 16 (44.4) 13 (36.1) Data Gathering and Diagnosis 10 (27.8) 17 (47.2) Data Gathering, Diagnosis and Therapy 3 (8.3) 1 (2.8) All Aspects 7 (19.5) 5 (13.9)Case Setting (n; %) Inpatient 35 (97.2) 35 (97.2) Outpatient 1 (2.8) 1 (2.8)

Years of Postgraduate Training of Evaluator (mean±SD) 5.11±1.24 5.50±1.34

Table 2 - Cohort Features Academic Performance obtained at End of Clerkship BlockNational Board of Medical Examiners (NBME) Scores (mean±SD) 86.4±7.3

Clinical Performance obtained at End of Clerkship Block Clerkship Grade (n; %) Fail 1 (2.7) Pass 6 (16.7) High Pass 14 (38.9) Honors 15 (41.7)Clinical Experience (n; %) Beginning of Academic Year 12 (33.3) Halfway through the Academic Year 10 (27.8) End of Academic Year 14 (38.9)

Match Preference at End of Third Year* (n; %) Surgical/Procedural Based Specialtya 9 (25.7) Medical, Non-Neuroscienceb 18 (51.5) Medical, Neurosciencec 4 (11.4) Pediatrics 4 (11.4) Participation in Interprofessional (IP) Simulation+ (n; %) 3 (8.3) * information only available from 35/36 students; + IP with nursing students at a similar level of training a orthopedics, anesthesia, oral and maxillofacial surgery, general surgery, diagnostic radiology and otolaryngology; b dermatology, ophthalmology, OB/GYN, Family Medicine, Internal Medicine, Physical Medicine and Rehabilitation and Emergency Medicine; c neurology and psychiatry

SETTING AND PARTICIPANTS

Third year medical students at The University of Alabama at Birmingham School of Medicine during their

Neurology Clerkship from 12 sequential clerkship blocks (July 2017 to June 2018). A total of 36 students

completed the study.

Generously funded by the UAB Center for Teaching and Learning Quality Improvement Program Teaching Innovation Award 2017/2018.

DISCUSSION

Several randomized, controlled studies comparing the impact of simulation- with lecture-based education

on the performance of medical students and junior residents on standardized written and practical exams

have showed that simulation not only leads to significant immediate but also long-term improvement of

academic performance and procedural skills.1-4 Unlike in our case, none of these studies assessed the

potential impact of simulation on the clinical, interpersonal and communication skills of medical trainees

during “real-life” clinical encounters. Yet a standardized rating of the performance in all these domains

during an encounter with an actual patient translates into a more comprehensive assessment of the

trainee’s overall understanding of the medical subject, critical thinking skills and correct application of their

medical knowledge.5 Most of these aspects are hard to reproduce with a written or standardized practical

exam (e.g. OSCE) and in particular the latter comes with the potential danger of just assessing a “practice

effect” rather than actual comprehension and critical application of the acquired medical knowledge.1-3 At

the same time simulation-based education provides a safe environment for medical trainees to work on all

those pertinent skills and creates an emotional response that – if it’s tied to the learning experience – can

facilitate long-lasting learning.2,3

This prospective cohort study directly links the active participation in a simulation with a significant

improvement of clinical, communication and interpersonal skills of health professionals in training assessed in

a real-world clinical setting. This finding holds true regardless of the setting and focus of the clinical encounter

and the learner’s clinical experience, academic interest or final academic performance. This supports the use

of simulation as a very practical and hands-on educational tool for a well-rounded training of future health

professionals. More prospective studies assessing the impact of simulation on actual patient care and outcome

as well as patient satisfaction could boost the use of simulation in health care.

CONCLUSION

1 Alluri RK, Tsing P, Lee E, Napolitano J. A randomized controlled trial of high-fidelity simulation versus lecture-based education in preclinical medical students. Medical Teacher 2016, 38: 404–409. 2 Beal MD, Kinnear J, Anderson CR, Martin TD, Wamboldt R, Hooper L. The Effectiveness of Medical Simulation in Teaching Medical Students Critical Care Medicine – A Systematic Review and Meta-

Analysis. Society for Simulation in Healthcare 2017, 12 (2): 104-115. 3 Cook DA, Hatala R, Brydges R, Zendejas B, Szostek JH, Wang AT, Erwin PJ, Hamstra SJ. Technology-Enhanced Simulation for Health Professions Education – A Systematic Review and Meta-analysis.

JAMA, 2011, 306 (9): 978-988. 4 Rogers GD, McConnell HW, Jones de Rooy N, Ellem F, Lombard M. A randomised controlled trial of extended immersion in multi-method continuing simulation to prepare senior medical students for

practice as junior doctors. BMC Medical Education 2014, 14:90: 1-10. 5 Kogan, J. R., Holmboe, E. S. & Hauer, K. E. Tools for direct observation and assessment of clinical skills of medical trainees: a systematic review. JAMA 2009;302:1316-1326.

0

1

2

3

4

5

6

7

8

9

HistoryTakingSkills

PhysicalExamSkills

HumanismandProfessionalism

ClinicalJudgement

CounselingSkills EfficiencyandOrganization

ClinicalCompetence

Mini-C

EXScores(Scale1-9)

* *

*

*

**

Figure 3 – Mini-CEX Scores for the Pre- (blue bars) and Post-Simulation Encounter (red bars) Asterisk (*) marks a significant improvement in Mini-CEX score with p<0.05

Figure 1 – Study Protocol

Figure 2 – Mini-CEX Tool It is recommended to use a two-step approach for using the

nine-point scale: 1. Determine if the performance was satisfactory, unsatisfactory or superior. 2. Determine which of the three possible ratings best reflects the observed trainee-

patient encounter within the selected performance category. A rating of four (4), while classified as “satisfactory”, is defined as

“marginal” and conveys the need to improve performance.

TABLES

REFERENCES