Background: Entrustable Professional Activities (EPAs) are fast becoming the framework to assess medical student preparedness to deliver safe, high quality care. The hospital ward environment leads to highly variable teaching and evaluation of EPAs. Accordingly, we felt that the controlled teaching environment found in simulation (SIM) along with a standardized checklist with trained faculty would improve the reliability of EPA education and assessment.

Purpose: 1. We developed a checklist that can be used to assess student performance on EPAs 1 (history and physician), 2 (differential diagnosis), 3 (initiate appropriate work up and treatment) and 10 (recognize urgent and emergent situations), during simulation.
2. We used the checklist to identify gaps in student EPA performance and address them in post-SIM debriefing.
3. We sought to understand whether individual SIM affected performance on EPAs when compared to group SIM – the standard in medical education.

Description: We developed 4 SIM cases: acute upper GI bleed, community-acquired pneumonia, diabetic ketoacidosis and acute coronary syndrome. 24 IM sub-internship students were assessed in groups of 3 to 4. 8 students were assessed individually. An evidence-based checklist was developed to assess different domains in each of the 4 EPAs. The observer faculty scored 1 or 0 for each item on the checklist based on student performance. An aggregate score of each EPA for each of the 4 cases was obtained to identify knowledge gaps. A score of >70% was defined as passing for a particular EPA.

A group vs individual analysis (Table 1) showed that groups and individuals performed similarly on EPAs 1, 2 and 10 with a trend towards better group performance on EPA 3 (p=0.17). Overall, all students, whether group or individual, performed poorly on EPA 1 and well on EPA 2. When assessed in groups, >50% of students passed EPAs 2, 3 and 10 for the cases of acute UGIB and CAP while <50% of the students passed EPA 3 for ACS.

Faculty data from the checklist was used for focused teaching during debriefing. Survey data comparing student perceived comfort with EPA performance before and after simulation revealed increased comfort level in performing all 4 EPA’s (p < 0.05) (Table 2).

Conclusions: We developed a checklist that can be reliably used to evaluate student performance on different EPAs during SIM. Evaluation of EPA 1 may be limited due to electronic health record reliance outside of SIM. Individual and group EPA evaluation may differ. Using simulation to teach EPAs is well received by internal medicine sub-interns.

IMAGE 1: