Sara L. Swenson, MDCalifornia Pacific Medical Center
Program Size: 31-100 residentsAcademic Setting: Community-BasedClinical Setting: All
We have developed milestone-based evaluation tools for all major inpatient and outpatient rotations as well as for direct observation, patient and staff 360 evaluations, resident teaching activities (e.g., Residents’ Report, journal club) and courses (e.g., Evidence-Based Medicine, Quality Improvement elective). We present examples of our end-of-rotation and 360 evaluations that highlight the process and outcomes of our milestone-based redesign. Our tools are based on a combination of ACGME sub-competencies and/or EPA’s (Entrustable Professional Activities). They are rotation-specific, and the tool for each activity was developed by key stakeholder faculty, residents, and staff for that rotation. For our community-based training program, we prioritized feasibility, faculty and resident engagement and ownership, and tools that provided discrete data for performance summaries. Since each evaluation includes only a subset of sub-competencies, we created a master grid to ensure that our evaluations in toto assess each component of all 22 sub-competencies. All evaluations map directly to ACGME sub-competencies on Medhub so that the summaries can easily be used for ACGME biannual reporting. Each question maps to relevant portions of 1 or more ACGME sub-competency for internal medicine. All questions contain descriptors for each milestone level, ranging from 1 (”Critical Deficiency”) to 5 (“Role Model Attending”). Most evaluations use a 5-point scale that maps directly to the ACGME milestone levels. To address faculty preferences and rotation criteria, we designed some evaluations with 3- or 4-point scales; however, each question descriptor is worded to identify a specific milestone level, and it maps to that level on Medhub. We use this data for ACGME milestone reporting as well as to provide feedback to residents, frame discussions at Clinical Competency Committee (CCC) meetings, formulate individualized remediation plans, and link assessment to rotation learning objectives and curriculum development. We present the following examples of our redesign tools: end-of-rotation evaluations by faculty for (1) ICU, (2) palliative care and (3) quality improvement rotations; by (4) interns of residents and (5) residents of interns for medicine ward rotations, and (6) 360 evaluations by clinic staff for continuity clinic rotations. Each tab on the Excel spreadsheet shows a different evaluation tool. Numbers above the questions show the specific milestone level to which each descriptor matches.
Download the Tool
For each evaluation tool, we met with key resident and faculty stakeholders to develop and refine rotation learning objectives. From these, we developed items for individual evaluation questions that are “EPA-like”: discrete, observable activities or behaviors. Faculty and/or residents used Q-sort techniques to prioritize items and discuss how many items best balanced comprehensiveness with feasibility. One faculty member mapped each of the selected items to specific ACGME subcompetencies and developed descriptors for each milestone level. Faculty and residents then reviewed the tools and provided feedback, and each tool was finalized and entered into our evaluations software (MedHub).
Implementing our evaluation redesign yielded several “lessons learned”. First, with our initial tools, some faculty members were reluctant to rank residents at the novice ends of the milestone scale because the language seemed overly negative. We learned to modify the language of the 22 ACGME subcompetencies to make it simpler and less negative. We also learned to minimize the number of questions to ensure feasibility and minimize evaluator “fatigue”. Finally, we learned the importance of piloting draft evaluations with faculty and residents to ensure that what educators say they prioritize are actually behaviors that they have the opportunity to observe.
Involving faculty, residents, and staff in developing evaluation tools constitutes a key element of training. Soliciting and incorporating stakeholder input fosters ownership over our assessment process. Faculty and residents also build skills in direct observation, assessment, evaluation and feedback through small didactic aliquots during monthly CCC meetings, noon conferences, and faculty development workshops. During CCC meetings, members complete a single ACGME sub-competency on residents discussed. Some faculty also complete group-based rotation evaluations. These practices help us to become familiar with the concept of outcomes-based assessments and practice using the language of milestones to discuss residents’ current performance and give feedback.
All of our evaluation tools map discretely into the 22 ACGME sub-competencies for internal medicine. Program leadership and CCC members can then use the milestones report functions on Medhub to produce summaries for each learner's milestones. We combine these summaries with CCC assessments and qualitative performance data to determine each learner's milestone levels for NAS Reporting. Summaries are also used to give residents formative feedback and inform CCC decisions regarding resident advancement and remediation.
For more information, please contact email@example.com.