Curated Milestones Evaluation Exhibit

Methods for Evaluation Development

Stephanie Halvorson, MD
Oregon Health & Sciences University

Program Size: 31-100 residents
Academic Setting: University-Based
Clinical Setting: ICU, Inpatient Wards, Outpatient/Continuity Clinic, Patient Safety/QI

Overview

We have submitted our faculty evaluations of residents and interns rotating through the ICU, inpatient wards and ambulatory clinic experiences, as well as a peer-evaluation of a quality improvement project and “360” evaluation of residents by their clinic medical assistants (MAs). Evaluations use “level of guidance” (supervision) as the rating scale as this is felt to be most intuitive and observable by faculty.

We use curricular milestones as the building blocks for our evaluations, and then link them to their respective reporting milestones.The codes shown on the evaluation refer to the curricular milestone for that question.Regarding the curricular milestones, in 2012 the Accreditation Council for Graduate Medical Education (ACGME) and American Board of Internal Medicine released 142 milestones to define and map the competency-based progress through Internal Medicine Residency Training, denoting the approximate time trainees should reach each milestone. This was a precursor to the “reporting milestones.”

Download Tools: 

Development

Our process for developing the evaluations involves a “Q Sort,” which is a psychometric method for classifying items according to the opinions of a group. The 142 curricular milestones were first sorted into skills achieved by 0-12 months and 12-36 months (representing interns and senior residents, respectively). This group of approximately 70 milestones was further limited to only those with relevance to a particular rotation. The 22 most relevant milestones were included in the Q-sort process. A group was then convened of key stakeholders (faculty) for each rotation. The 22 milestones were printed on individual sticky notes (see “Q Sort Ambulatory” attachment for an example), which were then placed on a ‘game board’ (see attachment).

Working in small groups of 2-3, faculty ranked the milestones based on the following criteria – a) Overall importance (from faculty perspective), and b) Ability to assess the skill within the context of the rotation. This process was repeated for the intern and resident evaluations. The top-ranked 10-12 items for each evaluation were then included in the end-of-rotation evaluation. Though time consuming, this process has the advantage of soliciting faculty engagement in the decision of which skills to measure, and implicit agreement that evaluation of these items is both important and feasible within the context of the rotation. We added a question about overall performance on each evaluation to serve as an early warning system for residents who may be behind.

The peer evaluation was not created using the Q-sort methodology. The residency program director and associate program director selected relevant curricular milestones for inclusion in this evaluation.

Lessons Learned

  1. Not all curricular milestones match neatly to reporting milestones.
  2. We selected a "guidance" or "supervision" scale for this evaluation as it is more intuitive than novice--> expert
  3. The last question has been critical (well-behind--> right on track) and has helped us identify interns/residents early on who are struggling.
  4. Some faculty continue to use "all 5s" despite the descriptive anchors on this form.. Further faculty education and feedback are needed.

How Used to Inform Decisions about a Learner's Milestone

We automatically link each question to its respective reporting milestone, which auto-populates a score MedHub. CCC members can click on a hyperlink to see which questions populated the score for that particular reporting milestone and may adjust the score based on the comments.

For more information, please contact halvorss@ohsu.edu.