“I know it when I see it" is famously associated with Justice Potter Stewart's concurring opinion in the Jacobellis v. Ohio case in 1964. It refers to his struggle to define "hard-core pornography" for legal purposes, ultimately concluding that while he could identify it, he couldn't provide a definitive description. In today’s PAAD and accompanying editorial, Sun et al.1 and Lockman and Schwartz2 respectively take a deep dive into answering the question: “Attaining Anesthesiology Clinical Competence during Graduate Medical Education: Can We Rely on What We See?”
In today’s PAAD we’ll review definitions of competency and the American Accreditation Council for Graduate Medical Education (ACGME) competency-based evaluation system, called “Milestones.”3 How have we arrived at Milestones as a method to document competence? And do the Milestones provide a valid structured framework for assessing the developmental progression of residents along key competencies and their respective subcompetencies?
As a life long educator (without an MSEd), I must admit that over my 40+ years in practice the monthly resident and fellow assessment of competency always boiled down to one question for me: “Would I trust this resident/fellow to anesthetize my children or (now) grandchildren”? More than the formal evaluations and metrics, I relied on “I know it when I see it”. Not exactly science or an evidence based assessment! Our regular education reviewers (who could not participate in today’s PAAD because they wrote the editorial) and the current leaders in graduate medical education would scoff and probably bust a gut by my definition, but to be honest this was the ultimate test for me. Myron Yaster MD
Editorial
Lockman JL, Schwartz AJ. Attaining Anesthesiology Clinical Competence during Graduate Medical Education: Can We Rely on What We See? Anesthesiology. 2025 May 1;142(5):790-792. doi: 10.1097/ALN.0000000000005408. Epub 2025 Apr 8. PMID: 40197450.
Original article
Sun T, Park YS, Chen F, Hogan SO, Tanaka P. Longitudinal Reliability of Milestones Learning Trajectories during Anesthesiology Residency. Anesthesiology. 2025 May 1;142(5):918-928. doi: 10.1097/ALN.0000000000005388. Epub 2025 Jan 27. PMID: 39869633; PMCID: PMC11974626.
“Competence is the summative (cumulative) goal for which teachers educate and learners strive. Competence for the anesthesiologist indicates possessing qualifications to practice independently providing effective, efficient, and safe care of patients who have bestowed their personal trust upon the clinician.”2 “Historically, when an anesthesiology trainee completed time-based training (a “prescribed time-in-residency” that was the same for everyone) and experienced structure-based training (listing completed cases), they were deemed competent. As many have no doubt witnessed, time- and structure-based education do not guarantee competence. Some gain competence earlier/with fewer exposures than others, while some require more/far longer.”2
“Recognizing this inherent inadequacy, the Accreditation Council for Graduate Medical Education (ACGME) developed a competency-based evaluation system, called “Milestones.”3 Its six core competencies—patient care, medical knowledge, practice-based learning and improvement, interpersonal and communication skills, professionalism, and systems-based practice—and their 25 specific subcompetencies, which “graded” on a 5-point scale, with anchors for behaviors at each level, provide a paradigm to articulate expectations for trainee development.3 (Edgar L, McLean S, Hogan SO, Hamstra S, Holmboe ES: The Milestones Guidebook, version 2020. Accreditation Council for Graduate Medical Education. Available at: https://www.acgme.org/globalassets/milestonesguidebook.pdf )”3 “The target for graduation is level 4 in each subcompetency. ACGME explicitly states that meeting level 4 is not a requirement for graduation and independent practice. Level 5 is considered aspirational.”3
“The goal of Sun et al.1 was to identify and validate the usefulness of Milestones 1.0 data. By identifying patterns in learning trajectories over time, they theorized that a developmental perspective on assessment data could facilitate an understanding of how trainee competence evolves. Applying these patterns, if valid, could inform the timing and modalities of educational interventions for both learners and training programs and might provide programs benchmarking goals identified through national data comparisons.”1,2
“The study included a total of 682,475 ratings for 4,976 learners in 140 programs. Growth curve model results indicated that the mean baseline Milestone rating across the 25 subcompetencies was 2.05 (95% CI, 1.96 to 2.14), with an average increase of 0.49 (95% CI, 0.48 to 0.51) units per reporting period. The growth rate reliability (mean ± SD, 0.58 ± 0.03) suggested a moderate capability of anesthesiology Milestones to detect individual differences in the growth of latent competency. Growth curve reliability estimates (mean ± SD, 0.71 ± 0.02) suggested acceptable overall reliability of Milestones across all the six assessment points. Significant variability was observed at both the program and learner levels (P < 0.001). Latent class growth analyses identified 3 to 4 latent groups of learners with distinct learning trajectories across the 25 subcompetencies.” They concluded: “Based on the Accreditation Council for Graduate Medical Education reports between 2014 and 2020, this study indicates that the anesthesiology Milestones version 1.0 provide moderately reliable information for tracking an individual’s progress over time and may provide an opportunity for program directors to implement individualized learning plans to support resident development.”1
This sounds like a lot of self-serving gobbledygook by “true believers” to me (MY). As Lockman and Schwartz point out: “Competence is ultimately a qualitative rather than a quantitative assessment. Professionalism and interpersonal and communication skills, as examples, are competencies that may carry huge subjective components embedded in a faculty evaluator’s judgement of trainee performance. Subjective assessment through one faculty member’s personal lens may be quite different when viewed through the lens of a colleague.”2 “Might the Milestone data be flawed? Is it even possible to assess competence quantitatively using a number, a Milestone, when the summative evaluation is ultimately a qualitative assessment?”2
Lockman and Schwartz conclude “While milestones may provide a convenient paper trail for program directors and ACGME and may help us identify best practices, the real work lies in faculty development, working with clinical educators to enhance teaching methods, the delivery of meaningful verbal and written feedback to trainees in a timely fashion, and investing in individualized professional development of trainees to ensure they are actually competent at graduation. Doing so requires significant investment on the part of clinical faculty and in program leadership by hospitals and departments.” This work of faculty development is essential if trainee assessment for competence is to be meaningful.2
So, when evaluating competency, I (MY) return to my basic question: “Would I trust this resident/fellow to anesthetize my children or my grandchildren”? Send your thoughts and comments to Myron who will post in a Friday reader response.
References
1. Sun T, Park YS, Chen F, Hogan SO, Tanaka P. Longitudinal Reliability of Milestones Learning Trajectories during Anesthesiology Residency. Anesthesiology 2025;142(5):918-928. (In eng). DOI: 10.1097/aln.0000000000005388.
2. Lockman JL, Schwartz AJ. Attaining Anesthesiology Clinical Competence during Graduate Medical Education: Can We Rely on What We See? Anesthesiology 2025;142(5):790-792. (In eng). DOI: 10.1097/aln.0000000000005408.
3. Schartel SA, Kuhn C, Culley DJ, Wood M, Cohen N. Development of the anesthesiology educational milestones. J Grad Med Educ 2014;6(1 Suppl 1):12-4. (In eng). DOI: 10.4300/JGME-06-01s1-29.