Author + information
- Douglas E. Drachman, MD∗ ()
- ↵∗Reprint requests and correspondence:
Dr. Douglas E. Drachman, Cardiology Division, Massachusetts General Hospital, Gray-Bigelow 800, 55 Fruit Street, Boston, Massachusetts 02114.
The assessment of competence is a critical element of graduate medical education. During the past decade, medical training programs have shifted from an historic focus on a fixed curriculum—in which the number of procedures performed, scope of lectures attended, and months of clinical rotations performed defined the criteria for successful completion—to a culture in which assessment of learner progress is performed continuously, charting incremental development and refinement of knowledge and skills until specific milestones are achieved. Curricula and the milestones for assessment are defined by the Accreditation Council for Graduate Medical Education (ACGME) in terms of 6 core competencies: medical knowledge, patient care, interpersonal and communication skills, systems-based practice, practice-based learning and improvement, and professionalism. It is the responsibility of the training program director to use all assessment tools available to critique, report, and guide trainees to the achievement of independent competence.
Educators use many tools to provide such assessment, but often these tools require direct observation, chart review, or other potentially subjective frameworks on which opinion is formulated. Although each evaluation method may provide a window of insight into leaner performance, there has been a long-standing need for a standardized, robust, and validated mechanism to assess medical knowledge. To this end, the in-training examination (ITE) was developed and has been used widely and effectively in internal medicine residency programs since 1988.
Although in-training assessment is a core requirement of programs seeking accreditation by the ACGME, no standardized ITE had been developed for cardiology fellowship programs until 2011. In this issue of the Journal, Kuvin et al. (1) describe the development of the American College of Cardiology (ACC) ITE for fellows in training (FIT) and report the outcomes of the examination over the past 3 years since its inception.
The ACC ITE, which resembles the cardiology board certification examination developed by the American Board of Internal Medicine (ABIM), comprises 150 single best answer multiple-choice questions, developed with the same content area proportion as the ABIM examination. In the ACC ITE, however, consistent with its mission as an educational tool, detailed feedback is provided to the FIT and program director, including percent correct, comparison to others taking the test in all 3 fellowship years, a breakdown of score based on content area, and a link to specific competency-based ACC Curricular Milestones for questions answered incorrectly. These curricular milestones detail the specific elements of the competencies that are expected of a clinical cardiologist. The curricular milestones comprise a major component of COCATS-4, the ACC document (2) that defines the educational expectations for cardiology fellowship training; outcome evaluation tools, and—in particular the role of ITE—feature prominently in the document.
The ITE has enjoyed considerable success since it was first offered in 2011, extending its reach to 194 fellowship programs and 3,388 FITs during the first 3 years. The ITE performs with remarkable consistency and validity. During the 3 years that the test has been offered, fellows in each year of training have scored consistently on an annual basis, and there has been improvement in scores with each additional year of fellowship experience.
Adoption of the ITE offers many potential advantages to trainees, programs, and the greater cardiovascular society. The standardized examination provides an objective assessment of FIT medical knowledge. Using the ACC Curricular Milestones feedback enables each FIT to identify specific knowledge gaps and develop a program to address the gap; this process itself sharpens the individuals’ skills of practice-based learning and improvement. By identifying and closing learning gaps, FITs will become better prepared for the ABIM board certification examination.
Within fellowship programs, reviewing feedback from all examinees enables the program director to identify whether there are consistent areas of educational deficiency within the program where additional instruction and focus is required. In turn, this provides opportunities to improve the fellowship curriculum. In addition, the ITE findings may indicate that certain topic areas require greater experience or higher-level insight than others. Understanding the dynamics of such “hierarchic structure” or “stepwise” nature of learning may influence the scheduling of learning experience, to optimize early experience prior to addressing more complex or advanced topics (3).
On a national level, the identification of topics associated with low performance scores may provide a “needs-based” assessment; cardiovascular societies may develop educational products tailored to address these specific needs.
Whereas the ACC ITE represents a powerful tool for evaluation of cardiovascular FIT medical knowledge, future developments may expand the reach and scope of assessment. There is opportunity for cardiovascular subspecialty programs, such as Interventional Cardiology, Electrophysiology, and Heart Failure/Transplantation to develop subject-specific in-training examinations. The format of ITE, using multiple choice questions, is effective for the assessment of medical knowledge and patient care. Other core competencies, including systems-based practice or practice-based learning and improvement, are less easily assessed through this format. Future assessment formats, including simulation, or, blending with other strategies including direct observation, may complement the insights derived from ITE in these specific competency areas.
The authors, ACC staff, and the National Board of Medical Examiners are to be commended for creating an effective new resource to assess medical knowledge. The use of this examination increases dramatically every year, but its potential to guide the educational objectives for fellows, program directors, and the greater cardiovascular society has just begun to be realized.
↵∗ Editorials published in the Journal of the American College of Cardiology reflect the views of the authors and do not necessarily represent the views of JACC or the American College of Cardiology.
Dr. Drachman has served as a member of the ACC Test Materials Development Committee since 2013.
- American College of Cardiology Foundation
- Kuvin J.T.,
- Soto A.,
- Foster L.,
- et al.
- Williams E.S.,
- Halperin J.L.,
- Fuster V.,
- et al.