Author + information
- Beau M. Hawkins, MD†,
- Lisa A. McCoy, MS‡,
- Megan L. Neely, PhD‡,
- Srinath Adusumalli, MD§,
- John C. Messenger, MD‖,
- Douglas E. Drachman, MD§,
- Theodore A. Bass, MD¶,
- Patrick T. O'Gara, MD#,
- Sunil V. Rao, MD‡ and
- Robert W. Yeh, MD, MSc§∗ ()
- †Cardiovascular Section, Department of Internal Medicine, University of Oklahoma Health Sciences Center, Oklahoma City, Oklahoma
- ‡Duke Clinical Research Institute, Durham, North Carolina
- §Division of Cardiology, Department of Internal Medicine, Massachusetts General Hospital, Boston, Massachusetts
- ‖Department of Medicine, University of Colorado Hospital at Denver and Health Sciences Center, Aurora, Colorado
- ¶University of Florida College of Medicine-Jacksonville, Jacksonville, Florida
- #Cardiovascular Division, Department of Medicine, Brigham and Women's Hospital, Boston, Massachusetts
- ↵∗Cardiology Division, Massachusetts General Hospital, 55 Fruit Street, GRB 800, Boston, Massachusetts 02114
To the Editor:
Each July, hospitals absorb an influx of medical trainees who are functioning as physicians for the first time, are assuming new roles with greater responsibilities, or are entering fields that require new cognitive and procedural skill sets. The inexperience of trainees during this period of transition might adversely impact patient care, a concept commonly referred to as the “July effect” (1). Studies across a range of specialties have linked the July effect to higher rates of medical errors, procedural complications, and patient mortality (2,3).
Percutaneous coronary intervention (PCI) is a commonly performed procedure with well-defined risks, has technical components with established learning curves, and is often performed in acutely ill patients who might be at imminent risk of death. In light of prior research documenting the existence of the July effect in other disciplines, we hypothesized that PCI outcomes would be worse early in the academic year when performed at hospitals with affiliated interventional cardiology fellowships.
A list of accredited interventional cardiology fellowship training programs was obtained from the Accreditation Council for Graduate Medical Education website (4). Affiliated hospitals for each of these accredited fellowships were identified and linked to the National Cardiovascular Data Registry CathPCI Registry. We divided the academic year into early (July through August) and late (September through June) time periods on the basis of when PCIs were performed between 2009 and 2012. The primary outcomes of interest were in-hospital major bleeding, access site bleeding, vascular complications, death, and the composite of myocardial infarction, stroke, or death.
We compared crude and adjusted rates of all outcomes for the early versus late periods at training hospitals. Variables used for adjustment were those previously included in validated mortality and bleeding models from the National Cardiovascular Data Registry (Table 1). Adjustment was done with logistic regression with generalized estimating equations to account for hospital clustering. A cohort of concurrent PCI procedures performed at non-training hospitals served as a control population, and we performed tests for interaction between academic year timing and training hospital status for all endpoints. We further surveyed interventional fellowship program directors to determine the percentage of PCIs with trainee participation and examined whether the association between academic year timing and outcomes differed by the institutional percentage of cases involving trainees. The study was conceived of and designed by investigators at the Massachusetts General Hospital. Statistical analyses were performed at the Duke Clinical Research Institute with SAS (version 9.2, SAS Institute, Inc., Cary, North Carolina).
Of 183 hospitals affiliated with 137 Accreditation Council for Graduate Medical Education–accredited interventional cardiology fellowship programs, 136 (74.3%) participated in the CathPCI registry. These hospitals performed 309,866 PCIs during the study period, of which 61,503 and 248,363 procedures were performed in the early and late time periods, respectively. During this same period, 1,525,454 PCI procedures were performed at hospitals without training programs, of which 306,423 and 1,219,031 were performed in the early and late periods, respectively.
Among procedures done at training hospitals, access site, antiplatelet therapy, anticoagulants, fluoroscopy time, contrast use, PCI complexity, and length of stay were similar between the early and late periods. There were no significant differences in any outcomes between early and late periods (Table 1). The associations between academic year timing and outcomes were similar at hospitals with and without training programs (tests for interaction, bleeding p = 0.45, access site bleeding p = 0.95, vascular complications p = 0.79, death/myocardial infarction/stroke p = 0.83, mortality p = 0.70).
Survey data with regard to the percentage of PCIs performed with trainee involvement in training hospitals were received from 81 training programs (59% of surveyed programs total). There were no differences in the effect of academic year timing on any of the assessed outcomes when stratified by the degree of trainee involvement in procedures (interaction p = NS for all comparisons).
Within a large registry of PCIs performed in the United States, we found that procedural timing within the academic year did not influence rates of bleeding, vascular complications, or mortality in hospitals with affiliated interventional cardiology fellowship programs. Our findings suggest that the involvement of inexperienced trainees in the performance of PCI does not compromise quality of patient care early in the academic year, as assessed by these outcomes.
Young et al. (2) examined the results of 39 studies assessing the effect of the end-of-year changeover on quality of care. Although there was significant heterogeneity among studies, there was a trend toward worse mortality and efficiency of care at the time of academic year-end changeovers. In our analysis, measures of efficiency including fluoroscopy time, contrast usage, and hospital length-of-stay were similar during the early and late periods. Moreover, we found no effect of academic year timing on patient outcomes despite being highly powered to do so.
Several factors might explain the lack of a detectable July effect in interventional cardiology. More rigorous attending supervision with graded trainee participation early in the year might compensate for the relative inexperience of trainees. Similarly, the participation in procedures in July by more experienced fellows might minimize complication risks during this perceived vulnerable period. Such supervision practices are recommended by training guidelines for cardiac catheterization and other endovascular procedures (5,6). Finally, successful matriculation from general cardiology fellowship occurs after a minimum of 4 months of procedural training in diagnostic coronary procedures (5), and the application process for interventional fellowship candidates might allow for the selection of individuals proficient in the technical aspects of this procedural specialty.
Our study has several important limitations. Only three-quarters of hospitals with affiliated interventional cardiology fellowships participated in the CathPCI registry during the study period. We were unable to account for clinical practice heterogeneity within training programs. Although we surveyed interventional fellowship programs to quantify the percentage of procedures involving interventional cardiology fellows to assess its influence on our study results, not all programs responded to our survey, and the precise roles of fellows in these procedures could not be assessed.
Among training hospitals participating in a national PCI registry, the performance of PCI early in the academic year was not associated with increased bleeding, vascular complications, or mortality. These findings might be reassuring to both patients and providers at institutions that are involved in training in interventional cardiology.
Please note: This project was supported by the American College of Cardiology Foundation National Cardiovascular Data Registry. CathPCI Registry is an initiative of the American College of Cardiology Foundation and The Society for Cardiovascular Angiography and Interventions. The authors gratefully acknowledge the contributions of Daniel M. Shivapour, MD, to this analysis. Dr. Messenger receives research grant support from Medtronic, Inc. Dr. Drachman receives research support from iDev Technologies, Inc. and Lutonix/Bard; serves on the Clinical Events Committee for PLC Medical Systems, Inc.; and is a member of the Prairie Education & Research Cooperative Data and Safety Monitoring Board. Dr. Bass is Chair of the American Board of Internal Medicine question writing committee for the certification examination in interventional cardiology; and is the President-Elect of the Society for Cardiac Angiography and Interventions and serves as Vice-Chair on the American College of Cardiology Foundation/American Heart Association/Society for Cardiovascular Angiography and Interventions 2012 Update of the Clinical Competence Statement on Cardiac Interventional Procedures. Dr. Rao serves as a consultant for Terumo Medical Corporation. Dr. Yeh is an investigator at the Harvard Clinical Research Institute; and has served as a consultant for the Kaiser Permanente Division of Research. All other authors have reported that they have no relationships relevant to the contents of this paper to disclose.
- American College of Cardiology Foundation
- ↵Young JQ, Ranji SR, Wachter RM, Lee CM, Niehaus B, Auerbach AD. “July effect”: impact of the academic year-end changeover on patient outcomes: a systematic review. Ann Intern Med;155:309–15.
- ↵Peterson ED, Roe MT, Chen AY, et al. The NCDR ACTION Registry-GWTG: transforming contemporary acute myocardial infarction clinical care. Heart 2010;96:1798–802.
- Jacobs A.K.,
- Babb J.D.,
- Hirshfeld J.W. Jr..,
- Holmes D.R. Jr..