Author + information
- Received February 14, 2018
- Revision received March 1, 2018
- Accepted March 2, 2018
- Published online May 21, 2018.
- Joseph M. Bumgarner, MDa,
- Cameron T. Lambert, MDa,
- Ayman A. Hussein, MDa,
- Daniel J. Cantillon, MDa,
- Bryan Baranowski, MDa,
- Kathy Wolski, MPHb,
- Bruce D. Lindsay, MDa,
- Oussama M. Wazni, MD, MBAa and
- Khaldoun G. Tarakji, MD, MPHa,∗ ()
- aDepartment of Cardiovascular Medicine, Cleveland Clinic, Cleveland, Ohio
- bCleveland Clinic Coordinating Center for Clinical Research (C5Research), Cleveland Clinic, Cleveland, Ohio
- ↵∗Address for correspondence:
Dr. Khaldoun G. Tarakji, Section of Cardiac Pacing and Electrophysiology, Heart and Vascular Institute, Cleveland Clinic, 9500 Euclid Avenue, J2-2, Cleveland, Ohio 44195.
Background The Kardia Band (KB) is a novel technology that enables patients to record a rhythm strip using an Apple Watch (Apple, Cupertino, California). The band is paired with an app providing automated detection of atrial fibrillation (AF).
Objectives The purpose of this study was to examine whether the KB could accurately differentiate sinus rhythm (SR) from AF compared with physician-interpreted 12-lead electrocardiograms (ECGs) and KB recordings.
Methods Consecutive patients with AF presenting for cardioversion (CV) were enrolled. Patients underwent pre-CV ECG along with a KB recording. If CV was performed, a post-CV ECG was obtained along with a KB recording. The KB interpretations were compared to physician-reviewed ECGs. The KB recordings were reviewed by blinded electrophysiologists and compared to ECG interpretations. Sensitivity, specificity, and K coefficient were measured.
Results A total of 100 patients were enrolled (age 68 ± 11 years). Eight patients did not undergo CV as they were found to be in SR. There were 169 simultaneous ECG and KB recordings. Fifty-seven were noninterpretable by the KB. Compared with ECG, the KB interpreted AF with 93% sensitivity, 84% specificity, and a K coefficient of 0.77. Physician interpretation of KB recordings demonstrated 99% sensitivity, 83% specificity, and a K coefficient of 0.83. Of the 57 noninterpretable KB recordings, interpreting electrophysiologists diagnosed AF with 100% sensitivity, 80% specificity, and a K coefficient of 0.74. Among 113 cases where KB and physician readings of the same recording were interpretable, agreement was excellent (K coefficient = 0.88).
Conclusions The KB algorithm for AF detection supported by physician review can accurately differentiate AF from SR. This technology can help screen patients prior to elective CV and avoid unnecessary procedures.
Atrial fibrillation (AF) is the most commonly encountered arrhythmia in clinical practice and population-based studies forecast over 6 million individuals living with this diagnosis by 2050 (1,2). It is a chronic condition whose prevalence increases with age, and represents a growing economic burden for our health care system (3,4). Although the journey of AF begins with an initial diagnosis, its management is long term, nuanced, and often involves hospital-based interventions along the way, including electrical cardioversion (CV).
Recently, commercially available handheld cardiac rhythm recorders have been developed that can record a rhythm strip using smartphone technology (5). In November 2017, the Kardia Band (KB) (AliveCor, Mountain View, California) was introduced as the first U.S. Food and Drug Administration (FDA)–cleared Apple Watch accessory (Apple, Cupertino, California) that allows a patient to record a rhythm strip equivalent to lead I for 30 s. The KB is coupled with an application that provides an instantaneous and automatic rhythm adjudication algorithm for the diagnosis of AF. The application can inform the patient when AF is detected and transmit these results to the patient’s caring physician instantaneously.
The primary objective of our study was to examine whether the KB and AF detection algorithm could accurately and reliably differentiate sinus rhythm (SR) from AF when compared with physician-interpreted 12-lead ECGs and KB recordings in patients with known AF presenting to a high-volume hospital-based electrophysiology practice for scheduled electrical CV.
This was a prospective, nonrandomized, and adjudicator-blinded study completed at a tertiary care hospital-based electrical CV laboratory designed to evaluate the accuracy of the KB automated algorithm for the detection of AF. AliveCor provided the KB connected to an Apple Watch which was paired via Bluetooth to a smartphone device (Apple) for utilization in the study (Figure 1). The Cleveland Clinic’s Institutional Review Board approved the study.
Consecutive patients with a diagnosis of AF who presented for scheduled elective CV with or without a planned transesophageal echocardiogram were screened for enrollment. Inclusion criteria included all adult patients age 18 to 90 years who were able to provide informed consent and willing to wear the KB before and after CV. We excluded all patients with an implanted pacemaker or defibrillator.
Once enrolled, patients underwent a pre-CV ECG followed immediately by KB recording. These paired recordings were considered simultaneous. If the CV was performed, a post-CV ECG was then obtained along with another KB recording. The KB tracing was automatically analyzed using the KB algorithm. This algorithm measures rhythm irregularity and P-wave absence in real time to classify the rhythm strip as “possible AF.” If the criteria for AF is not met, the KB algorithm classifies regular rhythms with P waves as “normal” if the rate is between 50 and 100 beats/min or “unclassified” for those rhythms with rates <50 or >100 beats/min or if the recording is noisy or shorter than 30 s. The KB rhythm strips were automatically transferred to the secure AliveCor server, downloaded, and printed for review.
All automated KB rhythm strips and ECGs were anonymized and distributed to 2 blinded electrophysiologists (BB and DC) who independently interpreted each tracing and assigned a diagnosis of SR, AF or atrial flutter, or unclassified. If the 2 electrophysiologists disagreed on the diagnosis, a third electrophysiologist (AH) reviewed the tracing and assigned a final diagnosis. To assess the accuracy of the KB algorithm at appropriately identifying AF, the automated KB interpretations were compared with both the physician-interpreted KB rhythm strips and the physician-reviewed simultaneous ECGs.
Sensitivity and specificity were calculated for KB automated interpretation compared with physician-interpreted 12-lead ECG, for physician interpreted KB rhythm strip compared with physician-interpreted 12-lead ECG, and for KB automated interpretation compared with physician-interpreted KB recordings. Kappa (κ) coefficients for interobserver agreement were assessed. κ coefficients >0.80 were considered to represent excellent agreement. AF and atrial flutter were considered a single disease state for all interpretations.
A total of 100 patients were enrolled in the study from March 2017 through June 2017. Demographics and clinical characteristics are summarized in Table 1. CV was performed in 85% of study participants. Of the 15 patients who did not undergo CV, 8 were cancelled due to presentation in SR. There were 169 simultaneous 12-lead ECG and KB recordings obtained from study participants, and 57 KB recordings were determined as unclassified by the KB algorithm. Of the 57 unclassified KB tracings, 16 (28%) were due to baseline artifact and low amplitude of the recording, 12 (21%) were due to a recording of <30 s in duration, 6 (10%) were due to a heart rate of <50 beats/min, 5 (9%) were due to a heart rate of >100 beats/min, and the remaining 18 (32%) were unclassified due to an unclear reason. Electrophysiologist-interpreted 12-lead ECGs were all interpretable.
To test the ability of the KB algorithm to detect AF, automated KB rhythm interpretations and electrophysiologist-interpreted 12-lead ECGs were compared. Among the recordings where the KB provided a diagnosis, it correctly diagnosed AF with 93% sensitivity, 84% specificity, and a K coefficient of 0.77 (95% confidence interval: 0.65 to 0.89) when compared with the electrophysiologist-interpreted 12-lead ECG (Table 2). Because our analysis used multiple observations from the same individual, we evaluated for possible intraindividual correlations by comparing only pre-CV KB recordings to electrophysiologist-interpreted 12-lead ECGs and found the performance of the KB algorithm to be unchanged (Online Table 1).
To determine whether the automated KB recordings labeled as “unclassified” by the algorithm were still clinically useful, these tracings were interpreted by our blinded electrophysiologists and compared with the electrophysiologist-interpreted 12-lead ECGs. Of the 57 automated unclassified KB recordings, the interpreting electrophysiologists were able to correctly diagnose AF with 100% sensitivity, 80% specificity, and a K coefficient of 0.74 (Table 3).
To assess the fidelity and overall quality of the KB tracings produced by the smartwatch, electrophysiologist-interpreted KB recordings were compared to corresponding 12-lead ECG tracings. A total of 22 recordings were determined to be noninterpretable by the reading electrophysiologist, and these were predominately due to baseline artifact. Of the remaining 147 simultaneous recordings, the electrophysiologist interpreted 12-lead ECGs and electrophysiologist interpreted KB recordings, physician interpretation of the KB tracings demonstrated 99% sensitivity, 83% specificity and a K coefficient of 0.83 (Table 4).
Additionally, to measure the quality of the KB recordings, we compared the KB automated algorithm interpretation to physician interpretation of the same recordings. Of the cases where both methods were interpretable, the KB automated algorithm was 93% sensitive and 97% specific in detecting AF with a K coefficient of 0.88 (Table 5).
The era of mobile health care technology has proliferated over the past decade. Consumers from the general public now have direct access to devices and applications that offer real-time measurements of cardiovascular physiology, and some technologies extrapolate this data to provide diagnostic information (6). It is estimated that by 2019, annual sales of such devices will reach 50 billion dollars worldwide (7). However, the ability of some devices to accurately measure biometric endpoints has been questioned, and some mobile health technologies are available without verification through rigorous clinical studies (8).
Alongside the growth of mobile health care technology has been the desire of many physicians and patients to accurately monitor disease-related metrics of chronic conditions in the ambulatory setting. AF is a good example of a relapsing condition that requires frequent monitoring of clinical endpoints to assess the efficacy of treatment choices and plan future interventions. The KB is the first smartwatch accessory cleared by the FDA and available to the general public without a prescription that claims to instantaneously detect AF and transmit this information to a patient’s treating physician.
In this study, we aimed to assess whether the KB and AF detection algorithm could accurately and reliably differentiate SR from AF in patients with known AF presenting for scheduled electrical CV (Central Illustration). We compared automated KB interpretations to simultaneously recorded ECGs read by blinded electrophysiologists and found very good agreement between them. When able to provide an interpretation, the automated KB readings correctly identified AF with 93% sensitivity and 84% specificity (Figure 2). Of the 169 total KB recordings, 57 (33.7%) were interpreted as unclassified by the automated KB algorithm. Reasons that these recordings were deemed noninterpretable included short recordings <30 s, low-amplitude P waves, and baseline artifact. For those recordings where the automatic KB tracing was noninterpretable, direct physician interpretation could be used to correctly identify AF with 100% sensitivity and 80% specificity (Figure 3). In general, the KB recordings when interpreted by the physician had excellent agreement with simultaneous 12-lead ECG interpretation with 99% sensitivity and 83% specificity.
Prior to the development of the KB smartwatch algorithm, several algorithms used by implantable loop recorders (ILRs) were validated for the detection of AF. Currently available ILRs detect AF by sensing R waves and applying a variety of regularity algorithms to detect AF. The Confirm DM2101 (Abbott, Chicago, Illinois) detects RR interval regularity and measures suddenness of an irregular rhythm’s onset and offset to diagnose AF using 2 probabilistic scoring models. The BioMonitor (Biotronik, Berlin, Germany) also measures R-wave variability and allows the clinician to adjust the number of cycle lengths used and the confirmation time needed to detect AF. The most studied of the ILRs is the Reveal LINQ (Medtronic, Minneapolis, Minnesota) system whose algorithm for AF detection uses both R-wave irregularity and a programmable P-wave evidence discrimination tool that can be modified based on the individual needs of a given patient (9–11). The Reveal LINQ system was evaluated in the XPECT (Reveal XT Performance Trial). In this study, the sensitivity and specificity for identifying patients with any AF was 96.1% and 85.4%, respectively (12). In our study, the accuracy of the KB algorithm for the detection of AF was comparable to these results.
Wearable devices like the KB require a safe and durable platform upon which recordings can be reviewed and stored. A secure cloud-based platform has been developed to view and download KB recordings. The applicability of this platform to the outpatient management of patients with AF needs to be evaluated and studied in future trials. Our study also demonstrated that a subset of patients (8%) who presented for CV was found to be in SR. For each of these patients, the automated KB algorithm did not erroneously identify AF, and the physician interpretation of the KB recording correctly confirmed SR in each case. Although this study was not powered to assess the financial consequences of cancelled CVs, it is reasonable to conclude that a measurable number of resources were forfeited by both the patient and the health care system in anticipation of a procedure that was ultimately deemed unnecessary once SR was confirmed. As data from the KB can be reviewed remotely, the resources used in preparation of these patients’ cancelled CVs could have been saved. The KB system has been previously shown to be cost-effective for AF screening. Our study suggests the potential use of these products to provide more effective health care delivery (13).
This was a single-center study at a tertiary referral center with a small sample size. The population represented in this study had a known history of AF and a sufficient burden of AF to prompt electrical CV. The performance of the KB smartwatch algorithm may be more variable in a population with a lower AF burden. We did not evaluate socioeconomic status in our study, and only 17% of our enrolled patients were female. Additionally, none of the patients who participated in our study had previously used the KB. These facts may limit the generalizability of our findings in the general public, and future studies should consider measuring these variables. Patients with cardiac implantable electronic devices were excluded from this study, and further evaluation of the KB algorithm is needed in this patient population. Participants were instructed on how to use the KB wristband while seated in a hospital bed immediately prior to obtaining each recording. Their ability to record each tracing was directly observed. As a result, the performance of the KB algorithm and the clarity of the recorded tracings may be less accurate in an outpatient or ambulatory setting. For the same reason, some of the unclassified recordings could have been avoided with more patient practice on the proper use of the KB device. Additionally, the KB prototype used in our study did not display a real-time ECG tracing on the watch screen at the time of recording. Since FDA clearance, the KB app is now permitted to display this information. We anticipate the real-time display of the ECG recording will improve the quality of the recordings obtained by users of the device.
The KB smartwatch automated algorithm for AF detection, supported by physician review of these recordings, can reliably differentiate AF from SR. Avoiding scheduling unnecessary electrical CVs is 1 example of a clinical application of the KB system. Many other potential applications warrant further investigation and might transform our longitudinal care of AF patients.
COMPETENCY IN PATIENT CARE: Among patients with AF undergoing elective cardioversion, an automated smartwatch algorithm with physician oversight accurately differentiates between AF and SR.
TRANSLATIONAL OUTLOOK: As the prevalence of AF rises and access to mobile health care technology expand, randomized trials are needed to validate the sensitivity, specificity, and generalizability and define the clinical utility of personalized wearable technologies for arrhythmia detection.
AliveCor provided the Kardia Band monitors that were connected to an Apple Watch and paired via Bluetooth to a smartphone device for utilization in the study. AliveCor was not involved in the design, implementation, data analysis, or manuscript preparation of the study. Dr. Hussein has served as a consultant for Abbott and Biosense Webster. Dr. Cantillon has served as a consultant for Abbott, Boston Scientific, Stryker Sustainability, and LifeWatch. Dr. Wazni has received a speaker honorarium from Spectranetics. Dr. Tarakji has served on the medical advisory board of Medtronic and AliveCor. All other authors have reported that they have no relationships relevant to the contents of this paper to disclose.
- Abbreviations and Acronyms
- atrial fibrillation
- implantable loop recorder
- Kardia Band
- sinus rhythm
- Received February 14, 2018.
- Revision received March 1, 2018.
- Accepted March 2, 2018.
- 2018 American College of Cardiology Foundation
- January C.T.,
- Wann L.S.,
- Alpert J.S.,
- et al.
- Becker C.
- Freedman B.
- Piwek L.,
- Ellis D.A.,
- Andrews S.,
- et al.
- Gillinov S.,
- Etiwy M.,
- Wang R.,
- et al.
- Lee R.,
- Mittal S.
- Passman R.S.,
- Rogers J.D.,
- Sarkar S.,
- et al.
- Mittal S.,
- Rogers J.,
- Sarkar S.,
- et al.
- Hindricks G.,
- Pokushalov E.,
- Urban L.,
- et al.