Author + information
- ↵∗Address for correspondence:
Dr. Antonio Raviele, ALFA–Alliance to Fight Atrial Fibrillation, Via Torino 151/c, 30171 Mestre-Venice, Italy.
Two different types of oral anticoagulants (OACs) are currently available to reduce the risk of stroke/systemic embolism (SE) in patients with atrial fibrillation (AF): vitamin K antagonists (VKAs) and direct OACs (DOACs) (1). The VKAs, for example, warfarin, are the traditional OACs, and until 2009, were the only class of OACs available. VKAs are very effective in preventing thromboembolic events in patients with AF. In a meta-analysis of 6 randomized controlled trials (RCTs) of these drugs, adjusted-dose warfarin reduced the risk of stroke by 64% and the risk of all-cause mortality by 26% in comparison with placebo or no treatment (2). However, VKA therapy has several limitations, such as unpredictable response, narrow therapeutic window, slow onset/offset of action, numerous food–drug interactions, numerous drug–drug interactions, and warfarin resistance, which make it difficult to implement in clinical practice. Moreover, it requires routine coagulation monitoring and frequent dose adjustments, resulting in substantial risk and inconvenience. This explains the low use and the high discontinuation rate of warfarin in the real world, as well as the inadequate level of anticoagulation reached in many patients.
Developed in recent years, DOACs are new compounds that directly inhibit thrombin or the activated factor X. In 4 large RCTs, these drugs were compared with warfarin for stroke prevention in >71,000 patients. A meta-analysis of these trials showed that DOACs significantly reduced stroke/SE by 19% and all-cause mortality by 10% in comparison with warfarin (3). DOACs have several advantages over VKAs, such as fewer intracranial hemorrhages, more predictable response, rapid onset/offset of action, no major food–drug interactions, and fewer drug–drug interactions. These last properties enable fixed doses to be administered without the need for routine coagulation monitoring, thereby simplifying treatment. However, DOACs also have some limitations, particularly their short half-life and the fact that no routine coagulation monitoring is required, which may potentially increase the risk of stroke or SE in case of poor drug compliance. Moreover, their high cost curbs their use in many patients, especially in countries where greater constraints are placed on health care spending.
Thus, despite the better benefit/risk ratio of DOACs than of VKAs, VKAs are still frequently used in routine clinical practice. Undoubtedly, DOACs are particularly attractive for AF patients starting anticoagulant therapy. However, it is not clear whether AF patients who are well anticoagulated on warfarin should switch to a DOAC. The quality of anticoagulation on VKAs can be measured in clinical practice by means of the time-in-therapeutic range (TTR). This is usually calculated by means of the method described by Rosendaal et al. (4), which uses linear interpolation of INR values in each patient receiving VKAs to calculate the percentage of days when the INR is in the therapeutic range (2.0 to 3.0). It is likely that patients who are rarely in the therapeutic range receive little or no benefit from treatment with VKAs. Indeed, several reports have indicated an association between low TTR and increased rates of both stroke/SE and major hemorrhage in patients on VKAs. Interestingly, secondary analyses of the RCTs of DOACs indicate that warfarin at TTR ≥70% has comparable efficacy and safety to those of novel anticoagulants (5–7). Therefore, it has become a common belief that patients with good anticoagulation control on warfarin, as expressed by a TTR ≥70%, derive little or no benefit at all from switching to a DOAC, a concept that has been endorsed by current guidelines, which recommend maintaining patients on VKAs when TTR is ≥70% (1). This, however, assumes that TTR will remain stably high over time, but only limited data on this topic exist in the published reports (8,9).
In a paper published in this issue of the Journal, Bonde et al. (10) evaluated the stability of TTR in a large population of AF patients taking warfarin over a prolonged period of time and correlated the clinical outcomes of these patients during follow-up with the quality of anticoagulation control achieved during the first 6 months of treatment. A total of 4,772 patients from Danish registries were included in the analysis: 1,691 (35.4%) with good initial anticoagulation control (TTR ≥70%) and 3,081 (64.6%) without (TTR <70%). During the first year of follow-up, only 57% of patients with good initial anticoagulation control continued to have a TTR ≥70%. Moreover, the risk of stroke/SE or major bleeding in these patients was not significantly different from that of patients with prior suboptimal anticoagulation control. The outcome of patients with TTR ≥70% became significantly better than that of patients with TTR <70% only if changes in TTR during follow-up were taken into consideration, allowing patients to switch TTR group after the first 6 months. Factors predicting a low TTR value (<70%) during follow-up were: female sex, age <60 years, alcohol abuse, heart failure, and peripheral artery disease.
These results are interesting and have several important clinical implications. First, they show that the optimal management of VKAs for stroke prevention is still a challenge in daily clinical practice. Second, they confirm recent published report data (8,9) that INR instability is very frequent during the follow-up of AF patients on VKAs with previously good anticoagulation control. Third, they demonstrate for the first time that TTR ≥70% during the first 6 months of VKA is not associated with a better prognosis in terms of risk of stroke/SE or major bleeding. Thus, the authors conclude that basing the decision not to switch from VKAs to a DOAC only on TTR values, as recommended by current guidelines (1), is questionable.
How, then, can we justify the apparent discrepancy between the results of the study by Bonde et al. (10) and the common belief concerning the usefulness of TTR? A possible explanation is implicit in the Rosendaal method used to calculate the TTR, which assumes that INR values change linearly between 2 successive checks (4). However, in real life, when INR is <2 or >3, it usually takes only a few days to return to the therapeutic range after adequate VKA dose adjustment. By contrast, according to the Rosendaal method, it would take many days or even weeks if the frequency of INR checks is too low and/or the time between 2 consecutive INR measurements is too long (11). Consequently, the calculated TTR will be falsely low, becoming an unreliable tool for monitoring the quality of anticoagulation. In the study by Bonde et al. (10), the number of INR measurements was sufficiently high (a median of 11), but the interval between 2 successive INR checks was sometimes as long as 50 to 90 days, and this could have influenced the results.
Despite the importance of good anticoagulation control for patients receiving warfarin, the use of clinical parameters able to predict TTR in a single patient is very limited. Recently, a score that incorporates some of these parameters, the SAMe-TT2R2 score, has been developed and implemented in clinical practice (12). This score has shown a good discriminatory performance and has been recommended by current guidelines in order to select those patients who are most likely to achieve and maintain high TTR values (1). Although we do not know whether the use of this score would have changed the results reported by Bonde et al. (10), it is reasonable to advocate its implementation in daily routine, with the aim of improving the anticoagulation control and clinical outcome of patients taking VKAs.
In conclusion, should we abandon the common practice of using a high TTR value to identify patients with AF who will do well on VKAs and do not need to switch to a DOAC, as recommended by current guidelines? The paper by Bonde et al. seems to suggest that we should. However, it is likely that better patient education, more frequent INR monitoring, prompt VKA dose adjustment, and the accurate selection of appropriate candidates for VKA therapy on the basis of TTR predictors would yield a better outcome than that found in the Danish study. I therefore think that such efforts should be made before a final decision is taken on this issue. It is indisputable that restricting the use of DOACs only to patients who are really poor candidates for warfarin would markedly improve the cost-effectiveness and the economic sustainability of these drugs.
↵∗ Editorials published in the Journal of the American College of Cardiology reflect the views of the authors and do not necessarily represent the views of JACC or the American College of Cardiology.
Dr. Raviele has reported that he has no relationships relevant to the contents of this paper to disclose.
- 2018 American College of Cardiology Foundation
- Wallentin L.,
- Lopes R.D.,
- Hanna M.,
- et al.
- Piccini J.P.,
- Hellkamp A.S.,
- Lokhnygina Y.,
- et al.
- Dallalzadeh L.O.,
- Go A.S.,
- Chang Y.,
- Borowsky L.H.,
- Fang M.C.,
- Singer D.E.
- Bonde A.N.,
- Staerk L.,
- Lee C.J.-Y.,
- et al.
- Reiffel J.A.