National Guidelines Organization
Stroke risk
CCS [12]
ESC [11]
AHA/ACC/HRS [14]
ACCP [13]
High
Age > 65, or any CHADS2 risk factor
CHA2DS2-VASc ≥ 1
CHA2DS2-VASc ≥ 2
CHADS2 ≥ 1
OAC
OAC
OAC
OAC
Low
Age < 65, no CHADS2 risk factor, but vascular disease
CHA2DS2-VASc ≥ 1
CHA2DS2-VASc = 1
CHADS2 ≥ 1
ASA
OAC
OAC or ASA or no antithrombotic
OAC
Very low
Age < 65, no CHADS2 risk factor, no vascular disease
CHA2DS2-VASc = 0
CHA2DS2-VASc = 0
CHADS2 = 0
No antithrombotic
No antithrombotic
No antithrombotic
No antithrombotic
The efficacy of antithrombotic therapy to prevent ischemic stroke must be balanced against the risk of major hemorrhage. Bleeding risk in a patient receiving anticoagulant therapy may be predicted using the HAS-BLED schema [15]. The score allows clinicians to calculate an individual patient risk of major bleeding ranging from about 1 % (score 0–1) to 12.5 % (score 5). The application of a bleeding-risk schema ensures that important risk factors are systematically considered and allows estimation of the relative risks of stroke vs. major bleeding with various antithrombotic therapies. As many as 70 % of strokes with AF are either fatal or leave severe residual deficits, whereas major bleeding is less often fatal, is less likely to leave significant residual effects in survivors and tends to be rated by patients as less concerning than stroke. Many of the factors that determine stroke risk are also predictors of bleeding, but stroke risks usually exceed those of major bleeding. Patients at increased risk of major bleeding warrant extra caution and closer monitoring of antithrombotic therapy. Only when the stroke risk is low and the bleeding risk particularly high (e.g., a young patient with AF and few or no stroke-risk factors, but a high risk of major hemorrhage e.g., malignancy, prior major hemorrhage or participation in contact sports) does the risk:benefit ratio favor no antithrombotic therapy. Patient preferences are of great importance in deciding on antithrombotic therapy in relation to benefits and risks.
For the VKAs, bleeding risk depends upon INR, the quality of monitoring, the duration of therapy (higher risk during initial few weeks of therapy) and the stability of dietary and other factors that may alter VKA potency. Bleeding risk is likely higher in clinical practice than in the rigorous setting of a clinical trial or a dedicated, expert anticoagulation service.
Vitamin K Antagonist Pharmacology and Therapeutic Challenges
All VKAs exert their anticoagulant effects by interfering with the hepatic synthesis of the coagulation proteins factors II, VII, IX, and X [16]. Precursors of these proteins are synthesized in the liver and must undergocarboxylation to yield the coagulation factors. The carboxylation is catalyzed by reduced vitamin K, which is converted to oxidized vitamin K in the process and then regenerated by enzymatic reduction of the oxidized vitamin K. The VKAs interfere with the synthesis of coagulation factors by decreasing the regeneration of reduced vitamin K. The ultimate suppression of the coagulation factors resulting from VKA administration is dependent upon this complex series of steps and the effect of a given dose is highly variable from one patient to another and may vary widely within a given patient. Hence, achieving the potential efficacy of VKA for prevention of stroke/systemic embolus with acceptable rates of major bleeding is challenging for both patients and their doctors [16]. Warfarin is the most widely used VKA in North America, but other available VKAs include acenocoumarol, phenprocoumon, and fluindione, each of which has its own intrinsic and extrinsically influenced pharmacodynamics and pharmacokinetic characteristics. Discussions of VKAs in this chapter will henceforth refer only to warfarin unless specifically stated otherwise.
Warfarin is absorbed relatively quickly and completely, but because its action depends upon blocking the synthesis of specific coagulation factors, the onset of the anticoagulant effect depends upon the individual half-lives of these coagulation proteins and up to 5 days is required before a steady-state anticoagulant effect occurs. The return to normal coagulation on stopping warfarin is dependent on both the elimination half-life of warfarin (36–42 h) and the resumed synthesis and steady-state levels of the affected coagulation proteins, which requires about 5 days (Table 2.2). The degree of INR prolongation by a given dose of warfarin is unpredictable because of numerous factors affecting the pharmacokinetics and pharmacodynamics of warfarin and resulting in unpredictable and varying INR prolongation by a given dose of warfarin in a given patient [16]. Genetic variations in the enzymes responsible for warfarin metabolism and controlling vitamin K cycling can cause several-fold increased or decreased sensitivity to a given warfarin dose. The hepatic metabolism of warfarin may be slowed by several allelic variations in the CYP-450 enzyme system, reducing warfarin requirement and possibly resulting in bleeding complications at relatively low doses. Mutations of the gene coding for the vitamin K oxide reductase (VCOR) enzyme may result in widely varying sensitivity to the inhibitory effect of warfarin and may cause marked warfarin resistance. These genetic variations in warfarin metabolism and vitamin K cycling are unpredictable and although genetic testing can reveal some of them, the use of these tests has generally not improved the efficacy, safety, and cost-effectiveness of warfarin therapy [17]. Many drugs can influence the absorption, metabolism, or clearance of warfarin and of vitamin K, resulting in increased or decreased sensitivity to a given dose [18]. A variety of foods and dietary supplements may influence warfarin effects, as may dietary vitamin K content and several disease states including hepatic and renal failure. The INR affords an excellent measure of likely efficacy and safety of warfarin. However, even in clinical trials, achieving therapeutic-range INRs >65 % of the time is infrequent and in clinical practice, the figure is commonly 50 % or less [19]. Time in the therapeutic range (TTR) of INR 2–3 is closely related to risk of stroke among patients prescribed warfarin [20] and even following the establishment of a therapeutic dose of warfarin, patients require monthly determination of the INR. Not surprisingly both patients and physicians find warfarin treatment challenging, and registries in Europe and the United States have generally documented rather low rates of initiation and adherence among patients with clear indications for warfarin [21].
Table 2.2
Clinical pharmacology of warfarin and the novel oral anticoagulants
Feature | Warfarin [16] | Dabigatran [23] | Rivaroxaban [24] | Apixaban [25] | Edoxaban [26] |
---|---|---|---|---|---|
Mechanism | Inhibits synthesis II, VII, IX, X | Direct IIa inhibitor | Direct Xa inhibitor | Direct Xa inhibitor | Direct Xa inhibitor |
Pro-drug | No | Yes | No | No | No |
Dose regimen | Oral | Oral | Oral | Oral | Oral |
Administration | od | bid | od | bid | od |
AF dose (mg) | INR 2–3 | 150, 110, 75 | 20, 15 | 5, 2.5 | 60, 30 |
Food effect | Yes | Delays Tmax | ↑Bioavail | No | No |
No ↓ Bioavail | Take with food | Take with or without food | Take with or without food | ||
Take with or without food | |||||
Food interaction | Many | No | No | No | No |
Bioavailability (%) | 98 | 6.5 | 100 with food | 50 | 50 |
T max (hr) | 72–120 | 0.5–2 | 2–4 | 3–4 | 2–3 |
T 1/2 (hr) | 20–60 | 11–17 | 5–13 | 5–13 | 9–11 |
Substrate CYP | 2C9, 3A4 | No | 3A4, 2J2 | 3A4 | 3A4 |
Inhibitor | Ketoconazole | Ketoconazole | Ketoconazole? | ||
Inducer | Ritonavir | Ritonavir | Ritinavir? | ||
Rifampin | Rifampin | Rifampin | |||
Substrate P-gp | No | Yes | Yes | Yes | Yes |
Inhibitor | Ketoconazole | Ketoconazole | Ketoconazole | Ketoconazole | |
Inducer | Carbamazepine | Ritonavir | Ritonavir | Ritonavir | |
Dronedarone | Carbamazepine | Carbamazepine | Carbamazepine | ||
St J’s W | Dronedarone? | Dronedarone? | Dronedarone | ||
St J’s W | St J’s W | St J’s W | |||
Renal clearance | No | 85 | 33 | 27 | 35 |
Protein bound (%) | 99 | 35 | 90–95 | 87–93 | 54 |
Monitoring | INR | No | No | No | No |
The Development and Clinical Evaluation of the New Oral Anticoagulants
The New Oral Anticoagulants (NOACs) were designed to overcome some of the limitations of warfarin. The crystal structure of thrombin was reported in 1989 and of activated factor X in 1992 [22]. Intensive laboratory endeavors culminated in the development and clinical evaluation of the direct thrombin inhibitor dabigatran etexilate and the Xa inhibitors rivaroxaban, apixaban, edoxaban, and betrixaban. All but betrixaban proceeded to phase III studies, initially in venous thromboembolism and then AF. These agents (Table 2.2) [23–26] are all rapidly absorbed following oral intake and reach steady-state anticoagulation quickly because they directly inhibit preformed factor IIa or Xa. After discontinuation, their anticoagulant effects diminish quickly because of short serum and receptor-inhibition half-lives. Their absorption is largely unaffected by food or other medications, and their pharmacokinetics are affected by few agents, although drugs which inhibit or induce selected CYP enzymes or P-gp can affect concentration levels of the NOACs (Table 2.2). Dabigatran is not metabolized by the hepatic cytochrome P450 system, whereas the Xa inhibitors are and their anticoagulant effects will be enhanced by strong inhibitors and reduced by strong inducers of CYP 3A4. All NOACs are substrates for the P-gp system; accordingly their anticoagulant effects will be enhanced by strong inhibitors and reduced by strong inducers. The active drugs are excreted renally to varying extents; severe renal dysfunction must be taken into account in dose selection, conversion from a NOAC to warfarin and in drug interruptions for invasive procedures. Most of the NOACs are extensively protein bound, although dabigatran is not and is dialyzable. Anticoagulation monitoring is not required and dose recommendations vary little among patients, although lower doses of most NOACs are indicated for patients with reduced renal function, advanced age, or small body mass index. The principal drawbacks to the clinical use of these agents are that there is no readily available assay for assessing anticoagulant effect and no specific antidotes are yet available. Intensive investigation is currently focused on addressing these concerns. Four large RCTs have been conducted, each comparing one of the NOACs to warfarin among patients with nonvalvular AF (Table 2.3).
Table 2.3
Selected outcomes from the four major RCTs of a NOAC vs. warfarin among patients with nonvalvular atrial fibrillation
Dabigatran [23] is approved in Canada, the United States, and Europe for the prevention of SSE in AF and AFL, for the prevention of venous thromboembolic events (VTE) (deep venous thrombosis [DVT] and pulmonary embolism [PE] ) among patients undergoing hip or knee replacement and the treatment of VTE and prevention of recurrent DVT and PE. The approvals for AF were based on the results of the RE-LY trial [27], which randomized 18,113 AF patients (mean CHADS2 2.1) to dabigatran (110 mg vs. 150 mg twice daily, double-blind) or open-label warfarin. The principal outcome of SSE occurred at annual rates of 1.69 % (warfarin), 1.53 % (dabigatran 110 mg) (RR vs. warfarin 0.91; 95 % confidence-interval [CI], 0.74–1.11) and 1.11 % (dabigatran 150 mg) (RR vs. warfarin 0.66; CI 0.53–0.82; P < 0.001) (Table 2.3). The annual rates of major bleeding were 3.36 % (warfarin), 2.71 % (dabigatran 110 mg) (RR vs. warfarin 0.8, P = 0.003) and 3.11 % (dabigatran 150 mg) (RR vs. warfarin 0.93, P = 0.31). The rates of major bleeding on warfarin were substantially greater than the mean 1.3 %/year observed in the earlier RCTs of warfarin vs. control [9], perhaps in part because the mean age had risen from 69 [9] to >71 [27] and it is likely that bleeding was more assiduously documented in more recent trials. The phase III trials of the other NOACs also observed higher rates of major bleeding in the warfarin arm than had been documented in the earlier trials of warfarin vs. control [28, 29]. In RE-LY, intracranial bleeding and hemorrhagic stroke were significantly less with dabigatran 110 mg (respective HRs vs. warfarin 0.31 and 0.31) and with dabigatran 150 mg (respective HRs vs. warfarin 0.40 and 0.26) than with warfarin. The annual rates of the outcome of “net clinical benefit” (composite of SSE, pulmonary embolism, MI, death, or major bleeding) were 7.64 % (warfarin), 7.09 % (dabigatran 110 mg) (RR vs. warfarin 0.92; 0.84–1.02) and 6.91 % (dabigatran 150 mg) (RR vs. warfarin 0.91; 0.82–1.00).
Rivaroxaban [24] is approved in Canada, the United States, and Europe for the prevention of SSE in AF/AFL, for the prevention of VTE (DVT and PE among patients undergoing hip or knee replacement and the treatment of VTE and prevention of recurrent DVT and PE. The AF approvals were based on the ROCKET-AF trial [28] which randomized 14,264 AF patients (mean CHADS2 3.5) to rivaroxaban 20 mg once daily (15 mg once daily when CrCl was 30–49 mL/min) or warfarin. The primary analysis was a per-protocol non-inferiority comparison of warfarin and rivaroxaban for the principal outcome of SSE, which occurred at annual rates of 1.7 % (rivaroxaban) vs. 2.2 % (warfarin) (RR 0.79; 0.66–0.96, P < 0.001 for non-inferiority) (Table 2.3). In a secondary, intention-to-treat analysis, the respective rates were 2.1 % vs. 2.4 % (RR 0.88; 0.75–1.03; P = 0.12 for superiority). Major bleeding occurred at annual rates of 3.6 % (rivaroxaban) vs. 3.4 % (warfarin) (RR 1.04). There was significantly less hemorrhagic stroke with rivaroxaban (HR vs. warfarin 0.67). No net clinical benefit data were reported.
Apixaban [25] is approved in Canada, the United States, and Europe for the prevention of SSE in AF/AFL, for the prevention of VTE (DVT and PE among patients undergoing hip or knee replacement and the treatment of VTE and prevention of recurrent DVT and PE. The approvals for AF were based on the results of the ARISTOTLE trial [29], which randomized 18,113 AF patients (mean CHADS2 2.1) double-blind, to apixaban 5 mg twice daily (2.5 mg twice daily for 2 or more of age ≥80, weight ≤60 kg, or serum creatinine ≥133 μmol/L) or to warfarin. The principal outcome of SSE occurred at annual rates of 1.27 % (apixaban) vs. 1.60 % (warfarin) (RR 0.79; 0.66–0.95; P < 0.01 for superiority) (Table 2.3). Major bleeding occurred at annual rates of 2.13 % (apixaban) vs. 3.09 % (warfarin) (RR 0.69, P < 0.001). There were statistically significant reductions in intracranial bleeding (HR vs. warfarin 0.42) and hemorrhagic stroke (HR 0.51). The outcome of net clinical benefit (composite of SSE, major bleeding and all-cause mortality) occurred at annual rates of 3.17 % (apixaban) vs. 4.11 % (warfarin) (RR 0.85; 0.78–0.92, P < 0.001).
Apixaban was also compared to aspirin in the Apixaban vs. Acetylsalicylic Acid to Prevent Strokes (AVERROES) trial [30]. There were 5590 AF patients (mean CHADS2 = 2.0) unsuitable for warfarin therapy who were randomized double-blind to apixaban 5 mg twice daily (2.5 mg twice daily in selected patients) or to aspirin (81–324 mg/day) and followed for a median of 1.1 year. The trial was stopped early because of marked outcome differences. The rates of the principal outcome (SSE) were 1.6 %/year with apixaban vs. 3.7 %/year with apixaban (RR vs. aspirin 0.45; 0.32–0.62; P < 0.001). The rates of major bleeding were 1.4 %/year with apixaban vs. 1.2 % with aspirin (RR 1.13, P < 0.57), with no significant differences in intracranial or GI bleeding.
Edoxaban is approved in Japan for the prevention of SSE in AF/AFL, for the prevention of VTE (DVT and PE among patients undergoing hip or knee replacement and the treatment of VTE and prevention of recurrent DVT and PE. The US FDA has voted to approve edoxaban for the prevention of SSE in AF/AFL, and US marketing approval is awaited. Approval requests are under consideration in Europe and Canada. The AF data are available from the Effective Anticoagulation with Factor Xa next Generation in Atrial Fibrillation-Thrombolysis in Myocardial Infarction 48 (ENGAGE AF-TIMI 48) trial [31] which randomized 21,105 patients (mean CHADS2 = 2.8) in a double-blind protocol to edoxaban 30 mg once daily, edoxaban 60 mg once daily or warfarin. The principal outcome rates (SSE) were 1.61 %/year with edoxaban 30 mg and 1.50 % with warfarin (HR vs. warfarin 1.07, P = 0.005 for non-inferiority) and 1.18 % with edoxaban 60 mg (HR vs. warfarin 0.79, P < 0.001 for non-inferiority and HR 0.87, P = 0.08 for superiority) (Table 2.3). Annualized major bleeding rates were 3.43 % with warfarin, 1.61 % with low-dose edoxaban (HR vs. warfarin 0.47, P < 0.001) and 2.75 % with high-dose edoxaban (HR vs. warfarin 0.80, P < 0.001). Intracranial bleeding was significantly less with both low-dose (HR vs. warfarin 0.30) and high-dose edoxaban (HR vs. warfarin 0.47) and hemorrhagic stroke was also significantly less with both low-dose (HR vs. warfarin 0.33) and high-dose edoxaban (HR vs. warfarin 0.54). All-cause mortality was significantly less with low-dose edoxaban (HR vs. warfarin 0.87, P = 0.006) and there was a trend to lower all-cause mortality with high-dose edoxaban (HR 0.92, P = 0.08). Annualized net clinical benefit rates (composite of SSE, major bleeding or death from any cause) were 8.11 % with warfarin, 6.79 % with low-dose edoxaban (HR vs. warfarin 0.83, P < 0.001) and 7.26 % with high-dose edoxaban (HR vs. warfarin 0.89, P = 0.003).
Choosing Between a VKA and a NOAC
Patient and Physician Convenience
The NOACs were designed to overcome a number of the patient and physician challenges inherent in the use of warfarin. The starting dose of each of the NOACs is much less variable than that of warfarin. For all of the NOACs, the higher of the doses evaluated in the large RCTs is appropriate as a starting dose in most patients, whereas the lower dose may be selected for patients with advanced age, low body weight or significant renal failure. Absorption and metabolism of the NOACs is generally not influenced by diet, and alterations of the absorption or metabolism of individual NOACs are caused by relatively few drugs, which are generally not in common usage and are well-described. Coagulation monitoring is not required. The rapid onset and offset of these agents simplifies drug initiation and discontinuation. The comparative simplicity and convenience of the use of a NOAC compared to warfarin appears to result in improved compliance among de novo recipients [32, 33].
Efficacy and Safety (Table 2.3)
In view of the expected greater convenience associated with the NOACs, each of the phase III trials was designed to demonstrate non-inferiority of the new agent compared to warfarin for the efficacy outcome of SSE and the safety outcome of major bleeding. All of the NOACs were found to be non-inferior to warfarin for these outcomes. In addition, dabigatran 150 mg and apixaban were found to be superior to warfarin for the prevention of SSE while dabigatran 110 mg, apixaban and edoxaban both 30 and 60 mg were found to cause significantly less major bleeding than warfarin. All NOACs caused significantly less hemorrhagic stroke and intracranial hemorrhage than warfarin. The net clinical benefits outcome was significantly better with dabigatran 150 mg, apixaban and both doses of edoxaban. Although dabigatran is a thrombin inhibitor and rivaroxaban, apixaban and edoxaban are structurally distinct anti-Xa agents, the overall effects of the NOACs have been estimated in a meta-analysis of the 4 RCTs [34] with the following findings for the higher dose regimens vs. warfarin: SSE (RR 0.81, 95 % CI 0.73, 0.91, P < 0.0001), major bleeding (RR 0.86, 0.73–1.00, P = 0.06), intracranial hemorrhage (RR 0.48, 0.39–0.59, P < 0.0001), gastrointestinal bleeding (RR 1.25, 1.01–1.55, P = 0.04), all-cause mortality (RR 0.90, 0.85–0.95, P = 0.0003). Comparison of the lower dose regimens with warfarin showed similar rates of SSE, significantly less intracranial bleeding and significantly less mortality.