Fixed dose versus 3-day loading dose warfarin initiation in atrial fibrillation: effects on INR stabilization and time in the therapeutic range

Statement of key findings

After adjusting for clinical and demographic confounders, this retrospective multiethnic cohort study revealed no significant difference in long-term anticoagulation quality between fixed-dose and 3-day loading-dose warfarin initiation strategies. Neither the TTR at 12 months nor the time to stable INR differed significantly between the groups. Our observed TTR (45.0% to 65.7%) was comparable to that in Brazil (64.3%) [36] yet notably lower than that in China (74.0% to 78.9%) [30].

This finding reinforces that patient-specific factors, particularly genetic polymorphisms in CYP2C9, VKORC1, and CYP4F2, often influence anticoagulation control more than the initiation protocol choice [18]. Indeed, machine learning models have identified these genetic variants as key predictors of stable warfarin dose and anticoagulation status [37].

However, while genetic profiling can enhance individualized warfarin dosing, its routine application is limited in many settings due to cost and infrastructure limitations [37, 38]. Furthermore, meta-analyses have demonstrated that genotype-guided dosing has yet to present a consistent, significant benefit for primary clinical outcomes over a well-managed clinical approach. Therefore, a practical and fair way to achieve safe and effective anticoagulation is to optimize initiation strategies based on clinical factors and early INR responses. Our results underline the need for tailored context-appropriate approaches that respect precision medicine ideals without depending on genetic testing.

Comparison of findings with other studies

Genotype-guided dosing trials have shown mixed improvements in anticoagulation control, such as the European Pharmacogenetics of Anticoagulant Therapy (EU-PACT) and the Clarification of Optimal Anticoagulation through Genetics (COAG). In EU-PACT, the TTR was 60% to 67% within 12 weeks, whereas COAG reported ~ 45% within 4 weeks [39, 40]. Our cohort achieved a comparable TTR of 45.0% to 65.7% over 3 to 12 months. The time to the first therapeutic INR timeframe is comparable to EU-PACT (14–30 vs. 21–29 days), but complete stabilization was delayed, requiring 94 to 104 days. These comparisons demonstrate that our non-genomic, real-world cohort achieved outcomes similar to landmark genomic trials.

The generalizability of these trials is further limited, as they were conducted mainly in homogeneous populations with established genetic testing infrastructure, a context that does not reflect most LMICs. In our setting, routine genotyping is not feasible due to cost and logistical constraints. This, in turn, necessitates reliance on clinical prediction tools, and recent evidence suggests that clinical algorithms incorporating early INR response can achieve comparable outcomes [21]. Nonetheless, these tools present their own "one-size-fits-all" challenges; a prime example is the sex, age, medical history, treatment, tobacco use, and race (SAMe-TT2R2) score, which has reduced utility in diverse populations, as it designates "Asian" as a risk factor, making it less precise for multiethnic settings [41, 42].

Separate from the challenges of prediction is the inherent difficulty of sustaining long-term anticoagulation control. Even after achieving an initial stable dose, many patients fail to maintain a high TTR over time, highlighting the dynamic nature of warfarin therapy [5, 7]. This finding reinforces the need for initiation strategies that are rapid and durable. Therefore, our study addresses these pragmatic challenges by assessing the time to initial INR stabilization and the 12-month TTR. This dual assessment evaluates the real-world effectiveness of these initiation strategies, a topic of clinical interest due to challenges in optimizing warfarin initiation [8].

Secondary key findings

Although our unadjusted analysis revealed that a loading-dose strategy achieved the first therapeutic INR faster [43, 44], this short-term gain is associated with a known risk of supratherapeutic INRs [25]. This finding does not translate into better long-term TTR or faster stabilization after adjustment. In line with quartile analysis (Electronic Supplementary Table 2), our results further demonstrate that the time to stable anticoagulation is the strongest predictor of long-term success: the TTR was consistently high in the fastest-stabilizing quartile (~ 80.6–82.0%) but fell markedly in the slowest quartile (~27.2 - 47.3%). This marked gradient highlights that early INR stability, rather than the initial loading strategy, is the dominant factor in sustaining therapeutic control over time.

Baseline characteristics likely explain the stronger inverse correlation in the fixed-dose group. These older patients had more comorbidities, particularly CAD, making delayed stabilization a stronger predictor of poor long-term anticoagulation. Because fixed-dose regimens achieve therapeutic levels more gradually, prolonged stabilization exerts a greater adverse effect on TTR than in the 3-day loading group, which explains the more distinct correlation. In the overall study population, bridging was uncommon because most had NVAF, for whom it is not routinely recommended. Indeed, achieving optimal anticoagulation control in such patients is a recognized challenge in local clinical practice [45].

Another consideration is the issue of dose adjustment standardization. While most of our cohort was initiated and followed under the WMTAC protocol, which ensures standardized follow-up, we acknowledge that a minority of patients were managed during the early phase by various attending physicians (e.g., during hospital admission). This reflects the true clinical journey of patients in a real-world setting, although long-term follow-up under WMTAC care likely minimizes any lasting impact on outcomes. This belief is supported by evidence that this structured approach improves anticoagulation control over usual care, likely mitigating any initial management differences [29].

The prolonged INR stabilization in our cohort reflects real-world patient management challenges, not routine follow-up delays. Patients were managed under a national protocol mandating frequent monitoring at least once a week during initiation. Unlike a randomized controlled trial (RCT), this real-world effectiveness is influenced by factors like patient adherence and comorbidities [45]. Therefore, our delayed stabilization findings accurately reflect the clinical journey for patients initiating warfarin.

Clinical implications and future directions

Our findings underscore that achieving rapid INR stabilization is more significant than the specific initiation protocol. In multiethnic settings without routine genetic testing, resources are better directed toward frequent early monitoring and comprehensive medication education, particularly in pharmacist-led clinics where guidance on adherence, diet, and drug interactions can strengthen patient-provider collaboration [29, 30, 36, 45]. This approach is essential for improving real-world TTR and warrants immediate quality improvement efforts.

While supratherapeutic INR levels are clinically relevant due to their bleeding risk, our analysis focused on surrogate outcomes of anticoagulation quality (INR stabilization and TTR). Prospective studies are urgently required to develop and validate clinical and genetic-guided warfarin dosing algorithms tailored to the unique genetic diversity of Southeast Asian populations.

Strengths and limitations

The primary strength of this study lies in its novel comparison of warfarin initiation strategies in a large, real-world, multiethnic Southeast Asian cohort, thereby addressing a key regional evidence gap. The main limitation is the retrospective design, which is subject to selection bias; clinicians tended to prescribe fixed doses for higher-risk patients [23,24,25] and loading doses for post-surgical patients [46]. While we adjusted for confounders using GLM, this inherent bias cannot be eliminated. Other limitations include the potential for limited generalizability, as the data were derived from only two regional hospitals, and the possibility of residual confounding from unmeasured variables such as diet, adherence, and pharmacogenomics. In addition, although key interacting drugs, such as amiodarone, were adjusted for, we could not fully assess the impact of other potent CYP450 inducers and inhibitors during the 12-month follow-up due to their low prevalence (< 3%); their overall influence on TTR in this cohort is therefore likely to be minimal.

Comments (0)

No login
gif