Warfarin is effective in preventing thromboembolic events, but concerns exist regarding its use in patients with substance abuse.
Identify which patients with substance abuse who receive warfarin are at risk for poor outcomes.
Retrospective cohort study. Diagnostic codes, lab values, and other factors were examined to identify risk of adverse outcomes.
Veterans AffaiRs Study to Improve Anticoagulation (VARIA) database of 103,897 patients receiving warfarin across 100 sites.
Outcomes included percent time in therapeutic range (TTR), a measure of anticoagulation control, and major hemorrhagic events by ICD-9 codes.
Nonusers had a higher mean TTR (62 %) than those abusing alcohol (53 %), drugs (50 %), or both (44 %, p < 0.001). Among alcohol abusers, an increasing ratio of the serum hepatic transaminases aspartate aminotransferase/alanine aminotransferase (AST:ALT) correlated with inferior anticoagulation control; normal AST:ALT ≤ 1.5 predicted relatively modest decline in TTR (54 %, p < 0.001), while elevated ratios (AST:ALT 1.50–2.0 and > 2.0) predicted progressively poorer anticoagulation control (49 % and 44 %, p < 0.001 compared to nonusers). Age-adjusted hazard ratio for major hemorrhage was 1.93 in drug and 1.37 in alcohol abuse (p < 0.001 compared to nonusers), and remained significant after also controlling for anticoagulation control and other bleeding risk factors (1.69 p < 0.001 and 1.22 p = 0.003). Among alcohol abusers, elevated AST:ALT >2.0 corresponded to more than three times the hemorrhages (HR 3.02, p < 0.001 compared to nonusers), while a normal ratio AST:ALT ≤ 1.5 predicted a rate similar to nonusers (HR 1.19, p < 0.05).
Anticoagulation control is particularly poor in patients with substance abuse. Major hemorrhages are more common in both alcohol and drug users. Among alcohol abusers, the ratio of AST/ALT holds promise for identifying those at highest risk for adverse events.
Request Reprint E-Mail: firstname.lastname@example.org