Listen to your heart: Anticoagulant use with pharmacogenomics


Although warfarin is a commonly prescribed drug for reducing blood clots, predicting its individual effects can be a tricky business.

 Illustration by Sarah Nagorcka

Illustration by Sarah Nagorcka


Gene Dosage is a monthly column by Janan Arslan that finds out what genome science is uncovering about each individual's unique response to drugs and pharmaceuticals. Janan is a graduate student and pharmacogenomics researcher with a keen interest in personalised medicine.

Warfarin is an anticoagulant: a drug designed to reduce the formation of blood clots, thereby preventing heart attacks and strokes. It is one of the most frequently used anticoagulants, having been around since the 1950s, and is commonly prescribed for patients with atrial fibrillation or an irregular heartbeat. Warfarin was originally used as a poison for rats and mice, but was later approved as a medication for humans (don’t be too horrified – many medications start off from humble beginnings before they become superstars in places we least expected).

Nowadays, warfarin is dispensed 2.4 million times annually in Australia, and a total of about 25 million prescriptions have been dispensed in the United States. On a global scale, the medication accounts for 0.5–1.5% of total prescriptions annually.

Although warfarin is considered a highly effective anticoagulant, prescribing it can be a tedious process. If prescribed at a higher than required dose, a patient may experience excessive bleeding. Conversely, insufficient doses may have little to no effect on a clot, leading to an increased risk of thromboembolism (where a blood clot blocks a blood vessel).

  They hide from the warfarin.   Smithsonian's National Zoo/Flickr  (CC BY-NC-ND 2.0) 

They hide from the warfarin. Smithsonian's National Zoo/Flickr (CC BY-NC-ND 2.0) 

There are several factors that physicians need to take into consideration when prescribing warfarin:  patient-specific information (such as age, height and weight), as well as drug-drug interactions, drug-food interactions, and its narrow therapeutic index (NTI).

NTI refers to the ideal “treatment window” a patient should fall within to achieve the best possible outcome of their treatment. Patients falling outside this window can experience serious side effects – in this case, either excessive bleeding or thromboembolism. Warfarin NTI is monitored regularly using the international normalised ratio (INR), a ratio that measures the time required for the patient's blood to clot. Generally, patients with a low INR are at increased risk of thromboembolism, while patients with a high INR are at risk of excessive bleeding. Many factors can explain inter-individual variation in warfarin response, including genetic predisposition. 

The role of genetic heritability in warfarin metabolism was originally published in 1969. The paper demonstrated a patient who needed 20 times the standard warfarin dose to receive its benefits. Genetic variability was suspected, but the actual gene involved was unknown. The genetic mechanism was not identified until 2001, in a study that elucidated the role of genes CYP2C9 and vitamin K epoxide reductase (VKORC1) in warfarin metabolism. In 2004, clinicians and researchers began to seriously look into the extreme effects of these genes on warfarin metabolism. It became apparent that warfarin dosing strategies had to improve and incorporate genetic variability.

  Warfarin tablets of varying dosages.   Gonegonegone/Wikimedia Commons  (CC BY-SA 3.0)

Warfarin tablets of varying dosages. Gonegonegone/Wikimedia Commons (CC BY-SA 3.0)

In 2007, the US Food and Drug Administration (FDA) amended the drug labelling for warfarin to include the possibility of genetic variability causing varied responses to the drug. In 2010, the FDA took a step further, offering a range of expected therapeutic warfarin doses based on the CYP2C9 and VKORC1 genes. These ranges were derived from many clinical studies, and made the translation of raw genetic data into understandable prescribing methods considerably easier.

In the absence of any known genetic variants, the FDA guide recommends making prescribing decisions based on the previously-mentioned clinical factors, and to start on lower initiation doses for elderly or debilitated patients. The FDA ranges have found a place in pharmacogenomics reporting. Some pharmacogenomics companies, at least the ones I have worked for, refer to these ranges when making recommendations for physicians.

Although the CYP2C9 and VKORC1 genes form the cornerstone of warfarin dosing, other genes have also come into play. CYP4F2 has been confirmed as a gene involved in warfarin metabolism, with CYP1A2 and CYP3A4 also implicated. However, currently most pharmacogenomics companies can only test for CYP2C9 and VKORC1, as the latter genes are generally unavailable for clinical use.

  “When you say green tea, does that include my green tea Kit Kats?”   jpellgen/Flickr  (CC BY-NC-ND 2.0)

“When you say green tea, does that include my green tea Kit Kats?” jpellgen/Flickr (CC BY-NC-ND 2.0)

Genetic variability can also explain the role of drug-drug and drug-food interactions with warfarin. There are approximately 120 drugs and foods that interact with warfarin. Some of the most notable drug interactions include amiodarone (antiarrhythmic), ciprofloxacin (antibacterial), diltiazem (antihypertensive), citalopram (antidepressant), and carbamazepine (antiepileptic).

Warfarin may also be hindered by consumption of certain foods, including cranberries, mangos, grapefruit, gingko and garlic. There are even suggestions that green tea affects the efficacy of warfarin metabolism. To illustrate how these drugs and foods interact with warfarin, let's use garlic as an example. Garlic inhibits the expression of the gene CYP2C9, one of the primary pathways used by warfarin as part of its metabolic processes. This hinders the gene’s ability to function at full capacity, resulting in adverse events.

Outside of pharmacogenomics testing companies, there are several online resources available to clinicians for monitoring warfarin dosing. Among the sea of potential candidates, one of the most valued is this free online prediction tool, created in 2008 and endorsed by the Clinical Pharmacogenetics Implementation Consortium (CPIC). However, several contenders have proposed improved models, which the CPIC are currently reviewing but have not as yet endorsed. 

  The new kid on the block: dabigatran.   ec-jpr/Flickr  (CC BY-NC-ND 2.0)

The new kid on the block: dabigatran. ec-jpr/Flickr (CC BY-NC-ND 2.0)

Although warfarin still maintains its crown and has been the focus of this column, it would be naïve of me not to mention a few newer kids on the block. Dabigatran, rivaroxaban and apixaban are all offered as alternatives to patients who are unable to take warfarin. Their use is considerably less than warfarin, with dabigatran being dispensed 307,813 times, rivaroxaban 768,923 times and apixaban 214,609 times in Australia in 2014.

Earlier this year, a member of my family had a blood clot detected early. At the time, I thought they’d be immediately prescribed warfarin, but instead the physician went straight for dabigatran. According to Brieger and Curnow, “Many patients find the limitations of warfarin burdensome and attend their GP requesting a change to a new oral anti-coagulant.”

Although the statistics suggest that warfarin is still more commonly used, perhaps this will change over time. In a 2011 article, Kitzmiller and associates suggested that discussing the relevance of warfarin dosing analyses may become drastically irrelevant as these new blood thinners enter into clinical practice. So, who knows? Maybe I’ll be discussing the genes that metabolise dabigatran, rivaroxaban and apixaban sometime in the future. We’ll cross that bridge when we come to it.