×

Warfarin-induced skin necrosis following recommencement of warfarin after perioperative Prothrombinex-VF

Clinical record

A 62-year-old man with thrombophilia was receiving warfarin for recurrent venous and arterial thrombosis, and had a known 48 mm diameter infrarenal abdominal aortic aneurysm (AAA). He presented with collapse at home after 2 days of increasing pain in the left flank. A left-sided retroperitoneal haematoma was identified by computed tomography angiography. Increasing abdominal pain and a decline in haemoglobin levels from 125 g/L to 88 g/L made it necessary to transfer the patient urgently to theatre for exploration and open repair of a presumed ruptured AAA.

The patient had been taking 1.5 mg warfarin each day for 20 years without complication. He was known to be heterozygous for both the factor V Leiden and the prothrombin G2021A mutations. He was a current smoker with a 40-pack-year history who also had mild rheumatoid arthritis, insulin-dependent type 2 diabetes mellitus, stage 3A chronic kidney disease, moderate aortic stenosis and hypertension.

Before surgery, anticoagulation with therapeutic warfarin (international normalised ratio [INR] 2.5) was reversed according to our unit protocol with 5000 IU Prothrombinex-VF (CSL Behring Australia). A posterior rupture of the AAA was confirmed during the operation. Sodium heparin (5000 U) was administered before aortic cross-clamping, and its action was fully reversed at the end of surgery with 50 mg protamine sulphate.

Recovery was initially uneventful, and therapy with 1.5 mg warfarin was resumed on postoperative day 1, together with a renally adjusted dose of enoxaparin sodium (40 mg twice daily).

On postoperative day 5, the patient experienced increasing abdominal pain and was returned to theatre for an exploratory laparotomy; nothing significant was found. His INR was 3.1, and reversal of anticoagulation was not performed.

On postoperative day 8, he was transferred to the intensive care unit because of deteriorating gas exchange, hypotension and an evolving coagulopathy. Large and painful areas of skin necrosis had developed on the abdomen, flanks and thighs (Figure, A). His INR was 6.2, activated partial thromboplastin time (APTT) 64 seconds, fibrinogen levels 1.0 g/L, and platelet numbers had dropped from 265 × 109/L to 123 × 109/L.

Seven units of fresh frozen plasma and 10 units of cryoprecipitate were infused. Warfarin treatment was withdrawn, and anticoagulation therapy with intravenous heparin initiated (target APTT: 65–100 seconds) to treat the presumed warfarin-induced skin necrosis (WISN).

The results of an enzyme-linked immunosorbent assay (ELISA) test for heparin-induced thrombocytopenia were negative, as was screening for vasculitis-related antibodies. Anti-cardiolipin and anti-β2 glycoprotein I antibodies were not detected, nor was lupus anticoagulant. Protein C and S levels were low (0.45 U/mL and 0.53 U/mL, respectively).

The skin lesions continued to demarcate over the next 2 days, and were debrided on postoperative day 10 (Figure, B). Histopathological findings were consistent with WISN (Figure, C).

Anticoagulation treatment with intravenous heparin continued for 2 weeks, and was then changed to enoxaparin sodium (100 mg twice daily).

The skin lesions were regularly debrided and negative pressure dressings applied during the following months. Autologous split skin grafts were later performed with excellent results (Figure, D, E), and the patient was transferred to our rehabilitation facility on postoperative day 63. Treatment with oral rivaroxaban was initiated when he was discharged from hospital, and is to continue indefinitely at a dose of 20 mg daily.

Warfarin-induced skin necrosis (WISN) is a rare complication of a commonly used medication. The underlying mechanism is unclear, but it is thought that WISN is induced by a transient paradoxical hypercoagulable state.

Warfarin inhibits certain vitamin K-dependent factors more quickly than it does others, producing a transient imbalance in procoagulant and anticoagulant activity.1 The anticoagulant activity of protein C is rapidly reduced (within 24 hours) because of its short half-life (5–8 hours). The levels of other vitamin K-dependent coagulation factors (II, IX and X) decline at slower rates because they have longer half-lives (24–72 hours). The initial result, therefore, is a relative increase in thrombin generation and a transient hypercoagulable state that may lead to thrombotic occlusion of the microvasculature and thus tissue necrosis.

Our patient had several risk factors for WISN, including his age, obesity and a history of thrombophilia. Further, he was heterozygous for the factor V Leiden mutation (resulting in activated protein C resistance and a functional protein C deficiency) and for the prothrombin G2021A mutation (resulting in elevated prothrombin levels).2 Hypercoagulable conditions more commonly associated with WISN include deficiencies of protein C, protein S and antithrombin III. Other recognised predisposing factors include being female and being given higher loading doses of warfarin.3

The potential role of Prothrombinex-VF in the development of WISN in our patient warrants further consideration. In Australia, immediate warfarin reversal is achieved by using prothrombin complex concentrates or fresh frozen plasma. Prothrombinex-VF is the only prothrombin complex concentrate routinely used in Australia and New Zealand. It is a three-factor concentrate (prothrombin II, IX and X), including low levels of factor VII, but does not contain proteins C or S.

Our unit protocol for immediate warfarin reversal at the time of this patient’s admission reflected the recommendations published in 2009 by Chiu and colleagues.4 Our patient received Prothrombinex-VF alone; neither vitamin K nor fresh frozen plasma were used during reversal. We hypothesise that protein C and S levels were low at the time of his operation, as Prothrombinex-VF does not reverse the reduction of protein C and S levels caused by warfarin. Perioperative blood loss would have reduced their levels further, and the resumption of warfarin treatment immediately after the operation would have depleted them even more. We therefore suggest that very low levels of proteins C and S, together with his pre-existing thrombophilia, are likely to have tipped the balance in favour of thrombosis.

A recent update of the consensus guidelines for warfarin reversal in Australia suggested that 5–10 mg vitamin K1 be given parenterally at the same time as Prothrombinex-VF.5 The half-lives of the infused clotting factors are similar to those of endogenous clotting factors, but the addition of vitamin K1 (as a cofactor in their synthesis) would sustain the reversal effect. It may also increase protein C and S levels, and thereby avoid a transient prothrombotic state when treatment with warfarin is resumed.

Our case highlights the importance of being aware of WISN as a rare complication of warfarin therapy. Consideration of individual patient factors, including a history of thrombosis, before initiating warfarin reversal is critical for ensuring that appropriate adjuvant therapy is provided and an optimal outcome achieved. Vitamin K1 should be administered with Prothrombinex-VF during warfarin reversal, as it sustains the reversal effect, may increase the levels of proteins C and S, and thereby avert thrombotic complications.

Lessons from practice

  • Individual patient factors, including a history of thrombosis, must be considered before warfarin reversal.
  • Updated consensus guidelines for warfarin reversal suggest giving vitamin K1 with Prothrombinex-VF.
  • By increasing protein C and S levels, vitamin K1 may prevent a transient hypercoagulable state after resuming warfarin therapy.

Warfarin-induced skin necrosis and results following autologous skin grafts


A Bilateral flank and thigh skin necrosis.


B: Wound debridement.


C Diffuse dermal ischaemic necrosis, haemorrhage and oedema. A small number of platelet thrombi are evident in the small veins of the dermis. There is no evidence of vasculitis. Haematoxylin-eosin stain; magnification × 200.


D: Left thigh at 6-month follow-up.


E: Abdomen at 6-month follow-up.

The trumpet’s blown pupil

On the advice of a neurologist friend (author C P), a 42-year-old man was rushed to hospital with a fixed dilated pupil (Figure, A). His condition was otherwise normal, as were the findings of urgent magnetic resonance imaging and angiography of the brain.

By Day 3, his mydriasis had resolved (as did the mystery) when the patient asked his “learned” friend how common eye injuries with whipper-snippers were. He mentioned, in passing, that on the morning of the incident he had cut back his beloved Angel’s trumpet (Figure, B) — a member of the Solanaceae family, all parts of which are laced with anticholinergic alkaloids.

A detailed history, botanical or otherwise, will always trump the next best test.

A prospective cohort study of trends in self-poisoning, Newcastle, Australia, 1987–2012: plus ça change, plus c’est la même chose

Intentional poisoning is a major public health problem and generally occurs in the context of deliberate self-harm and drug misuse. There are 60 International Statistical Classification of Diseases and Related Health Problems, 10th revision (ICD-10) codes for drug-related deaths. In 2009, these codes together accounted for 6.4% of male and 5.5% of female total years of potential life lost in Australia.1 Most deaths are in young people, and drug-related deaths account for a large proportion of lost years of life — causing about 25% of completed suicides.1 The estimated Australian rate of people hospitalised for self-poisoning of 119 per 100 000 population per year2 substantially underestimates total numbers as many patients are not admitted or do not present.3 Most poisonings are in young adults and are impulsive or unplanned. Morbidity and mortality from poisoning has proved surprisingly responsive to targeted public health interventions to reduce the availability of means to poison oneself accidently or deliberately.4,5 Identification of drugs causing disproportionate numbers of poisonings, morbidity or deaths is thus a key aspect of an effective toxicovigilance system.

We aimed to examine inhospital morbidity and mortality associated with poisoning in the greater Newcastle region over 26 years and to broadly determine what factors have (and have not) changed over this time, during which there have been substantial changes in medication use. In particular, the use of psychotropic drugs has changed and there have been large increases in antidepressant prescribing in Australia over the past three decades.6,7 Favourable and unfavourable effects on population suicide rates have been postulated for antidepressants.8,9 Thus we assessed the effect of the increase in antidepressant prescribing on total and antidepressant self-poisoning, and examined changes in prescribing of and self-poisoning with several drugs that have previously been identified as having higher relative toxicity in overdose: short-acting barbiturates, dextropropoxyphene, chloral hydrate, dothiepin, thioridazine, pheniramine, temazepam, amisulpride, alprazolam, venlafaxine and citalopram.1015

Methods

This is a cohort study of patients presenting consecutively after self-poisoning to the Hunter Area Toxicology Service (HATS) between January 1987 and December 2012. Since 1987, HATS has provided a comprehensive 24 hours/day toxicology treatment service for a population of about 500 000. From 1992, ambulances diverted all poisoning presentations in the lower Hunter to this service. HATS currently has direct clinical responsibility for all poisoned adult patients in all hospitals in the greater Newcastle region and provides a tertiary referral service to Maitland and the Hunter Valley. HATS routinely records data on patients who present to hospital (even if the poisoning is uncomplicated).16 Previous studies on poisoning in Newcastle17,18 have shown that no patients were treated exclusively in private hospitals or by their family doctor, indicating that most presentations to medical care facilities are recorded. This cohort does not comprehensively cover unintentional childhood (age < 14 years) poisonings. The local human research ethics committee has previously granted an exemption regarding use of the database and patient information for research.

A structured data collection form is used by HATS to prospectively capture information on patient demographics (age, sex, postcode), drugs ingested (including doses), co-ingested substances, regular medications and management and complications of poisoning.19 At discharge, further information is collected (eg, hospital length of stay [LOS], psychiatric and substance misuse diagnoses). Data are routinely entered into a fully relational Microsoft Access database separate to the hospital’s main medical record system. Data on all patients aged ≥ 14 years who presented following self-poisoning were analysed.

Analyses of population-referenced data (ie, rates) were restricted to postcodes that predominantly cover Newcastle, Lake Macquarie and Port Stephens. Changes in total self-poisoning rates in the four statistical subdivisions in this area were examined between 1991 and 2011. Changes in rates of self-poisoning using the main antidepressant drug classes (tricyclic antidepressants [TCAs], selective serotonin reuptake inhibitors [SSRIs], serotonin–noradrenaline reuptake inhibitors [SNRIs], monoamine oxidase inhibitors [MAOIs] and other) were also examined. Data on rates of antidepressant drug use in these drug classes (standardised by the defined daily dose [DDD]) in Australia from 1991 to 2011 were taken from Australian government publications. We have previously shown that these data agreed within two significant digits with Newcastle-specific data for a range of medications.11

Results

Over the study period, there were 17 266 admissions of patients who had self-poisoned and 11 049 individual patients; the median number of admissions per patient was one (range, 1–115). The number of admissions increased over the first 8 years, but since 1995 has been quite stable (HATS became well established in 1994) (Appendix 1).

Of the total admissions, 15 327 (88.8%) were attempts at self-harm and the remainder were a mixture of unintentional, iatrogenic and recreational self-poisonings. (Data are generally presented for admissions, and may thus include the same patient with different poisonings.) The median age of admitted patients was 32 years (range, 14–97 years) and the female : male ratio was 1.6 : 1 (10 514 female, 6711 male and 39 transgender patient admissions). The median LOS was 16 hours (interquartile range, 9.3–25.7 hours; total time spent in hospital, 15 688 hours). Of the total admissions, 2101 involved admission of the patient to an intensive care unit (ICU) (12.2%) and 1281 involved ventilation of the patient (7.4%). There were 78 inpatient deaths (0.45% of admissions).

We investigated the changes in morbidity and mortality over the study period by dividing admissions into those in the first 6 years, reported previously,20 and four subsequent 5-year periods (Box 1). Over this period, the rate of admission to ICU dropped from 19.2% (376/1955) to 6.9% (280/4060) and rate of mechanical ventilation from 13.7% (268/1955) to 4.8% (193/4060). The fatality rate dropped from 0.77% (15/1955) to 0.17% (7/4060). The median LOS decreased from 20.5 hours in the first 6 years to about 16 hours in all subsequent 5-year periods.

In the 17 266 admissions, a previous history of psychiatric illness (9692, 56.1%), previous admission for a psychiatric episode (6426, 37.2%), previous suicide attempt (9665, 56.0%) and history of alcohol or drug misuse (8466, 49.0%) were commonly recorded. Few admitted patients were in full-time paid work (2421, 14.0%); some were unemployed (3622, 21.0%), pensioners or retired (4163, 24.1%), students (1058, 6.1%), doing home duties (920, 5.3%) or other (703, 4.1%); data were missing for the remainder (4379, 25.4%). Only 23.8% (4104) were married; others were single (9567, 55.4%), separated (1325, 7.7%), divorced (1302, 7.5%), widowed (409, 2.4%), in de facto relationships (79, 0.5%), other (13, 0.1%); and data were missing for the remainder (467, 2.7%). These demographic, social and psychiatric factors remained stable, except for a slight rise in the proportion of patients reporting previous self-harm (Appendix 2).

Including co-ingested alcohol, 34 342 substances were involved (mean, 1.99 per patient; range, 1–18 per patient). The major groups of agents involved in self-poisonings and ICU admissions are shown in Appendix 3 (drugs usually available on prescription) and Appendix 4 (non-prescription drugs and other substances). The most commonly ingested substances were benzodiazepines (5470, 15.9%), alcohol (5461, 15.9%), paracetamol (4619, 13.5%), antidepressants (4477, 13.0%), antipsychotics (3180, 9.3%), anticonvulsants (1514, 4.4%), opioids (1232, 3.6%), non-steroidal anti-inflammatory drugs (1104, 3.2%) and antihistamines (743, 2.2%). Prescription items accounted for 18 950 agents (55.2%), of which 14 445 (76.2%) were known to have been prescribed for the patient.

There were major changes over time in the patterns of drugs ingested (Box 2, Box 3), especially for psychotropic drugs and sedatives. Psychotropic drugs consistently accounted for about 50% of all drugs ingested but newer antidepressants and atypical antipsychotics have largely replaced the older drugs (TCAs and conventional antipsychotics). Several of the drug classes for which frequency of ingestion declined (eg, barbiturates, theophylline, TCAs) have disproportionately high toxicity (Appendix 3). In some drug classes, there were larger declines in individual drugs identified as having greater toxicity in the mid 1990s1013 (Appendix 5). This is only partly explained by falling prescriptions for these agents (Appendix 6).

There was a more than sixfold increase in antidepressant DDDs per 1000 people per day between 1991 and 2010 (from 12 to 77). However, the increase in the proportion of poisonings due to antidepressants was very modest (about 1.34-fold). There were no corresponding changes in the population rates of self-poisoning (Box 4, Appendix 6), which fluctuated around the long-term mean in each district. There was thus a large decrease in the rate of self-poisoning per DDD prescribed per 1000 population per day for antidepressants (Appendix 7). In Box 4, the increase in total antidepressant prescriptions is illustrated by the total shaded area and changes in individual classes are illustrated by the coloured shading.

Discussion

The rates of self-poisonings in Newcastle were stable over the past two decades, and the features of the population presenting with self-poisoning were constant. This suggests a long-term ongoing and reasonably predictable need for clinical toxicology treatment and ancillary psychiatric and drug and alcohol support services. Despite large increases in prescriptions for drugs used to treat psychiatric illness (and a range of other major mental health interventions), there appears to have been no positive result in terms of reducing episodes of self-harm.

Interestingly, there was a more than sixfold increase in the use of antidepressants, and while the agents taken in overdose changed substantially, there were only small changes in rates of antidepressant overdoses. Interpreting this surprising finding is not straightforward. It probably indicates that antidepressants are increasingly being prescribed for patients who have minimal risk of self-harm. Reassuringly, there is no evidence in our population to support concerns about pro-suicidal effects of new antidepressant prescriptions. The lack of any change in overall self-harm rates also suggests that increased antidepressant use for depression is not an effective public health strategy to reduce rates of self-harm. The only strategy to prevent fatal poisoning with consistent supporting evidence is restricting the availability of high-lethality methods.4,5

Identification of high toxicity in overdose is a problem that can only be studied after approval for marketing is granted. Postmarketing surveillance by pharmaceutical companies of toxicity in overdose is not a requirement for drug registration in Australia or any other country. There is little incentive for voluntary surveillance. Although most companies record case reports of overdoses of their drugs, this does not facilitate comparisons. Reporting biases mean that such cases may be atypical of the usual clinical picture.

The many new psychiatric medications coming onto the market should mandate coordinated collection of timely information on self-poisoning and suicide. Ideally, this should be done at three levels. First, a national coronial register of drug-related deaths is essential to enable an analysis of relative mortality (as done in the United Kingdom21). Second, data on poisonings reported to poison centres are essential, particularly for childhood poisonings that rarely require admission and for assessing the effect of primary prevention measures. Poison centre data have limitations due to referral bias, lack of uniformity of assessment and lack of clinical information.22 Third, for these reasons, systematic use of clinical databases to record hospital admissions for cases of poisoning is needed to measure relative clinical toxicity.

The HATS clinical database has identified disproportionate effects in overdoses with many drugs. Translation of HATS data into clinical risk assessment and guidelines has occurred, but with lengthy delays. For example, the risk of QT prolongation and torsades de pointes with thioridazine, citalopram and escitalopram10,14,23 were detected in the database 5–10 years after these drugs became available. However, to reduce the time to identify toxicological problems to 1–2 years, collaborating centres in Australia and overseas will be needed to accelerate collection of data on self-poisoning.

Most deaths due to poisoning occur outside hospital. Any significant decrease in mortality from self-poisoning will result from primary or secondary prevention. Efforts at decreasing morbidity and mortality from self-poisoning should continue to target drugs that are frequently taken or are lethal in overdose.

The falls in prescriptions and poisonings with several drugs with greater relative toxicity occurred several years after the problems relating to overdoses with these drugs were identified in the HATS database around 1994–1995 (Appendix 6, Appendix 8). For example, large reductions were due to withdrawal of drug subsidies (those for dextropropoxyphene and thioridazine were withdrawn in 2000) and removal of a formulation that was misused (temazepam gel-filled capsules were withdrawn in 2001). Publication of toxicity data alone had limited (if any) effect in terms of reducing prescriptions of the more toxic drugs. No drugs were banned in Australia on the basis of HATS data, although manufacturers did voluntarily withdraw some of the highlighted drugs (barbiturates and chloral hydrate were withdrawn in 1994–1995, thioridazine was withdrawn in 2009). Further, there has been no drop yet in prescriptions of or poisoning with drugs identified as having greater toxicity in the mid 2000s14,15 (data not shown).

Our data show large drops in rates of poisoning with some of the more lethal drugs, such as TCAs and barbiturates (Box 2, Box 4). The introduction of less toxic antidepressants and sedatives dramatically changed prescribing, which in turn changed the types of drugs taken in self-poisoning. This trend has presumably also been reflected in changes in drug-related deaths. However, the nature of coding in official death statistics means that there are no published Australian data to support this contention. For example, poisoning by “antiepileptic, sedative-hypnotic, antiparkinsonism and psychotropic drugs” are all lumped together under one code in Australia.1 Most fatal poisonings are classified as due to unspecified or multiple agents. Some improvement in coding of drug-related deaths should not be difficult. Much finer detail is provided in other ICD-10 codes. For example, the Australian Bureau of Statistics records deaths from crocodile and rat bites (three and zero, respectively, between 1999 and 2008) separately from those caused by other animal bites.1 The development of the National Coronial Information System, launched in 2000, may make fatal poisoning comparisons possible in future, but only if data are accurately and consistently coded. Assessing the impact that clinical toxicology has on direct patient management and on public health is hindered by the lack of reliable epidemiological and clinical data.

Our data have inherent limitations. There is likely to be selection bias against less severe poisonings (the types of cases where patients might not present for medical attention) and rapidly lethal poisonings (cases in which patients die outside hospital). In surveys of self-harm and anecdotal reports from patients, it has been estimated that a significant proportion of people who have self-poisoned (5%–15%) don’t present for medical care.3 Also, although there is a prospective data collection form, retrospective review of medical records is often required to complement prospectively collected data. Data on the ingested drugs were based on patient history, including corroborating history obtained from ambulance officers and accompanying people and from information on drug containers. Drug concentrations were not measured in most patients, although previous research on specific drugs has found the patient history of drugs ingested to generally be confirmed by an appropriate assay.24

The key strengths of our study are its long duration and the consistent core data fields. There are few similar attempts to gather data longitudinally on self-poisoning over prolonged periods. Many have retrospective identification of cases from hospital coding and thus rely entirely on the completeness of medical records.25 Others have not been collected continuously26 or have focused on psychiatric factors and treatment (rather than drugs ingested and toxicity)27 or were conducted in developing countries where agents ingested differ substantially from those used in Australia.28

However, the uniqueness of the HATS database also highlights a weakness of our study. As there are no comparable current datasets, it is difficult to determine the extent to which the Hunter experience represents that of the developed world. Expansion of the database could be facilitated by database systems integrated with electronic medical records.

We identified interesting and important patterns relating to drug prescriptions, epidemiology of overdose patients and importance of relative toxicity. A massive increase in antidepressant prescriptions has had little impact on rates of self-harm or antidepressant poisoning. Changes in antidepressant classes (generally from more to less toxic) have had significant effects on morbidity and mortality from antidepressant poisoning, and therefore from all poisonings. We were also able to generate information regarding relative toxicity and patient management. However, for many rare poisons, gaining sufficient numbers of patients to generate reliable information about management and prognosis will take decades from one centre. We believe that, in Australia and overseas, there is a need for a coordinated approach to address the toxicity of drugs in overdose. The public health benefits would greatly outweigh the modest costs of enhancing postmarketing surveillance through more widespread systematic collection of poisoning and overdose data.

1 Morbidity and mortality due to self-poisoning, 1987–2012, measured by need for intensive care and ventilation, length of stay and fatality*


* Data were divided into five periods with roughly equal patient numbers.

2 Use of sedatives and psychotropic drug classes in self-poisoning, 1987–2012

3 Use of alcohol, analgesics and selected other drug classes in self-poisoning, 1987–2012


4 Rates of antidepressant prescribing (1990–2011, shaded areas) and total rates of self-poisoning (1992–2011, solid lines and symbols)*


* Total numbers of admissions for self-poisoning before 1993 are slightly lower as the Hunter Area Toxicology Service was just being established. Dotted lines indicate 20-year mean for each district.

Polypharmacy among inpatients aged 70 years or older in Australia

Research has confirmed a significant association between polypharmacy — defined here as five or more regular prescription medications1 — and adverse outcomes among older people living in the community. The associations of polypharmacy in older people admitted to hospital have been less extensively explored. Older inpatients are a vulnerable group, at high risk of prolonged hospital stays, institutionalisation and death.2 Several studies have reported a high prevalence of potentially inappropriate medications among older inpatients (ranging from 20%3 to 60%4) and, while potentially inappropriate medications have been linked to adverse drug reactions,5 the most significant predictor of adverse drug reactions among inpatients is the number of medications prescribed.6

Medications can, of course, prolong life and prevent serious morbid events in older people. People aged 80 years and over are at highest risk of cardiovascular events, and the absolute benefits of primary and secondary prevention may be greatest in this group.7 Medications can also improve quality of life through symptom control and maintenance of function. Individualisation of therapy should underpin prescribing, weighing up the potential benefits and risks of medication with reference to the patient’s own goals of care.8,9 Hospitalisation presents an opportunity for physicians to undertake such a process and to rationalise prescribing for older people.

We aimed to investigate medication changes for older inpatients and explore patient characteristics associated with polypharmacy.

Methods

Participants and setting

This prospective cohort study included patients aged 70 years or older admitted to general medical units of 11 acute care hospitals in Queensland and Victoria — two small secondary care centres (120–160 beds), two rural hospitals (250–280 beds), four metropolitan teaching facilities (300–450 beds) and three major tertiary referral centres (> 650 beds).

Recruitment took place between July 2005 and May 2010 as part of three cohort studies, described in detail elsewhere.1012 Personal or proxy patient consent was obtained in writing before patients entered the study. Patients were excluded if they were admitted to coronary or intensive care units, received palliative care only, or were transferred out of general medical units within 24 hours of admission to inpatient wards. Recruitment took place on weekdays only.

Measurement tool and outcome measures

The interRAI assessment system for acute care (interRAI AC)13 was used for data collection. This instrument screens many domains, including cognition, communication, mood, behaviour, activities of daily living (ADL), instrumental activities of daily living (IADL), continence, nutrition, skin condition, falls and medical diagnosis.14 IADL items represent a higher order of functioning than the basic ADL items and dependence in IADL items is likely to occur before dependence in ADL. The interRAI AC was specifically developed for use in the acute care setting, to support comprehensive geriatric assessment of older inpatients.14

Trained nurse assessors gathered data about the patient’s physical, cognitive and psychosocial functioning in the premorbid period (before the current episode of illness, assessed retrospectively at admission) and at admission (based on observations of patients during their first 24 hours in the ward). All available sources of information, including the patients, carers and the medical, nursing and allied health staff, were utilised to complete the interRAI AC instrument, either directly as verbal reports or indirectly through written entries in hospital records.

There are scales embedded in the interRAI instruments that combine single items belonging to a domain (eg, ADL, IADL and cognition), which are used to describe the presence and extent of deficits in that domain.15 Here, the short ADL scale, the IADL performance scale and the cognitive performance scale were used as summary measures of functional and cognitive status, with higher scores indicating greater incapacity.15

For all patients, prescribed medications were recorded about 24 hours into their hospital stay and again at discharge from hospital. These lists were reviewed so that medications used for a finite period in hospital to manage acute medical conditions (eg, intravenous antibiotics, diuretics and subcutaneous anticoagulants) were not included in the number of regular prescribed medications. Complementary and as-required medications were also excluded. The Anatomical Therapeutic Chemical classification system codes, plus doses, routes of administration and dosing regimens, were recorded at admission and discharge. Data were entered by pharmacists or pharmacy students and verified by a second pharmacist or a geriatrician. Medications were classified as being for controlling symptoms, prevention or both, based on principles used in palliative care settings,16,17 as well as expert opinion and consensus among us (geriatricians, physicians and clinical pharmacologists).

Analysis

Frequency distributions were used to describe the characteristics of the study population. Polypharmacy status was categorised into three groups: non-polypharmacy (0–4 drugs), polypharmacy (5–9 drugs) and hyperpolypharmacy (≥ 10 drugs).18

Depending on the distribution of the data, non-parametric (Kruskal–Wallis test) or parametric (analysis of variance) methods were used to compare continuous data across the polypharmacy categories. For categorical variables, χ2 tests were used. For paired data, the paired t-test (continuous data) or the McNemar test (nominal data) was used to examine changes between admission and discharge. A variation of the McNemar test (McNemar–Bowker test of symmetry) was used when there were more than two categories.

To examine whether the factors that were significant in univariate analysis were independently associated with the dependent variable (polypharmacy status measured at the ordinal level), an ordinal regression proportional odds procedure was used. Tests for multicollinearity of the independent variables did not show any violation of the assumptions of the regression model. The proportional odds assumption was met using a test of parallel lines provided by the ordinal regression output. The model was adjusted for age and sex, and the results are reported as odds ratios (ORs).

All proportions were calculated as percentages of patients with available data. Significance was set at P < 0.05. Patients with missing data were excluded from relevant analyses. The findings are reported in accordance with the Strengthening the Reporting of Observational studies in Epidemiology (STROBE) statement for cohort studies.19 Analyses were performed using SPSS IBM Version 22 (SPSS Inc).

Ethics

Ethics approval was obtained from the human research and ethics committee of each participating hospital and the University of Queensland Medical Research Ethics Committee.

Results

Of 1220 patients who were recruited for the study, medication records at admission were available for 1216. Mean age was 81.3 years (SD, 6.8 years), and median length of stay was 6 days (interquartile range [IQR], 4–11 days). Diagnosis-related groups, recorded for 1211 participants (99.6% of the sample), classified the most frequent primary diagnoses as diseases and disorders of the circulatory system (259 [21.4%]), respiratory system (253 [20.9%]), nervous system (125 [10.3%]) and kidney and urinary tract (96 [7.9%]).

There were significant differences across the polypharmacy categories for several variables (Box 1). The number of medications increased in association with a higher number of comorbidities and a higher prevalence of pain, dyspnoea, and dependence in terms of IADL. In contrast, the number of medications decreased in association with higher prevalence of severe cognitive impairment. There were no significant differences across polypharmacy categories based on dependence in terms of ADL, or presence of fatigue, urinary incontinence, behaviour symptoms, falls or weight loss.

A total of 1173/1216 patients (96.5%) had complete data and were included in the ordinal regression model that was used to identify independent predictors of polypharmacy status from the factors significant in the univariate analysis shown in Box 1. With each additional comorbidity, the odds of being in a higher polypharmacy category on admission increased by 27% (OR, 1.27; 95% CI, 1.20–1.34); patients with pain (OR, 1.31; 95% CI, 1.05–1.64) or dyspnoea (OR, 1.64; 95% CI, 1.30–2.07) or who were dependent in terms of IADL premorbidly (OR, 1.70; 95% CI, 1.20–2.41) were also significantly more likely to be in the higher polypharmacy categories (Box 2). In contrast, compared with those with severe cognitive impairment, those with intact cognition (OR, 3.17; 95% CI, 2.05–4.92) or mild to moderate cognitive impairment (OR, 1.95; 95% CI, 1.22–3.11) were significantly more likely to be in the higher polypharmacy categories.

Patients with complete medication records on admission and discharge (1187/1216 patients [97.6%]) were prescribed a mean (SD) of 7.1 (3.7) regular medications per day on admission and 7.6 (3.8) on discharge (P < 0.001). There was, however, no significant change in the prevalence of medications such as statins (459 [38.7%] v 457 [38.5%] patients), opioid analgesics (155 [13.1%] v 166 [14.0%] patients), antipsychotics (59 [5.0%] v 65 [5.5%] patients) and benzodiazepines (122 [10.3%] v 135 [11.4%] patients) (data on medications with a prevalence of at least 2% at admission are shown in Appendix 1.

Hyperpolypharmacy was observed in 290/1216 patients (23.8%) at admission and 336/1187 patients (28.3%) on discharge. Most patients (875 [73.7%]) did not change polypharmacy category from admission to discharge; however, 200 (16.8%) changed to a higher polypharmacy category and 112 (9.4%) changed to a lower polypharmacy category, both changes being significant (P < 0.001) (Appendix 2).

There was no significant association with diagnosis-related group category for the 109 patients (9.2%) who were taking < 10 medications at admission but taking 10 or more at discharge. There was no clinically meaningful change in the classification of prescribed medication (symptom control, prevention or both) during the inpatient episode (Box 3). For example, the proportion of purely preventive medications in the hyperpolypharmacy category was 1209/3371 (35.9%) at admission and 1508/4117 (36.6%) at discharge. While there was limited variation in number of prescribed medications according to hospital size, this was not statistically significant.

Discussion

Our study showed that three-quarters of older patients assessed were receiving five or more drugs, and more than one-fifth were receiving 10 or more, on admission to hospital. The mean number of prescribed medications per day remained high at discharge, with no clinically meaningful change in the classification (symptom control, prevention or both) of medications, nor in the prevalence of specific drug classes such as statins, opioid analgesics, antipsychotics and benzodiazepines. This may suggest that active attempts were not made to deprescribe when appropriate. Polypharmacy was significantly associated with comorbidity and with symptoms (notably pain and breathlessness) and with worse performance in IADL.

Our study has some limitations. We acknowledge the possibility of selection bias from several sources, including the requirement that patients have an expected hospital stay of at least 48 hours, the recruitment of participants on weekdays only, and the number of patients who declined to participate in the study. The appropriateness of prescribing at the level of individual patients based on clinical indications and contraindications was not considered, and medication doses were not explored. Medications at admission and discharge were documented from patients’ prescription charts alone, rather than by complete medication reconciliation using multiple sources of information (patient interview, letter from general practitioner and dispensing history from pharmacy20). Importantly, the design of our study denied us the opportunity to explore whether comorbidities, which may have triggered prescribing of multiple drugs, were the principal cause of the symptoms and functional impairment observed in association with polypharmacy, or whether the polypharmacy itself may have been responsible for some or most of this illness and disability burden, as has been suggested in other studies.21

The strengths of the study are its large cohort of patients recruited across secondary and tertiary care settings and the rigorous assessment of patients’ functional and cognitive status.

For all patients, the objectives of health care, including pharmacotherapy, can be summarised as ameliorating symptoms, improving function and delaying death. With increasing age, prolonging life often becomes a less feasible therapeutic goal. Older patients themselves prioritise improvements in mobility and functional status over longer survival.22 In a recent survey, only 3% of community-dwelling older people would take medications for primary prevention of cardiovascular disease if adverse effects were severe enough to affect functioning.23 Our study found that among patients taking 10 or more drugs, more than one in four of these medications were classified as purely preventive, according to mean numbers of drugs per patient, which may constitute an unnecessary treatment burden if clinical benefits are not realised in the short term. In view of our finding of increased prescribing for those with functional impairment, the values and preferences of this patient group should be explored in qualitative studies.

Where and when the process of detailed medication review should occur has not been established. Community medication reviews by pharmacists are an attractive option and have strong face validity. However, several studies of pharmacist-led medication review have shown no positive effect on clinical outcomes or quality of life.24 Perhaps only a medication review underpinned by careful consideration of the health status of the patient concerned, including estimation of life expectancy and exploration of individual goals of care, is likely to result in clinically meaningful outcomes. The acute care hospital ward under the care of physicians is a setting in which these complex decisions could be considered and actions initiated in discontinuing inappropriate medications. While time constraints are a challenge, they are outweighed by the accessibility of the patients’ full history and investigations, and the ability to monitor for adverse effects of drug withdrawal in those patients with longer hospital stays.

In conclusion, polypharmacy and hyperpolypharmacy are common among older people in general medical wards in Australia, with no significant changes in the number or classification of medications between admission and discharge. Patients who are dependent in terms of IADL are at particular risk of polypharmacy. Whether such prescribing in this patient group enhances quality of life and improves longevity or whether it imposes iatrogenic harm and lowers quality of life needs to be established by carefully designed randomised controlled trials of interventions designed to minimise inappropriate polypharmacy.

1 Categories of medication prescribing on admission in relation to patient characteristics*

Category

All categories

Non-polypharmacy (0–4 drugs)

Polypharmacy (5–9 drugs)

Hyperpolypharmacy (≥ 10 drugs)

P


Total, n (%)

1216 (100.0%)

291 (23.9%)

635 (52.2%)

290 (23.8%)

 

Demographics

         

Age in years, mean (SD)

81.3 (6.8)

81.4 (6.9)

81.7 (7.0)

80.4 (6.3)

0.03

Women

659/1216 (54.2%)

151/291 (51.9%)

358/635 (56.4%)

150/290 (51.7%)

0.28

Admission source

       

0.24

Community

1061/1215 (87.3%)

261/290 (90.0%)

552/635 (86.9%)

248/290 (85.5%)

 

Institution

154/1215 (12.7%)

29/290 (10.0%)

83/635 (13.1%)

42/290 (14.5%)

 

Length of stay in days, median (IQR)

6 (4–11)

7 (4–13)

6 (3–10)

6 (4–10)

0.31

Discharge destination or outcome

       

0.01

Community

792/1215 (65.2%)

184/291 (63.2%)

419/634 (66.1%)

189/290 (65.2%)

 

Other inpatient care

182/1215 (15.0%)

46/291 (15.8%)

103/634 (16.2%)

33/290 (11.4%)

 

Residential aged care facility

189/1215 (15.6%)

45/291 (15.5%)

97/634 (15.3%)

47/290 (16.2%)

 

Died

52/1215 (4.3%)

16/291 (5.5%)

15/634 (2.4%)

21/290 (7.2%)

 

Comorbidities

         

No. of comorbidities at admission, mean (SD)

6.2 (2.3)

5.2 (2.3)

6.3 (2.1)

7.0 (2.1)

< 0.001

Geriatric syndromes

         

Short ADL scale at admission, median (IQR)

2 (0–6)

3 (0–8)

2 (0–6)

2 (0–7)

0.26

Premorbid IADL

       

0.001

Independent

164/1216 (13.5%)

56/291 (19.2%)

84/635 (13.2%)

24/290 (8.3%)

 

Dependent

1052/1216 (86.5%)

235/291 (80.8%)

551/635 (86.8%)

266/290 (91.7%)

 

Cognitive status at admission

       

< 0.001

Intact (0 or 1)

837/1210 (69.2%)

174/290 (60.0%)

436/631 (69.1%)

227/289 (78.5%)

 

Mild/moderate impairment (2–4)

272/1210 (22.5%)

75/290 (25.9%)

149/631 (23.6%)

48/289 (16.6%)

 

Severe impairment (5 or 6)

101/1210 (8.3%)

41/290 (14.1%)

46/631 (7.3%)

14/289 (4.8%)

 

Urinary incontinence at admission

   

 

0.65

Not present

879/1216 (72.3%)

215/291 (73.9%)

452/635 (71.2%)

212/290 (73.1%)

 

Present

337/1216 (27.7%)

76/291 (26.1%)

183/635 (28.8%)

78/290 (26.9%)

 

Behaviour symptoms at admission§

       

0.12

Not present

1142/1214 (94.1%)

271/291 (93.1%)

592/634 (93.4%)

279/289 (96.5%)

 

Present

72/1214 (5.9%)

20/291 (6.9%)

42/634 (6.6%)

10/289 (3.5%)

 

Falls in the 90 days before admission

       

0.34

None

727/1215 (59.8%)

173/291 (59.5%)

370/634 (58.4%)

184/290 (63.4%)

 

At least one

488/1215 (40.2%)

118/291 (40.5%)

264/634 (41.6%)

106/290 (36.6%)

 

Weight loss

       

0.56

No

911/1197 (76.1%)

211/284 (74.3%)

475/625 (76.0%)

225/288 (78.1%)

 

Yes

286/1197 (23.9%)

73/284 (25.7%)

150/625 (24.0%)

63/288 (21.9%)

 

Symptoms

         

Pain at admission

       

0.007

Not present

557/1198 (46.5%)

155/285 (54.4%)

281/627 (44.8%)

121/286 (42.3%)

 

Present

641/1198 (53.5%)

130/285 (45.6%)

346/627 (55.2%)

165/286 (57.7%)

 

Dyspnoea at admission

       

< 0.001

Not present

511/1197 (42.7%)

159/289 (55.0%)

270/624 (43.3%)

82/284 (28.9%)

 

Present

686/1197 (57.3%)

130/289 (45.0%)

354/624 (56.7%)

202/284 (71.1%)

 

Fatigue at admission

       

0.31

Not present

806/1195 (67.4%)

199/290 (68.6%)

426/621 (68.6%)

181/284 (63.7%)

 

Present

389/1195 (32.6%)

91/290 (31.4%)

195/621 (31.4%)

103/284 (36.3%)

 

ADL = activities of daily living. IADL = instrumental ADL. * Data are numerator/denominator (%) unless otherwise indicated. † The ADL short-form scale comprises four items (personal hygiene, walking, toilet use and eating); range is 0–16, with higher scores reflecting greater level of dependency. ‡ Premorbid IADL performance is assessed on seven items (meal preparation, ordinary housework, managing finances, managing medications, using the telephone, shopping and transportation); scores on each item range from 0 (independent) to 6 (total dependence), and a score of ≥ 2 on any item indicates IADL dependence. § Includes the presence of one or more of the following: verbal abuse, physical abuse, resisting care, and socially inappropriate or disruptive behaviour. ¶ Loss of ≥ 5% bodyweight in the 30 days before admission or ≥ 10% in the 180 days before admission.


2 Ordinal regression model of factors associated with increasing polypharmacy*

Factor

Odds ratio (95% CI)

P


Number of comorbidities

1.27 (1.20–1.34)

< 0.001

Pain at admission

1.31 (1.05–1.64)

0.02

Dyspnoea at admission

1.64 (1.30–2.07)

< 0.001

Premorbid IADL — dependent

1.70 (1.20–2.41)

0.003

Cognitive status at admission

   

Intact (0 or 1)

3.17 (2.05–4.92)

< 0.001

Mild/moderate impairment (2–4)

1.95 (1.22–3.11)

0.005

Severe impairment (5 or 6)

Reference


IADL = instrumental activities of daily living. * Adjusted for age and sex.

3 Numbers of medications for symptom control, prevention or both at admission and discharge, by polypharmacy category

First use of creatine hydrochloride in premanifest Huntington disease

Huntington disease is a devastating autosomal dominant neurodegenerative disorder that typically manifests between ages 30 and 50 years. Promising high-dose creatine monophosphate trials have been limited by patient tolerance. This is the first report of use of creatine hydrochloride in two premanifest Huntington disease patients, with excellent tolerability over more than 2 years of use.

Clinical record

A 33-year-old patient in our general practice carried the autosomal dominant gene for Huntington disease (HD). The abnormal number of cytosine-adenine-guanine triplet repeats in the huntingtin gene she carried meant she would eventually become symptomatic for this dreadful disease.

The patient requested information regarding potential treatments, as she had become aware of clinical trials for HD and of compounds used by patients with HD. A neurologist had previously recommended a healthy diet, exercise, avoiding excessive toxins (such as alcohol), social enrichment and cognitive stimulation, which together may modestly slow clinical disease progression and improve quality of life.1 She had used preimplantation genetic diagnosis during her pregnancies but preferred otherwise not to focus on her condition. She understood that there were no proven therapies for this incurable condition and did not want to attend HD clinics. She was asymptomatic.

At her request, I searched the PubMed database for possible treatment options. There were some that were unproven in HD but had been used safely in humans for other indications, had a reasonable rationale regarding known HD pathophysiology, and had positive results in animal models of HD and/or early-phase human HD trials.2

In January 2012, I sought advice on using these options (eg, high-dose creatine, melatonin, coenzyme Q10, trehalose, ultra-low-dose lithium with valproate) from a specialist HD clinic but was advised against this approach. Instead, it was suggested that the patient might be able to sign up for clinical trials including high-dose creatine. The patient chose subsequently to participate in an observational trial (PREDICT-HD) which did not limit her options. However, she declined consideration for the Creatine Safety, Tolerability, and Efficacy in Huntington’s Disease (CREST-E) study,3 an international Phase III placebo-controlled trial of creatine monophosphate (CM) in early symptomatic HD. It is also very unlikely she would have been accepted for this trial as she was asymptomatic.

In February 2014, the Creatine Safety and Tolerability in Premanifest HD trial (PRECREST),4 a Phase II trial, showed significant slowing of brain atrophy in CM-treated premanifest HD patients. If convincingly replicated, this would be a major advance.

The main practical problem with high-dose CM (20–30 g daily) is tolerability. Adverse effects are common, especially nausea, diarrhoea and bloating. In people who have normal renal function before commencing creatine supplementation, creatine does not appear to adversely affect renal function.5

In PRECREST, about two-thirds of patients tolerated the maximum dose (30 g daily) and 13% of those on placebo were unable to tolerate CM when they switched to it. Moderate intolerance appears to be common. A high dropout rate affected the HD gene carriers in this study despite assumed high motivation.6 Recommended additional water intake for patients on CM therapy is 70–100 mL per gram of creatine per day, which is problematic at high doses of CM.

The patient again requested assistance as she wanted to seek the best available potential treatment to face her condition with equanimity.7 I decided that, provided safety was paramount, I would assist her on an informed consent basis as part of my duty of care, respecting her informed autonomy.

A case presentation and treatment plan was prepared and an expert team of relevant medical specialists was assembled. Comprehensive informed written consent, including consent from the patient’s partner for additional medicolegal protection, was obtained. The New South Wales off-label prescribing protocol8 was followed, actions were consistent with article 37 of the Declaration of Helsinki,9 and medical defence coverage for the proposed treatment was specifically confirmed by my indemnity insurer.

After baseline assessment, including renal function and careful attention to hydration, the patient commenced oral CM therapy at 2 g/day. This was slowly increased to 12 g/day but she was unable to maintain this dosage due to gastrointestinal adverse effects.

Creatine hydrochloride (CHCl), a creatine salt that has greater oral absorption and bioavailability than CM, and requires less water and a lower dose, offered a possible solution.10 The reduced dose also reduces intake of contaminants, which is very important for extended use. Use of CHCl has been confined to the bodybuilding industry and, to the best of my knowledge after a careful search of PubMed, nothing has previously been published in the context of neurodegenerative disorders.

After review by a pharmacologist and consultation with the co-inventor of the available formulation of CHCl,10 a daily dose of 12 g (equivalent to about 19 g CM) with 100 mL water per 4 g of CHCl was proposed. The manufacturer (AtroCon Vireo Systems) provided 1 g capsules of pharmaceutical grade CHCl at reduced cost. The patient decided to commence CHCl therapy after ceasing CM therapy. The dose of CHCl was slowly increased to 4 g three times a day (12 g daily) with a minimum of 100 mL additional fluid per 4 g dose.

The patient has been taking this dosage since January 2013 without any significant adverse effects and is keen to continue. Her serum creatinine levels are stable. Her serum creatine levels before and after doses have also been measured, and this confirmed that the CHCl is being absorbed.

Shortly after this patient began CHCl therapy, a second related premanifest HD patient requested access to CHCl. After a similar informed consent process, the second patient commenced the same dose of CHCl and has also not developed significant adverse effects. Clinically, both patients remain well.

Discussion

This is the first report of CHCl use in HD, with excellent tolerability for more than 2 years by two patients. If replication of the PRECREST findings confirms high-dose creatine as the first potentially disease-modifying treatment for HD, CHCl may represent an important option for patients, warranting further studies.

In this context, it is disappointing that CREST-E was closed in late 2014 after interim analysis showed it was unlikely to show that creatine was effective in slowing loss of function in early symptomatic HD based on clinical rating assessment to date. There were no safety concerns.11

It will be interesting to see, when eventually analysed and published, whether the magnetic resonance imaging (MRI) data from CREST-E showed any benefit in any subgroup and whether the trial cohort as a whole were in fact all in early-stage disease, and to consider whether the clinical rating scales were sensitive enough in this specific trial context.

Although others disagree, I argue that it remains unclear based on PRECREST findings whether the lack of benefit of creatine for early symptomatic disease in CREST-E is strictly relevant to the much earlier presymptomatic stage of the disease, especially when patients are far from onset.

HD symptoms take 30–50 years to develop, and the disease generally progresses to early dementia and death. Progressive MRI abnormalities accumulate for 20 or more years before onset. It appears that by the time the disease becomes symptomatic after 30–50 years, a multiplicity of interacting pathogenic mechanisms have become active (eg, excitotoxicity, mitochondrial energy deficit, transcriptional dysregulation, loss of melatonin receptor type 1, protein misfolding, microglial activation, early loss of cannabinoid receptors, loss of medium spiny striatal neurones, oxidative stress), and early and late events have occurred. The authors of a study of postmortem HD brain tissue refer to these mechanisms as a “pathogenetic cascade”,12 while others refer to them as multiple interacting molecular-level disease processes.13 “Early” downregulation of type 1 cannabinoid receptors has been identified as a key pathogenic factor in HD.14 In a recent review on the pathophysiology of HD, the authors described “a complex series of alterations that are region-specific and time-dependent” and noted that “many changes are bidirectional depending on the degree of disease progression, i.e., early versus late”.15 These and other findings suggest that HD has a complex temporal and mechanistic evolution that has not been fully elucidated. For this reason, we should think carefully before abandoning an agent when it fails at the relatively late symptomatic stage of this devastating and incurable disease.

As creatine is thought to have a useful potential for action in relation to only one of the many relevant disease mechanisms — mitochondrial energy deficit — was it too much to expect creatine to have a significant impact on symptomatic-stage disease in CREST-E? It seems possible, based on the references cited above, that there are fewer (or less intense) pathogenic mechanisms operating at much earlier presymptomatic stages of the disease, when the brain is more intact and plastic. If so, treatment trials in presymptomatic patients assessed using MRI or other biomarkers might offer better prospects for benefit.

I believe that sophisticated replication of PRECREST (or at least clarification as to whether the slowed rate of atrophy on MRI in premanifest patients was genuine or artefactual) is an ethical obligation that we owe to the HD community who contributed so much to CREST-E.

There are significant ethical and sociomedical issues associated with HD research. In reviewing the literature, it was obvious that early-phase research contains multiple examples of existing, out-of-patent or non-patentable potential therapies that appear to warrant modern clinical trials and, I argue, at an appropriate early stage of the disease.2,16,17 Early-phase studies of combination therapies with existing agents appear frequently to receive little, if any, follow-up.2,18

Currently, any drug for which US Food and Drug Administration or European Medicines Agency approval is sought for presymptomatic HD must achieve a clinical end point first in symptomatic HD, then requalify in presymptomatic HD, meeting combined clinical and biomarker end points. Does this arbitrarily overprivilege the clinically observable stage of a disease, which is now understood (based on relatively recent MRI studies) to have a course of 20 or more years before symptoms begin?

Because of the enormous costs associated with drug development, and the uncertainty of such research, I believe that it is time for a renewed focus on small, targeted clinical trials, especially in premanifest HD, using existing and novel agents. Recent advances in MRI and additional biomarkers that are under development19 open the possibility of meaningful small trials that aim to slow HD progression until gene therapy arrives.

None of this, however, will achieve its full potential unless we address the barriers to genetic testing. The true incidence of many genetic conditions, including HD, in Australia is unknown. If a treatment becomes available, more people will want to be tested. The decision to have genetic testing is complex, controversial and uniquely personal. Respecting this, I believe that we need to urgently follow the lead of the United States, Germany, Sweden, France, Denmark and other countries in legislating to end genetic discrimination in health, insurance, employment and services.20 I urge policymakers to replicate and clarify PRECREST and, in full collaboration with the HD community, trial existing and available medications alongside novel agents.

New and emerging treatments for Parkinson disease

The main aim is to maintain quality of life throughout the illness

World Parkinson’s Day commemorates the birth of James Parkinson on 11 April 1755, and it will soon be the 200th anniversary of his description of the “shaking palsy”.1 In this article I highlight some of the advances in Parkinson disease (PD) therapy since the topic was most recently reviewed in the Journal.2

The first step: diagnosis

The diagnosis of PD is the first step in its management. Even years after PD is diagnosed, patients report that “satisfaction with the explanation of the condition at diagnosis” continues to have an impact on quality of life.3 Diagnosis is not always straightforward. In a UK study, only 44% of patients with PD were initially referred to a neurologist, the other patients being referred to general physicians, orthopaedic surgeons, urologists, psychiatrists and rheumatologists. Pain was the symptom that most frequently impaired the recognition of PD, while frozen shoulder, spondylosis, depression and anxiety were among the common misdiagnoses.4 The 1997 charter of the European Parkinson’s Disease Association recommends that all patients be referred to a doctor with a special interest in PD.5

Managing non-motor symptoms

Non-motor symptoms, an intrinsic part of PD, have a major impact on quality of life. Anxiety and depression will develop in about 60% of patients with PD; this is twice the rate seen in the general population. The severity of mood disturbance or apathy is the most important determinant of quality of life in patients receiving treatment, having a greater impact than motor impairment.3,6 It is important to determine whether mood fluctuates with the motor “on” and “off” states and might therefore be responsive to dopaminergic therapy, or is more pervasive and requires supplementary pharmacological or non-pharmacological treatment. Even after accounting for the effects of altered mood, “current feelings of optimism” still have an independent influence on quality of life.3

Impaired olfaction, chronic constipation, and rapid eye movement (REM) sleep behaviour disorder (yelling or thrashing about while dreaming) can predate the onset of motor symptoms by years and even decades. Studies are underway to determine whether these features might be used to enable diagnosis of PD in the premotor stage of the disorder.

Selecting the initial therapy

Monoamine oxidase B (MAO-B) inhibitors, dopamine agonists (DAs) and levodopa can be employed in the initial treatment of PD, with each approach having its advantages and disadvantages.7 As no treatment has been unequivocally shown to either slow or hasten disease progression, the primary goal of therapy should be to restore and maintain quality of life. There is no advantage in delaying therapy if this has a negative effect on quality of life. Initial therapy may influence short- to medium-term outcomes, and should be tailored to the needs of the individual. The MAO-B inhibitor rasagiline may achieve a marginal slowing of disease progression, requires minimal titration and is well tolerated,8 but its symptomatic effect may not be as great as that of a DA or levodopa.9 It should therefore be considered for patients suffering only minor disability, or for those for whom rapid amelioration of disability is not required.

Early treatment with the DA pramipexole has been shown to reduce the risk of motor complications by 55% over 2 years, compared with levodopa monotherapy; equivalent to treating 4–5 patients with pramipexole instead of levodopa to prevent one additional complication.10 This benefit, however, needs to be balanced against its potentially serious side effects, especially impulse control disorders which develop in around 17% of patients using DAs,11 and excessive daytime somnolence that can lead to sleep attacks (sudden or irresistible drowsiness that can lead to falling asleep, including while driving).

Levodopa is the most potent and well-tolerated symptomatic treatment, its drawback being an increased risk of dyskinesia.12,13 A recent study of more than 1500 patients randomly allocated to initial treatment with an MAO-B inhibitor (selegiline or rasagiline), DA or levodopa showed only very small differences in global outcome at 3–7 years. The best results were achieved with levodopa in terms of mobility and quality of life, and with the levodopa-sparing agents in terms of avoiding dyskinesias.13 The overall results reinforced the view that any of the three options is a reasonable choice as initial therapy.

Advanced therapies for patients with disabling symptoms

Patients with disabling symptoms who do not respond to adjustments of medication dose should be referred earlier rather than later for consideration of an advanced therapy. Only a minority of PD patients who might benefit from an advanced therapy are currently referred for assessment. There is a limit to the relief of severe, unpredictable motor fluctuations that can be achieved by adjusting the dose of standard oral medications, driven in large part by impaired gastric emptying in PD.14 To allow patients to make an informed decision about their preferred method of treatment, they should ideally be referred to a specialist with broad experience in advanced therapies.

Two advanced therapies, deep brain stimulation (DBS) and levodopa–carbidopa intestinal gel (LCIG, commonly referred to by its trade name, Duodopa [AbbVie]) have been shown to effectively reduce severe motor fluctuations (including dyskinesias) in randomised controlled studies, decreasing “off” time and increasing quality “on” time by an average of 4 to 5 hours per day. Three trials have found that DBS therapy improved quality of life and motor function better than adjustment of medical therapy.1517 DBS was approved for treating PD in Australia in 2001, but inequality of access remains a problem. Public funding for the stimulator device is limited in many states, so that the therapy is primarily financed by private health insurance or self-funding. More recently, LCIG therapy for severe motor fluctuations has been shown to confer benefits similar to those of subthalamic nucleus DBS.18,19 LCIG is delivered as a continuous infusion into the jejunum, bypassing the problem of impaired gastric emptying. Patients need to have a permanent percutaneous endoscopic gastrostomy tube inserted, through which a finer jejunal tube is placed and connected to an external pump that houses the LCIG cassette. LCIG was approved by the Therapeutic Goods Administration in 2008 and has been funded by the Pharmaceutical Benefits Scheme since 2011. There is evidence that LCIG and DBS also improve non-motor symptoms.20,21 For patients who need an advanced therapy but prefer a less invasive option than DBS or LCIG, intermittent or continuous subcutaneous delivery of apomorphine (a dopamine agonist without opioid properties) can be an effective alternative,22,23 but its benefits are less well documented.

Conclusion

There have been significant advances in our understanding of the motor and non-motor symptoms of PD in the past decade. New therapeutic approaches and options are available, and general practitioners play a central role in educating patients about, and facilitating access to, optimal therapy, so that patients can make informed and positive choices.

Intravenous OxyContin-associated thrombotic microangiopathy treated successfully without plasma exchange

In April 2014, in response to intravenous misuse of oral extended-release oxycodone hydrochloride, a new tamper-resistant formulation was released in Australia. We report a case of thrombotic microangiopathy after intravenous misuse of the new tamper-resistant formulation that was successfully managed without plasma exchange.

Clinical record

A 56-year-old man of European ancestry with no clinically significant medical history presented with a 3-day history of periumbilical abdominal pain. He admitted to daily intravenous (IV) misuse of oral extended-release oxycodone hydrochloride (OxyContin; Mundipharma) over a period of months. For the 5 weeks before presentation, he had been injecting the new tamper-resistant formulation because he was unable to access the discontinued crushable form.

On presentation, the patient was afebrile, had a pulse of 95 beats/min, blood pressure of 154/85 mmHg, a respiratory rate of 14 breaths/min and oxygen saturation of 97%. On examination, he had mild periumbilical tenderness. Results of cardiovascular and respiratory examinations were unremarkable. No murmur was detected. There was no injection site infection or axillary lymphadenopathy.

Laboratory investigations showed a haemoglobin level of 87 g/L (reference interval [RI], 135–180 g/L), a total white cell count of 15.0 × 109/L (RI, 4–11 × 109/L), a neutrophil count of 10.84 × 109/L (RI, 2–8 × 109/L), a monocyte count of 1.47 × 109/L (RI, 0.1–1.0 × 109/L) and a platelet count of 53 × 109/L (RI, 140–400 × 109/L). Electrolyte levels were normal, and serum creatinine level was normal at 66 µmol/L. The patient’s unconjugated bilirubin level was 34 µmol/L (RI, < 20 µmol/L) and lactate dehydrogenase level was 769 U/L (RI, 150–280 U/L); other liver function test results were normal. His reticulocyte count was 168 × 109/L (RI, 10–100 × 109/L), haptoglobin level was 0.04 g/L (RI, 0.36–1.95 g/L) and Coombs test result was negative. Three per cent of his red blood cells were fragmented and polychromasia was present, consistent with microangiopathic haemolytic anaemia (Box 1). ADAMTS13 activity was 70% (RI, 40%–130%).

The patient’s serological test results for hepatitis B, hepatitis C and HIV were negative. Vitamin B12, folate, lupus anticoagulant, anticardiolipin, anti-β2 glycoprotein I, antinuclear antibody, extractable nuclear antigen and complement levels were normal. His ferritin level was elevated at 442 µg/L (RI, 30–300 µg/L), and transferrin saturation was normal at 18%. Activated partial thromboplastin time and prothrombin time were normal and his fibrinogen level was elevated (5.4 g/L [RI, 1.7–4.5 g/L]). A random urine test showed a proteinuria level of 340 mg/L (RI, < 100 mg/L) and a protein-to-creatinine ratio of 66 g/mol (RI, < 15 g/mol).

We elected to manage the patient’s condition conservatively without the use of plasmapheresis, steroids or antiplatelet agents. Spontaneous resolution of the microangiopathic haemolysis followed, and subsequent outpatient review showed that his parameters continued to normalise (Box 2).

To our knowledge, this is the first case of thrombotic microangiopathy (TMA) associated with IV-administered reformulated OxyContin.

Discussion

Misuse of prescription opioids is an increasing problem in Australia. Morphine, oxycodone and methadone liquid are the most commonly misused prescription opioids, with injection being the route of administration in 75% of misusers.1 In response to these concerns, the National Pharmaceutical Drug Misuse Framework for Action (2012–2015) (http://www.nationaldrugstrategy.gov.au/internet/drugstrategy/Publishing.nsf/content/drug-mu-frm-action) has been developed, outlining various strategies to reduce misuse of pharmaceutical medications.

In April 2014, in response to IV misuse of OxyContin, a new crush-resistant formulation with the intent to deter inappropriate tampering and misuse of the drug was released in Australia, and supply of the old formulation was discontinued. The new tablets, which are embossed with “OP” (the original formulation was embossed with “OC”), are difficult to break, cut, crush or chew, and when added to water, form a viscous hydrogel, which cannot readily pass through a needle.

However, in the United States, where tamper-resistant OxyContin was introduced in August 2010, 24% of misusers reportedly found ways to inject the new formulation.2 Similarly, since its introduction in Australia, misusers presenting to the Sydney Medically Supervised Injection Centre have successfully injected the tamper-resistant formulation.3

TMA is a rare but serious blood disorder characterised by thrombosis in arterioles and capillaries that manifests clinically as a microangiopathic haemolytic anaemia and thrombocytopenia. Types of TMA include thrombotic thrombocytopenic purpura (TTP) or TMA associated with ADAMTS13 deficiency; haemolytic–uraemic syndrome (HUS) or infection-induced TMA; atypical HUS or TMA associated with disorders of complement; drug-induced TMA (Box 3); and TMA associated with other conditions including transplantation, lupus, glomerulopathies and malignant hypertension.4 For patients with TMA, HIV and other active viral infections should also be excluded.

Laboratory findings in TMA are consistent with non-immune haemolysis, including anaemia with high reticulocyte counts, elevated bilirubin and lactate dehydrogenase levels, reduced serum haptoglobin levels, red cell fragmentation and negative Coombs test results.

Treatment of TMA varies depending on the cause. Patients with TTP require plasma exchange, whereas HUS treatment is largely supportive. Atypical HUS is treated with plasma exchange and eculizumab, a monoclonal antibody that prevents C5 activation. Of the drug-induced TMAs, only ticlopidine-induced TMA has been shown to respond to plasma exchange. While determining ADAMTS13 levels assists in making a diagnosis, the decision to treat with plasma exchange is usually based on clinical history and other laboratory parameters, owing to the delay in obtaining this result and the necessity to start plasma exchange promptly.

Here, we report the first case of reformulated OxyContin-associated TMA and demonstrate that conservative management is a valid approach. While drug rechallenge would be required to prove a causal association, the temporal association with IV misuse of the reformulated OxyContin, the lack of alternative aetiologies, and the disease resolution with drug cessation indicate that the TMA was induced by the reformulated OxyContin. In addition, reformulated OxyContin is not the first reformulated drug to be associated with TMA. In August 2013, the US Food and Drug Administration and Centers for Disease Control and Prevention issued warnings regarding reformulated oral extended-release oxymorphone hydrochloride (Opana ER; Endo Pharmaceuticals; not available in Australia) after a link between IV Opana ER misuse and TMA was identified.5,6 Initial cases of reformulated Opana ER-induced TMA were treated with plasma exchange;7,8 however, subsequent reports have demonstrated that plasma exchange is unnecessary.9,10

In contrast to our case of OxyContin-associated TMA, acute kidney injury requiring haemodialysis was a prominent feature among patients injecting reformulated Opana ER.8 Our patient had a somewhat atypical presentation, with normal serum creatinine levels, minor proteinuria and no definitive evidence of other end-organ damage.

It is unclear what components of tamper-resistant OxyContin or Opana ER might trigger TMA and whether different methods of preparation can increase or decrease this risk. The new formulation of OxyContin contains inactive ingredients not found in the original formulation, including polyethylene oxide (PEO), butylated hydroxytoluene and magnesium stearate. Opana ER also contains PEO, and notably, in one study, rats intravenously injected with PEO subsequently became thrombocytopenic.11 Interaction with other prescription or illicit drugs is also a possibility. Our patient was taking sertraline 200 mg once daily and diazepam 5 mg as needed, and admitted to chronic daily marijuana use but denied any other substance misuse or medications.

Given the recent switch to tamper-resistant OxyContin in Australia, further cases of OxyContin-associated TMA may occur. A high degree of clinical suspicion is required to make this diagnosis, and we recommend that all patients presenting with TMA be questioned about the use of IV drugs. If OxyContin misuse is acknowledged, clinicians should consider conservative management, which avoids the morbidity and costs associated with plasma exchange.

1 Peripheral blood smear showing microangiopathic haemolysis with red blood cell fragmentation


Wright stain; original magnification, × 40.

2 Improvement in parameters throughout conservative management of thrombotic microangiopathy

Parameter

RI

Day 1

Day 2

Day 3

Day 4

Day 5

Day 6

Day 9

3 months


Haemoglobin (g/L)

135–180

87

86

87

88

87

88

92

164

Platelet count (× 109/L)

140–400

53

34

42

55

80

123

281

358

Unconjugated bilirubin (µmol/L)

< 20

34

22

30

24

18

8

10

10

Lactate dehydrogenase (U/L)

150–280

769

870

889

858

688

611

439

221


RI = reference interval.

3 Drugs that may induce thrombotic microangiopathy

  • Thienopyridines: eg, ticlopidine, clopidogrel, prasugrel
  • Calcineurin inhibitors: eg, cyclosporine, tacrolimus
  • Nucleoside analogues: eg, gemcitabine, mitomycin C
  • Anti-vascular endothelial growth factor therapies: eg, bevacizumab, sunitinib
  • Imatinib
  • Quinine
  • Interferon beta
  • Tamper-resistant extended-release oxymorphone hydrochloride

Stable post-TRUS biopsy sepsis rates and antibiotic resistance over 5 years in patients from Newcastle, New South Wales

To the Editor: Sepsis after transrectal ultrasound (TRUS)-guided prostate biopsy is a potentially serious complication. Antibiotic prophylaxis reduces the risk of sepsis; however, practices are highly variable among urologists.

Several reports show increasing incidence of bacterial resistance in patients with sepsis after TRUS-guided biopsy.13 Preprocedure screening of high-risk patients using cultures from rectal swabs would allow targeted prophylaxis.4,5 We performed a retrospective audit of all men undergoing a TRUS-guided prostate biopsy by one of nine urologists in Newcastle, New South Wales, to assess the incidence of bacteraemia and changes in antimicrobial susceptibility.

Unique patient data were collected retrospectively from private and public hospital records, including data on positive blood cultures from Hunter Area, Douglass Hanly Moir, Laverty and Healthscope pathology services. Patient and pathology records were cross-referenced and data were analysed using Microsoft Excel. This clinical audit did not require ethics committee approval, in accord with the NSW Health policy on authorisation to commence human research in public health organisations.

From 2008 to 2012, 4218 men underwent a TRUS-guided prostate biopsy. Median age was 64 years. Of these men, 35 (0.8%) developed bacteraemia, with the annual incidence varying between 4/935 (0.4%) and 12/999 (1.2%) over the 5 years (non-significant differences). There were no recorded deaths from sepsis. Most of the cultures from men who developed bacteraemia (29/35) grew Escherichia coli. None grew Enterococcus species. The isolate cultured was resistant to at least one of the preprocedure prophylactic antibiotics given in 13 out of 35 cases.

Bacteraemia was uncommon, and the rates we found were comparable to previously reported ones.1,2 Antimicrobial resistance fluctuated (Box) without significant change.

Of note, we found quinolone resistance for cultures from 5/12 patients who developed bacteraemia in 2012. Worldwide emergence of multidrug-resistant E. coli sequence type 131 has coincided with a rising incidence of quinolone-resistant infection after TRUS-guided prostate biopsy in New Zealand.3

Only five isolates (out of 35) were resistant to both gentamicin and ciprofloxacin, while 12 were resistant to one of these, suggesting that a prophylactic combination remains superior to either agent alone. The low overall incidence of post-TRUS bacteraemia due to a pathogen with resistance to ciprofloxacin and gentamicin (5/4218; 0.1%) implies that there is no utility in screening with cultures from rectal swabs before TRUS biopsy in the population we studied. It remains essential to administer antibiotics prophylactically 1 hour before the procedure and at an appropriate dose to maximise sepsis prevention.

Antibiotic resistance among patients undergoing transrectal ultrasound-guided prostate biopsy who developed bacteraemia (n = 35), 2008–2012, Newcastle, New South Wales

   

Antibiotic (no. of resistant isolates)


Year

No. of patients with bacteraemia

Gentamicin

Ciprofloxacin

Ampicillin

Trimethoprim

Cephazolin

Ciprofloxacin + gentamicin


2008

5

2

2

3

2

1

2

2009

8

0

2

6

2

0

0

2010

6

1

0

4

1

2

0

2011

4

1

0

3

1

2

0

2012

12

4

5

11

5

7

3

First do no harm: a real need to deprescribe in older patients

We enthusiastically welcome Scott and colleagues’ article highlighting the real need to deprescribe in older patients;1 however, we would like to emphasise the role of validated tools and medication management reviews in deprescribing.

Validated tools help clinicians identify potentially inappropriate medications. Scott et al state that medications whose benefits are outweighed by harm in most circumstances, such as potent opioids, anticholinergics and benzodiazepines, account for relatively few adverse drug events (ADEs) in Australian practice. This is potentially misleading, as it relies on practitioners or consumers recognising ADEs. Recognising ADEs is relatively simple when anticoagulants cause bleeding, but more subtle ADEs, such as functional impairment from anticholinergic and sedative medications, are often not recognised and are misattributed to ageing or multimorbidity. Therefore, validated tools for recognising high-risk medicines are an important component of medication management reviews.

We agree that input from appropriately trained (accredited) pharmacists and other health care professionals with expertise in the clinical use of medicines has been shown to be beneficial for deprescribing practice. We wish to highlight the value of medication management reviews, such as Home Medicines Review (HMR) and Residential Medication Management Review (RMMR), which involve pharmacists and general practitioners in the deprescribing process. This kind of multidisciplinary model has been shown to be effective in reducing polypharmacy.2

In Australia, recommendations provided by pharmacists as part of the HMR/RMMR service reduce exposure to anticholinergic and sedative medications.3,4 Accredited pharmacists have excellent medication knowledge and communication and education skills that are highly valued by patients.5

Antibiotic prescribing practice in residential aged care facilities — health care providers’ perspectives

In reply: We acknowledge the importance of the consumer in shared decision making regarding patient care. Our focus was to explore antibiotic prescribing practices from health care providers’ perspectives,1 including workflow-related factors, in residential aged care facilities (RACFs).

We aimed to identify the potential for point-of-care tools in RACFs, among other interventions. Not surprisingly, a perception of consumer demand for antibiotics was reported, and this has been evident in other publications.24 Nevertheless, the views of consumers were outside the scope of our study.

Importantly, we remain interested in consumer perspectives on antibiotic use. Members of this research team are directly involved with consumer-based activities related to antibiotic prescribing, such as developing clinical care standards.