×

Undetected and underserved: the untold story of patients who had a minor stroke

Equity of access is particularly concerning for minor stroke

Medical advances, such as stroke units, improved primary and secondary stroke prevention, and hyperacute treatments have revolutionised acute stroke management.1 The lessening of stroke severity as a result of such ground-breaking initiatives has, however, led to a larger proportion of individuals returning to community living following minor strokes2 (ie, with minimal motor deficits or no obvious sensory abnormality). In this article, we review current literature to identify the potential difficulties experienced following a minor stroke.

Individuals who survive a more severe stroke often undergo extensive multidisciplinary rehabilitation in an inpatient setting. By contrast, patients who have a minor stroke are likely to be discharged home early, often with limited referrals to services beyond their general practitioner.3 This is despite increasing evidence that survivors of minor stroke may have persisting stroke-related impairments that require rehabilitation.4 These “hidden” impairments may not become apparent until after discharge, when the patient attempts to resume their usual daily activities.2,4 Edwards and colleagues4 found that despite full independence with personal activities of daily living, 87% of patients who had a minor stroke reported residual difficulties with mobility, concentration, and participation in social activities and physically demanding leisure activities such as golf. These persisting subtle impairments may cause social and economic disruption for the individual and their family; however, due to difficulties identifying them in the hospital setting, it may result in poor coordination between primary and secondary care, especially if the patient is deemed fully independent at discharge. When the impairments are detected at a later stage, rehabilitation or support services may not be accessible, potentially rendering the patient “lost” in the health care system.

Equity of access is particularly concerning for minor stroke. In regional Australia, there may be no hospital or community rehabilitation services available,5 with patients at home dependent on the Medicare rebate for access to private allied health services within the current Chronic Disease Management (CDM) program.6 Women, who are more likely to be discharged to residential care, face further access challenges.7 Compounding this is evidence suggesting that all patients who may benefit from inpatient rehabilitation are not appropriately identified,1 which is concerning given the “hidden” nature of many impairments resulting from minor stroke.

A systematic review by Tellier and Rochette2 revealed that patients who have had a minor stroke often have impairments that span the domains of physical status, emotional health, cognition and social participation. The combined effect of these impairments may be an inability to fully resume valued activities, leading to reduced quality of life.2 Studies have shown that between one- and two-thirds of minor stroke survivors have compromised social participation outcomes.2,4 Edwards and colleagues4 found that 62% of patients who had a mild stroke had difficulty returning to employment or volunteer work, while 36% had reduced social activity 6 months after the stroke. Since about 30% of strokes occur in individuals under 65 years of age,8 these figures are particularly troubling. It is worth noting, however, that participants in the study by Edwards and colleagues4 had experienced a single ischaemic stroke and had a mean age of 64.74 years (range = 20–97 years). Therefore, as about only half of the participants4 in the study fell into the young stroke category, it is unknown how accurately these figures reflect the return to work status specifically of younger patients who had a minor stroke.

The 2014 National Stroke Foundation Rehabilitation audit9 found that less than 40% of patients who had a stroke received a psychological assessment before discharge. Formal neuropsychological assessment is expensive and not available in many areas and so inpatients rarely receive one, even if experiencing obvious impairments, such as aphasia or pronounced memory deficits. For people who have had a minor stroke, impairments are even less obvious and may manifest as a diverse range of milder cognitive problems, including attentional neglect or reduced processing speed. A neuropsychological assessment could identify these deficits and their impact on functioning and make recommendations for compensatory strategies or adjustments to reduce this impact.

Mental health problems, in particular depression, are prevalent regardless of stroke severity, with 25–29% of patients who have had a minor stroke reporting depression in the first year.10,11 Early and late onset post-stroke depression has been associated with disability and poor physical and mental health at 1 year,11 and with a reduced likelihood of driving a vehicle, participating in sports or recreational activities and interpersonal relationships at 6 months after the stroke.12 It is encouraging that improvement of depression within the first year after the stroke has been associated with better functional outcomes and quality of life.10 This highlights the need to regularly monitor patients after a minor stroke to identify and treat depression as soon as possible. Despite apparent good recovery, depression is a risk and some patients require referral to services, medication and psychological support in a coordinated manner.

As with most patients who have had a stroke, patients who have had a minor stroke are usually unable to drive for a period of time, relying instead on public transport, family members or unapproved driving for transport to medical appointments and other destinations. Research has found that one in four young survivors of stroke (aged 18–65 years) return to driving within 1 month after the stroke, despite recommendations to the contrary.13 Drivers who have had a minor stroke perform significantly worse on complex tasks, with greater cognitive load (eg, turning across oncoming traffic and bus following), and make twice the number of driving errors compared with control subjects.14 In addition to the detrimental influence of spatial, visual and cognitive impairments, the risk of seizure contributes to the moratorium on driving after a stroke. Premature return to driving may reflect poor compliance with advice, which is perceived as inconvenient and perhaps not fully explained to patients. Providing patients who have had a minor stroke with education about driving restrictions and alternative transport options and ongoing monitoring of driving fitness should be part of primary health care.

Patients who have had a minor stroke are also at risk of hospital re-admissions due to other medical conditions. For example, patients who have had a minor stroke have a heightened risk of experiencing a subsequent cardiovascular event.15 They may also have an array of concomitant medical conditions, including diabetes mellitus, atrial fibrillation and congestive cardiac failure,15 and may benefit from a coordinated approach to manage these comorbidities and prevent hospital re-admission.

Six months after a minor stroke, patients do significantly less high intensity physical activity compared with the activity done before the stroke, and despite the benefits of physical activity for future stroke prevention, they tend not to take up new high intensity activities.12 Indeed, Kono and colleagues16 found that higher levels of exercise in the form of daily step counts were associated with a reduced risk of new vascular events following minor stroke. Patients who have had a minor stroke and are living in the community may benefit from education about secondary stroke prevention. A GP-led multifaceted and target-based approach to secondary stroke prevention may be ideal for this population, especially given that a combination of medications (eg, aspirin, a statin and an antihypertensive agent), exercise and dietary modifications have been found to produce a cumulative relative risk reduction of stroke by 80%.17

Conclusion

In summary, minor stroke is a chronic health condition with long term impairment and disability.2 Residual impairments and comorbidities often require the involvement of multiple health care providers, the need for which may not always be evident at the time of stroke. Community-living patients who had a minor stroke may currently be managed through initiatives such as the CDM program. Access to CDM items can be problematic and, due to the mild nature of minor stroke, it is likely that these items will be overlooked. The five sessions per calendar year under the CDM program — which include a range of allied health services, such as speech pathology, occupational therapy, psychology and physiotherapy, with a Medicare rebate that may cover the total cost depending on whether the provider accepts the Medicare benefit as full payment for the service — are often inadequate for patients who have a more complex situation, but may be ideal in the population who have had a minor stroke and hence, a good use of existing resources. Therefore, we need to audit existing strategies in primary care to uncover which processes are working well, and which require attention. This is particularly pertinent given the creation of new government initiatives, including the National Disability Insurance Scheme (in which, however, patients who have had a minor stroke look unlikely to be eligible), and Primary Health Networks within the Health Care Home framework.

A GP-led approach that coordinates a range of primary and allied health professionals close to the home of patients who have had a minor stroke may be the ideal way to meet the needs of this population and prevent costly re-admissions to hospital, while simultaneously maximising quality of life. To ensure that community-dwelling patients who have had a minor stroke and have unmet needs are not missed, we need a coordinated, integrated primary health care response that detects and manages impairments and activity restrictions as they arise, along with medical comorbidity management and self-management support. At a minimum, we need to ensure that all patients who have had a minor stroke, regardless of their geographic location, have improved access to formal neuropsychological assessment, falls prevention, exercise programs and more extensive Medicare-based allied health funding if required. The key to this is auditing existing programs and investigating the relevance of new government initiatives as they arise for these patients, while also improving the communication between hospitals and primary health care services. Further study of the unmet needs and mechanisms for ensuring access for all patients who have had a stroke is also vital.

Regenerative neurology: meeting the need of patients with disability after stroke

If regenerative neurology restores function, it will meet a huge unmet need and change dogma

Treatment of stroke in the acute phase has come a long way with the development of paramedic, emergency department and stroke team pathways for hyperacute assessment and management with intravenous thrombolysis, endovascular clot retrieval and hemicraniectomy. Acute stroke units reduce mortality and morbidity by up to 20% or more.1 An estimated 80% of stroke patients survive for one year after stroke, with the large majority being left with chronic disability.2 In Australia and many other countries around the world, stroke is the leading cause of adult disability.3 It is estimated that up to 450 000 Australians have disability after stroke.4,5

The only intervention currently available to stroke survivors is rehabilitation. Increasing evidence suggests that rehabilitation complements the natural functional recovery process that can often continue for months or years after stroke.6 However, there are persisting gaps in our understanding of the basic biological pathways that drive post-stroke recovery, and these pose challenges in applying evidence-based rehabilitation strategies in the real world. This becomes especially critical as patients often need a combination of rehabilitation strategies that cater for their specific disability and complement their potential for long-term recovery. These are often required beyond the period for which rehabilitation services are currently made available due to resource constraints.7 So where does that leave us in 2017?

Regenerative neurology or stem cell therapy may provide an answer to this unmet need by potentially restoring neurological function in an individualised manner. Many stem cell researchers and clinicians hold the view that the field of regenerative medicine may have as large an impact on humanity as antibiotics.8

Basics of stem cells

Stem cells are unique in possessing two qualities — the capacity for self-renewal and the potential for multilineage differentiation. If a stem cell is pluripotent, it can give rise to cells derived from all three germ layers (ectoderm, mesoderm and endoderm) that differentiate into different tissues during embryonic development. On the other hand, a multipotent stem cell tends to generate limited cell types, often relevant to the organ from which the stem cell was derived — for example, haematopoietic stem cells (HSCs) tend to generate blood and immune cell types. Embryonic stem cells isolated from the very early embryo are pluripotent while adult somatic stem cells derived from adult organs, such as mesenchymal stem cells from bone marrow, are multipotent, similar to HSCs.

A significant clinical limitation to the use of embryonic stem cells therapeutically is the potential for them to form tumours, such as teratomas which have multiple cell types from the different embryonic lineages (hair, bone, teeth, heart muscle, etc).9 In contrast, to date, multipotent cells such as mesenchymal stem cells are considered safer, with animal studies reporting no increase in tumorigenicity.10

In 2006, Yamanaka (2012 Physiology or Medicine Nobel Laureate) showed that somatic cells (skin fibroblasts) could be engineered genetically by four genes (known as the Yamanaka factors) to produce pluripotent cells similar to embryonic stem cells.11 This third type of stem cell is termed an induced pluripotent stem cell (iPSC). This discovery has radically transformed stem cell research and proffers the concept of personalised regenerative medicine. Early clinical trials have already started deriving iPSCs from an individual’s fibroblasts for autologous (self-)treatment or personalised medicine.12 The findings of preclinical studies in stroke models have provided encouraging evidence for potential for neuroregeneration and useful insights into potential applicability in the future.1315

Chronic stroke and local injection

Last year was an exciting one for stem cell therapy in stroke patients. There were two high impact publications documenting early phase clinical studies with two different multipotent stem cells, SB623 and CTX0E03. Both are genetically modified stem cell types, one isolated from fetal brain tissue16 and the other from adult bone marrow.17 Two independent research teams from reputable institutions in the United Kingdom and United States performed these studies with industry funding (ReNeuron and San Bio, respectively).

This research examined two key questions in relation to study design:

  • Is it potentially useful to treat stroke survivors in the chronic phase when their disability has plateaued, sometimes as long as 3 to 4 years after stroke?

  • Is intracerebral implantation of stem cells a feasible route of administration?

Published preclinical and preliminary clinical data indicate that the design of the studies was valid, although research opinion is often divided as to optimum timing and route of administration of cell transplantation.9

Why was stem cell therapy not administered in the acute phase after stroke in these studies? There may be a number of clinically pragmatic answers to this question — in the acute phase, patients may be too medically unstable to undergo neurosurgery. Moreover, patients are often still showing rapid improvement, so it would be problematic to measure any benefit above that of optimum acute stroke unit care, when disability has not yet plateaued.18

Why was a neurosurgical implantation chosen? “Functional neurosurgery” is a fast-developing specialty and these neurosurgeons routinely implant electrodes for deep brain stimulation to treat Parkinson disease. Thus they have the expertise to inject, via a narrow bore cannula, deposits of stem cells into multiple sites within the human brain. One benefit to the patient of intracerebral implantation is that the cells remain within the brain and can be imaged non-invasively.19 An alternative route of administration used in earlier clinical studies was intravenous injection.20 Initially, this approach was considered safer than intracerebral implantation, but it is now appreciated that there is a theoretical risk of distant tumorigenicity, in that stem cells injected intravenously may deposit widely throughout a number of organs within the body (ie, lung, liver, etc.) and may interact with presymptomatic tumours.20

Is it safe?

Early phase clinical trials characteristically involve small numbers of patients to minimise the number at risk if there is a serious treatment-related adverse event. In the two studies described above,16,17 27 patients were followed for 12 months after treatment, which is a generally accepted timeframe. The studies stated that no adverse event directly attributable to the stem cell therapy was found. However, the neurosurgical procedure of creating a burr hole and entering the brain to administer the cells did result in appreciable anticipated adverse events (ie, haematoma, headache and other symptoms related to the consequent reduction of intracranial pressure). It is noteworthy that both studies will continue surveillance of all patients after 12 months to detect any longer term adverse events.

We propose an alternate perspective with respect to the claims that no stem cell-related adverse events occurred. Stem cells implanted into the brain are known from preclinical data to differentiate into neural cells and probably integrate within the brain.9 In theory, this cellular behaviour has the potential to form an epileptogenic focus. A small number of patients in each of the two high impact studies16,17 were reported to have seizures. With this limited clinical dataset it cannot be concluded whether their seizures arose from the neurosurgical procedure, as suggested in the publications,16,17 or was related to the stem cells. We propose that larger phase 2/3 studies should incorporate electroencephalography investigations to better understand the association of seizures with intracerebral implantation stem cell therapy.

The clinical data in these two early phase clinical studies supports the clinical feasibility and safety of intracerebral implantation of stem cells in patients with chronic disability after stroke. Both studies used an escalating dose of stem cell therapy. Cell doses of up to 10 million SB623 and 20 million CTX0E03 stem cells may be used for future larger phase 2 studies.

So: does it work?

This question will not be answered with any degree of certainty for a number of years as we await the results from large, multicentre, multinational, double-blind, randomised controlled clinical trials. While preclinical data from animal studies suggest an overall functional improvement of 40.6%, the extrapolation of these findings to human stroke pathophysiology is limited by: (i) species-specific differences; and (ii) the fact that controlled induction of cerebral ischaemic lesions in animals is not fully representative of the heterogeneous lesion load seen with human stroke.9

Early clinical studies enrolled a heterogeneous mix of patient groups. Most of these studies were open label and single arm and thus not designed to answer the question of efficacy. Therefore, at present, it is difficult to postulate any differential benefit for specific patient or stroke subgroups.18 From a mechanistic perspective, there are a number of theories from preclinical data on how stem cell therapy may decrease post-stroke disability (Box), with neuroplasticity considered to be an important factor.21

An aspect of immense practical relevance is that standardised rehabilitation was not provided to participants in these studies. There is an ongoing debate about the potential confounding effect of rehabilitation on functional and structural outcomes. However, rehabilitation is accepted as a standard of care to optimise natural recovery, and guidelines for stem cell research such as Stem Cell Therapy as an Emerging Paradigm for Stroke (STEPS)22 recommend its inclusion in trial design. Stroke clinicians will know from everyday experience that significant improvement in neurological function many years after an ischaemic stroke is rarely observed. The two studies described above16,17 are very important in the field of regenerative neurology in that both found an associated improvement in function in the chronic phase of stroke among patients with different areas of stroke-induced injury. In light of the emerging evidence for long-term potential to relearn that can be harnessed by rehabilitation, stem cell implantation along with targeted and protracted rehabilitation could have a synergistic and biologically plausible impact on post-stroke recovery.

It is of fundamental interest that both studies described changes on magnetic resonance imaging (MRI) of the human brain after treatment. It was suggested that these MRI findings may not be explained by the neurosurgical procedure alone.17 These preliminary findings may present an opportunity for reverse translational research, from the clinic back into the research laboratory, to gain a better understanding of how changes in the human brain may occur after stem cell therapy.

At this juncture of stem cell research in stroke, there are three important points to be considered:

  • The preclinical and early clinical data which suggest that stem cell therapy may be helpful are becoming encouragingly robust.23

  • The preponderance of failed translation efforts from preclinical to clinical therapeutics in stroke highlights that continued exercise of scientific rigor is critical.

  • Ongoing stem cell tourism across the world and in Australia to reach centres that operate for financial gain without regard to research integrity or patient safety poses a significant danger to the credibility of this field.24

The current regulatory framework in Australia for oversight of cellular therapies has significant gaps in scope as well as implementation. It is a matter of urgency that our politicians and regulatory authorities collaborate with their counterparts in the US, European Union, Japan and other regions where innovative approaches are being implemented to develop the field while creating adequate safeguards to protect patient interests.25,26

Exciting scientific research is that in which the questions raised outweigh the answers. We suggest the quest to fulfil the unmet need for treating disability after stroke has taken a step forward.

Box –
Putative mechanisms of action of stem cells in stroke*


* Adapted with permission from Nagpal et al.21

Clot retrieval and acute stroke care

Resource distribution in stroke care must be rational and evidence-based, not driven by media coverage

There has been a sudden upsurge of interest in the availability of endovascular clot retrieval (ECR) for stroke treatment (Box). Recent newspaper articles published in New South Wales1,2 highlight the potential benefits as well as the financial and logistical challenges of providing around the clock ECR services in a vast country like Australia.

Many well designed randomised trials have demonstrated the efficacy and safety of ECR,3 but the evidence of benefit, although dramatic in some cases, is confined to patients whose ECR procedure begins within 6 hours of symptom onset,4 with or without intravenous thrombolysis. However, before 24/7 ECR services can operate in Australia, individual health services need to examine the challenges of transporting eligible stroke patients from the emergency departments wherever they may be — remote, regional or urban — to a comprehensive stroke centre within the required time window. This potentially means covering vast distances — several hundreds of kilometres in some cases — in a matter of hours. A detailed analysis then needs to demonstrate that the benefits of a rapid ECR service justify funding over many other competing health care needs. Victoria, the third smallest Australian health jurisdiction (after the Australian Capital Territory and Tasmania) with the second highest population,5 has set an example by starting a statewide 24/7 ECR service. However, it is unclear at this stage whether a similar service would be feasible in larger states with lower population densities, given the similar challenges for ECR service provision in the United States and Canada.6 Currently, there are no 24/7 statewide ECR pathways in New South Wales. In the Australian Capital Territory, the health service is establishing a 24/7 ECR service in 2017, although many hurdles still remain.

However, even within stroke care, there are other, perhaps more immediate, needs. Currently, there are large gaps in our ability to deliver basics of stroke care, particularly in regional Australia. For example, a 2015 national audit of acute stroke services coordinated by the Stroke Foundation showed that only 67% of patients were admitted into a stroke unit,7 even though patients admitted to stroke units with any type of stroke, ischaemic and haemorrhagic, are more likely to be alive, living at home and independent 1 year after their stroke.8

Another area that could be improved is availability of advanced imaging — currently, as a result of lack of expertise and local department policy, too many stroke patients, even in urban centres, do not benefit from multimodal imaging such as computed tomography (CT) angiogram or CT perfusion scans. These scans can demonstrate the presence and location of a clot within the cerebral vasculature, as well as the size of the penumbra (the area of the brain at risk of infarction without urgent revascularisation), to identify patients who may benefit from aggressive and more invasive treatments such as ECR.9

Further, intravenous thrombolysis — a proven10 and more easily accessible therapy than ECR — is currently not an option for many stroke patients who would be eligible. The national thrombolysis rate is currently languishing at 7%, unchanged from 2011.7 Some work has already been done to improve this by establishing 24/7 acute stroke teams and forming regional acute stroke networks where a large hospital provides thrombolysis expertise and support for the regional and rural hospitals. Setting up a statewide ECR service without widely available capacity to optimally assess, image, treat and transport stroke patients risks squandering precious resources without clear benefits for the majority who do not live in the immediate vicinity of an ECR centre.

Delivering a world class stroke service is complex; the constantly advancing evidence base means that the entire service is always playing catch up. It is important to foster a provision and funding model for rational, holistic and flexible stroke services that puts patient outcomes first and covers all aspect of stroke care — not just the acute reperfusion therapies of intravenous thrombolysis and ECR but also stroke unit care and specialist neurorehabilitation. There is much work to be done, given that only one in 87 stroke units qualifies as a comprehensive stroke service,7 only 40% of stroke units routinely utilise established guidelines, care plans and protocols,7 and one in three patients is discharged from hospital without any preventive medications.7 Focusing the discussion about stroke care only on the availability of ECR — a necessary but complex and costly intervention which will benefit only a small proportion of stroke patients — diverts resources from wider and more fundamental needs in stroke care and does not serve the best interests of our patients.

Box –
Endovascular clot retrieval


Cerebral angiograms showing (A) occlusion of the proximal right middle cerebral artery (arrow), and (B) recanalisation of the same artery after clot retrieval (arrow). Images courtesy of Dr Shivendra Lallo, Canberra Hospital, ACT.

Neurobionics and the brain–computer interface: current applications and future horizons

Neurobionics is the science of directly integrating electronics with the nervous system to repair or substitute impaired functions. The brain–computer interface (BCI) is the linkage of the brain to computers through scalp, subdural or intracortical electrodes (Box 1). Development of neurobionic technologies requires interdisciplinary collaboration between specialists in medicine, science, engineering and information technology, and large multidisciplinary teams are needed to translate the findings of high performance BCIs from animals to humans.1

Neurobionics evolved out of Brindley and Lewin’s work in the 1960s, in which electrodes were placed over the cerebral cortex of a blind woman.24 Wireless stimulation of the electrodes induced phosphenes — spots of light appearing in the visual fields. This was followed in the 1970s by the work of Dobelle and colleagues, who provided electrical input to electrodes placed on the visual cortex of blind individuals via a camera mounted on spectacle frames.24 The cochlear implant, also developed in the 1960s and 1970s, is now a commercially successful 22-channel prosthesis for restoring hearing in deaf people with intact auditory nerves.5 To aid those who have lost their auditory nerves, successful development of the direct brainstem cochlear nucleus multi-electrode prosthesis followed.6

The field of neurobionics has advanced rapidly because of the need to provide bionic engineering solutions to the many disabled US veterans from the Iraq and Afghanistan wars who have lost limbs and, in some cases, vision. The United States Defense Advanced Research Projects Agency (DARPA) has focused on funding this research in the past decade.7

Through media reports about courageous individuals who have undergone this pioneering surgery, disabled people and their families are becoming more aware of the promise of neurobionics. In this review, we aim to inform medical professionals of the rapid progress in this field, along with ethical challenges that have arisen. We performed a search on PubMed using the terms “brain computer interface”, “brain machine interface”, “cochlear implants”, “vision prostheses” and “deep brain stimulators”. In addition, we conducted a further search based on reference lists in these initial articles. We tried to limit articles to those published in the past 10 years, as well as those that describe the first instances of brain–machine interfaces.

Electrode design and placement

Neurobionics has been increasing in scope and complexity because of innovative electrode design, miniaturisation of electronic circuitry and manufacture, improvements in wireless technology and increasing computing power. Using computers and advanced signal processing, neuroscientists are learning to decipher the complex patterns of electrical activity in the human brain via these implanted electrodes. Multiple electrodes can be placed on or within different regions of the cerebral cortex, or deep within the subcortical nuclei. These electrodes transmit computer-generated electrical signals to the brain or, conversely, receive, record and interpret electrical signals from this region of the brain.

Microelectrodes that penetrate the cortical tissue offer the highest fidelity signals in terms of spatial and temporal resolution, but they are also the most invasive (Box 2, A).8 These electrodes can be positioned within tens of micrometres of neurons, allowing the recording of both action potential spikes (the output) of individual neurons and the summed synaptic input of neurons in the form of the local field potential.9 Spiking activity has the highest temporal and spatial resolution of all the neural signals, with action potentials occurring in the order of milliseconds. In contrast, the local field potential integrates information over about 100 μm, with a temporal resolution of tens to hundreds of milliseconds.

Electrocorticography (ECoG), using electrodes placed in the subdural space (on the cortical surface), and electroencephalography (EEG), using scalp electrodes, are also being used to detect cortical waveforms for signal processing by advanced computer algorithms (Box 2, C, D). Although these methods are less invasive than using penetrating microelectrodes, they cannot record individual neuron action potentials, instead measuring an averaged voltage waveform over populations of thousands of neurons. In general, the further away the electrodes are from the brain, the safer the implantation procedure is, but with a resulting decrease in the signal-to-noise ratio and the amount of control signals that can be decoded (ie, there is a lot of background noise). Therefore, ECoG recordings, which are closer to the brain, typically have a higher signal spatial and temporal resolution than that achievable by EEG.8 As EEG electrodes are placed on the opposite side of the skull from the brain, the recordings have a low fidelity and a low signal-to-noise ratio. For stimulation, subdural electrodes require higher voltages to activate neurons than intracortical electrodes and are less precise for stimulation and recording. Transcranial magnetic stimulation can be used to stimulate populations of neurons, but this is a crude technique compared with the invasive microelectrode techniques.10

Currently, implanted devices have an electrical plug connection through the skull and scalp, with attached cables. This is clearly not a viable solution for long term implantation. The challenge for engineers has been to develop the next generation of implantable wireless microelectronic devices with a large number of electrodes that have a long duration of functionality. Wireless interfaces are beginning to emerge.3,1113

Applications for brain–computer interfaces

Motor interfaces

The aim of the motor BCI has been to help paralysed patients and amputees gain motor control using, respectively, a robot and a prosthetic upper limb. Non-human primates with electrodes implanted in the motor cortex were able, with training, to control robotic arms through a closed loop brain–machine interface.14 Hochberg and colleagues were the first to place a 96-electrode array in the primary motor cortex of a tetraplegic patient and connect this to a computer cursor. The patient could then open emails, operate various devices (such as a television) and perform rudimentary movements with a robotic arm.15 For tetraplegic patients with a BCI, improved control of the position of a cursor on a computer screen was obtained by controlling its velocity and through advanced signal processing.16 These signal processing techniques find relationships between changes in the neural signals and the intended movements of the patient.17,18

Reach, grasp and more complex movements have been achieved with a neurally controlled robotic arm in tetraplegic patients.19,20 These tasks are significantly more difficult than simple movements as they require decoding of up to 15 independent signals to allow a person to perform everyday tasks, and up to 27 signals for a full range of movements.21,22 To date, the best BCI devices provide fewer than ten independent signals. The patient requires a period of training with the BCI to achieve optimal control over the robotic arm. More complex motor imagery, including imagined goals and trajectories and types of movement, has been recorded in the human posterior parietal cortex. Decoding this imagery could provide higher levels of control of neural prostheses.23 More recently, a quadriplegic patient was able to move his fingers to grasp, manipulate and release objects in real time, using a BCI connected to cutaneous electrodes on his forearms that activated the underlying muscles.24

The challenge with all these motor cortex electrode interfaces is to convert them to wireless devices. This has recently been achieved in a monkey with a brain–spinal cord interface, enabling restoration of movement in its paralysed leg,25 and in a paralysed patient with amyotrophic lateral sclerosis, enabling control of a computer typing program.11

These examples of BCIs have primarily used penetrating microelectrodes, which, despite offering the highest fidelity signal, suffer from signal loss over months to years due to peri-electrode gliosis.26 This scarring reduces electrical conduction and the resulting signal change can require daily or even hourly recalibration of the algorithms used to extract information.18 This makes BCIs difficult to use while unsupervised and hinders wider clinical application, including use outside a laboratory setting.

A recently developed, less invasive means of electrode interface with the motor cortex is the stent-electrode recording array (“stentrode”) (Box 2, B).27 This is a stent embedded with recording electrodes that is placed into the sagittal venous sinus (situated near the motor cortex) using interventional neuroradiology techniques. This avoids the need for a craniotomy to implant the electrodes, but there are many technical challenges to overcome before human trials of the stentrode can commence.

Lower-limb robotic exoskeleton devices that enable paraplegic patients to stand and walk have generated much excitement and anticipation. BCIs using scalp EEG electrodes are unlikely to provide control of movement beyond activating simple robotic walking algorithms in the exoskeleton, such as “walk forward” or “walk to the right”. Higher degrees of complex movement control of the exoskeleton with a BCI would require intracranial electrode placement.28 Robotic exoskeleton devices are currently cumbersome and expensive.

Sensory interfaces

Fine control of grasping and manipulation of the hand depends on tactile feedback. No commercial solution for providing artificial tactile feedback is available. Although early primate studies have produced artificial perceptions through electrical stimulation of the somatosensory cortex, stimulation can detrimentally interfere with the neural recordings.29 Optogenetics — the ability to make neurons light-sensitive — has been proposed to overcome this.30 Sensorised thimbles have been placed on the fingers of the upper limb myoelectric prosthesis to provide vibratory sensory feedback to a cuff on the arm, to inform the individual when contact with an object is made and then broken. Five amputees have trialled this, with resulting enhancement of their fine control and manipulation of objects, particularly for fragile objects.31 Sensory feedback relayed to the peripheral nerves and ultimately to the sensory cortex may provide more precise prosthetic control.32

Eight people with chronic paraplegia who used immersive virtual reality training over 12 months saw remarkable improvements in sensory and motor function. The training involved an EEG-based BCI that activated an exoskeleton for ambulation and visual–tactile feedback to the skin on the forearms. This is the first demonstration in animals or humans of long term BCI training improving neurological function, which is hypothesised to result from both spinal cord and cortical plasticity.33

The success of the cochlear prosthesis in restoring hearing to totally deaf individuals has also demonstrated how “plastic” the brain is in learning to interpret electrical signals from the sound-processing computer. The recipient learns to discern, identify and synthesise the various sounds.

The development of bionic vision devices has mainly focused on the retina, but electrical connectivity of these electrode arrays depends on the recipient having intact neural elements. Two retinal implants are commercially available.3 Retinitis pigmentosa has been the main indication. Early trials of retinal implants are commencing for patients with age-related macular degeneration. However, there are many blind people who will not be able to have retinal implants because they have lost the retinal neurons or optic pathways. Placing electrodes directly in the visual cortex bypasses all the afferent visual pathways.

It has been demonstrated that electrical stimulation of the human visual cortex produces discrete reproducible phosphenes. Several groups have been developing cortical microelectrode implants to be placed into the primary visual cortex. Since 2009, the Monash Vision Group has been developing a wireless cortical bionic vision device for people with acquired bilateral blindness (Box 3). Photographic images from a digital camera are processed by a pocket computer, which transforms the images into the relevant contours and shapes and into patterns of electrical stimulation that are transmitted wirelessly to the electrodes implanted in the visual cortex (Box 3, B). The aim is for the recipient to be able to navigate, identify objects and possibly read large print. Facial recognition is not offered because the number of electrodes will not deliver sufficient resolution.2 A first-in-human trial is planned for late 2017.2,34

The lateral geniculate nucleus of the thalamus is an alternative site for implantation of bionic vision devices. Further technical development of the design, manufacture and placement of multiple brain microelectrodes in this small deep brain structure is needed before this could be applied in humans.35

Memory restoration and enhancement

The same concepts and technologies used to record and stimulate the brain in motor and sensory prostheses can also be applied to deeper brain structures. For example, the fornix is an important brain structure for memory function. A human safety study of bilateral deep brain stimulation of the fornix has been conducted in 42 patients with mild, probable Alzheimer disease (ADvance trial), and this study will now proceed to a randomised controlled trial.36 This technique involves deep brain stimulation without direct feedback from neural recording.

A more definitive approach to memory augmentation would be to place a multi-electrode prosthesis directly into the hippocampus. Electrical mimicry of encoded patterns of memory about a task transmitted from trained donor rats to untrained recipient rats resulted in enhanced task performance in the recipients.37,38 This technology has been applied to the prefrontal cortex of non-human primates.39 Although human application is futuristic, this research is advancing rapidly. A start-up company was formed in 2016 to develop this prosthetic memory implant into a clinic-ready device for people with Alzheimer disease.40 The challenge in applying these therapies to Alzheimer disease and other forms of dementia will be to intervene before excessive neuronal loss has occurred.

Seizure detection and mitigation

Many patients with severe epilepsy do not achieve adequate control of seizures with medication. Deep brain electrical stimulation, using electrodes placed in the basal ganglia, is a treatment option for patients with medically refractory generalised epilepsy.41 Methods to detect the early onset of epileptic seizures using cortical recording and stimulation (to probe for excitability) are evolving rapidly.42 A hybrid neuroprosthesis, which combines electrical detection of seizures with an implanted anti-epileptic drug delivery system, is also being developed.43,44

Parkinson disease and other movement disorders

Deep brain stimulation in the basal ganglia is an effective treatment for Parkinson disease and other movement disorders.45 This type of BCI includes a four-electrode system implanted in the basal ganglia, on one or both sides, which is connected to a pulse generator implanted in the chest wall. This device can be reprogrammed wirelessly. Novel electrodes with many more electrode contacts and a recording capacity are being developed. This feedback controlled or closed loop stimulation will require a fully implanted BCI, so that the deep brain stimulation is adaptive and will better modulate the level of control of the movement disorder from minute to minute. More selective directional and steerable deep brain stimulation, with the electrical current being delivered in one direction from the active electrodes, rather than circumferentially, is being developed. The aim is to provide more precise stimulation of the target neurons, with less unwanted stimulation of surrounding areas and therefore fewer side effects.46

Technical challenges and future directions

Biocompatibility of materials, electrode design to minimise peri-electrode gliosis and electrode corrosion, and loss of insulation integrity are key engineering challenges in developing BCIs.47 Electrode carriers must be hermetically sealed to prevent ingress of body fluids. Smaller, more compact electronic components and improved wireless interfaces will be required. Electronic interfaces with larger numbers of neurons will necessitate new electrode design, but also more powerful computers and advanced signal processing to allow significant use time without recalibration of algorithms.

Advances in nanoscience and wireless and battery technology will likely have an increasing impact on BCIs. Novel electrode designs using materials such as carbon nanotubes and other nanomaterials, electrodes with anti-inflammatory coatings or mechanically flexible electrodes to minimise micromotion may have greater longevity than standard, rigid, platinum–iridium brain electrodes.48 Electrodes that record from neural networks in three dimensions have been achieved experimentally using injectable mesh electronics with tissue-like mechanical properties.49 Optogenetic techniques activate selected neuronal populations by directing light onto neurons that have been genetically engineered with light-sensitive proteins. There are clearly many hurdles to overcome before this technology is available in humans, but microscale wireless optoelectronic devices are working in mice.50

Populating the brain with nanobots that create a wireless interface may eventually enable direct electronic interface with “the cloud”. Although this is currently science fiction, the early stages of development of this type of technology have been explored in mice, using intravenously administered 10 μg magnetoelectric particles that enter the brain and modify brain activity by coupling intrinsic neural activity with external magnetic fields.51

Also in development is the electrical connection of more than one brain region to a central control hub — using multiple electrodes with both stimulation and recording capabilities — for integration of data and neuromodulation. This may result in more nuanced treatments for psychiatric illness (such as depression, post-traumatic stress disorder and obsessive compulsive disorder), movement disorders, epilepsy and possibly dementia.

Ethical and practical considerations

Implantable BCI devices are in an early phase of development, with most first-in-human studies describing only a single patient. However, the performance of these devices is rapidly improving and, as they become wireless, the next step will be to implant BCIs in larger numbers of patients in multicentre trials.

The prime purpose of neurobionic devices is to help people with disabilities. However, there will be pressure in the future for bionic enhancement of normal cognitive, memory, sensory or motor function using BCIs. Memory augmentation, cognitive enhancement, infrared vision and exoskeletal enhancement of physical performance will all likely be achievable.

The introduction of this technology generates many ethical challenges, including:

  • appreciation of the risk–benefit ratio;

  • provision of adequate and balanced information for the recipient to give informed consent;

  • affordability in relation to the fair and equitable use of the scarce health dollar;

  • inequality of patient access to implants, particularly affecting those in poorer countries;

  • undue influence on physicians and scientists by commercial interests; and

  • the ability to achieve unfair physical or cognitive advantage with the technology, such as enhancing disabled athletes’ performance using exoskeleton devices, military application with the creation of an enhanced “super” soldier, or using a BCI as the ultimate lie detector.52

The introduction of these devices into clinical practice should therefore not proceed unchecked. As the technology transitions from clinical trial to the marketplace, training courses and mentoring will be needed for the surgeons who are implanting these devices. Any new human application of the BCI should be initially tested for safety and efficacy in experimental animal models. After receiving ethics committee approval for human application, the technology should be thoroughly evaluated in well conducted clinical trials with clear protocols and strict inclusion criteria.53

One question requiring consideration is whether sham surgery should be used to try to eliminate a placebo effect from the implantation of a new BCI device. Inclusion of a sham surgery control group in randomised controlled trials of surgical procedures has rarely been undertaken,54 and previous trials involving sham surgery have generated much controversy.5557 Sham surgery trials undertaken for Parkinson disease have involved placing a stereotactic frame on the patient and drilling of burr holes but not implanting embryonic cells or gene therapy.5860 We do not believe sham surgery would be applicable for BCI surgery, for several reasons. First, each trial usually involves only one or a few participants; there are not sufficient numbers for a randomised controlled trial. Second, the BCI patients can serve as their own controls because the devices can be inactivated. Finally, although sham controls may be justified if there is likely to be a significant placebo effect from the operation, this is not the case in BCI recipients, who have major neurological deficits such as blindness or paralysis.

Clinical application of a commercial BCI will require regulatory approval for an active implantable medical device, rather than approval as a therapy. It is also important for researchers to ask the potential recipients of this new technology how they feel about it and how it is likely to affect their lives if they volunteer to receive it.61 This can modify the plans of the researchers and the design of the technology. The need for craniotomy, with its attendant risks, may deter some potential users from accepting this technology.

As the current intracortical electrode interfaces may not function for more than a few years because of electrode or device failure, managing unrealistic patient and family expectations is essential. Trial participants will also require ongoing care and monitoring, which should be built into any trial budget. International BCI standards will need to be developed so that there is uniformity in the way this technology is introduced and evaluated.

Conclusions

BCI research and its application in humans is a rapidly advancing field of interdisciplinary research in medicine, neuroscience and engineering. The goal of these devices is to improve the level of function and quality of life for people with paralysis, spinal cord injury, amputation, acquired blindness, deafness, memory deficits and other neurological disorders. The capability to enhance normal motor, sensory or cognitive function is also emerging and will require careful regulation and control. Further technical development of BCIs, clinical trials and regulatory approval will be required before there is widespread introduction of these devices into clinical practice.

Box 1 –
Schematic overview of the major components of brain–computer interfaces


Common to all devices are electrodes that can interface at different scales with the neurons in the brain. For output-type interfaces (green arrows), the brain signals are amplified and control signals from them are decoded via a computer. These decoded signals are then used to control devices that can interact with the world, such as computer cursors or robotic limbs. For input-type interfaces (red arrows), such as vision or auditory prostheses, a sensor captures the relevant input, which a computer translates into stimulation parameters that are sent to the brain via an electrode interface. EEG = electroencephalography. LFP = local field potential.

Box 2 –
Electrodes of different scales that can be used to record neural activity for brain–computer interfaces


A: The most invasive method of recording neural activity, which produces the best signal quality, requires penetrating microelectrodes, such as this Utah array (Blackrock Microsystems), with 100 electrodes with a spacing of 400 μm. Wires connected to each electrode (bundled to the right of the image) need to be percutaneously connected to the outside world. B: Electrodes placed on an intravascular stent with (inset) a close-up image of a few electrodes (750 μm diameter). C: A 128-channel, non-invasive electroencephalography cap. After the cap is applied to the scalp, conductive gel is injected into each electrode to ensure electrical contact. D: An example of a planar array that can be placed in the subdural space to record electrocorticography signals. The platinum electrodes (350 μm diameter circles) are embedded in silicone.

Box 3 –
An example of a fully implantable brain–computer interface


A: The Monash Vision Group cortical vision prosthesis, which consists of an array of penetrating microelectrodes (metallic spikes) connected through a ceramic casing to electronics that are capable of delivering electrical stimulation and receiving wireless power and control signals. B: A close-up of a single electrode with a 150 μm diameter. The bright band is the conductive ring electrode, where electrical charge is delivered. Electrodes are spaced 1 mm apart.

[Case Report Comment] Morvan just a syndrome…!

Neurologists are often perceived as diagnosticians with little in the way of treatments to benefit their patients. However, this view is being gradually dispelled. In no area of neurology is this occurring more rapidly than in neuroimmunology, where recognition of constellations of clinical signs can lead to prompt and effective treatment, often preventing serious sequelae.

[Correspondence] Revising the ICD: stroke is a brain disease

The tenth revision of the International Statistical Classification of Diseases and Related Health Problems (ICD-10) was long overdue. The ICD-10 was based on outdated medical knowledge and concepts from the 1980s. Since then, science and practice have changed beyond recognition. The WHO neurology topic advisory group (TAG) for the revision of the ICD-10 was formed in 2009. In the ICD-10, cerebrovascular diseases were inconsistently and confusingly spread over several different chapters. In March, 2011, the Neurology and Circulatory TAGs, with contribution of WHO classification representatives and relevant WHO departments, agreed that in the ICD-11, all types of strokes should form a single block, and that this block should be placed in the nervous system diseases chapter.

Appropriate care for older people with cognitive impairment in hospital

More than half of the patients in adult hospitals are over 65 years of age, many of them frail with multiple comorbidities, frequently including some degree of cognitive impairment. The most common “side effect” experienced by older people in hospital is delirium, sometimes accompanying the illness that brought them to hospital (prevalent delirium), but too frequently developing in hospital (incident delirium) due to misadventure, illnesses contracted, and/or treatments and procedures administered.

Delirium is an acute syndrome characterised by altered levels of consciousness, attention and cognitive function. The overall occurrence of delirium, including prevalent and incident delirium, ranges from 29% to 64% on general or geriatric medicine wards, 11% to 68% on surgical wards (highest in orthopaedics), and 26% to 82% in intensive care units (ICUs). However, many studies exclude people with dementia, thereby likely underestimating the true rates.1 These rates are only found when delirium is actively and frequently screened for by trained researchers. In practice, delirium is missed at least 50% of the time.2,3 The reason that poor detection is of such concern is that delirium is a medical emergency that leads to a 50–500% increase in mortality, with greater disability, cognitive impairment and rates of institutionalisation among survivors.1

Delirium prevention

It is important to remember that delirium can be prevented. The Hospital Elder Life Program (HELP) study found that a multicomponent intervention aimed at minimising six risk factors for delirium (cognitive impairment, sleep deprivation, immobility, visual impairment, hearing impairment, dehydration) reduced the odds of developing delirium by 40% in medical patients, as well as reducing the days with delirium and the number of episodes.4 A similar intervention has proven effectiveness in Australia.5 A recent review of 11 non-pharmacological intervention studies found a 53% reduction in delirium and 62% reduction in falls.6

A randomised controlled trial of delirium prevention in older orthopaedic patients found that geriatric consultation preoperatively or within 24 hours postoperatively, along with a geriatrician visiting daily in hospital making targeted recommendations based on a structured protocol, reduced delirium by 36%, with a particular reduction in severe delirium.7 Hospital in the Home treatment is another option that reduces the incidence of delirium and mortality.8,9

Screening for delirium

Because delirium can either be found on presentation to the emergency department (ED) or develop in hospital, screening should occur throughout the admission, particularly in high risk contexts such as the ED or ICU, on the orthopaedic or geriatric medicine wards, or after any surgical procedure. Delirium can affect people at any age, but the risk is highest and the consequences are greatest for older people. Many useful screening tools have been developed. These include the Abbreviated Mental Test Score, which screens for cognitive impairment,10 or the 4AT, a rapid assessment test for delirium and cognitive impairment,11 and the 7-second RADAR (Recognizing acute delirium as part of your routine) screening tool, which is incorporated into the medication round.12 Delirium has different subtypes: hypoactive, hyperactive and mixed. The hyperactive or agitated subtype draws attention to itself, whereas the hypoactive subtype, sometimes described as “quietly confused”, is often mistaken for dementia or depression.

Screening should be done as part of a stepped approach to diagnosis and management to support frontline ward staff to care for their patients effectively. Patients who screen positive for delirium require a diagnostic test and then a workup. A number of diagnostic tools are available. One such tool is the Confusion Assessment Method, which also has a version for use in the ICU.13,14 Such diagnostic tests are aided by consultation with the family or carers, checking to see if the patient seems different to normal. Screening and diagnosis are an important part of the comprehensive geriatric assessment process.

Considering dementia

Many patients in hospital have cognitive impairment due to dementia, but without delirium. Dementia is a major risk factor for delirium, so it is very important to check for delirium in a person with dementia, but not everyone with dementia will develop delirium in hospital. People with dementia in hospital still require action to prevent them developing delirium. Half will experience a deterioration in their dementia as a result of that hospitalisation — with an increased risk of requiring admission to a residential aged care facility — so person-centred care designed to meet their particular needs is crucial. This care needs to include a multicomponent intervention to prevent delirium, while also aiming to prevent pressure sores, dehydration, functional decline and stress.15

Appropriate care in hospital for people with dementia needs to take account of the stage of their dementia. Early dementia in a person living at home with independence in most activities of daily living (ADLs) and a good quality of life should not exclude a wide range of therapeutic interventions.

However, end-stage dementia where someone is bed-bound, at times with contractures, very limited or no vocabulary and experiencing episodes of aspiration or dehydration due to poor oral intake, indicates that many therapeutic options are futile. In this situation, there is no evidence to suggest that feeding tubes will improve longevity, cause weight gain or prevent aspiration or pressure sores.16

Management — evaluation

Once a diagnosis of delirium is made, it is imperative to evaluate the patient for predisposing conditions (vulnerability) and precipitating causes or noxious insults, before admission or in hospital, which are potentially treatable. The more vulnerable or frail the patient, the smaller or more benign the putative insult that can trigger delirium.2 Many, if not most, patients with delirium will have difficulty giving a clear history, although extremely useful information can still be obtained, especially with a corroborative history from the family, carer or general practitioner. A comprehensive assessment and systematic workup is mandatory. Older patients have a greater need for diagnostic tests because of their atypical presentation and difficulty accurately reporting their symptoms while acutely confused, and higher incidence of a wide range of conditions.

Patients with delirium often have a multiplicity of identifiable pathology and drugs that may be triggering delirium. Any centrally acting drugs or drugs with anticholinergic properties are particularly suspect (see the case scenario in Box 1).17 Judicious multifaceted intervention is therefore required in an attempt to restore homoeostasis, including withdrawal of aggravating medications. For example, when unable to distinguish between pulmonary oedema and chest infection, clinicians can treat both conditions, even if this involves the seemingly contradictory notion of intravenous frusemide with gentle intravenous rehydration. Frail older people are relatively immunocompromised and may not display the full symptoms and signs of infection, or other conditions, which healthy younger people do. Delayed treatment, waiting for an illness to fully declare itself, is generally the worst option. However, part of the evaluation of a patient with delirium should include discussion with the family about the goals of treatment. Burdensome treatment at the end of life should be avoided, and the broader use of advance care planning greatly assists with communicating which options are preferred.18

The nature of delirium is that it does not follow office hours, often presenting in the middle of the night (see the case scenario in Box 2). Therefore, all staff need to have some familiarity with delirium recognition and management and access to resources on the ward.19 Hospitals need to have multicomponent support to assist with more complex cases of delirium to improve outcomes;6,7 such support includes geriatric or psychogeriatric consultation, hopefully with both nursing and medical experts, and ideally a delirium unit for complex behavioural issues.20 Physical and chemical restraints will frequently exacerbate delirium and lead to worse injuries — delirium is one of the most common causes of falls in hospital (see the case scenario in Box 3) — as well as causing a permanent deterioration in cognitive function, accelerating the rate of progression of cognitive impairment, and worsening ADL function.21 Therefore, prevention and early treatment of delirium has many flow-on benefits.

Management — treatment

Treatment of delirium is directed at the underlying medical and surgical causes of the syndrome. Addressing the six HELP risk factors is also useful as part of the treatment, and families can play a beneficial role in calming agitated patients and providing familiar objects from home. In some cases, an assistant-in-nursing or volunteer to sit with the patient and assist with the HELP risk factors can be of great benefit. A delirium unit or room can provide a calm, comfortable environment, which is helpful for agitated or wandering patients.

As antipsychotic medicines have only modest benefit and known serious effects, including increased mortality, their use for patients with dementia and delirium for behavioural symptoms should be reserved for those with significant distress who are at risk to themselves or others and where non-drug strategies have been unsuccessful.2224 The principles of low dosage, close monitoring, short duration and discussion with the patient, carers and family should be followed. If drugs are needed for severe agitation or psychosis, low dose antipsychotics are generally first line, except in delirium tremens or delirium in Lewy body dementia. We do not have any known disease-specific or disease-modifying treatments for delirium, because so little is known about its pathophysiology. Research is pointing to roles for inflammation, metabolic changes, neurotransmitter disturbance and reduced cerebral blood flow, which exacerbate many of the known features of dementia in the central nervous system.2528 Given that delirium increases the risk of developing dementia and accelerates its progress, delirium may be best understood as the acute manifestation or exacerbation of the chronic disease, dementia; effective communication to the GP and other care providers is therefore vital.

This suggests that effective treatment of delirium will reduce the complications of delirium, including reducing mortality and placement in residential aged care; it can also be disease-modifying for dementia. Therefore, action to improve delirium prevention, recognition and management should be a priority for all (adult) hospitals.

Australian Commission on Safety and Quality in Health Care resources — a better way to care

The Australian Commission on Safety and Quality in Health Care (the Commission) has released A better way to care: safe and high-quality care for patients with cognitive impairment (dementia and delirium) in hospital — a set of resources for clinicians, health service managers and consumers to improve the early recognition of, and response to, patients with cognitive impairment so that they receive safe and high quality care.2931 These detailed resources complement and expand upon the general principles outlined in this article.

The Commission has also released the Delirium Clinical Care Standard,32 which consists of key quality statements that describe high priority areas for improvement in delirium prevention, recognition and treatment. The standard also includes suggested indicators to assist local health services to monitor implementation.

The Commission has also included the quality and safety risks for people with cognitive impairment in draft version 2 of the National Safety and Quality Health Service (NSQHS) Standards. Following their release in 2017, implementation of the revised NSQHS Standards is expected to begin from January 2019.

Box 1 –
Case scenario: can medicine to help you walk also help you fall?

Mr B is an 82-year-old man living alone in his home since the death of his wife 3 years earlier. He was brought into the emergency department by ambulance after having fallen over in his local shopping centre. The ambulance report noted that he appeared confused and there was concern that he may have hit his head. Mr B’s confusion worsened and he required sedation to allow a cerebral computed tomography (CT) scan to be performed.

It took several hours for his GP to be identified through prescriptions found in his wallet. It appeared from these that he had hypertension. His GP confirmed that he had been treating him for hypertension for some years, and he had recently commenced treatment for Parkinson disease by a neurologist.

Mr B’s CT scan showed cerebral atrophy consistent with his age but no recent evidence of injury. Other laboratory investigations were within normal limits. Mr B’s pharmacist was contacted by staff and confirmed that, the previous week, he had dispensed Sinemet CR 200/50 (levodopa/carbidopa) three times a day.

Mr B remained agitated and a diagnosis of delirium was made. He was admitted and initially commenced on a low dose of haloperidol before the information about possible Parkinson disease was known. This resulted in significant sedation, and all medication was then withheld.

Mr B improved both cognitively and physically over the following 3 days with regular fluids and mobilisation. As it was felt that his delirium had not completely resolved, he was transferred to the rehabilitation ward for further mobilisation and assessment of his possible Parkinson disease. The cause of his delirium was felt most likely to be the relatively high dose of Sinemet that he had been taking for the several days before admission.

Box 2 –
Case scenario: can delirium occur before the age of 65?

Mrs S is a 62-year-old lady with younger onset Alzheimer disease. She is otherwise physically well and lives with her husband in their own home, with support from their daughter who lives nearby. Mrs S requires prompting or assistance with most activities of daily living but can still toilet herself independently and feed herself if food is provided for her.

One evening, she became very confused, constantly repeating herself and walking around the house looking for her husband who had gone out. She could not be settled and her daughter rang the local GP and then an ambulance. In hospital, Mrs S became very agitated, screaming for the police and becoming very aggressive to staff when they tried to examine her and take blood.

Mrs S’s daughter had accompanied her mother and explained to the emergency department staff that her mother had moderate dementia but was normally quite calm and cooperative and that this behaviour was a dramatic change. ED staff placed Mrs S in a quiet room with her daughter until the results of investigations indicated that Mrs S had a urinary tract infection with elevated white cell count and C-reactive protein level. She was treated with intravenous antibiotics and fluids, and her daughter remained with her overnight.

Her agitation had decreased in the morning with the appearance of her husband, and she was discharged home with follow-up from the Hospital in the Home team to administer antibiotics and ensure adequate hydration. Mrs S’s GP agreed to review her later.

Box 3 –
Case scenario: fractured hips and delirium

Mrs W is an 84-year-old lady who lives with her husband in their own home. She has some cognitive impairment and has suspected early dementia. However, she is still quite independent in activities of daily living and is physically active. She has a known predisposition to delirium, having had an episode with a chest infection some months earlier. Her medication included regular vitamin D, salbutamol and paracetamol as needed.

Mrs W fell while vacuuming the stairs in her house and sustained a left hip fracture. She was admitted to hospital and gave a clear history in the emergency department. There was no obvious cognitive impairment noted.

Following hip fracture surgery, she spent a prolonged period in recovery and was left on her own. She became quite confused and, thinking that her husband was being taken away on a trolley, she climbed out of bed, fell, and re-fractured her hip. She was found to have a post-operative delirium, due to the anaesthetic and analgesic drugs she had been receiving. This was treated supportively with fluids and pain relief, and resolved over 2 weeks.

She required 2 months of bed rest in hospital before revision of her hip fracture was able to be performed. This was spent mainly in the rehabilitation ward, allowing her to maintain her upper limb strength and balance. Regular screening for delirium occurred during this time, and she had one brief episode due to a urinary tract infection.

Following revision of the hip replacement, she had shortening of 3 cm to the left leg, requiring a 3 cm shoe raise and a cane when mobilising. Mrs W was able to return home after a further period of rehabilitation but remained significantly disabled due to mobility limitation associated with her shortened limb.

Neurofibromatosis of the tongue

A 45-year-old woman presented with a painless mass in the tongue that had grown gradually over the past 20 years (Figure, arrowheads). She had café-au-lait spots and previous neurofibroma resections. Neurofibromatosis type 1 was also found in her father and two children. Recent speech problems made a resection necessary. Partial removal of the mass immediately improved communication. Pathological analysis showed plexiform neurofibroma without malignant transformation. Neurofibromatosis type 1 is an autosomal dominant disorder characterised by neurofibromas that can potentially affect every site of the body. Malignant transformation is rare and resection is indicated when functional or aesthetic impairment is associated.1

Figure

Central retinal venous pulsations

Diagnosing raised intracranial pressure through ophthalmoscopic examination

The ophthalmoscope is one of the most useful and underutilised tools and it rewards the practitioner with a wealth of clinical information. Through illumination and a number of lenses for magnification, the direct ophthalmoscope allows the physician to visualise the interior of the eye. Ophthalmoscopic examination is an essential component of the evaluation of patients with a range of medical conditions, including diabetes mellitus, systemic hypertension and conditions associated with raised intracranial pressure (ICP). The fundus has exceptional clinical significance because it is the only location where blood vessels can be directly observed as part of a physical examination.

Optic disc swelling and central retinal venous pulsations are useful signs in cases where raised ICP is suspected. Both signs can be obtained rapidly by clinicians who know how to recognise them. Although optic disc swelling supports the diagnosis of raised ICP, the presence of central retinal venous pulsations may indicate the contrary.

In the standard technique for direct ophthalmoscopy, the patient is positioned in a seated posture and asked to fix their gaze on a stationary point directly ahead. Pupillary dilation, removal of the patient’s spectacles and dim room illumination usually aid the examination. To start examining the patient, set the ophthalmoscope dioptres to zero — alternatively, a suitable setting would be the sum of the refractive errors of the patient and the examiner. Use the right eye to examine the patient’s right eye and vice versa. Using a slight temporal approach facilitates the identification of the optic disc, which also minimises awkward direct facial contact with the patient. Examine the red reflex at just under arm’s length. A pale or absent red reflex may suggest media opacity, such as a cataract. Next, on approaching the patient and obtaining a clear view of a retinal vessel, follow its course toward the optic disc. The presence or absence of venous pulsations should be appreciable (see the video at www.mja.com.au; pulsations of the central vein are clearly visible at the inferior margin of the optic disc). These pulsations, usually of the proximal portion of the central retinal vein, are most readily identified at the optic disc. The examination of the fundus should be concluded by visualisation of the four quadrants of the retina and examination of the macula.

Central retinal venous pulsations are traditionally attributed to fluctuations in intraocular pressure with systole, although this is may be an incomplete explanation.1 Patients with central retinal venous pulsations generally have cerebrospinal fluid pressures below 190 mmHg.2 Based on the results of Wong and White,3 the positive predictive value for retinal venous pulsations predicting normal ICP was 0.88 (0.87–0.9) and the negative predictive value was 0.17 (0.05–0.4).

This is important when considering lumbar puncture and when neuroimaging is not available. A limitation of this sign is that about 10% of the normal population4 do not have central retinal venous pulsations visible on direct ophthalmoscopy.4 The absence of central retinal venous pulsations does not, by itself, represent evidence of raised ICP; some patients with elevated ICP may still have visible retinal venous pulsations.

Papilloedema (optic disc swelling caused by increased ICP) may develop after the loss of retinal venous pulsations. This change in the appearance of the optic disc and its surrounding structures may be due to the transfer of elevated intracranial pressure to the optic nerve sheath. This interferes with normal axonal function causing oedema and leakage of fluid into the surrounding tissues. Progressive changes include the presence of splinter haemorrhages at the optic disc, elevation of the disc with loss of cupping, blurring of the disc margins, and haemorrhage. In later stages, there is progressive pallor of the disc due to axonal loss. A staging scale, such as that of Frisén,5 can be used to reliably identify the extent of papilloedema (Box).

Box –
Stage 4–5 papilloedema (5) showing disc and nerve fibre swelling, haemorrhage, loss of the optic cup and obscuration of the vessels at the disc margin


Source: Bruce AS, O’Day J, McKay D, Swann PG. Posterior eye disease and glaucoma A–Z. London: Elsevier Health Sciences, 2008.

Primary amoebic meningoencephalitis in North Queensland: the paediatric experience

Primary amoebic meningoencephalitis (PAM) is a rare but fulminant disease leading to diffuse haemorrhagic necrotising meningoencephalitis, and has a very poor prognosis.1 Naegleria fowleri is the causative agent. At Townsville Hospital, our first confirmed case of PAM was an 18-month-old girl from a rural location in North Queensland who presented with fever, seizures and an altered level of consciousness.2 Organisms resembling Naegleria spp. were seen on microscopy of cerebrospinal fluid (CSF). Despite aggressive therapy with multiple antimicrobial agents, the patient died within 72 hours of presentation. An older sibling of the patient had presented with a similar syndrome several years earlier and had died of an undifferentiated meningitic illness. The sibling was retrospectively suspected to also have had PAM.2

Our second confirmed patient presented in early 2015. A previously well 12-month-old boy from a nearby West Queensland cattle-farming area had had a 36-hour history of fevers, rhinorrhoea and frequent emesis, which progressed to lethargy and irritability. Before arrival at the local rural hospital, he had a tonic–clonic seizure lasting 3–5 minutes. On arrival he appeared drowsy, had mottled skin, a blanching maculopapular rash, which may not necessarily have been related to PAM, and a central capillary refill of 3–4 seconds. He was treated with intravenous antibiotics for presumed bacterial meningitis. Given the remote location and clinical suspicion of elevated intracranial pressure, lumbar puncture was not performed. On arrival at Townsville Hospital, his Glasgow Coma Scale score was 8/15, he was increasingly febrile, and had an evolving maculopapular rash. Broad spectrum antimicrobial therapy was subsequently started for presumed meningoencephalitis. Within 18 hours of leaving home, he had no spontaneous respiratory effort, reduced tone, up-going plantar reflexes and fixed pupils.

Neuroimaging showed diffuse cerebral oedema with progressive dilation of the ventricular system on sequential studies. An external ventricular drain was placed because of clinical instability, and CSF microscopy showed motile trophozoites on a wet preparation and Giemsa stain, consistent with N. fowleri. The patient was commenced on intrathecal amphotericin, with no improvement in his clinical state. The organism seen in the CSF was confirmed after the patient’s death by polymerase chain reaction (PCR) analysis as being N. fowleri. When reviewing the patient’s history, it was noted that, as in previous cases, he lived on a property that used untreated and unfiltered bore water domestically, to which he had multiple potential exposures, including via water play with hoses and bathing.

Literature review

We searched the PubMed database using the terms “Naegleri”, “fowleri” and “meningitis”. No time period was specified. The James Cook University eJournal database was searched for historical information.

We also searched the Queensland Health Communicable Diseases Branch and the Communicable Diseases Network Australia databases for Australian cases, but, as N. fowleri infection is not a notifiable disease, this returned a low yield.

History of Naegleria fowleri

In 1899, the Austrian scientist Franz Schardinger published the first description of an amoeba that transforms into a flagellate, with drawings of the amoeba, cysts and flagellates. In 1912, Alexeieff coined the name Naegleria, but physicians at the time thought that the genus did not cause disease in humans.3 It was not until the late 1960s that Naegleria was implicated as the cause of PAM by the work of Adelaide pathologists Malcolm Fowler and Rodney Carter, and of South Australian rural general practitioner Robert Cooter. In 1965, it was first proposed that the organism entered the CSF through the cribriform plate after Fowler isolated the organism in autopsy specimens. Following communication of his findings, Cooter and colleagues were able to directly observe the live amoeba in a CSF sample from a 10-year-old boy who presented with meningoencephalitis.4,5

Pathophysiology

N. fowleri lives and multiplies in warm freshwater areas, and acquisition is often associated with water-based recreational activities.6 Infection may occur when contaminated water is flushed into the nasal cavity. After penetrating the nasal mucosa and passing through the cribriform plate, trophozoites migrate along the olfactory nerve directly into brain tissue. Cases are almost universally fatal, although survival has been reported in the literature following early diagnosis and management.7,8

Epidemiology

The worldwide incidence of PAM is not accurately known,9 and the disease is likely to be under-diagnosed and under-reported. In the developing world, numerous factors affect accurate identification, including a lack of resources or expertise in microbiological diagnosis; prioritising management of other infections that are more common; and cultural beliefs that prevent autopsies.9 Higher water temperatures, inadequate sanitation, unsafe water sources, and religious ablution practices, such as the use of Neti pots for nasal cleansing, could potentially increase the risk for acquiring PAM.10,11 N. fowleri is a thermophilic organism and would therefore be expected to occur more frequently in tropical areas; however, the majority of cases are reported from subtropical or temperate regions.12 In a study in Karachi, Pakistan, N. fowleri was recovered from 8% of 52 domestic water taps that were sampled.13

An epidemiological review of PAM cases in the United States showed that N. fowleri infections are rare and primarily affect younger males exposed to warm recreational freshwater in the southern states.1416 There are two case reports of patients who acquired N. fowleri from using treated municipal water for nasal irrigation,17 and another patient who contracted the disease from inadequately treated municipal water.18

In Australia, Dorsch and colleagues reported 20 cases of PAM, 13 of which occurred between 1955 and 1972 in South Australia. These cases were attributed to household water that was piped overland for long distances,19 allowing it to be heated to temperatures that promoted growth of the amoeba.5 After the introduction of continuous water chlorination in 1972, only one further case was reported in South Australia in 1981.19 In Queensland, only three previous patients have been described in the literature: one from Mount Morgan who survived, one from Charters Towers,19 and one referred from North West Queensland to Townsville Hospital.2

Clinical challenges

Patients with PAM present with the same symptoms as those with bacterial meningitis, and clinical differentiation between the two conditions is impossible. Patients often have a history of recent exposure to warm fresh water, although the definite exposure event is not always identified.9 The incubation period ranges from 2 to 15 days, and presenting symptoms may include meningism, fever, confusion and signs of elevated CSF pressure, such as seizures or coma.14

Diagnosis is made more difficult in North Queensland by the vast distances between remote towns in the western part of the state. Townsville Hospital services an area of nearly 150 000 km2 and has the only dedicated paediatric intensive care unit north of Brisbane. Patients with PAM inevitably require intensive care unit management and tertiary level investigations. Obtaining CSF samples for formal microscopic diagnosis is often impossible in small clinics with limited medical imaging or local laboratory services, and where performing a lumbar puncture is contraindicated by symptoms of raised intracranial pressure. Because of the rarity of the infection, greater awareness of PAM among primary health care professionals is required in order to increase suspicion in a clinically compatible case. Most importantly, education about prevention is essential for the continued health of rural communities, of which local medical professionals are a vital part. To this end, recent guidelines for the management of encephalitis20 include assessing risk factors for this condition and performing appropriate testing, as described below.

Diagnostic challenges

Diagnosis requires identification of motile trophozoites in CSF or characteristic morphology in stained specimens by a trained microbiologist (Box 1), with confirmation using molecular methods (PCR) or culture (Escherichia coli lawn culture). The trophozoites are visible in a wet unstained preparation of CSF (magnification, × 400), exhibiting sinusoidal movement by means of lobopodia; however, specimens need to be examined very soon after collection, as the amoebae degenerate rapidly in vitro and can be easily mistaken for leucocytes.

CSF chemistry is not diagnostic and will usually reveal a similar pattern to that of bacterial meningitis (Box 2). PCR analysis is performed using in-house methods at reference laboratories, and confirmation is often posthumous due to the rapid decline experienced by most patients. The US Centers for Disease Control and Prevention has developed a multiplex real-time TaqMan PCR assay to simultaneously identify three free-living amoebae (N. fowleri, Acanthamoeba spp. and Balamuthia mandrillaris) in clinical specimens.21 In Queensland, the pathology laboratory which performs all N. fowleri molecular testing uses primers and probes in line with the method of Qvarnstrom and colleagues.21 Culture may take several weeks and is difficult to perform.

Treatment

Given the limited data available, there are no set guidelines for antimicrobial therapy; however, it can be extrapolated from cases of patients who have survived that combination therapy with multiple anti-parasitic agents is required.

In 1969, Carter was able to demonstrate the sensitivity of the organism to amphotericin B (AMB) and it has remained the mainstay for treatment of PAM to this day.22 AMB has been used in all patients who have survived the illness.23 N. fowleri is highly sensitive to AMB in vitro with a minimum amoebicidal concentration of 0.01 μg/mL,24 and no resistance has been reported. Conventional AMB is preferred to liposomal forms as it can be given intrathecally as well as intravenously. Despite this, only a few patients have survived.25

Other antifungal drugs, such as miltefosine and the azoles, have all shown in vitro activity against N. fowleri.2224 Miconazole has synergistic activity when combined with AMB, and fluconazole is used as first line in combination therapy.

Miltefosine is a protein kinase B inhibitor that was originally developed as an antineoplastic agent. It also has anti-parasitic activity and is used for the treatment of leishmaniasis. Schuster and colleagues26 reported that miltefosine showed in vitro activity against free-living amoebae, including N. fowleri, Acanthamoeba spp. and B. mandrillaris. Recently, miltefosine has been used in the treatment of Acanthamoeba granulomatous amoebic encephalitis and PAM. Linam and colleagues27 described the case of a child treated for PAM with combination therapy including amphotericin, miltefosine, fluconazole and rifampicin, who survived with no significant neurological sequelae.

Rifampicin is commonly used in the treatment of PAM; however, it has variable central nervous system penetration and poor efficacy in vitro.24 It may also reduce the efficacy of the azole drugs due to cytochrome P450 interactions. Although azithromycin has shown some in vitro and in vivo activity against N. fowleri, the other macrolides are less effective.9 Atypical agents such as the diamidines and chlorpromazine have been studied in animal models but have yet to be utilised clinically.24,28

Public health

As described, our patient was probably the third child to die with PAM in 14 years in a small area with a tiny population on remote Queensland cattle stations. As a response to the third death, a public health investigation found large numbers of N. fowleri at the patient’s homestead. In this district, water was sourced from deep artesian bores at about 60°C (Box 3) and cooled in open surface dams before being piped hundreds of metres on the surface to households, keeping water temperatures high. It was noted that the cases described in North Queensland were of children too young to be swimming in surface waters, the assumption being that they contracted the disease in the home environment. There had never been water treatment or filtration in the homesteads for generations; the clarity and taste of the bore water had often been a source of pride for owners. The difference in the present era of rural life was the advent of modern facilities, allowing the heated bore water to be pressurised via taps, hoses, toys and showerheads and delivered directly into the homestead.

The public health hypothesis was that:

  • Hot artesian bore water and long surface pipelines promote large concentrations of N. fowleri, which can be sucked into water pipes from sediments, particularly in drought years.

  • There had been no form of treatment for apparently clean water.

  • In recent years, among young families with modern water facilities, there were many more opportunities for water to be forced into a vulnerable (non-immune) child’s nose at pressure.

  • Simple filtration and disinfection of all water for washing and playing would prevent child deaths on these properties.

The public health dilemma was whether health promotion for a single, rare disease could be cost-effective or gain traction among rural people possibly reluctant to accept an expensive treatment of their water. Untreated surface water can also lead to a whole spectrum of gastrointestinal diseases, even if these were not familiar to the remote communities. It was decided that a health promotion campaign about domestic water filtration and treatment could protect not only from PAM but also from a range of other diseases.

The family of our second confirmed patient embarked on a rural education campaign of their own to prevent any further deaths from PAM or other waterborne diseases, culminating in an episode of the television series Australian Story in November 2015.29 To coincide with this story, public health physicians gave a series of talks to communities and health staff across a wide area of outback Queensland. To follow up the face-to-face campaign, Queensland Health released a safe water booklet with advice on cost-effective filtration and disinfection.30 As a result, many rural properties and some small towns are installing water treatment equipment for the first time. The South Australian and Western Australian governments have online education resources specifically targeting rural communities at risk of amoeba acquisition,31,32 with the primary focus on prevention. The aim of the Queensland public health booklet was to provide a more comprehensive education document for water treatment in rural communities.30

Conclusion

We hope an increased awareness of N. fowleri and its association with warm, non-chlorinated water provides an opportunity for counselling families about safe water use: avoiding diving or jumping into or squirting untreated water, and disinfecting or filtering water used for washing and playing, as well as for drinking. In particular, bore water at warm or hot temperatures and other warm water sources should be considered ideal reservoirs for this organism. In the clinical setting, difficulties with analysing CSF make it unlikely that an accurate diagnosis could be provided in a remote environment. The presentation of an acutely unwell child with a history of bore water exposure and signs of meningitis or encephalitis should, however, prompt consideration of PAM as a potentially life-threatening diagnosis. Our experience with this disease clearly demonstrates the crucial role of medical professionals working in rural and remote Australia in primary prevention of this almost universally fatal condition.

Box 1 –
Microscopy of cerebrospinal fluid of Patient 2,showing trophozoites (Giemsa stain, black arrows) and mononuclear leucocytes (white arrows)

Box 2 –
Analysis of cerebrospinal fluid (CSF) in patients with primary amoebic meningoencephalitis at Townsville Hospital

Microscopy

White cell count (106/L)

Polymorphonuclear leucocytes

Protein (mg/L)

CSF:blood glucose


Normal

No organisms

< 1

0

< 0.4

> 0.6

Patient 1

Motile trophozoites

7200

91%

3900

0.17

Patient 2

Motile trophozoites

240

54%

2700

0.12


Box 3 –
Great Artesian Basin


The Great Artesian Basin, from which bore water comes, covers a vast area of rural Australia. Western Queensland has a particularly wide coverage, and rural properties use bore water extensively.

Source: Australian Government Department of Sustainability, Environment, Water, Population and Communities, 2011. Available at http://www.agriculture.gov.au/water/national/great-artesian-basin (accessed Aug 2016).