×

How and why the brain and the gut talk to each other

 

It’s widely recognised that emotions can directly affect stomach function. As early as 1915, influential physiologist Walter Cannon noted that stomach functions are changed in animals when frightened. The same is true for humans. Those who stress a lot often report diarrhoea or stomach pain.
We now know this is because the brain communicates with the gastrointestinal system. A whole ecosystem comprising 100 trillion bacteria living in our bowels is an active participant in this brain-gut chat.

Recent discoveries around this relationship have made us consider using talk therapy and antidepressants as possible treatments for symptoms of chronic gut problems. The aim is to interfere with the conversation between the two organs by telling the brain to repair the faulty bowel.

Our research found talk therapy can improve depression and the quality of life of patients with gastrointestinal conditions. Antidepressants may also have a beneficial effect on both the course of a bowel disease and accompanying anxiety and depression.

What are gastrointestinal conditions?

Gastrointestinal conditions are incredibly common. About 20% of adults and adolescents suffer from irritable bowel syndrome (IBS), a disorder where abdominal discomfort or pain go hand-in-hand with changes in bowel habits. These could involve chronic diarrhoea and constipation, or a mixture of the two.

IBS is a so-called functional disorder, because while its symptoms are debilitating, there are no visible pathological changes in the bowel. So it is diagnosed based on symptoms rather than specific diagnostic tests or procedures.

People with chronic gut conditions can experience severe pain that affects their quality of life.
from shutterstock.com

This is contrary to inflammatory bowel disease (IBD), a condition where the immune system reacts in an exaggerated manner to normal gut bacteria. Inflammatory bowel disease is associated with bleeding, diarrhoea, weight loss and anaemia (iron deficiency) and can be a cause of death. It’s called an organic bowel disease because we can see clear pathological changes caused by inflammation to the bowel lining.

Subtypes of inflammatory bowel disease are Crohn’s disease and ulcerative colitis. Around five million people worldwide, and more than 75,000 in Australia, live with the condition.

People with bowel conditions may need to use the toilet 20 to 30 times a day. They also suffer pain that can affect their family and social lives, education, careers and ability to travel. Many experience anxiety and depression in response to the way the illness changes their life. But studies also suggest those with anxiety and depression are more likely to develop bowel disorders. This is important evidence of brain-gut interactions.

How the brain speaks with the gut

The brain and gut speak to each other constantly through a network of neural, hormonal and immunological messages. But this healthy communication can be disturbed when we stress or develop chronic inflammation in our guts.

Stress can influence the type of bacteria inhabiting the gut, making our bowel flora less diverse and possibly more attractive to harmful bacteria. It can also increase inflammation in the bowel, and vulnerability to infection.

Chronic intestinal inflammation may lower our sensitivity to positive emotions. When we become sick with conditions like inflammatory bowel disease, our brains become rewired through a process called neuroplasticity, which changes the connections between the nerve signals.

Anxiety and depression are common in people suffering chronic bowel problems. Approximately 20% of those living with inflammatory bowel disease report feeling anxious or blue for extended periods of time. When their disease flares, this rate may exceed 60%.

Interestingly, in a recent large study where we observed 2,007 people living with inflammatory bowel disease over nine years, we found a strong association between symptoms of depression or anxiety and disease activity over time. So, anxiety and depression are likely to make the symptoms of inflammatory bowel disease worse long-term.

It makes sense then to offer psychological treatment to those with chronic gut problems. But would such a treatment also benefit their gut health?


Gut feeling: how your microbiota affects your mood, sleep and stress levels


Inflammatory bowel disease

Our recent study combined data from 14 trials and 1,196 participants to examine the effects of talk therapy for inflammatory bowel disease. We showed that talk therapy – particularly cognitive behavioural therapy (CBT), which is focused on teaching people to identify and modify unhelpful thinking styles and problematic behaviours – might have short-term beneficial effects on depression and quality of life in people with inflammatory bowel disease.

But we did not observe any improvements in the bowel disease activity. This could be for several reasons. Inflammatory bowel disease is hard to treat even with strong anti-inflammatory drugs such as steroids, so talk therapy may not be strong enough.

Talk therapy may only help when it’s offered to people experiencing a flare up in their disease. The majority of the included studies in our review were of people in remission, so we don’t know if talk therapy could help those who flare.

On the other hand, in our latest review of 15 studies, we showed antidepressants had a positive impact on inflammatory bowel disease as well as anxiety and depression. It’s important to note the studies in this review were few and largely observational, which means they showed associations between symptoms and antidepressant use rather than proving antidepressants caused a decrease in symptoms.

Studies show talk therapy improves the symptoms of irritable bowel syndrome.
from shutterstock.com

Irritable bowel syndrome

When it comes to irritable bowel syndrome, the studies are more conclusive. According to a meta-analysis combining 32 trials,
both talk therapy and antidepressants improve bowel symptoms in the disorder. A recent update to this meta-analysis, including 48 trials, further confirmed this result.

The studies showed symptoms such as diarrhoea and constipation improved in 56% of those who took antidepressants, compared to 35% in the group who received a placebo. Abdominal pain significantly improved in around 52% of those who took antidepressants, compared to 27% of those in the placebo group.

Symptoms also improved in around 48% of patients receiving psychological therapies, compared with nearly 24% in the control group, who received another intervention such as usual management. IBS symptoms improved in 59% of people who had cognitive behavioural therapy, compared to 36% in the control group.

Stress management and relaxation were found to be ineffective. Interestingly, hypnotherapy was also found effective for bowel symptoms in 45%, compared to 23% of control therapy participants.

What now?

Better studies exploring the role of talk therapy and antidepressants for symptoms of inflammatory bowel disease need to be conducted. We should know in a few years which patients are likely to benefit.

The ConversationIn the meantime, there is enough evidence for doctors to consider referring patients with irritable bowel syndrome for talk therapy and antidepressants.

Antonina Mikocka-Walus, Senior Lecturer in Health Psychology, Deakin University

This article was originally published on The Conversation. Read the original article.

Dementia study debunks exercise theory

 

Look at any of the multitude of articles of the past few years on how to avoid dementia and you’ll almost certainly read that exercise delays onset. Not so, according to the most recent research, published this week in the BMJ.

The 28-year study followed over 10,000 middle-aged British civil servants, noting at seven-year intervals whether participants were doing the “recommended” amount of exercise, defined as moderate or vigorous physical activity for 2.5 or more hours per week.

Surprisingly, the researchers found no correlation between how much exercise a patient did and whether they experienced cognitive decline over the study period, identified through a battery of cognitive tests, along with dementia diagnoses from hospital and mental health services.

The finding runs counter to several recent meta-analyses of observational studies which concluded that physical activity is neuroprotective in cognitive decline and dementia risk.

What the researchers did find was that in participants who eventually developed dementia, a decline in physical activity started around nine years before diagnosis.

This finding could be key to why previous observational studies have found a correlation between exercise and dementia risk, say the French researchers from the Centre for Research in Epidemiology and Population Health in Paris.

It’s now well known that brain changes start happening many years before dementia symptoms become apparent, and a decrease in physical activity is probably part of the cascade of changes in this preclinical phase of dementia, the researchers say.

The upshot is that findings of a lower risk of dementia with exercise may be attributable to reverse causation – in other words, decline in physical activity is due to the dementia, and not the other way around.

The researchers say that two problems with some of the earlier observational studies were that their duration was too short and their participants were too old. This made them more liable to be confounded by participants with preclinical dementia, who for that reason had lower levels of physical exercise.

They also point out a difference between observational and randomised trials, with the latter less likely to find a protective effect with exercise.

The recommendation of exercise for the prevention of dementia has already become enshrined in a number of international guidelines, including in Australia.

You can access the study here.

Art and Medicine

By Dr Jim Chambliss

It is often said that a picture speaks a thousand words.

Contemporary medical technology provides incredibly intricate pictures of external and internal human anatomy.

However, technology does not communicate holistic representations of the social, behavioural and psychosocial impacts associated with illness and the healing process.

Studies have shown that increased reliance on reports from expensive laboratory tests, radiology and specialised diagnostic technology has resulted in inadequacy of physical examination skills; decline in patient empathy, and less effective doctor/patient communication.

Having commenced in May this year and continuing until July 8, continuing professional development workshops which explore and promote the value of art expression in the development of observation skills, human sensitivity and relevant healthcare insights will be presented at the National Gallery of Victoria exhibition of the original works of Vincent van Gogh.

The program will incorporate empirical research to illustrate the way neuropsychological conditions can influence art and creativity. The objectives of the workshops are to:

 • advance understanding of the impact of medical, psychological and social issues on the health and wellbeing of all people;

 • promote deeper empathy and compassion among a wide variety of professionals;

 • enhance visual observation and communication skills; and

 • heighten creative thinking.

Over the last 15 years, the observation and discussion of visual art has emerged in medical education, as a significantly effective approach to improving visual observation skills, patient communication and empathy.

Pilot studies of implementing visual art to teach visual diagnostic skills and communication were so greatly effective that now more than 48 of the top medical schools in the USA integrate visual arts into their curriculum and professional development courses are conducted in many of the most prestigious art galleries and hospitals.

The work of Vincent van Gogh profoundly illustrates the revelations of what it means to be uniquely human in light of neurological characteristics, behavioural changes and creative expression through an educated, respectful and empathic perspective.

The exact cause of a possible brain injury, psychological illness and/or epilepsy of van Gogh is unknown.

It is speculated by numerous prominent neurologists that Vincent suffered a brain lesion at birth or in childhood while others opine that it is absinthe consumption that caused seizures.

Two doctors – Felix Rey and Théopile Peyron – diagnosed van Gogh with epilepsy during his lifetime.

Paul-Ferdinand Gachet also treated van Gogh for epilepsy, depression and mania until his death in 1890 at the age of 37.

After the epilepsy diagnosis by Dr Rey, van Gogh stated in a letter to his brother Theo, dated 28 January 1989: “I well knew that one could break one’s arms and legs before, and that then afterwards that could get better but I didn’t know that one could break one’s brain and that afterwards that got better too.”

Vincent did not, by any account, demonstrate artistic genius in his youth. He started painting at the age of 28 in 1881.

In fact, his erratic line quality, compositional skills and sloppiness with paint were judged in his February 1886 examinations at the Royale Academy of Fine Arts, Antwerp to be worthy of demotion to the beginners’ painting class. His original drawings and paintings were copies from others’ art, while his sketches in drawing class showed remarkably different characteristics.

Increased symptoms of epilepsy and exposure to seizure triggers (absinthe and sleep deprivation) ran parallel with van Gogh’s most innovative artistic techniques and inspirations following his move to Paris in 1886 to 1888.

These symptoms increased, accompanied by breathtaking innovation following his move to Arles, France in 1888 and his further decline in mental and physical health.

In Paris he was exposed to the works of many of the most famous impressionistic and post impressionistic painters, but so much of his new techniques and imagery were distinctly innovative in detail without traceable influences from others.

While in Paris his work transitioned from drab, sombre and realistic images to the vibrant colours and bold lines.

His ebb-and-flow of creative activity and episodes of seizures, depression and mania were at their most intense in the last two years of his life when he produced the greatest number of paintings.

His works are among the most emotionally and monetarily valued of all time. Vincent’s painting of Dr Gachet (1890) in a melancholy pose with digitalis flowers – used in the treatment of epilepsy at that time – sold for $US82.5 million in May, 1990, which at the time set a new record price for a painting bought at auction.

Healthcare professionals and art historians have written from many perspectives of other medical and/or psychological conditions that impacted van Gogh’s art and life with theories involving bipolar disorder, migraines, Meniere’s decease, syphilis, schizophrenia, alcoholism, emotional trauma and the layman concept of ‘madness’.

What was missing as a basis to best resolve disputes over which mental or medical condition(s) had significant impact on his life was a comprehensive foundation of how epilepsy or mental illness can influence art and possibly enhance creativity based on insights from a large group of contemporary artists.

Following a brain injury and acquired epilepsy I gained personal insight into what may have affected the brain, mind and creativity of van Gogh and others who experience neurological and/or psychological conditions.

The experience opened my eyes to the medical, cognitive, behavioural and social aspects of two of the most complex and widely misunderstood human conditions.

Despite having no prior experience or recognisable talent, I discovered that my brain injury/epilepsy had sparked a creative mindset that resulted in a passion for producing award-winning visual art.

I enrolled in art classes and began to recognise common topics, styles and characteristics in the art of contemporary and famous artists who are speculated or known to have had epilepsy, such as Vincent van Gogh, Lewis Carroll, Edward Lear and Giorgio de Chirico.

Curiosity for solving the complex puzzle of how epilepsy could influence art led me to pursue a Masters in Visual Art which included a full course exclusively about Vincent van Gogh.

I subsequently obtained the world’s first dual PhD combining Visual Arts, Medicine and Art Curation at the University of Melbourne.

The PhD Creative Sparks: Epilepsy and enhanced creativity in visual arts (2014) was based on the visual, written and verbal insights from more than 100 contemporary artists with epilepsy and provided:

 • objective and subjective proof that epilepsy can sometimes enhance creativity – supported by brain imaging illustrating how that can occur;

 • a comprehensive inventory of the signature traits of neurological and psychological conditions that have significant interpretive value in healthcare practice and consideration in art history;

 • the largest collection of images of the visual narratives from people with epilepsy;

 • comparative data to distinguish epilepsy from other medical and mental conditions; and

 • the Creative Sparks Art Collection and Website – artandepilepsy.com.

Interest in these research discoveries and art exhibitions provided opportunities for me to deliver presentations at national and international universities, hospitals and conferences. Melbourne University Medical School sponsored an innovative series of workshops through which to teach neurology and empathy by an intriguing new approach.

 Jim Chambliss has a dual PhD in Creative Arts and Medicine and has explored the ways epilepsy and other health conditions can influence art and enhance creativity.

Information about his Art and Medicine Workshops involving Vincent van Gogh can be obtained by visiting artforinsight.com or artandepliepsy.com

 

[Perspectives] Brain Diaries: two hemispheres of interest

“Understanding the brain and its diseases is one of the key challenges of the 21st century”, said Professor of Clinical Neurology Christopher Kennard at the launch of Oxford University Museum of Natural History’s Brain Diaries. “I’ve said that is like climbing Everest, but I don’t even think we’ve got to base camp”, Kennard explained, citing the growing “problem of dementia: the longer we live, the more likely we are to develop Alzheimer’s”. Incorporating research from more than 50 neuroscientists, Brain Diaries explores the passage of a healthy brain from conception to old age.

Neurobionics and the brain–computer interface: current applications and future horizons

Neurobionics is the science of directly integrating electronics with the nervous system to repair or substitute impaired functions. The brain–computer interface (BCI) is the linkage of the brain to computers through scalp, subdural or intracortical electrodes (Box 1). Development of neurobionic technologies requires interdisciplinary collaboration between specialists in medicine, science, engineering and information technology, and large multidisciplinary teams are needed to translate the findings of high performance BCIs from animals to humans.1

Neurobionics evolved out of Brindley and Lewin’s work in the 1960s, in which electrodes were placed over the cerebral cortex of a blind woman.24 Wireless stimulation of the electrodes induced phosphenes — spots of light appearing in the visual fields. This was followed in the 1970s by the work of Dobelle and colleagues, who provided electrical input to electrodes placed on the visual cortex of blind individuals via a camera mounted on spectacle frames.24 The cochlear implant, also developed in the 1960s and 1970s, is now a commercially successful 22-channel prosthesis for restoring hearing in deaf people with intact auditory nerves.5 To aid those who have lost their auditory nerves, successful development of the direct brainstem cochlear nucleus multi-electrode prosthesis followed.6

The field of neurobionics has advanced rapidly because of the need to provide bionic engineering solutions to the many disabled US veterans from the Iraq and Afghanistan wars who have lost limbs and, in some cases, vision. The United States Defense Advanced Research Projects Agency (DARPA) has focused on funding this research in the past decade.7

Through media reports about courageous individuals who have undergone this pioneering surgery, disabled people and their families are becoming more aware of the promise of neurobionics. In this review, we aim to inform medical professionals of the rapid progress in this field, along with ethical challenges that have arisen. We performed a search on PubMed using the terms “brain computer interface”, “brain machine interface”, “cochlear implants”, “vision prostheses” and “deep brain stimulators”. In addition, we conducted a further search based on reference lists in these initial articles. We tried to limit articles to those published in the past 10 years, as well as those that describe the first instances of brain–machine interfaces.

Electrode design and placement

Neurobionics has been increasing in scope and complexity because of innovative electrode design, miniaturisation of electronic circuitry and manufacture, improvements in wireless technology and increasing computing power. Using computers and advanced signal processing, neuroscientists are learning to decipher the complex patterns of electrical activity in the human brain via these implanted electrodes. Multiple electrodes can be placed on or within different regions of the cerebral cortex, or deep within the subcortical nuclei. These electrodes transmit computer-generated electrical signals to the brain or, conversely, receive, record and interpret electrical signals from this region of the brain.

Microelectrodes that penetrate the cortical tissue offer the highest fidelity signals in terms of spatial and temporal resolution, but they are also the most invasive (Box 2, A).8 These electrodes can be positioned within tens of micrometres of neurons, allowing the recording of both action potential spikes (the output) of individual neurons and the summed synaptic input of neurons in the form of the local field potential.9 Spiking activity has the highest temporal and spatial resolution of all the neural signals, with action potentials occurring in the order of milliseconds. In contrast, the local field potential integrates information over about 100 μm, with a temporal resolution of tens to hundreds of milliseconds.

Electrocorticography (ECoG), using electrodes placed in the subdural space (on the cortical surface), and electroencephalography (EEG), using scalp electrodes, are also being used to detect cortical waveforms for signal processing by advanced computer algorithms (Box 2, C, D). Although these methods are less invasive than using penetrating microelectrodes, they cannot record individual neuron action potentials, instead measuring an averaged voltage waveform over populations of thousands of neurons. In general, the further away the electrodes are from the brain, the safer the implantation procedure is, but with a resulting decrease in the signal-to-noise ratio and the amount of control signals that can be decoded (ie, there is a lot of background noise). Therefore, ECoG recordings, which are closer to the brain, typically have a higher signal spatial and temporal resolution than that achievable by EEG.8 As EEG electrodes are placed on the opposite side of the skull from the brain, the recordings have a low fidelity and a low signal-to-noise ratio. For stimulation, subdural electrodes require higher voltages to activate neurons than intracortical electrodes and are less precise for stimulation and recording. Transcranial magnetic stimulation can be used to stimulate populations of neurons, but this is a crude technique compared with the invasive microelectrode techniques.10

Currently, implanted devices have an electrical plug connection through the skull and scalp, with attached cables. This is clearly not a viable solution for long term implantation. The challenge for engineers has been to develop the next generation of implantable wireless microelectronic devices with a large number of electrodes that have a long duration of functionality. Wireless interfaces are beginning to emerge.3,1113

Applications for brain–computer interfaces

Motor interfaces

The aim of the motor BCI has been to help paralysed patients and amputees gain motor control using, respectively, a robot and a prosthetic upper limb. Non-human primates with electrodes implanted in the motor cortex were able, with training, to control robotic arms through a closed loop brain–machine interface.14 Hochberg and colleagues were the first to place a 96-electrode array in the primary motor cortex of a tetraplegic patient and connect this to a computer cursor. The patient could then open emails, operate various devices (such as a television) and perform rudimentary movements with a robotic arm.15 For tetraplegic patients with a BCI, improved control of the position of a cursor on a computer screen was obtained by controlling its velocity and through advanced signal processing.16 These signal processing techniques find relationships between changes in the neural signals and the intended movements of the patient.17,18

Reach, grasp and more complex movements have been achieved with a neurally controlled robotic arm in tetraplegic patients.19,20 These tasks are significantly more difficult than simple movements as they require decoding of up to 15 independent signals to allow a person to perform everyday tasks, and up to 27 signals for a full range of movements.21,22 To date, the best BCI devices provide fewer than ten independent signals. The patient requires a period of training with the BCI to achieve optimal control over the robotic arm. More complex motor imagery, including imagined goals and trajectories and types of movement, has been recorded in the human posterior parietal cortex. Decoding this imagery could provide higher levels of control of neural prostheses.23 More recently, a quadriplegic patient was able to move his fingers to grasp, manipulate and release objects in real time, using a BCI connected to cutaneous electrodes on his forearms that activated the underlying muscles.24

The challenge with all these motor cortex electrode interfaces is to convert them to wireless devices. This has recently been achieved in a monkey with a brain–spinal cord interface, enabling restoration of movement in its paralysed leg,25 and in a paralysed patient with amyotrophic lateral sclerosis, enabling control of a computer typing program.11

These examples of BCIs have primarily used penetrating microelectrodes, which, despite offering the highest fidelity signal, suffer from signal loss over months to years due to peri-electrode gliosis.26 This scarring reduces electrical conduction and the resulting signal change can require daily or even hourly recalibration of the algorithms used to extract information.18 This makes BCIs difficult to use while unsupervised and hinders wider clinical application, including use outside a laboratory setting.

A recently developed, less invasive means of electrode interface with the motor cortex is the stent-electrode recording array (“stentrode”) (Box 2, B).27 This is a stent embedded with recording electrodes that is placed into the sagittal venous sinus (situated near the motor cortex) using interventional neuroradiology techniques. This avoids the need for a craniotomy to implant the electrodes, but there are many technical challenges to overcome before human trials of the stentrode can commence.

Lower-limb robotic exoskeleton devices that enable paraplegic patients to stand and walk have generated much excitement and anticipation. BCIs using scalp EEG electrodes are unlikely to provide control of movement beyond activating simple robotic walking algorithms in the exoskeleton, such as “walk forward” or “walk to the right”. Higher degrees of complex movement control of the exoskeleton with a BCI would require intracranial electrode placement.28 Robotic exoskeleton devices are currently cumbersome and expensive.

Sensory interfaces

Fine control of grasping and manipulation of the hand depends on tactile feedback. No commercial solution for providing artificial tactile feedback is available. Although early primate studies have produced artificial perceptions through electrical stimulation of the somatosensory cortex, stimulation can detrimentally interfere with the neural recordings.29 Optogenetics — the ability to make neurons light-sensitive — has been proposed to overcome this.30 Sensorised thimbles have been placed on the fingers of the upper limb myoelectric prosthesis to provide vibratory sensory feedback to a cuff on the arm, to inform the individual when contact with an object is made and then broken. Five amputees have trialled this, with resulting enhancement of their fine control and manipulation of objects, particularly for fragile objects.31 Sensory feedback relayed to the peripheral nerves and ultimately to the sensory cortex may provide more precise prosthetic control.32

Eight people with chronic paraplegia who used immersive virtual reality training over 12 months saw remarkable improvements in sensory and motor function. The training involved an EEG-based BCI that activated an exoskeleton for ambulation and visual–tactile feedback to the skin on the forearms. This is the first demonstration in animals or humans of long term BCI training improving neurological function, which is hypothesised to result from both spinal cord and cortical plasticity.33

The success of the cochlear prosthesis in restoring hearing to totally deaf individuals has also demonstrated how “plastic” the brain is in learning to interpret electrical signals from the sound-processing computer. The recipient learns to discern, identify and synthesise the various sounds.

The development of bionic vision devices has mainly focused on the retina, but electrical connectivity of these electrode arrays depends on the recipient having intact neural elements. Two retinal implants are commercially available.3 Retinitis pigmentosa has been the main indication. Early trials of retinal implants are commencing for patients with age-related macular degeneration. However, there are many blind people who will not be able to have retinal implants because they have lost the retinal neurons or optic pathways. Placing electrodes directly in the visual cortex bypasses all the afferent visual pathways.

It has been demonstrated that electrical stimulation of the human visual cortex produces discrete reproducible phosphenes. Several groups have been developing cortical microelectrode implants to be placed into the primary visual cortex. Since 2009, the Monash Vision Group has been developing a wireless cortical bionic vision device for people with acquired bilateral blindness (Box 3). Photographic images from a digital camera are processed by a pocket computer, which transforms the images into the relevant contours and shapes and into patterns of electrical stimulation that are transmitted wirelessly to the electrodes implanted in the visual cortex (Box 3, B). The aim is for the recipient to be able to navigate, identify objects and possibly read large print. Facial recognition is not offered because the number of electrodes will not deliver sufficient resolution.2 A first-in-human trial is planned for late 2017.2,34

The lateral geniculate nucleus of the thalamus is an alternative site for implantation of bionic vision devices. Further technical development of the design, manufacture and placement of multiple brain microelectrodes in this small deep brain structure is needed before this could be applied in humans.35

Memory restoration and enhancement

The same concepts and technologies used to record and stimulate the brain in motor and sensory prostheses can also be applied to deeper brain structures. For example, the fornix is an important brain structure for memory function. A human safety study of bilateral deep brain stimulation of the fornix has been conducted in 42 patients with mild, probable Alzheimer disease (ADvance trial), and this study will now proceed to a randomised controlled trial.36 This technique involves deep brain stimulation without direct feedback from neural recording.

A more definitive approach to memory augmentation would be to place a multi-electrode prosthesis directly into the hippocampus. Electrical mimicry of encoded patterns of memory about a task transmitted from trained donor rats to untrained recipient rats resulted in enhanced task performance in the recipients.37,38 This technology has been applied to the prefrontal cortex of non-human primates.39 Although human application is futuristic, this research is advancing rapidly. A start-up company was formed in 2016 to develop this prosthetic memory implant into a clinic-ready device for people with Alzheimer disease.40 The challenge in applying these therapies to Alzheimer disease and other forms of dementia will be to intervene before excessive neuronal loss has occurred.

Seizure detection and mitigation

Many patients with severe epilepsy do not achieve adequate control of seizures with medication. Deep brain electrical stimulation, using electrodes placed in the basal ganglia, is a treatment option for patients with medically refractory generalised epilepsy.41 Methods to detect the early onset of epileptic seizures using cortical recording and stimulation (to probe for excitability) are evolving rapidly.42 A hybrid neuroprosthesis, which combines electrical detection of seizures with an implanted anti-epileptic drug delivery system, is also being developed.43,44

Parkinson disease and other movement disorders

Deep brain stimulation in the basal ganglia is an effective treatment for Parkinson disease and other movement disorders.45 This type of BCI includes a four-electrode system implanted in the basal ganglia, on one or both sides, which is connected to a pulse generator implanted in the chest wall. This device can be reprogrammed wirelessly. Novel electrodes with many more electrode contacts and a recording capacity are being developed. This feedback controlled or closed loop stimulation will require a fully implanted BCI, so that the deep brain stimulation is adaptive and will better modulate the level of control of the movement disorder from minute to minute. More selective directional and steerable deep brain stimulation, with the electrical current being delivered in one direction from the active electrodes, rather than circumferentially, is being developed. The aim is to provide more precise stimulation of the target neurons, with less unwanted stimulation of surrounding areas and therefore fewer side effects.46

Technical challenges and future directions

Biocompatibility of materials, electrode design to minimise peri-electrode gliosis and electrode corrosion, and loss of insulation integrity are key engineering challenges in developing BCIs.47 Electrode carriers must be hermetically sealed to prevent ingress of body fluids. Smaller, more compact electronic components and improved wireless interfaces will be required. Electronic interfaces with larger numbers of neurons will necessitate new electrode design, but also more powerful computers and advanced signal processing to allow significant use time without recalibration of algorithms.

Advances in nanoscience and wireless and battery technology will likely have an increasing impact on BCIs. Novel electrode designs using materials such as carbon nanotubes and other nanomaterials, electrodes with anti-inflammatory coatings or mechanically flexible electrodes to minimise micromotion may have greater longevity than standard, rigid, platinum–iridium brain electrodes.48 Electrodes that record from neural networks in three dimensions have been achieved experimentally using injectable mesh electronics with tissue-like mechanical properties.49 Optogenetic techniques activate selected neuronal populations by directing light onto neurons that have been genetically engineered with light-sensitive proteins. There are clearly many hurdles to overcome before this technology is available in humans, but microscale wireless optoelectronic devices are working in mice.50

Populating the brain with nanobots that create a wireless interface may eventually enable direct electronic interface with “the cloud”. Although this is currently science fiction, the early stages of development of this type of technology have been explored in mice, using intravenously administered 10 μg magnetoelectric particles that enter the brain and modify brain activity by coupling intrinsic neural activity with external magnetic fields.51

Also in development is the electrical connection of more than one brain region to a central control hub — using multiple electrodes with both stimulation and recording capabilities — for integration of data and neuromodulation. This may result in more nuanced treatments for psychiatric illness (such as depression, post-traumatic stress disorder and obsessive compulsive disorder), movement disorders, epilepsy and possibly dementia.

Ethical and practical considerations

Implantable BCI devices are in an early phase of development, with most first-in-human studies describing only a single patient. However, the performance of these devices is rapidly improving and, as they become wireless, the next step will be to implant BCIs in larger numbers of patients in multicentre trials.

The prime purpose of neurobionic devices is to help people with disabilities. However, there will be pressure in the future for bionic enhancement of normal cognitive, memory, sensory or motor function using BCIs. Memory augmentation, cognitive enhancement, infrared vision and exoskeletal enhancement of physical performance will all likely be achievable.

The introduction of this technology generates many ethical challenges, including:

  • appreciation of the risk–benefit ratio;

  • provision of adequate and balanced information for the recipient to give informed consent;

  • affordability in relation to the fair and equitable use of the scarce health dollar;

  • inequality of patient access to implants, particularly affecting those in poorer countries;

  • undue influence on physicians and scientists by commercial interests; and

  • the ability to achieve unfair physical or cognitive advantage with the technology, such as enhancing disabled athletes’ performance using exoskeleton devices, military application with the creation of an enhanced “super” soldier, or using a BCI as the ultimate lie detector.52

The introduction of these devices into clinical practice should therefore not proceed unchecked. As the technology transitions from clinical trial to the marketplace, training courses and mentoring will be needed for the surgeons who are implanting these devices. Any new human application of the BCI should be initially tested for safety and efficacy in experimental animal models. After receiving ethics committee approval for human application, the technology should be thoroughly evaluated in well conducted clinical trials with clear protocols and strict inclusion criteria.53

One question requiring consideration is whether sham surgery should be used to try to eliminate a placebo effect from the implantation of a new BCI device. Inclusion of a sham surgery control group in randomised controlled trials of surgical procedures has rarely been undertaken,54 and previous trials involving sham surgery have generated much controversy.5557 Sham surgery trials undertaken for Parkinson disease have involved placing a stereotactic frame on the patient and drilling of burr holes but not implanting embryonic cells or gene therapy.5860 We do not believe sham surgery would be applicable for BCI surgery, for several reasons. First, each trial usually involves only one or a few participants; there are not sufficient numbers for a randomised controlled trial. Second, the BCI patients can serve as their own controls because the devices can be inactivated. Finally, although sham controls may be justified if there is likely to be a significant placebo effect from the operation, this is not the case in BCI recipients, who have major neurological deficits such as blindness or paralysis.

Clinical application of a commercial BCI will require regulatory approval for an active implantable medical device, rather than approval as a therapy. It is also important for researchers to ask the potential recipients of this new technology how they feel about it and how it is likely to affect their lives if they volunteer to receive it.61 This can modify the plans of the researchers and the design of the technology. The need for craniotomy, with its attendant risks, may deter some potential users from accepting this technology.

As the current intracortical electrode interfaces may not function for more than a few years because of electrode or device failure, managing unrealistic patient and family expectations is essential. Trial participants will also require ongoing care and monitoring, which should be built into any trial budget. International BCI standards will need to be developed so that there is uniformity in the way this technology is introduced and evaluated.

Conclusions

BCI research and its application in humans is a rapidly advancing field of interdisciplinary research in medicine, neuroscience and engineering. The goal of these devices is to improve the level of function and quality of life for people with paralysis, spinal cord injury, amputation, acquired blindness, deafness, memory deficits and other neurological disorders. The capability to enhance normal motor, sensory or cognitive function is also emerging and will require careful regulation and control. Further technical development of BCIs, clinical trials and regulatory approval will be required before there is widespread introduction of these devices into clinical practice.

Box 1 –
Schematic overview of the major components of brain–computer interfaces


Common to all devices are electrodes that can interface at different scales with the neurons in the brain. For output-type interfaces (green arrows), the brain signals are amplified and control signals from them are decoded via a computer. These decoded signals are then used to control devices that can interact with the world, such as computer cursors or robotic limbs. For input-type interfaces (red arrows), such as vision or auditory prostheses, a sensor captures the relevant input, which a computer translates into stimulation parameters that are sent to the brain via an electrode interface. EEG = electroencephalography. LFP = local field potential.

Box 2 –
Electrodes of different scales that can be used to record neural activity for brain–computer interfaces


A: The most invasive method of recording neural activity, which produces the best signal quality, requires penetrating microelectrodes, such as this Utah array (Blackrock Microsystems), with 100 electrodes with a spacing of 400 μm. Wires connected to each electrode (bundled to the right of the image) need to be percutaneously connected to the outside world. B: Electrodes placed on an intravascular stent with (inset) a close-up image of a few electrodes (750 μm diameter). C: A 128-channel, non-invasive electroencephalography cap. After the cap is applied to the scalp, conductive gel is injected into each electrode to ensure electrical contact. D: An example of a planar array that can be placed in the subdural space to record electrocorticography signals. The platinum electrodes (350 μm diameter circles) are embedded in silicone.

Box 3 –
An example of a fully implantable brain–computer interface


A: The Monash Vision Group cortical vision prosthesis, which consists of an array of penetrating microelectrodes (metallic spikes) connected through a ceramic casing to electronics that are capable of delivering electrical stimulation and receiving wireless power and control signals. B: A close-up of a single electrode with a 150 μm diameter. The bright band is the conductive ring electrode, where electrical charge is delivered. Electrodes are spaced 1 mm apart.

Motor neurone disease: progress and challenges

Motor neurone disease (MND) is a progressive, neurodegenerative disorder that mainly attacks the human motor system, leading to significant disability and ultimately death, usually within 3 years.1 The incidence of MND in Western populations, including Australia, is about 2–3 per 100 000, with a national prevalence of about 8 per 100 000.2 Currently, 1500 Australian patients suffer from the disease.3 As there remains no test for MND, diagnosis is based on clinical findings, supported by investigations such as neurophysiological testing and structural imaging to exclude mimic disorders.4 Given that MND is clinically and pathologically heterogeneous, therapeutic and neuroprotective targets have been difficult to identify. However, progress in recent years has stimulated innovative research into this devastating disorder. In this review, we discuss areas of progress in the field of MND, including improved understanding of the various clinical phenotypes, the development of standards of care, the continuum with frontotemporal dementia (FTD), the role of genetics, and the global clinical trials pipeline. We also highlight the importance of translating research into clinical practice through various networks.

To formulate an evidence-based review of MND in clinical practice, we searched PubMed for original and review articles published between 1990 and 2016, focusing on publications within the past 5 years. We also sourced guidelines and other articles from MND Australia and the Cochrane Database of Systematic Reviews, and drew on collective specialist experience across Australia’s multidisciplinary clinics.

New clinical perspectives

Patients with MND are heterogeneous with varied presentations, depending on the site of disease onset. The definitive clinical characteristics remain the presence of upper and lower motor neurone signs coexisting in the same symptomatic area (Box 1).3,5 The median time to diagnosis is about 14 months, which is often a distressing period that also inevitably delays appropriate disease-modifying therapies and acceptance of prognosis.4,6

The clinical phenotype and site of disease onset appears to be of important prognostic significance in MND.7 Four main clinical phenotypes are described based on the relative degree of upper and lower motor neurone predominance and the site of onset:

  • Amyotrophic lateral sclerosis (ALS) represents 70% of cases.3,6 ALS classically begins in the limb and exhibits a combination of upper and lower motor neurone signs.3,6 In about 20% of patients, the weakness starts in the bulbar region.3 With a median survival of about 3 years, ALS is the most malignant of the MND phenotypes.7

  • Isolated bulbar palsy represents about 4–8% of cases.8 Patients have localised, progressive speech and swallowing difficulties for a prolonged period (> 6 months), despite relative preservation of limb strength. Although almost all isolated bulbar palsy patients eventually progress to definite ALS, they have a better prognosis than bulbar-onset ALS, with disease duration extended by at least 12 months.8

  • Progressive muscular atrophy presents with pure lower motor neurone signs and represents 5% of the MND phenotype.9 A subgroup develops a flail limb variant, in which symptoms are limited either to the upper (flail arm syndrome) or the lower limbs (flail leg syndrome) for at least 12 months. The prognosis for these variants is typically more prolonged than for classic ALS, but patients with more generalised weakness (> 50% of limb regions affected) follow a similar disease course to ALS.9

  • Primary lateral sclerosis (1–3% of cases)10 presents with pure upper motor neurone signs and a predilection for lower limb disease onset. Patients experience a much slower disease progression and better prognosis, with some known to have normal life expectancy.10

Advances in treatment approach: new standards of care

An increased understanding of this complex disorder has enabled identification of several important factors that improve survival and reduce patient symptoms, generating new evidence-based treatment interventions.11

Multidisciplinary care

Multidisciplinary care has been reported to improve both quality of life and survival, potentially up to 7–24 months.12,13 For patients with bulbar disease, survival benefit is possibly even longer.12 The care team involves medical, nursing and allied health professionals, and focuses on proactive intervention for early holistic management of the patient (Box 2). The network also includes input from MND Australia and state-based associations, which assist with care coordination, family support and community education. This dynamic framework addresses the complex medical issues that arise over time, allows for continual assessment of functional disabilities and psychosocial burden, and provides a network of support for the treating clinicians.13 Management within a multidisciplinary care clinic is therefore recommended for all patients, with such clinics operating in most capital cities around Australia.2 Telemedicine services are also available in some areas for patients who have attendance difficulties (such as due to disability or location).

Respiratory management with non-invasive ventilation

The benefit of using non-invasive positive pressure ventilation for respiratory involvement has been a crucial discovery in MND care, not only because it greatly improves symptoms and quality of life, but because it extends patient survival by up to 13 months.14,15 Only a small proportion of patients with MND have respiratory problems at the initial onset of disease, but almost all will develop symptoms during the course of the disease and most will die from respiratory complications.16,17 Respiratory symptoms should therefore be assessed at each visit and, if tolerated, early institution of non-invasive positive pressure ventilation should be implemented.

Nutritional support

Weight loss and malnutrition during the course of disease exert a negative effect on survival, associated with more rapid disease progression.18 This can occur from development of a hypermetabolic state, swallowing problems, neuropsychiatric issues, and feeding difficulties due to loss of limb function.19 Monitoring patients for weight loss is thus essential, and managing this with the help of an experienced speech pathologist and dietitian is critical in routine MND care. Interventions for enteral feeding options (such as percutaneous endoscopic gastrostomy tubes) are helpful in circumventing dysphagia (especially in patients with bulbar-onset MND) and for maintaining nutrition in patients using long term ventilatory support.20,21

Other symptoms

Cumulative experience has guided current clinical practice for the treatment of other symptoms often experienced by patients with MND related to their progressive motor and non-motor dysfunction (Box 3).

End-of-life challenges

For all patients with MND, the issue of advanced care planning needs to be raised within an appropriate timeframe by the multidisciplinary care team. Although the end-of-life phase remains poorly defined and timing of palliative care input is not consistent globally,22 advance care planning assists patients and their families with important decision making, imparting control and composure in a situation in which they may otherwise be absent. Carers have identified disease-specific advanced directives (such as patient letters of future care) as useful tools to stimulate such discussion while maintaining respect for patient autonomy.23 Appropriate timing for these discussions is often individually based, but can be sensitively approached when introducing intervention options such as non-invasive positive pressure ventilation or gastrostomy.

The MND–FTD spectrum

Although originally thought to be a purely motor disorder, MND has been increasingly recognised for its extra-motor manifestations. Cognitive impairment is common and may develop in up to 50% of patients, manifesting as language abnormalities and mild to moderate frontal dysfunction.24 The presence of cognitive impairment is associated with a negative impact on survival, more rapid disease progression, decreased functional ability, higher rates of non-compliance with therapy, and increased psychosocial distress and burden for carers.24

About 15% of patients with MND who have cognitive impairment meet the criteria for FTD, and are considered to have MND–FTD.25 MND and FTD are known to share distinct clinical, neuropathological and genetic features, and it is now recognised that they are part of a continuum in which pure MND (with no cognitive involvement) and pure FTD (with no motor involvement) form ends of a spectrum.25 In 2011, the discovery of a mutation in the C9orf72 gene unambiguously linked these two conditions (Box 4).26,27 C9orf72 mutation is a hexanucleotide repeat expansion, now known to be responsible for about 40% of familial MND, 25% of familial FTD, and up to 80% of MND–FTD cases.27

Genetics: new insights

The recent discovery of novel genes associated with MND has changed the genetic landscape and encouraged the possibility of future gene therapy. Increasingly, this is modifying the clinical approach to MND, and clinicians commonly face questions regarding genetic causes, testing and family risk. Although mainly driven by the neurologist and clinical geneticist, the dilemma on handling such queries necessitates an understanding of appropriate genetic testing options and counselling for patients and their relatives.

Both familial and sporadic MND are clinically similar, and the genetic and biological distinction between them is also becoming increasingly blurred. Ninety per cent of MND cases appear to be sporadic, with 5–10% of patients having a family history of MND.28 Family history is established with the presence of MND or FTD in a first- or second-degree relative. This is most commonly inherited in an autosomal dominant manner but may be autosomal recessive or X-linked.29 Empirical data suggest that the lifetime risk of MND in first-degree relatives of sporadic patients is 1–3%, with a lower risk for second-degree relatives and no apparent increase in risk for more distant relations.30 However, many factors can complicate the pattern and presence of inheritance, including incomplete family information, false paternity, early death and non-penetrance.

To date, more than 25 MND genes have been discovered, explaining 10% of sporadic and 65% of familial disease.31 The C9orf72 repeat expansion is the most common genetic cause of MND. This is reported to account for about 40% of familial and about 7% of sporadic disease,31,32 but the exact frequency of MND-related genes varies between different populations and specific data on prevalence in Australia are still limited. Preliminary national studies identified C9ORF72 mutation in 38.5% of familial cases and 3.5% of sporadic cases.33 There are also some shared clinical traits in these patients, who are typically of northern European descent and who clinically often have neuropsychiatric symptoms, including frank FTD.32,33 There is also a suggestion of higher frequency of bulbar-onset disease, earlier age of symptom onset, and more rapid disease progression.26,32

Who should be offered testing?

Currently, as there is no proven lifestyle modification or medication to reduce the risk of disease, the reason for genetic testing is mainly to provide diagnostic support or pre-natal counselling.34 The option for testing is usually offered by a neurologist to symptomatic patients who have a first- or second-degree relative with MND, FTD or MND–FTD. It can also be discussed with all other symptomatic patients, but with emphasis on the uncertainties of testing. Guidelines do not recommend that asymptomatic at-risk people be routinely offered testing, particularly as positive test results do not reliably correlate with disease development, severity, progression or age of disease onset.34

Genetic testing options

Until recently, testing was limited to a single gene — SOD1. Multigene next-generation sequencing panels and whole-exome sequencing in particular have significantly increased the identification of new genes and commercial testing options. Most tests take from 3–24 weeks for results. Although some genetic services may provide subsidised testing for several MND-related genes, there is no Medicare rebate for testing. The test can therefore cost between $250 and $9000, depending on the test and the laboratory.25,33

Limitations of testing

Despite the increase in genetic understanding of disease, establishing genetic testing guidelines has been complicated because of areas of persisting uncertainty. Establishing whether a mutation is pathogenic can be difficult, even for widely studied genes.34 There are also high rates of variants of uncertain significance in multigene panels, and technical challenges from differing laboratory techniques also generate problems.35 These limitations in testing should be emphasised to the patient, including the fact that positive results do not predict disease course, negative results do not exclude genetic basis of disease, and results may not provide interpretable information if a genetic variant of uncertain significance is identified.

Genetic counselling

Given the complications, interpreting results correctly and counselling patients with accurate risk assessments can be difficult. Genetic counselling should be managed by the multidisciplinary team, mainly via a clinical geneticist, but also involving a neurologist, general practitioner and psychologist. Genetic counselling clinics operate across Australia (information is available at http://www.genetics.edu.au). Patients and family members should have pre-test consultation with a genetics counsellor as a prerequisite before undergoing a test. Post-test counselling is usually offered for patients with a positive test result, and implications for family members (including offspring) can be discussed. DNA banking permits future testing and is an option for families who do not feel ready to undergo genetic testing.32,34

Neuroprotection: current and future

Riluzole currently remains the only neuroprotective medication available for patients with MND, and early commencement is recommended for all patients.36,37 Riluzole modulates sodium channels and inhibits glutamate release, providing a survival benefit of about 3–6 months, potentially greater for patients with bulbar-onset disease.38 It has a modest side effect profile and is generally well tolerated by patients. Liver function tests and a full blood count should be carried out monthly for 3 months, and 3-monthly thereafter, with treatment cessation if liver function test results (alanine transaminase and aspartate transaminase levels) exceed more than five times the upper limit of normal and/or neutropenia develops.37,39

Current trials

Edaravone (3-methyl-1-phenyl-2-pyrazolin-5-one) is a free radical scavenger that has gained attention as a potential agent to slow disease progression in MND. Originally approved in Japan in 2002 to treat ischaemic stroke, edaravone has been shown to have multiple effects on the neural and vascular ischaemic cascade.40 Although an initial phase 3 randomised controlled trial in MND found no clinically significant effect, a subgroup of mildly symptomatic MND patients who had a forced vital capacity ≥ 80% and who were not more than 2 years from symptom onset showed slowing of disease progression.41 Intravenous edaravone treatment for such patients was approved in Japan in 2015. An oral equivalent is currently being tested in Europe in a phase 2 trial (http://www.treeway.nl/news-treeway-announces-positive-data-phase-1-trial-tw001).

There have been several other phase 3 randomised control trials that have shown promising results. A 2015 trial of ultra-high dose methylcobalamin showed a dose-dependent prolongation of survival when used early in disease.42 Positive results have also been seen with masitinib, an oral tyrosine kinase inhibitor that targets macrophages and mast cells,43 and a current trial has shown improvement in patients’ Revised ALS Functional Rating Scale scores and forced vital capacity (https://www.clinicaltrialsregister.eu/ctr-search/trial/2010-024423-24/IE#E). Although not available for use in Australia, masitinib has recently been granted orphan drug status by the European Medicines Agency and by the United States Food and Drug Administration (https://alsnewstoday.com/news-posts/2016/08/11/ab-science-potential-als-treatment-masitinib-named-orphan-drug-by-ema).

In Australia, three human clinical trials are currently commencing in Sydney and Melbourne. The first will analyse central nervous system copper treatment in patients with MND, after delivery of a copper compound (CuATSM) into the central nervous system of SOD1(G93A) mouse models prolonged survival by 18 months.44 The second trial is a phase 2 study assessing the effects of the antiretroviral agent Triumeq (ViiV Healthcare; combination tablet containing 600 mg abacavir, 300 mg lamivudine and 50 mg dolutegravir) in treating MND. The basis for this trial relates to links between human endogenous retrovirus K and the development of MND in mouse models and in humans.45 The third trial is a randomised crossover study evaluating the efficacy of oral FLX-787 (Flex Pharma) — a constituent of ginger — for patients with MND suffering from muscle cramps and spasticity. This study hypothesises that FLX-787 activates transient receptor potential ion channels involved in pain, and indirectly decreases motor neurone hyperexcitability.46 Recruitment for these trials began at involved centres towards the end of 2016.

Gene therapy

Gene technology aims to protect motor neurones by modulating mutated gene expression and reducing toxic RNA and proteins. Encouragingly, studies in SOD1 mouse models have shown significant benefits,47 with C9orf72 studies ongoing. However, this area is still in its infancy, mainly due to the complexity of the MND genetic environment, the difficulty in correlating mutations with clinical pathogenicity, the technical difficulty of the treatment itself, and the economic challenge it presents.47

Stem cell therapy

The benefit of stem cell therapy is frequently questioned by patients and families. The efficacy of this treatment remains open in the literature, with small phase 2 trials from around the world reporting safety and efficacy using various modes of stem cell delivery.48 Human induced pluripotent stem cell technology is a new and evolving field in which cells are created from patients with MND and differentiated into relevant cell subtypes, such as motor neurones and astrocytes. This has been used in vitro and in mouse models, but engraftment into patients with MND for therapeutic benefit remains a challenge.49 Although some countries offer therapeutic trials of stem cell treatment, it is not offered in Australia owing to its unclear efficacy. More understanding of underlying mechanistic processes and long term safety data are needed before clinical translation will be possible.

Fundraising, awareness and research

While the search for improved therapies continues, strong patient advocacy and home-based assistance has been provided by state-based MND associations and MND Australia. The Australian Motor Neurone Disease Registry also collects information from MND centres across Australia to increase clinical understanding and improve quality of care.2

Support for trials in Australia is also needed to drive national research and provide Australian patients with access to potential international treatments. Recently, Australian Clinical Trials and Translational Research Advisory (https://curemnd.org.au/meet-the-team) has been established to enable collaboration and grant this much needed opportunity to patients. Increase in community awareness (through campaigns such as the “ice bucket challenge”) and the generation of funding in Australia via organisations such as Cure for MND Foundation (https://www.curemnd.org.au) and the MND Research Institute of Australia (MNDRIA) (http://www.mndaust.asn.au) has supported MND national research in an unprecedented way, and is enabling successful translation of research by clinicians into the MND community.

Conclusion

Recent evolution in the clinical, pathological and genetic understanding of MND is progressively unmasking the multifactorial nature of this complex condition. Up-to-date knowledge of the current climate is thus essential for optimal patient care. Closing the research–practice gap through growth of community awareness and support from MND organisations has also been critical for this process. Ongoing engagement of professionals and the community is an invaluable asset to patients, encouraging novel therapeutic strategies and powering the drive to find effective treatments.

Box 1 –
Clinical signs and symptoms of upper and lower motor neurone involvement in motor neurone disease


UMN = upper motor neurone. LMN = lower motor neurone.

Box 2 –
Motor neurone disease management: multidisciplinary care model


The multidisciplinary care model centres on the patient with motor neurone disease. It involves dynamic integration of medical, nursing and allied health professionals for optimal patient management. Care is often coordinated by the clinical nurse, with the neurologist and general practitioner overseeing all aspects of care.

Box 3 –
Symptomatic treatments for motor neurone disease

Symptom

Pharmacological management (first-line medications)

Non-pharmacological management


Secondary to motor dysfunction

Cramps/fasciculations

Magnesium; carbamazepine

Physiotherapy (stretches); massage; hydrotherapy

Spasticity

Baclofen; clonazepam

Physiotherapy; hydrotherapy

Dyspnoea

Morphine (oral)*; lorazepam

Ventilatory support (respiratory review); chest physiotherapy; manually assisted coughing

Thickened saliva

Normal saline nebulisers; nebulised mucolytics (eg, N-acetylcysteine)

Natural mucolytics (papaya, pineapple, dark grape juice); hydration; portable suction device

Excess (watery) saliva

Amitriptyline; atropine (sublingual)*; glycopyrrolate

Portable suction device; diligent mouth care

Laryngospasm/paroxysmal choking

Lorazepam (sublingual)*; morphine (oral)*

Careful positioning; suctioning; ± ventilatory support

Secondary to non-motor dysfunction

Pain

Musculoskeletal: paracetamol; ibuprofenNeuropathic: gabapentin, pregabalin

Physiotherapy; hydrotherapy; pressure area care (repositioning, pressure cushion/mattress)

Cognitive dysfunction

Memantine; antidepressants

Education of caregivers/family

Emotional lability; depression

Amitriptyline; citalopram; mirtazapine

Psychological support; cognitive behavioural therapy

Sleep disturbance

Amitriptyline; benzodiazepines

Address underlying problem; may need respiratory review ± non-invasive ventilation

Constipation

Aperients; suppositories

Dietary changes (increased fibre and fluid); review drug adverse effects


* The specific formulation indicated is preferred for treatment of this symptom. † Pain is often multifactorial, and treatment must therefore be tailored to the individual cause(s). ‡ Used off label, but not supported by a recent negative phase 3 trial in FTD (Boxer AL, Knopman DS, Kaufer DI, et al. Memantine in patients with frontotemporal lobar degeneration: a multicentre, randomised, double-blind, placebo-controlled trial. Lancet Neurol 2013; 12: 149-156).

Box 4 –
The motor neurone disease (MND)–frontotemporal dementia (FTD) spectrum, showing some major known genetic causes (circles) and approximate year of gene discovery

Risk-adjusted hospital mortality rates for stroke: evidence from the Australian Stroke Clinical Registry (AuSCR)

The known Variance in patient outcomes between hospitals treating acute stroke needs to be reliably assessed. Methodology for standardising risk adjustment is evolving and requires field testing. The data in hospital admission databases are limited with regard to risk adjustment. 

The new Since 2009, the Australian Stroke Clinical Registry has captured data on stroke severity and other variables. The data have been used to improve risk adjustment when comparing hospital mortality rates; they can also be reliably linked to death registrations to compare methods for assessing risk-adjusted hospital mortality. 

The implications Including appropriate risk adjustment variables will ensure that comparisons of hospital performance regarding important patient outcomes for stroke are reliable. 

Stroke imposes a major health care burden, but the adoption of effective interventions varies widely.1 Efforts to improve the quality of stroke management rely on rigorous outcomes data2 for avoiding misleading comparisons of hospitals. To identify potentially modifiable factors, analyses must account for casemix differences and random error.3 In particular, analyses must take stroke severity into consideration, as it is one of the strongest predictors of stroke mortality.2,4,5

Although the methodology is still evolving, standardised risk adjustment2 is highly relevant to health care consumers and policy makers. In a recent report of routinely collected hospital admissions data, significant variation in 30-day stroke mortality was found after adjusting for age, sex and comorbidities (including hypertension and diabetes), but there was no adjustment for stroke severity.6 The National Health Performance Authority (NHPA) has identified stroke as a condition for which inter-hospital differences in models of care (eg, patterns of patient transfers) and inconsistent recording of clinical information and procedures (eg, palliative care coding) may distort comparisons of mortality.7 Because hospital data must be complete, accurate and consistent, the NHPA is currently unable to support public reporting of inter-hospital disease mortality rates, as such comparisons could be unreliable.7 In contrast, the ability to reliably compare hospital performance with respect to patient outcomes has rapidly accelerated improvements in health care overseas.7

Our aim was to describe variance in 30-day stroke mortality between hospitals using risk-adjusted mortality rates (RAMRs), as part of our trialling a recently recommended new statistical method that includes stroke severity as a covariate.2

Methods

Study design

The Australian Stroke Clinical Registry (AuSCR) is a voluntary, prospective, clinical quality registry that captures standardised data for nationally agreed variables for all patients admitted to participating hospitals with acute stroke or transient ischaemic attack (TIA).8 AuSCR includes personal information (eg, name, address), clinical characteristics (eg, type of stroke), quality of care indicators (eg stroke unit treatment), and outcomes measured at discharge and at 90–180 days (eg, survival and quality of life).8 Stroke severity is captured using a simple, validated prognostic measure, the “ability to walk unaided at the time of hospital admission”.9 In the original statistical modelling by Counsell and colleagues,9 this criterion was associated with a relative risk for 30-day survival of 1.63 (95% confidence interval [CI], 1.15–2.31). In our earlier work, the strongest predictor of independence at time of hospital discharge was the ability to walk on admission (odds ratio [OR], 2.84; 95% CI, 2.18–3.71).10

Data from participating hospitals were obtained for the period from 15 June 2009 (six participating hospitals) until 31 December 2014 (40 participating hospitals). We included all stroke types (ischaemic, intracerebral haemorrhage, and undetermined) in our analyses, as well as demographic variables, as stroke mortality is higher at all ages for Indigenous than for non-Indigenous Australians,11 varies according to country of birth,12 and is greater for people of lower socio-economic status.13 Socio-economic status was assessed by matching patients’ addresses with the corresponding Index of Relative Socio-economic Advantage and Disadvantage (IRSAD) score,14 collated as quintiles. Whether the stroke was the first or subsequent stroke experienced by the patient was included as a covariate, as the risk of death is greater for recurrent events.15 Age was included as a continuous measure in all models. All episodes occurring within 30 days of admission were included. Harrell’s concordance statistic (C-statistic) was used to determine how well the variable “ability to walk on admission” predicted 30-day mortality in our models.

We excluded patients who experienced a stroke while in hospital for another condition or when transferred from another hospital, as the different patterns of care may distort mortality ratios.7,16 Data for stroke care in a paediatric hospital were also excluded because of the small sample size (fewer than 50 care episodes).

Mortality data

Survival status at 30 days was obtained by probabilistic matching of AuSCR registrant identifiers with the National Death Index (NDI) by the Australian Institute of Health and Welfare. AuSCR staff undertook the review of non-exact matches; for discordant dates, we used NDI data as the reference. Based on this linkage method, in-hospital death reporting in the AuSCR had 98.8% sensitivity and 99.6% specificity, compared with an in-hospital death determined with the NDI date of death.

Outcome measures and analyses

The primary outcome was the risk-adjusted mortality rate (RAMR) at 30 days after admission, using the method recommended by the American Heart (AHA) and Stroke Associations (ASA).2 To maximise the reliability of our estimates,17 analyses were conducted for individual hospitals that provided data on at least 200 episodes of care for stroke between 2009 and 2014. The stages in deriving each model were:

  • entering the observed values;

  • generating estimates from generalised linear latent and mixed models (GLLAMM) by maximum likelihood;

  • generating expected probabilities;

  • generating predicted probabilities;

  • generating ratios predicted as expected; and

  • generating the RAMR.

The RAMR for each hospital was calculated by dividing the overall RAMR by the risk-adjusted average hospital mortality, and then multiplying by the overall crude (unadjusted) proportion of deaths in the whole sample.

The results were compared in models with covariates corresponding to those available in hospital admissions data (the hospital admissions model) and after also including covariates corresponding to those available only in the AuSCR (the Registry model). A model adjusted only for age and sex was also estimated. The hospital admissions model was adjusted for age, sex, year of data, stroke subtype, IRSAD quintile, Indigenous status, and place of birth (Australia v elsewhere). The Registry model was adjusted for the same variables, as well as for stroke history and severity. Differences between the models in the ranking of individual hospitals were explored.

Data are provided on the calibration and discrimination of the models,2 using the likelihood ratio test, the Akaike information criterion (AIC), the Bayesian information criterion (BIC), and the C-statistic. A smaller AIC or BIC indicates a better fitting model; a C-statistic of 1 indicates a perfect fit model, while a C-statistic of 0.5 indicates that fit is no better than chance. Multilevel models were used, with one level defined as the hospital unit, to account for correlations between patients who were managed in the same hospital, and the other representing patients as individual units.

P < 0.001 (two-sided) was deemed statistically significant because of the large sample size. Analyses were performed in Stata 12.1 (StataCorp).

Identifying significant mortality variation and differences in mortality outcomes

According to standard practice, hospitals within two standard deviations (SDs) of the overall average RAMR were deemed to lie within normal variation, and those outside three SDs were deemed to vary significantly from the other hospitals in the sample.18 Funnel plots were used to investigate deviations from the average hospital mortality rate.19 The direction of the change was also explored by graphing the difference between the age- and sex-adjusted hospital admissions and Registry RAMR estimates.

Ethics approval

Appropriate ethics and governance approvals were obtained for all participating hospitals in AuSCR, and from the Human Research Ethics Committee of Monash University (reference, CF11/3537–2011001884). Ethics approval was obtained from the Australian Institute of Health and Welfare to conduct data linkage to the National Death Index (reference, EO 2013/2/16).

Results

Between 2009 and 2014, 26 302 episodes of care for 24 806 individual patients from 45 hospitals were recorded; 3151 patients (12%) died within 30 days of admission (excluding TIAs, 14%). The concordance of ability to walk on admission as an indicator of stroke severity with 30-day mortality was excellent (C-statistic, 0.97). Patients with intracerebral haemorrhage (29%) were more likely to die within 30 days than those with other stroke types (ischaemic, 12%; undetermined stroke, 14%; TIAs, < 1%) (online Appendix 1).

Data from hospitals reporting at least 200 episodes of stroke care

Eighteen hospitals located in metropolitan areas and ten in rural and regional areas, each with a stroke unit, provided data for at least 200 episodes of care. Hospitals in the eastern states contributed most data (Victoria, 40% of episodes; New South Wales, 17%; Queensland, 34%; Tasmania, 4%); in Western Australia (5% of episodes), only two hospitals participated. We excluded from our analysis 7509 patients who had a TIA or in-hospital stroke, or who were transferred from another hospital (online Appendix 1).

In total, 16 218 episodes of care were provided to 15 951 individual patients (median age, 77 years; women, 46%; ischaemic stroke, 79%). Compared with patients who were alive 30 days after admission, the proportion of women among those who died was greater; they were also older, fewer were able to walk on admission, and more had a history of stroke or TIA (Box 1). The characteristics of patients with stroke were similar across the 28 hospitals with respect to age, sex, and ability to walk on admission (online Appendix 2). The proportions of patients with severe strokes were similar for hospitals with more or fewer episodes of care (data not shown; P = 0.59).

Comparison of hospital 30-day mortality outcomes

The unadjusted (crude) mortality rates for the 28 hospitals with at least 200 episodes of care ranged between 7% and 23%. Excluding the 7509 patients who had a TIA or in-hospital stroke, the unadjusted mortality rates for hospitals ranged between 5% and 20%, and the age- and sex-adjusted mortality rates ranged between 8% and 20%. The RAMRs estimated by the hospital admissions model ranged between 9% and 20%, and those by the Registry model between 9% and 21% (Box 2). The overall RAMRs adjusted for different combinations of Indigenous status, country of birth and history of stroke are reported in online Appendix 3. According to the model fit statistics (BIC, AIC, likelihood ratio test, C-statistic), the Registry model had the best fit (Box 3, online Appendix 3). Correlations between the number of episodes contributed by a hospital and the differences between age- and sex-adjusted RAMRs and the Registry RAMR estimates (R2 = 0.021) or hospital admissions RAMRs (R2 = 0.001) were low. When the results of the hospital admissions and Registry models were compared, the variance ranged between 0% and 3%.

Although the ranges of estimates by the adjusted models were similar, the rank order of hospitals changed according to the initial crude estimate and simple age- and sex-adjusted models (Box 2; online Appendix 4). The change in ranks in the hospital admissions and full Registry models illustrates the possibility of a hospital attaining very different results. The models with the best fit were those that included stroke severity as a covariate (Box 3). Based on the funnel plot distribution, the estimated mortality for only two hospitals was more than three SDs from the mean, one with low mortality, and the other with borderline excess mortality relative to other hospitals (Box 4).

Quality of care and correlations with mortality rates

Adherence to processes of care was similar for all hospitals (online Appendix 2). Stroke unit admissions ranged from 99% for the hospital with lowest RAMR to 80% for the hospital with the highest RAMR (weak positive correlation between increased stroke unit access and lower RAMR: R2 = 0.138). A negligible positive correlation was noted between increased prescription of antihypertensive drugs at discharge and lower RAMR (R2 = 0.021).

Discussion

Assessing the quality of health care delivered by different health care providers is complicated by the variable quality of routinely collected hospital data.7 For burdensome conditions such as stroke, this problem is exacerbated by the inability to account for differences in stroke severity and by inaccuracies in the coding of diagnosis or cause of death.20 Clinical quality registries have emerged as important tools for resolving these problems, but support from government agencies is not as consistent in Australia as in comparable countries.

We have provided an important illustration of the value of a national clinical quality registry for stroke, using a new method for calculating mortality statistics. The models with the best fit for standardising mortality were those that included adjustment for stroke severity, a covariate routinely available only in AuSCR. The change in rank position according to different RAMRs was clearest for hospital 13, which was ranked number 6 in the full Registry model, but number 21 in the hospital admissions model. Rankings that are not based on models adequately adjusted for relevant risks can lead to interpretations that suggest that some hospitals provide substandard care, and thereby impugn their reputations and that of their clinicians. The funnel plot approach provides an alternative method for assessing performance, but the control limits associated with the assumption of a normal distribution of the data makes caution advisable, particularly if the data are skewed, as in our sample of only 28 hospitals. For the hospital with the lowest mortality in each of the models (Hospital 1), selection bias may have arisen because 99% of its patients were treated in a stroke unit (online Appendix 2), and other unmeasured factors may have also contributed to its better outcome.

Our findings differ from a previous investigation of hospital stroke mortality rates in NSW that applied more conventional modelling methods, without adjusting for stroke severity.6 Standardised 30-day mortality rates varied significantly, from 15% to 30%, and several hospitals were categorised as “poor performers”.6 Cases were sampled across different timeframes and with varying sample sizes, but there was a greater diversity of hospitals than in our study; for example, hospitals without stroke units were included.

Registry data that include disease severity risk-adjustment variables that supplement hospital data can be used to ensure that performance comparisons are more reliable. Given the growth in public reporting of hospital performance and the recognition of its potentially driving improvement of quality of care,21 it is essential that appropriate methods are employed. We estimated RAMRs using a new approach recommended by the AHA/ASA,2 replacing the observed number of deaths with a prediction of numbers of deaths estimated from the average number of deaths for hospitals in a risk-adjusted model. This reduces the influence of chance on the variation in RAMRs (predicted v expected). Our study is the first report on the application of this new approach, and our models predicted numbers of deaths within 0–9% of the actual number.

Our investigation has broader implications for Australia, in that it advances methods for hospital-level comparisons of risk-adjusted mortality, particularly on the basis of routinely collected registry data. We acknowledge, as a limitation of our study, that not all hospitals contribute data to AuSCR, and our findings may consequently not be generalisable to all hospitals. Further, our sample was restricted to hospitals reporting at least 200 episodes of care (10 084 episodes from 17 hospitals were therefore excluded from our analyses). Including all hospitals may have led to greater variance in our results, but our sample was broadly representative of the entire cohort (online Appendix 1). The overall crude 30-day death rate for eligible hospitals (with at least 200 episodes of care) was 15% (range, 7–23%), similar to reports from other countries (13–15%).4,17

Critical predictors of stroke mortality include age, sex, stroke severity, and comorbidities,18 and a further limitation of our study was the inability to adjust for comorbidities, but inconsistent reporting of International Classification of Diseases (ICD-10) coding of comorbidities is recognised.22 Future linking of AuSCR data with hospital admissions data will enable a greater range of variables to be explored. Several International stroke registries incorporate National Institutes of Health Stroke Scale (NIHSS) data that can be used for adjusting for stroke severity, but training is required to administer the NIHSS.23 The ability to collect NIHSS data was introduced in AuSCR in 2015, but the level of missing data currently undermines its usefulness, whereas “ability to walk on admission” information was available for 90% of episodes. In recent validation work,9 a model based on simple variables (including ability to walk) performed as well as one employing NIHSS and age data; the choice of measure should therefore be based on practical considerations.24 Because there were very few episodes of intracerebral haemorrhage, we included stroke type as a covariate rather than stratifying the dataset, as has been previously recommended by other authors.25

In conclusion, we highlight the importance of using appropriate risk adjustment variables and methods for comparing hospital outcomes for stroke, with particular emphasis on the need to account for stroke severity. Moreover, we have shown the value of clinical quality disease registry data for refining outcome performance measurement in health care. As this is an evolving field, further research into risk adjustment variables and comparison of mortality rates is encouraged.

Box 1 –
Demographic and clinical characteristics for patients admitted to 28 hospitals with at least 200 episodes of care in the Australian Stroke Clinical Registry (AuSCR), 2009–2014

Status at 30 days


P

Died

Living


Number of patients

2372

13 846

Sex (men)

1043 (44%)

7604 (55%)

< 0.001

Age (years)

< 0.001

< 65

209 (9%)

3485 (25%)

65–74

321 (14%)

3292 (24%)

75–84

794 (34%)

4291 (31%)

≥ 85

1048 (44%)

2672 (20%)

Median (IQR)

84 (76–89)

75 (65–83)

< 0.001

Country of birth

< 0.001

Australia

1452 (67%)

8622 (67%)

United Kingdom

163 (8%)

1005 (8%)

Italy

133 (6%)

615 (5%)

Other European countries

240 (11%)

1315 (10%)

Asia

64 (3%)

580 (5%)

Other countries

108 (5%)

782 (6%)

Identifies as Aboriginal and/or Torres Strait Islander

15 (1%)

158 (1%)

0.024

Previous stroke/transient ischemic attack

550 (26%)

2966 (23%)

0.001

Index of Relative Socio-economic Advantage and Disadvantage (IRSAD)

0.016

Quintile 1 (most disadvantaged)

376 (16%)

2078 (15%)

Quintile 2

444 (19%)

2729 (20%)

Quintile 3

242 (10%)

1712 (12%)

Quintile 4

527 (22%)

3010 (22%)

Quintile 5 (least disadvantaged))

783 (33%)

4317 (31%)

Type of stroke

< 0.001

Intracerebral haemorrhage

765 (32%)

1629 (12%)

Ischaemic

1468 (62%)

11 286 (82%)

Undetermined stroke type

132 (6%)

897 (7%)

Cause of stroke known§

1054 (47%)

6511 (49%)

0.027

Stroke severity

Able to walk on admission

85 (4%)

4857 (39%)

< 0.001

Patients with multiple episodes of stroke care recorded in AuCSR

141 (6%)

694 (5%)

0.058


∗ Missing data: < 2%. † Missing data: 2–5%. ‡ Missing data: 6–10%. § Based on evidence of a structural, radiological, haematological, genetic or drug-related cause. 7509 patients with a transient ischemic attack or in-hospital stroke, or who were transferred from another hospital were excluded from analysis.

Box 2 –
Comparison of ranking of hospitals according to 30-day mortality for stroke by the hospital admission and full registry models*


* The full data for this figure are included in online Appendix 4.

Box 3 –
Summary statistics for goodness of fit of the three models of 30-day mortality rates for 28 hospitals providing at least 200 episodes of care in the Australian Stroke Clinical Registry, 2009–2014

Risk adjustment model


Adjusted for age and sex

Hospital admissions model*

Registry model


Association between risk-adjusted mortality rate and number of episodes

P = 0.19

P = 0.29

P = 0.24

Akaike information criterion (AIC)

12 451

11 694

9322

Bayesian information criterion (BIC)

12 482

11 764

9405

C-statistic (95% CI)

0.69(0.68–0.71)

0.74(0.73–0.75)

0.80(0.79–0.81)

Likelihood ratio test

P < 0.001

P < 0.001

Reference


* Adjusted for age, sex, year of admission, stroke type, Index of relative Socio-Economic Advantage and Disadvantage, Indigenous status, and place or birth (Australia v elsewhere). † Adjusted for history of stroke, stroke severity, and other variables of the hospital admissions model. The likelihood ratio test compares the different models.

Box 4 –
Funnel plot of risk-adjusted mortality rates for hospitals (Registry model)*


SD = standard deviation. * The numbers for the hospitals indicate their rank according to crude mortality rates (lowest to highest). Registry model was adjusted for age, sex, stroke type, index of relative socio-economic advantage and disadvantage, Indigenous status, country of birth, year of admission, history of previous stroke, and stroke severity.

Assessing the outcome of stroke in Australia

Appropriate risk adjustment of stroke outcome data is needed for assessing and ensuring quality of care

Australia prides itself on providing high quality health care. But how is it measured? A common benchmark in hospitals is the outcome for patients as measured by routinely collected mortality data, with hospitals ranked according to their performance on this measure. However, “league tables” that rank hospitals by crude (unadjusted) mortality rates may not accurately reflect their processes and quality of care if the rates are not adjusted for other factors that can influence outcomes, such as casemix (Box).13

In this issue of the Journal, Cadilhac and colleagues report for the first time Australian mortality rates for stroke (30 days after hospitalisation) that are adjusted for important prognostic factors (covariates) not routinely recorded in hospital admission databases.4 The Australian Stroke Clinical Registry (AuSCR) prospectively collected clinical data on 15 951 patients who were admitted with acute stroke to 28 participating Australian hospitals (18 metropolitan and 10 rural or regional), each of which provided at least 200 episodes of stroke care between 2009 and 2014.4,5 Baseline AuSCR data included information routinely collected by hospitals (age, sex, country of birth, Indigenous status, socio-economic status of postcode, year of stroke, stroke type) as well as additional prognostic covariates (a history of previous stroke; ability to walk on admission). The AuSCR data were linked to 2372 national death registrations, realising an overall crude 30-day mortality rate of 14.6%. The crude 30-day mortality rate ranged between 5.2% and 19.6%, despite similar adherence to evidence-based processes of care in the 28 hospitals (such as treating patients in stroke units).4 Patients who died as the result of their stroke within 30 days of hospitalisation were, on average, older than 30-day survivors, and were more frequently women, unable to walk on admission, and hospitalised for a haemorrhagic or recurrent stroke. After adjusting for prognostic covariates recorded in hospital admission data, the 30-day risk-adjusted mortality rate (RAMR) ranged from 8% to 20% across the 28 hospitals. After adjusting for prognostic covariates recorded in the clinical registry, the 30-day RAMR ranged between 9% and 21%. The ranking of the 28 hospitals according to their risk-adjusted 30-day stroke mortality rates varied according to which covariates were included in analyses, particularly for hospitals with high crude mortality rates. The models with the best fit were those that included stroke severity, as indicated by ability to walk on admission, as a covariate; data for this factor are currently recorded only in the AuSCR.4

The authors of the AuSCR study are to be congratulated for overcoming challenges to obtaining data held by the states (hospital data) and by the National Death Index for linkage to a non-governmental national clinical registry, the AuSCR. Their study illustrates the importance of adjusting analyses for key baseline variables (such as stroke severity) when comparing mortality rates for patients hospitalised for stroke. It also highlights the capacity of registries of clinical quality data to inform and complement hospital and national outcome data in the quest to measure, monitor and benchmark patient outcomes. Further, the AuSCR study provides an insight into the potential of clinical registries that systematically collect standardised data about processes of care to identify variations in clinical practice and to assess the appropriateness of care in the context of evidence-based standards and guidelines.6 These data may facilitate the evaluation of the effects of compliance with standards and of variations in care on patient outcomes, and assist in the design of interventions to reduce variation and to improve outcomes.7

In recognition of the potential for clinical quality registries to fill information gaps in the measurement and monitoring of the appropriateness and effectiveness of health care, a national framework has been developed that describes a mechanism for the secure disclosure, collection, analysis, and reporting of individual patient record-level data for high burden clinical conditions, such as stroke.8 Remaining challenges for clinical stroke registries such as AuSCR include the evolving definition9 and coding10 of stroke, ascertaining a high proportion of the eligible patient population, accurate and complete recording of prognostic data by clinical units, measuring outcomes that are important to patients (such as disability and return to usual activities), and providing clinicians with timely feedback that encourages adherence to evidence-based care.

Box –
Prognostic factors that influence outcomes for patients with acute stroke

Systematic

  • Age
  • Sex
  • Ethnic background
  • Socio-economic and employment status
  • Residence (rural and remote v urban)
  • Pre-stroke functional ability
  • How the stroke is defined, diagnosed and coded
  • Pathological type of the qualifying stroke (ischaemic v haemorrhagic)
  • Severity of the qualifying stroke
  • Prevalence of concurrent comorbidities (eg, atrial fibrillation, heart failure, diabetes, prior stroke, smoking)
  • Treatments and quality of care
  • How outcome (eg, disability, handicap, recovery) is defined, diagnosed and coded
  • When outcome is measured

Random

  • Chance factors

Transforming the management of stroke

Effective strategies for improving outcomes require efficient triage and interdisciplinary cooperation

When I commenced work as a junior neurologist, one of the first patients admitted under my care was a woman with a history of atrial fibrillation who presented acutely with a major stroke in her dominant hemisphere, causing aphasia and a dense right-sided hemiparesis. After examining the woman, I explained to her son — one of my colleagues — that the prognosis was poor; should his mother survive, she would probably be left with significant long term disability. He responded that he had just read a report in Nature about removing clots from blood vessels to reperfuse the brain during an acute stroke. The approach seemed rational, although it appeared to rest more in the realm of science fiction than something that would be of practical clinical use anytime soon.

Twenty years later, the time for intervention and clot retrieval in stroke has arrived (Box). Following a series of ground-breaking trials that have established the benefit of mechanical thrombectomy, the management of acute stroke has dramatically changed for the better. Specifically, the natural history of potentially life-threatening stroke has been completely transformed, and patients treated within the optimal time frame are now walking out of hospital with only a minimal or no deficit.

Although it has long been known that managing stroke patients in a dedicated acute stroke unit achieves better outcomes, the implementation of appropriate models of care is inconsistent, as discussed in this issue of the MJA.1,2 However, the landscape for the standard of care for acute ischaemic stroke has been significantly altered by a recent series of remarkable trials. The Multicenter Randomized Clinical Trial of Endovascular Treatment for Acute Ischemic Stroke in the Netherlands (MR CLEAN) found that treating ischaemic stroke patients with endovascular thrombectomy in addition to providing standard care reduced their level of disability.3 These findings were complemented by other trials, most notably the Extending the Time for Thrombolysis in Emergency Neurological Deficits — Intra-Arterial (EXTEND-IA) trial, driven by colleagues at the Melbourne Brain Centre working together with stroke centres around Australia.4 The EXTEND-IA study investigated stroke patients with evidence of salvageable brain tissue provided by perfusion imaging within 4.5 hours of the onset of stroke. Early endovascular thrombectomy with flow restoration after intravenous alteplase (a tissue plasminogen activator) achieved better outcomes than treatment with alteplase alone. Interestingly, the release of results from the MR CLEAN trial led to the early review of EXTEND-IA data that identified the greater benefit for patients of removing clots from the proximal anterior circulation of the brain. Specifically, reperfusion of brain tissue after vessel occlusion reduced brain infarct growth, and this was associated with a greater clinical benefit for patients both in terms of brain recovery and functional outcomes.

Where do we go from here? These ground-breaking revascularisation trials have further highlighted the critical importance of time as a key determinant of stroke outcome. The lead-up to patient presentation in the acute setting — this includes patient transport, but also the effective triaging of patients between centres that do or do not offer interventional services — can seriously impede efficient treatment. Similarly, neuroimaging is crucial to any subsequent endovascular treatment, as clinical examination cannot distinguish between ischaemic and haemorrhagic stroke. Examination by a trained neurologist working as part of a multidisciplinary acute stroke team, using scales such as the National Institute of Health Stroke Scale (NIHSS), is better for correlating the clinical presentation with the site of vascular territory and therefore of blockage, enabling the most effective therapy and avoiding unnecessary intervention, and ultimately improving functional outcomes.5,6 Clinicians must also remain vigilant in their ongoing management of stroke risk factors, with clear progress best represented by the worldwide reduction in stroke mortality, more prominent in high income countries with better access to medical services and treatment.7

Determining the most effective implementation of any new treatment approach takes time, but also significantly enhances patient outcomes, particularly in the dawning era of precision medicine.8 The real challenge now is to provide the health care systems and personnel required for working through the complexities of establishing centres of neuro-interventional expertise, with appropriate patient triage that makes clot retrieval services available to everyone. The Australian and New Zealand Association of Neurologists has initiated training programs to assist registrars acquire interventional skills for managing stroke, additional to their usual neurology training. These programs will require continued dialogue with other specialties, particularly radiology and neurosurgery. As part of this process, a conjoint committee for training in interventional neuroradiology has been established to ensure that these disciplines continue to work together productively in stroke management,9 and that the standard of care continues its upward trajectory into the future, so that all Australians receive the benefits of first class stroke care.

Box –
Computed tomography brain perfusion scan of a patient with stroke, having presented with acute onset of left-sided weakness*


The scan indicates reduced mean transit time and a large area of decreased flow on the right side of the brain, and also delineates tissue at risk from potentially salvageable ischaemic brain tissue. These changes are consistent with an acute occlusion of the right internal carotid artery, for which early endovascular thrombectomy is appropriate.

Reducing the burden of neurological disease and mental illness

The key to finding solutions for brain disorders is cooperation and collaboration, from the laboratory to the clinic

Australia is challenged by the rising economic and social costs of neurological disease and mental illness, which together account for one-third of the total disease burden in Australia.1 The financial cost of these disorders — about $45.5 billion annually14 — does not take into account the emotional impact and social isolation they cause. Many are chronic conditions with limited options for even ameliorative treatment, so that research into finding new approaches to their management is urgently needed. Translation of research into improved clinical practice, however, requires a continuum of process, including basic research, application of research findings, clinical trials, and implementation. Involving both basic researchers and clinicians in this process is crucial to its success. The Australasian Neuroscience Society (ANS; www.ans.org.au) recognises this need both by representing neuroscientists and clinicians in Australia and New Zealand active in neuroscience and mental health research, and by acting as a conduit for clinicians to interact more closely with researchers to achieve their shared goals.

This issue of the MJA highlights examples of current progress in the neuroscience of neurological disease and mental health conditions. As discussed by Koblar and colleagues,5 restoring brain function in people who have had a stroke or incurred other damage to the central nervous system remains an area of unmet need. Australian researchers play significant roles in international efforts to develop regenerative neurology; for example, the 2017 Australian of the Year, Professor Alan Mackay-Sim, was recognised for his work in developing stem cell therapies for people with spinal cord injuries. Australians have long played an important role in developing devices for restoring central nervous system function. For instance, the cochlear implant, invented by Professor Graeme Clark and colleagues at the University of Melbourne in 1978, has restored hearing to nearly 350 000 individuals across the world with sensorineural hearing dysfunction. Australians continue to operate at the cutting edge of the development of devices at the brain–computer interface, such as those described in this issue by Rosenfeld and colleagues.6

The burden of neurodegenerative disorders is rising as the Australian population ages. Dharmadasa and her co-authors7 review advances in the treatment of motor neurone disease, including three ongoing Australian clinical trials of potentially neuroprotective therapies; that is, of interventions that aim to slow the progress of the disease, not just provide symptomatic relief.

2017 promises to be an exciting year for accelerating progress in understanding the human brain. Major research projects seeking to deepen our understanding of its function and to translate this understanding into practical therapies are underway in the United States, Europe, Japan, and China, and the number of participating countries is rapidly expanding.8 Australia itself has a national brain project; developed by the Australian Brain Alliance and coordinated by the Australian Academy of Science, it is a collaboration of 28 organisations (including ANS) involved in brain research.9 The Australian Brain Project aims to understand how the brain encodes, stores and retrieves information, and its goals will be the focus of a proposal to be presented to the federal government in 2018. The Australian Brain Alliance also participated in an historic meeting at Rockefeller University (New York) in September 2016 with the goal of promoting collaboration and cooperation between large scale brain research projects around the world.10

The fundamental brain functions investigated by the members of ANS and the Australian Brain Project are intrinsic to our humanity, and they are often compromised by neurological disease and mental illness. Comprehensive understanding of these processes, and of precisely how and why they are disrupted in disease states, will provide us with new opportunities for improving diagnostics and developing more effective therapies that enhance the lives of the many Australians burdened by these disorders.