×

Consistently high incidence of diabetic ketoacidosis in children with newly diagnosed type 1 diabetes

To the Editor: Data from the tertiary paediatric hospitals in Brisbane (Royal Children’s and Mater Children’s Hospitals) support Claessen and colleagues’ letter.1

A total of 1091 children aged < 18 years were initially admitted from 1 January 2001 to 31 December 2011 with a new diagnosis of type 1 diabetes (T1D) (Box). Diabetic ketoacidosis (DKA) was defined as venous pH < 7.3 or serum bicarbonate level < 15 mmol/L in association with hyperglycaemia and ketoacidosis. Severity was defined as mild (pH 7.2 to < 7.3, or serum bicarbonate level 10 mmol/L to < 15 mmol/L), moderate (pH 7.1 to < 7.2, or serum bicarbonate 5 mmol/L to < 10 mmol/L) and severe (pH < 7.1, or serum bicarbonate < 5 mmol/L). Overall, 348 of 1091 children (31.9%; 95% CI, 29.1%–34.7%) presented with DKA over the 11 years studied. Initial analysis of trend suggested that the proportion of DKA was increasing over the period (χ2 test for trend, P = 0.005). However, when the 119 children whose DKA status was not recorded were excluded, this trend was no longer significant (P = 0.296), suggesting that the trend observed was a result of case ascertainment bias. To further assess this bias we analysed the period from 1 January 2006 to 31 December 2011 (which had minimal patients with DKA status not recorded), and there was no change in the trend of DKA presentations (P = 0.272).

A recent Australian study aimed at increasing awareness of T1D resulted in a decreased number of children presenting with DKA in the intervention region (15/40 to 4/29; P < 0.03), while in the control region there was no significant reduction in the rate of DKA over the same period (46/123 to 49/127).2

We confirm the high rates of DKA in children first admitted with a diagnosis of T1D to tertiary paediatric hospitals in Brisbane, and this has not decreased over
the past decade. Given that DKA is associated with significant morbidity and mortality, combined with recent evidence that improved awareness of T1D can decrease DKA rates at presentation, it seems appropriate to initiate public awareness campaigns on a larger scale.3

Diabetic ketoacidosis (DKA) status and severity in children first admitted to hospital with a diagnosis of type 1 diabetes

Modifying the gluten-free threshold for foods: first do no harm

To the Editor: The gluten-free (GF) diet for people with coeliac disease (CD) is complex, costly, and compliance with the diet is variable. Coeliac Australia, with the Australian Food and Grocery Council, are lobbying to increase the mandated gluten threshold for GF foods.1 The situation in Australia since 1995 has been that there must be “no detectable gluten” in foods labelled “gluten free”. The proposed new standard is “< 20 parts per million (ppm)”. This has been proposed because food testing has become increasingly sensitive over the years, resulting in fewer foods being considered gluten free. The current detection level of food testing is about 3 ppm. Unfortunately, the proposed new GF standard may not be safe for patients with CD.

There are few high-quality studies determining a safe gluten intake for patients with CD, although it is known that tolerable amounts vary between patients.2 In one study, 42 patients with CD who were eating a GF diet received 0, 10 or 50 mg of gluten daily for 3 months (10 mg in 500 g of food represents 20 ppm; 10 mg gluten is ingested in 1/250th of a slice of bread containing 2.5 g gluten). Patients’ duodenal mucosa were examined histologically before and after the gluten challenge. The study concluded that, for patients with CD, the daily dietary intake of gluten should be < 50 mg.3 This study has been interpreted as suggesting that 10 mg of gluten daily is safe.1 Regrettably, the patients in the study were a selective group, possibly less sensitive to gluten and, of those receiving 10 mg of gluten daily, one had symptomatic relapse and several showed worsening CD on histological examination.3 It is therefore surprising this study has been particularly influential in recommending a GF standard of < 20 ppm.1

In 2011, a comprehensive United States Food and Drug Administration (FDA) safety report concluded that gluten levels in food of < 1 ppm are required to protect the greatest number of patients with CD.4 Despite this, a long-awaited FDA ruling, released on 2 August 2013, sets the GF standard at < 20 ppm.5 This formalises tighter standards than previously existed in the US. Establishing a standard is complex, requiring consideration of issues such as industry and consumer concerns, industry regulation, economics, international precedent and safety.

The concept of doing no harm in health care is paramount. In Australia, where concerns about the availability of GF foods have been raised,1 it may be prudent to allow GF foods an increase in “measurable” gluten (eg, from undetectable to < 1–3 ppm). By contrast, increasing the “permissible” level of gluten (from undetectable to < 20 ppm) will increase overall gluten ingestion in a GF diet. For an undetermined proportion of patients with CD, this will lead to adverse health outcomes and generate additional health care costs.

Developing a global agenda for action on cardiovascular diseases

Australian health policy can and should address, as a core aim, cardiovascular health in less economically advanced nations

Cardiovascular diseases have snatched the mantle of top-priority global health problem from infectious diseases including tuberculosis, malaria and HIV/AIDS. This is because of the deaths attributable to cardiovascular diseases, the years of life lost, and the longer-term disability from heart failure and stroke.1 While deaths due to cardiovascular diseases among people younger than 65 years have fallen dramatically in the past 50 years in Australia, in less economically advanced communities one-third of these deaths occur among people younger than 65 years.2

Cardiovascular diseases are potent widow- and orphan-makers. Particularly in developing communities, they can precipitate poverty. The cost of care in communities lacking affordable health insurance and effective primary care can be catastrophic.

The effect on a nation’s lost productivity and growth is no less disastrous. Every 10% rise in chronic non-communicable diseases is estimated to bring a 0.5% decrease in economic growth.3 It has also been estimated that deaths in developing countries attributable to chronic disease will grow from 46% of all deaths in 2002 to 59% of all deaths in 2030, or to more than 37 million lives lost per year.3

Why the delayed recognition? These circumstances have been many decades in the making. Three principal reasons for global inaction over those years stand out.

First, in many countries maternal and infant mortality rates are high, visible, tragic and immediate, and a natural priority for scarce health care resources. Such countries that now also face the cardiovascular crisis are war-weary from fighting infant and maternal mortality, tuberculosis, malaria and HIV/AIDS. But great gains have been made in these conditions, and it is now imperative that we encourage and support those nations to address chronic diseases.

Second, perception of cardiovascular diseases, in relation to human behaviour, differs radically from that of infectious diseases. As with type 2 diabetes, obesity and chronic lung disease, cardiovascular diseases occur principally among older people, in social conditions of fast economic development and generally favourable, poverty-reducing urban development. They depend on human behaviour — smoking tobacco, overeating fats and sugars, abandoning traditional (usually healthier) nutrition, and underexercising. Potential donors who wish to improve international health consider cardiovascular diseases off limits for funding, since these diseases are “the sufferers’ fault” or diseases of old age. It is hard to convince major donors that such adverse individual health behaviour is largely determined by domestic, community, work and economic environments and that older people matter.

Preventive strategies for chronic disease that respond to the individual and the social environment behind these disorders appear soft and diffuse. They are complex compared with, say, an immunisation program with its clean start, jab and finish. Interest groups that profit from an environment that promotes chronic diseases, especially cardiovascular diseases, resist efforts that encourage change.4

But these detached, judgemental and indolent attitudes are changing, stimulated by a 2011 United Nations meeting on the global chronic diseases crisis.5 The UN meeting resulted from years of advocacy by a few governments, including Australia’s, and non-government agencies concerned about cardiovascular diseases, diabetes, cancer and chronic respiratory disease — the NCD Alliance. The Lancet has shown admirable academic leadership in non-communicable diseases research by creating an action group, publishing special issues and providing support for international meetings.

Often the recognition of a crisis jolts us to take the matter seriously, and so it is with cardiovascular diseases. A political declaration from the UN meeting articulated goals and strategies for preventing and controlling non-communicable diseases over the following 5 years. This has pushed international agencies such as the World Health Organization to act. The WHO is responding with global strategies: enhancing tobacco control, addressing dietary salt reduction, nominating essential medicines (including antihypertensives), and advocating for fuller and more stable primary care services everywhere.

In addition, chronic diseases are being reconceptualised, and are now frequently perceived as an impediment to social development, thus adding them to the agenda for discussion concerning the next steps to be taken after the Millennium Development Goals conclude in 2015.6

The third factor behind our relative inaction, despite indisputable progress, has been those massive holes, only now beginning to close, in knowledge about what to do, and how to implement the knowledge we have.

Although we have had the major risk factors for cardiovascular diseases nailed for the past 50 years, and can use them to explain most of the variance in cardiovascular disease frequency, more basic and clinical research is required alongside health services research to translate these insights into effective policy, population interventions and individual behaviour change.

Fruitful fields of inquiry include events in early life capable of setting the later epigenetic, physiological and behavioural trajectories for chronic disease.7

Australia has generally done well with cardiovascular disease control, although onset and mortality occur a decade earlier in our Indigenous communities than in the rest of the population.8 Overall, rates of deaths due to coronary heart disease in Australia fell by 83% between 1968 and 2000, as newer medical and surgical interventions have exerted a spectacular positive influence on individuals, and lifestyle changes have contributed positively at the individual and population levels.9

Tobacco smoking is now less common in Australia than in most other economically advanced nations. Our efforts, although incomplete, in cardiovascular disease prevention and management in urban and rural Indigenous communities might apply to other communities. In a spirit of mutual learning, we should share our experience with these efforts.

We know well the battles over entrenched behaviour, practices and social structures that nourish risk factors. The tobacco war is by no means over, and the food and alcohol wars are just beginning here and elsewhere. In the United Kingdom, the government has recently suspended the push for tobacco plain packaging legislation,10 and the same is likely to happen to a minimum alcohol pricing policy.11

In Africa, rapid modernisation will, by the middle of this century, potentially not only lead to food self-sufficiency but also surplus food to export.12 Although this will alleviate starvation, it will spell disaster for rapidly urbanising populations where, if the previous experience of developed societies is any model, cardiovascular disease rates will increase quickly.

Translating knowledge and science into resource-poor (or even just less-developed) settings is culturally, politically and logistically difficult. But as a good and progressive global citizen, Australia can still advocate for access to essential medications, meaningful aid and public health support. Such strategies have worked to combat infectious diseases globally, but now they must address non-communicable diseases.

Australia has much expertise and experience to share in international efforts to prevent and control cardiovascular and other chronic diseases. If this challenge is embraced by both major parties in the upcoming federal election, it would be pleasing indeed.

The impact of trans fat regulation on social inequalities in coronary heart disease in Australia

To the Editor: The evidence that industrially produced trans fatty acids (TFAs) increase the risk of coronary heart disease is compelling, and it is widely agreed that their use in food products should be minimised.13 Dietary TFAs are generally found in higher quantities in “unhealthy” food products,4 consumption of which is also found to follow predictable socio-demographic patterns.5 Thus, although the average TFA intake for Australians is relatively low, socioeconomically disadvantaged people are likely to disproportionately represent those with above average intakes.

Mandatory labelling of TFA content on all packaged foods in Australia has recently been advocated,1 so that individuals can make informed decisions about purchasing products with excessive levels of TFA. However, while such an intervention may reduce TFA intake at the population level, it is likely to increase social inequalities in TFA consumption and, therefore, inequalities in deaths from coronary heart disease. The reasons for this are as follows. First, research has shown that people who have healthier diets and who are from higher socioeconomic backgrounds are more likely to seek out and use food labels to make healthier choices,6 while those from more disadvantaged backgrounds who do not understand or act on nutrient labelling are much less likely to benefit. Second, mandatory TFA labelling may prompt food manufacturers to brand their products as “TFA-free”, which may bestow an undeserved “health halo” on energy-dense nutrient-poor foods.3 This “health halo” effect is likely to disproportionately influence the purchasing decisions of lower socioeconomic groups, among whom nutrition knowledge tends to be lower than among higher socioeconomic groups.4,5

The ability to replace industrially produced TFAs with healthier alternatives at minimal expense to consumers has prompted jurisdictions such as Denmark and New York City to introduce mandatory limits on the total amount of TFA permitted in all food products. Recent evaluation of the New York City policy showed a significant reduction of TFA in restaurant products, without a corresponding increase in saturated fat, and this effect was similar across high-income and low-income neighbourhoods.7

It is time that Australia introduced strong regulation to reduce TFA intake for all Australians.

Online screening for alcohol and other drug problems: an acceptable method for accessing help

To the Editor: Alcohol and other drug (AOD)-related harms are considerable,1 yet individuals with AOD disorders are reluctant to seek treatment.2,3 Barriers to help-seeking include issues of stigma and a lack of understanding of what treatment involves. One possible approach to improving access is the use of online tools that provide advice about available help options. While online screening for alcohol has shown value as a complement to AOD-related treatment in a Swedish sample, limited work has been conducted examining such utility for AOD-related problems within an Australian context.4

From 4 December 2012 to 10 January 2013, an integrated online screening tool (comprising the Alcohol Use Disorders Identification Test, Drug Use Disorders Identification Test, Kessler Psychological Distress Scale [K10] and a three-point Likert scale measuring subjective quality of life) to identify problematic AOD use, psychological distress and subjective wellbeing, was posted on the Turning Point website (http://www.turning point.org.au) and promoted through Counselling Online (http://www. counsellingonline.org.au), a national 24/7 online counselling service for AOD issues. No identifying information was collected, and the research was approved by the Eastern Health Research and Ethics Committee. Participants completing the screen received individualised feedback about the severity of their reported AOD use and suggestions for seeking further support (eg, relevant self-help materials, online, telephone and face-to-face counselling).

After 5 weeks of the pilot program, 288 screens were completed (from 900 website visits) of which 35.9% met criteria for likely alcohol dependence and 22.6% for likely drug dependence. Although overall physical health and quality of life scores were moderate, scores of psychological distress on the K10 indicated that 56.1% of respondents were experiencing high levels of psychological distress.5 Population demographics are shown in the Box.

All participants reported that the screen was helpful and acceptable in length, and provided a positive experience in clarifying further help options. Almost half (47.8%) of the participants reported they would seek further professional support after completing the screen, as directed by the help options provided.

The results show that an integrated online screening package may increase access to professional support for at-risk populations. It provides an acceptable and accessible mechanism for self-screening, acting as a bridge to a diverse range of treatment and support options for nearly half of all those screened.

Basic demographics of Turning Point website users accessing online screening

  • most site users (68.3%) were 20–44 years old;

  • 47.4% were men, 51.9% were women and 0.7% reported “other” as their sex;

  • 2.7% identified as Aboriginal and/or Torres Strait Islander;

  • 71.8% reported their cultural background as Australian, followed by 8.7% European, 4.5% American, 3.8% British and 2.8% New Zealander;

  • the tool had both national and international exposure: 93.4% of respondents were located in Australia (of those, 61.3% were in Victoria, 14.3% in New South Wales, 8.0% in Queensland, 4.2% in South Australia, 4.2% in Western Australia and 0.7% in Tasmania; 7.3% did not specify their state), 4.2% were in the United States, 0.7% were in the United Kingdom and 0.3% each in Canada, New Zealand and Kenya;

  • the respondents’ education level was high: 56.8% completed university, 20.5% had a secondary education, 18.2% had attained a diploma and 4.5% had a trade certificate;

  • 75% were employed; of those, 63% were in full-time employment, 24.2% part-time and 12.2% casual; and

  • 38.6% had children.

Public transport – just the ticket

After writing this column for the past 10 years I can report that the email feedback that I receive is polarised.

Firstly, I receive many requests to review high-performance eco-unfriendly vehicles.

German cars, V8s – anything that goes fast – and who cares how much fuel it uses to get there?

I equally receive requests from green doctors to review relatively environmentally-friendly cars.

Hybrids, electric cars, anything that goes a long way without producing too much CO2.

In deference to the second group I thought that this month I’d take a different look at how to get from A to B.

That is, no car review this time, but a peek at the public transport system.

To set the bench-mark as high as possible, I road-tested the public transport system in Hong Kong to see what might lie ahead for all of us in a world without cars.

My journey started at the airport where I bought an Octopus Card for $300 Hong Kong dollars (about $A42).

This included an each-way trip on the Airport Express to the city and unlimited MTR (train and bus travel) for three days.

Oh, and as an added bonus, the Airport Express part of the journey also includes free transfers to and from your hotel on a local bus.

Once purchased, the Octopus Card can be topped up for travel on other transport, such as trams and ferries.

And the same card can be also used to pay for small value items at hundreds of locations, and can even be used in some taxis.

The Octopus Card is so smart that it only needs to be in the vicinity of the reader, and doesn’t even have to leave your wallet or purse to be read.

Public transport in Hong Kong is unbelievably cheap, with a scenic bus trip across the island to Stanley only costing $A1.10, and travelling the whole 13 kilometres of historic tramways costs only 32 cents.

The Star Ferry to Kowloon across the harbour only costs 35 cents each way, and the views of the sky-line and the laser light show are free.

When using public transport in Hong Kong there are some idiosyncrasies to master, such as on buses you pay on the way in and on trams you pay on the way out.

While all the signs are bilingual, there is still room for confusion as there are two stations on the train network with what seems like the same name (Wan Chai and Chai Wan).

In providing what is arguably the best public transport system in the world, it does help that Hong Kong is still the most densely populated city on earth.

There are twice as many skyscrapers in Hong Kong as there are in New York.

Hong Kong also boasts more Rolls Royces per capita than anywhere else but, apart from the trip to Stanley, there isn’t really anywhere to drive to.

For those that like walking, Hong Kong also boasts the longest escalator system in the world.

At 800 metres, it takes locals down-hill from Soho in the morning, and at 10.15am it reverses its direction to take them home.

Hong Kong is a great city, and like all great cities you don’t need a car to get around.

Safe motoring,

Doctor Clive Fraser

Email: doctorclivefraser@hotmail.com

Fewer mixed signals, more green salad

The communication of public health expertise for the benefit of individuals and the community is necessarily complex in its content and delivery. Isolated public health pronouncements tend to be lost among all the information people need to assimilate in daily life and their disinclination to be told what to do. Public health messages need commitment to sustained, inclusive and responsive strategies that engage the whole community.

As with campaigns against tobacco use, product industries and suppliers are important elements to engage as part of a public health strategy. In an article in our series leading up to the federal election, Magnusson and Reeve (doi: 10.5694/mja13.10843) say that much more has to be done to encourage people to make healthier food choices to prevent chronic disease. They argue that food industry self-regulation has (perhaps inevitably) failed to decrease the consumption of unhealthy foods or people’s exposure (particularly children’s) to their marketing. The food industry, with commercial interests that run up against what governments want, sends mixed signals about its willingness to support healthy eating. Magnusson and Reeve do not advocate direct legislative controls on marketing and availability. Instead, they call for legislation to “scaffold” the industry self-regulation already in place, so that no incentive is given to companies’ efforts to present their unhealthy products. In this way, self-regulation becomes transparent and accountable, building community trust.

Encouraging desired behaviour in health does not always involve an iron fist in a velvet glove. Positive incentives can also get important players in health to act in the community interest. Cheng and Nation (doi: 10.5694/mja13.10657) highlight the importance of regulatory “loosening” to encourage more pharmaceutical company investment in antimicrobial research and development. Resistance to
antimicrobials means that new agents (of which there are few in development) are needed, and old drugs that are now finding a renewed role need to be redeveloped. The authors cite the United States Food and Drug Administration’s moves to license agents based only on small, less expensive trials, but to limit their use to carefully selected patient subgroups, backed by robust postmarketing surveillance. It is an interesting, but in the end complementary, message in the context of wholesale rationing and close supervision of antimicrobial usage.

Nothing is a more potent signal for action than success. Tasmania can now confidently declare that it is free of human hydatid disease, after a successful eradication program largely concluded in 1996. O’Hern and Cooley (doi: 10.5694/mja12.11745) show that new notifications of human hydatid disease between 1996 and 2012 cannot be attributed to exposure in this period. Does this demonstrate that eradicating human hydatid disease from the rest of Australia is achievable?

New movements towards better care also indicate what good clinical care and good clinical research need — each other. In the last article to emanate from the MJA Clinical Trials Research Summit, Winship and colleagues (doi: 10.5694/mja13.10381) argue that investigator-led trials geared towards determining the most effective care deserve direct funding from the health system, and that such trials should be embedded in the course of clinical care. This is not simply a grab for more research money. It acknowledges that day-to-day high-quality care depends on high-quality trial evidence to iteratively inform practice. This is an important message for the profession and the community about how good care and good research should be operationally inseparable.

Medicine and health depend on delivery of signals to the community that are informed by the medical profession. The time of division and antagonism between government and commercial interests has now passed, if it ever existed. Instead, to give the community clear signals about taking their health destiny into their own hands, with the support of the medical profession, we need collaboration and partnership to transform opposing agendas into mutual interests, and to highlight successes.

Should we screen for lung cancer in Australia?

Systematic screening reduces mortality, but is it the best way to go?

Lung cancer is the leading cause of cancer death in Australia. Late diagnosis of advanced disease contributes to the poor 13% 5-year survival rate associated with lung cancer. However, the recently updated United States National Lung Screening Trial (NLST) showed a 20% survival benefit from early detection with low-dose computed tomography (CT) screening.1,2 In light of these results, should people in Australia at high risk of lung cancer now undergo screening?

The NLST randomly allocated 53 000 participants to three rounds of annual screening with either chest CT or chest x-ray, with follow-up for a further 3 years.1,2 Unlike population-based screening programs for other common cancers, the NLST only enrolled high-risk participants (ie, current smokers or former smokers who had quit within the past 15 years, aged between 54 and 74 years, with > 30 pack-years). Adherence to screening was over 90%. At the initial screen, three times more stage 1A tumours (< 3 cm, no metastases) were detected on CT than on chest x-ray; most of these were resected.2 However, the absolute reduction in the risk of death from lung cancer was small (0.33%), with 320 participants being screened annually for 3 years to avoid one lung cancer death over 6 years.1

Harm did occur.3 The cumulative positive scan rate approached 40% over 3 years.1 While it is not directly comparable because of different screening intervals, this rate is several times higher than that in other screening programs, such as the Australian National Bowel Cancer Screening program, where only 7.8% of people screened biennially had a positive result on the screening test (faecal occult blood test).4

In the NSLT, over 95% of positive scans were shown to be false positives — that is, cancer was not present. However, most false positives required only repeat scanning. Invasive investigations were needed in a minority of instances, and the risk of fatal complications was small (0.03% within the CT arm of the study), reflecting the expert care provided to participants at trial centres. Radiation-induced cancer death could not be measured in the 6-year follow-up of the NLST, but is estimated at 0.04% 10–15 years after screening at the relevant levels of radiation exposure.3

Lung cancer screening therefore appears efficacious under optimal conditions and in expert hands. The absolute benefit, however, is modest and may be rapidly eroded by small decrements in effectiveness, or minor increments in harm.3 For example, a doubling of radiation risk (with the use of older scanners) and fatal complications (at inexperienced centres) coupled with a halving of screening effectiveness (by a lack of expert treatment pathways) could completely negate all benefit. Therefore, despite ready access to CT scanners, screening should not be performed sporadically in the absence of a systematic screening infrastructure.

Before Australia can embark on systematic screening, issues of local feasibility must be addressed. Many such questions will be answered by the Queensland Lung Cancer Screening Study, which is now midway through recruitment.5 This study adapted the NLST protocol to an Australian population, and will inform the implementation of future screening.

Cost presents a more formidable challenge. Of the approximately two million individuals in Australia aged between 54 and 74, roughly 400 000 were smokers 15 years ago, and many continue to smoke. Not including infrastructure, the cost of a screening program comprising three annual CT scans (with downstream tests and treatments) was calculated at $16.5 million per 10 000 individuals screened (2002 prices).6 Screening 400 000 individuals over 3 years would therefore cost $660 million. Assuming a screening uptake rate of 75%, annual costs would amount to $165 million.

Is such expenditure cost-effective? Dividing $660 million by an estimated 1250 lives saved gives a cost of $530 487 per death averted. Assuming a sustained benefit from screening, each patient whose death from lung cancer is averted gains 13 life-years.7 Thus the cost per life-year gained is in the region of $40 000, which approaches the cost-effectiveness of biennial bowel screening8 or cervical screening.9

However, there is already a more powerful method to reduce mortality in this high-risk population. Primary prevention is far more effective than screening, by at least an order of magnitude. The direct costs of smoking cessation interventions in 2003 Australian prices were between $1000 and $4000 per successful quitter, depending on the combination of cessation techniques.10 The cost of smoking cessation interventions per life-year gained therefore ranges between $250 and $1000, because smoking cessation adds 4 years of life to each quitter in their early sixties.11 Younger quitters derive even greater benefit.11

Formal up-to-date costings for screening should be undertaken given that available Australian models are a decade old. However, national lung screening is likely to strain health care expenditure, already stretched by expansions to breast and bowel screening programs. While systematic screening would likely save some lives, its running costs may be equivalent to the annual expenditure on all lung cancer care.12

In the United States, alternatives to full government support are being suggested, such as partially or fully self-funded screening, or tobacco-taxation. In Australia, the question for debate is whether screening should be implemented at all, until effectiveness and cost-effectiveness are substantially enhanced. Currently, smoking cessation is far ahead on both counts. Thus, greater emphasis (and more funding) should be directed towards intensified tobacco control and sustained quitting.

For now, in the absence of a coordinated nationwide program, we caution that sporadic lung screening has the potential for harm rather than benefit. We propose instead that smokers should be vigorously directed towards quitting.

Health outcomes of a subsidised fruit and vegetable program for Aboriginal children in northern New South Wales

In high-income countries, lower socioeconomic status is associated with both higher prevalence of non-communicable diseases and less-healthy dietary intake.1 In this context, promoting healthier nutrition, particularly increasing the intake of fruits and vegetables, has become an important public health priority.2 For those on low incomes, it has been argued that the cost of healthier foods is an important barrier to improving nutrition.3 Though not widely implemented in Australia, food subsidy programs are one strategy with the potential to improve socioeconomic inequalities in dietary intake.

In 2005, a rural Aboriginal community-controlled health service initiated a program for providing subsidised fruits and vegetables to improve nutrition among disadvantaged Aboriginal families. This program aimed to engage families in preventive health care in partnership with the health service while also addressing the barrier of the cost of healthier food choices.

Our previously published evaluation of this program demonstrated improvements in biomarkers of fruit and vegetable intake among children.4 We were also interested in whether there were short-term health benefits of this program, which may have been indicative of enhanced functioning of the immune system due to improved nutritional status.5

Here, we report on whether participation in this fruit and vegetable subsidy program in northern New South Wales was associated with short-term improvements in the health of children in participating families using a number of markers, including any changes in episodes of illness, episodes of common clinical conditions, prescription of antibiotics and the prevalence of anaemia and iron deficiency.

Methods

The fruit and vegetable subsidy program

In 2005, the Bulgarr Ngaru Medical Aboriginal Corporation established a fruit and vegetable subsidy program for low-income Aboriginal families in the Clarence Valley, NSW. The program combined annual health assessments, including dental and hearing check-ups, with receiving a weekly box of subsidised fruits and vegetables. Participating families collected boxes of seasonal fruits and vegetables (worth $40 if 1–4 children, or $60 if ≥ 5 children) at local greengrocers, making a copayment of $5. Complementary seasonal recipes and practical cooking and nutrition education sessions facilitated by dietitians were provided. This is an ongoing program in the Clarence Valley; however, our evaluation involved new families receiving weekly boxes of fruits and vegetables over 12 months with children having health assessments at baseline and after 12 months. The recruitment and baseline assessments were undertaken between December 2008 and September 2009, with follow-up assessments completed between December 2009 and September 2010.

Additional funding enabled the Galambila Aboriginal Health Service in Coffs Harbour and the Giingan Darrunday Marlaanggu Aboriginal Health Clinic at Bowraville in the Nambucca Valley to institute similar fruit and vegetable subsidy programs. These health services also participated in this evaluation study. The availability of and arrangements with greengrocers varied between the communities. In Coffs Harbour, families received vouchers from the health service, which they redeemed at the greengrocer by selecting their own fruits and vegetables. In the Nambucca Valley, the greengrocer was in a different town to the health service, so the health service staff collected and delivered the boxes of fruits and vegetables to families at their homes and collected the $5 contribution from them.

Participants

The participants were low-income (ie, unemployed or receiving pensions) Aboriginal families with one or more children ≤ 17 years of age who were regular patients at the respective health services. Many of the children had an identified nutrition risk (eg, underweight or overweight, chronic or recurrent infections) or presented frequently with episodes of illness to the health service. Parents or carers provided written informed consent and agreed to their children having annual health assessments, including research evaluation assessments. Potential participants were identified by staff using the criteria described above and were invited to join the program. At Bulgarr Ngaru, there was a waiting list of eligible families who wanted to participate, but numbers were limited by available funding.

Data collection and analysis

Retrospective health records audits were used to compare the 12 months before participation in the program with the initial 12 months during participation. These audits were only completed if records for the entire 24 months were available. Health records were reviewed from Aboriginal health services, local hospitals and any other nominated general practice. The number of visits to any health service for illness or preventive health activities, the number of episodes of common clinical conditions, the number of visits to hospital emergency departments and the number of antibiotic prescriptions were compared during each 12-month period.

In addition, each participant had a health assessment, based on the Medicare Benefits Schedule Indigenous Child Health Check, before participation and 12 months after joining the program. For all participants at each health assessment height and weight were measured and non-fasting venous blood samples were obtained to assess haemoglobin and iron status. Height was measured without shoes or thick socks using a Seca 214 portable stadiometer or S&M Instrument Co wall-mounted stadiometer. The participant stood with the heels together and the heels, buttocks and upper part of the back touching the upright of the stadiometer. Children under 3 years who were unable to stand unaided were measured supine using a Seca 210 baby measuring mat on a firm surface. Weight and body fat were measured using a Tanita UM030 Body Fat Monitor wearing light clothing only, with empty pockets and shoes and socks removed. Body fat was measured only for children in the Clarence Valley ≥ 7 years, as per the Tanita recommendations. Children < 2 years who were unable to stand unaided were weighed on a Soehnle Professional Babyscale 7725. Body mass index (BMI) in kg/m2 was calculated for children 2–17 years. Blood samples collected from participants in the Clarence Valley were analysed at the Grafton Base Hospital pathology laboratory. Haemoglobin was analysed on a Roche Diagnostics Sysmex XT-2000i haematology analyser. Serum iron and serum ferritin were analysed on a Roche Diagnostics Cobas Integra 800 chemistry analyser. Blood samples collected in Coffs Harbour and the Nambucca Valley were analysed at Symbion Laverty Pathology, Coffs Harbour. Full blood counts were analysed on a Sysmex XT-2000i haematology analyser. Serum ferritin assays were performed on the Siemens ADVIA Centaur XP automated immunoassay system. Serum iron was measured on the Siemens ADVIA 2400 chemistry system.

Statistical analysis

The mean and 95% confidence interval of changes in the number of health service visits, common clinical conditions and antibiotic use, anthropometric measurements and levels of haemoglobin, iron and ferritin were evaluated in IBM SPSS Statistics, version 19 using a paired sample t test and a general linear model to adjust for sex, age and community. The mean changes in these outcomes were assessed overall and by community, owing to differences in program implementation in each community. The analysis was based on complete data with no imputation for missing values. Based on an international classification of BMI centiles for age,6 the proportions of children who were underweight, normal weight, overweight and obese before participation were compared with the proportions after participation using the Stuart–Maxwell test of marginal homogeneity. The proportions of children with low haemoglobin, ferritin and iron before and after participation were compared using the McNemar test.

Ethics

Ethics approval was obtained from the University of Melbourne Human Research Ethics Committee, University of South Australia Human Research Ethics Committee, the Aboriginal Health and Medical Research Council of NSW and the North Coast Area Health Service human research ethics committee. Community consent was obtained from the boards of the three participating health services. The results of each child’s pathology results were discussed with parents or carers, and overall summary results were discussed in community focus groups in the Clarence Valley.

Results

The demographic characteristics of 174 children who participated in the fruit and vegetable program are presented in Box 1. Of these, 167 children had an initial health assessment including anthropometry completed at baseline.

Retrospective clinical audits were completed for 167 children whose families received at least one box of subsidised fruits and vegetables. Seven children did not have clinical audits: three whose families moved from the area, and four whose families were withdrawn from the program for non-compliance with initial assessments.

After 12 months, 143 children had follow-up health assessments. Of those who did not complete follow-up assessments, nine were from families who moved from the area, nine failed to attend appointments and 13 were from families who dropped out of the program. The median period between baseline and follow-up health assessments was 370 days (interquartile range, 354–407 days). In the Clarence and Nambucca Valleys combined, 30 of 43 families collected 75% or more of the fruit and vegetable boxes available to them over the 12 months. These data were not available for Coffs Harbour.

Anthropometric changes

At the initial assessment of 134 children aged 2–17 years, 4.5% (6) were underweight, 67.2% (90) were normal weight, 14.9% (20) were overweight and 13.4% (18) were obese. Of 125 children aged 2–17 years who were reassessed after 12 months, 4.0% (5) were underweight, 66.4% (83) were normal weight, 16.8% (21) were overweight and 12.8% (16) were obese. There were no significant differences in the proportion of children in each weight category after the fruit and vegetable program compared with baseline (χ2[3,125] = 1.33; P = 0.721). There was also no significant change in the mean percentage body fat after 12 months on the program compared with baseline (22.5% versus 22.1%) among the subgroup of 22 children aged ≥ 7 years who had this assessed.

Health outcomes

The unadjusted data from clinical audits for the overall sample showed that during program participation the mean annual numbers of visits to any health service for illness, hospital emergency department attendances and oral antibiotic prescriptions were significantly lower (P = 0.037, P = 0.017, P = 0.001, respectively) (Box 2). There was also a non-significant reduction in episodes of pyoderma during program participation (P = 0.093). After adjustment for sex, age and community, only the reductions in illness-related health service or hospital visits and in prescribing of oral antibiotics remained statistically significant (Box 3). An additional adjustment of change scores for the baseline values in the covariate-adjusted models yielded no differences in the conclusions drawn other than a loss of statistical significance for the observed reduction in illness-related visits ( 0.5; 95% CI, 1.0 to 0.03).

Changes in haemoglobin and
iron status

A small, non-significant increase of 1.5 g/L (P = 0.076) in the mean haemoglobin level was shown; this effect increased in magnitude to 3.1 g/L and was statistically significant after adjustment for community, sex and age (Box 4). An additional analysis adjusting for baseline haemoglobin level did not change this conclusion. Comparing the individual communities, a large, statistically significant increase in mean haemoglobin levelwas shown at Bowraville (7.8 g/L) but not in Coffs Harbour or the Clarence Valley (P < 0.001 for difference between communities). The proportion of participants with anaemia decreased by 3% compared with baseline (Box 4). Iron deficiency, based on serum ferritin, was common at baseline (41%). There were small decreases in the proportion of fruit and vegetable program participants with low ferritin and iron levels; however, there were no significant differences in mean serum ferritin and serum iron levels after the fruit and vegetable program compared with baseline with or without adjustment for community, sex and age (Box 4). Additional adjustment for baseline iron and ferritin levels did not change these findings.

Discussion

Aboriginal children from the NSW north coast who participated in this fruit and vegetable subsidy program had significantly fewer oral antibiotic prescriptions over 12 months compared with the preceding year. The proportion of overweight or obese children after participation in this program did not change. Although height, weight and BMI had all increased significantly at the 12-month follow-up as expected in children, there was no change in the percentage body fat among a subgroup who had this assessed. The prevalence of iron deficiency at baseline was 41%, with anaemia in 8%. There was a small but statistically significant increase in the mean haemoglobin level and a reduction in the proportion of children with anaemia, but only a non-significant 4% decrease in iron deficiency.

Our study demonstrates the potential to undertake evaluation studies in an Aboriginal community-controlled health service, despite the inherent limitations in a busy community-oriented service organisation. It is also an example of an Aboriginal community-directed program, which are far more common than intervention research, although few are documented in academic literature.

The nutritional challenges in this group of disadvantaged Aboriginal children are consistent with those reported in a study of other towns in northern NSW.7 Low intakes of fruits and vegetables and high intakes of energy-dense, nutrient-poor foods were reported among both Aboriginal and Torres Strait Islander and non-Indigenous children aged 9–13 years, with a particularly high intake of sodium, calories, fat, sugary drinks and white bread by Indigenous boys.7 Although the nature of the intervention in our study differed from other nutrition interventions in remote Aboriginal communities, such as the Looma Healthy Lifestyle Program8 in Western Australia and the Minjilang Health and Nutrition Project9 in the Northern Territory, a common feature of these successful programs was strong community engagement. This, together with ongoing relationships, underpins other current Aboriginal community research programs.10,11

Community support for our healthy food program was fostered by the 88% subsidy for fruits and vegetables. Lower subsidies of 10%–20% have been used in other recent healthy food research and modelling studies.1214 The higher subsidy used in this program reflects the substantial challenges and barriers to healthy nutrition faced by disadvantaged Aboriginal and Torres Strait Islander families. However, it is consistent with the WIC program (Special Supplemental Food Program for Women, Infants, and Children) in the United States and the Healthy Start program in the United Kingdom, which provide free healthy foods to low-income pregnant women and young children. The WIC program, in particular, has been shown to improve the nutritional status of participating women and children and pregnancy outcomes.1519 There are still questions about the cost-effectiveness of these healthy food subsidy programs and whether the impacts on nutritional status are sustained.15,20,21 Food subsidies remain topical in Australia, given increasing concerns about food insecurity22 and as a policy alternative to compulsory income management and cash entitlements for low-income families.

The before-and-after uncontrolled study design limits the strength of our data. Regression to the mean due to paired data and the normal reduction in rates of childhood illnesses in older children may have also contributed to the findings.23 Regression to the mean was accounted for through use of all-covariate adjusted models that included age, sex and community, in addition to the baseline value for each outcome analysed. It is also possible that other unrelated environmental factors contributed to the improvements in nutrition and health outcomes, such as local early childhood and school nutrition programs.24,25 In addition, the health record audits may be subject to incomplete ascertainment, due to the ability of patients to potentially access more than one primary health care service and the lack of linkage of hospital records across area health services. It is not possible to predict the impact of this on the findings; however, it is likely to have had a similar impact before and after participation.

We showed an association between subsidised fruits and vegetables and short-term health improvements in this study. We have previously reported increased plasma biomarkers of fruit and vegetable intake among participants,4 which supports the hypothesis that improvements in dietary intake contributed to improved health outcomes. A controlled study is needed for further confirmation of these findings and to allow investigation of the cost-effectiveness of such a program. Our findings are consistent with prospective studies demonstrating an association between healthy nutrition and improved long-term health outcomes.26,27

A larger trial is warranted to investigate the sustainability and feasibility of healthy food subsidy programs in Australia. The program could be adapted to target low-income families more generally. The design of future healthy food subsidy studies needs to allow us to distinguish between the relative contribution of fruit and vegetables and comprehensive primary health care to the improved outcomes. This program aimed to engage families in preventive health activities more fully than previously, which may also have contibuted to the observed health outcomes. This is relevant, given the cost of food subsidies and the need to target effective interventions. Food subsidy programs in the US operate independently of health services, although the WIC program assists participants to access health and social services.28

This fruit and vegetable subsidy program was associated with improvements in some indicators of short-term health status among disadvantaged Aboriginal children. These health outcomes and the associated improvements in biomarkers of fruit and vegetable intake4 have the potential to reduce health disparities in the population.

1 Baseline demographic characteristics of participating children, in total and by community

All communities

Clarence

Coffs Harbour

Nambucca


No. of families

55

30

12

13

No. of children

174

90

36

48

No. of boys

82

46

18

18

Age in years, mean (SD)

7.6 (4.2)

7.5 (3.8)

11.0 (3.3)

5.8 (4.3)

Children with at least one smoker in household*

107/164

62/90

18/36

27/38

Families receiving unemployment benefits, pensions, no./total

51/55

28/30

10/12

13/13


* Proportion of participants with a valid response to the number of smokers in the household.

2 Retrospective clinical audit data for health outcomes among participants for the 12 months before and 12 months after starting the subsidised fruit and vegetable program (n = 167)*


* Error bars show 95% CI. Illness-related visits to health services. Preventive health-related visits to health services. § Number of prescriptions.

3 Change in health outcomes among Aboriginal children participating in the subsidised fruit and vegetable program (n = 167)

Sick
visits*

Well
visits

Otitis media
episodes

Pyoderma
episodes

Hospital
attendances

Oral
antibiotics

Topical
antibiotics


Unadjusted mean Δ-score§ (95% CI)

0.6 
( 1.1 to 0.04)**

0.1 
( 0.3 to 0.03)

0.1 
( 0.2 to 0.06)

0.2 
( 0.4 to 0.03)

0.3 
( 0.5 to 0.05)

0.5 
( 0.8 to 0.2)**

0.06 
( 0.2 to 0.1)

Adjusted mean Δ-score
(95% CI)

0.6 
( 1.2 to 0.001)**

0.2 
( 0.3 to 0.01)

0.1 
( 0.2 to 0.06)

0.2 
( 0.4 to 0.05)

0.2 
( 0.4 to 0.1)

0.5 
( 0.8 to 0.2)**

0.1 
( 0.2 to 0.1)


* Illness-related visits to health services. Preventive health-related visits to health services. Number of prescriptions. § (Number of episodes per year during 12 months’ participation) (number of episodes in the year before program participation). Adjusted for sex, age and community. ** Significantly different to zero (P < 0.05).

4 Changes in haemoglobin and iron status among fruit and vegetable program participants (n = 129)

Mean level (SD)


Δ-score (95% CI)


Proportion classified as low


Before

After

Unadjusted mean

Adjusted mean*

Before (no. [%])

After (no. [%])

P


Haemoglobin (g/L)

126.8 (12.3)

128.2 (10.5)

1.5 ( 0.2 to 3.1)

3.1 (1.4 to 4.8)**

12/150 (8%)

7/137 (5%)

0.453

Ferritin (μg/L)§

33.3 (24.2)

35.2 (22.5)

3.2 ( 0.5 to 6.2)

1.7 ( 2.5 to 6.0)

63/152 (41%)

51/139 (37%)

0.440

Iron (μmol/L)

12.7 (6.0)

13.2 (5.3)

0.5 ( 0.6 to 1.6)

0.8 ( 0.5 to 2.0)

43/152 (28%)

32/139 (23%)

0.405


* Adjusted for sex, age and community. 129 participants had valid haemoglobin, ferritin and iron at baseline and follow-up. Additional participants had valid pathology at either baseline or follow-up as shown. Reference interval (RI): ≥ 5 years, 115–140 g/L; < 5 years, 105–140 g/L. § RI: boys, 20–200 μg/L; girls, 29–200 μg/L. RI, 11–28 μmol/L. ** Significantly different to zero (P < 0.05).

Impact of swimming on chronic suppurative otitis media in Aboriginal children: a randomised controlled trial

Rates of chronic suppurative otitis media (CSOM) among Aboriginal children living in remote areas in Australia are the highest in the world.1,2 A survey of 29 Aboriginal communities in the Northern Territory found that 40% of children had a tympanic membrane perforation (TMP) by 18 months of age.3 About 50%–80% of Aboriginal children with CSOM suffer from moderate to severe hearing loss.4,5 This occurs while language and speech are developing and may persist throughout primary school.

There is evidence suggesting that the recommended treatment for ear discharge (twice-daily cleaning and topical ciprofloxacin) can produce cure rates of 70%–90%.68 However, a study of Aboriginal children with CSOM in the NT found that less than 30% of children had resolution of ear discharge after 8 weeks of similar treatment.9 This study suggested that ongoing treatment for long periods was difficult for many Aboriginal families living in underresourced and stressful conditions. When children in high-risk communities do not receive appropriate medical treatment for ear disease, using swimming pools to limit levels of ear discharge and possibly reduce bacterial transmission becomes an attractive option.

Traditionally, children with perforated eardrums have been restricted from swimming because of fears of infection. However, it is hypothesised that swimming helps cleanse discharge from the middle ear, nasopharynx and hands and that this benefit may outweigh the risk of introducing infection. Several observational studies have examined the relationship between swimming and levels of skin and ear disease among Aboriginal children.1014 In a cross-sectional survey, close proximity to a swimming area was associated with reductions of up to 40% in otitis media.10 Two systematic reviews have found that swimming without ear protection does not affect rates of recurrent ear discharge in children with tympanostomy tubes (grommets).15,16 Despite these findings, surveys indicate uncertainty among clinicians regarding water precautions for children with grommets.1719

Our aim was to conduct a randomised controlled trial (RCT) to better understand the impact of swimming on children with CSOM, and to address a lack of data on ear discharge in older Aboriginal children (aged 5–12 years) with CSOM. We also aimed to obtain microbiological profiles of the nasopharynx and middle ear to help elucidate the cleansing hypothesis.

Methods

Study design

Between August and December 2009, we conducted an RCT examining the impact of 4 weeks of daily swimming in a chlorinated pool on TMPs in Aboriginal children. The Human Research Ethics Committee of the Northern Territory Department of Health and Families and the Menzies School of Health Research approved the study.

Participants and setting

Participants were from two remote Aboriginal communities in the NT. Resident Aboriginal children aged 5–12 years who were found at baseline ear examination to have a TMP were eligible for the trial. Children with a medical condition that prohibited them from swimming were excluded.

Randomisation and blinding

A random sequence stratified by community and age (< 8 years or ≥ 8 years) was generated using Stata version 8 (StataCorp). The allocation sequence was concealed from all investigators. The clinical assessment was performed without knowledge of the group allocation, and laboratory staff were also blinded to group allocation and clinical data.

Intervention

Children in the intervention group swam in a chlorinated pool for 45 minutes, 5 days a week, for 4 weeks. Swimmers did not wear head protection (cap or earplugs) and went underwater frequently. Children in the control group were restricted from swimming for 4 weeks.

Clinical assessments

Participants’ ears were examined in the week before and the week after the intervention using tympanometry, pneumatic otoscopy and digital video otoscopy. Criteria for diagnosis were:

  • Otitis media with effusion: intact and retracted non-bulging tympanic membrane and type B tympanogram

  • Acute otitis media without perforation: any bulging of the tympanic membrane and type B tympanogram

  • Acute otitis media with perforation: middle ear discharge, and perforation present for less than 6 weeks or covering less than 2% of the pars tensa of the tympanic membrane

  • Dry perforation: perforation without any discharge

  • CSOM: perforation (covering > 2% of the pars tensa) and middle ear discharge.

Children with a perforation were examined a second time with a video otoscope. The degree of discharge was graded as nil, scant (discharge visible with otoscope, but limited to middle ear space), moderate (discharge visible with otoscope and present in ear canal), or profuse (discharge visible without otoscope). Drawings of the eardrum and perforations were made, with estimates of the position and size of the perforation as a percentage of the pars tensa. Examiners reviewed the videos in Darwin to confirm the original diagnoses of perforations.

Swab collection and microbiology

Swabs were taken from the nasopharynx and middle ear at both the baseline and final ear examinations. All swabs were cultured on selective media for respiratory bacteria. The bacteria specifically targeted were Streptococcus pneumoniae, non-typeable Haemophilus influenzae, Moraxella catarrhalis and Staphylococcus aureus. Ear discharge swabs were also cultured for Streptococcus pyogenes (Group A Streptococcus), Pseudomonas aeruginosa and Proteus spp.

Swabs stored in skim-milk tryptone glucose glycerol broth20 were thawed and mixed, and 10 μL aliquots were cultured on the following plates: full chocolate agar, 5% horse blood agar containing colistin and nalidixic acid, and chocolate agar with bacitracin, vancomycin, and clindamycin (Oxoid Australia). Ear discharge swabs were also cultured on MacConkey agar plates. Blood plates were incubated at 37°C in 5% CO2, and MacConkey plates at 35°C in air. Bacterial isolates were identified according to standard laboratory procedures.

The density of each of the bacteria on each plate was categorised as: 1) < 20; 2) 20–49; 3) 50–100; 4) > 100 or confluent in the primary inoculum; 5) as for 4, but colonies also in second quadrant of the plate; 6) as for 5, but colonies also in third quadrant; 7) as for 6, but colonies also in fourth quadrant. Dichotomous measures for bacterial load were categorised as low density (< 100 colonies) or high density (≥ 100 colonies).

Outcome measures

Clinical measures

The primary outcome measure was the proportion of children with otoscopic signs of ear discharge in the canal or middle ear space after 4 weeks. Final ear examinations took place 12 hours to 2.5 days after the participants’ last scheduled swim. Prespecified subgroup comparisons were: younger (5–7 years) versus older (8–12 years) children; children who had been prescribed topical antibiotics versus those who had not; degrees of discharge; and smaller (< 25%) versus larger (≥ 25%) perforations.

Microbiological measures

For the nasopharynx, we determined the proportions of children with S. pneumoniae, H. influenzae, M. catarrhalis, any respiratory pathogen (S. pneumoniae, H. influenzae, M. catarrhalis) and S. aureus. For the middle ear, we determined the proportions of children with S. pneumoniae, H. influenzae, M. catarrhalis, S. aureus, Group A Streptococcus, P. aeruginosa and Proteus spp.

Statistical methods and analyses

All participants allocated to a group contributed a clinical outcome for analysis, including children lost to follow-up, whose diagnoses were assumed not to have changed from baseline. Children lost to follow-up were excluded from assessments of microbiological outcomes. Risk differences (RDs) between the study groups were calculated with 95% confidence intervals. The Mann–Whitney U test was used to compare median perforation sizes of the study groups.

Sample size

We hypothesised that 90% of children not swimming would have ear discharge at 28 days and that swimming could reduce this proportion. We specified that a 25% difference between the two groups would be clinically important. Our aim was to recruit a sample of 100 children to provide 80% power to detect a substantial difference of 25% between the two groups.

Results

Parental consent was obtained for 89 eligible children: 41 children in the swimming group and 48 children in the non-swimming group (Box 1). At 4-week follow-up, final ear examinations were conducted on 82 children (36 swimmers and 46 non-swimmers).

At baseline, the study groups were similar in age, sex, perforation size, the presence and degree of ear discharge, and the prevalences of ear diagnoses (Box 2). Although there were no statistically significant differences in the baseline prevalence of bacteria in the nasopharynx or middle ear, swimmers had lower rates of H. influenzae in the nasopharynx and higher rates of S. aureus in both the nasopharynx and middle ear. Of the 89 children, 58 (26 swimmers and 32 non-swimmers) had ear discharge at baseline.

At 4-week follow-up, 56 children had ear discharge: 24 of 41 swimmers compared with 32 of 48 non-swimmers (RD, 8%; 95% CI, 28% to 12%). Excluding children lost to follow-up, 21 of 36 swimmers had ear discharge compared with 31 of 46 non-swimmers (RD, 9%; 95% CI, 30% to 12%).

Between baseline and 4-week follow-up, there was no statistically significant change in the prevalence of bacteria in the nasopharynx (Box 2). P. aeruginosa infection in the middle ear increased in swimmers, compared with no change in non-swimmers. Non-typeable H. influenzae isolated from ear discharge increased in both groups. Overall, the dominant organisms were S. pneumoniae and H. influenzae in the nasopharynx, and H. influenzae, S. aureus and P. aeruginosa in the middle ear.

Per-protocol analysis of swimmers attending > 75% of swimming classes and non-swimmers adhering to swimming restrictions > 75% of the time indicated that 16 of 24 swimmers had ear discharge at 4-week follow-up, compared with 29 of 44 non-swimmers (RD, 1%; 95% CI, 23% to 23%).

Rates of discharge were significantly lower in children who were prescribed ciprofloxacin and in children with smaller perforations (Box 3).

Of the 89 children, 65 had no change from their original diagnosis (by child’s worst ear) at 4-week follow-up. Ear discharge failed to resolve in 31 of the 35 participants with moderate to profuse ear discharge at baseline (Box 3). Seven of the 89 children had a perforation that healed (Box 4).

Discussion

We found that regular swimming in a chlorinated pool for 4 weeks did not aid resolution of ear discharge in Aboriginal children with CSOM. At the end of the trial, rates of ear discharge were similar between swimmers and non-swimmers. Our microbiological data also suggest that swimming is unlikely to be effective in removing discharge from the middle ear and nasopharynx, with rates and densities of organisms generally comparable between swimmers and non-swimmers, with little change during the study. Among swimmers, there was an increase in P. aeruginosa middle ear infection, but this was not correlated with new episodes of ear discharge.

Our study is the first RCT to examine the effects of swimming on Aboriginal children with CSOM and also addresses the need for more RCTs examining the impact of swimming on children with grommets. Further, the microbiological data enabled an assessment of the effect of regular swimming on infection in the nasopharynx and middle ear. Other strengths include the blinding of examiners, prespecified subgroup analysis and a follow-up rate of more than 90%.

Our study also has some limitations. We planned to randomly assign 100 children and anticipated that 90% of participants would have ear discharge at follow-up, but we had only 89 participants and 63% with discharge at follow-up, meaning the study was underpowered. Some difficulties were encountered in recruiting children who did not attend school in one community. The possibility of contamination among non-swimmers was also a concern. Parents and school and pool staff assisted in ensuring that non-swimmers did not swim at the pool or at any other water sites, and alternative activities were provided for non-swimmers after school, as this was a popular swimming time. Attendance at swimming and activity classes were monitored, and two portable media players were offered as incentives to children with the highest attendance.

The lack of objective measures for the degree of discharge, perforation size and bacterial density may have contributed to measurement error. It is unlikely that these limitations would prevent a large clinical effect being identified. However, our small sample size means that modest benefits or harms associated with daily swimming may still be possible.

Our results are not consistent with research from two remote communities in Western Australia, which found that rates of TMPs among Aboriginal children halved from about 30% to 15% after swimming pools were installed.11 The potential to improve on our results with longer exposure to swimming is possible. However, the WA study did not follow individual children, and after 5 years the reductions were sustained in only one community.14 Further, the likelihood of significant clinical improvements over a longer period is not supported by our microbiological data. A recent South Australian study also found that the installation of swimming pools in six communities did not affect rates of TMPs among children.12

While swimming may remove some ear and nasal discharge, there is evidence to suggest that cleansing practices alone will not cure CSOM. A Cochrane review of studies conducted in developing countries found that wet irrigation or dry mopping was no more effective than no treatment in resolving ear discharge in children with CSOM (odds ratio, 0.63; 95% CI, 0.36–1.12).21 The review recommended that aural cleansing should be conducted in conjunction with topical antibiotic therapy.21 Future studies could look at the effectiveness of swimming in combination with the application of topical antibiotic therapy.

Over the 4 weeks of our intervention, rates of H. influenzae middle ear infection substantially increased in both swimmers (from 35% to 70%) and non-swimmers (from 50% to 65%). Previous topical antibiotic trials of Aboriginal children (aged 1–16 years) have reported lower baseline rates of H. influenzae in the middle ear, ranging from 5% to 25%.6,9 In contrast, a vaccination trial of Aboriginal infants aged < 24 months found H. influenzae in 85% of new perforations.22 The high levels of H. influenzae ear and nasopharyngeal infection may mean that there is a role for the use of oral antibiotics in combination with topical antibiotics to treat Aboriginal children with CSOM. There may also be benefits from vaccines against H. influenzae in Aboriginal children at high risk of progressing to CSOM.

Simultaneous hand contamination and nasal carriage of S. pneumoniae and H. influenzae is a reliable indicator of TMP in Aboriginal children under 4 years of age.23 Future research could examine rates of hand contamination in relation to swimming, particularly targeting younger children (aged 2–5 years), who are most likely to transmit otitis media bacteria to infants.

In conclusion, it seems unlikely that regular swimming in pools will resolve ear discharge and heal TMPs in the short term. We also found no clear indication that swimming reduces rates of respiratory and opportunistic bacteria in the nasopharynx or middle ear. However, we did not find swimming to be associated with an increased risk of ear discharge. We would not support the practice of restricting children with a TMP from swimming unless it was documented that ear discharge developed directly after swimming (for that particular child). More RCTs are needed to assess more modest (or longer-term) effects of swimming on middle ear disease in Aboriginal children. The combination of swimming and ciprofloxacin treatment may also produce better clinical outcomes and should be investigated.

1 Flowchart of participants through the trial


TMP = tympanic membrane perforation.

2 Participant characteristics at baseline and 4-week follow-up

Baseline


Follow-up


Swimmers
(n = 41)

Non-swimmers (n = 48)

Swimmers
(n = 41)

Non-swimmers (n = 48)

Risk difference
(95% CI)*


Mean age in years (SD)

8.9 (2.4)

8.6 (1.9)

Male

27 (66%)

31 (65%)

Ear diagnosis

n = 41 

n = 48

n = 41

n = 48

Bilateral closed tympanic membranes

1/41 (2%)

6/48 (13%)

10% ( 23% to 2%)

Unilateral dry TMP

11/41 (27%)

11/48 (23%)

11/41 (27%)

5/48 (10%)

16% (0 to 33%)

Bilateral dry TMPs

4/41 (10%)

5/48 (10%)

5/41 (12%)

5/48 (10%)

2% ( 12% to 17%)

Unilateral wet TMP

12/41 (29%)

13/48 (27%)

10/41 (24%)

12/48 (25%)

1% ( 18% to 18%)

Wet TMP and dry TMP

2/41 (5%)

2/48 (4%)

5/41 (12%)

5/48 (10%)

2% ( 12% to 17%)

Bilateral wet TMPs

12/41 (29%)

17/48 (35%)

9/41 (22%)

15/48 (31%)

9% ( 27% to 10%)

Median size of TMP as percentage of pars tensa (IQR)

20% (8%–38%)

18% (6%–40%)

15% (4%–32%)

20% (5%–49%)

P = 0.39

Any ear discharge (primary outcome)

26/41 (63%)

32/48 (67%)

24/41 (59%)

32/48 (67%)

8% ( 28% to 12%)

Moderate or profuse discharge

16/41 (39%)

19/48 (40%)

20/41 (49%)

25/48 (52%)

3% ( 24% to 17%)

Nasopharyngeal bacteria

n = 41 

n = 46

n = 35

n = 41

Streptococcus pneumoniae

28/41 (68%)

33/46 (72%)

19/35 (54%)

27/41 (66%)

12% ( 33% to 1%)

Non-typeable Haemophilus influenzae

17/41 (41%)

28/45 (62%)

21/35 (60%)

30/41 (73%)

13% ( 34% to 8%)

Moraxella catarrhalis§

17/40 (43%)

17/46 (37%)

6/35 (17%)

14/41 (34%)

17% ( 36% to 3%)

Any respiratory pathogen

28/41 (68%)

41/46 (89%)

24/35 (69%)

37/41 (90%)

22% ( 40% to 4%)

Staphylococcus aureus

8/41 (20%)

5/46 (11%)

9/35 (26%)

4/41 (10%)

16% ( 1% to 34%)

At least one high-density respiratory pathogen§

17/35 (49%)

23/43 (53%)

16/35 (46%)

16/41 (39%)

7% ( 15% to 28%)

Middle ear bacteria

n = 24

n = 30

n = 23

n = 32

Streptococcus pneumoniae§

1/24 (4%)

4/30 (13%)

0/23

2/32 (6%)

6% ( 20% to 9%)

Non-typeable Haemophilus influenzae

8/23 (35%)

14/28 (50%)

16/23 (70%)

20/31 (65%)

5% ( 21% to 29%)

Moraxella catarrhalis§

0/22

0/29

1/21 (5%)

0/31

5% ( 4% to 14%)

Staphylococcus aureus

8/24 (33%)

5/30 (17%)

8/23 (35%)

4/32 (13%)

22% (0 to 45%)

Group A Streptococcus

3/24 (13%)

1/30 (3%)

5/23 (22%)

2/32 (6%)

15% ( 3% to 37%)

Pseudomonas aeruginosa

3/24 (13%)

10/30 (33%)

10/23 (43%)

10/32 (31%)

12% ( 13% to 37%)

Proteus spp.

3/24 (13%)

2/30 (7%)

2/23 (9%)

2/32 (6%)

2% ( 13% to 22%)


TMP = tympanic membrane perforation. IQR = interquartile range. * Unless otherwise indicated. Includes children lost to follow-up, whose diagnoses were assumed not to have changed from baseline. Denominators are reduced due to children lost to follow-up, children refusing to have swab taken, or swab being damaged in transportation. § Some plates were contaminated by Proteus spp.

3 Children with ear discharge at final ear examination, by subgroup at baseline

Overall

Swimmers

Non-swimmers

Risk difference (95% CI)


All children with ear discharge at final ear examination

56/89 (63%)

24/41 (59%)

32/48 (67%)

8% ( 28% to 12%)

Subgroup

Aged 5–7 years

14/24 (58%)

6/11 (55%)

8/13 (62%)

7% ( 44% to 31%)

Aged 8–12 years

42/65 (65%)

18/30 (60%)

24/35 (69%)

9% ( 31% to 15%)

Not prescribed topical ciprofloxacin

46/67 (69%)

20/30 (67%)

26/37 (70%)

4% ( 26% to 18%)

Prescribed topical ciprofloxacin

10/22 (45%)

4/11 (36%)

6/11 (55%)

18% ( 54% to 23%)

Nil discharge

9/31 (29%)

3/15 (20%)

6/16 (38%)

18% ( 47% to 15%)

Scant discharge

16/23 (70%)

5/10 (50%)

11/13 (85%)

35% ( 66% to 4%)

Moderate or profuse discharge

31/35 (89%)

16/16 (100%)

15/19 (79%)

21% ( 1% to 44%)

Small (< 25%) perforation

19/49 (39%)

9/24 (38%)

10/25* (40%)

3% ( 29% to 24%)

Large (≥ 25%) perforation

35/38 (92%)

15/17 (88%)

20/21 (95%)

7% ( 31% to 13%)


* Perforation size was not estimated for two children in the non-swimming group at baseline.

4 Change in diagnosis (by child’s worst ear) from baseline to final ear examination

Outcome

Overall (n = 89)

Swimmers (n = 41)

Non-swimmers (n = 48)


Dry TMP to closed tympanic membrane

4 (5%)

1 (2%)

3 (6%)

Dry TMP to dry TMP

18 (20%)

11 (27%)

7 (15%)

Dry TMP to wet TMP

9 (10%)

3 (7%)

6 (13%)

Wet TMP to closed tympanic membrane

3 (3%)

0

3 (6%)

Wet TMP to dry TMP

8 (9%)

5 (12%)

3 (6%)

Wet TMP to wet TMP

47 (53%)

21 (51%)

26 (54%)

Improved

15 (17%)

6 (15%)

9 (19%)

Same

65 (73%)

32 (78%)

33 (69%)

Got worse

9 (10%)

3 (7%)

6 (13%)


TMP = tympanic membrane perforation.