In Part 2 of her series, Dr Louise Stone explores whether digital mental health services are providing the safety and value for money we expect.

In the previous article, I explored current policy positions on the use of digital mental health services, and discussed how these services are being actively promoted and incentivised to drive increased engagement. In this article, I will explore the cost, accessibility and safety of these interventions. With another $588 million invested in the sector in the recent Federal Budget, it is time to examine more carefully the cost benefit equation. After all, this investment is the equivalent of 2350 full time GPs doing long consultations.

Are digital mental health services cheap?

Digital services are marketed as a cost-effective solution to our burgeoning mental health crisis. However, they may not be providing the value for money we expect. It is almost impossible to obtain a complete picture of the public funding for digital health services because there are so many funding pathways and sources of data. Some services are funded directly by governments through core funding (eg, Lifeline). Some are funded through commissioned services by the various primary health networks. There are also federal research grants to develop and evaluate programs through the National Health and Medical Research Council and Medical Research Future Fund, and specific grants that are uncontested (eg, the Brain and Mind Centre development of Innowell). There are additional commercial benefits for some services, with profits to shareholders and occasionally consumer contributions or sales.

Are digital mental health services cost-effective and safe? - Featured Image
Digital mental health services are marketed as a cost-effective solution to the mental health crisis (dodotone/Shutterstock).

For example, headspace received $23.5 million in core funding from the federal government in 2023, some of which will be directed towards digital mental health initiatives. Primary Health Networks have also been funded to deliver youth mental health services, which mandates commissioning of headspace, at $134 million in 2023. Headspace also attracts philanthropic funding and state government funding. The portion of this funding spent on digital health interventions is unclear.

However, even if we only examine federal funding, digital mental health services are not cheap. The following table has been constructed using publicly available material, specifically Grant Connect, the government’s records of awarded federal grants, and the 2023–24 federal budget documents. This is a significant underestimate of funding because there are three large sources of funding that are not included. The first is grants through philanthropy, industry and other government support (eg, state government grants). The second is core funding. The third is commercial, with sales of apps and their related data providing significant profit for the companies that design them.

GP costs to the federal government are calculated using Medicare statistics for mental health item numbers.

Even restricting digital health funding to federal sources, each person who uses digital health services costs the taxpayer $76 in federal funding, whereas each person using a GP for their mental health costs the taxpayer $81.

With less than $5 difference in funding between a digital health service and a GP service, digital mental health is hardly cheap.

Category and organisation Funding (millions) People seen (millions) Federal investment
Federal budget announcement digital mental health ongoing funding 2023–24 $44
Federal research grants focused on digital mental health 2023 $27.3  
Selected federally funded programs (Black Dog Institute, StandBy Support After Suicide, InsideOut Institute’s eClinic and Digital GP Hub, Head to Health website) 2023–24 $25.9
Total e-mental health new investment by federal government in 2023 $97.2 1.248 (4.8% population) $78 per person using a digital health service
General practice mental health $269 3.224 (12.4% population) $83 per person using a GP  

Are digital mental health services safe?

At the bare minimum, we need assurances that digital mental health interventions are safe. It is the developers’ responsibility to prove that there is a solid research basis for their claims of efficacy. There should also be transparent reporting of harm.

One of the biggest concerns is data safety and privacy, and this is where I have issues with the term “consumer”. Although many people can act as informed consumers when they are unwell, the term “consumer” implies that they are well enough to make informed decisions about therapeutic products. Some consumers are patients who may lack their normal energy and cognitive capacity when they are unwell. They have the added protection of regulation by the Therapeutic Goods Administration (TGA) when we prescribe a drug or device. This gives patients some confidence that the medication or intervention they are prescribed is safe. Of course, patients still need support to consider their options, either through conversations with clinicians, or open and transparent communication of risks and benefits with publicly available educational material.

With digital products, transparency is critical, because patients have the right to informed consent. Consumers need to be clearly informed about how their data is going to be collected, managed and used, because data stewardship is critical to preserving their trust in the clinicians who refer them. There is evidence that service users may have limited knowledge and awareness of privacy, confidentiality, and security. They may have the right to decide how risk averse they choose to be regarding digital mental health tools, but developers may not make these risks clear. Many apps don’t display associated risks. Some digital tools passively monitor and collect data, with many forwarding data to commercial entities. Emerging apps that use sensor tracking and machine learning can provide personalised feedback on mental health, but also capture granular data on a person’s health, a significant privacy risk.

The digital designers need to be held accountable for the accuracy of their content. Some app content is inaccurate, or even harmful. Few digital mental health services consider diversity. Artificial intelligence (AI) is known for bias towards more privileged populations with one author stating that the acronym stands for “Augmenting Inequity”.

In response to these concerns, there have been attempts to regulate digital health products, with the TGA producing a regulation guideline. The guideline suggests that any service that is “based on established clinical practice guidelines that are referenced and displayed in the software” is exempt from the requirement to register. The decision tree for digital devices is found here. Based on this decision tree, it seems likely that most currently available digital mental health services are exempt.

The blurring of the lines between information and therapy is a concern. There is ample evidence for online education in all disciplines. Open, accessible information that takes a consumer-centric approach and is co-designed with consumers and carers is clearly empowering and effective.

However, digital mental health tools that claim to be therapeutic are a different matter. Therapeutic products should be held to higher standards than non-specific “wellness” apps or information sites, and should be required to meet accountability and transparency standards. There is an even greater need for regulation when these products use or are developed with AI.

There will always be a market for unregulated apps. We have seen this in the fitness industry. However, if GPs are to lend their credibility to digital tools by recommending them to patients, they need to have assurance that such tools are credible. This requires more stringent regulation.

Dr Louise Stone is a Canberra GP with clinical, research, teaching and policy expertise in mental health. She is an associate professor in the social foundations of medicine group, Australian National University Medical School.

Read part 1 of her series on digital mental health services here.

The statements or opinions expressed in this article reflect the views of the authors and do not necessarily represent the official policy of the AMA, the MJA or InSight+ unless so stated. 

Subscribe to the free InSight+ weekly newsletter here. It is available to all readers, not just registered medical practitioners. 

If you would like to submit an article for consideration, send a Word version to mjainsight-editor@ampco.com.au. 

5 thoughts on “Are digital mental health services cost-effective and safe?

  1. Dr Naomi Rutten says:

    I’m curious too, if we, as GP’s refer patients to these Apps and there is an adverse outcome or ineffective response, whose responsibility does it become?

  2. Kyla Bremner says:

    The main issue I find is that patients just don’t want therapy from a computer screen. People are isolated enough as it is and are already spending too much time on screens. They desperately want human warmth, empathy and connection with individual understanding of their lives. Who’s making money from these non-evidence based schemes? That’s my question.

  3. Anonymous says:

    Dear Louise, I asked Chat GP what it thought of this situation and here is the answer.
    How would Chat GPT cure mental health illness?
    ChatGPT:
    “ChatGPT, or any AI for that matter, isn’t equipped to cure mental health illnesses. Mental health treatment involves a complex interplay of factors including therapy, medication, support networks, lifestyle changes, and sometimes more. While AI like ChatGPT can offer support, provide information, and even offer coping strategies or suggestions for self-care, it’s not a substitute for professional medical or psychological help.

    AI can potentially augment mental health care by providing resources, offering supportive conversations, and even monitoring for concerning patterns in behavior or language that might indicate someone needs professional help. But ultimately, the treatment and management of mental health illnesses require the expertise of trained professionals.”

    Looks like AI will be referring back to us then!

  4. Anonymous says:

    Page 3 of today’s Age (broadsheet ) reports Google AIs “Cats on the moon” response , as well as more distrubingly , a person “feeling depressed”‘ being advised to jump off the Golden Gate Bridge.

    Also in SMH https://www.smh.com.au/business/companies/cats-on-the-moon-google-s-ai-tool-is-producing-misleading-responses-that-have-experts-worried-20240525-p5jgmk.html

  5. Caroline West says:

    Thank you Dr Louise Stone for outlining some of the issues . Found the relatively high cost of digital services quite a surprise . Also the concerns you flag about representing diversity are a real concern given the sway of AI .

Leave a Reply

Your email address will not be published. Required fields are marked *