Now is the time for “futurists” to start dreaming and for general practice to engage with generative AI.

It is no secret that Australia is facing a growing shortage of general practitioners (GPs), particularly in rural and remote communities and in underserved outer metropolitan populations. This shortage is putting more pressure on the GPs who remain, who also face increased practice costs. This is compounded in regional areas by workforce shortages in primary care and indeed across the whole health workforce. Given current trends, Australia will have a deficit of over 10 000 GPs by 2031.

The appropriate use of artificial intelligence (AI), and generative AI (GenAI) in particular, could be part of the answer to alleviating some of these pressures. Existing GPs are leaving the profession and 13% fewer medical students are choosing to go into general practice. Burnout and poor career progression options are often mentioned as reasons for GP dissatisfaction. One of the main elements contributing to burnout is interacting with electronic health records systems and overspending time on administrative tasks instead of devoting time to patients and clinical encounters. Other sources of frustration for GPs worldwide include longer working hours and a lack of a supportive work environment.

This makes the intelligent use of GPs and their limited time a priority. After all, primary health care provision is still the most efficient and sustainable form of providing health care.

In this context, we believe that the emergence of GenAI could have a transformational role in general practice and help ensure it has a future.

GPs urged to consider the benefits of generative AI - Featured Image
Several software vendors already offer GenAI software to automate parts of the clinical note-taking processes (MUNGKHOOD STUDIO / Shutterstock).

How AI could reshape primary care

With its capabilities in text generation, text processing and “understanding”, conversational AI systems such as ChatGPT should  be especially useful for clinical documentation tasks. A “digital scribe” that automates much of the documentation burden for GPs could indeed mitigate some of the reasons behind clinician burnout. This change is already happening, with several software vendors offering GenAI software to automate parts of the clinical note-taking processes.

Although documentation automation is one of the most obvious uses, this is not the most transformative one on the horizon. GenAI may truly reshape how care is delivered, especially in primary care. The appropriate use of GenAI systems could facilitate care more easily in remote areas, addressing understaffing with administrative task automation. The AI-assisted physician could address and mitigate some of the eternal tribulations of primary care, such as managing diagnostic uncertainty.

It is in this reshaping that we need to concentrate our efforts on imagining, developing and testing the future of “augmented” primary care. What do we mean by augmented? Previous work has highlighted the need to develop “learning health systems”. In a similar light, the augmented consultation could mean a departure from the usual patient–doctor–computer interaction, where, generally, there are discrete flows of information. Instead, a continuous feed from the patient to their AI assistant that then feeds and summarises this information for the doctor may appear. This flow of information won’t end with the consultation itself, as continuous ongoing exchanges between AI agents from patients and doctors will continue to provide information and guide health management.

We could argue then, that most of the primary care will indeed happen outside the consultation room itself. Likewise, we could envision an “embodied AI” consultation room, that listens and sees patient cues and provides them in concise and interpretable ways to physicians or, potentially, to other AI agents. Moreover, the record itself would be transformed into something else completely. Think of a chat interface, that either by text or speech will bring relevant patient information on the fly. You will ask with your voice, or type in a box, “Show me the last HbA1c measurements” and a graph with the values will appear. Type a different request and the record will change. In this sense, the “AI-embodied” consultation room may not even have a computer screen or keyboards but resemble more of a comfortable lounge room, where doctors and patients interact directly, while all the other elements (text, clinical history, measurements) are subdued into the background. Much of this has been foreshadowed and now GenAI has the potential to make it happen.

AI as a training tool

GenAI systems are now multimodal, meaning they can also process and generate other sources such as sound, image or even video. The current developments of multimodal AI systems will likewise allow in-real-time or near-real-time feedback from patient and doctor communication. Think of receiving cues from how the clinical conversation is going, if the patient seems undecided about treatment or if the tone you are using to communicate information is not adequate. For instance, you could receive hints on what “stage of change” a patient is for quitting smoking based on non-verbal cues you might not have picked up on. It may also suggest follow-up questions to reach a diagnosis.

Although this real-time flow can be useful initially in training scenarios, GPs could also use this as a continuous learning and development tool. Here they would review (with the help of a GenAI assistant) past conversations and cases, practising to improve diagnostic or communication skills.

In this light, AI is also going to be decisive for medical education — akin to airline pilots, who must spend a considerable number of hours on simulators before getting “behind the yoke”. GenAI should allow students and future clinicians to develop their skills before they see their first real patient, allowing for better trained clinicians (eg, extensive training on giving bad news before facing real patients).

Caution is still required

Although there is indeed a bright future ahead, we should not race to a mindless use of these technologies. To continue with transportation analogies, just as cars require multiple quality tests before they hit the market, AI will need robust testing before it can hit the “medical roads”, and users will need training. We may need to qualify AI clinical users to know how to prompt and interact with this novel form of computation so, for instance, they won’t be confused by anomalous outputs. While it is likely that the most salient problems such as fabrication of answers (hallucination) will be overcome over the next years, GenAI departs from previous forms of deterministic computing (meaning that you will always get the same output for a given input) and into a more conversational style of interaction with computers. Although variation in expression (articulating something in different words) by GenAI is acceptable, we need to make sure that these variations don’t entail erroneous judgments. We need to make sure that GenAI variability falls within its expertise so a trusting relationship with clinicians is established and they can see the AI’s output akin to a peer.

Beyond primary care, there is a huge opportunity to rethink health care workflow from operating theatre to hospital switchboards. GPs need to be at the centre of this revolution as their expert generalism is going to be key to understanding, filtering and acting on the vast capabilities of GenAI.

We believe there is a bright future ahead, and we need to engage with it continuously. That enthusiasm should always be tempered with rational caution because we do not wish to harm for innovation’s sake. This is primary care’s chance to rethink what it means to, and how we provide care for, patients and the community.

Professor Enrico Coiera is the Director of the Centre for Health Informatics at the Australian Institute of Health Innovation, Macquarie University. Professor Coiera trained as a medical doctor and leads the Australian Alliance for AI in Healthcare

Dr David Fraile Navarro is Postdoctoral Research Fellow on Generative AI at the Australian Institute of Health Innovation, Macquarie University, and a trained general practitioner.

The statements or opinions expressed in this article reflect the views of the authors and do not necessarily represent the official policy of the AMA, the MJA or InSight+ unless so stated. 

Subscribe to the free InSight+ weekly newsletter here. It is available to all readers, not just registered medical practitioners. 

If you would like to submit an article for consideration, send a Word version to mjainsight-editor@ampco.com.au. 

5 thoughts on “GPs urged to consider the benefits of generative AI

  1. Louise Stone says:

    Why oh why does every single edition of every single journal contain at least one article “urging” GPs to change their practice in a way that will benefit the authors? Usually with something technologically sophisticated.

    The authors may be right. There may well be a place for AI in general practice but we won’t know if we don’t have the courtesy to consult with experts. And weird though I know it sounds in our current climate, the expert on how GP works is actually a GP. (I know one author has been a GP but that was briefly and in Spain in 2017. GP has moved on.)

    I am so tired of being “encouraged to overcome my barriers” or “think again about my choices” or even be regulated so my “resistance” can be “managed”. So let me put in bluntly.

    I know science doesn’t have the same purchase in our post truth world but we should at least try to generate evidence before we attempt to change a system with the cheapest, most accessible and most effective outcomes we have. GP is cheap, effective and accessible and has high patient satisfaction. If you want me to change, you’re going to have to prove to me that your shiny intervention is better than my current practice . In my context, not yours.

    New drugs get RCTs and have to pass the TGAs standards to prove they are better than standard care. Devices need to at least prove they are safe in humans. Healthcare interventions (like this one) need to have a compelling narrative, a bright idea, an enabling framework and usually a few blokes to sell an idea that will make my practice better, cheaper, faster, more connected and (usually) less private and more over governed. More harvestable data means more nudging from more bureaucrats. But the promise of data harvesting will get this into policy quicker than I can post this comment.

    One of my patients came back from seeing someone at Headspace recently and said it wasn’t helpful. I asked why. She really thought about it and said “I guess I realised within a few minutes she had her own agenda and didn’t want to get to know me and what I needed. Without that, how can I believe that what she suggests is going to work for me?”

    If you want to “convince” me that I should change my practice to incorporate your ideas and, less face it, buy your products, you need to know what I do. And that means involving GPs in your research or at least in your communication. Otherwise it’s just another didactic, patriarchal grab for a very large and lucrative market. If I’m wrong, do what my patient said. Show me you understand my industry before you tell me what to do.

    Otherwise, I will do what I am taught to do. I will be skeptical about products that claim to solve a problem that is ill defined and poorly understood. We knew that with the pharmaceutical reps in the 80s. Now it’s the data entrepreneurs. Stop it.

  2. Joel Bernstein says:

    Another major problem facing the use of AI especially with regard to diagnosis and treatment are the medico-legal aspects with regard to responsibility and blame for bad outcomes.
    This could end up being the largest hurdle to be overcome and at what cost?

  3. Joachim Sturmberg says:

    Technically, well may be. Pragmatically – it fails most patients in general practice. It further entrenches the loss of “healing relationships” at the heart of medicine. Without listening and shared sense-making, medicine is without an essence and a soul.

    For those who want to reflect more deeply, listen to neurologist Iain McGilhrist’s talk: A Revolution in Thought? He makes the very pertinent observation, extending on Benjamin Franklin: [Man is] the Tool making animal [who] thinks like a machine and has therefore exported machine like thinking into our environment everywhere nature and life.

    https://www.youtube.com/watch?v=AuQ4Hi7YdgU

  4. Dr Philip Dawson says:

    Except for mental health consults history taking is not onerous, and most encounters in GP need 2 lines of text. Talk about using a sledge hammer on a nail ! Perhaps we should ask AI why graduates don’t want to do general practice, and what could be done about it? Everyones efforts over the last 30 years have been a dismal failure. Recognize this and think again. Replacing humans with IT solutions will be about as popular as online mental health tools. Despite pushing these for a decade patients won’t or can’t use them. They want to talk to a real sympathetic empathetic person.
    AI is just a smarter search engine. It is so smart that it couldn’t work out that using the acronym AI would make rural people and anyone involved with animals, vets and the meat industry laugh. It has long been the acronym for what farmers do to breeding cows! Perhaps it’s not so smart. The acronym GIGO springs to mind (if you don’t know what it means ask AI. Better still type it in to my favourite website, acronmyfinder.org).

  5. Ewen McPhee says:

    The authors do paint a positive image for the future and one might hope that it transpires that we can be supported to deliver better outcomes. Regardless it is still hyperbole, with hidden risks such AI training that favours a company’s business model, or a lack of appreciation of critical data in decisions especially with error prone AI. Airlines won’t be using AI to replace training anytime soon but they are investigating decision support and ergonomics. Finally there was a lot of justification in the opening paragraph that is contentious with 18 to 20% of doctors seeking primary care as a career when you include Rural Generalists. The expanded practice team may well be the model we pursue along with alternative approaches to communication, patient activation and information sharing. AI has much to offer but it’s not there yet, as well as being at the mercy of a solution looking for the problem to solve.

Leave a Reply

Your email address will not be published. Required fields are marked *