AI scribes promise multiple benefits for general practitioners and their patients, but we may risk losing valuable human connection in the process, writes Dr Elizabeth Deveny.

AI scribes are entering general practice settings across Australia. These tools transcribe conversations and draft notes. The promise: less admin, better records and more patient focus.

In a system under pressure, these benefits are not trivial. None of this is about blame: GPs are under immense pressure already.

But what do we all lose when the machine starts to write the notes?

Health care is deeply human. The consultation is more than a transaction or information exchange. It’s relational, contextual and shaped by silence as much as by speech. The introduction of AI into that space is not neutral. It changes not only what gets recorded, but what gets said.

We don’t yet have population-level data on the use of AI scribes in general practice in Australia. Formal research will follow, but early signals like these deserve attention now. Clinicians and consumers report mixed experiences.

I’ve heard many people say that something shifts in the room when their doctor starts using an AI scribe. They can’t always name what’s changed. Some feel less seen. Others miss the way their GP paused to jot things down. That pause helped the doctor think, and gave patients a moment to reflect, breathe and decide what to say.

That small moment of slowness mattered.

And then it’s gone.

The doctor’s still facing you, but something in the room has changed.

AI scribes in general practice: support, silence and the shape of care - Featured Image
Clinicians and consumers report mixed experiences with AI scribes (Monkey Business Images / Shutterstock).

Speeding up consultations isn’t always helpful, especially, for example, for people with auditory processing issues or who come from culturally and linguistically diverse backgrounds. A faster clinical conversation may make it harder to participate. One woman told me she left a consultation in tears. She couldn’t keep up and didn’t want to interrupt the GP or the machine. Technology might help one person and accidentally shut someone else out.

A friend recently said this: “My GP seems to forget more of what we talked about than I do, since he started using that bot to write his notes.” She felt less remembered, less heard — enough to consider changing doctors.

This isn’t science, but the patterns are there, echoed in many conversations. Maybe I’m wrong. Maybe these are just teething issues, and we’ll all adapt. But something’s shifting, and I don’t think we can afford to ignore it. The presence of an AI scribe changes the feel of the room. It changes who clinicians speak to, what they notice, and what people are willing to say. If you’re early in your career, these shifts may not be obvious yet, but they shape trust over time.

Some clinicians have told me that reviewing the notes they personally wrote after a consultation often brings a fresh insight. A moment of clarity that helps them piece something together. The act of re-reading their own words, shaped by their memory and attention at the time, can bring nuance to clinical reasoning. I wonder whether AI-generated notes will offer the same depth. When the words aren’t your own and the emotional context is stripped away, does something vital get lost?

All of us in leadership roles — whether consumers, clinicians, system builders or policy makers — need to ask what the AI is learning. What counts as success? Who defines it? This is also a governance question. If we’re training machines on imperfect data, we may be encoding blind spots.

There are also concerns around training and consent. Where power imbalances exist, the rushed “This okay with you?” at the start of a consultation may not constitute meaningful consent. Consumers often say they didn’t have a real choice.

And what if a consumer does mind? What happens then? Consumers don’t want to be punished if they object to an AI scribe being used during their consultation. Some object because no one explains how the technology works or where the data goes. Others simply don’t want their conversations recorded. This is all reasonable. Health professionals need to remember that while they may believe AI scribes are the future, not all consumers are convinced.

We need to shape the use of AI scribes with care. That means co-designing standards with everyone affected, especially consumers. It means making space to pause, reflect and catch what might be missed.

And real consent, not just a rushed line at the start. Without this, we risk undermining trust — the very foundation of good care.

The potential benefits aren’t only for clinicians. We need to ask: what might a consumer experience as a gain? Are their needs being captured and acted on more clearly?

Peer-reviewed studies could help us understand how AI scribes affect not just clinician burnout and system efficiency, but also consumer safety and continuity of care. I’m not anti-technology. If this is progress, I’d say it’s time to get a second opinion — and ask what might quietly go missing.

Silence isn’t empty. It can hold hesitation, even courage. A human can be trained to notice this. A machine can only transcribe.

Even the best transcription is still a translation. And what gets lost in translation, especially in care, can be everything that matters.

Dr Elizabeth Deveny is currently the CEO of Consumers Health Forum of Australia, the national independent peak body for health consumers. Her recent roles include CEO of South Eastern Melbourne PHN, CEO of Bayside Medical Local and Chair of Southern Metropolitan Partnership. 

The statements or opinions expressed in this article reflect the views of the authors and do not necessarily represent the official policy of the AMA, the MJA or InSight+ unless so stated. 

Subscribe to the free InSight+ weekly newsletter here. It is available to all readers, not just registered medical practitioners. 

If you would like to submit an article for consideration, send a Word version to mjainsight-editor@ampco.com.au. 

6 thoughts on “AI scribes in general practice: support, silence and the shape of care

  1. Judith Virag says:

    Another exceptionally insightful and thought provoking article. Thank you!

  2. Philip Dawson says:

    I cannot see the use of it for the majority of consultations. It may be useful for long psychiatric consultations, but even then there are huge problem. I often find myself listening to an anxious depressed patients diatribe about their family and friends. Often in our small town those family and frineds are patients or even colleauges. I only record that the patient has “problems” or “disagrements” with their family/friends. i dont want the substance of those complaints in writing, which might imply I agree with them. I wouldnt want that subpoenaed. As for mort review consultations for chronic illnes, or to give results and arrange further care- a click to say the results are notified, a couple of lines and its done, Same for the referral letter- a few lines of typing, a few clicks to add results and par=st history and its done. The AI scribe, being standalone, cannot do that. Maybe when the Artificial Insemination scribe (oops I live in a rural area and thats what patients think when they hear AI!) is incorporated into the clinical software there maybe a use case. However like AI incorproated into google search, Chrome and Bing Browsers and MS Office it may become intrusive and time wasting. I use it occasionally, and often have to click to get rid of it. Its not very intelligent, it cant tell when the basic google search gets the best answer. It also cannot tell when there is no answer, it hallucinates and mkes one up.

  3. Anonymous says:

    It is suggested by one College that Registrars restrict their use of AI scribes in Term 1. I think it should be longer than this. Writing our own notes after a consultation helps us in many ways eg to think about what we forgot to do or ask, synthesize information, develop critical thinking to name a few. It also helps us to remember where we were up to or what we were thinking at the next consultation. I don’t think that these skills are all fully developed after 1 term in General Practice.

  4. Nicholas Lelos says:

    I believe that the statements “something has changed in the room” over and over represents every major technological implementation in healthcare. Certainly this line must have been used when doctors introduced writing down with implements on mass produced paper, and the when computers showed up in consultation rooms.
    Let’s face it, this is just another tool to help an ever increasing workload. Whether it is used as a transcription tool with appropriate consent, or used as a dictation device after seeing a patient, it is still a great way of reorganising thoughts and keeping documentation up to date. It is the clinicians responsibility to look up the notes after, and do the usual lip-tapping, ruminating we all love doing after a consultation. Nothing gets lost, only gain occurs from that front – increased accuracy, recall and better medical record keeping.
    A better question is how to harness this technology efficiently and ethically to improve patient care and our workplaces more centred around our patients then endless paperwork.

  5. Dr Allen Chow says:

    Not only the above, I ‘ve seen a colleague’s AI notes and I can say I don’t want to be next doctor looking after his/ her patients as the amount on information there is a minefield and imagine having to address each and every concern that was raised by the patient in the AI notes. I always thought salient points should be recorded and to avoid the detractors.

  6. Jill Ramsey says:

    fantastic insight of an experienced GP. I am a retired GP and I couldn’t agree with her more. thankyou

Leave a Reply

Your email address will not be published. Required fields are marked *