AI scribes promise multiple benefits for general practitioners and their patients, but we may risk losing valuable human connection in the process, writes Dr Elizabeth Deveny.

AI scribes are entering general practice settings across Australia. These tools transcribe conversations and draft notes. The promise: less admin, better records and more patient focus.

In a system under pressure, these benefits are not trivial. None of this is about blame: GPs are under immense pressure already.

But what do we all lose when the machine starts to write the notes?

Health care is deeply human. The consultation is more than a transaction or information exchange. It’s relational, contextual and shaped by silence as much as by speech. The introduction of AI into that space is not neutral. It changes not only what gets recorded, but what gets said.

We don’t yet have population-level data on the use of AI scribes in general practice in Australia. Formal research will follow, but early signals like these deserve attention now. Clinicians and consumers report mixed experiences.

I’ve heard many people say that something shifts in the room when their doctor starts using an AI scribe. They can’t always name what’s changed. Some feel less seen. Others miss the way their GP paused to jot things down. That pause helped the doctor think, and gave patients a moment to reflect, breathe and decide what to say.

That small moment of slowness mattered.

And then it’s gone.

The doctor’s still facing you, but something in the room has changed.

AI scribes in general practice: support, silence and the shape of care - Featured Image
Clinicians and consumers report mixed experiences with AI scribes (Monkey Business Images / Shutterstock).

Speeding up consultations isn’t always helpful, especially, for example, for people with auditory processing issues or who come from culturally and linguistically diverse backgrounds. A faster clinical conversation may make it harder to participate. One woman told me she left a consultation in tears. She couldn’t keep up and didn’t want to interrupt the GP or the machine. Technology might help one person and accidentally shut someone else out.

A friend recently said this: “My GP seems to forget more of what we talked about than I do, since he started using that bot to write his notes.” She felt less remembered, less heard — enough to consider changing doctors.

This isn’t science, but the patterns are there, echoed in many conversations. Maybe I’m wrong. Maybe these are just teething issues, and we’ll all adapt. But something’s shifting, and I don’t think we can afford to ignore it. The presence of an AI scribe changes the feel of the room. It changes who clinicians speak to, what they notice, and what people are willing to say. If you’re early in your career, these shifts may not be obvious yet, but they shape trust over time.

Some clinicians have told me that reviewing the notes they personally wrote after a consultation often brings a fresh insight. A moment of clarity that helps them piece something together. The act of re-reading their own words, shaped by their memory and attention at the time, can bring nuance to clinical reasoning. I wonder whether AI-generated notes will offer the same depth. When the words aren’t your own and the emotional context is stripped away, does something vital get lost?

All of us in leadership roles — whether consumers, clinicians, system builders or policy makers — need to ask what the AI is learning. What counts as success? Who defines it? This is also a governance question. If we’re training machines on imperfect data, we may be encoding blind spots.

There are also concerns around training and consent. Where power imbalances exist, the rushed “This okay with you?” at the start of a consultation may not constitute meaningful consent. Consumers often say they didn’t have a real choice.

And what if a consumer does mind? What happens then? Consumers don’t want to be punished if they object to an AI scribe being used during their consultation. Some object because no one explains how the technology works or where the data goes. Others simply don’t want their conversations recorded. This is all reasonable. Health professionals need to remember that while they may believe AI scribes are the future, not all consumers are convinced.

We need to shape the use of AI scribes with care. That means co-designing standards with everyone affected, especially consumers. It means making space to pause, reflect and catch what might be missed.

And real consent, not just a rushed line at the start. Without this, we risk undermining trust — the very foundation of good care.

The potential benefits aren’t only for clinicians. We need to ask: what might a consumer experience as a gain? Are their needs being captured and acted on more clearly?

Peer-reviewed studies could help us understand how AI scribes affect not just clinician burnout and system efficiency, but also consumer safety and continuity of care. I’m not anti-technology. If this is progress, I’d say it’s time to get a second opinion — and ask what might quietly go missing.

Silence isn’t empty. It can hold hesitation, even courage. A human can be trained to notice this. A machine can only transcribe.

Even the best transcription is still a translation. And what gets lost in translation, especially in care, can be everything that matters.

Dr Elizabeth Deveny is currently the CEO of Consumers Health Forum of Australia, the national independent peak body for health consumers. Her recent roles include CEO of South Eastern Melbourne PHN, CEO of Bayside Medical Local and Chair of Southern Metropolitan Partnership. 

The statements or opinions expressed in this article reflect the views of the authors and do not necessarily represent the official policy of the AMA, the MJA or InSight+ unless so stated. 

Subscribe to the free InSight+ weekly newsletter here. It is available to all readers, not just registered medical practitioners. 

If you would like to submit an article for consideration, send a Word version to mjainsight-editor@ampco.com.au. 

14 thoughts on “AI scribes in general practice: support, silence and the shape of care

  1. Anonymous says:

    I attended a specialist using AI for notes and later read the letter written to my physiotherapist. It was incoherent, incomplete and inaccurate.
    Of course, I could tell my own story to the physio, but some people can’t.
    However I am old enough to recall GP referral letters on a tiny note ‘Thank you for seeing X about Y’ with no other details – so I conclude that any tool is as good as the person using it.

  2. Wally Jammal says:

    Insightful article Elizabeth-thank you . As a GP using these tools, the feedback we get is generally positive .Yes, there is something that has changed in the room – the doctor is concentrating on the patient rather than the computer . For years we have advocated for better patient centred care that focuses on the humanity of our interactions . For years we have also advocated for transparency of information for our patients, including making a doctor’s notes visible to patients to include quality and safety . These tools could help all of that but it all depends on how they are used . I for one use them as a tool to look at people more , not to speed me up . I still have to spend time checking, removing and adding content, which I do before the patient leaves the room. And I for one am not ashamed to say that the AI scribe jotted things down I did not register! When I show the notes generated to patients, they are incredibly impressed . In my mind, if we learn to use these tools effectively, they could not only improve safety but also used to improve the value of our humanity. Ultimately it is up to us to partner with our patients to make things better for everyone .

  3. Geoff Harding says:

    I have used Dragon Dictate for my notes for about 20 years now. I usually type my history in directly ( can’t dictate while the patient is talking).
    But then after doing physical exam I ask the patient to sit and hear what I am dictating – including physical findings and then my assessment and management. Also my opinions of findings in investigatins etc.
    Patient then gets full print-out of the whole consult. ( and then send that to their GP via Medical Objects as well)
    The difference is that no outside corporation gets my notes, and what is in the notes is my dictation, not an
    AI-generated impression of the consult. I am in control.
    It works perfectly well for me ( I use a small radio lapel mic).
    I have no interest or need to “progress” to AI tools.

  4. alan mclean says:

    every step ‘forward’ in tech over last 30 years has promised saving in time and improved productivity. Unfortunately the results have generally been disappointing and frustrating. I feel now a pressure to try AI but really it is the fear of missing out. I have always been told that notes and records are extremely important, and I take pride in my records. Better to stay in control of your own documentation.

  5. Christopher Steinfort says:

    Great comments above.
    Thank you for these warnings and advice.
    I will not use AI and will footnote my letters stating this.

  6. Matthew Cohen says:

    It is probably very different for us specialists who, prior to using an AI scribe, dictated a report back to the referring doctor after each patient, or at the end of each clinic if running late. Editing an AI-prepared draft is so much faster, I can either see more patients or get home on time. As for “something is missing” I find I now engage much better with patients when I’m looking at them instead of the screen (or the keyboard for us non-touch typists).

  7. Anonymous says:

    Okay, so I’m not a GP, I’m a Pain Specialist but my comment would be the exact reverse of what is mentioned above. I’ve been using an AI scribe since September last year. I barely strike a keystroke during a consult now, the feedback from patients has been overwhelmingly positive. They feel I am more connected and listening more intently rather than desperately trying to keep up my notes (which over the years have become increasingly more details due to medico legal requirements). Furthermore, my reports go to the GPs as the patient leaves the room. In addition, my program has the ability to send an email summary of the consult to the patient afterwards (encrypted with a pin code of course). This feature has brought considerable positive feedback. Of course I always check and edit the AI notes, but I can’t imagine ever going back, nor would the vast majority of my patients want me to.

  8. Anonymous says:

    Another issue is the huge amount of energy and water used in AI data centres. For example, see a recent article from ABC news:
    https://www.abc.net.au/news/2025-05-20/data-centre-growth-energy-water-internet-future/105089876

  9. Judith Virag says:

    Another exceptionally insightful and thought provoking article. Thank you!

  10. Philip Dawson says:

    I cannot see the use of it for the majority of consultations. It may be useful for long psychiatric consultations, but even then there are huge problem. I often find myself listening to an anxious depressed patients diatribe about their family and friends. Often in our small town those family and frineds are patients or even colleauges. I only record that the patient has “problems” or “disagrements” with their family/friends. i dont want the substance of those complaints in writing, which might imply I agree with them. I wouldnt want that subpoenaed. As for mort review consultations for chronic illnes, or to give results and arrange further care- a click to say the results are notified, a couple of lines and its done, Same for the referral letter- a few lines of typing, a few clicks to add results and par=st history and its done. The AI scribe, being standalone, cannot do that. Maybe when the Artificial Insemination scribe (oops I live in a rural area and thats what patients think when they hear AI!) is incorporated into the clinical software there maybe a use case. However like AI incorproated into google search, Chrome and Bing Browsers and MS Office it may become intrusive and time wasting. I use it occasionally, and often have to click to get rid of it. Its not very intelligent, it cant tell when the basic google search gets the best answer. It also cannot tell when there is no answer, it hallucinates and mkes one up.

  11. Anonymous says:

    It is suggested by one College that Registrars restrict their use of AI scribes in Term 1. I think it should be longer than this. Writing our own notes after a consultation helps us in many ways eg to think about what we forgot to do or ask, synthesize information, develop critical thinking to name a few. It also helps us to remember where we were up to or what we were thinking at the next consultation. I don’t think that these skills are all fully developed after 1 term in General Practice.

  12. Nicholas Lelos says:

    I believe that the statements “something has changed in the room” over and over represents every major technological implementation in healthcare. Certainly this line must have been used when doctors introduced writing down with implements on mass produced paper, and the when computers showed up in consultation rooms.
    Let’s face it, this is just another tool to help an ever increasing workload. Whether it is used as a transcription tool with appropriate consent, or used as a dictation device after seeing a patient, it is still a great way of reorganising thoughts and keeping documentation up to date. It is the clinicians responsibility to look up the notes after, and do the usual lip-tapping, ruminating we all love doing after a consultation. Nothing gets lost, only gain occurs from that front – increased accuracy, recall and better medical record keeping.
    A better question is how to harness this technology efficiently and ethically to improve patient care and our workplaces more centred around our patients then endless paperwork.

  13. Dr Allen Chow says:

    Not only the above, I ‘ve seen a colleague’s AI notes and I can say I don’t want to be next doctor looking after his/ her patients as the amount on information there is a minefield and imagine having to address each and every concern that was raised by the patient in the AI notes. I always thought salient points should be recorded and to avoid the detractors.

  14. Jill Ramsey says:

    fantastic insight of an experienced GP. I am a retired GP and I couldn’t agree with her more. thankyou

Leave a Reply

Your email address will not be published. Required fields are marked *