×

Where is the next generation of medical educators?

In reply: We thank Hart and Pearce for supporting the views raised in our editorial, noting the unmet demand for medical education expertise.

We also thank Kandiah for his response, and agree that medical graduates should be “clinically competent, reliable, keen to learn and show compassion to patients and colleagues”. We believe this outcome is best achieved by strong collaborations among “skilled clinicians and excellent mentors” and medical educators, many of whom are also practising clinicians. Clinicians provide critical input to ensure the validity and authenticity of what is taught and assessed, and are an essential element of the “triad” of patient, student and clinician in clinical learning.1 Collaboration between clinicians and medical educators is not difficult because they are often embodied within the same people.

The question of proof in medical education is the subject of much activity and, as Kandiah notes, there is an increasing output of scholarship in medical education. Moreover, the quality and rigour of this output is increasing, with a growing evidence base for medical educational practice.2 Generating new knowledge and applying it to medical student education is a key goal of an increasingly professionalised medical education community.

Medical education research is confounded by multiple factors, not the least being the powerful and uncontrolled effects of the diverse clinical environments in which students learn and practise as graduates.3 These make causal pathways difficult to unpick. While researching the effect of medical educators may be desirable, we believe that researching the effects of medical education interventions is more fruitful. For example, if one were to substitute medical educators with radiologists, how could one “prove” that radiologists have improved the health of the Australian population? Yet we are convinced that radiology does play an important role, based on multiple individual studies showing contributory evidence for this claim.

We welcome opportunities to work with health services and the community to examine the long-term performance of our students and their impact on the health system. Collaboratively defining and answering specific questions is likely to be much more productive than making artificial distinctions between clinicians and educators.

Supply and demand mismatch for flexible (part-time) surgical training in Australasia

Surgical training follows an apprenticeship model, traditionally involving long hours of full-time mentored practice over several years. Recently, this model has been challenged by several trends, including working-hour restrictions, falling case exposure, and a desire for work–life balance.13 Another challenge is an increasing demand for flexible (part-time) training.4,5

The Royal Australasian College of Surgeons (RACS) supports flexible training by allowing trainees to accredit part-time work, but it mandates a time commitment of at least 50% during any training year.6 However, the opportunity to train part-time in surgery is also influenced by hospital employers and supervisors, and the supply of part-time surgical training posts is limited.

There are currently no data from our region regarding the number of surgical trainees undertaking flexible training. Our primary aim was therefore to define current flexible surgical training uptake and demand in Australia and New Zealand. A secondary aim was to identify demographic and work-related factors motivating interest in flexible training.

Methods

All 1191 trainees enrolled in an RACS program during 2010 were identified through the College’s database and invited by email to complete an anonymous online questionnaire, with weekly reminders over 4 weeks. Approval was granted by the RACS Ethics Committee.

The survey comprised four sections with option buttons. The first section defined demographic characteristics, including age, sex, specialty and Surgical Education and Training (SET) program year. Demographic data were also analysed to determine whether the responding sample was representative of all RACS trainees. The second section defined current working hours, and the third section rated work-related fatigue. Trainees were asked if they worked part-time, full-time or were not currently in active clinical training (ie, interrupted or deferred, such as for research or parenting). Results concerning trainee working hours and impacts of fatigue have been reported elsewhere.3,7 The fourth section used Likert scales to ascertain respondents’ perceptions of their work–life balance and interest in flexible training. This section included the question, “Are you interested in the option of applying for flexible (less than full time) training during your surgical training?”. Responses were assessed for differences between sexes and specialties. Further analyses were undertaken to identify whether interest in flexible training was correlated with working hours or fatigue.

Analyses were performed using SPSS version 19.0 (IBM), using cross tabulations with the χ2 test (threshold P < 0.05).

Results

Of the 1191 trainees, 659 responded (response rate, 55.3%), and 587 respondents (89.1%) completed all relevant questions. Respondents were similar to all trainees in terms of specialty (P = 0.22) and sex (P = 0.09). Of the 659 respondents, 187 (28.4%) were female, with the proportion of women differing between specialties (P = 0.02), ranging from 2/19 in cardiothoracic surgery to 9/16 in paediatric surgery (Box 1). The median age of respondents was 32 years (range, 24–50 years).

Most of the 659 respondents (627, 95.1%) were engaged in full-time clinical training, with 30 (4.6%) not in active clinical training, and only two (0.3%) in a part-time clinical training position. Both respondents who were working part-time reported working 40 hours per week, compared with a median of 60 hours per week for those in full-time clinical work. The small number of part-time trainees precluded further comparisons with full-time trainees.

An interest in flexible training was reported by 208 respondents (31.6%), being more common among women than men (54.3% [94/173] v 25.9% [114/441]; P < 0.001) (Box 2, A). There was no statistically significant difference in interest in flexible training by state or country (P = 0.82) or by hospital setting (rural, regional, tertiary) (P = 0.07) (data not shown). There was also no difference in interest in flexible training by age (P = 0.21), but junior trainees were more likely to express an interest than senior trainees (45.8% [60/131] and 38.4% [56/146] for SET years 1 and 2, respectively, v 24.8% [26/105] and 26.4% (23/87) for SET years 4 and 5–6, respectively; P = 0.002). Trainees interested in part-time training were more likely to express concerns regarding fatigue impairing their work performance and limiting their social or family life, inadequate work–life balance, and insufficient time for things outside surgical training, including study or research (Box 3 and Appendix).

General and orthopaedic surgery trainees were most likely to report an interest in flexible training (41.6% [116/279] and 32.5% [37/114], respectively), while cardiothoracic and vascular surgery trainees were least likely (6.7% [1/15] and 8.0% [2/25], respectively). Female general surgery trainees were more likely to be interested (65.2% [58/89]) than female trainees in other specialties (P = 0.04).

There was no significant difference in work–life balance satisfaction between male and female surgical trainees with respect to working ≥ 60 or < 60 hours per week (P = 0.48). About two-thirds of trainees reported currently working “too many” or “far too many” hours in terms of their preferred work–life balance (men, 62.0% [259/418] and women, 65.1% [110/169]). Trainees’ opinions on whether they had satisfactory time in their lives for things outside of surgical training are shown in Box 2, B.

Discussion

This study demonstrates a striking mismatch between interest in flexible training among Australasian surgical trainees and the number of trainees currently in a part-time post. Although 32% of trainees were interested in flexible training, less than 1% were engaged in part-time clinical training.

These results show a previously undocumented high level of interest in flexible training among both male and female trainees. The rate of interest among men was higher than the 13% rate reported among male general surgical residents and medical students in the United States.4 The leading factor known to motivate interest in flexible training is time for parenting,4,8 and the mean age of trainees in our study (32 years) coincides with the age when Australians would typically choose to become parents.9 Impact on family life has been found to be among the biggest regrets of US surgeons regarding their time in residency,4 and the American Surgical Association has previously called for increased flexibility in training to facilitate parenting.5 Our results show that similar efforts are required in Australasia. Limited opportunities for flexible training may discourage graduates from considering a surgical career.1012

Our study also found that work-related factors are associated with interest in flexible training. On average, Australasian surgical trainees work more than 60 hours per week, and around 75% also perform on-call duties for a further 28 hours per week.7 A previous study of Younger Fellows of the RACS showed that those working more than 60 hours per week were at higher risk of “burnout”,13 which could perhaps be mitigated by increased work flexibility. Factors other than family and fatigue, such as health and academic interests, are also likely to contribute to interest in flexible training.4,8,14

Barriers to flexible surgical training that may explain the low uptake in our study include clinical, supervisory, trainee and employment factors.12,14,15 From a clinical and supervisory perspective, the potential impact of part-time training on continuity of care, and the associated need for additional handovers, is a concern.16 As part-time training occurs at a lower intensity and over a longer period, it may also be an impediment to gaining technical skills.15 Few educational data are currently available to assess this, but limited Australasian experience suggests quality outcomes can be achieved within the right model.14 Factors deterring trainees from flexible training include prolongation of training, a trade-off in salary and benefits, complexities in negotiating a part-time hospital contract, a perception of receiving “second-class training”, and discouragement from supervisors and other trainees.4,17 In addition, the limited availability of part-time hospital appointments is a key barrier.14 It may be difficult in some rotations to provide trainees with the necessary range of clinical experience (spanning acute, elective, operative and non-operative experience) in a flexible capacity.

Despite these challenges, opportunities exist to enhance the supply of flexible surgical training posts. Two possible models are job-sharing and stand-alone posts, both of which have successful precedents in Australia.12,14 Job-sharing can be facilitated by allowing trainees to “match” into suitable posts, but this typically must be planned months in advance and may be difficult in smaller specialties or regions. A possible path to advancing flexible training through a stand-alone model is through private hospital rotations, which are a focus of increasing interest in Australasia.18 Private sector training imposes opportunity costs on surgeon and hospital income,19 which could be partly offset by these positions being part-time, if quality operating exposure could be assured. Job-sharing in acute surgical units offers an opportunity to mitigate impacts on continuity of care, as scheduled handovers occur continuously during the 24-hour acute care model.20,21

Female students now outnumber male students in Australasian medical schools,22 and our study suggests the proportion of female trainees in surgery is also growing. Although only around 8% of qualified Australasian surgeons are women,23 we found that 28% of RACS trainees are women, indicating that a workforce transition is occurring. As we also found that more women than men are interested in flexible training, these demographic trends are likely to increase pressure for part-time training opportunities.

A limitation of our study is the sourcing of data from self-survey responses; however, the response rate was satisfactory.24 Further, a reported interest in flexible training may not translate into uptake of flexible training, even if the opportunities are available.

In conclusion, we believe efforts should be made to facilitate part-time surgical training in our region.

1 Sex of respondents by surgical specialty

2 Comparisons of male and female trainee responses to questions about interest in flexible training (A) and time outside training for other things (B)

* P < 0.001 for male v female. P = 0.06 for male v female.

3 Factors associated with an increased interest in flexible training

Factor

P*


Fatigue is impairing concentration or performance at work

0.009

Fatigue is limiting participation in social or family life

< 0.001

Current working hours are in excess of preferred work–life balance

< 0.001

Perceived insufficient time in life for things outside of surgical training

< 0.001

Perceived insufficient time for surgical study and research needs

0.003


* Each question was assessed on a five-point Likert scale. P values reflect χ2 analyses of Likert response and interest in flexible training. The full datasets for these analyses are provided as an Appendix (online at mja.com.au).

Flexible surgical training in Australasia

Establishing part-time surgical training posts may be difficult, but evidence suggests it is possible and increasingly necessary

The study by McDonald and colleagues confirms previous reports in Australia and elsewhere that there is a significant difference between the number of trainees who are undertaking some of their surgical training part-time and those who would be interested in doing so.1

It cannot be denied that doctors entering a career in surgery are now seeking a more balanced lifestyle. Generational change and associated lifestyle expectations, the feminisation of the workforce and changing working environments have contributed to the trend.2

While the Royal Australasian College of Surgeons (RACS) has had a policy supporting flexible surgical training for some years,3 it recognises that, by itself, this is not enough. The RACS has recently established the Flexible Training Working Party to look at how we can make the option of part-time training more of a reality for our trainees. Members of the working party include jurisdictional (ie, hospital and health department) representatives, and representatives from Colleges with a higher uptake of part-time training than the RACS. Jurisdictional representation on this working party is extremely important, as the College itself does not employ trainees; while it can accredit trainee posts, it does not create them.

The Royal Adelaide Hospital has successfully implemented an effective model for part-time surgical training.4 Interestingly, the position came about after lobbying to the state government produced the funding for a stand-alone 0.5 full-time equivalent position. An early concern for the hospital was the lack of guarantee that the post would be filled from year to year. However, the post has proven popular with trainees, with some even relocating from interstate to take up the position. It is important to note that this post is an accredited position with appropriate learning experiences for the trainee, and not a position designed to satisfy a trainee’s wish for a part-time commitment.

Several commentators have raised concerns that part-time training in surgery will affect the educational experience and that a lack of continuity may be an impediment to gaining proficiency in both technical and non-technical skills.5 Contrary to this is logbook evidence from the Royal Adelaide Hospital post, which showed equivalence between 12 months of flexible training and a full-time 6-month position. Furthermore, all trainees who undertook this part-time position subsequently obtained their Fellowship of the RACS.4

There is no doubt that some doctors are put off pursuing surgical training because it is not seen as compatible with raising a family or having an appropriate work–life balance. This contributes to the imbalance between the sexes in surgery, with only 30% of applicants for surgical training in 2012 being women. Although this has improved from the 9% of current active Fellows who are female,6 more than 50% of new graduating doctors are women, and more needs to be done to ensure surgery is not limiting its pool of available candidates.

The educational concerns relating to part-time training will continue to be debated, but McDonald et al’s study demonstrates a desire from trainees for flexible training options. Although the current surgical clinical training environment is not conducive to establishing posts that are less than full-time, the evidence suggests that it is possible. There is a real risk that the limited availability of flexible training options may deter some doctors from entering the field of surgery. However, until there are a proportion of flexible options for training available, these issues and concerns cannot be fully substantiated or explored. It is hoped that the outcomes of the RACS Flexible Training Working Party will lead to the establishment of more flexible training posts, enabling the many questions and issues raised to be resolved.

Early obstetric simulators in Australia

Simulation became important for obstetric training in the 18th and 19th centuries

The increased numbers of medical students in Australia, coming from a greater number of medical schools, have put pressure on the traditional obstetric education requirement for students to personally deliver babies under supervision. There is a real chance that students may “miss out” on this experience, and universities may need to resort to using simulation to give students those experiences and skills.

Aside from being a substitute for performing procedures where there is less opportunity for students to do so on real patients, simulation is increasingly being used to improve medical education and training because it offers a risk-free environment for learning new skills and practising management of occasional but serious conditions that need prompt and effective intervention. Similar observations were made around 300 years ago, when simulators began to be used in obstetric training.1,2 Surgery simulators were used more than a thousand years earlier, but obstetrics was the first discipline to widely integrate simulation in training.1 In the second half of the 18th century, London was a centre of excellence for obstetrics and attracted students from all over the United Kingdom, Europe and the United States. A certain apothecary, Matthew Flinders, attended a simulation-based obstetric training course there and was then able to successfully manage several difficult deliveries.3 In this article, we outline this training and the development of simulation in 18th and 19th centuries, and describe the historical obstetric simulators that are on display in Australia.

Obstetric training

In the 17th century it was recognised that the outcome of labour could be improved by intervention, and in the 18th century a new health care professional appeared — the man-midwife or accoucheur. William Smellie was an early specialist in this discipline and designed several simulators for use in his training courses. One of his students, Colin Mackenzie, became an assistant but left after an argument over the dubious manner in which the body of a woman pregnant with twins was obtained by Smellie for dissection.3 Mackenzie then established a school of midwifery in St Saviour’s Churchyard in Southwark, London. His lectures included the use of anatomical preparations and “machines” to demonstrate difficult deliveries, and on which students could practise.

Much as now, many health professionals in the 18th century were city-based. While apothecaries were originally compounders of medicines, many also consulted, particularly in the countryside. Some, such as Flinders, undertook additional training so that they could provide comprehensive health care, and these “apothecaries plus” were the progenitors of general practitioners. They were looked down upon by physicians and surgeons, although their training meant they were sometimes better providers of health care. After training as an apothecary, Flinders went to London where he “walked the wards” at The London Hospital on Whitechapel Road as a pupil of the surgeon Richard Grindall and studied obstetrics at Mackenzie’s School of Midwifery.

Flinders received his certificate in the “theory and practise of midwifery” on 16 July 1770. It was signed by David Orme, a man-midwife at the City of London Lying-in Hospital who supplemented his income by teaching midwifery at Mackenzie’s school. When Mackenzie died, Orme bought the school’s collection of simulators and anatomical preparations in order to continue the school. The collection must have been extensive, because Orme paid 1000 guineas for it, equivalent to around $250 000 today.4

Life-saving interventions were a point of differentiation between the man-midwife and the midwife, and justified the higher fees charged by the man-midwife. The fees alone did not reflect the time and disruption caused by attending a labour, but obstetrics became an important part of 18th century general practice because of the connection it created between families and the practitioner. However, failure in the form of a maternal death could quickly destroy a reputation, and simulation-based training on the management of difficult births was highly valued.5 We know from Flinders’ diaries that his training on simulators meant he was able to use obstetric forceps, hooks and the crotchet when necessary.6 Fortunately, there were no complications when Flinders delivered his first son in 1774, whom he also called Matthew. It was expected that the boy would become an apothecary like his father and grandfather.

Early obstetric simulators

Obstetric simulators were first developed at the beginning of the 18th century, and two distinct types were developed: one was based on a skeletal pelvis, and the other had the appearance of a female torso. The pelvis simulator was used with a fetal mannequin or preserved fetus cadaver to demonstrate the usual passage of the fetus through the birth canal and the consequences and management of other fetal presentations. Sometimes a leather (or later rubber) uterus was placed in the pelvis so that some procedures could be practised. The torso simulator was used with a fetal mannequin or preserved fetus cadaver to practise delivery and obstetric interventions.

Natural mannequins

In the 18th century, preserved fetal cadavers were often used in obstetric training. In the 19th century, deceased parturients were frequently used as “natural mannequins”. Johann Lucas Boër, the first director of obstetrics at the Vienna General Hospital who had trained in London, used a pelvis simulator for teaching. In 1823 he was replaced by the more progressive Johann Klein, who quickly introduced teaching through anatomical pathology.

The subjects for operation were the bodies of women who had died in the hospital, most likely in the lying-in division, because recent parturition would improve them as “material”. The abdomen was opened and the pelvic viscera removed by way of preparation, so as to make room for the foetus. After the foetal corpse had been placed into position, it was held by the teacher and the pupil proceeded to perform the operations of version, decapitation, etc. as required.7

Under Boër the maternal mortality from puerperal fever was around 1%, but under Klein it was frequently over 20%.8 One inquiry into what caused the high rate of puerperal fever at the hospital concluded it was the result of large numbers of foreign medical students.9 After thousands of maternal deaths, the relationship between dissection and puerperal fever was discovered by one of Klein’s assistants, Ignaz Semmelweis. Klein was not impressed and made sure Semmelweis did not get his contract extended at the hospital. After appeals to the university, Semmelweis was eventually offered a private lectureship; however, with the condition that he teach only using mannequins.

Advances in simulator technology

Some 18th century torso simulators had a uterus made from glass so that changes in fetal position during birth and the effect of obstetric manoeuvres could be observed for teaching and assessment. While the fetus could be put in different positions or changed, the maternal simulator was generally fixed. In the second half of the 19th century, simulators that could be adjusted were developed. The Budin–Pinard simulator, for example, had a mechanism to change the shape of the pelvis.10 The original Schultze simulator used preserved female external genitalia and internal organs but it was later redesigned with rubber parts. Preserved fetal cadavers continued to be used well into the 20th century.11

Matthew Flinders: the apothecary’s son turned navy captain

In the 18th century it was usual for the sons of apothecaries to train as apothecaries. We know what the training involved from the writing of another apothecary, James Parkinson, a contemporary of Flinders senior, and who also studied under Richard Grindall.

The first four or five years are almost entirely appropriated to the compounding of medicines; the art of which, with every habit of exactness, might just as well be obtained in as many months. The remaining years of his apprenticeship bring with them the acquisition of the art of bleeding, of dressing a blister, and, for the completion of the climax — of exhibiting an enema.12

Later in life, Flinders junior recalled that when he read Robinson Crusoe he realised there would be more excitement at sea than in apothecary training. He joined the Royal Navy, against his father’s wishes, for a career that would change Australia.

Early obstetric simulators in Australia

Pelvis and torso simulators typical of those used in the 18th and 19th centuries can be seen in collections in Australia. A ceramic pelvis and mannequin thought to have come from a collection started by James Simpson in Edinburgh is on display in the medical library at Flinders University in Adelaide (Box 1). Simpson established a large collection of obstetric specimens and apparatus in the middle of the 19th century and documented this as part of his application for the Chair of Obstetrics at the University of Edinburgh.13 The fetal mannequin has facial features, cranial sutures and fingers and toes. This meant it could be used by students to identify fetal presentation by “touch” and to practise version and the Mauriceau–Smellie–Veit manoeuvre in breech delivery.

A female torso simulator and fetal mannequin are on display at the Powerhouse Museum in Sydney (Box 2). Several manufacturers made these models in the 19th century and although some were quite distinct, copies with minor modifications were common. Most simulators at this time were, like the Powerhouse model, covered with leather for durability. The fetus could be put in the abdominal cavity in any position to demonstrate causes of prolonged labour or covered with a blanket for students to practise normal delivery, version, use of the perforator, and other procedures. Reed’s book, Operative obstetrics on the manikin for students and practitioners, shows how this was done.14

Also in the Powerhouse Museum collection is a glazed ceramic obstetric simulator designed by Douglas to be easy to clean after being used with fetal cadavers (Box 3).11 Its purpose is not instantly recognisable from its form, and Douglas noted that this meant it did not need to be put away or covered when not being used. The cost of consumables for this simulator was kept low by fashioning the pelvic floor and external genitalia from rubber balls.11

1 Ceramic pelvis and fetal doll on display at Flinders University

2 Obstetric simulator and fetal mannequin on display at the Powerhouse Museum in Sydney

3 Twentieth century ceramic obstetric simulator for use with preserved fetal cadavers*

* The original versions were made by Mayer and Phelps, but this one does not have a maker’s mark so it could be a copy.

Only the best: medical student selection in Australia

To the Editor: I share Mahar’s concern regarding any future screening of prospective medical students for signs that they are likely to develop mental or physical impairment.1 Although Wilson and colleagues do acknowledge that screening may not be ethical, the separate issue of their seeming conflation of likelihood of illness with impaired ability to practise in the long term is problematic.2 Mental illness, and particularly depression and anxiety, are common disorders in medical students.3

Even in the context of medical schools’ “fitness to practise” procedures, which Wilson et al consider more practical, it is important that criteria and processes for removal of students are not so broad that they can be applied selectively. Further, the tendency for the medical mind to seek to eliminate personified risk factors of future problems should be resisted; further research may not be “paramount”.2 It is not the possibility or probability of developing later illness that matters (dark actions wait at the end of that uncertain path) — individuals should be judged on the acts that demonstrate potential harm to patients.

Students and society will benefit
if the attitude of schools to students suffering illness is one of compassionate support — many students’ conditions may improve. We should also not neglect the current weakness of our own “postmarketing surveillance” — our management of colleagues whose behaviour is far from ideal. We struggle with the fact that they, unlike students, are cloaked with the tribal defence of Fellowship, though this should not change our judgement of their actions — as it certainly does not change their harmful consequences.

Riding the waves of change

Change — in health care, its systems and community need — is one of the few certainties in medicine. A past Dean of Harvard Medical School, Sydney Burwell, put it this way in the 1950s: “My students are dismayed when I say to them, ‘Half of what you are taught as medical students will in 10 years have been shown to be wrong. And the trouble is, none of your teachers know which half.’” (BMJ 1956; 2: 113-116). A broader question is: do we ride out change or ride with it?

Many contributions in this issue of the MJA highlight not only change but also the associated challenges, constructive debates needed and hard decisions to be made as medicine and health care evolve.

An obvious, pressing development is the steady increase in medical graduate numbers, which are now double what they were in 2006. This surge means that a 400-place shortfall in first postgraduate-year training positions is forecast within 4 years. These “waves” of future graduates face questions about when, where, how and even if they will complete their junior medical officer year to become fully registered practitioners. Kevat and Lander (doi: 10.5694/mja12.10967) are concerned that the states’ “priority system” for selecting interns discriminates against interstate applicants, including those trying to return to their home state. Highlighting the Australian Capital Territory graduates now considered interstate applicants by the New South Wales system, they argue that this system contravenes the Australian Constitution. It is an issue that may be resolved not by the health system but in a court of law.

More medical students and graduates mean more competition for clinical experience as well as training positions. The new national registration standard will allow greater flexibility in obtaining the requisite clinical experience during internship. For example, Gosbell and colleagues ((doi: 10.5694/mja13.10176) say that the new standard will allow emergency medicine rotations to be done outside of emergency departments, including in some general practice settings. Although access to placements may improve, they say the accompanying national accreditation framework must prevent any dilution of clinical experience.

The quality and extent of students’ clinical experiences may affect their later careers. As de Costa and Rane (doi: 10.5694/mja13.10109) discuss, greater student numbers and the demands of other newer disciplines in medical courses mean that not all medical schools require their students to perform normal deliveries in obstetric rotations. What would the Australian community think if it was generally known that some of our doctors may be graduating without the experience of at least assisting in uncomplicated labour? And what of interns’ confidence levels if they are required to manage labour in regional and rural rotations?

One might wonder whether core clinical experience is being overridden by the introduction of new subjects to medical education. But at least one of these curricular developments may be truly needed, owing in part to the increasingly international orientation of medical schools and their students. As Law and colleagues (doi: 10.5694/mja12.11463) report, around one in four medical students undertake overseas electives in developing countries, with attendant personal risks and educational benefits; accordingly, briefings before and debriefings after such terms need to be scaled up. Mitchell and colleagues (doi: 10.5694/mja12.11611) advocate formal postgraduate global health training, in line with North American courses, focusing on international health equity and fieldwork.

Change in any aspect of the medical profession and health care inevitably raises the perennial question: what is the purpose of medical education? Today, as it was 50 years ago, there is no clear, single answer. But continuing renewal and adaptation to medicine’s evolving circumstances would seem necessary. The question today is how we can continue to adapt successfully to the changing tides.

Should Australian medical students deliver babies?

Sharing the experience of normal vaginal birth is an important formative experience for medical students

Hearing or knowing “I’ve done this before” would, no doubt, be more comforting to both the labouring woman and the doctor than “Well, I’ve delivered three plastic babies — with a good deal of KY jelly — how much harder can it be?”

In late 2012, around 3500 new graduates emerged from Australia’s medical schools,1 more than twice the number that graduated in 2006.2 Whether all these young doctors will obtain intern places and future training posts has been the subject of robust discussion. Less attention seems to have been paid to the quality and extent of their clinical experience as students.35

Until the late 20th century, all medical students were required to perform at least some normal (uncomplicated) deliveries, under the watchful eyes (and, at times, the hands) of senior midwives. However, these requirements have been reduced owing to increased student numbers, increased competition from student midwives for attending births, decreased birth rates and increased caesarean section rates, competing demands from the many new disciplines now included within the medical course, and a greater say by women themselves in what happens to them during labour and birth. A 2008 survey of the 19 Australian medical schools showed that 10 schools had mandatory requirements for students to perform between one and four normal deliveries during their time in obstetric clinical rotations.2 However, the remaining schools (some of which were in the process of developing their courses) all emphasised the importance of their students spending mandated and productive time observing labour, birth and other related birth-suite activities.

At James Cook University School of Medicine, we aim to prepare a proportion of our graduates for rural practice and have, since the inception of our Reproductive and Neonatal Health course in 2004, required our clinical students to witness three births and “perform” two. Our experience over the past 10 years has confirmed our belief that the requirement of compulsory deliveries — that also include the student remaining with the woman and her support persons during labour, birth and the immediate postpartum period — sends a clear message to students and staff that delivering a baby under the skilled direction of a midwife is an important step towards becoming a doctor.

As second year interns, many of our graduates may find themselves rotated to small rural Queensland hospitals that no longer have maternity units but which, nevertheless, may be the immediate port of call for women who were booked elsewhere but unexpectedly go into labour. Of course, a week in a birth suite (including two normal deliveries) does not an obstetrician make; but it does give students some familiarity with the sights, sounds and smells of childbirth, which will be helpful in such situations. It will also help them to understand what is normal, what is not normal and also how quickly the normal can become abnormal.

Yet there is much more to it all than that. To explore the feelings of students, we asked our 2012 graduating students to write about their own experiences. As one recent graduate recounted:

On my last day of medical school I was asked what has been the best experience I had had as a student. I replied “My first time assisting in a delivery”.

This is an opinion that other students have echoed. Spending time with a labouring woman and her partner, and participating in what is usually an amazing and joyful event,
is something that cannot be replicated by plastic models or videos. As another student wrote:

I went home that afternoon with goose bumps. To have assisted a mother to bring a beautiful life into the world was remarkable. This was the most memorable day of my entire medical school.

Such experiences will be vital if we are to continue recruiting competent junior doctors into our specialty training. Trainees should personally experience normal deliveries long before they commence registrar training. Some of our students have come to realise that they would like to make a career of obstetrics — as a specialist or general practitioner — or to work in the area of women’s health. One wrote:

It was in my birth suite week that my plans to become a gastroenterologist took a U-turn and I began to appreciate the field of obstetrics and gynaecology.

But even for students who will never practise obstetrics or see another baby born, the experience is part of the important process of becoming a competent and caring medical practitioner.

For all these reasons, we will continue to ensure that our medical students deliver babies during their clinical terms. We believe that it would be a great tragedy if, after years of study, our future graduates were to have their medical diplomas pressed into their hands by a vice-chancellor without having participated in the miracle of normal birth.

Should hospitals have intensivist consultants in-house 24 hours a day? – Yes

Onsite intensivist support is needed to improve clinical decisions and safety

An intensive care unit (ICU) is only as good as the care and decision making provided at 2 am. If we believe that intensivists really make a difference to patient outcomes, surely extended hours of onsite intensivist cover are necessary? A patient-centred approach to staffing that takes into account the potential for human error is needed. Most Australian ICUs are staffed after-hours by registrars. Some are not vocational trainees. Experience and clinical skill is variable. Onsite intensivist support tends to be concentrated throughout the day, with the on-call specialists often required to be onsite for 12 hours or more and on-call overnight. Challenges exist in providing uniform levels of clinical expertise around the clock to ICU patients while maintaining a healthy and safe work routine for clinicians.

The ICU is a complex operating environment that requires high-risk decision making day and night. Early work on errors in the ICU emphasised adverse incidents; current research concentrates on diagnostic error. A recent systematic review of autopsy studies on ICU patients found an important incidence of critical misdiagnosis including vascular events and infections.1 Other missed diagnoses included pulmonary embolus, myocardial infarction, pneumonia and aspergillosis. Perhaps extended onsite intensivist cover would help reduce misdiagnoses?

Acute care hospital intensive care services are not only provided within increasingly large ICUs (30 plus beds are not uncommon), but many ICUs also provide rapid response to the wards. Night duty is associated with an increased risk of error because it coincides with the circadian nadir of medical staff and is associated with mild-to-moderate sleep deprivation. A study that examined sleep patterns in a tertiary Australasian ICU found that many registrars were sleep deprived while working on night duty (45% had woken before 16:00 and 48% had less than 5 hours’ sleep before shifts).2 It has been shown that even a modest sleep deficit can impair waking neurobehavioural functions.3

A recent study examined cognitive errors in the ICU and reviewed current research on dual process theory in relation to diagnostic error.4 In essence there are considered to be two types of clinical thinking: pattern recognition (intuitive thinking) and analytical thinking. An experienced clinician mainly uses intuitive thinking, and only uses analytical reasoning when encountering a new situation. Clinical reasoning is often influenced by cognitive bias. Many such biases have been described, including confirmation bias (selecting information to confirm the diagnosis), anchoring heuristic (relying on initial impression despite subsequent information) and framing effects (diagnostic information biased by inappropriate information).

It follows that intuitive thinking is where most cognitive error occurs. Individuals with sleep deprivation and task saturation are more likely to revert to intuitive thinking, which requires less effort than thinking analytically.

It can thus be argued that what is needed is an environment that promotes optimal decision making 24 hours a day. Specialists working extended days and on-call overnight to support junior onsite medical staff is not optimal. While all clinicians will be subject to the pressures of night duty outlined earlier, ICUs need a senior clinician who is awake and immediately available.

There have been arguments for and against intensivist staffing of ICU after-hours with no clear resolution.5 Those opposing 24-hour intensivist staffing have made arguments on the basis of no discernible difference in outcome, intensivist lifestyle and burnout, the need for registrars to have a degree of autonomy in their training, and cost. There are practical difficulties in moving to this system including night duty fatigue and clinical handover. Importantly, it requires a shift from continuity of care provided by individuals to one of system-based continuity. Market forces may eventually drive change towards 24-hour in-house specialist staffing. Increasing numbers of trained specialists and a limited pool of specialist positions has the potential to decrease the demand for intensive care training. Another problem is the smaller ICUs, where 24-hour specialist cover is impractical — although the remote telemedicine model with 24-hour intensivist supervision of multiple ICUs may be the answer here.

Hospitals have a duty to provide safe care. Ideally there should be a specialist awake and available to the ICU at all times. This is a major change in intensivist work practices. Evening shift rostering for intensivists may provide a transition to safer cover for ICUs as well as optimising clinician work routines. Most tertiary hospitals now have specialist anaesthetists and emergency physicians working evening shifts. It might be naive to think that intensive care, which is so closely affiliated to these acute care specialties, should be different.

The need for quality and quantity in emergency medical care rotations for interns

To the Editor: A national registration standard for internship
will apply from 2014.1 In addition to
10-week rotations in medicine and surgery, interns will need to obtain
8 weeks’ experience in emergency medical care. A national framework
for the accreditation of intern training programs is also being developed.2

The new standard allows emergency medical care rotations outside of emergency departments (EDs), including selected general practices. Although primary care settings can facilitate valuable training, there is limited evidence that a community placement can effectively substitute
for an emergency medicine term.

Emergency medicine terms expose interns to a broad range of acute undifferentiated illness not often encountered in other rotations.3 These terms also facilitate acquisition of key skills and knowledge, including the ability to prioritise under time pressure, recognise “sick” and “well” patients, perform common procedures and interact with other health care team members.3,4 EDs are the most appropriate setting for this generalist medical experience.

Rigorous assessment of interns is an important but under-recognised capability of the emergency medicine experience. A recent study showed that ED-based terms are crucial in detecting underperformance,5 which probably relates to the proximity of supervision, the requirement for interns to act as primary treating clinician and the necessity for decision making. EDs may be the only setting where interns’ clinical skills are directly observed.

Supervisory capacity may limit provision of ED-based experiences4 and, with expanding graduate numbers,6 demand will increase further. Although emergency medicine terms in alternative settings may improve access to placements, the accreditation framework2 must protect against
any dilution of clinical experience. Guidelines must define minimum standards for training opportunities, supervision and assessment, not just casemix.

Solutions that sustainably increase ED training capacity should be supported, including innovative models of supervision. Structured teaching and simulation also have roles.4 The More Learning for Interns in Emergency (MoLIE) program, for example, increases training capacity and simultaneously enhances the educational experience.7

Australia must continue to support learning in emergency medicine by sufficiently resourcing EDs to deliver high-quality teaching and training to interns, and the unique elements of emergency medicine rotations must be protected by robust accreditation standards.

An audit of dabigatran etexilate prescribing in Victoria

To the Editor: In 2009, the RE-LY (Randomized Evaluation of Long-Term Anticoagulant Therapy) trial compared dabigatran etexilate with warfarin for prevention of stroke and systemic embolism in 18 113 patients with
non-valvular atrial fibrillation (AF)
and at least one additional risk factor
for stroke.1 In April 2011, it became accessible in Australia (Pradaxa; Boehringer Ingelheim) under a product familiarisation program funded by
the manufacturer; however, the Pharmaceutical Benefits Advisory Committee expressed concern that without informed and appropriate prescription, clinical trial outcomes may not be reproducible.2 Case reports from Europe3 and New Zealand4 identified the need for caution when treating older patients, those with low body weight or patients with renal impairment because of the risk of serious bleeding.

We performed a retrospective audit of the available criteria of indication, renal function and time in therapeutic range (TTR) of 362 patients at a private anticoagulant clinic who were transferred from warfarin to dabigatran between 1 June 2011 to 30 November 2011. Patients recorded as having AF were presumed to have non-valvular heart disease, although this was not confirmed. The dose of dabigatran, the CHADS2 score (congestive heart failure, hypertension, age ≥ 75 years, diabetes, 1 point each; prior stroke or transient ischaemic attack, 2 points), the weight of the patient, and the patient versus clinician preference for therapy were not known to pathology service staff.

The patients in our cohort were older (mean, 76 years) than participants in the RE-LY trial (mean, 71 years). Fewer of our patients had significant renal impairment (14% had an estimated glomerular filtration rate [eGFR] < 50 mL/min/1.73 m2, versus 19%
of RE-LY participants), but 2% had
an eGFR < 30 mL/min/1.73 m2, and 12% had not had a renal function assessment in the past 12 months. Twenty-nine patients (8%) did not
meet the indication of having non-valvular AF.

A RE-LY subanalysis suggested
that dabigatran had no advantage over warfarin in reducing non-haemorrhagic stroke or death in patients with excellent international normalised ratio (INR) control (defined as TTR > 72.6%).5 In our cohort, TTR was assessed over a minimum of 6 months before patients were switched from warfarin to dabigatran, and the mean TTR (70%) was higher than in RE-LY (64%). In addition, one-third of our patients had TTR ≥ 79%, indicating high-quality anticoagulant control.

Our study confirms the prescription of dabigatran to a local population
that differed from RE-LY participants
in age, renal function and TTR, and occasionally for an unapproved indication. Although we cannot comment on clinical consequences,
care should be taken in extrapolating positive outcomes to non-equivalent patient groups, and continuing education campaigns are needed.