© The Financial Times Ltd 2016
FT and 'Financial Times' are trademarks of The Financial Times Ltd.
The Financial Times and its journalism are subject to a self-regulation regime under the FT Editorial Code of Practice.
October 30, 2010 12:33 am
Every British doctor will have attended at least one lecture on patient safety in which the NHS is compared to the airline industry. In my first hospital job, a retired pilot came to speak to the junior doctors. He described the list of checks he had to carry out before he was allowed to take off in a Boeing 747. The list was lengthy and explicit and more thorough than the kind of thoughts most of us were having before writing drug scripts, for example. Prescribing paracetamol, which junior doctors may do 20 or 30 times a week, often involves little more self-interrogation than: Is this the right drug chart? Are they allergic to this? In fact there are at least two more questions, even for this relatively innocuous drug, that should be asked but are often skipped: Is this patient on anything else that contains paracetamol? And, do they have paracetamol prescribed already – perhaps on another chart?
The jumbo jet theme recurred on a training day a few years later. In the morning we rehearsed staged medical emergencies; in the afternoon we sat and listened to a black-box recording in which a pilot and his co-pilot misunderstood one another regarding a crucial detail of altitude. The plane crashed. Medicine can learn from the aviation industry, the pilot said; we can teach you the importance of good communication, of explicit checklists, of standardisation.
The government has long believed this to be true. In 2000 the then-chief medical officer Liam Donaldson wrote the NHS patient safety manifesto, “An Organisation with a Memory”, in which aviation figures as a central theme. “Plane crashes are not usually caused by pilot error per se but by an amalgam of … factors which predispose to human error or worsen its consequences … Experience and research from other sectors, in particular the airline industry, show the impact of human error can be reduced.” The National Patient Safety Agency models much of its work on aviation protocols, gathering stories of medical “near misses” in order to analyse them and publish recommendations for change.
It makes good sense, and yet, when you are actually at work, it is worth remembering the many ways in which a hospital does not resemble an aircraft. Medicine has to be one of the few professions in which you can start a new job at 10 o’clock at night, in a building that you have never seen before. You have probably never met the people you’re going to work with, and you may not see them again once your shift has ended. You will receive a verbal handover from your outgoing colleagues; this will take place in a room filled with discarded scrubs, sandwich wrappers and bloodstained surgical clogs.
Both medicine and the airline industry follow strict protocols in the case of disaster – in medicine, it is the algorithm-driven “crash call” that dictates what to do when someone’s heart stops beating. But when the crash bleep goes off at three in the morning in a building you’ve only occupied for five-and-a-half hours, and a voice tells you to attend a cardiac arrest on 34b, it may take some time to even find the disaster before you can go and apply a protocol to it. And Boeing 747 cockpits are not, presumably, all laid out in different ways, whereas the equipment needed for a cardiac arrest – a trolley containing a defibrillator, drugs, breathing equipment and syringes – may live anywhere on a ward, its position determined by nothing more logical than local custom and available space.
Sophie Harrison is a hospital doctor in South Yorkshire
Copyright The Financial Times Limited 2016. You may share using our article tools.
Please don't cut articles from FT.com and redistribute by email or post to the web.