You wouldn’t immediately think that airline pilots and hospital consultants, particularly surgeons have—had—much in common. Yet both are in positions of power and control, both are ‘authority figures’ with ‘God-like personas’. There’s one very large difference; the airline industry today operates a ‘just culture’, a ‘no blame culture’; is this the case in medicine? This was the subject of a recent BBC radio documentary, here.
Airplane disasters make for sensational headlines; there is often a great loss of life, and afterwards a very thorough inquiry. Three disasters are classics in the industry.
In 1977, two jumbo jets collided in Tenerife, killing over 500 hundred people. The pilot of one plane believed he was cleared for take-off; the cabin crew believed they weren’t. During take off, this plane hit another on the runway. The captain was a very senior pilot, very much a ‘Sir’. He was wrong, the junior staff were right.
In 1978, there was a malfunction in the instrumentation of the landing gear of a plane approaching Portland in the US. The wheels were fully descended, the micro-switches notifying the pilot malfunctioned. While the pilots faffed about wondering how to rectify a fault that wasn’t there, the flight engineer repeatedly told them that the plane was running out of fuel; he was ignored, and the plane crashed.
In 1989, on a British Midland flight from London to Belfast, one of the two engines malfunctioned. The pilots believed, wrongly, that it was the engine on the right side. They shut down the right hand engine. The cabin crew and passengers could see that the left hand engine was on fire. The plane crashed near Kegworth.
These three classic aviation disasters all had human error as the basic cause. Four main types of error were identified:
- Deference to Authority
- Information overload
- Distraction
- Communication problems.
In 2005, Elaine Bromily was admitted to hospital for a minor procedure on her nose. What happened next is described by her husband, Martin Bromily here—it’s a rather harrowing video. (There’s a much more technical description, here.) Note that the nurses produced a tracheostomy kit off their own bat while the consultants faffed around, in circumstances alarmingly similar to the Portland crash. In this crash, the crew concentrated on one problem—the landing gear—and ignored that they were running out of fuel. In Ms Bromily’s case, the consultants concentrated on achieving an airway, ignoring her dangerously low oxygen levels which should have prompted an alternative course of action, as the nurses tried to suggest. ATLS courses in the UK and the Republic of Ireland have for the past 25 years taught participants how to do a cricothyroidotomy; all recently appointed consultants and juniors in anaesthesia and surgery must have completed such a course.
Martin Bromily is a commercial airline pilot and he recognised that the circumstances surrounding his wife’s death were caused by human error, and he knew the classic cases in the aviation industry. He saw a learning opportunity not a litigation opportunity. The New Statesman has an excellent article about him and the problem of human error,here; it’s long but well worth reading. He has since gone on to found the Clinical Human Factors Group, here.
The classical response in the NHS to such situations has been blame than learning; the involvement of reams of lawyers, litigation extending over years, and in the case of whistleblowers, gagging clauses. There was an ‘accident’, therefore someone was at fault. I have ‘accident’ in scare quotes because ‘accidents’ don’t just happen, they are caused, they are not ‘Acts of God’. And so often, the cause is human error; it’s not wilful, it’s not deliberate, it’s not arrogance, it’s a failure of cognition and of comprehension, a failure of understanding in difficult circumstances; errare est humanum. We are all liable to it for we are all human; but we can learn how best to mitigate it. Today, there are checklists before an operation is begun; for example, surgeons are expected to mark the site of the proposed operation themselves with an indelible marker. And if someone in the operating theatre suspects a problem, there are ‘trigger words’ that they can use:
- I am concerned
- I am uncomfortable
- This is unsafe
- We need to stop
These words and phrases are very difficult to ignore.
The bigger problem is the organisational culture within which any of us works.
There are four responses that those working in an unpleasant culture can choose: to accept it, to change themselves, to change the system, or to leave. Changing culture in an organisation is very difficult; overcoming entrenched ideas and methods of working can be difficult to say the least; and changing ‘blame’ into ‘learning’ and into a ‘just culture’ is a major challenge. So often it takes a disaster to expose problems in an industry; but the airline industry learned the real lessons from disasters; they changed their culture in response.
Does your organisation have a blame culture, is it judgemental? Are you congratulated if you make a mistake and recognise it? And if not, what you you doing about it?

Robert Campbell is a retired surgeon.