Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.
The paper by Dr Martin Allnutt, a military aviation psychologist, is now 15 years old.1 That it was published in a mainstream anaesthesia journal that long ago reflects credit on the then Editor of the journal and his staff. For, so far as medicine was (and, regrettably, still is to a degree) concerned, this paper remains ahead of its time. Yet Dr Allnutt would be the first to point out that most of what he says could be regarded by psychologists as mainstream knowledge.2
There are so many concepts and messages in this paper that strike at the heart of error production and which are fundamental to patient safety improvements, that it must be regarded as required reading—by which I mean “understanding and acknowledging”—for all medical practitioners.
Perhaps the most powerful statement in this power packed paper is that it is “an absolutely basic tenet … that all human beings, without any exception whatsoever, make errors and that such errors are a completely normal and necessary part of human cognitive function. For a pilot or doctor to accept that he or she is as likely as anyone else to make a catastrophic error today is the first step towards prevention; whereas to claim exemption on the grounds of being a test pilot, senior professor, commanding officer or consultant, or of having 30 years’ experience or 3000 accident-free hours, is the first step on the road to disaster.”
Is there a single medical practitioner who can deny the penetrating accuracy of that statement? It compels the abandonment of the often unspoken but absurd precept still fluttering about inside medicine that “doctors never make mistakes”, and demands an understanding that errors are essential learning processes for all human beings. The paper pointed clearly to the need for a culture change in medicine which we are still struggling to achieve. Put simply, to deny one’s errors is both ridiculous and dangerous.
Why is it that medicine persisted in denying its fallibility in the face of irrefutable, psychological and clinical (not to mention common sense) evidence to the contrary? The reasons are neither complicated nor mysterious; they are simple, human ones.
There is an eloquent Chinse proverb which says:
Victory has many fathers: defeat is an orphan!
Doctors, like all other human beings, have been reluctant to admit they were wrong. Now let us be fair. Medicine has achieved much and will continue to do so. After all, medical practitioners have obviously quite often been right! However wrong we are at times, in keeping with the rest of the human race and, as Allnutt explains, such errors are fundamental to progress.
Historically, the profession has occupied a privileged place in communities. That age old “mystery” surrounding healing (now added to in modern times by many equally “mysterious” technological advances), the relative scarcity of a university education, and the vulnerable position in which bad health places anyone led patients over the centuries to place in us their complete trust and respect. Sadly, albeit uncommonly, this trust has been abused. Like all fallible humans, we let this privileged position go to our heads. It resulted in some of us—present and past—adopting a patronising posture where admission of ignorance or error was unthinkable. This “culture” began to be perpetuated, particularly among some senior members of the profession, and even influenced the attitudes of medical undergraduates. Absurdly, the patient was pushed into the background while we paraded around in a flood of self-importance, competing for attention and seniority. The poor patients became frightened even to question us about their own health and our decisions concerning their welfare. As for being honest and open with patients and relatives about our errors, “advice” from our medical insurance and our vanity made this uncommon.
But improvements are discernible. Importantly, patients are no longer willing to sit “dumb and accepting” of our pontifications. Society has woken up to our fallibility and, happily, it demands the truth from us now. To be fair, this emerging change of attitude has received some of its impetus from within the profession itself.3–,5 The cynic may say that the heat of medicolegal scrutiny initiated this, but there are—and always were—good and honest medical practitioners. An attitude of openness and honesty with patients, relatives, and colleagues is now widely practised and advised.
Of course the present tort system in the courts, with its focus on outcome and the need to assign “fault” (often of an individual), tends to fly in the face of such advice.6 Having made an error, often while subject to complex and stressful demands imposed by a less than perfect system,7 and while actually trying to do our best, medical and nursing personnel have been encouraged to “admit nothing” by our medical insurance institutions and then are subject to a legal process that operates in this “blame” culture. As mentioned above, a desire not to lose face will encourage such behaviour. This approach drives error admission and analysis underground, so that methods of preventing a recurrence are never considered. It also carries the potential for injustice, where an individual who forms the final pathway for an accident, thrust there by a flawed environmental system, takes the blame for slips and mistakes (for example, judgmental errors) made hours, days, or months earlier by other persons on the blunt managerial end. Furthermore, it is an ironical fact that keeping harmed patients and their relatives uninformed about an error mitigates powerfully towards their seeking legal redress.
Emphasising the rapid decay and complete unreliability of human memory over time—particularly when stressed—the paper firmly supports the concept of early and full reporting of incidents, near misses, and adverse events by those directly involved. Such incident reporting data, correctly amassed and analysed, are already making powerful contributions to patient safety improvements. Dr Allnutt’s implied support for this technique is hardly surprising, remembering that the original concept of “critical incident reporting” appeared 30 years earlier in a seminal paper by a pioneer military aviation psychologist.8
The insight into human cognitive processes provided by Dr Allnutt’s paper is fundamental to error prevention. Its messages should become part of medical education curricula at all levels of learning. Yet appreciable unawareness of these concepts persists among many medical practitioners—persons who could reasonably have been expected to be among the most informed in such matters. Dr Allnutt is telling us not only that “patterns of human error are identifiable, predictable, repetitive and lend themselves to classification and analysis”,1 but that they offer the very information that is the pathway to their prevention.
This is a classic paper for all interested in improving the safety of our patients, our colleagues, and ourselves.