Article Text

Download PDFPDF

COMMENTARY
Free
  1. J Williamson1,
  2. P Barach2
  1. 1Specialist Consultant, Australian Patient Safety Foundation, GPO Box 400, Adelaide 5001, South Australia; john.williamsonapsf.net.au
  2. 2Associate Professor, Department of Anesthesiology; Medical Director of Quality and Safety, Jackson Memorial Hospital, and Director Miami Center for Patient Safety, University of Miami, USA

    Statistics from Altmetric.com

    Request Permissions

    If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

    UNDERSTANDING OURSELVES IN THE HEALTHCARE SYSTEM: PSYCHOLOGICAL INSIGHTS

    Many people working in health care know very little about the human and organisational precursors of error. But, as technology advances and both workloads and complexity in health care increase, the risk of error and adverse patient outcome grows. In the face of these trends, public expectations of health care are rising and tolerance of error is diminishing. The paper by Professor James Reason, although focusing on anaesthetic mishaps, contains generic information that should now be considered a required part of the undergraduate and postgraduate medical curricula.

    Learning from others

    Health care has been characterised by its “silo” thinking! All around it in the community other professions and occupations—such as aviation, nuclear power plant operations, military command, fire prevention, rescue organisations—have developed and employed successful safety measures that are directly applicable to many healthcare activities. Until recently, medical workers took little notice. But we are learning now via the psychology pipeline. For example, some disciplines—led by dentistry,1 nursing,2 and anaesthesia3—have already made effective use of the powerful technique of incident reporting and analysis.4

    From such data there is a dawning realisation of the significant role in health care error production of the “latent error” and of “system failure”, rather than simply an error by a person at “the sharp end”.5,6 As Reason has pointed out, blame of such an individual implies delinquency and this is usually dealt with by measures carrying a “shaming” flavour. This has no remedial value (it is often actually counterproductive) at the level of the individual, who “… did not choose to err in the first place”. (It may be observed here that such understanding appears still to lag within some legal circles, despite centuries of their own evidence.)

    Signs of a culture change

    Hand in hand with these enlightenments is the gradual culture change occurring in health care, which is learning to admit that human error is inevitable, universal and “… a completely normal and necessary part of human cognitive function”.7 Incident, near miss, and adverse event data are now being meaningfully gathered. This is leading to the development and application of effective preventive strategies—that is, derived from real world data—particularly against such “system” or “organisational” failures. The result is some real improvements in patient and/or staff safety.8–10 Such appropriate use of incident data has been referred to as “closing the loop”.

    The need for humility

    The earlier absence from medical and nursing curricula of patient safety as a discrete topic, and of any instruction in human error psychology, bred generations of medical and nursing personnel lacking insight into such matters. This “latent knowledge based mistake” has since combined with several “sharp end” triggers. These have included vast technological advances, large increases in the healthcare workload, greatly increased public expectations and, sadly, a previously perpetuated (albeit ridiculous) culture in medicine of “doctor infallibility”. The combinations have contributed towards the awful level of iatrogenic injury (and “near miss”) rates with which we are presently struggling.11

    Happily, such insights are bringing with them a long overdue reduction in medical arrogance, an increase in humility and, perhaps most importantly, the recognised need for early, full, and open communication with affected patients, their families, and their carers following any healthcare error. (This last mentioned trend, hardly surprisingly, carries with it a quite profound potential for diminishing patient and family dissatisfaction and their tendency to sue when an error occurs.)

    Of course these medical self-criticisms (mea culpa) may exhibit some degree of “hindsight bias”. For one is now aware of the “outcome”—the unacceptably high rates of iatrogenic harm—despite most medical and nursing folk having generally tried their hardest over the years to do the right thing. Back then we were “armed only with foresight”! Many healthcare workers (and patients and their relatives) knew things were not right, but they did not know what to do about it, other than to apportion individual blame. Again in Reason’s words, “… It usually takes someone else, with a fresh view of the situation, to detect a deviation from some adequate path.

    A “classic” paper

    This paper, in a nutshell, condenses what every healthcare worker should understand about errors and their prevention. This must include the people within the higher echelons of the healthcare system—politicians, administrators, planners, lawyers and senior consultants. The paper carries generic truths that will be as valid in 100 years’ time as they were in 1995 and are today. This is because the focus is on how humans basically think and behave. It will indeed be a long time before there are any changes in these attributes!

    REFERENCES