Article Text

PDF

Safety management
Medical applications of industrial safety science
  1. T W van der Schaaf
  1. Correspondence to:
 Dr T W van der Schaaf, Associate Professor of Human Factors in Risk Control, Eindhoven University of Technology, Safety Management Group, PAV U-8, P O Box 513, 5600 MB Eindhoven, The Netherlands;
 t.w.v.d.schaaf{at}tm.tue.nl

Statistics from Altmetric.com

Healthcare workers could learn much from the engineering and civil aviation industries about safety management.

The medical community is becoming more open to learning safety lessons from other domains, but it should remain aware of the sometimes differing tasks and contexts. These same differences could also challenge experienced safety researchers to cross the boundary from industry and transportation into health care in order to contribute to the understanding, measurement, and enhancement of “patient safety”.

Over the last few decades safety science has developed rapidly thanks to the (frequently disaster related) “challenges” in the areas of industry (chemical and nuclear) and transportation (civil aviation) where failures could simply not be tolerated any longer by the public and government. These domains have risked their “licence to operate” on several occasions and concluded that their control of processes had to change at the system level. This sense of urgency is not yet widespread in the medical profession nor among patients. So how can the brave storm troopers from a limited number of medical sectors get the most out of the available experiences? This editorial attempts to summarise the main lessons available to healthcare workers.

PERSPECTIVES AND ATTITUDES

Probably the most valuable lesson that industry has learned is that safety management is more than buying and applying a set of tools and techniques: without the proper changes in culture, perspective, and attitude toward errors, failures and their causes, introducing tools with the hope of a “quick fix” will largely miss the point. Safety must be recognised as a systems problem1 instead of the present “blaming and shaming” of individuals working at the sharp end. The focus of incident investigations must therefore be on the latent factors2 and not just on the immediate precursors and local triggers. These underlying factors are also present in “near misses”3; one really doesn't have to wait for an actual injury to a patient to discover the root causes and to address them proactively. Early discovery is much more likely when the risk assessment of a medical process and the countermeasures are based on a large database of incidents, instead of the all too frequent ad hoc “firefighting” of symptoms after each and every (usually major) mishap. This will require a bold change in the attitudes of healthcare managers and regulators alike. The role of staff members must also be recognised—not just as the cause of adverse events but also as the strongest safety link in health care.3, 4 Many imperfections in the organisational and technical context in which health care works are detected, understood, and corrected before patients are harmed. The implicit notion of the duality of the human component in health care allows us to address adverse events on two fronts: by preventing initial errors and failures, and by building into the system timely and effective recovery opportunities. Eventually, such perspectives and attitudes will consolidate into a proper safety culture.5

MODELS, TOOLS AND TECHNIQUES

Industry has operationalised the safety culture and attitudes in a number of widely used models, tools, and techniques which can be subdivided into prospective, retrospective, and organisational learning centres. Prospective approaches (aimed at predicting risk factors) are relatively new in health care. Failure mode and effects analysis (FMEA), for instance, widely used for decades in automotive engineering, was recently targeted by the president of the JCAHCO in the US as a critical tool for enhancing safety.6 There are also quantitative candidates from the industrial “error management tool box” ranging from the nuclear power probabilistic risk assessment (PRA) to the HEART techniques. Retrospective approaches are used to describe and analyse actual incidents into their root causes and have been around much longer. Reason's error model2 is the basis of a number of medical incident analysis tools.7

Finally, reports from the US Institute of Medicine8 and the UK Department of Health9 underscored the essential mechanisms for organisational learning and the value of event and “near miss” reporting mechanisms.10 These tools allow large databases to be created quickly, but also are instruments to change the medical culture by involving and relying upon all levels of staff to provide input by (voluntary) sharing of experiences at the “sharp end”. These reporting systems depend on trust by the frontline workers and make them a potentially powerful driving force towards achieving a “just culture”.11

CHALLENGING DIFFERENCES

Healthcare tasks and contexts differ from industry and transportation in a number of important ways which should lead to appropriate adjustments of the safety management tools and techniques.

Perhaps the most striking difference is the highly dynamic nature and lack of standardisation in health care.12 At any given time it is not always clear what the proper actions are and—even more so—the change over time and contexts. Physicians pride themselves in practising the art of medicine and are reluctant to follow accepted guidelines, leading to large variation in practice. If this is true, it is rather disconcerting that the medical community looks predominantly to the solutions developed by a domain which is possibly at the very other end of this scale—namely, civil aviation. Aviation owes its major safety leaps forward mainly to the very rigid and consistent standardisation of its technology, tasks, procedures, and personnel. A unique feature of healthcare safety is that the patients themselves are key players, not only as the object of protection but also as an additional source of error. At times patients are contributors to error, but also can help with the timely detection of errors—for instance, when the medication or treatment they get differs from that which they expected.13

It remains to be seen how useful tools and techniques developed with a focus on identifying, analysing, and preventing errors by individuals are in health care where the bulk of patient care is given by teams. We have much to learn from other domains in which the importance of team awareness and training is foreign to most healthcare workers. These differences, however, may be used to persuade safety researchers and practitioners to take an active interest in patient safety problems and solutions. Such a “win-win” situation for both the medical and non-medical community could overcome the traditional barriers to cooperation, thereby boosting patient safety by the rapid deployment of proven solutions that can be applied today.

Healthcare workers could learn much from the engineering and civil aviation industries about safety management.

REFERENCES

View Abstract

Request permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.