Article Text

Download PDFPDF

The systems approach to medicine: controversy and misconceptions
Free
  1. Sidney W A Dekker1,
  2. Nancy G Leveson2
  1. 1Safety Science Innovation Lab, Griffith University, HUM, Nathan Campus, Queensland, Australia
  2. 2Aeronautics and Astronautics and Engineering Systems Division, Massachusetts Institute of Technology, Aeronautics and Astronautics, Cambridge, Massachusetts, USA
  1. Correspondence to Professor Sidney W A Dekker, Safety Science Innovation Lab, Griffith University, HUM, N16 Macrossan Building, 170 Kessels Road, Nathan Campus, QLD 4111, Australia; s.dekker{at}griffith.edu.au

Abstract

The ‘systems approach’ to patient safety in healthcare has recently led to questions about its ethics and practical utility. In this viewpoint, we clarify the systems approach by examining two popular misunderstandings of it: (1) the systematisation and standardisation of practice, which reduces actor autonomy; (2) an approach that seeks explanations for success and failure outside of individual people. We argue that both giving people a procedure to follow and blaming the system when things go wrong misconstrue the systems approach.

  • Quality improvement
  • Patient safety
  • Human error
  • Human factors
  • Complexity

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Introduction

The ‘systems approach’ to patient safety has recently led to questions about its ethics and practical utility. Recently, Levitt, a retired neurosurgeon, wrote how ‘the medical profession has put its faith in a systems approach to the problem … [a] so-called solution that doesn't address the problem’.1 These arguments stem in part from misunderstanding the systems approach as (1) equating to standardising practice and reducing individual autonomy (eg, creating more rules, policies and compliance demands)2 and (2) blaming the system rather than holding people accountable.3–5 Neither of these characterisations captures the essence of the systems approach as practised in industries that have used it to increase safety to extremely high levels, such as commercial aviation. Here we briefly explain the systems approach in the broader ways it has been applied elsewhere, and then reflect on the questions of standardisation and accountability it has generated in healthcare.

What is a system and a systems approach?

A system, such as a hospital, is a dynamic and complex whole, interacting as a structured functional unit to achieve goals (eg, treating patients). One system may be nested within another system —for example, a hospital is nested within a larger healthcare system; an intensive care unit exists inside a hospital. The behaviour of a system reflects the linkages and interactions among the components that make up the entire system. All medicine is practised within a system. The behaviour of the components or entities that exist within that system is influenced by the system design and structure, such as the remuneration schemes, time and financial pressures, the accuracy of available information about the patient or the procedure being performed, and much more. These system design factors can help or hinder medical professionals from doing their job. While it is laudable that professionals accept responsibility for their actions, it is unrealistic to believe that their behaviour is not affected by the context in which it occurs. We can have an impact on behaviour by careful design of the structure and incentives of the systems in which it occurs.

Reducing the system approach to following a checklist or standardised procedure trivialises what can be accomplished by careful system design. Checklists, protocols and other devices that aim to streamline and reduce variation play a role in a number of safety-critical fields. The goal of a systems approach, however, is not to reduce human behaviour to rule-following, but to design a system in which individual responsibility and competence can effectively help create desired outcomes. The usefulness of standardised responses depends on the thinking and engineering that went into the system design, as well as on the human ingenuity in selecting and applying and even modifying standard responses. Procedures or checklists per se do not reduce harm. Mistakes in using checklists in aviation, for example, do not directly produce catastrophe because of careful engineering and design that preceded operational use. At the same time, human resilience fills the gap between work-as-imagined and work-as-done: autonomy is maintained for a variety of processes (eg, how and when to configure an airliner for landing—within certain parameters), and in many situations checklists are not useful because of time constraints or decision ambiguity.

Thus, standardisation, or giving people a procedure to follow, does not constitute a systems approach, and advice given to hospitals or medical specialties to that effect should not be taken at face value. Claiming that a systems approach doesn't work because standardisation doesn't always work is equivalent to prescribing a treatment of limited efficacy for a particular disease and then concluding that the disease is untreatable and that a more powerful and comprehensive treatment regimen would be no more effective.

Does a systems approach conflict with personal accountability?

The systems approach argues that a flawed hospital system, rather than flawed individuals, is responsible for patient harm.6 Some then invert this, suggesting that a systems approach entails just blaming the system, not the individual. This critique seems to be more prevalent in medicine than in safety-critical industries that more freely acknowledge and engineer against human fallibility.7

But a systems approach does not eschew individual responsibility and accountability. First, the rate at which healthcare produces ‘second victims’ compared with other domains shows just how much individual accountability its practitioners assume.8 Second, in a system, each component has specific responsibilities to help attain its ultimate goals. While surgeons, for instance, have and take responsibility for performing surgery safely and effectively, others have responsibility to ensure that required resources are available. A New Zealand surgeon, for example, was criminally prosecuted for a number of deaths to patients in his care. What received scant attention was that he was forced to operate with help from medical students, because of a lack of available competent assistance.9 Prosecuting the surgeon, who had little control over the context in which he worked, did not solve the problem. After all, it would have similarly affected most people practising surgery in that environment.

Blame is the enemy of safety.10 Emphasising blame and punishment results in hiding errors and eliminates the possibility of learning from them. So-called ‘just culture’ programmes and systems have been effective in aviation by encouraging the reporting of errors so that steps can be taken to reduce them—or their consequences.11 A just culture can also fairly adjudicate how to respond to undesired practice, particularly when it is made clear who gets to determine the response, and if those persons are familiar with the messy details of practice.12 Such things are consistent with a systems approach, which, after all, considers error to be reducible through processes, procedures, training, and system design, including the design of the incentive structure around practitioners. Similarly, the management of (in)competence can be seen as a system issue, by carefully looking at training, selection, continuing development, and life-long competency checking. In aviation, individual competence is taken as a system responsibility—too important to leave the retaining, refreshing and checking of it to an individual professional.13 Structures are in place to oversee and eliminate incompetent practice, instead of leaving its discovery and management only to individual moral valence. This can be made more effective in medicine too so patients can be protected.14 ,15

Some years ago, Atul Gawande published a reflection on an emergency tracheotomy he bungled. Gawande concluded that ‘although the odds were against me, it wasn't as if I had no chance of succeeding. Good doctoring is all about making the most of the hand you're dealt, and I failed to do so’.3 But, while good doctoring may be making the most of the hand one is dealt, the systems approach has always been about providing a better hand in order to improve the opportunity to do the right thing. Merely leaving the hand with which one is dealt and banking on personal virtue to do the rest is both practically and ethically irresponsible.

References

Footnotes

  • Contributors The authors contributed equally with ideas, method and writing of text.

  • Competing interests None.

  • Provenance and peer review Not commissioned; externally peer reviewed.