rss
Qual Saf Health Care 12:311-312 doi:10.1136/qhc.12.4.311
  • Classic paper

FOR THOSE CONDEMNED TO LIVE IN THE FUTURE

  1. J M Flach
  1. Department of Psychology, Wright State University, Dayton, OH 45435, USA; john.flach{at}wright.edu

      “In situations where information is limited and indeterminate, occasional surprises—and resulting failures—are inevitable. It is both unfair and self-defeating to castigate decision makers who have erred in fallible systems without admitting to that fallibility and doing something to improve the system”.3 (page 298)

      A common goal of many of the people concerned with the “error problem” in medicine is ultimately to improve the system. However, there is a great debate about the best strategy for accomplishing this goal. The extreme poles in this debate might be caricatured as the error elimination strategy1 and the safety management strategy.2 The error elimination strategy tends to rely heavily on hindsight. This strategy tries to reconstruct the history of events in order to identify the “causes” of the errors. It is believed that, by systematically eliminating the causes of error, the system is made increasingly safer. The safety management strategy tends to rely more on foresight. This strategy tries to integrate past experiences to better understand the evolving work ecology. This includes trying to anticipate the functional constraints that shape the opportunities and risks associated with work and the information that might best specify those constraints to decision makers. This approach believes that making the relevant constraints more salient to decision makers is the most promising direction for increasing safety.

      Fischhoff’s work3,4 on the “hindsight bias” suggests that either strategy is vulnerable to errors, and re-reading this important work should provide a healthy dose of humility for people on both sides of the debate. Woods et al5 summarized this vulnerability as follows:

      “Given knowledge of outcome, reviewers will tend to simplify the problem-solving situation that was actually faced by the practitioner. The dilemmas, the uncertainties, the tradeoffs, the attentional demands, and double binds faced by practitioners may be missed or under-emphasized when an incident is viewed in hindsight. . . . Possessing knowledge of the outcome, because of the hindsight bias, trivializes the situation confronting the practitioners and makes the correct choice seem crystal clear.” (page 7–8)

      The error elimination strategy is particularly vulnerable because this approach depends on the ability to accurately reconstruct the past in order to identify causal chains. In fact, Fischhoff4 suggests that the very notion of “causality” may be a symptom of hindsight bias in which an “outcome seems a more or less inevitable outgrowth of the reinterpreted situation” in light of hindsight (page 343). This is one reason why the safety management strategy prefers to focus on “constraints” rather than “causes”. However, the safety management strategy is also vulnerable to hindsight bias in that judgments about the natural salience of information are also affected by the hindsight bias—such that the natural salience of relevant information may be overestimated in the light of hindsight. Despite this clear danger, I tend to believe that the safety management strategy, while fallible, provides the best way forward for improving system safety.

      One lesson of Fischhoff’s work is that the human memory system is not designed to accurately reconstruct the past (as is explicitly assumed by much of the research on memory which measures memory solely in terms of its ability to accurately remember past events), but rather the human memory system is designed to adapt to the future. This adaptation involves “making sense” of the past in order to better anticipate the future. This is clearly not a perfect system:

      “‘Making sense’ out of what we are told about the past is, in turn, so natural that we may be unaware that outcome knowledge has had any effect on us. Even if we are aware of there having been an effect, we may still be unaware of exactly what it was. In trying to reconstruct our foresightful state of mind, we will remain anchored in our hindsightful perspective, leaving the reported outcome too likely looking.”3 (page 343)

      However, the “biases” in judging the past may have positive as well as negative implications when projected into the future. Consider the quote from the analysis by Dominguez6 of the conversion decision in the context of laparoscopic cholecystectomy shown in box 1. Perhaps the error was an inevitable result of the uncertainties associated with surgery. And perhaps this surgeon is castigating himself too severely, given the inevitability of errors in this type of environment. However, I suspect that this surgeon will move into the future with a greater awareness of the potential for danger and that he will be a much better (safer) surgeon as a consequence. In this sense, the adaptation process may involve a migration toward the boundaries of safety. The consequence of crossing a boundary (an error) may be an overcorrection in favor of caution (in this sense it is a bias—incorrectly feeling that he should have seen the error coming). While clearly not optimal in a statistical sense, this may lead to a system that satisfices in favor of safety! Such a system—one that errs in the direction of caution—may well be more likely to survive in an uncertain world than one that optimizes around a particular history [that might reflect both real and imagined (luck) constraints].

      Box 1 Quote from Dominguez6

      “I would be trying to shoot an intraoperative cholangiogram before I’d go ahead and clip that but then again that’s just my own bias from my own previous experience from having a ductile injury. In that girl, [she] had a fairly acute disease, wasn’t quite as bad looking as this but everything was fine until 5 days post-op when she came back to the office just still puking her guts out. And I’d just destroyed her hepatic duct, her common hepatic duct, because I hadn’t realized where we were and that was an error on my part and I had been fooled by the size of her cystic duct. The stone, it had been a good size stone, it had worked its way down chronically to the cystic duct enlarging it so that it looked like the infundibulum of the gallbladder and then at the common duct junction I thought the common duct was the cystic duct so I went ahead and clipped it, divided and then started cauterizing. Well when you cauterize up through there you’ve got the hepatic duct line right behind it and I eventually burned that part. If you talk to any other surgeons who’ve had that kind of an injury, I mean I lost sleep for several nights over that. It’s one of those things that haunt you and you hate it, you just hate it.”

      Fischhoff’s research suggests that it is impossible to reconstruct the past accurately. Any approach to the medical error problem that depends on an accurate reconstruction of the past is therefore doomed to fail. However, it is also important to note that the past is only an imperfect predictor of the future. Even a perfect memory of past events will not allow an unambiguous projection of the future. A system that studies the past with an eye to the future coupled with a healthy dose of humility and caution may therefore provide the best path forward. Today, the safety management strategy reflected in the cognitive systems engineering approach2,7 offers the best hope to a medical system that is destined to live in the future.

      REFERENCES

      Free sample

      This recent issue is free to all users to allow everyone the opportunity to see the full scope and typical content of BMJ Quality & Safety.
      View free sample issue >>

      Email alerts

      Don't forget to sign up for content alerts so you keep up to date with all the articles as they are published.

       

      Navigate This Article