FOR THOSE CONDEMNED TO LIVE IN THE FUTURE
- Department of Psychology, Wright State University, Dayton, OH 45435, USA; firstname.lastname@example.org
“In situations where information is limited and indeterminate, occasional surprises—and resulting failures—are inevitable.
It is both unfair and self-defeating to castigate decision makers who have erred in fallible systems without admitting to
that fallibility and doing something to improve the system”.3 (page 298)
A common goal of many of the people concerned with the “error problem” in medicine is ultimately to improve the system. However, there is a great debate about the best strategy for accomplishing this goal. The extreme poles in this debate might be caricatured as the error elimination strategy1 and the safety management strategy.2 The error elimination strategy tends to rely heavily on hindsight. This strategy tries to reconstruct the history of events in order to identify the “causes” of the errors. It is believed that, by systematically eliminating the causes of error, the system is made increasingly safer. The safety management strategy tends to rely more on foresight. This strategy tries to integrate past experiences to better understand the evolving work ecology. This includes trying to anticipate the functional constraints that shape the opportunities and risks associated with work and the information that might best specify those constraints to decision makers. This approach believes that making the relevant constraints more salient to decision makers is the most promising direction for increasing safety.
Fischhoff’s work3,4 on the “hindsight bias” suggests that either strategy is vulnerable to errors, and re-reading this important work should provide …