More information about text formats
Dhaliwal's comment  on Zwaan et al  nicely refutes what has been called "the hypothesis of special cause"  - the notion that when things turn out wrong, the cognitive processes leading to that outcome must have been fundamentally different (ie, error-prone) from when they turn out right. Dhaliwal's argument recapitulates thinking that is over 100 years old; one of the early contributors to psychology, Ernst Mach, wr...
Dhaliwal's comment  on Zwaan et al  nicely refutes what has been called "the hypothesis of special cause"  - the notion that when things turn out wrong, the cognitive processes leading to that outcome must have been fundamentally different (ie, error-prone) from when they turn out right. Dhaliwal's argument recapitulates thinking that is over 100 years old; one of the early contributors to psychology, Ernst Mach, wrote (in 1905): "Knowledge and error flow from the same mental source; only success can tell one from the other" .
What is interesting here is not that the hypothesis of special cause is wrong, but rather the question of why has it been so popular and persistent. What is it about the notion of humans as fundamentally irrational, poor decision-makers that gives this idea such wide appeal? After all, broad acceptance of this sort is not the norm for most psychological or medical research; controversy, argument, or outright disbelief are much more common . Christensen-Szalanski and Beach surveyed decision-making studies in psychology and reported that, although the studies' conclusions were roughly evenly divided between finding good or poor decision-making performance (56% vs 44%), studies reporting human performance as flawed were cited almost 6 times more frequently than those reporting it good. Citations outside of psychology journals were overwhelmingly used to advance the claim that people are poor decision-makers .
One reason for this strange popularity is that the people-are-irrational claim provides benefits for those who have rationality to sell: guideline authors, health care managers, and other proponents of scientific-bureaucratic medicine [6,7]. Another is that it paradoxically provides individual benefits: once we understand the clever puzzles of heuristics and biases problems, even in retrospect, we tend to feel that we must be pretty clever also. And a final, and likely strongest influence, is that it protects organizations and elites: attributing adverse events to flawed mental processes at the front lines serves as a kind of lightning rod, conducting the harmful consequences of bad outcomes down an organizationally safe pathway .
Unfortunately, the history of patient safety to date does not suggest that cautions such as Dhaliwal's will have much effect; such cautions have been raised and ignored before [9-12]. Patient safety's fixation on 'medical error' as the fundament of medical harm serves many (perhaps extraneous) purposes, but is based on an ontological will-of-the-wisp [3,13,14]. Given general agreement on the meagre progress of the patient safety movement to date [15-18], a fundamental re-thinking of our basic premises and hidden assumptions is desperately needed if we are to move forward. And as with many fixations, a sea-change of this sort is not likely to come from within the present patient safety movement, but must come from the outside [19,20]. We can only hope 'these barbarians' challenge us sooner rather than later .
1. Dhaliwal G. Premature closure? Not so fast. BMJ Quality & Safety 2016 bmjqs-2016-005267:online ahead of print.
2. Zwaan L, Monteiro S, Sherbino J, Ilgen J, Howey B, Norman G. Is bias in the eye of the beholder? A vignette study to assess recognition of cognitive biases in clinical case workups. BMJ Quality & Safety 2016.
3. Hollnagel E. Safety-I and Safety-II: The Past and Future of Safety Management. Farnham, UK: Ashgate; 2014, 187 pages.
4. Mach E. Knowledge and Error. Translated by Foulkes P, McCormack TJ. Dordrecht, Netherlands: Reidel Publishing Co; 1905 (English translation 1976), 393 pages.
5. Lopes LL. The Rhetoric of Irrationality. Theory & Psychology 1991;1(1):65-82.
6. Harrison S, Moran M, Wood B. Policy emergence and policy convergence: the case of 'scientific-bureaucratic medicine' in the United States and United Kingdom. The British Journal of Politics & International Relations 2002;4(1):1-24.
7. Wears RL, Hunte GS. Seeing patient safety 'Like a State'. Safety Science 2014;67:50-57.
8. Cook RI, Nemeth C. "Those found responsible have been sacked": some observations on the usefulness of error. Cogn Technol Work 2010;12(1):87-93.
9. Henriksen K, Kaplan H. Hindsight bias, outcome knowledge and adaptive learning. Qual Saf Health Care 2003;12(Suppl 2):ii46-ii50.
10. Dekker SWA. Patient Safety: A Human Factors Approach. Boca Raton, FL: CRC Press; 2011, 250 pages.
11. Hollnagel E. Does human error exist? In: Senders JW, Moray NP, eds. Human Error: Cause, Prediction, and Reduction. Hillsdale, NJ: Lawrence Erlbaum Associates; 1991: pp 153.
12. Wears RL. The error of chasing 'error'. Northeast Florida Medicine 2007;58(3):30-31.
13. Dekker SWA. Is it 1947 yet? http://www.safetydifferently.com/is-it-1947-yet/, accessed 19 May 2015.
14. Woods DD, Dekker SWA, Cook RI, Johannesen L, Sarter N. Behind Human Error. 2nd ed. Farnham, UK: Ashgate; 2010, 271 pages.
15. National Patient Safety Foundation. Free From Harm: Accelerating Patient Safety Improvement Fifteen Years after To Err Is Human. Cambridge, MA: National Patient Safety Foundation; 2015, http://www.npsf.org/custom_form.asp?id=03806127-74DF-40FB-A5F2-238D8BE6C24C, accessed 8 December 2015, 59 pages.
16. Pronovost PJ, Ravitz AD, Stoll RA, Kennedy SB. Transforming Patient Safety: A Sector-Wide Systems Approach: Report of the WISH Patient Safety Forum 2015. Qatar: World Innovation Summit for Health; 2015, http://dpnfts5nbrdps.cloudfront.net/app/media/1430, accessed 18 February 2015, 52 pages.
17. Baker GR, Black G. Beyond the Quick Fix. Toronto, ON: University of Toronto; 2015, http://ihpme.utoronto.ca/wp-content/uploads/2015/11/Beyond-the-Quick-Fix-Baker-2015.pdf, accessed 12 November 2015, 32 pages.
18. Illingworth J. Continuous improvement of patient safety: the case for change in the NHS. London, UK: The Health Foundation; 2015, http://www.health.org.uk/sites/default/files/ContinuousImprovementPatientSafety.pdf, accessed 12 November 2015, 40 pages.
19. De Keyser V, Woods DD. Fixation Errors: Failures to Revise Situation Assessment in Dynamic and Risky Systems. In: Colombo AG, de Bustamante AS, eds. Systems Reliability Assessment: Springer Netherlands; 1990: pp 231-251.
20. Woods DD, Cook RI. Perspectives on human error: hindsight biases and local rationality. In: Durso FT, Nickerson RS, Schvaneveldt RW, et al., eds. Handbook of Applied Cognition. 1st ed. New York, NY: John Wiley & Sons; 1999: pp 141-171.
21. Cavafy C. Waiting for the Barbarians. http://www.cavafy.com/poems/content.asp?id=119&cat=1 . accessed 6 March 2014.