Intended for healthcare professionals

Education And Debate

Five pitfalls in decisions about diagnosis and prescribing

BMJ 2005; 330 doi: https://doi.org/10.1136/bmj.330.7494.781 (Published 31 March 2005) Cite this as: BMJ 2005;330:781
  1. Jill G Klein (jill.klein{at}insead.edu), associate professor of marketing1
  1. 1INSEAD, 1 Ayer Rajah Avenue, 138676 Singapore
  • Accepted 1 February 2005

Everyone makes mistakes. But our reliance on cognitive processes prone to bias makes treatment errors more likely than we think

Introduction

Psychologists have studied the cognitive processes involved in decision making extensively and have identified many factors that lead people astray. Because doctors' decisions have profound effects on their patients' health, these decisions should be of the best possible quality. All doctors should therefore be aware of possible pitfalls in medical decision making and take steps to avoid these unnecessary errors. In this article, I present five examples of cognitive biases that can affect medical decision making and offer suggestions for avoiding them.

Psychology of decision making

Doctors often have to make rapid decisions, either because of medical emergency or because they need to see many patients in a limited time. Psychologists have shown that rapid decision making is aided by heuristics—strategies that provide shortcuts to quick decisions—but they have also noted that these heuristics frequently mislead us.1 Good decision making is further impeded by the fact that we often fall prey to various cognitive biases.

To make correct decisions in clinical practice, doctors must first gather information on which to base their judgments. According to decision making experts Russo and Schoemaker,2 the best way to do this is to ask the most appropriate questions, to interpret answers properly, and to decide when to quit searching further. Straightforward though this sounds, misleading heuristics and cognitive biases create pitfalls throughout this process.

Doctors may believe that, as highly trained professionals, they are immune to these pitfalls. Unfortunately, they are just as prone to errors in decision making as anyone else.35 Even worse, it is common for people who are particularly prone to cognitive biases to believe that they are good decision makers.2 As Shakespeare put it, “The fool doth think he is wise, but the wise man knows himself to be a fool.”w1 Studies based on both simulated cases and questionnaires show that doctors are susceptible to decision making biases,6 7 including insensitivity to known probabilities,7 overconfidence,w2 a failure to consider other options,w3 the attraction effect,w4 and the availability heuristic.w5 The good news is that training in these dangers can reduce the probability of flawed medical decision making.w6


Embedded Image

Cognitive biases can impair decision making

Credit: ZODIE HAWKINS/TREVILLION

Pitfall 1: the representativeness heuristic

The representativeness heuristic is the assumption that something that seems similar to other things in a certain category is itself a member of that category. Kahneman and Tversky showed this heuristic in a classic experiment in which they presented participants with descriptions of people who came from a fictitious group of 30 engineers and 70 lawyers (or vice versa).8 The participants then rated the probability that the person described was an engineer. Their judgments were much more affected by the extent to which the description corresponded to the stereotype of an engineer (for example, “Jack is conservative and careful”) than by base rate information (only 30% were engineers), showing that representativeness had a greater effect on the judgments than did knowledge of the probabilities.

The representativeness heuristic has also been shown in nursing. Nurses were given two fictitious scenarios of patients with symptoms suggestive of either a heart attack or a stroke and asked to provide a diagnosis.9 The heart attack scenario sometimes included the additional information that the patient had recently been dismissed from his job, and the stroke scenario sometimes included the information that the patient's breath smelt of alcohol. The additional information had a highly significant effect on the diagnosis and made it less likely—consistent with the representativeness heuristic—that the nurses would attribute the symptoms to a serious physical cause. The effect of the additional information was similar for both qualified and student nurses, suggesting that training had little effect on the extent to which heuristics influenced diagnostic decisions.

How can we avoid being led astray by the representativeness heuristic? The key is to be aware not only of the likelihood of a particular event (such as a stroke) based on situational information (such as alcohol on the breath), but also how likely the event is in the absence of that information. In other words, it is important to be aware of base rates of the occurrence of a particular condition and to avoid giving too much weight to one piece of information. By the same token, if a disease is extremely rare, it may still be unlikely to be the correct diagnosis even if a patient has the signs and symptoms of that disease.

Pitfall 2: the availability heuristic

When we use the availability heuristic, we place particular weight on examples of things that come to mind easily, perhaps because they are easily remembered or recently encountered. In general, this guides us in the right direction, as things that come to mind easily are likely to be common, but it may also mislead. The availability heuristic is apparent after a major train crash, when some people choose to travel by car instead of by rail, in the incorrect belief that it is safer.w7

In the medical setting, one study asked doctors to judge the probability that medical inpatients had bacteraemia. The probability was judged to be significantly higher when doctors had recent experience of caring for patients with bacteraemia.10 Another example is the documented tendency of doctors to overestimate the risk of addiction when prescribing opioid analgesics for pain relief and to undertreat severe pain as a result.1113w8-w11 Risk of addiction is actually low when patients receive opioids (particularly controlled release formulations) for pain,14 15 but opiate addiction tends to receive high publicity and so—through the availability heuristic—its likelihood may be overestimated.

To avoid falling prey to the availability heuristic, doctors should try to be aware of all the diverse factors that influence a decision or diagnosis. They should ask if their decision is influenced by any salient pieces of information and, if so, whether these pieces of information are truly representative or simply reflect recent or otherwise particularly memorable experiences. Knowing whether information is truly relevant, rather than simply easily available, is the key.

Rules for good decision making

  • Be aware of base rates

  • Consider whether data are truly relevant, rather than just salient

  • Seek reasons why your decisions may be wrong and entertain alternative hypotheses

  • Ask questions that would disprove, rather than confirm, your current hypothesis

  • Remember that you are wrong more often than you think

Pitfall 3: overconfidence

To use our knowledge effectively, we must be aware of its limitations. Unfortunately, most of us are poor at assessing the gaps in our knowledge, tending to overestimate both how much we know and how reliably we know it (see bmj.com for an example). Research has shown that almost all of us are more confident about our judgments than we should be. Since medical diagnoses typically involve some uncertainty, we know that almost all doctors make more mistakes in diagnosis than they think they do. Overconfidence also comes into play when doctors rate their clinical skills. Larue et al found that both primary care doctors and medical oncologists rated their ability to manage pain highly, even though they actually had serious shortcomings in their attitudes toward and knowledge of pain control.16

The dangers of overconfidence are obvious. Doctors who overestimate their management of a condition may continue to prescribe suboptimal treatment, unaware that their management could be improved. Also, overconfidence in diagnostic abilities may result in too hasty a diagnosis, when further tests are needed. It is critical, therefore, to be aware of the limits of your knowledge and to ensure that knowledge is kept up to date. Awareness of your shortcomings makes it more likely that you will gather further information. It can also be helpful to make a habit of seeking the opinions of colleagues.17

Pitfall 4: confirmatory bias

Confirmatory bias is the tendency to look for, notice, and remember information that fits with our pre-existing expectations. Similarly, information that contradicts those expectations may be ignored or dismissed as unimportant.1 2 Confirmatory bias has been shown to affect peer-reviewers' assessments of manuscripts. Mahoney sent fictitious manuscripts with identical methods but different results to reviewers.18 Reviewers gave significantly better ratings to the methods section when the results supported their pre-existing beliefs.

Once again, doctors are not immune to confirmatory bias. In taking medical histories, doctors often ask questions that solicit information confirming early judgments. Even worse, they may stop asking questions because they reach an early conclusion, thus failing to unearth key data. More generally, the interpretation of information obtained towards the end of a medical work-up might be biased by earlier judgments.19

The confirmatory bias can also lead to treatment errors. It is natural to expect that the drug you are about to administer is the correct drug. Apparently obvious information that you have the wrong drug—for example, a label marked ephedrine instead of the expected epinephrine—may be ignored or misinterpreted to confirm your expectation that the drug is correct.20

Summary points

Psychologists have extensively studied the cognitive processes involved in making decisions

Heuristics and biases that lead to poor decisions are widespread, even among doctors

Awareness of the cognitive processes used to make decisions can reduce the likelihood of poor decisions

Although the danger of confirmatory bias is greatest when making decisions about diagnosis, ongoing treatment decisions are also affected. It is thus critical to remain constantly vigilant for any information that may contradict your existing diagnosis, and to give any such information careful consideration, rather than dismissing it as irrelevant. It is also a good idea to try to think of specific reasons why your current theory might be wrong and to ask questions that could potentially disprove your hypothesis. Always be aware of alternative hypotheses and ask yourself whether they may be better than your current ideas.

Pitfall 5: illusory correlation

Illusory correlation is the tendency to perceive two events as causally related, when in fact the connection between them is coincidental or even non-existent. (It has some overlap with confirmatory bias when causes that fit with pre-existing ideas are noticed.) Homoeopathy provides an excellent example of illusory correlation. Homoeopaths will often notice when patients improve after being treated with a homoeopathic remedy and claim this as evidence that homoeopathic treatment works. However, no convincing evidence exists that homoeopathic treatments are effective.w12 w13 Illusory correlation is probably at work: homoeopaths are likely to remember occasions when their patients improve after treatment.

Falling prey to illusory correlation can reinforce incorrect beliefs, which in turn can lead to the persistence of suboptimal practices. Ask yourself whether any instances do not fit with your assumed correlations. A straightforward way to do this is simply to keep written records of events that you believe to be correlated, making sure that all relevant instances are recorded.

Conclusions

Doctors often have to make decisions quickly. However, the greatest obstacle to making correct decisions is seldom insufficient time but distortions and biases in the way information is gathered and assimilated. Being aware that decisions can be biased is an important first step in overcoming those biases. In real life, of course, biases may not necessarily fit neatly into any one of the categories I described above but may result from a complex interaction of different factors. This increases the potential for poor decisions still further. The good news is that it is possible to train yourself to be vigilant for these errors and to improve decision making as a result (box).

Acknowledgments

I thank Adam Jacobs of Dianthus Medical for help in preparing the manuscript.

Footnotes

  • Embedded ImageReferences w1-13 and an example of how we overestimate our knowledge is on bmj.com

  • Contributors and sources JGK has a PhD in social psychology and has spent much of her academic career conducting research on biases in impression formation. Sources cited in this article were derived from extensive searches of Medline and Embase.

  • Funding This paper was prepared with financial assistance from Janssen-Cilag.

  • Competing interests JGK has received speaking and consultancy fees from Janssen-Cilag, which manufacture an opioid analgesic patch.

References

  1. 1.
  2. 2.
  3. 3.
  4. 4.
  5. 5.
  6. 6.
  7. 7.
  8. 8.
  9. 9.
  10. 10.
  11. 11.
  12. 12.
  13. 13.
  14. 14.
  15. 15.
  16. 16.
  17. 17.
  18. 18.
  19. 19.
  20. 20.
View Abstract