Intended for healthcare professionals

Analysis

Errors in clinical reasoning: causes and remedial strategies

BMJ 2009; 338 doi: https://doi.org/10.1136/bmj.b1860 (Published 08 June 2009) Cite this as: BMJ 2009;338:b1860
  1. Ian A Scott, associate professor of medicine
  1. 1Department of Internal Medicine and Clinical Epidemiology, Princess Alexandra Hospital, Ipswich Road, Brisbane, Australia 4102
  1. Correspondence to: I A Scott ian_scott{at}health.qld.gov.au
  • Accepted 4 January 2009

Everyone makes mistakes, but greater awareness of the causes would help clinicians to avoid many of them, as Ian Scott explains

Most errors in clinical reasoning are not due to incompetence or inadequate knowledge but to frailty of human thinking under conditions of complexity, uncertainty, and pressure of time. To minimise such cognitive error we need to understand its prevalence and causes. In this article I discuss why errors occur and describe strategies that may help avoid them.

Prevalence of reasoning error

The first step to optimal care is making the correct diagnosis, which is missed or delayed in between 5% and 14% of acute hospital admissions.1 2 Autopsy studies confirm diagnostic error rates of 10-20%,3 4 with autopsy disclosing previously undiagnosed problems in up to 25% of cases.3 Even if the diagnosis is correct, up to 45% of patients with acute or chronic medical conditions do not receive recommended evidence based care,5 while between 20% and 30% of administered investigations and drugs are potentially unnecessary.6 Clinicians are sometimes less willing to adopt new beneficial interventions than to abandon old ineffective ones.7

The extent to which these deficits relate directly to reasoning error by clinicians, rather than environmental determinants beyond their control, remains uncertain, although recent studies of adverse events in hospital patients may give some indication.8 9 A third of the identified adverse events involved errors of execution (slips, lapses, or oversights in carrying out appropriate management in correctly diagnosed patients), but almost half involved errors of reasoning or decision quality (failure to elicit, synthesise, decide, or act on clinical information). Such reasoning errors led to death or permanent disability in at least 25% of cases, and at least three quarters were deemed highly preventable.9

Of some concern is the discrepancy between prevalence of reasoning error and clinicians’ appreciation of the scale and causes of the problem. For example, no more than 10% of clinicians admit, when asked, to any error in diagnosis over the past year,10 and 40% of diagnoses about which clinicians were certain were proved wrong at autopsy.11 Clinicians often stay wedded to an incorrect diagnosis, even if the correct one is suggested by colleagues or by decision support tools.12 In terms of management, no correlation exists between actual rates of guideline concordance and how closely clinicians perceive themselves as adhering to guidelines.13 14 Being an older and presumably more experienced clinician also does not guarantee better quality carew1 or lower risk of reasoning error.w2

Cognitive psychology of clinical reasoning

More research has focused on diagnostic reasoning than on management (or treatment) reasoning,w3 but the cognitive psychology of both share common properties. Diagnosis begins with acquisition of data through history taking and clinical examination. Clinicians use these data, almost subconsciously, to frame or contextualise the patient’s problem as a clinically meaningful representation. They then use various semantic or abstract linkagesw4 to transform individual clinical findings into coherent clinical syndromes or schemes that then inductively trigger one or more diagnostic ideas. For many previously encountered problems, experienced clinicians then proceed in a non-analytical fashion, relying on pattern recognition—selecting the best match from a large mental library of example cases.w5 The diagnosis is then verified quickly through a small number of confirmatory inquiries.

In more novel, ambiguous, or complex situations, clinicians switch to a more analytical mode of reasoning. Several diagnostic ideas are iteratively tested by slower, deliberate, and selective gathering of additional data that, by a process of deduction, narrows the list of possibilities towards the provisional diagnosis.w6 Diagnosis is then verified according to whether the observed natural course, results of investigations, or initial response to treatment corresponds to what is expected for the assumed diagnosis. Novice clinicians may complement this method with other types of reasoning based on pathophysiological principles or algorithmic approaches.w7

Management reasoning can be similarly conceptualised. A diagnosis will lead the clinician to frame or contextualise basic management goals with the aim of controlling symptoms, avoiding clinical complications, or simply reassuring and monitoring the patient. For common, straightforward diagnoses for which there are well known effective treatments, the appropriate management options will come quickly and intuitively from imprinted care patterns or “mindlines.”w8 But if the disease is unfamiliar to the clinician or there are competing treatment risks or comorbidities, selection of management options will proceed more analytically, with explicit weighing up of the pros and cons of different treatments in light of the patient’s circumstances and preferences.w9 w10

Sources of cognitive error

Irrespective of whether diagnosis or management is the focus, or whether analytical or non-analytical reasoning modes predominate, all decision making is vulnerable to different forms of cognitive and affective (emotional) bias or error.w11-w13 With the benefit of hindsight, clinicians will offer various explanations for wrong decisions (box 1),w13 w14 many of which relate to embedded ways of thinking, including the use of mental heuristics (maxims, shortcuts, rules of thumb). These heuristics are very efficient and accurate in many situations (box 2) but can sometimes predispose to wrong decisions.15

Box 1 Commonly stated explanations for decision errors

Errors in diagnosis
  • It (the correct diagnosis) never crossed my mind

  • I paid too much attention to one finding, especially laboratory results

  • I did not listen enough to the patient’s story

  • I was in too much of a hurry

  • I didn’t know enough about the disease

  • I let the consultant or specialist convince me

  • I didn’t reassess the situation when things didn’t fit

  • The patient had too many problems at once

  • I was overly influenced by a similar case

  • I failed to convince the patient to have further investigations

  • I was in denial of an upsetting diagnosis

Errors in management
  • The treatment seemed to work well on the last patient who had the same problem

  • Most of my colleagues were very keen on this new drug so I used it too

  • I was too concerned about possible side effects and underestimated the potential benefit of treating with drug x.

  • I thought I should be seen to be doing something, even though I knew the treatment had little chance of success

  • I had so many therapeutic options to choose from, and as I wasn’t sure which one would work best, I stuck with the one I was most familiar with

  • I did not fully appreciate how difficult it would be for the patient to stick to my advice

  • I wasn’t as aggressive as I should have been in treating this patient’s hypertension and hyperlipidaemia as I didn’t appreciate just how high his risk was of an adverse outcome

Box 2 Commonly used heuristics

  • If it looks like a duck, sounds like a duck, and walks like a duck, it is a duck

  • Common conditions occur commonly (including their atypical variants): “If you hear hoof beats, don’t think zebras”

  • Look for a single diagnosis that can explain all the findings (Occam’s razor)

  • Favour a diagnosis (or choose diagnostic investigations) that explains the clinical findings (or are most likely to verify the diagnosis)—go where the money is (Sutton’s law)

  • The best medicine may be to do nothing—first do no harm

  • Treat the patient, not the numbers

Other forms of bias can be internal to the clinician (such as value bias based on the clinician’s beliefs and values,w15 expectation bias based on what the clinician expects of the patient-doctor relationship,w16 agency bias in which clinicians put their interests ahead of those of the patient,w17 and affective bias arising from clinicians’ emotions and personalityw18), or external (such as social bias contingent on past professional socialisation and influence of peer opinion,w19 and externality bias due to constraints of time, resources, and skillw20). Also highly relevant is the presence or absence of ill health, fatigue, interruptions, and time pressure, which can blunt attention span and fracture cognitive integrity.w21 The successful decision maker has to reconcile these, at times, dissonant internal and external worlds and select the most appropriate form of reasoning for the decision requirements at the time.

Types and examples of reasoning error

More than 40 forms of cognitive error have now been described,16 and several texts and articles explore these in depth using narrative case studies.w22-w24 Tables 1 and 2 define the commonest errors in diagnosticw25-w30 and managementw9 w10 w31-w37 reasoning and provide examples. Many error types are inter-related, and more than one can feature in a patient’s care. Importantly, deficiencies in medical knowledge are rarely responsible for diagnostic errors, with premature acceptance of the most favoured diagnosis being highly prevalent (up to 90%) and independent of level of expertise.w29 Similarly, cognitive resistance to altering past habits and mindsets has a much more prominent role than ignorance in errors of management reasoning.w37 w38

Table 1

Cognitive errors in diagnostic reasoning and debiasing strategies

View this table:
Table 2

Cognitive errors and debiasing strategies for management reasoning

View this table:

Strategies for preventing reasoning error

At the system level several interventions can improve decision quality:

  • Good training and ongoing professional development programmes that expand clinical expertise, using both didactic and experiential teaching

  • Collegiate ethos of seeking second opinions and advice without fear of ridicule

  • Educational outreach by respected and seasoned peersw39

  • Clinical decision support systems that remind and prompt clinicians to consider evidence based recommendations and clinical decision rulesw40

  • Robust handover and information systems providing seamless transmission of patient data and clinician reasoning from one individual clinician or team to anotherw41

  • Feedback in the form of clinical audits, mortality and morbidity reviews, and sentinel event analyses in which causes and consequences of faulty decisions can be discussed openly and dispassionately.w42

Feedback is especially important given observations that clinicians are not good at assessing their own performance.w43

At the level of the individual clinician, maintaining continuity of care with individual patients over the long term ensures awareness of past mistakes that take time to emerge.w44 Another strategy is for clinicians to develop an understanding of basic error theory and skills in meta-cognition—that is, thinking about their thinking.w45-w47 Being able to critique your own reasoning, particularly in circumstances where error is more likely to occur (situational awareness) and to activate thought processes that make decisions less susceptible to bias and error (debiasing strategies) are valuable skills. Croskerry has proposed cognitive approaches that can be generic (being familiar with major types of reasoning error and the debiasing strategies that may be applied) and specific (being aware of specific clinical scenarios in which classic errors are more predictably made and selecting the most appropriate debiasing strategy in response).15

Common clinical scenarios associated with increased risk of diagnostic error include back pain in presence of known malignancy, wherein anchoring may cause osteoarthritis and other common causes of mechanical back pain to be considered over metastatic spinal disease; and patients with dyspnoea, raised jugular venous pressure, and hypotension for whom systolic heart failure is prematurely accepted as the diagnosis when pulmonary thromboembolism and cardiac tamponade can present with similar features. Similarly, for management error patients with atrial fibrillation may have anticoagulant therapy withheld because of overestimation of bleeding risk (omission bias, regret bias) or inconvenience to the patient or doctor from long term monitoring (contextual error, clinical inertia); and patients with end stage heart failure or lung disease may receive inappropriately aggressive treatment (commission bias) when end of life supportive care may be more suitable. Box 3 gives further examples.

Box 3 Common scenarios associated with a high likelihood of reasoning error

Diagnostic reasoning
  • Seemingly minor and non-specific infection in immunocompromised patients—Anchoring bias, confirmation bias, and premature closure may lead to diagnosis of common, uncomplicated viral or bacterial infections without adequate consideration and exclusion of more sinister opportunistic infections that can present with protean manifestations

  • Mimics of acute stroke in patients with odd neurological signs—Anchoring bias and premature closure may result in acceptance of acute stroke in situations where diagnoses of demyelinating disorders such as multiple sclerosis or inflammatory disorders such as meningoencephalitis are more likely to account for atypical presentations

  • Temporal arteritis in patients with atypical headaches—Anchoring bias and premature closure may cause muscle contraction or tension headaches to be diagnosed when more sinister diseases such as temporal arteritis need to be considered

  • Coeliac disease in patients diagnosed as having eating disorders or benign dyspepsia—Anchoring bias and premature closure may lead to diagnosis of eating disorders or dyspepsia in patients presenting with non-specific abdominal pain or bloating whereas a more open mind, coupled with more probing questions relating to weight loss and bowel symptoms, may prompt consideration of alternative diagnoses such as coeliac disease

  • Inflammatory bowel disease in patients with irritable colon symptoms—Representativeness heuristic, anchoring bias, and confirmation bias may lead to irritable colon being diagnosed in a patient with a several month history of constipation alternating with diarrhoea but who also exhibits weight loss, lower abdominal pain, and mild fever—features suggesting more sinister disease such as Crohn’s disease

  • Thyroid disease in elderly patients—Anchoring bias and premature closure may cause non-specific features of lethargy and altered mentation to be attributed to depression, dementia, or old age when increased or decreased levels of thyroxine are responsible.

Management reasoning
  • Assessing and minimising perioperative cardiac risk—Framing effects may lead to overestimation of the absolute risk of perioperative cardiac events if clinicians fail to use validated tools in predicting risk (such as the revised cardiac risk index). Extrapolation error, commission bias, and outcome bias may also lead to inappropriate use of β blockers in patients who have little to gain and indeed could be harmed by such therapy

  • Venous thromboembolism prophylaxis— Framing effects also apply in this example with hospital medical patients at low risk of venous thromboembolism (<1%) receiving anticoagulants while surgical patients at much higher risk (>20%) do not because of perceived (but false) risk of postoperative bleeding and underappreciation of the risk of thromboembolism

  • Management of transient ischaemic attack—Framing effects and clinical inertia may lead to complacency in management of patients presenting with transient ischaemic attack designed to prevent future stroke, which is up to 15% at 30 days in high risk patients

  • Management of poorly controlled type 2 diabetes—Omission bias and clinical inertia may lead to unjustifiably conservative blood sugar targets in patients with longstanding disease and absence of overt diabetic complications, especially if accompanied by past treatment resistance and seemingly adequate control of other vascular risk factors

  • Management of hypertension, falls, delirium, and dementia in elderly people—Omission bias (unsure of intervention effectiveness), regret bias (no wish to harm), contextual errors (failure to elicit patient/carer preferences), and clinical inertia may stop clinicians from aggressive management or prevention of geriatric syndromes despite the existence of treatments and simple strategies with potential benefit

  • Management of first unprovoked seizures in adults— Commission bias (wanting to prevent further seizures), multiple alternative bias (not sure whether to observe, undertake extensive investigations, empirically start anticonvulsants), and contextual factors (not wanting to label patient epileptic, which carries occupational and insurance liabilities) may all conspire to cause clinicians to treat such patients in varied ways, even though studies indicate a first unprovoked seizure associated with normal neurological examination, computed tomography of the head, and electroencephalogram is unlikely to recur and, if it does, will be seen within the next few months

Croskerry15 16 and othersw22 w24 have developed several corrective (or debiasing) strategies for minimising errors of reasoning (tables 1 and 2). Although this approach has face validity and is being adopted in think aloud simulation exercises,w48 clinical coaching,w49 and hypothetical vignettes,w50 its effectiveness in preventing error is yet to be evaluated and unintended consequences are possible. These include decisional delays (“paralysis by analysis” or constant second guessing), increase in unnecessary investigations in response to expanded lists of differential diagnoses, patient anxiety arising from clinicians’ expressions of uncertainty, and more errors as more investigative and treatment options have to be considered.w51 More effort may also be needed to deal with negative emotions and cognitive impairment arising from work stress and personal predispositions that cause clinicians to jump to wrong decisions.w52

Implications for clinical training

It is important that experienced clinicians act as role models in good clinical reasoning and explicitly discuss how they arrive at the decisions they have made. This requires “thinking out loud” as they grapple with clinical problems in real time, articulating problem representations, highlighting pivotal or key features in diagnosis and management, and explaining the pros and cons of different courses of action.17 18 19 The approach should be used not only for cases solved successfully but for cases characterised by blind alleys and false starts.

Students should learn about how cognitive biases can mislead and be taught simple corrective maxims to lessen their effects, including judicious application of evidence based medicine and clinical decision support. Novice clinicians need to be encouraged to think and question using a democratic (not authoritarian) style and to have their reasoning heard and appraised with specific, timely, and constructive feedback that avoids harsh judgments based on hindsight. Finally patients, families, and carers need to be encouraged to help improve decision quality by being aware of circumstances pertaining to themselves (such as a tendency towards hypochondriasis) or to the environment (busy emergency department where staff may be overworked) which predispose to clinician error, to participate in decision making and sound the warning bell if they feel at risk,20 and accept a certain level of uncertainty when the right course of action is not immediately obvious.

Notes

Cite this as: BMJ 2009;338:b1860

Footnotes

  • Contributors and sources: IAS has a research masters in clinical education, has written and implemented a curriculum and developed resource materials for a course in clinical reasoning skills within the University of Queensland graduate medical course and is developing a self directed learning programme in clinical reasoning for trainees registered with the Royal Australasian College of Physicians.

  • Competing interests: None declared.

  • Provenance and peer review: Not commissioned; externally peer reviewed.

References