Identifying diagnostic errors in primary care using an electronic screening algorithm

Arch Intern Med. 2007 Feb 12;167(3):302-8. doi: 10.1001/archinte.167.3.302.

Abstract

Background: Diagnostic errors are the leading basis for malpractice claims in primary care, yet these errors are underidentified and understudied. Computerized methods used to screen for other types of errors (eg, medication related) have not been applied to diagnostic errors. Our objectives were to assess the feasibility of computerized screening to identify diagnostic errors in primary care and to categorize diagnostic breakdowns using a recently published taxonomy.

Methods: We used an algorithm to screen the electronic medical records of patients at a single hospital that is part of a closed health care system. A Structured Query Language-based program detected the presence of 1 of 2 mutually exclusive electronic screening criteria: screen 1, a primary care visit (index visit) followed by a hospitalization in the next 10 days; or screen 2, an index visit followed by 1 or more primary care, urgent care, or emergency department visits within 10 days. Two independent, blinded reviewers determined the presence or absence of diagnostic error through medical record review of visits with positive and negative screening results.

Results: Among screen 1 and 2 positive visits, 16.1% and 9.4%, respectively, were associated with a diagnostic error. The error rate was 4% in control cases that met neither screening criterion. The most common primary errors in the diagnostic process were failure or delay in eliciting information and misinterpretation or suboptimal weighing of critical pieces of data from the history and physical examination. The most common secondary errors were suboptimal weighing or prioritizing of diagnostic probabilities and failure to recognize urgency of illness or its complications.

Conclusions: Electronic screening has potential to identify records that may contain diagnostic errors in primary care, and its performance is comparable to screening tools for other types of errors. Future studies that validate these findings in other settings could provide improvement initiatives in this area.

Publication types

  • Research Support, N.I.H., Extramural
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Aged
  • Algorithms*
  • Diagnostic Errors* / classification
  • Feasibility Studies
  • Female
  • Hospitalization
  • Humans
  • Male
  • Medical Records Systems, Computerized*
  • Middle Aged
  • Predictive Value of Tests
  • Primary Health Care*
  • Software*