Article Text

Download PDFPDF

Why a sociotechnical framework is necessary to address diagnostic error
  1. Meagan M Ladell1,
  2. Sarah Yale1,
  3. Brett J Bordini1,
  4. Matthew C Scanlon1,
  5. Nancy Jacobson2,
  6. Elizabeth Lerner Papautsky3
  1. 1 Pediatrics, Medical College of Wisconsin, Milwaukee, Wisconsin, USA
  2. 2 Emergency Medicine, Medical College of Wisconsin, Milwaukee, Wisconsin, USA
  3. 3 Department of Biomedical & Health Information Sciences, University of Illinois Chicago, Chicago, Illinois, USA
  1. Correspondence to Dr Meagan M Ladell; mladell{at}mcw.edu

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Diagnostic error: the problem

Failures in the diagnostic process are thought to affect at least 15% of patient encounters, cause 34% of adverse events in hospitals, are a leading cause in major malpractice claims and payouts and are recognised as a top priority in patient safety research.1–3 The National Academies of Science, Engineering and Medicine defines diagnostic error as a failure to establish an accurate and timely explanation of a patient’s medical problem and has been shown to contribute to the morbidity and mortality of an estimated 795 000 patients each year in the USA.1 Although diagnostic error has received significant research attention across multiple clinical settings over the last several decades, it continues to pose consequential challenges and requires improvement in systematic investigation and operational intervention.3 4 Additionally, few effective mitigation strategies have been designed for widespread prevention of diagnostic error.3 4

To date, much of the research about diagnostic error has been based on malpractice and autopsy data. By nature, both sources are subject to outcomes and hindsight biases, which give little insight into contributing and interacting factors that lead to diagnostic error.1 4 Furthermore, these data do not represent the true frequency of occurrence in clinical practice, as they likely underestimate the prevalence of diagnostic errors as they are only recognised when they lead to poor outcomes.1 The unintentional problem with focusing on diagnostic outcomes (ie, whether the diagnosis is correct or incorrect) and then retrospectively reviewing cases in which the diagnosis went awry neglects the vital factor of how physicians arrive at a diagnosis.

Diagnostic errors rarely occur as isolated events, but rather span multiple care episodes.3 By solely studying diagnosis retrospectively and neglecting process evaluation for vulnerabilities, current approaches limit the understanding of how and why diagnostic error occurs. Therefore, efforts that are only focused on developing error taxonomies or characterising cognitive biases that contribute to faulty logic are insufficient for achieving a deep understanding of this complex problem space.

Diagnostic error: the problem with conventional approaches

Outside of research, there are few tools to analyse diagnostic error events practically. Complex safety events that span multiple care domains, such as diagnostic error, are hard to adequately identify.5 Once they are brought to the attention of the institution there are few evidence-based investigation techniques and intervention methodologies.3 4 When cases of diagnostic error are reviewed, traditional approaches such as root cause analysis (RCA) and/or morbidity and mortality conferences (M&M) may be employed. M&Ms are very heterogenous in their format and quality.6 If M&Ms are not led by a skilled facilitator in an environment that supports safety culture, they risk placing blame on individuals rather than seeking to uncover and discuss system failures. Further, some M&Ms do not include any discussion of improvement ideas or expected changes to the system.6 Although RCAs help to highlight the potential for concrete system changes, they do not sufficiently address complexity due to being reductionist.7 RCAs are often narrow in scope; failing to inform sustainable improvements as they tend to arrive at a single or few root causes and miss out on the multiple interacting factors contributing to any diagnostic error.7 8

Diagnostic error is often framed as the result of providers’ cognitive bias or an individual system-based failure.3 5 9 However, narrowing into those dichotomous silos fails to appreciate the interplay of multiple factors within a sociotechnical system.10 11 In the Safer Dx framework, Singh and Sittig made progress towards highlighting the needs for systems thinking in investigating and addressing diagnostic error; however, their model does not fully explore the significance of the interactions between system factors.5 Current categorical definitions and frameworks for diagnostic error—even frameworks that acknowledge multifactorial contributions to error—fail to capture the dynamic interactions among the people, clinical environment, technology and tools that constitute the diagnostic milieu; these dynamic interactions are better conceptualised and analysed within a sociotechnical framework. For example, Singh’s sociotechnical framework references eight dimensions of the system but discusses them as if they are static and separate entities.5 Holden et al propose a framework that is dynamic and offer structured tools to capture the interactive components of the sociotechnical system.12 Without addressing the interactions, we downplay the dynamic roles of provider work, patient work, provider communication and patient communication.

Diagnostic error: a complex systems problem

Conventional models of diagnostic error have consistently failed to account for the complexity of diagnostic work. Diagnosis is non-linear, multifactorial and often occurs in concert with therapeutic interventions.5 13 Clinical teams may need to address the patient’s condition through interventions while simultaneously carrying out the diagnostic tasks.13 Diagnosis may be conducted by multiple clinicians, and is dynamic, characterised by uncertainty, time pressure, high clinician workload and dependencies on other people, tools and technologies for information gathering, synthesis and task performance.14 15 All this work is done while balancing competing demands from other patients and providers. Given these complexities, effective mitigation strategies cannot be developed without a sociotechnical systems perspective to capture and characterise different types of diagnostic processes contextualised in clinical settings. The existing models of the diagnostic process in the literature lend themselves to simplistic understanding, which may potentially lead to the design of ineffective interventions; we suspect these have been created using a view of the diagnostic process based on work as imagined rather than work as done.16 By focusing on how we think healthcare is supposed to work, we ignore the realities and complexities of provider workarounds and adaptations that inevitably must occur to complete diagnostic work.16 We believe that diagnostic error is a product of dynamic, complex systems, and the interactions of system factors including patient and family, providers, technologies, hospital systems, work environment, external environments and dynamic cognitive load. Expanding the conversation to include a sociotechnical systems perspective is crucial to improving the understanding of diagnostic error and effective mitigation strategies. It is these dynamic interactions which explain how there can be the same patient presented to the same clinician under different circumstances and result in different diagnostic outcomes.

The field of human factors engineering (HFE) offers both theory and methods for applying a systems perspective in complex work domains such as healthcare.17 18 At the core of HFE is that humans are performing within the context of interacting sociotechnical factors, with both safe and unsafe care being emergent properties of a system. Therefore, it is necessary to consider both system interactions that influence the work as done, as well as the context for why a provider arrived at their diagnostic decisions.16–18 These factors must be studied and accounted for in developing effective solutions. The Systems Engineering Initiative for Patient Safety (SEIPS) 101 model, developed by human factors scientists, is a sociotechnical framework that characterises the interactions of work system factors including people, tools, tasks, settings, as well as the associated processes and outcomes.12 The SEIPS 101 model and the seven simple tools published by Holden and Carayon offer flexible tools that have multiple applications, one of such tools is the People, Environment, Tools and Tasks (PETT) scan.19 It serves to help researchers and providers account for the breadth of the work system.19 The PETT scan is intended to be flexible and thus can be tailored accordingly to one’s objectives to guide identification and characterisation of sociotechnical system factors, as well as barriers and facilitators.19 The PETT scan can be applied to diagnostic error research projects and practically to diagnostic error investigations to help expand systems thinking and operationalise system complexity to better understand the process and its competing demands.

Application of the SEIPS 101 PETT scan

To illustrate system factors and their interactions, we compiled a case study of a poor outcome in a paediatric tertiary care academic hospital setting that was labelled as diagnostic error. Using a traditional RCA in a case like this may attribute a combination of communication error between providers and cognitive bias of premature closure as the event’s root causes. An expert panel of four clinicians and one human factors scientist conducted discussions to identify and characterise facilitators and barriers to achieving an accurate diagnosis for the case described below. Participants included attending physicians with training in paediatric critical care (MCS), paediatric hospital medicine (BJB and SY) and paediatric emergency medicine (MML), with additional expertise in patient safety and diagnostic error. The team met for ten 1-hour sessions bimonthly to engage in discussions regarding cases of diagnostic error. The case below is a fictional case inspired by multiple cases experienced by the clinicians involved. From the SEIPS 101 manuscript, the template of the PETT scan was used, adapted for the case’s needs and completed together as a team. The completed product of discussions is the PETT scan table (table 1). The team then iterated and refined the findings via meetings and asynchronous communication over 6 months.

Table 1

People, Environment, Tools and Tasks scan applied to a diagnostic error case study19

Case example

The patient is a 21-day-old infant presenting to a paediatric emergency department with respiratory distress. The patient was primarily seen by a rotating trainee with limited paediatric experience. The trainee was at the end of their shift during which they saw multiple young children with viral bronchiolitis. The patient’s father, who had limited English proficiency, reported that a sibling was also sick. The infant was struggling to breathe and was believed to have a self-limited viral respiratory infection with hypoxia and was admitted to the hospital for oxygen support therapy. On day 3 of admission, there was underappreciation that the child was not improving as expected for an infectious viral aetiology. After clinical decompensation and cardiac arrest, the care team realised that the patient’s deterioration was likely caused by a previously unidentified critical congenital heart disease, despite also having a viral illness. Standard institutional investigation attributed the event to diagnostic error due to anchoring and premature closure.

The information in table 1 breaks down the facilitators and barriers that help illustrate the complexities of the work system underlying the diagnostic outcome for this patient. Due to the barriers identified in the table, the limited information available about the patient’s history (including feeding patterns and weight loss) offered little evidence to support a diagnosis other than viral infection. Similarly, the need for personal protective equipment and the use of disposable stethoscopes (due to the multiple respiratory viruses in the community) created significant barriers to an effective physical examination. The provider was forced to make do with the resources available under the additional constraints that existed.

Additionally, while the underlying pathophysiology for this patient was a congenital heart defect, the decompensation was likely a result of a concurrent superimposed viral illness that was present, indicating that this patient had a viral respiratory illness AND congenital heart disease. As demonstrated in this case, it is neither simply cognitive errors nor ‘system’ errors that result in diagnostic errors. Instead, both diagnostic error and diagnostic accuracy result from the interactions between the factors of sociotechnical systems. It is the dynamic nature of both the sociotechnical system factors AND their interactions that explain how the same ‘system’ can produce both correct and incorrect diagnoses.

The authors acknowledge the limitation of an isolated case example; however, they feel that it is a poignant demonstration of the failure of current diagnostic error investigation methodology to fully acknowledge and explore complexity. We propose that stepping outside of the conventional patient safety case review methodology may deepen our understanding of diagnostic error and other complex patient safety problems in healthcare. We argue to expand the application of human factors safety science to the diagnostic errors and for further research on using these principles in the clinical environment.

Call for increased application of the sociotechnical lens in future work

Diagnosis is a dynamic process, and the lack of improvement in over 15 years suggests that diagnostic errors are often complex and multifactorial. Given that healthcare is a complex sociotechnical system, capturing and characterising factor interactions is not intuitive and requires the conscious application of sociotechnical models which can benefit research, improvement efforts and error investigations. Although many methods have focused on failures of the system, they have neglected to consider when the diagnostic process succeeds. Examining system facilitators offers a novel strategy for improving the diagnostic process. Implementation of SEIPS offers a unique way to simultaneously consider vulnerabilities while guiding the characterisation of system resiliency. Future research should seek to identify and characterise the complex and dynamic interactions within work systems and their role in the diagnostic process. To make progress on long-standing patient safety threats such as diagnostic error, partnership with experts in HFE and reframing patient safety issues through sociotechnical models is a necessity.

Ethics statements

Patient consent for publication

Ethics approval

Not applicable.

References

Footnotes

  • Contributors MML: conceptualisation, investigation, project administration, writing—original draft preparation, guarantor. SY and BJB: investigation, writing—review and editing. MCS: conceptualisation, methodology, supervision, visualisation, writing—review and editing. NJ: writing—review and editing. ELP: methodology, supervision, writing—review and editing.

  • Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

  • Competing interests None declared.

  • Provenance and peer review Not commissioned; externally peer reviewed.