Elsevier

Safety Science

Volume 49, Issue 6, July 2011, Pages 939-945
Safety Science

Technical Communication
The complexity of failure: Implications of complexity theory for safety investigations

https://doi.org/10.1016/j.ssci.2011.01.008Get rights and content

Abstract

Complexity theory suggests that we see performance as an emergent property, the result of complex interactions and relationships. This can clash, however, with what stakeholders see as legitimate and normal in accident investigations. When systems fail, it is still common to blame components (e.g. human errors) and when they succeed spectacularly, to think in terms of individual heroism (e.g. the A320 Hudson River landing). In this paper, we lay out the contrast between a Newtonian analysis of failure that can be recognized in many efforts at safety analysis and improvement. It makes particular assumptions about the relationship between cause and effect, foreseeability of harm, time-reversibility and the ability to produce the “true story” of an accident. With inspiration from complexity theory, failures are seen as an emergent property of complexity. We explore what that means for safety science and work towards a post-Newtonian analysis of failure in complex systems.

Research highlights

► In complex systems there is no linear relationship between component behavior and system-level outcomes. ► According to complexity theory, reconstructing one “true” story of what happened is impossible. ► Investigations should gather multiple narratives from different perspectives inside the complex system. ► These narratives offer partially overlapping and contradictory accounts of emergent outcomes. ► Narrative diversity is more valuable than one official account: it offers more opportunities for learning.

Introduction

Complexity is a defining characteristic of today’s high-technology, high-consequence systems (Perrow, 1984), and recent submissions to this journal highlight the importance of taking a systems- or complexity view of failure in such systems (Goh et al., 2010). Despite this, single-factor explanations that condense accounts of failure to individual human action or inaction often prevail. An analysis by Holden (2009) showed that between 1999 and 2006, 96% of investigated US aviation accidents were attributed in large part to the flight crew. In 81%, people were the sole reported cause. The language used in these analyses also points to failures or shortcomings of components. “Crew failure” or a similar term appeared in 74% of probable causes; the remaining cases contain language such as “inadequate planning, judgment and airmanship,” “inexperience” and “unnecessary and excessive, ... inputs.” “Violation” of written guidance was implicated as cause or contributing factor in a third of all cases (Holden, 2009).

Single-factor, judgmental explanations for complex system failures are not unique to aviation—they are prevalent in fields from medicine (Wachter and Pronovost, 2009), to military operations (e.g. Snook, 2000), to road traffic (Tingvall and Lie, 2010). Much discourse around accidents in complex systems remains tethered to language such as “chain-of-events”, “human error” and questions such as “what was the cause?” and “who was to blame?” (Douglas, 1992, Catino, 2008, Cook and Nemeth, 2010). The problem of reverting to condensed, single-factor explanations rather than diffuse and system-level ones (Galison, 2000) has of course been a central preoccupation of system safety (Reason, 1997, Maurino et al., 1999, Dekker, 2006), and the difficulty in achieving systemic stories of failure has been considered from a variety of angles (Fischhoff, 1975, Woods et al., 2010).

This paper adds to the literature by critiquing the traditional philosophical–historical and ideological bases for sustained linear thinking about failure in complex systems. We mean by linear thinking a process that follows a chain of causal reasoning from a premise to a single outcome. In contrast, systems thinking regards an outcome as emerging from a complex network of causal interactions, and, therefore, not the result of a single factor (Leveson, 2002). We lay out how a Newtonian analysis of failure makes particular assumptions about the relationship between cause and effect, foreseeability of harm, time-reversibility and the ability to come up with the “true story” of an accident. Whereas such thinking has long been equated with science and rationality in the West, an acknowledgment of the complex, systemic nature of many accidents necessitates a different approach. We take steps toward the development of such an approach, of what we like to call a post-Newtonian analysis, in the second half.

Section snippets

The Cartesian–Newtonian worldview and its implications for system safety

The logic behind Newtonian science is easy to formulate, although its implications for how we think about accidents are subtle and pervasive. Classical mechanics, as formulated by Newton and further developed by Laplace and others encourages a reductionist, mechanistic methodology and worldview. Many still equate “scientific thinking” with “Newtonian thinking.” The mechanistic paradigm is compelling in its simplicity, coherence and apparent completeness and largely consistent with intuition and

The view from complexity and its implications for system failure

Analytic reduction cannot tell how a number of different things and processes act together when exposed to a number of different influences at the same time. This is complexity, a characteristic of a system. Complex behavior arises because of the interaction between the components of a system. It asks us to focus not on individual components but on their relationships. The properties of the system emerge as a result of these interactions; they are not contained within individual components.

Conclusion

When accidents are seen as complex phenomena, there is no longer an obvious relationship between the behavior of parts in the system (or their malfunctioning, e.g. “human errors”) and system-level outcomes. Instead, system-level behaviors emerge from the multitude of relationship and interconnections deeper inside the system, and cannot be reduced to those relationships or interconnections. Investigations that embrace complexity, then, might stop looking for the “causes” of failure or success.

References (51)

  • P. Cilliers

    Difference, identity and complexity

    Philosophy Today

    (2010)
  • R.I. Cook et al.

    Those found responsible have been sacked: some observations on the usefulness of error

    Cognition, Technology and Work

    (2010)
  • W. Cronon

    A place for stories: nature, history, and narrative

    The Journal of American History

    (1992)
  • S.W.A. Dekker

    The Field Guide to Understanding Human Error

    (2006)
  • S.W.A. Dekker

    Just culture: who draws the line?

    Cognition, Technology & Work

    (2009)
  • S.W.A. Dekker

    Drift into Failure: From Hunting Broken Components to Understanding Complex Systems

    (2011)
  • M. Douglas

    Risk and Blame: Essays in Cultural Theory

    (1992)
  • Feyerabend, P., 1993. Against Method London,...
  • B. Fischhoff

    Hindsight ≠ foresight: the effect of outcome knowledge on judgment under uncertainty

    Journal of Experimental Psychology: Human Perception and Performance

    (1975)
  • GAIN, 2004. Roadmap to a just culture: enhancing the safety environment. Global Aviation Information Network (Group E:...
  • Galison, P., 2000. An Accident of History. Atmospheric Flight in the Twentieth Century P. Galison and A. Roland....
  • F. Heylighen

    Causality as distinction conservation: a theory of predictability, reversibility and time order

    Cybernetics and Systems

    (1999)
  • Heylighen, F., Cilliers, P., et al., 2006. Complexity and Philosophy. Brussel, BE, Vrije Universiteit Brussel:...
  • R.J. Holden

    People or systems: to blame is human. The fix is to engineer

    Professional Safety

    (2009)
  • Hollnagel, E., 2004. Barriers and Accident Prevention. Aldershot, UK,...
  • Cited by (253)

    View all citing articles on Scopus
    View full text