Technical CommunicationThe complexity of failure: Implications of complexity theory for safety investigations
Research highlights
► In complex systems there is no linear relationship between component behavior and system-level outcomes. ► According to complexity theory, reconstructing one “true” story of what happened is impossible. ► Investigations should gather multiple narratives from different perspectives inside the complex system. ► These narratives offer partially overlapping and contradictory accounts of emergent outcomes. ► Narrative diversity is more valuable than one official account: it offers more opportunities for learning.
Introduction
Complexity is a defining characteristic of today’s high-technology, high-consequence systems (Perrow, 1984), and recent submissions to this journal highlight the importance of taking a systems- or complexity view of failure in such systems (Goh et al., 2010). Despite this, single-factor explanations that condense accounts of failure to individual human action or inaction often prevail. An analysis by Holden (2009) showed that between 1999 and 2006, 96% of investigated US aviation accidents were attributed in large part to the flight crew. In 81%, people were the sole reported cause. The language used in these analyses also points to failures or shortcomings of components. “Crew failure” or a similar term appeared in 74% of probable causes; the remaining cases contain language such as “inadequate planning, judgment and airmanship,” “inexperience” and “unnecessary and excessive, ... inputs.” “Violation” of written guidance was implicated as cause or contributing factor in a third of all cases (Holden, 2009).
Single-factor, judgmental explanations for complex system failures are not unique to aviation—they are prevalent in fields from medicine (Wachter and Pronovost, 2009), to military operations (e.g. Snook, 2000), to road traffic (Tingvall and Lie, 2010). Much discourse around accidents in complex systems remains tethered to language such as “chain-of-events”, “human error” and questions such as “what was the cause?” and “who was to blame?” (Douglas, 1992, Catino, 2008, Cook and Nemeth, 2010). The problem of reverting to condensed, single-factor explanations rather than diffuse and system-level ones (Galison, 2000) has of course been a central preoccupation of system safety (Reason, 1997, Maurino et al., 1999, Dekker, 2006), and the difficulty in achieving systemic stories of failure has been considered from a variety of angles (Fischhoff, 1975, Woods et al., 2010).
This paper adds to the literature by critiquing the traditional philosophical–historical and ideological bases for sustained linear thinking about failure in complex systems. We mean by linear thinking a process that follows a chain of causal reasoning from a premise to a single outcome. In contrast, systems thinking regards an outcome as emerging from a complex network of causal interactions, and, therefore, not the result of a single factor (Leveson, 2002). We lay out how a Newtonian analysis of failure makes particular assumptions about the relationship between cause and effect, foreseeability of harm, time-reversibility and the ability to come up with the “true story” of an accident. Whereas such thinking has long been equated with science and rationality in the West, an acknowledgment of the complex, systemic nature of many accidents necessitates a different approach. We take steps toward the development of such an approach, of what we like to call a post-Newtonian analysis, in the second half.
Section snippets
The Cartesian–Newtonian worldview and its implications for system safety
The logic behind Newtonian science is easy to formulate, although its implications for how we think about accidents are subtle and pervasive. Classical mechanics, as formulated by Newton and further developed by Laplace and others encourages a reductionist, mechanistic methodology and worldview. Many still equate “scientific thinking” with “Newtonian thinking.” The mechanistic paradigm is compelling in its simplicity, coherence and apparent completeness and largely consistent with intuition and
The view from complexity and its implications for system failure
Analytic reduction cannot tell how a number of different things and processes act together when exposed to a number of different influences at the same time. This is complexity, a characteristic of a system. Complex behavior arises because of the interaction between the components of a system. It asks us to focus not on individual components but on their relationships. The properties of the system emerge as a result of these interactions; they are not contained within individual components.
Conclusion
When accidents are seen as complex phenomena, there is no longer an obvious relationship between the behavior of parts in the system (or their malfunctioning, e.g. “human errors”) and system-level outcomes. Instead, system-level behaviors emerge from the multitude of relationship and interconnections deeper inside the system, and cannot be reduced to those relationships or interconnections. Investigations that embrace complexity, then, might stop looking for the “causes” of failure or success.
References (51)
- et al.
I knew it would happen remembered probabilities of once-future things
Organizational Behavior and Human Performance
(1975) - et al.
Applying systems thinking concepts in the analysis of major incidents and safety culture
Safety Science
(2010) - et al.
Cross-cultural comparisons of traffic safety, risk perception, attitudes and behavior
Safety Science
(2009) Risk management in a dynamic society: a modelling problem
Safety Science
(1997)- ASW, 2002. Failure to Minimize Latent Hazards Cited in Taipei Tragedy Report. Air Safety Week. Washington, DC, Aviation...
- BP, 2010. Deepwater Horizon Accident Investigation Report....
Flight 427: Anatomy of an Air Disaster
(2002)- CAIB, 2003. Report Volume 1, August 2003. Washington, DC, Columbia Accident Investigation...
A review of literature: individual blame vs. organizational function logics in accident analysis
Journal of Contingencies and Crisis Management
(2008)Complexity, deconstruction and relativism
Theory, Culture & Society
(2005)
Difference, identity and complexity
Philosophy Today
Those found responsible have been sacked: some observations on the usefulness of error
Cognition, Technology and Work
A place for stories: nature, history, and narrative
The Journal of American History
The Field Guide to Understanding Human Error
Just culture: who draws the line?
Cognition, Technology & Work
Drift into Failure: From Hunting Broken Components to Understanding Complex Systems
Risk and Blame: Essays in Cultural Theory
Hindsight ≠ foresight: the effect of outcome knowledge on judgment under uncertainty
Journal of Experimental Psychology: Human Perception and Performance
Causality as distinction conservation: a theory of predictability, reversibility and time order
Cybernetics and Systems
People or systems: to blame is human. The fix is to engineer
Professional Safety
Cited by (253)
Accident spread and risk propagation mechanism in complex industrial system network
2024, Reliability Engineering and System SafetyCharacterization of biases and their impact on the integrity of a risk study
2024, Safety SciencePotential sources of sensor data anomalies for autonomous vehicles: An overview from road vehicle safety perspective
2024, Expert Systems with ApplicationsPrinciples and practices of designing for resilient performance: An assessment framework
2024, Applied Ergonomics