The paradoxes of almost totally safe transportation systems
Section snippets
Introduction: an implicit and general-purpose safety model
In most industries, safety remains governed by a few simple and self-explicit principles:
- 1.
Conceptual designs generate systems with a high theoretical performance and safety potential, subject to technical and human failings. Breakdowns and human errors jeopardise operational and optimal safety, acting as ‘noise’ disturbing operations; ideally they should be totally eliminated or at least minimised. Conceptually, breakdowns and errors are symmetrically assessed, and detection logic, calculation
Safe and ultra-safe systems, and relative effectiveness of measures aimed at reducing breakdowns and human errors
Systems which are now ultra safe, i.e. where the risk of disastrous accident is below one accident per million events, behave differently safety wise than less safe systems.
They can be placed in the context of other systems:
- 1.
Dangerous systems, where the risk of accident is greater than one accident per 1000 events, 10−3 (e.g. bungee jumping, or mountain climbing3
Developments in research on human error
Research studies on human error are nothing new. However, many concepts were extensively reviewed during this century (see Fig. 1 for a summary). In psychophysics and behavioural psychology, scientists perceived error as an instrument helping measure the performance of experimental subjects. The study of error was not a goal per se, but just a mean to assess other psychological concepts. Threshold psychophysics were based on the delivery of the ‘good’ answer (correct perception of the
Extending the ecological safety model to incidents
Breakdowns and incidents follow the same logic as errors. An incident-free system becomes mute, and its safety can no longer be tuned. Investments stop being directed at safety and are earmarked towards improving performance; the control and the spontaneous limitation induced by the occurrence of incidents no longer play their role. The system can then brutally find itself in a situation of disastrous accident because its over-streched performance has given rise to new risks (Rasmussen, 1993).
General development model for large systems
This article started with an obvious statement: errors and breakdowns must be eliminated to increase safety. This paradigm holds true to optimise the safety of man–machine systems up to one risk of disastrous failure for 100,000 events, 1×10−5). To reach this level of effectiveness, definitions do not need to be overly clarified, nor are complex cognitive processes required to come into play. Increased procedures, improved staff training, automation, and conventional error blocking solutions
Conclusion
Ultra-safe systems have reached today's safety level through a lengthy, extensive and crisis-ridden optimisation process, during which these systems have aged and matured. However, this process has also made systems more fragile and less adaptive. These systems will eventually be replaced by others having a different safety potential; thus goes the development of technical cycles. However, the challenge for today's safety practitioners is to manage the transition from these older technologies
Disclaimer
The ideas expressed in this paper only reflect the opinion of the author and must not be considered as official views from any national or international authorities or official bodies to which the author belongs.
References (42)
Error detection processes in statistical problem solving
Cognitive science
(1984)- et al.
Metaknowledge for time and reliability
Reliability Engineering and Systems Safety
(1992) - et al.
Communication patterns and errors in flight simulation
Reliability Engineering System Safety
(1992) - Abbott, K., Slotte, S., Stimson, D. (Eds.), (1996, June). The Interfaces Between Flighcrews and Modern Flight Deck...
La conduite des systèmes à risques
(1996)- Amalberti, R., 1997. Paradoxes aux confins de la sécurité absolue. Annales Des Mines Fev97, 9–15. [Paradoxes of...
Automation in aviation: a human factors perspective
- Amalberti, R., Wioland, L., 1997. Human error in aviation. invited paper to the International Aviation Safety...
Overcoming Organisational Defenses
(1990)On the difficulties people have in dealing with difficulty
Simulation & Games
(1980)