System accident
A system accident is one that results from the complexity of a system. This complexity can either be technological or organizational. Very often, the accident will result from a difficult-to-predict sequence of events.
A mature technology, such as aviation, may collect a number of overlays of improvements, each overlay indeed an improvement, but together they form a situation where there are simply too many possible action pathways to plan for all contingencies, or to plan for them with any degree of thoroughness. The situation may approach combinatoric explosion. Also consider the management principle, "When everything is a priority, nothing is."
The key insight is that this can also happen with safety systems.
Charles Perrow and William Langewiesche are two writers who have explored these concepts.
Discussion
"Too many corporations take the view, 'Follow The Rules!' They would instead be better off if they encouraged their employees to stay engaged and alert."
External Links
- [1] This article discusses the tragedy of ValuJet 592, which crashed in the Everglades on May 11, 1996, from a very difficult to predict sequence of events, with no single step including a blatant human error. It can also be argued that the safety procedures involved were opaque rather than transparent.
- [2] An eight-page paper titled "Organizationally Induced Catastrophes" by Charles Perrow. This is a good overview of his general approach.
- [3] Three Mile Island. " . . . it is the same way every partial core melt-down has gone. People haven't believed the instrumentation as they went along." Because the system gives so many goofy readings in normal operation?
References
William Langewiesche, “The Lessons of ValuJet 592,“ The Atlantic Monthly, March 1998, Volume 281, No. 3: pp 81-98.
Charles Perrow, Normal Accidents: Living with High-RiskTechnologies, 1984.