Accident theory

from Wikipedia, the free encyclopedia

The accident theory deals with the causes and course of accidents by Charles Perrow complex results systems, so-called system accidents.

The development of an accident can be broken down into six stages:

Initial phase, latency phase, potency phase, acute phase, crisis phase and the system accident after the crisis point has been exceeded.

By understanding the accident theory, safety concepts can be developed to avoid accidents and errors.

Perrow's theories can still be empirically proven today.

Similar sequences of events that lead to a negative development and ultimately to system accidents can be found in the following incidents:

  • Construction defects and damage (hidden defects, shortened service life of components)
  • Crises (bankruptcies, scandals, disruptions)
  • Nuclear accidents
  • Diseases
  • Production error

System accidents

Normal Disasters: The Inevitable Risks of Large-Scale Technology is a book by Yale sociologist Charles Perrow from 1984, which provides a detailed analysis of complex systems from a social science perspective. She was the first to propose “a framework for characterizing complex technological systems such as air transport, maritime transport, chemical plants, dams and especially nuclear power plants according to their risk”. Perrow says that multiple and unexpected failures are built into complex and tightly coupled systems of society. Such accidents are inevitable and cannot be avoided.

Perrow's argument is based on human error that major accidents tend to escalate and that technology is not the problem, but the organization that is involved. Each of these principles is relevant to this day.

Perrow calls “normal” accidents or system accidents because such accidents are unavoidable in extremely complex systems. Given the characteristics of the system involved, several errors occur which, despite efforts to avoid them, interact with one another. Perrow said operator errors are a very common problem. Many mistakes are organizational rather than technology, and large accidents almost always have very small beginnings. Such events appear trivial at first, before they cascade through the system unpredictably to create a major event with dire consequences.

“Ordinary Accidents” contributed key words to a number of intellectual developments in the 1980s that revolutionized the concept of safety and risk. She advocated examining technological failure as the product of highly interactive systems, highlighting organizational and management factors as the main causes of failure. Technological disasters could no longer be traced back to isolated operational disruptions, operating errors or force majeure.

Perrow identifies three conditions that make a system susceptible to normal accidents. These are:

  • The system is complex.
  • The system is closely linked.
  • The system has catastrophic potential.

Construction defects

A tree Angel may have different defects have to be the cause. A distinction is made between obvious defects, hidden defects and fraudulently concealed defects.

As can be seen from expert reports, hidden defects can develop in the same way as accidents from the interplay of several unpredictable events.

As with sensitive systems, corresponding maintenance plans are drawn up, and sensitive and inaccessible components of buildings are equipped with sensors for monitoring systems.

Crises

Crises are internal or external events that pose acute dangers to living beings, the environment, assets or the reputation of a company or an institution.

Crisis communication then comprises all measures for communicative avoidance (potential crisis phase), early detection (latent crisis phase), coping (acute crisis phase) and follow-up (post-crisis phase) of crisis situations.

A distinction is made between three types of crises:

  • financial crises ("bankruptcies")
  • communicative crises ("scandals")
  • operational crises ("disruptions")

According to the institute's surveys, there are around 40,000 financial crises and around 280 operational and communicative crises (which have become public) every year in the DA-CH region.

Nuclear accidents

The inspiration for Perrow's books was the Three Mile Island accident in 1979, in which a nuclear accident resulted from the unexpected interaction of multiple failures in a complex system. The event was an example of a normal accident because it was "unexpected, incomprehensible, uncontrollable and inevitable".

Perrow concluded that the failure of Three Mile Island was a result of the immense complexity of the system. Such modern high-risk systems, he recognized, were prone to failure, but they were well managed. It was inevitable that they would eventually end up in what he called a "normal accident". Therefore, he suggested, we might better consider a radical redesign or, if that weren't possible, abandon this technology altogether.

New reactor designs

A disadvantage of any new nuclear reactor technology is that the initial safety risks can be greater because the reactor operators have little experience with the new design. Nuclear engineer David Lochbaum has stated that almost all major nuclear accidents have occurred with the most modern technology of the time. He argues that "the problem with new reactors and accidents is twofold: scenarios arise that are impossible to plan in simulations, and people make mistakes". As a director of a US research laboratory put it, "the manufacture, construction, operation and maintenance of new reactors will have a steep learning curve: advanced technologies will have an increased risk of accidents and errors. The technology can be proven, but the people are not “predictable”.

Sometimes technical redundancies that are put in place to ensure security can backfire and create less and no more reliability. This can be done in three ways: First, redundant safety devices result in a more complex system that is more prone to errors and accidents. Second, redundancy can lead to employee responsibility being suppressed. Third, redundancy can lead to increased production pressures, resulting in a system that operates at higher speeds but is less secure.

readership

Normal Accidents is a very widely cited book, with more than 1,000 citations in the Social Science Citation Index and in the Science Citation Index up to 2003. A German translation of the book was published in 1987 with a second edition in 1992.

literature

Individual evidence

  1. ^ A b c Daniel E Whitney: Normal Accidents by Charles Perrow . In: Massachusetts Institute of Technology . 2003.
  2. Perrow, Charles. Normal Accidents: Living with High-Risk Technologies New York: Basic Books, 1984. p.5
  3. Perrow, C. (1982), "The President's Commission and the Normal Accident," in Sils, D., Wolf, C. and Shelanski, V. (Eds), Accident at Three Mile Island: The Human Dimensions , Westview, Boulder, pp. 173-184
  4. ^ A b Benjamin K. Sovacool. A Critical Evaluation of Nuclear Power and Renewable Electricity in Asia, Journal of Contemporary Asia , Vol. 40, No. 3, August 2010, p. 381
  5. a b Scott D. Sagan: Learning from Normal Accidents . In: Organization & Environment . March 2004. Archived from the original on July 14, 2004. Info: The archive link was automatically inserted and not yet checked. Please check the original and archive link according to the instructions and then remove this notice. Retrieved November 6, 2017. @1@ 2Template: Webachiv / IABot / iis-db.stanford.edu
  6. Literature by and about accident theory in the catalog of the German National Library