Swiss cheese model

from Wikipedia, the free encyclopedia

The Swiss Cheese Model ( English Swiss cheese model is) is a pictorial representation of latent and active human errors contributing to the breakdown of complex systems and describes the concatenation of causes of accidents . The model was originally set out by the British psychologist James Reason of the University of Manchester and has since gained wide acceptance.

The Swiss cheese model compares security levels with cheese slices lying one behind the other. The holes in the cheese, such as in the case of Emmentaler, are an image of the imperfection of security or protective measures in a security system. The cheese holes as weak points can unexpectedly change their size and position. If there is an unfavorable combination of many causal factors, individual faults develop into damage , accidents or catastrophic consequences . In the model, the cheese holes then line up and there is an “opportunity for a trajectory ”, according to Reason, which overcomes all safety barriers. Instead of a model , Swiss cheese can also be seen as a useful metaphor .

The "Swiss cheese model" of the accident development shows that although there are many layers of defense between dangers and accidents, there are deficiencies in every layer of defense which, if they are "coordinated", can cause the accident to occur.

Latent and active defects

Reason examined accident reports from disasters such as Three Mile Island , Bhopal , Challenger , King's Cross , Chernobyl, and Zeebrugge, and suggested a distinction between active and latent failure; later he speaks of personal and system errors. Reason's hypothesis is that most accidents can be traced back to one or more of four fault areas: organizational influences, supervision, preconditions and specific actions. Deferred errors of the system are long before an adverse event on the management and organizational level have been created before, together with unfavorable circumstances and individual by unsafe acts people led (active failure) to the accident.

application

The Swiss cheese model for the causes of accidents is used in risk analysis and risk management . It is the starting point for the planning of security u. a. in engineering and healthcare , rail transport and aviation . It is used for system security in work and organizational psychology .

In everyday life mistakes happen that happen in the model of a cheese hole, but with a functioning system they are stopped by the next slice of cheese as a safety barrier.

Based on the Swiss-cheese model emerged Critical Incident Reporting Systems ( German  reporting systems on critical incidents ) for notification of critical events ( English critical incident ) and Beinahvorfällen ( English near miss ).

Frosch described Reason's model in mathematical terms as a model in percolation theory , which he analyzed as the Bethe grid .

Error: person or system

In an adverse event accusations are mostly against the actors in "vorderster Front" ( English frontline applicable). However, according to Reason, a damage or accident can be considered in two ways: personal or system errors.

Uncertainty germinates from the ranks of management or supervision , for example when inexperienced pilots are put together as a crew for a night flight in known poor weather conditions . Organizational influences manifest themselves in such cases through spending cuts in pilot training in times of tight budgets.

The same basic framework is also used in healthcare. A doctor prescribes a harmful dose due to a typo on the prescription . This can be seen as a careless mistake on the part of the doctor, such as occurs when there is a lack of time. German doctors spend more than a quarter of their working time on bureaucracy in the healthcare system (bureaucracy costs: 4.3 billion euros per year in the medical practices of established German doctors).

The details of active failure are often difficult to predict. However, latent preconditions can be identified before an adverse event occurs (see emergence ) in order to “make systems robust in view of fundamentally existing personal and operational risks ” - i.e. not primarily error avoidance, but rather the prevention of consequences as an approach to the security system .

Theories and models of the causes of accidents

In understanding the causes of accidents, Reason distinguishes itself from the personal approach that has been used for years. In the case of theories with a person approach, the unsafe actions or omissions of people are primarily considered - for example due to "forgetfulness, inattention, poor motivation, negligence, negligence and recklessness". Human mistakes are classified as a moral problem : "Bad people happen to bad things", referred to by psychologists as just-world fallacy ( German  fallacy of a just world ).

The “symptoms versus causes” theory suggests that instead of paying particular attention to the obvious faults leading to the accident, the root causes of the accident should be investigated; because unsafe actions and unsafe conditions are only symptoms. " Symptoms versus causes " is viewed as a warning rather than a theory in investigating the causes of accidents.

The domino model by Herbert William Heinrich (1931) was described as the first model of the cause of the accident . The accident is compared to a series of dominoes. As soon as a stone falls, it will knock over all the others. It is a complete sequence of factors, with the last stone corresponding to the accident itself. This describes a simple cause / effect principle . The model should not be confused with the domino effect .

In addition to the domino and Swiss cheese models, other models received little attention. Criticism of the Swiss cheese model is rare.

Individual evidence

  1. a b c d James Reason: The Contribution of Latent Human Failures to the Breakdown of Complex Systems . In: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences . tape 327 , no. 1241 , April 12, 1990, pp. 475-484 , doi : 10.1098 / rstb.1990.0090 , JSTOR : 55319 .
  2. a b c d e f James Reason: Human error: models and management . In: British Medical Journal . tape 320 , no. 7237 , March 18, 2000, p. 768-770 , doi : 10.1136 / bmj.320.7237.768 , PMID 10720363 , PMC 1117770 (free full text).
  3. ^ A b Erik Hollnagel, David D. Woods, Nancy Leveson (Eds.): Resilience Engineering: Concepts and Precepts . Ashgate Publishing, Burlington 2006, ISBN 978-0-7546-4641-9 .
  4. a b A typo and its consequences. (PDF; 72 kB) In: Rheinisches Ärzteblatt. Medical Association of North Rhine-Westphalia , September 2013, p. 16 , accessed on November 18, 2013 : “Quote: At the beginning there is a typo: Instead of“ 5 x 1 tablets ”of a drug daily,“ 3 x 5 tablets ”are incorrectly noted on the prescription . "
  5. Hannes Meuli: Safety is a top priority. (PDF; 1419 kB) (No longer available online.) In: Swiss Traffic 42nd Federal Office of Transport (FOT), CH-Bern, April 2007, p. 7 , archived from the original on January 18, 2015 ; accessed on May 20, 2016 .
  6. Guidelines for Aviation English Training Programs. (PDF; 638 kB) Cir 323. ICAO , 2009, p. 35 , accessed on December 7, 2013 (English).
  7. ^ Friedemann W. Nerdinger, Gerhard Blickle, Niclas Schaper: Industrial and organizational psychology . Springer, Berlin / Heidelberg 2008, chap. 27: Psychology of work safety, p. 501 f . ( springer.com [accessed December 6, 2013]).
  8. ^ Robert A. Frosch: Seeds of Disaster, Roots of Response: How Private Action Can Reduce Public Vulnerability . Ed .: Philip E Auerswald, Lewis M Branscomb, Todd M La Porte, Erwann Michel-Kerjan. Cambridge University Press, 2006, ISBN 0-521-85796-1 , Notes toward a theory of the management of vulnerability, pp. 88 .
  9. ^ Douglas A. Wiegmann, Scott A. Shappell: A Human Error Approach to Aviation Accident Analysis: The Human Factors Analysis and Classification System . Ashgate Publishing, 2003, ISBN 0-7546-1873-0 , pp. 48-49 (English).
  10. Jens Flintrop, Heike Korzilius: Bureaucracy in practices and hospitals: From the attempt to press everyday life into numbers . In: BÄK and KBV (ed.): Deutsches Ärzteblatt . tape 109 , issue 13, March 30, 2012, p. A-634 / B-550 / C-546 ( aerzteblatt.de [PDF; 371 kB ; accessed on November 18, 2013]): "According to a current survey by the KBV , a resident doctor spends an average of 26 percent of his working time with bureaucracy."
  11. Rebecca Beerheide, Heike Korzilius: You need staying power . (PDF; 189 kB) Reduction of bureaucracy in the medical practice. In: Deutsches Ärzteblatt. Federal Medical Association and National Association of Statutory Health Insurance Physicians ( KBV ), December 11, 2015, pp. 2120–2022 , accessed on December 15, 2015 (figures from the Federal Statistical Office on behalf of the National Regulatory Control Council ).
  12. ^ A b Abdul Raouf: Theory of accident causes. In: Jeanne Mager Stellman (Ed.): Encyclopaedia of occupational health and safety. 4th edition. International Labor Office, Geneva 1998, ISBN 92-2-109203-8 , Part VIII. Accident prevention, p. 56.6.
  13. MJ Lerner: The desire for justice and reactions to victims. In: J. McCauley, L. Berkowitz (Eds.): Altruism and helping behavior . Academic Press, New York 1970; quoted from James Reason: Human error: models and management. In: British Medical Journal. 320, No. 7237, March 18, 2000, pp. 768-770.
  14. ^ Herbert William Heinrich: Industrial accident prevention: a scientific approach . McGraw-Hill, New York 1931 (HW Heinrich, D. Petersen, N. Roos: Industrial accident prevention. 5th edition. McGraw-Hill, New York 1980).
  15. Thierry Meyer, gene Erik LL Reniers: Engineering Risk Management . Walter de Gruyter & Co, Berlin 2013, ISBN 978-3-11-028515-4 .
  16. ^ Jop Groeneweg: Hazard analysis: the accident causation model. In: Jeanne Mager Stellman (Ed.): Encyclopaedia of occupational health and safety. 4th edition. International Labor Office, Geneva 1998, ISBN 92-2-109203-8 , Part VIII: Accidents and safety management. Pp. 57.6-57.12.
  17. Revisiting the «SWISS CHEESE" Model of Accidents. (PDF; 533 kB) In: EEC Note No. 13/06. European Organization for the Safety of Air Navigation (Eurocontrol), October 2006, accessed on November 21, 2013 .