As human error is called error that a man by his actions ( operator error ) or non-action or is responsible for its physical and mental condition. Misconduct can be committed knowingly or unknowingly.
The counterpart to human error is the technical defect .
Human mistakes can happen in all areas of life, situations and in every group of people.
For others, mistakes in machine operators, plant operators or the like (e.g. power plants, vehicles, computer systems and medical personnel) are particularly serious ; the so-called human factor is of great importance here.
Human errors often have the following causes:
- lack of concentration ( concentration disorder )
- lack of responsiveness
- insufficient situational awareness
- carelessly made decision , e.g. B. Elimination of severe side effects when administering a drug ; see. decision making
- Assumption of incorrect starting positions (e.g. misunderstandings , incorrect understanding of messages or instructions, contradicting instructions, chains of command , communication problems - here e.g. interruption of the information path or incorrect perception ; see perception errors , perception disorders and perception delusions )
- Carelessness (mistaking / Ignore of warning signals - "That can not really be" or "That will be fine somehow!"; See. Titanic )
- Listlessness or even lethargy due to a task (in extreme situations)
- Disregard of common practice (e.g. rules , state of the art , common procedures)
Overload or stress , e.g. B. through understaffing
- due to the massing of pending tasks (. e.g., failure of supporting systems - man-machine interface ) - or
- due to the time-critical number of events ; see. Lapse of time
- Lack of knowledge and skills
- Too strong belief in the safe mastery of routines and in one's own competence lead; see. Competence trap
- Overtiredness (intrinsic system shutdown in the safety driving circuit )
- Illness including pain
- Influence of intoxicating or impairing substances ( alcoholism , drug intoxication or medication )
Incorrect organization of learning processes
According to Levinthal and March, organizations face the dilemma of having to develop new, high-quality sources of knowledge without neglecting the existing knowledge of their employees. Organizations solve this problem by (over) specializing or (over) simplifying learning processes. Overall, this results in a bias in favor of the exploitation of existing knowledge resources and existing experience of themselves as triple organizational myopia ( myopia can be described), with a view to operations in remote areas, far away times (both past and future) and already made Error. That means organizations seldom recognize the learning potential that lies in mistakes. This applies e.g. Sometimes also for the engineering psychological error research.
Some wrong decisions can become independent through error chains, i. H. Errors - also from a technical point of view - are mutually dependent ( domino effect ), such as B. the Chernobyl disaster .
Many accidents can be traced back to human error, including major maritime disasters such as the sinking of the Titanic and the accident of the Exxon Valdez off Alaska. Human errors almost always have an impact, both internally and externally.
The internal effects include negative emotions , reproaches , “guilty conscience ”, trauma , despair and, in extreme cases, suicide .
The effects in the external relationship can include: an accident or a disaster , occurrence of damage , environmental pollution , guilt , punishment , termination , separation from partners, the public and, in extreme cases, human life .
Many fatal human errors are evaluated so that precautionary measures do not cause them to happen again in the future . This can be achieved through further training of the target group, modifications of specifications or technical processes or constructions, through revision of specifications and innovations in technology.
To avoid human errors, there are very often control bodies and work instructions, e.g. B. Regulations of the state, the associations, the professional associations or the companies.
Depending on the severity of the guilt and the reproach, those responsible can be exposed to a variety of legal consequences. B. civil law and sanction norms . In serious cases, a criminal court can also impose a professional ban as a secondary penalty.
Under civil law, having to represent is generally considered. The consequences are financial claims ( recourse / compensation ), warnings , termination of contracts or transfers. In terms of service law , the provisions on behavior (e.g. “full dedication”, “legal compliance”) and, for example, disciplinary consequences, come into consideration.
In criminal law , negligence comes into play here , possibly with the special form of recklessness . In particular, the following sanction norms apply: negligent bodily harm , traffic offenses (mostly endangering road traffic ) and negligent homicide .
The term human error is a commonly used synonym for human error. However, it contains a prejudice of humans as the cause without considering inadequate technology. In many cases, people did not fail, but instead, due to their abilities, were not able to solve a problem originating from them or the technology in good time.
There are also effects that can be traced back to both human error and technical defects. Since technical defects are often directly or indirectly attributable to human errors (lack of maintenance, inadequate quality control, design errors, lack of preventive and follow-up care), most accidents can actually be traced back to human errors. However, technology can also fail due to random errors. The probability of this can be reduced, but not to 0. See, for example, soft error , bit error rate and error correction method .
Error management is a separate research area . It describes how a human in a human-machine system deals with errors, regardless of their cause. The ability to manage errors is an essential reason to keep people with a high level of responsibility in the technical system. Good error management can decide whether a human error or a technical defect leads to a catastrophe or not.
- Petra Badke-Schaub, Gesine Hofinger, Kristina Lauche (eds.): Human Factors. Psychology of safe action in high-risk industries. 2nd Edition. Springer, Heidelberg 2012, ISBN 978-3-642-19885-4 .
- Dietrich Dörner : The logic of failure - strategic thinking in complex situations . Rowohlt, Reinbek 1993, ISBN 3-499-19314-0 .
- Ulrich Frey: The blind spot: cognitive errors in science and their evolutionary basis . Ontos, Heusenstamm 2007. ISBN 978-3-938793-51-0 .
- Manfred Osten : The art of making mistakes. Plea for a mistake-friendly mistaken society . Suhrkamp, Frankfurt am Main 2006, ISBN 978-3518417447 .
- James Reason: Human Error. Psychological risk factors and modern technologies . Spectrum, Heidelberg 1994. <EST: Human error, German> ISBN 3-86025-098-1 .
- James Reason: Human error . Cambridge University Press, Cambridge 1990, ISBN 0-521-30669-8 .
- ↑ Daniel A. Levinthal, James G. March, The Myopia of Learning , Strategic Management Journal, Volume 14, Special Issue S2, pp. 95-112, February 1993
- ↑ Hans-Jürgen Weißbach , Michael Florian, Eva-Maria Illigen a. a., technology risks as cultural deficits , Berlin: Sigma 1994, p. 43 ff.