FORDEC

from Wikipedia, the free encyclopedia

FORDEC or FOR-DEC is an acronym and denotes a method for structured decision-making , which is mainly used in aviation . It was developed by employees of the German Aerospace Center with the introduction of crew resource management training for pilots and flight attendants. The FOR-DEC model represents the current doctrine in German-speaking countries with regard to decision-making processes in aviation.

Overview

The letters of the acronym indicate the individual steps that lead to the decision-making process and in detail mean:

F acts What is the situation?
O ptions Which options for action are available?
R isks & Benefits Which risks and benefits are associated with the respective options for action?
-
D ecision Which option is chosen?
E xecution Execution of the chosen option.
C heck Does the chosen path lead to the desired goal?

The hyphen "-" separates the phases of the situation analysis from the rest of the decision-making process. It virtually symbolizes a brief moment of pause before the preferred option is implemented. This moment can prevent important elements from being overlooked due to hectic rush or strong assumptions (see below) in situations in which a very precise diagnosis of the situation is important.

It is assumed that decisions are more robust against hasty impulses and emotional influences if they are made according to this metacognitive rule. In dynamic situations, however, a disadvantage could also arise from the fact that some decision-makers always try to find the "optimal" solution without any sense of time. If there is pressure to act in time-critical situations, an option should first be selected that increases the security situation and, if possible, provides more time reserves. FOR-DEC is currently mainly used in air traffic and medicine. However, the principle of structured decision-making is universally applicable.

What types of FOR-DEC are there?

FOR-DEC can be used for individual decisions as well as for group decisions in a team. In situations with high time pressure, however, the processing depth of the first three phases FOR is limited. Under time pressure, the primary goal is to find and apply a security solution option that will help to bridge an acute crisis situation. It is not important that the absolutely best possible decision is made, but that a way is found immediately to prevent a possible disaster. Damage control is the main goal here. If, on the other hand, the time factor plays a subordinate role, for example when solving problems, the decision-making process can be optimized through greater processing depth. In order to achieve an optimum, a more careful situation analysis and a more extensive weighing of possibilities and risks can take place.

Is FOR-DEC relevant?

If decision-making is more intuitive, two mechanisms - including heuristics - play a major role: Similarity Matching and Frequency Gambling. These decision heuristics enable quick, automated decisions and actions. However, information from the current situation is neglected. Instead, they draw on their own wealth of experience and build on the best possible fit with previous experience. Complex and unfamiliar situations fall out of the ordinary and often lead to wrong decisions by accident.

According to the human reliability approach, the likelihood of errors at work increases exponentially depending on time pressure, the complexity of the work task and familiarity with the situation. The estimated frequency of errors in simple routine tasks is around 1 in 1000. If, on the other hand, complex tasks are to be processed in unusual situations under time pressure, errors occur on average in three out of ten cases. This finding speaks for a great importance of situation factors for the occurrence of human errors.

The cockpit Human Factors Research Project Deutsche Lufthansa describes four sources of errors: (. Eg failure of one system) technical problems, operational problems (eg time pressure.), Social conflict and human error. The basis for almost 40 percent of all errors in air travel is a combination of operational errors, social conflicts and human errors. The remaining variance is explained by the other 14 possible combinations of the sources of error. Only five percent of all errors are based entirely on human errors. The human error rate only increases drastically when there is acute stress due to time pressure or uncertainty. If mistakes by other crew members are not compensated for because the teamwork does not work, the consequences can be catastrophic. This is in line with Bubb's findings that time pressures, complexity, and novelty lead to more errors.

FOR-DEC is used to analyze the situation and thus to bring a structure into something new and complex. The situation analysis prevents simple heuristics such as similarity matching and frequency gambling from determining a decision. Instead, the situation is specifically analyzed and options and risks are weighed up on the basis of these facts.

FOR-DEC as an intervention - how can FOR-DEC be conveyed?

FOR-DEC training resulted in a significant improvement in their decision-making skills in experienced pilots. But how could these effects be achieved? And are they transferable to other groups of people? In 2002 Goeters carried out a longitudinal study on 17 professional pilots. The experimental design consisted of three phases: pre-phase with competence assessment , intervention and post-phase with renewed competence assessment. After the intervention , the pilots were better at analyzing problems: they gathered more facts and generated more options for action. Risk assessment and outcome evaluation - hardly available before the intervention - also increased significantly.

The intervention consisted of the training of leadership skills, communication skills, cooperation and decision-making - the latter was given special intensive training, as this was where the greatest need for action existed. Goeters used FOR-DEC as a model for structuring decisions. Collecting facts, generating options, weighing up risks and evaluating results are partial steps of FOR-DEC - after the intervention, these steps were implemented much more frequently. The pilots in Goeter's study performed worst at making decisions. The intervention in Goeter's study consisted of CRM training (Crew Resource Management Training) in which FOR-DEC was embedded. In addition to the model for structured decision-making, leadership skills, cooperation and situational awareness were trained. The latter includes the perception of the context and the time pressure on which the possible depth of processing of the situation analysis depends. Both leadership skills and the ability to cooperate facilitate social interaction. Good social interaction, in turn, simplifies group decisions (“Decision”) and the joint implementation (“Execution”) of the decision. It remains to be investigated whether the transfer and success of FOR-DEC interventions depends on whether social skills are part of the intervention.

An awareness phase was switched on before the actual intervention . The pilots were advised of the frequency and circumstances of human error in air accidents. This should emphasize the personal relevance of training measures that reduce the proportion of human error .

Realistic and authentic case scenarios and exercise contexts are crucial for the acceptance and success of FOR-DEC interventions. In the described longitudinal study, the pilots were presented with realistic flight scenarios with incidents in writing during the FOR-DEC exercise phase. This served as the first step in getting familiar with FOR-DEC. The individual stages of FOR-DEC were discussed in further exercises: the pilots created a checklist with which they could check the facts and went through training on divergent thinking in order to learn how to generate as many options for action as possible. In a third and final step, the pilots came together in small groups and analyzed crisis situations that they had experienced in their previous careers.

Six months after the intervention, there were still significant improvements in the problem-solving skills of the pilots in the flight simulator, a fact that suggests that FOR-DEC has been integrated into its own problem-solving strategies in the long term.

Two psychological principles were used in the successful teaching of FOR-DEC: The emphasis on personal relevance (hold factor) serves to arouse interest and motivate the participants to open up to the learning content. The practical training with case studies and personal experiences increases the likelihood of transfer . These principles are general and can be applied to different learning content. So far, however, there has been no research into whether FOR-DEC can be made usable for other groups of people as pilots with the help of practical training and hold factors.

However, it is not just pilots who have deficits in decision-making, but people in general. We are influenced by a variety of heuristics and biases . In situational analysis, we tend to overestimate the importance of the facts with which we are familiar. What is immediately noticeable and what we have already encountered in the past forms the basis for our decisions. This represents a strong selection of the existing facts - the selection criterion is unconscious here. FOR-DEC is intended to make the acquisition of facts and the decision-making steps based on them consciously and deliberately controllable.

Individual evidence

  1. a b H.-J. Hörmann: judgment behavior and decision-making. In: H. Eißfeldt, K.-M. Goeters, H.-J. Hörmann, P. Maschke, A. Schiewe (eds.): Effective teamwork: crew resource management training for pilots and air traffic controllers. German Aerospace Center, Hamburg 1994. ISSN  0939-298X .
  2. H.-J. Hörmann: FOR-DEC. A prescriptive model for aeronautical decision making. In: R. Fuller, N. Johnston, N. McDonald (Eds.): Human Factors in Aviation Operations ". Proceedings of the 21st Conference of the European Association for Aviation Psychology ( EAAP ). Vol. 3, Avebury Aviation, Aldershot Hampshire 1995, ISBN 0-291-39825-1 , pp. 17-23.
  3. a b c H.-J. Hörmann: Proceedings from AGARD conference proceedings 588 of the Aerospace Medical Panel Symposium '96: Training of aircrew decision making. Prague, Czech Republic 1996.
  4. a b c d e f K.-M. Goeters: Evaluation of the effects of CRM training by the assessment of non-technical skills under LOFT. In: Human Factors and Aerospace Safety. (2, 1), 2002, pp. 71-86.
  5. Pateisky: Why Rumpelstiltskin had to fail or: How do I make better decisions - alone and in a team? Part 1. In: Speculum - Journal of Gynecology and Obstetrics. (24, 3), 2006, pp. 14-16.
  6. ^ MP Carlson, I. Bloom: The Cyclic Nature of Problem Solving: An Emergent Multidimensional Problem-Solving Framework. In: Educational Studies in Mathematics. (58), 2005, pp. 45-75.
  7. G. Fahnenbruck: About the emergence and development of a safety culture in aviation. 2010. [Power Point Slides]. From the website: http://www.mh-hannover.de/19593.html?&key=1-3
  8. a b c d Bubb: Human Reliability: A Key to Improved Quality in Manufacturing. In: Human Factors and Ergonomics in Manufacturing. (15, 4), 2005, pp. 353-368.
  9. a b M. Müller: Risk and risk management in aviation. In: Journal for Medical Training and Quality Assurance. (98, 7), 2004, pp. 559-65.
  10. a b H. Eißfeldt, K.-M. Goeters, H.-J. Hörmann, P. Maschke, A. Schiewe (eds.): Effective teamwork: crew resource management training for pilots and air traffic controllers. German Aerospace Center, Hamburg 1994. ISSN  0939-298X .
  11. ^ M. Mitchell: Situational interest: is multifaceted structure in the secondary school mathematics classroom. In: Journal of Educational Psychology. (85, 3), 1993, pp. 424-436.
  12. ^ Thompson Tulving: Encoding specificity and retrieval processes in episodic memory. In: Psychological Review. (80, 5), 1973, pp. 352-373.
  13. ^ A b D. Kahneman, P. Slovic, A. Tversky: Judgment under uncertainty: Heuristics and biases. University Press Cambridge, 1982.