Insistence on beliefs

from Wikipedia, the free encyclopedia

Insistence on beliefs , also known as conceptual conservatism ( English belief perseverance ) means being confronted with new information that contradicts a belief, but to preserve it anyway. The initial hypothesis that is persisted is called the persistent first hypothesis .

Since rationality brings conceptual flexibility with it, an insistence on belief is compatible with the view that people sometimes act irrationally. Philosopher FCS Schiller is of the opinion that the insistence on convictions deserves to be "counted among the fundamental 'laws of nature'."

Empirical evidence from experimental psychology

According to Lee Ross and Craig A. Anderson, "Beliefs are remarkably resilient in the face of empirical evidence that appears logically devastating." The following experiments can be understood or reinterpreted using the concept of persistence in belief.

The first study on the persistence of beliefs was done by Festinger, Riecken, and Schachter. These psychologists spent time with a cult whose members believed the world would end on December 21, 1954. After the prediction did not come true, most of the followers still clung to their beliefs.

When asked about a reassessment of probability estimates in light of current information, the test subjects showed a clear tendency not to take sufficient account of new findings.

In another study, mathematically competent adolescents and adults were given seven arithmetic problems and were initially asked for approximate answers using estimates. Then they should determine the exact results with the help of a pocket calculator, which was manipulated in such a way that it produced increasingly incorrect results (e.g. 252 × 1.2 = 452.4 - although the correct result is actually 302.4). When reflecting on their estimation skills and techniques, around half of the subjects completed all seven tasks without questioning their belief that calculators are infallible.

Lee Ross and Craig A. Anderson led some subjects to believe that there was a positive correlation between a firefighter's stated preference for taking risks and his or her professional performance. Other subjects were told that the correlation was negative. The subjects were then informed in detail during a debriefing and were informed that there was no correlation between risk-taking and performance. The authors found that interviews following the debriefing indicated a significant level of persistence in belief.

In another study, subjects spent about four hours following the directions in a practical guide. At one point the manual introduced a formula that led them to believe that the volume of spheres is 50% larger than it actually is. The subjects were then given actual balls and asked to determine their contents. First they used the formula to then later fill the ball with water, pour the water into a box and use it to measure the volume of the water.

In the last experiment in this series, all 19 subjects held doctoral degrees in natural sciences and were employed as researchers or professors at two large universities. They also performed the comparison between the two volume measurements a second time with a larger sphere. All but one of the subjects stuck to the wrong formula despite their empirical observations:

“Taken together, such experiments lead to a surprising conclusion: 'Even when we deal with ideologically neutral ideas of reality, we tend to - even though these ideas have only recently been acquired, passed down from unknown sources or even if they were assimilated for the wrong reasons and their rejection apparently brings few risks or costs with it - at least over a longer period of time not to doubt such ideas on a verbal level and not to abandon them in practice, even if later events clearly contradict them. '"

Individual evidence

  1. Ulrich Frey: The blind spot: Cognitive errors in science and their evolutionary basis. Walter de Gruyter, 2013. p. 99
  2. a b Moti Nissani: A cognitive reinterpretation of Stanley Milgram's observations on obedience to authority . In: American Psychologist . tape 45 , no. 12 , 1990, ISSN  1935-990X , pp. 1384–1385 , doi : 10.1037 / 0003-066x.45.12.1384 .
  3. ^ Roy F. Baumeister, Kathleen D. Vohs (Ed.): Encyclopedia of social psychology . Sage Publications, Thousand Oaks, Calif. 2007, ISBN 978-1-4522-6568-1 , pp. 109-110 ( books.google.de ).
  4. ^ JF Voss, et al. (Ed.): Informal Reasoning and Education . Erlbaum, Hillsdale 1991, p. 172 .
  5. ^ Leo HT West, A. Leon Pines: Cognitive structure and conceptual change . Academic Press, Orlando, Fl. 1985, ISBN 978-0-12-744590-8 , pp. 211 .
  6. ^ William Ian Beardmore Beveridge: The art of scientific investigation . Norton, New York 1957, pp. 110 ( Textarchiv - Internet Archive ).
  7. ^ Daniel Kahneman: Judgment under Uncertainty: Heuristics and Biases . Cambridge University Press, Cambridge 1982, pp. 144 ( books.google.de - excerpt).
  8. Leon Festinger, et al .: Festinger-Riecken-Schachter-When-Prophecy-Fails . University of Minnesota Press, Minneapolis 1956 ( archive.org ).
  9. B. Kleinmuntz (Ed.): Formal Representation of Human Judgment . Wiley, New York 1968, pp. 17-52 .
  10. Lois Timnick: Electronic Bullies . In: Psychology Today . tape 16 , 1982, pp. 10-15 .
  11. CA Anderson: Abstract and Concrete Data in the Conservatism of Social Theories: When Weak Data Lead to Unshakeable Beliefs . In: Journal of Experimental Social Psychology . tape 19 , no. 2 , 1983, p. 93-108 , doi : 10.1016 / 0022-1031 (83) 90031-8 .
  12. ^ Moti Nissani, Donna Marie Hoefler-Nissani: Experimental Studies of Belief Dependence of Observations and of Resistance to Conceptual Change . In: Cognition and Instruction . tape 9 , no. 2 , 1992, ISSN  0737-0008 , p. 97–111 , doi : 10.1207 / s1532690xci0902_1 .