Pretty sure

from Wikipedia, the free encyclopedia

A concept of probability theory and a special case of the concept is almost certain to be found almost everywhere in measure theory . A random event that has a probability of one is called almost certain . Similarly, an event is called almost impossible if the probability of its occurrence is zero. These terms play an important role, for example, in the almost certain convergence of random variables , as occurs in the situation of the law of large numbers .

definition

In a probability space , an event is called almost certain if

applies. It is called almost impossible if:

.

Not every almost certain event has to necessarily occur, but occurs on a quantity of measure one. The certain event is also almost certain, since there is a probability of one.

An almost impossible event can possibly occur, but only on a set of zero measure. The impossible event is also almost impossible.

Examples

In the case of an even distribution on the interval :

  • The probability of hitting exactly a certain number by chance is 0, although this event is not impossible. Likewise, the probability of hitting any number except is 1, but this event will not necessarily occur.
  • The probability of meeting any rational number in is also 0, since there are only countably infinitely many rational numbers in this range, the set of which therefore only has the Lebesgue measure 0. In contrast, the probability of hitting any irrational number is equal to 1, although this event does not have to occur.

literature

  • Manfred Precht, Karl Voit, Roland Kraft: Mathematics 1 for non-mathematicians. 7th edition. Oldenbourg, Munich, Vienna 2006, ISBN 3-486-27407-4 , section 5.6, p. 178 ( limited online copy in the Google book search)
  • Gerd Christoph, Horst Hackel: Start-up help stochastics. Teubner, Stuttgart a. a. 2002, ISBN 3-519-00341-4 , section 2.6, p. 32 ( limited online copy in the Google book search)