Conditional probability

from Wikipedia, the free encyclopedia

Conditional probability (also conditional probability ) is the probability of the occurrence of an event under the condition that the occurrence of another event is already known. It is written as. The vertical line is to be read as “under the condition” and is to be understood as follows: If the event has occurred, the possibilities are limited to the results in . This also changes the probability; this new probability for the event is given by . The conditional probability can therefore be interpreted as a reassessment of the probability of when the information is available that the event has already occurred. Sometimes the spelling is also used, but it can also have other meanings.

For a generalized, abstract notion of conditional probabilities, see conditional expectation .

Motivation and Definition

Sometimes one would like to investigate how strong the statistical influence of one variable is on another. For example, you want to know whether smoking ( ) is carcinogenic ( ). The logical implication would require that the conclusion applies to all instances, i.e. that every smoker gets cancer. A single smoker who does not get cancer would falsify the statement "Smoking causes cancer with logical certainty" or "Every smoker gets cancer". Nevertheless, although there are smokers without cancer, there is a statistical connection between these two events: the likelihood of developing cancer is increased among smokers. This probability is the conditional probability that someone will get cancer, provided they are a smoker.

The probability that someone smokes under the condition that he has cancer can now also be examined stochastically. In the calculation of probability it should be noted that the term condition is not linked to a causal or temporal relationship. The conditional probability is a measure of how strong the statistical influence of on is. It can be viewed as a stochastic measure of how likely the conclusion is. However, like all statistical quantities, it says nothing about the possible causality of the connection.

With this motivation one arrives at the following definition:

If and are and is arbitrary events , then the conditional probability of , provided (also: the probability of under the condition ), noted as (with a straight line between and ), is defined by:

Therein is the probability that and occur together. is called joint probability, joint probability or intersection probability. denotes the set- theoretical intersection of the events and .

Multiplication rate

The decision tree illustrated .

By transforming the definition formula, the multiplication theorem for two events is created:

If one generalizes the above expression of the multiplication theorem, which is valid for two events, one obtains the general multiplication theorem. Consider the case with random events .

Calculating with a decision tree is particularly clear here , as the diagram “takes into account” as it were: The data are easy to use and sequentially lead to the correct calculation process.

Law of total probability

If only conditional probabilities and the probabilities of the conditional event are known, the total probability of results from

where the counter event to denotes.

There is also a generalization here. Given are events with for all that form a partition of the probability space, i. i.e., they are pairwise disjoint and . Then:

.

Stochastic independence

If and only if and are stochastically independent , then :

,

which then leads to:

 or .

In other words, regardless of whether the event occurred or not, the probability of the event is always the same.

Bayes' theorem

For the relationship between and , Bayes’s theorem results directly from the definition and the multiplication law :

.

The law of the total probability can be calculated in the denominator.

Continuous random variable

For two random variables , with common density , a density of represented by

.

If so, one can define a conditional density of , given (or assumed) the event , by

.

Instead of writing , for the conditional density. The latter formula should not be understood like the density of a random variable .

The (one) simultaneous density of and is then obtained from the formula

.

From this a form of the law of total probability can be derived:

This process is known as marginalization .

It should be noted that densities that deliver the same integral values represent the same probability distribution as standard . Densities are therefore not clearly defined. An acceptable choice for , and is any measurable function , which in the integral the correct probabilities , or for any , results. The function must

fulfill. The formulas given above therefore only apply if the various densities are appropriately selected.

Examples

Depending on the degree of overlap of two events and , therefore, the size of the intersection , the entry of event can the probability that event has occurred, increase or decrease, up to 1 ( is almost certainly occurred) or to 0 ( almost certainly did not occur).

Examples with dice

The examples below always relate to throws with a fair standard die. The spelling denotes an event that a one, a two or a three was rolled on a roll.

special cases

and have no common elements. When it occurs, it can no longer occur and vice versa.
Example:
Event Event When occurs (i.e. a four, a five or a six is ​​rolled), it is certainly no longer possible.
.
The event is a subset of the event .
Example:
Event event
.
The probability of (here a priori ) increases in this case inversely proportional to the probability of (here , the probability increases here by a factor of 2).
In this case, knowledge of the absolute probabilities and is sufficient to calculate the conditional probability of under the condition . The exact size of the intersection does not need to be known.
The event is a superset of the event or the event is a subset of the event .
Example: If has occurred, must therefore also have occurred.
.

General case

More generally, in the Laplace experiment one needs the number of elements of the intersection to calculate the conditional probability of under the condition

The event of throwing at least a four (i.e. 4 or higher) has a priori probability .

If it is now known that an even number was rolled, i.e. that the event has occurred, then the conditional probability results for under the condition due to

.

The conditional probability in this case is higher than the initial probability.

If an odd number is rolled, so the event has occurred, is the conditional probability of the condition for equal

.

The conditional probability in this case is smaller than the a priori probability.

The event has a priori probability . When we know that the event has occurred, the probability of due to changes

.

In this example too, the occurrence of the event makes the event less likely, i.e. That is, the probability that the event occurred as a result of the throw has become smaller than the a priori probability, because the event in any case occurred as a result of the throw .

Throwing machine

Conditional probability as partial areas

An illustrative example enables conditional probabilities to be understood directly using set diagrams . A throwing machine is considered that randomly throws objects (e.g. balls, darts) onto a certain surface (e.g. a wall) so that every location on the wall is hit with the same probability. The function assigns the area or a certain partial area of the wall (e.g. any circle marked with a pen) to its area or area. Then the probability that the projectile in strikes is proportional to the ratio of the partial area to the total, so .

Now it is also assumed that the projectile hit within another partial area that overlaps with the partial area. Then the probability that the projectile in impinges . The conditional probability that the projectile also hits within the overlapping partial area at the same time under the additional condition is proportional to the area of ​​that part of the area that is also in , i.e. the area of the intersection . Conversely, for an equally large intersection, the less likely that a projectile hitting in will also hit in, the larger it was assumed. So is inversely proportional to .

Thus, the probability of an impact in with an additionally assumed impact in results as a conditional probability , i.e. by definition.

Further examples

  • For example, the conditional probability (the earth is wet when it rains) is usually high, because assuming that it rains at a point in time, one should expect the earth to get wet. So conditional probability asks how likely one event is if I already know another. In our example, I know it's raining and I wonder what the likelihood is that the earth is wet. Obviously, the conditional probability is different from the unconditional.
  • The probability that someone who speaks French is French is neither the same as the probability that someone who is French also speaks French, nor do the two probabilities add up to 100%.
  • People v. Collins (1968): In this criminal case in California, one of the reasons that a defendant was wrongly convicted of a bank robbery was that, according to witness statements, the perpetrator - like the defendant - had both a beard and a mustache, which was considered rare. Whoever wears a beard, however, very often also wears a mustache - the court did not base its wrong judgment on the conditional probabilities, as it would have been correct.
  • Sports draws: In 2013, two German and two Spanish teams made it to the semi-finals of the Champions League . The probability that a purely German and a purely Spanish semi-final will be drawn in this constellation is a third, not around fifty percent. We are looking for the probability that the second German (Spanish) club will be drawn as the second club, provided that a German (Spanish) club is the first to be drawn from the lottery pot. But if a German (Spanish) club was drawn as the first team, only one of the three teams remaining in the lottery wheel is also German (Spanish). Therefore the probability we are looking for is 13 . This can also be seen in the fact that in this case six pairings are possible. The option of a purely German (Spanish) final game is opposed to two other options.
This simple case can also be solved elementarily without any conditional probability: Each of the four teams will be given one of the other three teams with the same probability. Only one of these three teams comes from the same country. So the probability we are looking for is 13 .
  • In medicine, there is often only a limited probability (conditionality) for the cause (causality) or etiology of a disease.

See also

Web links

Wikibooks: Conditional Probabilities  - Learning and Teaching Materials

literature

Remarks

  1. See Dietrich von Engelhardt : Causality and conditionality in modern medicine. In: Heinrich Schipperges (Ed.): Pathogenesis. Basic features and perspectives of a theoretical pathology. Berlin / Heidelberg / New York / Tokyo 1985, pp. 32-58.