Discrete probability distribution

from Wikipedia, the free encyclopedia

A discrete (probability) distribution or a discrete probability measure is a special probability measure in stochastics . In contrast to the general probability measures, the discrete probability distributions are easy to handle because they are only defined on mathematically "small" quantities. On the one hand, this prevents the occurrence of paradoxes, as shown by Vitali's theorem , and the associated use of more complex set systems such as Borel's σ-algebra , on the other hand, it can also lead to the use of integrals in favor of the use of (finite or infinite) sums be waived.

The simplest example of a discrete probability distribution would be a toss of a possibly marked coin: The probability distribution assigns the event “The coin is heads” to a number that corresponds to the probability that the coin is heads. It also assigns the result “The coin shows number” a number that corresponds to the probability that the coin shows number. According to the intuitive understanding of probability, these numbers add up to one.

This article deals with properties of discrete probability distributions that are characteristic of them. For the general properties of probability measures that also apply to discrete probability distributions, see the main article on probability measures.

definition

A probability distribution is called a discrete probability distribution if one of the following three cases applies:

  • It is defined (mostly ) on a finite set .
  • It is defined on a countably infinite set (mostly natural numbers ).
  • It is defined on an arbitrary set, but only assumes a positive value on at most countably many elements of this set. This means that there is at most a countable set with (mostly the natural numbers, embedded in the real numbers).

Random variables whose distribution is a discrete probability distribution are also referred to as discrete random variables .

Examples

An example for the definition on a finite set is the example mentioned above with the coin toss. This is defined on the amount and in the fair case is given by

.

Often the sides of the coin are also coded, such as heads , tails or heads , tails . The coding does not change the property of the distribution to be discrete.

A typical example of a probability distribution on a countably infinite set, more precisely on , is the Poisson distribution . It is defined for a real parameter by

.

The normalization of a probability distribution follows from the definition of the exponential function over the power series.

Both of the above examples can also be understood as discrete probability distributions on the base space. This allows, for example, defining a distribution function and allows the folding of the probability distributions with further, possibly continuous probability distributions.

construction

Discrete probability distributions are mostly defined using probability functions. In the case of the base space , these are functions that assign a positive, real number between zero and one to every natural number . All of these real numbers must add up to one. Then you bet

.

In the example above the Poisson distribution would be for example

.

This method can be used for any discrete probability distribution. In fact, the discrete probability distributions are precisely those distributions that can be defined using a probability function. The assignment of discrete probability distribution - probability function is therefore bijective .

properties

Distribution function

Distribution function of a Bernoulli distribution for the parameter with characteristic jumps at 0 and at 1.

If one embeds discrete distributions (or an arbitrary, at most countable subset of the real numbers) in the real numbers , a distribution function can be assigned to the distribution . In the case of discrete probability distributions, this is characterized by the fact that it is piece-wise constant. At one point with , the distribution function always has a "jump" upwards, and that by exactly the value .

Probability generating function

In addition to the classic generating functions ( moment generating function , cumulant generating function and characteristic function ), a probability generating function can also be assigned to discrete probability distributions . This is a polynomial or a power series that can be clearly assigned to every probability distribution. Probability-generating functions make it easier, for example, to calculate moments such as expected value or variance or provide simple convolutional identities .

Special discrete probability distributions

Some important discrete probability distributions and their construction are listed below. The classification is not mandatory, some distributions can also be constructed in several ways.

Derived from the Bernoulli distribution

A starting point for the modeling is the Bernoulli distribution . It models the toss of a coin, where “heads” is coded with 1 and “tails” with 0. The probability for “heads” is given by a number . It is therefore a discrete probability distribution .

From this distribution the following can be derived directly:

Geometric and negative binomial distributions are also defined in different variants. Further derivable probability distributions are the beta binomial distribution (the probability of success of the coin itself is assumed to be beta distributed ) as well as the Rademacher distribution and the two-point distribution (Bernoulli distributions on special values) and the Dirac distribution (degenerate borderline case of a coin that always gives the same result shows).

Derived from the urn model

Another starting point for modeling is the urn model , which is based on discrete uniform distribution . A total of balls are divided into several groups (colored, numbered etc.) and placed in an urn. This is then drawn from, either with or without covering. Each ball should be equally likely (according to the uniform distribution). For example, you can construct:

Web links

literature

Individual evidence

  1. David Meintrup, Stefan Schäffler: Stochastics . Theory and applications. Springer-Verlag, Berlin Heidelberg New York 2005, ISBN 978-3-540-21676-6 , pp. 90 , doi : 10.1007 / b137972 .