A probability generating function , also briefly generating function or generating function called, is in the probability theory a special real function . Each discrete probability distribution on the natural numbers and each random variable with values in the natural numbers can be assigned a probability-generating function. Conversely, the probability distribution or the distribution of the random variable can also be uniquely reconstructed from each probability-generating function.
Because of this clear assignment , probability-generating functions make it possible to transfer certain properties of the distributions and operations of random variables to properties and operations of functions. For example, there is a relationship between the derivatives of the probability-generating function and the expected value , the variance and other moments of the random variable. Likewise, the addition of stochastically independent random variables or the convolution of the probability distributions corresponds to the multiplication of the corresponding probability-generating functions. This simplification of important operations then enables, for example, the investigation of complex stochastic objects such as the Bienaymé-Galton-Watson process .
definition
The probability-generating function can be specified in two ways: on the one hand by means of a probability distribution , on the other hand by means of the distribution of a random variable . Both types are equivalent in the sense that every probability distribution can be understood as a distribution of a random variable and every distribution of a random variable is again a probability distribution. Is set for both definitions . With the amount of is natural numbers including 0, respectively.
![0 ^ 0: = 1](https://wikimedia.org/api/rest_v1/media/math/render/svg/873703719945b62cbc02280c289e65c1da17b1de)
![\ mathbb {N} _ {0}](https://wikimedia.org/api/rest_v1/media/math/render/svg/77ab7e98123f0def29a1cd3df96a0b7a58f4202c)
For probability distributions
Is a probability distribution on with probability function , it means the function
![P](https://wikimedia.org/api/rest_v1/media/math/render/svg/b4dc73bf40314945ff376bd363916a738548d40a)
![{\ displaystyle f_ {P} (k) = P (\ {k \})}](https://wikimedia.org/api/rest_v1/media/math/render/svg/dab6210c5df92e5e4e63671f1cf825f904fe0377)
![{\ displaystyle m_ {P} \ colon [0,1] \ to [0,1]}](https://wikimedia.org/api/rest_v1/media/math/render/svg/21b214960b36ecc720c06292c41cdccc468ca5d6)
defined by
![{\ displaystyle m_ {P} (t) = \ sum _ {k = 0} ^ {\ infty} f_ {P} (k) t ^ {k}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/b8a61957aa9673c4e76ba1310c9cda1007b0d8c3)
the probability-generating function of or of .
![P](https://wikimedia.org/api/rest_v1/media/math/render/svg/b4dc73bf40314945ff376bd363916a738548d40a)
![f_P](https://wikimedia.org/api/rest_v1/media/math/render/svg/43bdbe4ab8c7bbbb89a5410c25b536d10befbb5e)
For random variables
For a random variable with values in is the probability generating function
![X](https://wikimedia.org/api/rest_v1/media/math/render/svg/68baa052181f707c662844a465bfeeb135e82bab)
![\ mathbb {N} _0](https://wikimedia.org/api/rest_v1/media/math/render/svg/77ab7e98123f0def29a1cd3df96a0b7a58f4202c)
![{\ displaystyle m_ {X} \ colon [0,1] \ to [0,1]}](https://wikimedia.org/api/rest_v1/media/math/render/svg/6e6c70f32f82ce1a4b17dee9539f481559c7fcef)
of or of defined as
![X](https://wikimedia.org/api/rest_v1/media/math/render/svg/68baa052181f707c662844a465bfeeb135e82bab)
![P_ {X}](https://wikimedia.org/api/rest_v1/media/math/render/svg/8348dd8ce7e6f7f4778ee01fa5bdc7b828afd98c)
-
.
Thus the probability-generating function of a random variable is precisely the probability-generating function of its distribution . Alternatively, the probability-generating function of a random variable can also be defined via the expected value as
-
.
Elementary examples
A Bernoulli-distributed random variable is given , that is . Then is and . In purely formal terms, one understands as a random variable with values in whole and then sets for . Then
![X](https://wikimedia.org/api/rest_v1/media/math/render/svg/68baa052181f707c662844a465bfeeb135e82bab)
![X \ sim \ operatorname {Ber} (p)](https://wikimedia.org/api/rest_v1/media/math/render/svg/0f8b9013cd0658c617721eada3f971d81185d860)
![P (X = 0) = 1-p](https://wikimedia.org/api/rest_v1/media/math/render/svg/cb3eea4f24364f5a8504e46ead8f2e338bdb12d0)
![P (X = 1) = p](https://wikimedia.org/api/rest_v1/media/math/render/svg/82fec546358e0ee8a15b2e0651936d71450ba682)
![X](https://wikimedia.org/api/rest_v1/media/math/render/svg/68baa052181f707c662844a465bfeeb135e82bab)
![\ mathbb {N} _ {0}](https://wikimedia.org/api/rest_v1/media/math/render/svg/77ab7e98123f0def29a1cd3df96a0b7a58f4202c)
![P (X = n) = 0](https://wikimedia.org/api/rest_v1/media/math/render/svg/d2675fc071dbc7b9b430a34c3c9d6c6a072ab6a1)
![n \ geq 2](https://wikimedia.org/api/rest_v1/media/math/render/svg/e6bf67f9d06ca3af619657f8d20ee1322da77174)
![m_X (t) = \ sum_ {k = 0} ^ {\ infty} t ^ k P [X = k] = 1-p + pt](https://wikimedia.org/api/rest_v1/media/math/render/svg/b86d692d0244d7f414a50c31c17b4e34941bec03)
If the random variable is binomially distributed with parameters and , that is , then for
![n](https://wikimedia.org/api/rest_v1/media/math/render/svg/a601995d55609f2d9f5e233e36fbe9ea26011b3b)
![p](https://wikimedia.org/api/rest_v1/media/math/render/svg/81eac1e205430d1f40810df36a0edffdc367af36)
![Y \ sim \ operatorname {Bin} _ {n, p}](https://wikimedia.org/api/rest_v1/media/math/render/svg/c21b66e5cb2d5c5bf68e8e0c5b4e6a9ea4b54057)
![P (X = k) = \ binom nk p ^ k (1-p) ^ {nk}](https://wikimedia.org/api/rest_v1/media/math/render/svg/dc81dc7124369056f70e91635c9fc1cfb5e16b3f)
and for . The probability generating function is then
![P (X = k) = 0](https://wikimedia.org/api/rest_v1/media/math/render/svg/6562e9a2e2965e05c618bd9a139dfa75da314a3b)
![k> n](https://wikimedia.org/api/rest_v1/media/math/render/svg/66e81682bf174c978e9008ffb557ba4da2cf7478)
-
.
This follows by means of the binomial theorem .
properties
Properties as a function
The probability generating function is a power series and has a radius of convergence of at least 1, so it converges for all . This follows from the fact that all coefficients in the power series are positive and add up to 1. It then follows for . Thus, the probability-generating functions inherit all properties of the power series on the examined interval : They are continuous and infinitely differentiable over the interval .
![t \ in [0.1]](https://wikimedia.org/api/rest_v1/media/math/render/svg/31a5c18739ff04858eecc8fec2f53912c348e0e5)
![\ sum_ {k = 0} ^ {\ infty} \ left | t ^ k P [X = k] \ right | \ leq 1](https://wikimedia.org/api/rest_v1/media/math/render/svg/5e6ccf1fabe8c1f902c9ad54646a2be7cddad0ba)
![t \ in [-1.1]](https://wikimedia.org/api/rest_v1/media/math/render/svg/95e826ba1c1444ac0d0e95163db2f676e081b538)
![[0.1]](https://wikimedia.org/api/rest_v1/media/math/render/svg/738f7d23bb2d9642bab520020873cccbef49768d)
![[0.1)](https://wikimedia.org/api/rest_v1/media/math/render/svg/0f99b30b4451167959e97802252ad13b87af5505)
Since each of the monomials is convex and monotonically increasing and these properties are closed with respect to conical combinations , the probability generating function is also convex and monotonically increasing.
Reversibility
The probability generating function determines the distribution of uniquely:
![X](https://wikimedia.org/api/rest_v1/media/math/render/svg/68baa052181f707c662844a465bfeeb135e82bab)
- If and -value random variables are with for all with one , then follows for all .
![X](https://wikimedia.org/api/rest_v1/media/math/render/svg/68baa052181f707c662844a465bfeeb135e82bab)
![\ mathbb {N} _ {0}](https://wikimedia.org/api/rest_v1/media/math/render/svg/77ab7e98123f0def29a1cd3df96a0b7a58f4202c)
![m_X (t) = m_Y (t)](https://wikimedia.org/api/rest_v1/media/math/render/svg/2aeda625bcfc63e79ff4bad0741bc4da7ede7807)
![t \ in [0, c]](https://wikimedia.org/api/rest_v1/media/math/render/svg/38f10404c6b4c3925e93b612a27b899ed9071442)
![c> 0](https://wikimedia.org/api/rest_v1/media/math/render/svg/2ba126f626d61752f62eaacaf11761a54de4dc84)
![P [X = k] = P [Y = k]](https://wikimedia.org/api/rest_v1/media/math/render/svg/b8e40d071f664de316724becdeec994de0086151)
![k \ in \ mathbb {N} _0](https://wikimedia.org/api/rest_v1/media/math/render/svg/97bceb13f72e37bcd50b60e5fb2fa05bcf15c265)
According to the Taylor formula , it then applies to everyone
-
.
This relationship shows that the probabilities can be "generated" and the probability function can be reconstructed from the probability-generating function.
![m_X](https://wikimedia.org/api/rest_v1/media/math/render/svg/e28cb8821ff4d31e934d322083dae04060ebf96d)
![P [X = k]](https://wikimedia.org/api/rest_v1/media/math/render/svg/807281821c44c7c193b54814d948823eb7c569a2)
Convolution and sums of random variables
If and are independent -value random variables, then applies to the probability-generating function of![X](https://wikimedia.org/api/rest_v1/media/math/render/svg/68baa052181f707c662844a465bfeeb135e82bab)
![\ mathbb {N} _ {0}](https://wikimedia.org/api/rest_v1/media/math/render/svg/77ab7e98123f0def29a1cd3df96a0b7a58f4202c)
-
,
because with and are also and independent. This can be generalized directly to finite sums of independent random variables: If independent -value random variables are, then for![X](https://wikimedia.org/api/rest_v1/media/math/render/svg/68baa052181f707c662844a465bfeeb135e82bab)
![Y](https://wikimedia.org/api/rest_v1/media/math/render/svg/961d67d6b454b4df2301ac571808a3538b3a6d3f)
![t ^ X](https://wikimedia.org/api/rest_v1/media/math/render/svg/2047019e5abe417f8748edec539fe1cd51b89006)
![t ^ Y](https://wikimedia.org/api/rest_v1/media/math/render/svg/93aa41184258492be5046a55737018a468e2cf1e)
![X_1, \ ldots, X_n](https://wikimedia.org/api/rest_v1/media/math/render/svg/ac794f5521dcce89913085a6d566e7cdb615dbb0)
![\ mathbb {N} _ {0}](https://wikimedia.org/api/rest_v1/media/math/render/svg/77ab7e98123f0def29a1cd3df96a0b7a58f4202c)
-
.
This then directly follows for the probability-generating function of the convolution of the probability measures![{\ displaystyle P * Q}](https://wikimedia.org/api/rest_v1/media/math/render/svg/93e53a3e499dafcd4ba5dc7ec68ed2958aa14c79)
-
.
example
Let be independent, Bernoulli-distributed random variables for the same parameter . Then the sum of the random variables is known to be binomially distributed to the parameters and , thus . With the probability-generating functions for the Bernoulli distribution and the binomial distribution derived
above in the section Elementary Examples![X_ {1}, X_ {2}](https://wikimedia.org/api/rest_v1/media/math/render/svg/6d6099d6fb3c34ad5e22fad9c79c40c4ebfee1ba)
![p](https://wikimedia.org/api/rest_v1/media/math/render/svg/81eac1e205430d1f40810df36a0edffdc367af36)
![2](https://wikimedia.org/api/rest_v1/media/math/render/svg/901fc910c19990d0dbaaefe4726ceb1a4e217a0f)
![p](https://wikimedia.org/api/rest_v1/media/math/render/svg/81eac1e205430d1f40810df36a0edffdc367af36)
![X_1 + X_2 \ sim \ operatorname {Bin} _ {2, p}](https://wikimedia.org/api/rest_v1/media/math/render/svg/42ea1edc32647953d93d8ce177bfadfd85460d35)
-
.
Moment generation
For a -value random variable and is
![\ mathbb {N} _ {0}](https://wikimedia.org/api/rest_v1/media/math/render/svg/77ab7e98123f0def29a1cd3df96a0b7a58f4202c)
![X](https://wikimedia.org/api/rest_v1/media/math/render/svg/68baa052181f707c662844a465bfeeb135e82bab)
![k \ in \ N_0](https://wikimedia.org/api/rest_v1/media/math/render/svg/97bceb13f72e37bcd50b60e5fb2fa05bcf15c265)
![\ operatorname E \ left [\ binom {X} {k} \ right] = \ dfrac {\ lim_ {t \ uparrow 1} m_ {X} ^ {(k)} (t)} {k!}](https://wikimedia.org/api/rest_v1/media/math/render/svg/22d537a35aca0bbb822503b16f5bac4ab7eb1d4a)
respectively
-
.
Both sides of the two equations are finite if and only if is finite.
![\ operatorname {E} \ left [X ^ {k} \ right]](https://wikimedia.org/api/rest_v1/media/math/render/svg/1978e6fa54d72d63edf8697f14b4bc0d55df360c)
In particular, the expected value and the variance of a -value random variable can be determined from its probability-generating function:
![\ mathbb {N} _ {0}](https://wikimedia.org/api/rest_v1/media/math/render/svg/77ab7e98123f0def29a1cd3df96a0b7a58f4202c)
-
,
-
.
The consideration of the left-hand limit value is necessary here, since the differentiability of power series on the edge of the convergence radius is not necessarily given.
example
Let be a binomially distributed random variable, so . Then
![X](https://wikimedia.org/api/rest_v1/media/math/render/svg/68baa052181f707c662844a465bfeeb135e82bab)
![{\ displaystyle X \ sim \ operatorname {Bin} _ {n, p}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/529855024874822efd66d6212f9943a0fce37659)
![{\ displaystyle m_ {X} (t) = (pt + 1-p) ^ {n}, \ quad m '_ {X} (t) = np (pt + 1-p) ^ {n-1} { \ text {and}} m '' _ {X} (t) = n (n-1) p ^ {2} (pt + 1-p) ^ {n-2}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/797aa0870e8f1a1fcd7ba6fd487097bc44584d34)
Both derivatives are polynomials and can therefore be evaluated for without any problems , so the limit value on the left does not need to be considered. It is
![{\ displaystyle t = 1}](https://wikimedia.org/api/rest_v1/media/math/render/svg/970dea4a5f5ec5355c4cdd62f6396fbc8b1baaa1)
-
.
This follows with the above results
-
.
Linear transformation of random variables
Linear transformations of the random variable have the following effect on the probability-generating function:
-
.
example
If a Bernoulli-distributed random variable, that is , then for the random variable there is two-point distribution on . The probability generating function is then
![X](https://wikimedia.org/api/rest_v1/media/math/render/svg/68baa052181f707c662844a465bfeeb135e82bab)
![X \ sim \ operatorname {Ber} (p)](https://wikimedia.org/api/rest_v1/media/math/render/svg/0f8b9013cd0658c617721eada3f971d81185d860)
![{\ displaystyle a, b \ in \ mathbb {N}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/b604733da4cfdd0d8c6c6956d592b26c2e1fc351)
![{\ displaystyle \ {b, a + b \}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/8467908f4a5ecbd9e8a904cbaa84ba0b1a77c47a)
-
.
convergence
The point-wise convergence of the probability-generating function can be directly related to the convergence in distribution :
- Are random variables with associated probability generating functions , then converge if and in distribution to when the probability generating functions for all with a pointwise to converge.
![{\ displaystyle X, X_ {1}, X_ {2}, X_ {3}, \ dots}](https://wikimedia.org/api/rest_v1/media/math/render/svg/1254529511b5e4c9fe7c6ff3744df746dcaf6e80)
![{\ displaystyle m, m_ {1}, m_ {2}, m_ {3}, \ dots}](https://wikimedia.org/api/rest_v1/media/math/render/svg/b2a0fb47d516ad9bcc96cf48ba2486fccd3124ea)
![X_ {n}](https://wikimedia.org/api/rest_v1/media/math/render/svg/72a8564cedc659cf2f95ae68bc5de2f5207a3285)
![X](https://wikimedia.org/api/rest_v1/media/math/render/svg/68baa052181f707c662844a465bfeeb135e82bab)
![{\ displaystyle m_ {n}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/ae8181c5cf15da902bbaa5c2291aeb8167fe78ae)
![{\ displaystyle t \ in [0, \ varepsilon)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/64f85fb7cc673dc1aa72b5c9e374338089e2acbc)
![{\ displaystyle \ varepsilon \ in (0,1)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/a8a761055dca7ca9e0a41a07cfc0ce4dfc87e2f7)
![m](https://wikimedia.org/api/rest_v1/media/math/render/svg/0a07d98bb302f3856cbabc47b2b9016692e3f7bc)
The statement also applies to the probability-generating functions of probability distributions and the weak convergence .
Probability-generating functions of random sums
Using probability-generating functions, sums can easily be calculated using a random number of summands. Are independent, identically distributed random variables with values in and a further, independent random variable with the same value range. Then has the random variable
![\ mathbb {N} _0](https://wikimedia.org/api/rest_v1/media/math/render/svg/77ab7e98123f0def29a1cd3df96a0b7a58f4202c)
![T](https://wikimedia.org/api/rest_v1/media/math/render/svg/ec7200acd984a1d3a3d7dc455e262fbe54f7f6e0)
![X_ {i}](https://wikimedia.org/api/rest_v1/media/math/render/svg/af4a0955af42beb5f85aa05fb8c07abedc13990d)
![{\ displaystyle Z = \ sum _ {i = 1} ^ {T} X_ {i}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7fd5efb52e40290efbfec67e2268955391196493)
the probability generating function
-
.
This property is used, for example, in the analysis of the Galton-Watson process . According to the above rules for calculating the expected value, the chain rule then applies
-
,
which corresponds to the formula of Wald .
The following then applies to the variance
-
.
This is exactly the Blackwell-Girshick equation . It also follows by means of the above rules for determining the variance and the product rule.
Multivariate probability generating function
If a -dimensional random vector with values in , then the probability generating function of is defined as
![X = (X_ {1}, \ dots, X_ {k})](https://wikimedia.org/api/rest_v1/media/math/render/svg/a9406d88bd8c4bd6b47780020f37490a98ae11e0)
![k](https://wikimedia.org/api/rest_v1/media/math/render/svg/c3c9a2c7b599b37105512c5d570edc034056dd40)
![{\ mathbb {N}} _ {0} ^ {k}](https://wikimedia.org/api/rest_v1/media/math/render/svg/af96e51051239e444527702020991e7121a4ec0a)
![X](https://wikimedia.org/api/rest_v1/media/math/render/svg/68baa052181f707c662844a465bfeeb135e82bab)
![{\ displaystyle m_ {X} (t): = m_ {X} (t_ {1}, \ dots, t_ {k}) = \ operatorname {E} \ left (\ prod _ {i = 1} ^ {k } t_ {i} ^ {X_ {i}} \ right) = \ sum _ {x_ {1}, \ ldots, x_ {k} = 0} ^ {\ infty} f_ {P} (x_ {1}, \ ldots, x_ {k}) t_ {1} ^ {x_ {1}} \ dots t_ {k} ^ {x_ {k}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/5141b718b296073be20d61f37ab98896961ce9d5)
with .
![{\ displaystyle f_ {P} (x_ {1}, \ ldots, x_ {k}) = P (X_ {1} = x_ {1}, \ dotsc, X_ {k} = x_ {k})}](https://wikimedia.org/api/rest_v1/media/math/render/svg/baf76e50c3b1253cbfbd0fa1c6029f582d70bd2c)
Expected value, variance, and covariance
Analogously to the one-dimensional case, the following applies
![\ operatorname {E} (X_i) = \ frac {\ partial m_X} {\ partial t_i} (1, \ dots, 1) \ quad \ forall i \ in \ {1, \ dots, k \}](https://wikimedia.org/api/rest_v1/media/math/render/svg/88f10af125057457e149fb207ba4c86d306683b3)
such as
![\ operatorname {Var} (X_i) = \ frac {\ partial ^ 2 m_X} {{\ partial t_i} ^ 2} (1, \ dots, 1) + \ frac {\ partial m_X} {\ partial t_i} (1 , \ dots, 1) \ left (1- \ frac {\ partial m_X} {\ partial t_i} (1, \ dots, 1) \ right) \ quad \ forall i \ in \ {1, \ dots, k \ }](https://wikimedia.org/api/rest_v1/media/math/render/svg/87ff6887e651b68e5586a45b3cfbfae5f137012c)
and
![\ operatorname {Cov} (X_i, X_j) = \ frac {\ partial ^ 2 m_X} {\ partial t_i \ partial t_j} (1, \ dots, 1) - \ frac {\ partial m_X} {\ partial t_i} ( 1, \ dots, 1) \ cdot \ frac {\ partial m_X} {\ partial t_j} (1, \ dots, 1) \ quad \ forall i, j \ in \ {1, \ dots, k \}](https://wikimedia.org/api/rest_v1/media/math/render/svg/a95ac1ff59aa0c7f7a8d68801ec2536b63488e06)
Examples
The table lists some of the probability-generating functions of popular discrete distributions. Probability-generating functions of probability distributions, which are not listed here, can be found in the respective article of the probability function.
distribution
|
Probability generating function
|
Bernoulli distribution |
|
Two-point distribution |
|
Binomial distribution
|
|
Geometric distribution
|
|
Negative binomial distribution
|
|
Discrete equal distribution on
|
|
Logarithmic distribution |
|
Poisson distribution
|
|
Generalized binomial distribution
|
|
Multivariate distributions |
|
Multinomial distribution |
|
In particular, the probability-generating function of the binomial distribution is equal to the n-fold product of the probability-generating function of the Bernoulli distribution, since the binomial distribution is exactly the sum of independent Bernoulli distributions. The same applies to the geometric distribution and the negative binomial distribution.
Connection with other generating functions
The probability generating function of a random variable with a probability function is a special case of a generating function with for . In addition to the probability-generating function, there are three other generating functions in stochastics, which are not only defined for discrete distributions. The torque generating function is defined as . Accordingly,
the characteristic function is defined as . Therefore applies .
![p](https://wikimedia.org/api/rest_v1/media/math/render/svg/81eac1e205430d1f40810df36a0edffdc367af36)
![a_ {i} = p \ left ({i} \ right)](https://wikimedia.org/api/rest_v1/media/math/render/svg/4d6b448bcec7e1ae2f37edf0a80f65a8b16703dc)
![i \ in {\ mathbb {N}} _ {0}](https://wikimedia.org/api/rest_v1/media/math/render/svg/f79aefb42a388c6a1f21725f96a25b11a5b48ed2)
![M_ {X} \ left (t \ right): = \ operatorname {E} \ left (e ^ {{tX}} \ right)](https://wikimedia.org/api/rest_v1/media/math/render/svg/2ffadf06150bb45549d680ce7f8a29087848a549)
![m_ {X} \ left (e ^ {t} \ right) = M_ {X} \ left (t \ right)](https://wikimedia.org/api/rest_v1/media/math/render/svg/117db74e5095025446e44825657b71bfee61849f)
![\ varphi _ {X} \ left (t \ right): = \ operatorname {E} \ left (e ^ {{itX}} \ right)](https://wikimedia.org/api/rest_v1/media/math/render/svg/ad03d931e9f7c1f2628e273e61fe98a61f62f09c)
![m_ {X} \ left (e ^ {{it}} \ right) = \ varphi _ {X} \ left (t \ right)](https://wikimedia.org/api/rest_v1/media/math/render/svg/955b2ffec287aa30cc3c323d9a68e156bd48c82d)
There is also the cumulative-generating function as the logarithm of the torque-generating function. The term cumulative is derived from it.
literature
- Klaus D. Schmidt: Measure and Probability. Springer, Berlin Heidelberg 2009, ISBN 978-3-540-89729-3 , p. 370 ff.
- Achim Klenke: Probability Theory . 3. Edition. Springer-Verlag, Berlin Heidelberg 2013, ISBN 978-3-642-36017-6 .
- Ulrich Krengel: Introduction to probability theory and statistics . For studies, professional practice and teaching. 8th edition. Vieweg, Wiesbaden 2005, ISBN 3-8348-0063-5 .
- Hans-Otto Georgii: Stochastics . Introduction to probability theory and statistics. 4th edition. Walter de Gruyter, Berlin 2009, ISBN 978-3-11-021526-7 .
- Christian Hesse: Applied probability theory . 1st edition. Vieweg, Wiesbaden 2003, ISBN 3-528-03183-2 .
Individual evidence
-
↑ Ehrhard Behrends: Elementary Stochastics . A learning book - co-developed by students. Springer Spectrum, Wiesbaden 2013, ISBN 978-3-8348-1939-0 , pp. 108 , doi : 10.1007 / 978-3-8348-2331-1 .
-
↑ Achim Klenke: Probability Theory . 3. Edition. Springer-Verlag, Berlin Heidelberg 2013, ISBN 978-3-642-36017-6 , p. 79 , doi : 10.1007 / 978-3-642-36018-3 .
-
^ Hans-Otto Georgii: Stochastics . Introduction to probability theory and statistics. 4th edition. Walter de Gruyter, Berlin 2009, ISBN 978-3-11-021526-7 , p. 111 , doi : 10.1515 / 9783110215274 .
-
^ A b Hans-Otto Georgii: Stochastics . Introduction to probability theory and statistics. 4th edition. Walter de Gruyter, Berlin 2009, ISBN 978-3-11-021526-7 , p. 114 , doi : 10.1515 / 9783110215274 .
-
↑ Achim Klenke: Probability Theory . 3. Edition. Springer-Verlag, Berlin Heidelberg 2013, ISBN 978-3-642-36017-6 , p. 83 , doi : 10.1007 / 978-3-642-36018-3 .