Central limit theorem

from Wikipedia, the free encyclopedia
Approximation of symmetrical (above) and skewed (below) binomial distributions (red) to the normal distribution (green)

The central limit theorem (by Lindeberg-Lévy) is an important result of probability theory . The central limit theorem provides the reasons for the phenomenon that during the additive superposition of many small independent random effects in an overall effect at least approximate a normal distribution results when none of the individual effects a dominant influence on the variance has.

The set is named after Lindeberg and Lévy .

There are various generalizations for which an identical distribution is not a necessary requirement. Instead, another prerequisite is required that ensures that none of the variables have too great an influence on the result. Examples are the Lindeberg condition and the Lyapunov condition . Further generalizations even allow a “weak” dependence of the random variables. The class of generalizations of the central limit theorem is called central limit theorems.

The name goes back to G. Pólya's work on the central limit theorem of probability theory and the moment problem from 1920.

The central limit theorem of statistics with identical distribution

Is a sequence of random variables , which on the same probability space with the probability all have the same probability distribution have and independently are (iid = independent and identically distributed , engl. Iid = independent and identically distributed ). Let us further assume that both the expected value and the standard deviation exist and are finite.

Let us now consider the -th partial sum of these random variables . The expectation of is and the variance is . If one forms the standardized random variable from this

Then the Central Limit Theorem says that the distribution function of for converges pointwise to the distribution function of the standard normal distribution. This corresponds exactly to the concept of convergence in distribution in stochastics. If the distribution function of is then it means that for every real

In a slightly different notation you get

in which

is the mean of the first summands of the random variable.

Remarks

  • The Central Limit Theorem can also be proven in an elementary way, that is, without the deep-seated aid of the characteristic function. For this purpose, expected values ​​of the form are examined, which on the one hand correspond to the probability in the case of an indicator function of a closed interval and on the other hand can be well approximated in cases of a sufficiently smooth function . This method of elementary evidence comes from Jarl Waldemar Lindeberg .
  • Finite sample sizes raise the question of the convergence quality. Under certain conditions, the Berry-Esseen theorem provides an answer: If the third centered moment exists and is finite, then the convergence to the normal distribution is uniform and the convergence speed is at least of the order .
  • Since the sum is normally distributed again for stochastically independent normally distributed random variables , the central limit theorem in the finite applies to them; more precisely, it is already standard normally distributed for each .

Generalizations

A generalization of the central limit theorem is the multi-dimensional central limit theorem . It provides information about the convergence of the distributions of random vectors against the multidimensional standard normal distribution .

Another generalization is the central limit theorem by Lindeberg-Feller . He also allows certain dependencies between the random variables by combining them into groups and requiring independence only within these groups. The sequence of these groups is called a scheme of random variables . The Lindeberg condition and the Lyapunov condition can also be formulated for schemes of random variables and thus provide criteria for convergence when using schemes.

literature

See also

Web links

Individual evidence

  1. a b Central limit theorem . In: Guido Walz (Ed.): Lexicon of Mathematics . 1st edition. Spectrum Academic Publishing House, Mannheim / Heidelberg 2000, ISBN 3-8274-0439-8 .
  2. Jeff Miller: Earliest Known Uses of Some of the Words of Mathematics.
    George Pólya: On the central limit theorem of probability theory and the moment problem, Mathematische Zeitschrift, 8, 1920, pp. 171–181 ( online )
  3. ^ Jarl Waldemar Lindeberg: A new derivation of the exponential law in the calculation of probability , Mathematische Zeitschrift, Volume 15, 1922, pp. 211-225 ( online version ).
    See also Jörg Bewersdorff : Statistics - how and why it works. A math reader . Vieweg + Teubner Verlag 2011, ISBN 978-3-8348-1753-2 , doi: 10.1007 / 978-3-8348-8264-6 , pp. 139-146.