Moivre-Laplace's theorem

As the number of points increases, the discrete binomial distribution approaches the continuous normal distribution.

The Moivre-Laplace theorem , also called de Moivre-Laplace's theorem or de Moivre-Laplace's central limit theorem , is a proposition from probability theory . According to this theorem , the binomial distribution for and probabilities converges to the normal distribution . With a large sample size, the normal distribution can therefore be used as an approximation of the binomial distribution, which is used in particular for the normal approximation and for hypothesis tests . For this approximation can be illustrated experimentally by the Galton board . ${\ displaystyle n \ rightarrow \ infty}$${\ displaystyle 0 ${\ displaystyle p = {\ tfrac {1} {2}}}$

From a historical point of view, Moivre-Laplace's theorem is the first central limit theorem . In 1730 Abraham de Moivre showed the testimony for and in 1812 the general case was shown by Pierre-Simon Laplace . ${\ displaystyle p = {\ tfrac {1} {2}}}$

statement

Let be a sequence of independent Bernoulli-distributed random variables with the parameters and . Then the sum is binomially distributed with parameters , and we have: ${\ displaystyle X_ {1}, X_ {2}, X_ {3}, \ cdots}$ ${\ displaystyle p \ in \;] 0.1 [}$${\ displaystyle \ sigma ^ {2} = p (1-p)}$${\ displaystyle S_ {n} = \ sum _ {i = 1} ^ {n} X_ {i}}$ ${\ displaystyle n \ in \ mathbb {N}}$${\ displaystyle p \ in \;] 0.1 [}$

(1) ${\ displaystyle \ quad \ operatorname {P} \ left (S_ {n} = k \ right) \; = \; B (k \ mid p, n) \; \ approx \; {\ frac {1} {\ sqrt {2 \ pi n \ sigma ^ {2}}}} \, \ exp \ left (- {\ frac {n} {2 \ sigma ^ {2}}} \ left ({\ frac {k} {n }} - p \ right) ^ {2} \ right)}$

(2) for everyone with . ${\ displaystyle \ quad \ lim _ {n \ to \ infty} \ operatorname {P} \ left (x_ {1} \ leq {\ frac {S_ {n} -np} {\ sqrt {n}}} \ leq x_ {2} \ right) \; = {\ frac {1} {\ sqrt {2 \ pi \ sigma ^ {2}}}} \ int _ {x_ {1}} ^ {x_ {2}} \ exp \ left (- {x ^ {2} \ over 2 \ sigma ^ {2}} \ right) \ mathrm {d} x}$${\ displaystyle x_ {1}, x_ {2} \ in \ mathbb {R}}$${\ displaystyle x_ {1}

Moivre-Laplace's theorem states that the distribution of the random variables for weak converges to the normal distribution with the variance . ${\ displaystyle \ textstyle {\ frac {S_ {n} -np} {\ sqrt {n}}}}$${\ displaystyle n \ to \ infty}$ ${\ displaystyle {\ mathcal {N}} \ left (0, \ sigma ^ {2} \ right)}$${\ displaystyle \ sigma ^ {2} = p (1-p)}$

Applications

Moivre-Laplace's theorem is the theoretical basis of normal approximation , a method with which the binomial distribution can be approximated.

The above statement is reformulated by a substitution and the standard normal distribution is obtained with the distribution function ${\ displaystyle \ Phi}$

${\ displaystyle \ lim _ {n \ to \ infty} \ left | \ operatorname {P} (S_ {n} \ leq t) - \ Phi \ left ({\ frac {t-np} {\ sqrt {np ( 1-p)}}} \ right) \ right | = 0}$

for everyone . ${\ displaystyle t \ in \ mathbb {R}}$

In this way, the value of the binomially distributed random variable can be approximated to the standard normal distribution via the values ​​of the distribution function. This is usually taken from the table of the standard normal distribution . ${\ displaystyle S_ {n}}$

Moivre-Laplace's theorem gives sufficiently good approximations if and satisfy the following condition: ${\ displaystyle n}$${\ displaystyle p}$

${\ displaystyle np (1-p)> 9}$

In normal approximation, a so-called continuity correction is also introduced to reduce the approximation error , which consists of the introduction of correction terms and is intended to compensate for the transition from a discrete to a continuous probability distribution. ${\ displaystyle \ pm {\ tfrac {1} {2}}}$

example

Plot of the density of the normal distribution with μ = 12 and σ = 3 and the binomial distribution with n = 48 and p = 1/4

The following calculations are carried out to illustrate the importance of the error correction.

Given a binomial distribution with and , consequently holds . We compare with a normal distribution with a mean and a variance . ${\ displaystyle n = 48}$${\ displaystyle p = {\ tfrac {1} {4}}}$${\ displaystyle np (1-p) = 9}$${\ displaystyle \ mu = np = 12}$${\ displaystyle \ sigma ^ {2} = np (1-p) = 9}$

Now we are looking for the answer to the question “What is the probability that values ​​are less than or equal ”. The calculations or estimates give the following results: ${\ displaystyle \ mu -3 \ sigma = 3}$

• Binomial distribution
${\ displaystyle P (0 \ leq X \ leq 3) = \ sum _ {k = 0} ^ {3} {\ binom {48} {k}} \ cdot {\ bigg (} {\ frac {1} { 4}} {\ bigg)} ^ {k} \ cdot {\ bigg (} {\ frac {3} {4}} {\ bigg)} ^ {48-k} \ approx 0 {,} 004}$
The approximate value was taken from the adjacent plot.
• Normal distribution with continuity correction
{\ displaystyle {\ begin {aligned} P (0 \ leq X \ leq 3) & \ approx \ Phi \ left ({\ tfrac {3 + 0 {,} 5-12} {3}} \ right) - \ Phi \ left ({\ tfrac {0-0 {,} 5-12} {3}} \ right) = \ Phi \ left (-2 {,} 83 \ right) - \ Phi \ left (-4 {, } 17 \ right) \\ & = 1- \ Phi \ left (2 {,} 83 \ right) -1+ \ Phi \ left (4 {,} 17 \ right) \ approx -0 {,} 99767 + 1 = 0 {,} 00233 \ end {aligned}}}
With this calculation it should be noted that for reasons of symmetry applies and is for .${\ displaystyle \ Phi (-x) = 1- \ Phi (x)}$${\ displaystyle \ Phi (x) \ approx 1}$${\ displaystyle x> 4 {,} 09}$
• Normal distribution without continuity correction
{\ displaystyle {\ begin {aligned} P (0 \ leq X \ leq 3) & \ approx \ Phi \ left ({\ tfrac {3-12} {3}} \ right) - \ Phi \ left ({\ tfrac {0-12} {3}} \ right) = \ Phi \ left (-3 \ right) - \ Phi \ left (-4 \ right) \\ & = 1- \ Phi \ left (3 \ right) -1+ \ Phi \ left (4 \ right) = - 0 {,} 99865 + 0 {,} 99997 = 0 {,} 00132 \ end {aligned}}}

Overall, it can be deduced from the values ​​of the "calculations" that with the help of the continuity correction a better correspondence with the value of the binomial distribution is achieved.