# Law of Large Numbers

Visualization of the strong law of large numbers: the y-axis shows the relative frequency of a six thrown, while the x-axis shows the number of passes. The horizontal gray line shows the probability of a sixth throw of 16.67% (= 1/6), the black line the proportion of all sixths rolled in a specific experiment up to the respective number of rounds.
Visualization of the weak law of large numbers in the cube example: For increasing n , the distribution of the relative frequency contracts ever closer to the value 1/6.

As a law of large numbers , abbreviated GGZ certain to limit theorems stochastics referred.

In their simplest form, these theorems say that the relative frequency of a random result usually stabilizes around the theoretical probability of a random result if the underlying random experiment is carried out over and over again under the same conditions. The frequently used formulation that the relative frequency “approaches more and more” of the probability is misleading, as there can be outliers even with a large number of repetitions. So the approximation is not monotonous.

Formally, these are statements about the convergence of the arithmetic mean of random variables , mostly divided into “strong” ( almost certain convergence ) and “weak” ( convergence in probability ) laws of large numbers.

## Example: tossing a coin

The probability that a coin will be upside down when tossed is ½. The more often the coin is tossed, the less likely it becomes that the proportion of tosses in which heads appear (ie the relative frequency of the "heads" event) deviates by more than any given value from the theoretical probability ½. On the other hand, it is quite likely that the absolute difference between the number of head-throws and half the total number of throws will increase.

In particular, these laws of large numbers do not mean that an event that has so far occurred below average will at some point compensate for its “deficit” and consequently must occur more frequently in the future. This is a common mistake among roulette and lottery players , but the "defaulting" number now has to catch up in order to correspond to the statistical uniform distribution again. There is therefore no law of equalization .

An example: Assume that a series of coin tosses begins with “heads”, “tails”, “heads”, “heads”. Then “heads” has been thrown three times, “tails” once. “Head” has a lead of two throws. After these four throws, the relative frequency of “heads” is ¾, that of “tails” ¼. After 96 more throws a ratio of 47 times “tails” to 53 times “heads” is established. The lead of “heads” is even greater after 100 throws than after four throws, but the relative distance between “heads” and “tails” has decreased significantly, or - and this is the statement of the law of large numbers - of Difference between the relative frequency of “head” and the expected value of “head”. The value is much closer to the expected value 0.5 than ¾ = 0.75. ${\ displaystyle \ textstyle {\ frac {53} {100}} = 0 {,} 53}$

## Weak law for relative frequencies

The simplest case of a law of large numbers, the weak law of relative frequencies, is the main result in Jakob I Bernoulli's Ars Conjectandi (1713). A random experiment with exactly two outcomes, called success and failure , i.e. a Bernoulli experiment , is repeated times independently. Describes the probability of success in a single implementation, then the number of successes is binomially distributed with the parameters and . Then applies to the expected value of and to the variance . ${\ displaystyle n}$${\ displaystyle p \ in (0,1)}$${\ displaystyle X_ {n}}$${\ displaystyle n}$${\ displaystyle p}$${\ displaystyle X_ {n}}$${\ displaystyle \ operatorname {E} (X_ {n}) = np}$ ${\ displaystyle \ operatorname {Var} (X_ {n}) = np (1-p)}$

For the relative frequency it follows and . The Chebyshev inequality applied to is thus ${\ displaystyle R_ {n} = {\ tfrac {1} {n}} X_ {n}}$${\ displaystyle \ operatorname {E} (R_ {n}) = p}$${\ displaystyle \ sigma ^ {2} = \ operatorname {Var} (R_ {n}) = {\ tfrac {1} {n ^ {2}}} \ operatorname {Var} (X_ {n}) = {\ tfrac {p (1-p)} {n}}}$${\ displaystyle R_ {n}}$

${\ displaystyle \ operatorname {P} \ left (| R_ {n} -p | \ geq \ varepsilon \ right) \ leq {\ frac {\ sigma ^ {2}} {\ varepsilon ^ {2}}} = { \ frac {p (1-p)} {n \ varepsilon ^ {2}}}}$

for everyone . Since the right-hand side of the inequality for converges towards zero, it follows ${\ displaystyle \ varepsilon> 0}$${\ displaystyle n \ to \ infty}$

${\ displaystyle \ lim _ {n \ to \ infty} \ operatorname {P} \ left (| R_ {n} -p | \ geq \ varepsilon \ right) = 0}$,

that is, for every small one , the probability that the relative frequency of successes is not in the interval approaches zero when the number of attempts approaches infinity. ${\ displaystyle \ varepsilon> 0}$${\ displaystyle (p- \ varepsilon, p + \ varepsilon)}$

## Weak law of large numbers

It is said that a sequence of random variables with satisfies the weak law of large numbers if for ${\ displaystyle X_ {1}, X_ {2}, X_ {3}, \ dotsc}$${\ displaystyle E (| X_ {i} |) <\ infty}$

${\ displaystyle {\ overline {X}} _ {n} = {\ tfrac {1} {n}} \ textstyle \ sum \ limits _ {i = 1} ^ {n} (X_ {i} -E ({ X} _ {i}))}$

for all positive numbers applies: ${\ displaystyle \ varepsilon}$

${\ displaystyle \ lim _ {n \ rightarrow \ infty} \ operatorname {P} \ left (\ left | {\ overline {X}} _ {n} \ right |> \ varepsilon \ right) = 0 \ ,,}$

i.e. when the arithmetic means of the centered random variables converge in probability to . ${\ displaystyle X_ {i} -E (X_ {i})}$ ${\ displaystyle 0}$

There are several conditions under which the weak law of large numbers holds. In some cases, demands are placed on the moments or independence. Significant prerequisites are:

• If there are pairwise independent random variables which are identically distributed and whose expectation value exists, then the weak law of large numbers applies.${\ displaystyle X_ {i}}$
• If random variables are pairwise uncorrelated and the sequence of their variances is limited, then the weak law of large numbers applies.${\ displaystyle X_ {i}}$

Further formulations can be found in the main article. In particular, the requirement of the limitedness of the variances in the second statement can be expressed more generally .

## Strong law of large numbers

It is said that a sequence of random variables with satisfies the strong law of large numbers if for ${\ displaystyle (X_ {n}) _ {n \ in \ mathbb {N}}}$${\ displaystyle \ operatorname {E} (| X_ {n} |) <\ infty}$

${\ displaystyle {\ overline {X}} _ {n} = {\ tfrac {1} {n}} \ textstyle \ sum \ limits _ {i = 1} ^ {n} (X_ {i} -E ({ X} _ {i}))}$

applies:

${\ displaystyle \ operatorname {P} \ left (\ limsup _ {n \ rightarrow \ infty} | {\ overline {X}} _ {n} | = 0 \ right) = 1}$,

that is, when the arithmetic means of the centered random variables almost certainly converge to 0 .

For example, the strong law of large numbers applies if one of the following is true:

• They are independent in pairs and distributed identically with a finite expected value.${\ displaystyle X_ {i}}$
• They are uncorrelated in pairs and it is .${\ displaystyle X_ {i}}$${\ displaystyle \ sup _ {i \ in \ mathbb {N}} \ operatorname {Var} (X_ {i}) <\ infty}$

The strong law of large numbers implies the weak law of large numbers. A more general form of the strong law of large numbers, which also applies to dependent random variables, is the individual ergodic theorem and the L p -ergodic theorem , both of which apply to stationary stochastic processes .

## Interpretation of the formal statements

In contrast to classical sequences , such as those examined in analysis , probability theory generally cannot make an absolute statement about the convergence of a sequence of random results. The reason for this is that, for example, in a series of dice attempts, consequences of random results such as 6, 6, 6, ... cannot be ruled out. With such a sequence of random results, the sequence of the arithmetic means formed from them would not converge to the expected value 3.5. However, the strong law of large numbers says that the event in which the arithmetic means do not converge to the expected value 3.5 has a probability of 0. Such an event is also called an almost impossible event.

The subject of the laws of large numbers is the sequence of the arithmetic means of the centered random variables formed for a given sequence of random variables ${\ displaystyle X_ {1}, X_ {2}, X_ {3}, \ dotsc}$

${\ displaystyle {\ overline {X}} _ {1} = X_ {1} -E (X_ {1}), \; {\ overline {X}} _ {2} = {\ tfrac {1} {2 }} ((X_ {1} -E (X_ {1})) + (X_ {2} -E (X_ {2})))), \; {\ overline {X}} _ {3} = {\ tfrac {1} {3}} ((X_ {1} -E (X_ {1})) + \ dotsb + (X_ {3} -E (X_ {3}))), \ dotsc}$

Due to the problems described, the formal characterization of the convergence of this sequence towards the value 0 must not only start from an arbitrarily small predetermined tolerance distance , as is the case with a classic sequence of numbers . In addition, an arbitrarily small tolerance probability is specified. The statement of the weak law of large numbers then means that for any given specification of a tolerance distance and a tolerance probability with a sufficiently large selected index, a deviation that exceeds the tolerance distance occurs with at most the probability . In contrast, the strong law of large numbers refers to the event that any of the deviations exceeds the tolerance distance. ${\ displaystyle {\ overline {X}} _ {1}, {\ overline {X}} _ {2}, {\ overline {X}} _ {3}, \ dotsc}$${\ displaystyle \ varepsilon> 0}$${\ displaystyle p _ {\ text {max}}> 0}$${\ displaystyle \ varepsilon}$${\ displaystyle p _ {\ text {max}}}$${\ displaystyle n}$${\ displaystyle | {\ overline {X}} _ {n} -0 | = | {\ overline {X}} _ {n} |}$${\ displaystyle \ varepsilon}$${\ displaystyle p _ {\ text {max}}}$${\ displaystyle | {\ overline {X}} _ {n} |, | {\ overline {X}} _ {n + 1} |, | {\ overline {X}} _ {n + 2} |, \ dotsc}$${\ displaystyle \ varepsilon}$

## Practical meaning

Insurance
The law of large numbers is of great practical importance in insurance. It allows an approximate prediction of the future course of damage. The greater the number of insured persons, goods and material assets that are threatened by the same risk, the less the influence of chance . But the law of large numbers cannot say anything about who will be hit by damage in detail. Unpredictable major events and trends such as climate change , which change the basis for calculating average values, can at least partially make the law unusable.
medicine
When demonstrating the effectiveness of medical procedures, it can be used to eliminate random influences.
Natural sciences
The influence of (non-systematic) measurement errors can be reduced by repeated attempts.

## History of the Laws of Large Numbers

A law of large numbers was first formulated by Jakob I Bernoulli in 1689, although it was not published posthumously until 1713. Bernoulli called his version of the weak law of large numbers the Golden Theorem . The first version of a strong law of large numbers for the special case of a coin toss was published in 1909 by Émile Borel . In 1917, Francesco Cantelli was the first to prove a general version of the strong law of large numbers.

The story of the strong law of large numbers reached a certain conclusion with N. Etemadi's theorem, proven in 1981. Etemadi's theorem shows the validity of the strong law of large numbers under the assumption that the random variables can be integrated (i.e. have a finite expectation), each have the same distribution, and each two random variables are independent. The existence of a variance is not assumed.

## Individual evidence

1. Norbert Henze : Stochastics for beginners. An introduction to the fascinating world of chance. 10th, revised edition. Springer Spectrum, Wiesbaden 2013, ISBN 978-3-658-03076-6 , p. 218 f.
2. Jörg Bewersdorff: Statistics - how and why it works. A math reader. 2011, Chapter 2.8, pp. 103–113.
3. Jörg Bewersdorff: Statistics - how and why it works. A math reader. 2011, Chapters 2.7 and 2.8, pp. 90–113.
4. ^ Nasrollah Etemadi: An elementary proof of the strong law of large numbers. In: Journal of Probability Theory and Allied Areas. (Online edition: Probability Theory and Related Fields. Continuation of Zeitschrift fur Probabilstheorie. ). Vol. 55, No. 1, 1981, pp. 119-122, doi : 10.1007 / BF01013465