Cramer's Theorem (Large Deviations)

from Wikipedia, the free encyclopedia

The Cramér's theorem is a statement of the mathematical sub-region of stochastics . The work Sur un nouveau théorème-limite de la théorie des probabilités , in which Harald Cramér published his sentence in 1938, is regarded as the foundation of the theory of large deviations theory . The set of Cramér quantifies how quickly the probability of certain rare events against converges. The Chernoff inequality transfers this asymptotic statement to the finite case; in contrast to other results named after Cramér, one speaks of the Cramér-Chernoff theorem .

Introductory example

Assuming a sequence of independent measurements of a physical quantity , the arithmetic mean is an unbiased estimator for the expected value . If, for example, the required variable is evenly distributed in the unit interval , then, according to the strong law of large numbers, the mean value of the measurements almost certainly converges to . However, this does not yet make a statement about the speed of convergence. The typical behavior of the sum is determined by the limit value , but it cannot be ruled out that it still deviates significantly for an arbitrarily large one, that is, for example, applies. Cramér's theorem now specifies a general procedure how the quality of the estimator can be determined from the underlying distribution of the quantity.

statement

Let be a sequence of independently and identically distributed (iid) real random variables , where the cumulant- generating function of is finite for each . For real things go on

the Legendre transform of . The Cramér's theorem now states that all true

Remarks

  • That the limit exists on the left side of the equation is not obvious, but part of the statement.
  • The specific choice of the basis is not essential for the sentence. The function generating the cumulative can be replaced by for any function, and the limit value of the logarithm to the base is then considered accordingly .
  • Cramér's theorem can be derived from the general case of Sanov's theorem or, alternatively, it can be proven in an elementary way using the exponential Chebyshev inequality .
  • Except for sublinear additive terms in the exponent (i.e. sub-exponential factors), asymptotically applies . The (general) Chernoff inequality says that even for every finite one holds .

Chernoff-Hoeffding theorem

An important special case of Cramér's theorem arises when the variables are Bernoulli-distributed with expected value . For a probability let

the Kullback-Leibler divergence between Bernoulli distributions with parameters and (in the unit Nat ). The cumulative generating function of a Bernoulli variable is . Therefore it can easily be shown that the Cramer function is precisely the divergence here. The sum is binomially distributed with an expected value . With the help of the Chernoff inequality we now get for

This is the Chernoff-Hoeffding theorem and is clearly also referred to as the additive Chernoff inequality.

See also

literature

Individual evidence

  1. Harald Cramér: Sur un nouveau théorème-limite de la théorie des probabilités . Colloque consacré à la théorie des probabilités. In: Actualités scientifiques et industrielles . tape 736 , 1938, pp. 2-23 .
  2. ^ Hugo Touchette: Large Deviation Theory: History, Sources, References, and Pointers. Division of Applied Mathematics, Stellenbosch University, Stellenbosch, South Africa; .
  3. ^ Norbert Straumann: Statistical Mechanics: Introduction and further information . Springer, Berlin & Heidelberg, Germany 2017, ISBN 978-3-662-52950-8 , pp. 10 , doi : 10.1007 / 978-3-662-52950-8 .
  4. Matthias Löwe: Large deviations. (PDF; 418 kB) Westfälische Wilhelms-Universität Münster, Institute for Mathematical Stochastics; .
  5. ^ Thomas M. Cover, Joy A. Thomas: Elements of Information Theory . 2nd Edition. John Wiley & Sons, Hoboken, NJ, USA 2006, p. 392 f .
  6. Wassily Hoeffding: Probability Inequalities for Sums of Bounded Random Variables . In: Journal of the American Statistical Association . tape 58 , 1984, pp. 13-30 , doi : 10.2307 / 2282952 .
  7. Devdatt P. Dubhashi, Alessandro Panconesi: Concentration of Measure for the Analysis of Randomized Algorithms . Cambridge University Press , New York, NY, USA 2009, ISBN 978-0-521-88427-3 .