Chebyshev's inequality

from Wikipedia, the free encyclopedia

The Chebyshev inequality , also known as Chebyshev inequality or Bienaymé-Chebyshev inequality , is an inequality in stochastics , a branch of mathematics. It is named after Irénée-Jules Bienaymé and Pafnuti Lwowitsch Tschebyscheff ; whose name can be found in the literature in various spellings, including Chebyshev , Chebyshev , Chebyshev or Chebyshev . In Chebyshev's inequality, the probability that a random variable will deviate from its expected value by more than a given threshold is estimated by its variance .

statement

Let be a random variable with expected value

and finite variance

.

Then for all real numbers :

.

By moving to the complementary event , one obtains

.

Goodness of estimate

The limits given by Chebyshev's inequality are sharp in the sense that there are random variables for which equality applies in the estimation.

This is the case, for example, for a discrete random variable with

and

,

where is a truly positive real number and . Then and , so the estimation follows

,

which is filled with equality, since then applies.

In general, however, the estimates are rather weak. For example, they are for trivial. Nevertheless, the theorem is often useful because it does not require distribution assumptions about the random variables and can therefore be used for all distributions with finite variance (especially those that differ greatly from the normal distribution ). In addition, the bounds are easy to calculate.

variants

Deviations expressed by the standard deviation

If the standard deviation is different from zero and a positive number, one obtains an often-cited variant of Chebyshev's inequality:

.

This inequality only provides a meaningful estimate, for it is trivial, because probabilities are always bounded by 1.

Generalization to higher moments

Chebyshev's inequality can be generalized to higher moments . If these generalized inequality often called (simply) also as Chebyshev inequality ( English Chebyshev's inequality ), while in the context of probability theory sometimes called Markov inequality (or as Markov inequality o. Ä., English Markov's inequality ) is called. With some authors, the generalized inequality can also be found under the name Chebyshev-Markov inequality (or Chebyshev-Markov inequality or the like).

The generalized inequality says that for a measure space and a measurable function and always the inequality

.

applies.

This follows from

The above version of the inequality can be obtained as a special case by putting , and , because then is

.

Chebyshev exponential inequality

The fact that the generalization applies to all positive moments at the same time can be used to prove the so-called exponential Chebyshev inequality . Let be a real random variable distributed according to and be a real number. In the notation above, we now put , and and get

The counter is the torque generating function of . The application of the exponential Chebyshev inequality to a sum of independent and identically distributed random variables is the decisive step in the proof of the Chernoff inequality .

history

In most textbooks, the inequality only bears the name of Pafnuti Lwowitsch Chebyshev . He published his proof for discrete random variables simultaneously in St. Petersburg and Paris in 1867, there in Joseph Liouville's Journal Journal de Mathématiques Pures et Appliquées . A more general proof, however, was provided by Irénée-Jules Bienaymé in 1853 in the Paper Considérations a l'appui de la découverte de Laplace sur la loi de probabilité dans la méthode des moindres carrés. released. This was even reprinted in the same journal just before Chebyshev's publication in Liouville's journal. In a later publication, Chebyshev recognized the first publication of Bienaymé.

Applications

Examples

example 1

For example, suppose the length of Wikipedia articles has an expected value of 1000 characters with a standard deviation of 200 characters. From Chebyshev's inequality one can then deduce that there is at least a 75% probability that a Wikipedia article is between 600 and 1400 characters long ( ).

The probability value is calculated in the following way:

Example 2

Another consequence of the theorem is that for every probability distribution with mean and finite standard deviation, at least half of the values lie in the interval ( ).

Example 3

A random event occurs when an attempt is made with probability . The experiment is repeated several times; the event occurs thereby times. is then binomially distributed and has expectation and variance ; the relative frequency of occurrence thus has expected value and variance . The Chebyshev inequality provides the deviation of the relative frequency from the expected value

,

for the second estimate, the relationship that follows directly from the inequality of the arithmetic and geometric mean was used.

This formula is the special case of a weak law of large numbers that shows the stochastic convergence of the relative frequencies against the expected value.

The Chebyshev inequality provides only a rough estimate for this example, the Chernoff inequality provides a quantitative improvement .

Evidence sketch

Most authors cite Chebyshev's inequality as a special case of the Markov inequality

with and the function .

How one can deduce the Markov inequality with school-style means from an immediately transparent comparison of areas and then derive this version of Chebyshev's inequality can be found in Wirths, for example. For a direct proof one defines

.

If the indicator function denotes the set , then the inequality applies to all

.

Because if, then the right side is zero and the inequality is fulfilled. If , after the quantities have been defined, the left side has at least the value , and the inequality is again satisfied. With the monotony of the expected value and its elementary calculation rules, the definition of the variance follows

.

Divide by gives the inequality.

Related results

literature

Web links

Wikibooks: Description with example  - learning and teaching materials

References and comments

  1. Norbert Henze: Stochastics for beginners . An introduction to the fascinating world of chance. 10th edition. Springer Spectrum, Wiesbaden 2013, ISBN 978-3-658-03076-6 , p. 165 , doi : 10.1007 / 978-3-658-03077-3 .
  2. ^ Hans-Otto Georgii: Stochastics . Introduction to probability theory and statistics. 4th edition. Walter de Gruyter, Berlin 2009, ISBN 978-3-11-021526-7 , p. 112 , doi : 10.1515 / 9783110215274 .
  3. ^ Robert B. Ash: Real Analysis and Probability. 1972, pp. 84-85 & p. 227
  4. TO Širjaev: Probability. 1988, p. 572
  5. ^ RG Laha, VK Rohatgi: Probability Theory. 1979, p. 33
  6. ^ Heinz Bauer: Measure and integration theory. 1992, p. 128
  7. Matthias Löwe: Large deviations. (PDF; 418 KB) Westfälische Wilhelms-Universität Münster, Institute for Mathematical Stochastics, p. 4 .;
  8. Chebyshev, Pafnutii Lvovich . In: Michiel Hazewinkel (Ed.): Encyclopaedia of Mathematics . Springer-Verlag , Berlin 2002, ISBN 978-1-55608-010-4 (English, online ).
  9. ^ VV Sazonov: Bienaymé, Irenée-Jules . In: Michiel Hazewinkel (Ed.): Encyclopaedia of Mathematics . Springer-Verlag , Berlin 2002, ISBN 978-1-55608-010-4 (English, online ).
  10. Heinz Bauer: Probability Theory. 2002, p. 69 ff
  11. Achim Klenke: Probability Theory . 3. Edition. Springer-Verlag, Berlin Heidelberg 2013, ISBN 978-3-642-36017-6 , p. 110 , doi : 10.1007 / 978-3-642-36018-3 .
  12. ^ Hans-Otto Georgii: Stochastics . Introduction to probability theory and statistics. 4th edition. Walter de Gruyter, Berlin 2009, ISBN 978-3-11-021526-7 , p. 122 , doi : 10.1515 / 9783110215274 .
  13. Klaus D. Schmidt: Measure and probability . 2nd, revised edition. Springer-Verlag, Heidelberg Dordrecht London New York 2011, ISBN 978-3-642-21025-9 , pp. 288 , doi : 10.1007 / 978-3-642-21026-6 .
  14. H. Wirths: The expectation value - sketches for the development of concepts from grade 8 to 13. In: Mathematik in der Schule 1995 / Heft 6, pp. 330–343
  15. Ehrhard Behrends: Elementary Stochastics . A learning book - co-developed by students. Springer Spectrum, Wiesbaden 2013, ISBN 978-3-8348-1939-0 , pp. 229-230 , doi : 10.1007 / 978-3-8348-2331-1 .