# Moment (stochastics)

Moments of random variables are parameters of descriptive statistics and play a role in stochastics . The terms expected value , variance , skewness and curvature for describing a random variable are closely related to their moments.

A distribution function is determined by specifying all moments of the corresponding random variable, if the moments exist and the series of the moment-generating function converges. The determination of a distribution with given moments is called the moment problem, which also plays an important role in technical mechanics .

There are distributions whose moments only exist up to a certain order. This includes B. the t-distribution whose moments only exist for orders that are smaller than the number of degrees of freedom . In the special case of the Cauchy distribution , the first moment (the expected value) does not even exist; this is also the case with the Lévy distribution .

## definition

Let it be a random variable and a natural number . Then the moment of the order of or shorter than the -th moment of the expected value of the ‑th power of (assuming that this exists): ${\ displaystyle X}$${\ displaystyle k}$${\ displaystyle k}$${\ displaystyle X}$${\ displaystyle k}$${\ displaystyle X}$${\ displaystyle k}$${\ displaystyle X}$

${\ displaystyle m_ {k}: = \ operatorname {E} \ left (X ^ {k} \ right),}$

and the -th absolute moment of is the expected value of the -th power of the absolute value of : ${\ displaystyle k}$${\ displaystyle X}$${\ displaystyle k}$${\ displaystyle | X |}$${\ displaystyle X}$

${\ displaystyle M_ {k}: = \ operatorname {E} \ left (\ left | X \ right | ^ {k} \ right).}$

In theoretical investigations, moments of a non-integer order are sometimes also considered. ${\ displaystyle \ kappa}$

The existence of moments of a certain order generally provides information about the distribution of the probability mass.

The first moment is the expected value. It is usually referred to with and can be viewed as the mean value. ${\ displaystyle \ mu}$

## Representation for real random variables

If a real random variable with the distribution function is defined on a probability space , then it follows from the definition of the expected value as a Stieltjes integral${\ displaystyle X}$ ${\ displaystyle (\ Omega, \ Sigma, P)}$ ${\ displaystyle F_ {X} (x) = P (X \ leq x)}$

${\ displaystyle m_ {k} = \ int \ limits _ {- \ infty} ^ {\ infty} x ^ {k} \, \ mathrm {d} F_ {X} (x)}$.

If a continuous random variable with the density function then applies: ${\ displaystyle X}$ ${\ displaystyle f_ {X}}$

${\ displaystyle m_ {k} = \ int \ limits _ {- \ infty} ^ {\ infty} x ^ {k} f_ {X} (x) \, \ mathrm {d} x}$,

and for a discrete random variable with the values and the corresponding probabilities is ${\ displaystyle x_ {i}}$${\ displaystyle p_ {i} = P (X = x_ {i})}$

${\ displaystyle m_ {k} = \ sum _ {i = 1} ^ {\ infty} x_ {i} ^ {k} \ cdot p_ {i}}$.

With the help of the Lebesgue integral with respect to the probability measure , these cases can be uniformly written as ${\ displaystyle P}$

${\ displaystyle m_ {k} = \ int _ {\ Omega} X ^ {k} \, \ mathrm {d} P}$.

## Central moments

In addition to the moments defined above, the central moments are defined for which the distribution of the probability mass around the expected value of the random variable is considered: ${\ displaystyle \ mu = \ operatorname {E} (X)}$${\ displaystyle X}$

${\ displaystyle \ mu _ {k}: = \ operatorname {E} \ left (\ left (X- \ mu \ right) ^ {k} \ right)}$

and

${\ displaystyle {\ bar {\ mu}} _ {k}: = \ operatorname {E} \ left (\ left | X- \ mu \ right | ^ {k} \ right).}$

They are called central moments because they are centered on the mean . From the definition it follows immediately that the first central moment is always 0: ${\ displaystyle \ mu}$

${\ displaystyle \ mu _ {1} = \ operatorname {E} \ left (X- \ mu \ right) = \ operatorname {E} \ left (X \ right) - \ mu = 0.}$

The first central absolute moment is the mean absolute deviation :

${\ displaystyle {\ bar {\ mu}} _ {1}: = \ operatorname {E} \ left (\ left | X- \ mu \ right | \ right).}$

The second central moment is the variance :

${\ displaystyle \ mu _ {2} = \ operatorname {E} \ left (\ left (X- \ mu \ right) ^ {2} \ right).}$

The third central moment

${\ displaystyle \ mu _ {3} = \ operatorname {E} \ left (\ left (X- \ mu \ right) ^ {3} \ right)}$

results after normalization with the standard deviation, the skewness (engl. skewness ) (also 3. normalized / standardized torque given):

${\ displaystyle \ mu _ {3} = \ operatorname {E} \ left (\ left ({\ frac {X- \ mu} {\ sigma}} \ right) ^ {3} \ right).}$

The fourth central moment

${\ displaystyle \ mu _ {4} = \ operatorname {E} \ left (\ left ({X- \ mu} \ right) ^ {4} \ right).}$

after normalization with the standard deviation results in the curvature (also called 4th normalized / standardized moment):

${\ displaystyle \ mu _ {4} = \ operatorname {E} \ left (\ left ({\ frac {X- \ mu} {\ sigma}} \ right) ^ {4} \ right).}$

Skew and curvature are collectively referred to as higher moments . The curvature is often used as a measure of the deviation from the normal distribution , the skewness is a measure of the deviation from a symmetrical distribution .

## Moments, characteristic function and accumulators

By deriving the formula for the characteristic function several times, the ordinary moments are represented by the characteristic function as

${\ displaystyle \ operatorname {E} (X ^ {k}) = {\ frac {\ varphi _ {X} ^ {(k)} (0)} {i ^ {k}}} \ quad (k = 1 , 2, \ dots)}$

The th moment can also be determined with the moment generating function . It is also possible to represent the -th moment as a -th degree polynomial using the first cumulants . This polynomial is then exactly the -th complete Bell polynomial : ${\ displaystyle k}$${\ displaystyle k}$${\ displaystyle k}$${\ displaystyle k}$ ${\ displaystyle \ kappa _ {1}, \ dots, \ kappa _ {k}}$${\ displaystyle k}$ ${\ displaystyle B_ {k}}$

${\ displaystyle m_ {k} = B_ {k} (\ kappa _ {1}, \ dots, \ kappa _ {k})}$.

## Markov's inequality

The meaning of the moments is made clear by the following sentence:

If the -th absolute moment of the random variable exists, then holds ${\ displaystyle k}$${\ displaystyle M_ {k}}$${\ displaystyle X}$

${\ displaystyle P (| X | \ geq x) \ leq {\ frac {M_ {k}} {x ^ {k}}}}$.

This is the Markov inequality , which provides a statement about the probability of values ​​of which are large in magnitude . In the special case , the well-known Chebyshev inequality follows with the variance of${\ displaystyle X}$${\ displaystyle k = 2}$${\ displaystyle \ sigma ^ {2}}$${\ displaystyle X}$

${\ displaystyle P (| X- \ operatorname {E} (X) | \ geq x) \ leq {\ frac {\ sigma ^ {2}} {x ^ {2}}}}$,

which makes a statement about the probability of large deviations of the random variables from their expected value. ${\ displaystyle X}$

## Bond moments

The concept of moments can also be expanded to include several random variables. In the case of two random variables and are the joint torques (engl. Joint moments ) of and${\ displaystyle X}$${\ displaystyle Y}$${\ displaystyle X}$${\ displaystyle Y}$

${\ displaystyle m_ {k \ ell} = \ operatorname {E} \ left (X ^ {k} Y ^ {\ ell} \ right) = \ int _ {- \ infty} ^ {\ infty} \ int _ { - \ infty} ^ {\ infty} x ^ {k} y ^ {\ ell} f_ {XY} (x, y) \, \ mathrm {d} x \ mathrm {d} y}$

with the common density . ${\ displaystyle f_ {XY}}$

The central common moments of and as ${\ displaystyle X}$${\ displaystyle Y}$

${\ displaystyle \ mu _ {k \ ell} = \ operatorname {E} \ left ((X- \ operatorname {E} (X)) ^ {k} (Y- \ operatorname {E} (Y)) ^ { \ ell} \ right)}$

Are defined. In particular, the covariance of and . ${\ displaystyle \ mu _ {11}}$${\ displaystyle X}$${\ displaystyle Y}$

## calculation

An approximation method for calculating moments is the first-order second-moment method .