# Ergodicity

Ergodicity (Greek έργον: work and όδος: way) is a property of dynamic systems . The term goes back to the physicist Ludwig Boltzmann , who examined this property in connection with the statistical theory of heat . Ergodicity is examined in mathematics in the ergodic theory .

## General

Ergodicity refers to the mean behavior of a system. Such a system is described by a pattern function that determines the development of the system over time depending on its current state . You can now average in two ways:

1. the development can be followed over a long period of time and averaged over this time, i.e. the time average , or
2. one can consider all possible states and average over them, i.e. form the ensemble average .

A system is called strictly ergodic if the time averages and crowd averages  lead to the same result with a probability of one. This clearly means that all possible states are reached during the development of the system, i.e. the state space is completely filled over time. This means in particular that in such systems the expected value does not depend on the initial state.

A system is called weakly ergodic if in both cases only the expected value and the variance match and higher-order moments are neglected.

The exact mathematical proof of the ergodicity, in particular the proof of the strict ergodicity, can only be provided in special cases. In practice, the weak ergodicity is demonstrated on one or a few sample functions.

## Examples

A simple physical example of an ergodic system is a particle that moves randomly in a closed container ( Brownian motion ). The state of this particle can then be described in a simplified manner by its position in the three-dimensional space that is limited by the container. This space is then also the state space, and the movement in this space can be described by a random function (more precisely: a Wiener process ). If you now follow the trajectory of the particle, it will have passed every point of the container after a sufficiently long time (more precisely: every point will have come as close as you want ). Therefore it does not matter whether you averaging over time or space - the system is ergodic.

In statistical mechanics , the assumption that real particles actually behave ergodically is of central importance for the derivation of macroscopic thermodynamic quantities, see ergodic hypothesis .

Another example is rolling the dice : The mean number of dice rolls can be determined both by throwing one dice 1000 times in succession and by throwing 1000 dice once at the same time. This is because the 1000 dice thrown at the same time will all be in slightly different states (position in space, alignment of the edges, speed, etc.) and thus represent a mean over the state space. This is where the term “crowd means” comes from: With an ergodic system, one can determine the properties of a whole “crowd” of initial states at the same time and thus obtain the same statistical information as when one looks at an initial state for a longer period of time. This is also used for measurements in order to obtain reliable results in a short time with noisy data.

A simple example of a non-ergodic process is obtained like this: A “fair coin” is tossed once. If “head” falls, you take the constant sequence , otherwise the constant sequence . The array averages are the same here, but the time averages 1 or 0 (each with probability ). ${\ displaystyle (1,1,1,1, \ ldots)}$${\ displaystyle (0,0,0,0, \ ldots)}$${\ displaystyle {\ tfrac {1} {2}}}$${\ displaystyle {\ tfrac {1} {2}}}$

## Ergodicity in time series analysis

For the statistical inference with time series, assumptions have to be made, since in practice there is usually only one implementation of the process generating the time series . The assumption of ergodicity means that sample moments that are obtained from a finite time series converge for quasi against the moments of the population . For and constant: mean ergodic:${\ displaystyle T \ rightarrow \ infty}$${\ displaystyle \ operatorname {E} [x_ {t}] = \ mu}$${\ displaystyle \ operatorname {Var} [x_ {t}] = \ sigma ^ {2}}$
${\ displaystyle \ lim _ {T \ rightarrow \ infty} \ operatorname {E} \ left [\ left ({\ frac {1} {T}} \ sum _ {t = 1} ^ {T} x_ {t} - \ mu \ right) ^ {2} \ right] = 0}$

variance: these properties for dependent random variables cannot be proven empirically and must therefore be assumed. In order for a stochastic process to be ergodic, it must be in statistical equilibrium; i.e., it must be stationary . ${\ displaystyle \ lim _ {T \ rightarrow \ infty} \ operatorname {E} \ left [\ left ({\ frac {1} {T}} \ sum _ {t = 1} ^ {T} (x_ {t } - \ mu) ^ {2} - \ sigma ^ {2} \ right) ^ {2} \ right] = 0}$

## Special use cases

The Ergodizitätsökonomie examined the conditions under which agents with similar qualifications - contrary to classical competitive situations - cooperate and minimize long-term individual risk. It is thus interpreted as an approach against the division of Western societies .

## Related terms

The concept of the mixture is closely related , it represents a tightening of the ergodicity. For a finer classification, the mixtures are then divided into “strongly mixing” and “weakly mixing”.

## literature

• Peter Walters: An introduction to ergodic theory. Springer, New York 1982, ISBN 0-387-95152-0 .

## Individual evidence

1. Ole Peters, Murray Gell-Mann: Evaluating gambles using dynamics. In: chaos. An Interdisciplinary Journal of Nonlinear Science. American Institute of Physics, February 2, 2016, accessed January 6, 2020 .
2. Ole Peters: The ergodicity problem in economics. In: Nature Physics. Nature Research, December 2, 2019, accessed January 6, 2020 .
3. Mark Buchanan: How ergodicity reimagines economics for the benefit of us all. In: Aeon. Aeon Media Group Ltd., August 14, 2019, accessed January 6, 2020 .
4. Paul Jerchel: There is more when we share . In: Contraste. Association for the Promotion of Self-Administration and Ecology eV (Ed.): Contraste. Self-organization newspaper . tape 37 , no. 428 , May 2020, ISSN  0178-5737 , p. 8 .