# Stationary stochastic process

A stationary stochastic process is a special stochastic process and thus an object of investigation in probability theory . One distinguishes

• weak stationary processes (rarely also called covariance stationary processes )
• strongly stationary processes , in which the addition "strong" is often left out and one only speaks of stationary processes .

Both have time-independent properties.

## definition

Time series of residuals , not stationary according to the Dickey-Fuller test

A stochastic process is said to be strongly stationary if the distribution of does not depend on the displacement . Here any index set (on which a binary operation is explained) denotes , usually the whole numbers , sometimes also the natural or the real numbers ; often the time is modeled. ${\ displaystyle (x_ {t}) _ {t \ in \ mathbb {T}}}$${\ displaystyle (x_ {s + t}) _ {t \ in \ mathbb {T}}}$${\ displaystyle s \ in \ mathbb {T}}$${\ displaystyle \ mathbb {T}}$ ${\ displaystyle +}$${\ displaystyle T}$

A stochastic process is called weakly stationary (rarely stationary covariance ), if ${\ displaystyle (x_ {t}) _ {t \ in \ mathbb {T}}}$

1. the expected value is constant, d. H. for all valid ,${\ displaystyle E}$${\ displaystyle t \ in \ mathbb {T}}$${\ displaystyle E (x_ {t}) = \ mu}$
2. the variance is finite, d. H. for all true and${\ displaystyle {\ text {Var}}}$${\ displaystyle t \ in \ mathbb {T}}$${\ displaystyle {\ text {Var}} (x_ {t}) <\ infty}$
3. the autocovariance does not depend on the shift , d. H. for all applies to the covariance .${\ displaystyle s \ in \ mathbb {T}}$${\ displaystyle s, t_ {1}, t_ {2} \ in \ mathbb {T}}$${\ displaystyle {\ text {Cov}} (x_ {t_ {1}}, x_ {t_ {2}}) = {\ text {Cov}} (x_ {s + t_ {1}}, x_ {s + t_ {2}})}$ ${\ displaystyle {\ text {Cov}}}$

## interpretation

Stationarity is one of the most important properties of stochastic processes in time series analysis . With the stationarity one obtains properties that not only apply to individual points in time, but are invariances over time. The time series has the same expected value and the same variance at all times . (The most important class of non-stationary processes are integrated processes .)

With the first property one can move on to a new process which then applies. This process is also called the centered process . One can therefore assume, without loss of generality , that a stationary stochastic process has the mean value 0. ${\ displaystyle x_ {t} -E (x_ {t})}$${\ displaystyle E (x_ {t} -E (x_ {t})) = 0}$

The second property simply says that each of the random variables has finite variance and thus belongs to the Hilbert space . It then also follows from this that the expected value exists. ${\ displaystyle L ^ {2}}$${\ displaystyle E (x_ {t})}$

The third requirement establishes a relationship between the different points in time and is therefore the most important property. It states that the covariances between the points in time does not depend on the two points in time, but only on the distance between the two points in time. The condition can also be formulated so that a function is only a single variable . One consequence of this is that there is an infinite block Toeplitz matrix . ${\ displaystyle r = t_ {2} -t_ {1}}$${\ displaystyle \ gamma (r) = Cov (x_ {t_ {1}}, x_ {t_ {1} + r})}$${\ displaystyle r}$${\ displaystyle \ Gamma = E (xx ^ {*}) - E (x) E (x ^ {*})}$

## Geometric meaning

The geometric interpretation of the univariate case ( ) falls back on the Hilbert space , the elements of which are the individual random variables of the process. The geometric interpretation supports a deeper understanding of the concept of stationarity. ${\ displaystyle n = 1}$ ${\ displaystyle L ^ {2}}$

As a standard in , the request may be construed as that all the process variables are equal, ie lying on a sphere. ${\ displaystyle E (x_ {t} ^ {2})}$${\ displaystyle L ^ {2}}$${\ displaystyle E (x_ {t} ^ {2}) = \ gamma (0)}$

${\ displaystyle E (x_ {t + s} x_ {t}) = \ gamma (s)}$then says, following the above interpretation, that for solid all enclose the same angle. If you increase by one, the rotation always continues by the same angle. ${\ displaystyle s}$${\ displaystyle x_ {t}}$${\ displaystyle s}$

Requirement (ii) means nothing else than , i.e. the angle between the unit and each process variable is constant. Here a degree of latitude is cut out of the unit sphere . ${\ displaystyle \ langle x_ {t}, 1 \ rangle = m}$

## Stationization

Making a non-stationary time series stationary is an important first task in time series analysis. Widespread methods here are the formation of differences, rescaling or logarithmizing the time series. More generally, one can try to obtain a stationary time series by using a suitable trend-season model .

## Examples

The most important (weak) stationary process is white noise . Furthermore, certain Gaussian processes and ARMA models are stationary. Also of theoretical importance are harmonic processes that are stationary under certain conditions. Furthermore, Markov chains that start in their stationary distribution are stationary processes.

## properties

Stationary stochastic processes in discrete time, which are given as canonical processes , can be understood as a dimensionally maintaining dynamic system . To do this, the shift operator is defined as ${\ displaystyle \ tau}$

${\ displaystyle \ tau ((\ omega _ {n}) _ {n \ in \ mathbb {N}}) = (\ omega _ {n + 1}) _ {n \ in \ mathbb {N}}}$.

Then is and the process arises through iterated application of . It is therefore a dynamic system that is dimensionally stable due to its stationarity. Building on this, ergodic stochastic processes can also be defined, for which important theorems of ergodic theory, such as the individual ergodic theorem, apply and thus provide strong laws of large numbers for dependent sequences of random variables. ${\ displaystyle X_ {n} (\ omega) = X_ {0} (\ tau ^ {n} (\ omega))}$${\ displaystyle \ tau}$

## literature

• Peter J. Brockwell, Richard A. Davis: Time Series: Theory and Methods . Springer Verlag, Berlin 2002, ISBN 0387974296
• GEP Box, GM Jenkins: Times Series Analysis: Forecasting and Control . 3rd edition, ISBN 0130607746

## footnote

1. only 332 Google search results, v. a. University scripts and statistics books, compared to approx. 149,000 results, some of which are of comparable quality, for weakly inpatient. Both terms are about equally popular in English, 2,360,000 vs. 2,870,000 results. Retrieved on May 27, 2012, 1:38 am
2. ↑ Stationary covariance is sometimes defined in such a way that the autocovariance is stable against shifts in time and not necessarily the expected value. For example Kirchgässner, u. a. Introduction to modern time series analysis, Springer 2013, p. 14. If mean stationarity and autocovariance (covariance stationarity) are stable with time shifts, this is defined as weak stationarity.