Vector process

from Wikipedia, the free encyclopedia

In time series analysis, the vector process is understood to be the combination of m real random variables that are simultaneously observable in t .

An economical example of a vector process is e.g. B. the yield curve . The different interest rates for the different remaining terms form the m variables , the changes of which can be observed over time.

A common stationarity of the vector process implies the stationarity of each of the univariate processes involved. Conversely, a combination of m > 1 stationary univariate processes is not necessarily a common stationary vector process. The stationarity of the sub-processes is a necessary but not a sufficient condition. The latter is the case if the coefficient matrices can be summed to the square.

Vector processes can be noted in the MA (VMA) and AR display ( VAR ) or as a combination of both display forms ( VARMA ). Such a process is called a linear or purely non- deterministic vector process. The vectorial white noise must be uncorrelated for different points in time. At the same time, however, a correlation is permitted. This is known as the contemporary correlation . The variances of the noise variables combined in the vector can be different, but must be constant in time.

The invertibility of a vector process is given when the AR coefficient matrices are absolutely summable. However, such a process is not necessarily stationary. It is this when all zeros of the AR matrix polynomial lie outside the unit circle . A stationary vector process in the MA representation can be inverted if all zeros of the determinant of the MA matrix polynomial lie outside the unit circle.

With regard to the uniqueness of the ARMA representation of a vector process, it should be said that the duality that is valid for the univariate processes no longer applies. Rather, a finite AR, MA or ARMA vector process also belongs to a given vector process with the associated covariance matrix function.