Multicollinearity

from Wikipedia, the free encyclopedia

Multicollinearity is a regression analysis problem and occurs when two or more explanatory variables have a very strong correlation with one another. On the one hand, with increasing multicollinearity, the method for estimating the regression coefficients becomes unstable and statements about the estimation of the regression coefficients become increasingly imprecise. On the other hand, the model interpretation is no longer clear. The classic symptom of strong multicollinearity is a high degree of certainty associated with low t-values ​​for the individual regression parameters .

Problems of multicollinearity

Perfect collinearity makes the computational implementation of the linear regression analysis impossible and usually occurs as a result of the incorrect specification of the underlying model ( true model ).

Numerical instability

Mathematically , the solution of the multiple linear regression problem for the regression coefficients obtained by means of the least squares method can be represented in vector-matrix notation as

.

The vector contains the estimated regression coefficients , the vector, and the data matrix

the -dimensional observation values. The problem lies in computing the inverse of the sum-of-products matrix ; the stronger the multicollinearity, the more it approximates a singular matrix , i.e. H. there is no inverse .

Model interpretation

If the regression model is and there is perfect multicollinearity, i. H.

or moved

and inserts both equations into the regression model, one obtains

(1)
(2)

In model (1) only depends on and in model (2) only depends on . The question now arises, which model is the “right one”? In economics, one speaks of non- identifiable models .

Identification of multicollinearity

Because empirical data always show a certain degree of multicollinearity, key figures were developed that provide indications of multicollinearity. However, there is no clear guideline.

correlation

To reveal multicollinearity z. B. the analysis of the correlation coefficients of the regressors. Very high positive or negative correlation coefficients indicate a strong relationship between the regressors and thus multicollinearity. However, a low correlation between the regressors does not automatically mean the absence of multicollinearity (example); also linear combinations of regressors that have a high positive or negative correlation, e.g. B. between and lead to the problems mentioned above. A high correlation between the regressors can be identified by the correlation matrix .

Coefficient of determination

A high coefficient of determination for linear regressions

,

d. H. the -th regressor is well predicted by all other regressors, indicates multicollinearity.

tolerance

The tolerance is used to estimate the multicollinearity. A value of indicates a strong multicollinearity.

Variance Inflation Factor (VIF)

The greater the variance inflation factor

, (with as coefficient of determination of the regression of all other influencing variables),

the stronger are the indications of multicollinearities. There is no definitive value from when the VIF displays (too) high multicollinearity. As a rule of thumb, VIF values ​​above 10 are often classified as "too high".

Condition index

The product sum matrix is positive semidefinite, i.e. H. all eigenvalues ​​of the matrix are positive or zero. If the matrix becomes singular, then at least one eigenvalue is zero. Is the condition index

a value greater than 30 is also referred to as strong multicollinearity.

See also

literature

  • L. von Auer: Econometrics - An Introduction. 7th edition. Springer, Berlin 2016, ISBN 978-3-662-47868-4 , pp. 561-588.

Individual evidence

  1. http://www.sgipt.org/wisms/EWA/EWA0.htm#Unauffaellige%20Korrelationsmatrix
  2. See for the rule of thumb and a discussion about it: Wooldridge, Introductory Econometrics: A Modern Approach, 2013, p. 98.