Linear predictor

from Wikipedia, the free encyclopedia

In statistics, and particularly in parametric regression analysis , a linear predictor is a linear combination of a series of coefficients ( regression coefficients ) and explanatory variables ( independent variables ), the value of which is used to predict a response variable. This additive-linear systematic component is a main part of linear regression models .

definition

In parametric regression analysis, a search space is formed from pontential regression functions using several regression parameters. Then the parameter configuration should be determined that provides the highest quality of fit  for the observed values ​​of the response variables and explanatory variables. The most important model classes of parametric regression analysis are, on the one hand, the class of linear models and, on the other hand, the class of generalized linear models . The adjective “linear” results from the fact that the two model classes are based on the linear predictor , which is defined as follows

.

This linear predictor is made up of the explanatory variables and the fixed but unknown regression parameters, usually equal to one ( ). The parameter is thus the intercept of the regression line or, more precisely, "regression hyperplane ". It determines the level of the linear predictor and is therefore also called the level parameter. The regression analysis is about estimating the axis intercept , the slope parameters and the variance of the disturbance variables.

Linear models vs. generalized linear models

Linear models assume the following relationship between the regression function and the linear predictor

.

Generalized linear models, on the other hand, assume that the expected value of the response variable only takes the form of a linear predictor through a suitable invertible coupling function

.

With the inverse function of the coupling function , the response function results for the regression function in this case

.

In additive models and generalized additive models with the observations of a continuous response variable and the explanatory variables , continuous explanatory variables are also added. The linear predictor is then given by:

The functions here represent the non-linear smoothing effects of the explanatory variables and are modeled and estimated using non-parametric methods.

Vector matrix notation

In vector matrix notation, the linear predictor can be written as follows:

, where and


here is a -column vector and is a transposed -column vector, so that the product results in a matrix or a scalar.

Use in linear regression

An example of using a linear predictor is linear regression , in which an additive confounding variable is superimposed on each of the relationship between explanatory variables and response variables. In multiple linear regression , the relationship can be written as follows:

.

Individual evidence

  1. ^ Ludwig Fahrmeir, Thomas Kneib , Stefan Lang, Brian Marx: Regression: models, methods and applications. Springer Science & Business Media, 2013, ISBN 978-3-642-34332-2 , p. 536.