Mean square deviation

from Wikipedia, the free encyclopedia
Two estimation functions: The choice of a distorted statistic can be advantageous with regard to its expected deviation from the true value compared to an unbiased statistic .

The root mean square deviation , also expected square error or mean square error called and MQA , MSE , or MSE (after the English name English mean squared error ) abbreviated, is a concept of mathematical statistics . In estimation theory, it indicates how much a point estimator scatters around the value to be estimated. This makes it a central quality criterion for appraisers . In regression analysis it is interpreted as the expected square distance that an estimator has from the true value .

definition

A statistical model and a point estimator are given

for a function to be estimated (in the parametric case the parameter function )

Then is called

is the root mean square deviation of . It denotes the expected value with regard to the probability measure . The equivalent representation follows by means of the shift law of the variance

.

Here the bias of the estimator is called the bias.

For estimators who take values ​​in a general decision space that is provided with a norm , the mean square deviation can be defined as

.

interpretation

In the classic case, a low mean square deviation means that the bias and variance of the estimator are both small. With the estimator you are on average close to the functional to be estimated (lower distortion) and at the same time you know that the estimated values ​​have little scatter (low variance) and are very likely to be close to their expected value .

With the MSE it is therefore possible to compare estimation methods with one another. The idea is that it can be beneficial to prefer a slightly skewed estimator that has a much smaller variance. The estimation procedure with the smaller MSE is generally considered to be the better one.

The problem is that the MSE depends on the unknown population parameter to be estimated.

example

A typical case is the estimation of the mean of a normal distribution. We assume that random variables exist, each of which is normally distributed with an unknown expected value and variance 1. The classic estimator is the sample mean . Here the distortion is zero:

,

since the empirical mean is unbiased for . Since it is normally distributed with expected value and variance , it follows

Root mean square consistency

An estimated statistic is called consistent in the root mean square if applies to

Effectiveness of Estimation Statistics

Let two estimation statistics and . The estimation statistic is called MSE effective if

holds for all admissible distributions. Furthermore, an estimation statistic is called the most effective MSE if its MSE is always the smallest for all admissible distributions.

Classification and related concepts

If one interprets the estimation theory as a statistical decision problem , then every point estimator is a decision function . The deviation of the decision function from the value to be estimated is then weighted by a loss function . This indicates how great the "damage" is caused by an estimate. The loss function is then combined with the decision function to form the risk function , which indicates the mean damage when using a certain decision function.

In this context, the root mean square deviation is the risk function that is obtained when using the Gaussian loss function

arises. The risk function is then obtained by forming the expected value.

With an analog construction using the Laplace loss , the mean error is obtained

.

See also

literature

Individual evidence

  1. ^ Ludwig Fahrmeir , Rita artist, Iris Pigeot , Gerhard Tutz : Statistics. The way to data analysis. 8., revised. and additional edition. Springer Spectrum, Berlin / Heidelberg 2016, ISBN 978-3-662-50371-3 , p. 344.
  2. ^ Ludwig Fahrmeir , Rita artist, Iris Pigeot , Gerhard Tutz : Statistics. The way to data analysis. 8., revised. and additional edition. Springer Spectrum, Berlin / Heidelberg 2016, ISBN 978-3-662-50371-3 , p. 347.