Loss function (statistics)

from Wikipedia, the free encyclopedia

A loss function is a special function in mathematical statistics and part of a statistical decision problem . It assigns to each decision in the form of a point estimate, a range estimate or a test the damage caused by a decision deviating from the true parameter. Together with the decision function , the loss function is combined to form the risk function , which indicates the potential damage when using a decision function.

definition

A statistical model and a decision space are given . Then a function is called a loss function if for every fixed function the function - is measurable . The L stands for loss , English for loss.

The loss function indicates the loss given the parameter , if one decides for.

Classic loss functions

Let the parameter function be given from the parameter set to the decision set , i.e. and a norm on the decision set. Mostly is .

Laplace and Gaussian Loss

A typical loss function is then

for a . Is so

,

so one speaks of the Laplace loss . Is so

,

so one speaks of the Gaussian loss .

Is chosen in the estimation theory the Gauss loss, thus simplifying the L-integrity of unbiasedness and the risk function to the mean square error . Similarly, if the Laplace loss is selected, the L integrity becomes the median integrity and the risk function becomes the mean error .

0-1 loss

Another important loss function is the so-called 0-1 loss . It is defined as

for a . He does not punish all decisions that are close enough to the "right" decision at all and all those who exceed a certain distance from it equally severely. In the context of the quantitative range estimates, the 0-1 loss is then also defined as

,

since the decisions are then quantities and no longer individual values.

Neyman-Pearson loss function

A modification of the 0-1 loss, the so-called Neyman-Pearson loss function, is used for statistical tests . If the parameter space is broken down into hypothesis and alternative as well as the decision for the hypothesis and the decision for the alternative, then the loss function is defined by

.

It is . then corresponds to the loss in a type 1 error , in a type 2 error .

literature