Uniformly best unbiased estimator

from Wikipedia, the free encyclopedia

A uniformly best unbiased estimator , also called uniformly best estimator or best estimator for short , is a special estimator in estimation theory , a branch of mathematical statistics . Equally best unbiased estimators are unbiased point estimators for a given estimation problem, that is, those without a systematic error. Due to the randomness of the sample, each unbiased estimator is scattered, but some less than others. Equally best unambiguous estimators are then those unambiguous estimators that scatter less for the given problem than any other unambiguous estimator. Thus, uniformly best unbiased estimators have the smallest variance among all unbiased estimators for an estimation problem. Equally best unbiased estimators are therefore “good” estimators in the sense that they have no systematic error and that their estimated value is on average closer to the value to be estimated than with all other unbiased estimators. However, there may be biased estimators that are consistently better with regard to the mean squared error than an equally best estimator true to expectations, see e.g. B. the James Stein estimator

There is also the designation of variance-minimizing estimator or uniformly minimal estimator . Some authors also use the term UMVUE estimator or UMVU estimator , which has been adopted from English, as an abbreviation for U niformly M inimum V ariance U nbiased E stimator.

definition

A statistical model and a parameter function to be estimated are given

.

Then an unbiased estimator with finite variance for a uniformly best unbiased estimator for is called if for every further unbiased estimator with finite variance for :

or equivalent due to the fidelity to expectations

.

Remarks

Even the best estimators are intuitively easily accessible: If you have two unbiased estimators at hand, you would consider the one who is less scattered around the value to be estimated to be “better”. The estimator that is better than all other unbiased estimators in this respect is then consistently the best estimator.

Mathematically, however, the following problems exist:

  • In general, there does not have to be a uniformly best estimator.
  • Even if a consistently best estimate should exist, it is not clear how to find it.
  • If an unbiased estimator is given, it is difficult to find out whether this is a consistently best estimator. The problem here is that the set of unbiased estimators with which it has to be compared is difficult to specify.

Important statements

Central statements regarding equally best estimates are:

Covariance method

The covariance method provides a way of using covariance to construct uniformly best estimators or to check for a given estimator whether it is a uniformly best estimator.

If there is an unbiased estimator with finite variance , then there is a uniformly best unbiased estimator if and only if for every zero estimator

applies. More generally, the covariance method can be applied to every linear subspace of the estimator functions. So is such a linear subspace and the set of unbiased estimators with finite variance and the sets of zero estimators and the statement applies to a

,

so is equally the best estimate for .

Generalizations

If the concept of a uniformly best estimator is too strong, it can be weakened: instead of requiring that the variance of one estimator be smaller than the variance of any other estimator, one only requires that the variance of for a fixed is smaller than Is the variance of all other estimators. This leads to the concept of the locally minimal estimator .

literature

Individual evidence

  1. ^ Georgii: Stochastics. 2009, p. 210.
  2. ^ Rüschendorf: Mathematical Statistics. 2014, p. 127.
  3. Czado, Schmidt: Mathematical Statistics. 2011, p. 108.