Locally minimal estimator

from Wikipedia, the free encyclopedia

A locally minimal estimator , also called a locally optimal estimator , is a special unbiased point estimator in estimation theory , a branch of mathematical statistics . Locally minimal estimators scatter less than all other estimators for a given probability measure , i.e. their variance is minimal. Thus locally minimal estimators are a weakening of uniformly best fair-expectation estimators , which vary less than all other estimators with respect to an entire class of probability measures.

definition

A statistical model and a parameter function to be estimated are given

.

Let be the set of unbiased estimators for and

is the set of all unbiased estimators for finite variance with respect to , where is.

Then an estimator is called locally minimal in or locally optimal in if that holds for all others

is.

Covariance method

The covariance method provides a possibility to construct locally minimal estimators by means of the covariance or to check for a given estimator whether it is locally minimal. Let it denote the set of all zero estimators and the set of all zero estimators with finite variance with respect to .

If a is then given, then is locally minimal in if and only if it holds for all that

is.

More generally, the covariance method can be applied to every linear subspace of the estimator functions. If there is such a linear subspace, then the statement applies to a

,

if and only if local is minimal in for .

Existence and uniqueness

Existence statements for locally minimal estimators are mostly based on functional analytical concepts. The locally minimal estimators correspond exactly to the minima of the functional that by

is defined. The fundamental theorem of the calculus of variations provides a statement of existence, for example . Something concrete can be concluded: Will of dominating , all density functions from (see lp space ) and so there is an estimator , the locally minimal in is.

The covariance method provides the uniqueness of a locally minimal estimator: If a locally minimal estimator exists in , then it is - almost certainly uniquely determined.

Important statements

In addition to the statements for uniformly best unbiased estimators, which also apply pointwise, i.e. for locally minimal estimators, the following statements are important:

  • Barankin and Stein's theorem : It characterizes the locally minimal estimator via the completion of the linear combinations of the density functions of the probability measures involved.
  • Chapman-Robbins inequality : It allows the variance of an estimator to be estimated and provides a point-wise version of the Cramér-Rao inequality at the limit crossing .

literature