Lehmann-Scheffé theorem

from Wikipedia, the free encyclopedia

The set of Lehmann-Scheffé is a key result of the estimation theory , a branch of mathematical statistics . The statement based on Rao-Blackwell 's theorem provides criteria under which unambiguous point estimators are also consistently best unambiguous estimators, i.e. have a lower variance than all other unambiguous estimators.

The set is named after Erich Leo Lehmann and Henry Scheffé .

statement

Lehmann-Scheffé's theorem can be formulated in different ways, which differ in their notation and the structures used, but are identical in content.

For σ-algebras

A statistical model is given and the set of all unbiased estimators with finite variance for the parameter function . Let the sub- σ-algebra be both sufficient for and complete for .

If , then is the Rao-Blackwell improvement of with respect to uniformly best unbiased estimator for . Say it applies

and all others .

For statistics

The formulation using statistics follows directly from the above: The sufficient, complete σ-algebra is replaced by sufficient , complete statistics . Part is also noted as. This does not mean that the statement only applies to parametric models. In full formulation, the statement then reads: is a consistently best unbiased estimator for , i.e. it is

and all others .

Alternative formulations

Possible reformulations of the above statements are:

  • If is sufficient and complete for and is , then is equally best unbiased estimator for .
  • If there is a complete sufficient statistic and one exists such that there is an unbiased estimator for , then there is an equally best unbiased estimator for . This is true because . If one now sets in the above statement , this formulation follows.

Generalizations

A specialization of the Lehmann-Scheffé theorem is the Barankin and Stein theorem , which describes the structure of locally minimal estimators .

literature