Chapman-Robbins inequality

from Wikipedia, the free encyclopedia

The Chapman-Robbins inequality is a mathematical statement in estimation theory , a branch of mathematical statistics . For an unbiased estimator, it provides a lower bound for the variance of the estimator and thus also an estimate of its quality. With additional regularity requirements, the Chapman-Robbins inequality also provides a point-wise version of the Cramér-Rao inequality .

The inequality is named after Douglas George Chapman and Herbert Robbins .

formulation

Framework

A statistical model is given . Be strong and be of dominating , that is all there is a density function

from regarding .

Furthermore, let the set of all square-integrable functions (see Lp-space ) and the set of all unbiased estimators for the parameter function .

Then

the set of all unbiased estimators for with finite variance with respect to and

the set of all density functions with finite variance with respect to .

statement

It applies to everyone :

Transition to the Cramér-Rao inequality

The Chapman-Robbins inequality yields a pointwise version of the Cramér-Rao inequality under the following conditions :

  • The derivation   in   exists   for all of them .
  • The quotient     converges for in versus   .
  • The parameter function   is differentiable in.

It follows from these assumptions

such as

,

where the Fisher information is in the point .

From the Chapman-Robbins inequality it follows that

,

the Cramér-Rao inequality in the point .

literature