Deviance Information Criterion

from Wikipedia, the free encyclopedia

In statistics , the deviance information criterion , also called Deviance Information Criterion , or DIC for short, is a measure (criterion) for the prediction error of a model .

This measure is an information criterion and belongs to the environment of the Bayesian method for model comparisons. The smaller the deviance information criterion, the better the model fit . The deviance information criterion can be viewed as the Bayesian equivalent of the Akaike information criterion.

When evaluating two models with different deviance information criteria, the following applies very roughly: If there are differences greater than 10, the model with the higher deviance information criterion is definitely worse, differences between 5 and 10 are substantial, if differences are less than 5 and there are significantly different model formulations it may be necessary to consider both models.

Individual evidence

  1. ^ Spiegelhalter, DJ, Best, NG, Carlin, BP & Van der Linde, A., "Bayesian Measures of Model Complexity and Fit (with Discussion)", Journal of the Royal Statistical Society, Series B, 2002 64 (4): 583-616.
  2. Explanations of the DIC in WinBugs