Rényi entropy

from Wikipedia, the free encyclopedia

In information theory , the Rényi entropy (named after Alfréd Rényi ) is a generalization of the Shannon entropy . The Rényi entropy belongs to the family of functions designed a system to quantify the diversity, uncertainty or randomness.

The Rényi entropy of order α, where α> 0, is defined as:

Here, X is a random variable with a range of values ​​{ x 1 , x 2 ... x n } and p i is the probability that X = x i . If the probabilities p i are all equal, then H α ( X ) = log 2 n , independent of α. Otherwise the entropies are monotonically decreasing as a function of α.

Here are a few individual cases:

which the logarithm of the thickness of X is sometimes also the "Hartley-entropy" X is called.

If the limit approaches 1 ( L'Hospital ), we get:

which corresponds to the "Shannon entropy / information entropy".

Further

which corresponds to the "entropy of correlation". The limit of for is

and is also called min-entropy because it is the smallest value of .

The Rényi entropies are important in ecology and statistics as indices of diversity. They also lead to a spectrum of indices of the fractal dimension .

literature

  • Dieter Masak: IT alignment. IT architecture and organization, Springer Verlag, Berlin / Heidelberg 2006, ISBN 978-3-540-31153-9 .
  • Lienhard Pagel: Information is energy. Definition of a physically based information term, Springer Fachmedien, Wiesbaden 2013, ISBN 978-3-8348-2611-4 .

Web links