Differential entropy

from Wikipedia, the free encyclopedia

The differential entropy is a term from information theory and represents a measure of the entropy of a continuous random variable , similar to the Shannon entropy for discrete random variables.

Strictly speaking, it is a measure of a probability distribution . It can be used to compare two continuous random variables, but does not have the same information as the Shannon entropy.

definition

A continuous random variable can have an infinite number of values, i.e. if one could determine its values ​​exactly, the probability for a certain value would be zero:

And thus the information content of each value is infinite:

Let be a continuous random variable with the probability density function , then its differential entropy is defined as

In contrast to the Shannon entropy, the differential entropy can also be negative.

Since the differential entropy is not scaling-invariant (see below), it is advisable to normalize the random variable appropriately so that it is dimensionless .

properties

  • The differential entropy is shift-invariant, ie for constant . It is therefore sufficient to consider mean-value-free random variables.
  • The following applies to the scaling: with the random vector and the amount of the determinant .

Differential entropy for different distributions

For a given variance , the Gaussian distribution has the maximum differential entropy, ie its “randomness” or its surprise value is - compared to all other distributions - the greatest. It is therefore also used to model interference in the channel model , since it represents a worst-case model for interference (see also additive white Gaussian noise ).

For a finite range of values, ie a given maximum amount, a uniformly distributed random variable has the maximum differential entropy.

Overview of different probability distributions and their differential entropy
distribution Probability density function Differential entropy (in bits) carrier
equal distribution
Normal distribution
Laplace distribution
Symmetrical triangular distribution

literature

  • Thomas M. Cover, Joy A. Thomas: Elements of Information Theory . John Wiley & Sons, 1991, ISBN 0-471-06259-6 , pp. 224-238 .
  • Martin Werner: Information and Coding. Basics and Applications, 2nd edition, Vieweg + Teubner Verlag, Wiesbaden 2008, ISBN 978-3-8348-0232-3 .
  • Peter Adam Höher: Basics of digital information transfer. From theory to mobile radio applications, 1st edition, Vieweg + Teubner Verlag, Wiesbaden 2011, ISBN 978-3-8348-0880-6 .

Web links