Composite entropy

from Wikipedia, the free encyclopedia

The composite entropy is a measure of the mean information content of several news sources.

definition

The join entropy for two sources is defined as follows:

Here, x and y in each case individual symbols of the source alphabets X and Y .

In the special case of the statistical independence of the sources applies

and thus

.

For more than two sources we get:

See also

literature

  • Martin Werner: Information and Coding. Basics and Applications, 2nd edition, Vieweg + Teubner Verlag, Wiesbaden 2008, ISBN 978-3-8348-0232-3 .
  • Peter Bocker: data transfer . Volume I - Basics . Springer Verlag, Berlin / Heidelberg 1976, ISBN 978-3-662-06499-3 .

Web links

Individual evidence

  1. ^ TH Cover, JA Thomas, Elements of Information Theory , p. 16 f, 2nd Edition, 2006, Wiley Interscience, ISBN 978-0-471-24195-9