from Wikipedia, the free encyclopedia

Negentropy is the abbreviation for negative entropy and is a special case of synentropy . In general, negentropy is defined as entropy with a negative sign . It can be interpreted as a measure of the deviation of a random variable from the uniform distribution . Since the entropy (disorder or randomness ) of a uniformly distributed random sequence is maximal, it follows that the negentropy of this sequence is minimal. In the information-theoretical interpretation of entropy , negentropy is high when there is a lot of information in a character string , and small in a random character string.


Negentropy is defined differently. The simplest definition (see above) is: negentropy is negative entropy .

Another definition defines negentropy for a discrete random variable such that it corresponds to redundancy :


  • the entropy ; it is crucial that, in accordance with the above Definition with a negative sign
  • a normalization of a uniformly distributed variable (with the same correlation and covariance matrix as ).

By means of suitable normalization one can achieve that the negentropy of the uniformly distributed variable is equal to zero:

Interpretation and miscellaneous

The term negative entropy was coined by Erwin Schrödinger in his book Was ist Leben or adopted by Boltzmann . He defines life as something that absorbs and stores negative entropy. This means that life is something that exports entropy and keeps its own entropy low: negentropy import is entropy export.

Even if Schrödinger meant free energy by negative entropy , as he wrote in a footnote, this does not contradict the second law of thermodynamics , contrary to the opinion that is often put forward , since this process takes place with the supply of energy (in plants, for example, by sunlight).

Léon Brillouin later shortened the term to negentropy in order to express the facts in a more “positive” way: a living system absorbs negentropy and stores it. Organisms change negentropically through energy use. Because of their energy requirements, organisms are open systems .

As for its use in other areas, the term negentropy does not seem to be clearly defined.

In the lexicon of biology, negentropy is defined as the average information content of the individual character within a given character string , with which a reference to information theory is established. This also corresponds to the above example for the equal distribution, since with an "equally distributed" variable there is no additional information compared to a "uniform distribution".

The term is defined somewhat differently by sociological systems theorists , namely as “negation of entropy” or as “increase in complexity ”. Thus, negentropy is synonymous with order or information and thus a characteristic of the emergence or delimitation of systems . Another (free) translation would be: "Absence of (relatively complete) entropy" or correspondingly: "Absence of chaos ".

Negentropy and order

The term order , which is usually equated with negentropy, is used in a similarly inconsistent manner as the term negentropy .

See also


  • Norbert Wiener, Cybernetics or Control and Communication in the Animal and the Machine , Massachusetts, MIT Press 1948
  • Heinz Herwig, Tammo Wenterodt: Entropy for engineers . 1st edition. Springer Fachmedien, Wiesbaden 2012, ISBN 978-3-8348-1714-3 , 2.10.1 Negentropy.

Individual evidence

  1. ISO / IEC DIS 2382-16
  2. Quote: By the way, “negative entropy” is not my invention at all. Indeed, it is the term around which Boltzmann's independent discussion revolved. Source: Erwin Schrödinger What is life? - Looking at the living cell through the eyes of the physicist. Piper Paperback 1989, ISBN 978-3492211345 , p. 130
  3. ^ Brillouin, Leon: (1953) "Negentropy Principle of Information", J. of Applied Physics , v. 24 (9) , pp. 1152-1163
  4. ^ Léon Brillouin, La science et la théorie de l'information , Masson, 1959
  5. Lexicon of Biology, Herder Verlag 1988