Entropy (social sciences)

from Wikipedia, the free encyclopedia
Icon tools.svg

This item has been on the quality assurance side of the portal sociology entered. This is done in order to bring the quality of the articles on the subject of sociology to an acceptable level. Help eliminate the shortcomings in this article and participate in the discussion . ( Enter article )
Reason: Please take over, maintain and expand the LD remainder. - Emergency doc ( Disk ) 15:42, Apr. 14, 2014 (CEST)


The concept of entropy or social entropy found its way into the social sciences , especially into the area of sociological systems theory within sociology , as a measure of inequality or disorder originally taken from physics. It is mostly understood as a measure of the degree of order or disorder within a social system, for example to describe social inequality. Sometimes, instead of a scalable measure, social entropy is understood as the tendency to change social structures .

Concept of entropy

A human intuition-friendly way of presenting the meaning of entropy in an understandable way is to consider not the entropy itself, but the change in entropy, i.e. the increase or decrease in entropy.

  • Increase in entropy is loss of information:
Gilbert Newton Lewis wrote in 1930: "An increase in entropy means loss of information and nothing else."
The short summary of an explanation of entropy by Murray Gell-Mann in 1994 reads: "Entropy is a lack of information , the magnitude of which is measured by the effort that would be required to remedy this lack of information." (For details see below)
  • To reduce entropy, a system needs its environment:
A system can only reduce its entropy by polluting its environment. To do this, it has to be open. If one of its subsystems reduces its entropy, either the sum of the entropies of the other subsystems in the overall system must increase or the overall system must burden its environment with entropy.
  • An increase in entropy reduces the ability of a system to change:
A system with low entropy can change more easily without burdening its environment than a system with high entropy. If a system changes independently of its environment, then its entropy increases. A system with maximum entropy can no longer change at all on its own . These two facts apply equally to the subsystems of a system.

Critique of the concept of entropy

Choice of words

The physicist Leon Cooper attributes difficulties in understanding the concept of entropy essentially to Rudolf Clausius' choice of the wrong term : “... instead of taking the name from the body of contemporary languages, he succeeded in coining a word which meant the same thing for everyone: nothing. ”Clausius justified his choice of words with the fact that he preferred ancient languages ​​for important scientific figures because they meant the same thing for everyone in ancient languages. And Claude Shannon chose instead of the originally envisaged by him notion of uncertainty ( uncertainty ) the notion of entropy for information theory, because it is a probably a bit ironic idea of John von Neumann took over: "... no one knows what entropy really is, so is you always have an advantage in the debate. ”That is why the Israeli thermodynamicist Arieh Ben-Naim (* 1934) introduces a textbook on statistical thermodynamics with a reason why he replaced the concept of entropy with the term“ missing information ”( MI , Missing Information ).

The relationship between entropy and "disorder"

Due to the frequent and popular equation of entropy with “disorder”, the use of the term entropy lends itself to every science that researches ordering processes and the order of social systems . Entropy is often used as a synonym for disorder in popular representations of social science topics. One example is “ social entropy ”, a term used in macro-sociology , which in public discussion often means “social disorder”. However, equating entropy with disorder is controversial. It is not in principle inadmissible, but the admissibility of the equation depends on many additional conditions, including subjective views and normative evaluations. The use of entropy as a synonym for disorder therefore easily leads to statements that are confusing, unclear and contestable. Arieh Ben-Naim shows in Entropy Demystified (2007) that the concept of order and disorder is not helpful in explaining entropy and that an explanation of the mechanisms of action of thermodynamic entropy is only possible precisely using information theory and probability theory . The concept of entropy was initially developed in physics, but it is based on information and probability theory.

How humans deal with entropy

When dealing with entropy, humans can only rely on intuition to a limited extent, which is why their main concern is the search for so-called energy sources , without taking into account the entropy production associated with every energy conversion and the need for entropy sinks (e.g. storage sites for waste materials). However, the processes by which entropy is exported into the environment are increasingly being perceived because of their harmful effects. This includes in particular the production of waste materials. In economics and politics, entropy production is a burden on the environment shared by society, which is already taxed in many countries. Such entropy taxes, with which environmental pollution is taxed according to the polluter pays principle, are now usually implemented as eco taxes . The reduction of entropy production can also be subsidized, for example by promoting efficient drives, heating systems, etc. With emissions trading , the right to export entropy into the shared biosphere became a commodity.

Intuition and implicit knowledge of entropy

How could ... be obtained that what was born, become, put together, is subject to decay, since it should not decay: that does not exist. "

Humans have an implicit knowledge of the limits of the reversibility of the destruction of structures (for example things and living beings). When dealing with entropy and the laws connected with it, man intuitively recognizes processes that are perceived as impossible and in which the second law of thermodynamics, the theory of probability, etc. are violated. If a person sees in a film how a house rebuilds itself from rubble and ashes, he knows intuitively that the film is being played backwards. So man has an implicit knowledge of the laws of irreversibility and reversibility. He feels, often without being able to describe the reason for it, that the entropy of a system cannot decrease without the help of the system's environment.

Human intuition is based on vital implicit knowledge. In contrast to this, knowledge of the consequences of one's own actions is partly an inhibition of one's own development opportunities. Here, people are warned not by knowing that their own actions can influence the environment, but by knowing about social sanctions and personal losses. Insofar as warnings of likely harmful consequences for the agent are not required, repression and avoidance of knowledge can help not to impair one's own development with higher priority through disruptive knowledge. This finds structure in belief systems in which miracles are permitted and thus free the acting person from his responsibility. These are the reasons for the difficulties of humans with their current energy conversion possibilities, which grew by orders of magnitude in such a short period of development history that humans have not yet been able to build up appropriate implicit and collective knowledge .

Information theoretical entropy

As a measure of information theory, the Shannon entropy found its way into the social sciences with the development of concentration measures and indicators for the inequality of distribution, for example the distribution of income and wealth distribution . The entropy measures of Henri Theil , Anthony Atkinson and Serge-Christophe Kolm were derived from this . The Theil index and the Atkinson measure are not only used in the study of income and wealth distributions, but also, for example, in sociology to observe the segregation of population groups.

Example: the Theil index

The term “entropy measure” causes difficulties even for outstanding scientists. In On Economic Inequality (Chapter 2.11) Amartya Sen said that the term entropy is not exactly intuitive. The partial entropy used by his co-author (James Foster) is "interesting", but he was surprised that a high unequal distribution of income, for example, would result in high entropy and thus high disorder. Sen even goes so far as to evaluate Theil's formula as "arbitrary".

Redundancy is the difference between two entropies

If a system has redundancy in one of its categories , this means that the system in this category can change with its own resources . How the redundancy of a system can be determined (for the category of the internal resource distribution) from the difference between two entropies of the system is shown by the calculation of the Theil index , which is a measure of inequality used in sociometry and econometrics , the Henri Theil from information theory derived.

With regard to general understanding, this article should do without formulas; only in the following an exception is made and Theils index is used as a representative for all measures of uneven distribution designed as a measure of entropy. The details are explained in the main article, Theil Index. With the variant of the formula given here, unequal distributions can be calculated in which N areas delimited by quantiles (in this example these are groups of income earners) have different widths : let the income be in the i- th range and be the number (or the percentage Proportion) of income earners in the i- th area. be the sum of the incomes of all N areas and be the sum of the income earners of all N areas (or 100%).

In this case, Sens's astonishment is not due to the problem (see above) of equating entropy and disorder, but the solution of Sens's problem results when one divides Teil's formulas into two parts and presents them as follows:

  • The formula, divided into a left and a right part, for the proportion of the population that is distributed by income, works with income-related summands:
  • The formula for income, which is divided into a left and a right part, is divided into parts of the population and has summands related to parts of the population:

In fact, each of the two formulas contains two entropies. The part to the left of the minus sign is the maximum entropy (with uniform distribution). The part to the right of the minus sign is the actual entropy (for an actually given distribution). Based on information theory, such a difference is not entropy, but redundancy .

Amartya Sen was right to wonder: The part index is not a part entropy but a part redundancy. However, as the difference between two entropies, the redundancy also remains in the entropy domain. Although Theils, Atkinsons and Kolms unequal measures of distribution are redundancies, they can therefore still be called entropy measures. Calling them “entropies”, on the other hand, can lead to confusion.

With normalized data and the formulas become even simpler:

The maximum entropy is shifted to zero. The actually given distribution then presents itself as a negative entropy. That is the redundancy.

The mean value of both redundancies is a symmetrized Theil index:

One person's order is another's disorder

The example also illustrates the problem of equating entropy with disorder. When a completely equal distribution is achieved, the Theil index takes the value "zero". That is maximum entropy or minimum redundancy. But does that also mean “maximum disorder”? Is maximum concentration "maximum order"? If easy findability were a criterion for order, then maximum order would actually exist with maximum unequal distribution, because this distribution results when, for example, the entire assets of a society are housed with a single person. The criterion of easy findability would be consistent with the information-theoretical definition of entropy as a measure of the lack of information. In order to find the wealth of all housed with one person, only one address has to be requested and communicated. This small lack of information is easy to fix. In a real society, however, with this maximum order there would also be maximum strife, caused by injustice that is perceived as maximum. For egalitarians , however, maximum order is the equal distribution of wealth. Empirical studies show, however, that many people consider an uneven distribution in a fuzzy area between total equal distribution and total unequal distribution to be optimal.

Correspondence between formula and entropy explanation

Theil's formula is also in accordance with Gell-Mann's explanation of entropy, which speaks of an "effort". To the left of the minus sign, for example, is the equally distributed wealth per owner. How is the information expenditure calculated that is required to represent the quotient resulting from this fraction? This is done with a scaling function called a logarithm : the logarithm of a number is proportional to the effort it takes to write that number down. This is the amount of information that is required to report the evenly distributed per capita wealth. To the right of the minus sign is the actual assets per owner group. Again, the logarithm of this number describes the space that is needed to write this number down. Since the group has less weight than the whole, the individual efforts still have to be weighted.

Entropy as a measure of "ignorance"

Murray Gell-Man's explanation of entropy as the effort necessary to reduce “ignorance” was only given abbreviated at the beginning of this article. His explanation was criticized by Ilya Prigogine because he said that one could reduce the entropy by changing knowledge in a closed system . It is therefore important to note that Gell-Mann's definition does not state that the lack of information must actually be corrected. It is about the potential effort to remedy a lack of information. If the effort were to be realized in a closed system and a reduction in entropy would result, then this reduction would at least be compensated for by an increase in entropy caused by the effort. Here we come full circle to Leó Szilárd (see above), who based on this connection showed that intelligent beings (including Maxwell's demon ) cannot reduce entropy in a closed system.

Statistical Physics

Entropy was first used as a physical quantity in thermodynamics and was understood phenomenologically . The processes and connections on which it is based can be explained with the knowledge of statistical physics (also known as statistical mechanics ) founded by Ludwig Boltzmann . Systematically, Boltzmann's entropy therefore offers itself as a common basis for the explanation of both thermodynamic (according to Clausius ) and information-theoretical entropy (according to Shannon).

Macrostate and its microstates

Social scientists use statistics to describe, for example, the macro- states of a society that result from the many micro-states of the people acting in their environment. If necessary, these micro-states are also macro-states in themselves, which can be broken down into even smaller micro-states. This leads to complex systems that can only be managed at the macro level. Knowing about the interaction of the micro-states helps to assess the possible macro-states, but the micro-states themselves are beyond practical manageability.

In statistical physics, too, the macrostates of a system are treated as a statistical description of the microstates that lead to the macrostates. For example, the temperature is a statistical variable in the macro range, which fluctuates in the micro range (spatially and temporally) and results from the individual behavior of the many particles in the body for which this temperature is measured. Even if the behavior of a single particle could be observed, it would not be possible to observe all the particles in the bodies that are visible to the naked eye in our everyday world. This example serves to introduce the terms macro and micro state before one of the possibilities to explain entropy follows: “ Entropy and information are closely related. Indeed, entropy can be understood as a measure of ignorance. If it is only known that a system is in a given macrostate, then the entropy of a macrostate describes the degree of ignorance about the microstate. This level is the number of bits of additional information still needed to fully specify the microstate. All micro-states are treated as equally likely. “In a systems theory for psychologists, Norbert Bischof describes entropy in a similar way as“ the measure of the quantity of unpredictable topicality, that is, of 'information', that is generated on average by each selection act of the signal source. ". According to Richard Becker , entropy is not ignorance, but a function of ignorance: "S = k ln (ignorance)" . Entropy thus remains a measure of ignorance here as well.

Fluctuations

Fluctuations are conceivable short-term, local concentration differences and thus local reductions in entropy. In a large macro system, the occurrence of local fluctuations can be observed: Considered in isolation, many low probabilities of occurrence can lead to a sufficiently high probability that an unexpected event will be observable somewhere.

The possibility of fluctuations means that in system analysis it must be carefully understood whether, in systems with few macrostates, fluctuations due to short fluctuations have to be taken into account, because of which successive states can no longer be identified by an increase in entropy. In addition, with regard to the infinity and also the length of human history, “short” can also mean many years. For the social sciences, this means that it is not always certain whether a situation (e.g. economic growth) is of significant duration or whether there is just fluctuation.

Merging of the different entropy concepts

The structural coupling between economy and society as well as between society and the material environment allows different types of entropy transfer, thus also an interaction of information entropies and thermodynamic entropies. How far the physical entropy can be merged with the metaphysical entropy, however, is under discussion. Two ways that lead there have already been mentioned:

  • The entropy of information theory is transferred to thermodynamics.
  • The thermodynamic entropy is explained in terms of information and probability theory.

In fact, an explanation of the mechanisms of action of thermodynamic entropy is possible exactly using probability theory.

The quantitative specification of entropy and the quantitative description of entropy changes in complex systems remains difficult or even impossible, which can be explained by the entropy of the descriptive systems (people and their aids). Nevertheless, an understanding of the effect of entropy on systems (see introduction to this article) is helpful in understanding the interaction between systems and their environment. This includes the fundamental impossibility of self-sufficient action. When a system opens, the aim of this opening is to renounce independence.

literature

  • Kenneth D. Bailey: Social entropy theory . State University of New York Press, New York 1990, ISBN 978-0-7914-0056-2 (English, 310 pages).
  • Karl-Michael Brunner: Social entropy: The nature-society difference using the example of thermodynamic models of society . In: Karl-Siegbert Rehberg (Ed.): Difference and Integration . tape 2 . West German Verlag, Opladen 1997, ISBN 3-531-12878-7 ( online - conference contribution).
  • Jakl, Thomas, Sietz, Manfred, Hrsg .: Making sustainability tangible - increase in entropy as a measure of sustainability. Proceedings of the symposium on April 27, 2012, Diplomatic Academy, Vienna 2012.
  • Welf A. Kreiner: Entropy - what is it? Overview for students and teachers.

Web links

Wiktionary: Entropy  - explanations of meanings, word origins, synonyms, translations

Individual evidence

  1. ^ Ditmar Brock et al.: Sociological paradigms according to Talcott Parsons: An introduction . 1st edition. VS Verlag für Sozialwissenschaften, Wiesbaden 2009, ISBN 978-3-531-16216-4 , p. 388 ( limited preview in Google Book search).
  2. ^ Lewis, Gilbert Newton (1930): The Symmetry of Time in Physics , Science , 71, 0569
  3. ^ Leon Neil Cooper (1968): An Introduction to the Meaning and the Structure of Physics
  4. a b Ben-Naim, Arieh (2008): A Farewell to Entropy: Statistical Thermodynamics Based on Information
  5. Wöhlcke, Manfred (2003): The end of civilization. About social entropy and collective self-destruction.
  6. a b Ben-Naim, Arieh (2007): Entropy Demystified , Chapter 8.2: The Association of Entropy with "Disorder"
  7. a b Verbeeck, Bernhard (1998): Anthropology of Environmental Destruction
  8. Scheer, Hermann (1998): Solar Strategy , therein The Entropy Tax (Chapter 8)
  9. Peter Kafka (1998), Time to Get Up ( Memento of the original from October 22, 2007 in the Internet Archive ) Info: The archive link was inserted automatically and has not yet been checked. Please check the original and archive link according to the instructions and then remove this notice. : "This tax on 'messing up the biosphere' could also be called an 'entropy tax'."  @1@ 2Template: Webachiv / IABot / www.peterkafka.de
  10. ^ Luhmann, Niklas (1988): The economy of society . In the book Luhmann shows that in the economy money is exclusively the communication medium ; a physical quantity such as “entropy” cannot be communicated in an operationally closed economic system. However, the economy is structurally linked to its environment to such an extent that the environment effects communication in the economic system. However, this is always done using money as a means of communication. For this reason, no directly perceptible relationship between environmental pollution and costs is conveyed in this trade communication.
  11. Siddhartha Gautama (presumably passed down from the year 483 BC): Dígha Nikáya (DN 16.3.6), Maháparinibbána Sutta
  12. a b Cowell, Frank A. (2002, 2003): Theil, Inequality and the Structure of Income Distribution (PDF; 320 kB) , London School of Economics and Political Sciences (with references to the "Class of Kolm Indices", these are measures of unequal distributions such as the Theil index )
  13. ^ Theil, Henri (1971): Principles of Econometrics
  14. a b Tsui, Kai-Yuen (1995): Multidimensional Generalizations of the Relative and Absolute Inequality Indices: The Atkinson -Kolm- Sen Approach . Journal of Economic Theory 67, 251-265.
  15. Sen, Amartya (1997): On Economic Inequality , Enlarged Edition with a substantial annexe after a Quarter Century with James Foster, Oxford
  16. The notation with E and A follows the notation of a small collection of formulas by Lionnel Maugis: Inequality Measures in Mathematical Programming for the Air Traffic Flow Management Problem with En-Route Capacities (for IFORS 96), 1996
  17. ^ Cuong Nguyen Viet (2007): Do Foreign Remittances Matter to Poverty and Inequality? Evidence from Vietnam ( Memento of the original from March 25, 2009 in the Internet Archive ) Info: The @1@ 2Template: Webachiv / IABot / economicsbulletin.vanderbilt.edu archive link was inserted automatically and has not yet been checked. Please check the original and archive link according to the instructions and then remove this notice. (The formulas apply to areas with the same width (quantiles with the same distance). The formulas given here in the article can also be applied to areas with different widths.)
  18. Helpman, Elhanan (2004): The Mystery of Economic Growth (The two formulas correspond to the two methods of calculating the Theil Index mentioned by the author on p. 150: In the first formula, the weighting is based on the income share, in the second based on the population share The mean value of both formulas leads to the symmetrized Theil index used in the article Theil index for the comparison with the Hoover unequal distribution .)
  19. ISO / IEC DIS 2382-16: 1996 defines redundancy in information theory
  20. Amiel / Cowell , FA (1999): Thinking about inequality , 1999
  21. ^ Prigogine, Ilya (1997): The End of Certainty, Time, Chaos and the New Laws of Nature
  22. Ben-Naim, Arieh (2007): Entropy Demystified (popular science); Ben-Naim, Arieh (2008): Statistical Thermodynamics Based on Information: A Farewell to Entropy (textbook)
  23. Malik, Fredmund (1984/2006): Strategy of the management of complex systems
  24. Gell-Mann, Murray (1994): The Quark and the Jaguar (quote slightly changed syntactically and grammatically for better understanding. "Ignorance" was translated as "ignorance".)
  25. Bischof, Norbert (1998): Structure and meaning (in the original "source" instead of "signal source" in the quote.)
  26. Becker, Richard (1955): Theory of Heat
  27. Vojta, Günter / Vojta, Matthias (2000): Pocket book of statistical physics , Chapter 7 Statistical Physics and Information Theory
  28. ^ Lange, Franz H. (1966): Signals and Systems , Volume 3, p. 64.
  29. Ebeling, W. / Freund, J., Schweitzer, F. (1998): Complex structures: Entropy and information
  30. Vojta, Günter / Vojta, Matthias (2000): Taschenbuch der Statistischen Physik , Chapter 13.2.2 Stochastic Theories of Behavior : “After all, using the methods of statistical physics opens up new ways to research and understand the behavior of very complex systems of Sociology, ecology and economics. "