# entropy

Physical size
Surname entropy
Formula symbol ${\ displaystyle S}$
Size and
unit system
unit dimension
SI J · K −1 L 2 · M · T −2 · Θ −1
When melting of ice , the ordered ice crystal is transferred to a random motion of individual water molecules: The entropy of the water in the ice cubes in this case increases ( Rudolf Clausius 1862)

The entropy ( Art word ancient Greek ἐντροπία ENTROPIA from ἐν s at ',' into ', and τροπή trope , turn') is a fundamental thermodynamic state variable with the SI unit Joules per Kelvin (J / K).

All processes that take place spontaneously in a system cause an increase in its entropy, as does the supply of heat or matter . Such processes are e.g. B. Mixing , heat conduction , chemical reaction or conversion of mechanical into thermal energy through friction (see dissipation , energy depreciation ). The entropy of a system can only decrease through the release of heat or matter. Therefore, in a closed system (a system in which there is no exchange of energy or matter with the environment) the entropy cannot decrease, but only increase over time ( Second Law of Thermodynamics ). Processes in which the entropy increases in one system can only be reversed if another system absorbs the entropy. They are called irreversible (irreversible).

For example, we observe that in a system of a cold and a hot body in an insulated box, i.e. H. in a practically closed system, heat transport begins and the temperature difference disappears. After a certain time, both bodies will have the same temperature, with which the system has reached the state of greatest entropy. In such a closed system we practically never observe the spontaneous cooling of the colder body and the heating of the warmer one.

In statistical mechanics , the macrostate of a system, which is defined exclusively by macroscopic thermodynamic quantities, is the more likely, the higher the number of microstates that can realize it and which can merge into one another through internal processes. This number therefore determines the entropy of the system in this macrostate. In a system left to its own devices in any initial state, the spontaneous internal processes then cause the state of the system to approach with the greatest probability that macrostate that can be realized with the same energy through the greatest number of different microstates, i.e. has the highest possible entropy .

This is often paraphrased colloquially by saying that entropy is a “measure of disorder”. However, disorder is not a well-defined physical term and therefore has no physical measure. It is more correct to understand entropy as an objective measure of the amount of information that would be required in order to be able to infer the actual microstate of the system from an observable macrostate. This is what is meant when entropy is also described as a “measure of ignorance of the states of all individual particles”.

## historical overview

For a long time in the history of physics there was a dispute about the meaning of the term “heat”: One side advocated the theory that heat phenomena are based solely on the vis viva (“living force” = kinetic energy) of the atoms; the other claimed that heat was a substance that was given the name caloricum ( French calorique , English caloric ).

In 1789 Antoine Laurent de Lavoisier distinguished between chaleur (warmth) and calorique (caloricum). The Caloricum should, among other things, bring about a repulsive force between the atoms of a solid, so that if a sufficient amount of Caloricum is supplied, it would first become liquid and then gaseous. Together with Pierre Simon Laplace , he constructed an ice calorimeter . Lavoisier and Laplace did not want to determine whether the vis viva or the caloricum substance is the cause of the heat phenomena. Joseph Black distinguished temperature from quantity of heat , u. a. based on the latent heat during melting. He noticed that the amount of heat had to be carried along with the steam escaping from a boiler.

Benjamin Thompson , Count of Rumford, investigated the temperature of chips that were produced when drilling cannon barrels while he was in Munich in 1798. Due to the arbitrarily large amount of heat that could arise from the mechanical drilling work, he doubted that the caloricum could be a (preserved) substance, which gave the proponents of the vis-viva theory a boost.

The namesake of the Carnot process , Nicolas Léonard Sadi Carnot , wrote in 1824 that the power of a steam engine is not due to the consumption of calorique , but to its transport from a warm body to a cold one, thus preparing the concept of entropy. The experiments of Robert Mayer and James Prescott Joule in the early 1840s showed that mechanical work could be converted quantitatively into heat . This was the basis for the general law of conservation of energy formulated by Hermann von Helmholtz in 1847 , i.e. the first law . Since then, the physical term heat has been fixed in terms of its energetic meaning.

A further 20 years later, however, Rudolf Clausius discovered that a second quantity-like quantity must flow when the energy form heat is transferred . He saw this quantity as the cause of the disgregation during melting and called it entropy . As worked out by Wilhelm Ostwald in 1908 and Hugh Longbourne Callendar in 1911, the entropy in Clausius corresponds to the calorique in Lavoisier and Carnot.

With work by Ludwig Boltzmann and Willard Gibbs , it was possible around 1875 to give entropy a statistical definition which microscopically explains the previously macroscopically defined quantity. The entropy of a macrostate is calculated using the probabilities of the microstates : ${\ displaystyle S}$${\ displaystyle p_ {i}}$ ${\ displaystyle i}$

${\ displaystyle S = -k _ {\ mathrm {B}} \ sum _ {i} p_ {i} \ ln (p_ {i})}$

The proportionality factor is the Boltzmann constant , but Boltzmann himself did not determine its value. ${\ displaystyle k _ {\ mathrm {B}}}$

The entropy defined in this way can be used meaningfully in many contexts.

Relationships between entropy and information arose as early as the 19th century through discussions about Maxwell's demon , a thought experiment that became topical in the context of miniaturization in the computer age. The computer science uses the Shannon information entropy corresponding to the statistical interpretation, as an abstract measure of information not directly related to the physical realization. Also Norbert Wiener used the concept of entropy to describe information phenomena, but with different sign than Shannon. The fact that the Shannon Convention prevailed is primarily due to the better technical usability of his work.

## Classical thermodynamics

In thermodynamics , a system can exchange energy with its environment in two ways: in the form of heat and work , with different variants of work depending on the system and the process management. a. Volume work, chemical work, and magnetic work. In the course of such an energy exchange, the entropy of both the system and the environment changes. The change occurs spontaneously only when the global sum of the entropy changes is positive.

### Basics

Entropy (unit J / K ) is an extensive state variable of a physical system and behaves additively when several systems are combined, as does volume , electrical charge or the amount of substance . The physicist Rudolf Clausius introduced this term in 1865 to describe cyclic processes . If you divide by the mass of the system, you get the specific entropy with the unit J / ( kg · K) as an intensive state variable. ${\ displaystyle S}$${\ displaystyle S}$ ${\ displaystyle s}$

According to Clausius, the differential in reversible processes between states in equilibrium is the ratio of transferred heat and absolute temperature : ${\ displaystyle \ mathrm {d} S}$ ${\ displaystyle \ delta Q _ {\ mathrm {rev}}}$ ${\ displaystyle T}$

${\ displaystyle \ mathrm {d} S = {\ frac {\ delta Q _ {\ mathrm {rev}}} {T}} \ qquad (1)}$

This change in entropy is positive when heat is supplied and negative when heat is removed. In this notation, a non-italic is used to emphasize that it is a complete differential , as opposed to which cannot be a complete differential because is a process variable. In this context, the reciprocal absolute temperature plays the role of an “integrating evaluation factor”, which turns the reversibly supplied or removed heat, a - mathematically speaking - incomplete differential, into a corresponding complete differential . As a result, the change in entropy in reversible processes - in contrast to the heat supplied or removed - is path-independent. With the definition of any value for a reference state, the entropy becomes a state variable given solely by the respective state . ${\ displaystyle \ mathrm {d}}$${\ displaystyle \ delta Q}$${\ displaystyle Q}$ ${\ displaystyle \ mathrm {d} S}$

In this respect, the entropy with reversible process management can also be defined as the “ thermal energy that is also evaluated”. Below is the problem of how far the energy of a system can be converted into work . ${\ displaystyle {\ tfrac {1} {T}}}$

If one uses the first law of thermodynamics, that is, that the change in energy is made up of the applied work and heat, and if all the processes possible for the experimenter by changing the system sizes are used for the work , one obtains from (1) for the change in entropy as a function of thermodynamic variables (still in the reversible case) ${\ displaystyle \ mathrm {d} U = \ delta W + \ delta Q}$${\ displaystyle \ mathrm {d} U}$${\ displaystyle \ delta W = -p \, \ mathrm {d} V + \ mu \, \ mathrm {d} N + \ dots}$

${\ displaystyle \ mathrm {d} S = {\ frac {1} {T}} (\ mathrm {d} U + p \, \ mathrm {d} V- \ mu \, \ mathrm {d} N- \ dots)}$

Clausius also dealt with irreversible processes and showed that in an isolated thermodynamic system the entropy can never decrease:

${\ displaystyle \ Delta S \ geq 0 \ qquad (2),}$

where the equal sign only applies to reversible processes. is the entropy change of the system with for the entropy of the state at the beginning of the state change and for the state at the end of the process. ${\ displaystyle \ Delta S = S_ {e} -S_ {a}}$${\ displaystyle S_ {a}}$${\ displaystyle S_ {e}}$

From (2) the inequality follows for closed systems in which heat energy can pass the system boundaries:

${\ displaystyle \ Delta S \ geq \ Delta S_ {Q} = \ int {\ frac {\ delta Q} {T}} \ qquad (3a)}$

${\ displaystyle \ Delta S_ {Q}}$is the entropy share that results from the supply of heat across the system boundary. The formula also applies to the removal of heat from the system, in this case it is negative. Inequality (3a) only becomes an equation for purely reversible processes. ${\ displaystyle \ Delta S_ {Q}}$

When analyzing thermodynamic systems in technology, a balance analysis is often carried out. To do this, one writes the inequality (3a) in the following form:

${\ displaystyle \ Delta S = \ Delta S_ {Q} + \ Delta S _ {\ mathrm {irr}} \ qquad (3)}$

This is the entropy share that arises from irreversible processes inside the system. These include, for example, mixing processes after removing an inner partition, thermal compensation processes, the conversion of electrical or mechanical energy (ohmic resistance, agitator) into heat and chemical reactions. If the irreversible processes are limited exclusively to the dissipation of mechanical or electrical work , then the work or the dissipated power can be expressed. ${\ displaystyle \ Delta S _ {\ mathrm {irr}} \ geq 0}$${\ displaystyle \ delta W _ {\ mathrm {diss}}}$${\ displaystyle \ Delta S _ {\ mathrm {irr}}}$${\ displaystyle P _ {\ mathrm {diss}}}$

${\ displaystyle \ Delta S _ {\ mathrm {irr}} = \ int {\ frac {\ delta W _ {\ mathrm {diss}}} {T}} = \ int {\ frac {P _ {\ mathrm {diss}} } {T}} dt}$

If the irreversible process runs quasi-statically, so that the system is always close to a state of equilibrium, then (3) can also be written with time derivatives.

${\ displaystyle {\ dot {S}} = {\ dot {S}} _ {Q} + {\ dot {S}} _ {\ mathrm {irr}} \ qquad}$

It is referred to as the entropy transport flow and the entropy production flow. ${\ displaystyle {\ dot {S}} _ {Q}}$${\ displaystyle {\ dot {S}} _ {irr}}$

From the first law of thermodynamics

${\ displaystyle \ Delta U = W + Q}$

it follows that the product represents the unused portion ("waste heat") in the isothermal generation of work from existing internal energy . The maximum value of this work is the so-called free energy${\ displaystyle (Q =) \, T \ Delta S}$${\ displaystyle W}$ ${\ displaystyle \ Delta U}$

${\ displaystyle \ Delta F = \ Delta UT \ Delta S}$.

This is an equivalent form of the 2nd law.

A consequence of this is the impossibility of a perpetual motion machine type 2. Clausius put it:

"There is no such thing as a cycle whose only effect is to transport heat from a colder reservoir to a warmer reservoir."

Apparently, otherwise, an inexhaustible source of energy would have been constructed. If it were possible to construct such a cycle process, one could continuously take energy from the warm reservoir and do work with it. The dissipated work would then be fed to the cold reservoir and would benefit the warm reservoir again via the cycle process mentioned. Equivalent to this is the formulation of William Thomson , later Lord Kelvin:

"There is no such thing as a cycle that takes a quantity of heat from a reservoir and completely converts it into work."

An ideal process that can be reversed at any time without friction losses is also called reversible. Entropy often remains unchanged during a process , a well-known example is the adiabatic compression and expansion in the cycle of a Carnot machine . Changes of state with constant entropy are also called isentropic, but not all isentropic changes of state are adiabatic. If a process is adiabatic and reversible, it always follows that it is also isentropic. ${\ displaystyle \ Delta S = 0}$

If the heat is absorbed in a cyclic process at the temperature and the amount of heat is released again at and if the heat absorption and release are reversible, then the entropy does not change: ${\ displaystyle T _ {\ rm {h}}}$${\ displaystyle Q _ {\ rm {h}}}$${\ displaystyle Q _ {\ rm {l}}}$${\ displaystyle T _ {\ rm {l}}}$

${\ displaystyle \ oint {\ rm {d}} S = 0}$; or   .${\ displaystyle {\ frac {Q _ {\ rm {h}}} {T _ {\ rm {h}}}} = {\ frac {Q _ {\ rm {l}}} {T _ {\ rm {l}} }} \,}$

The maximum work performed and the maximum efficiency , the so-called Carnot efficiency , can be derived from this: ${\ displaystyle W = Q _ {\ rm {h}} - Q _ {\ rm {l}}}$ ${\ displaystyle \ eta}$

${\ displaystyle \ eta = {\ frac {W} {Q _ {\ rm {h}}}} = {\ frac {T _ {\ rm {h}} - T _ {\ rm {l}}} {T _ {\ rm {h}}}} \ ,.}$

Carnot's efficiency represents the maximum work yield for all heat engines. Real machines usually have a considerably lower efficiency. With them, part of the theoretically available work is dissipated, e.g. B. by friction. As a result, entropy occurs in a real machine and more heat is dissipated to the cold reservoir than is necessary. So it works irreversibly.

The third law of heat (the so-called “ Nernst's law of heat ”) defines the entropy of a perfectly crystalline substance, in which, for example, no spin degeneration occurs, as zero at absolute zero :

${\ displaystyle S (T = 0) \ equiv 0 \ ,.}$

One conclusion is, for example, that the heat capacity of a system disappears at low temperatures, and above all that the absolute temperature zero point cannot be reached (this also applies to spin degeneration).

If a substance does not fulfill the condition perfectly crystalline (e.g. if there are several configurations or if it is a glass), entropy can also be ascribed to it at absolute zero ( zero point entropy ).

### Partial derivatives of entropy

Statements about the partial derivatives of entropy follow from the 2nd main theorem , e.g. B. according to temperature or volume . With the second main theorem, it applies first of all that there is a reversible state change . Together with the first law, it follows because according to the first law for internal energy , the sum of the work supplied to the system under consideration and the supplied heat (individually no state functions!) Results in a state function, precisely the "internal energy" of the system. It was assumed that the changes in volume and temperature take place adiabatically slowly so that no irreversible processes are generated. ${\ displaystyle T}$${\ displaystyle V}$${\ displaystyle \ mathrm {d} S = {\ tfrac {\ delta Q _ {\ mathrm {reversible}}} {T}}}$${\ displaystyle \ mathrm {d} S = {\ tfrac {\ mathrm {d} U- \ delta W} {T}},}$ ${\ displaystyle U}$${\ displaystyle \ delta W}$${\ displaystyle \ delta Q}$

So

${\ displaystyle \ mathrm {d} S = {\ frac {1} {T}} {\ frac {\ partial U (T, V)} {\ partial V}} \, {\ mathrm {d} V} + {\ frac {1} {T}} {\ frac {\ partial (U (T, V) + p \ cdot V (T))} {\ partial T}} \, \ mathrm {d} T,}$

where was used. ${\ displaystyle \ delta W = -p \, \ mathrm {d} V}$

${\ displaystyle \ Rightarrow {\ frac {\ partial S} {\ partial V}} = {\ frac {1} {T}} {\ frac {\ partial U (T, V)} {\ partial V}}}$ or.
${\ displaystyle {\ frac {\ partial S} {\ partial T}} = {\ frac {1} {T}} {\ frac {\ partial (U (T, V) + p \ cdot V (T)) } {\ partial T}}}$.

Similar relationships arise when the system depends on other variables in addition to density or volume, e.g. B. of electrical or magnetic moments.

From the third law follows that both as well for indeed sufficiently rapidly to disappear, and that (as can be shown) is only fulfilled if not classical physics at low temperatures, but quantum physics apply. ${\ displaystyle {\ tfrac {\ partial S} {\ partial T}}}$${\ displaystyle {\ tfrac {\ partial S} {\ partial V}}}$${\ displaystyle T \ to 0}$

## Statistical Physics

In the u. a. Statistical mechanics founded by James Maxwell explains the behavior of macroscopic thermodynamic systems by the microscopic behavior of its components, i.e. elementary particles and systems composed of them such as atoms and molecules . With regard to entropy, the question arises as to how it can be interpreted here, and whether the second law can be derived from a microscopic time-reversal invariant theory.

A microstate is classically given by specifying all locations and impulses of the particles belonging to the system. Such a microstate is accordingly a point in a 6N -dimensional space, which in this context is called a phase space . The canonical equations of classical mechanics describe the evolution of the system over time, the phase trajectory. All under given macroscopic boundary conditions, such as B. total energy , volume and particle number , achievable phase points form a coherent phase space volume . ${\ displaystyle ({\ vec {q}}, {\ vec {p}})}$${\ displaystyle E}$${\ displaystyle V}$${\ displaystyle N}$${\ displaystyle \ Omega}$

Around 1880 Ludwig Boltzmann was able to find a quantity on a microscopic level that meets the definition of thermodynamic entropy:

${\ displaystyle S = k _ {\ mathrm {B}} \ ln \, \ Omega}$

The constant is the Boltzmann constant. The entropy is therefore proportional to the logarithm of the phase space volume belonging to the values ​​of the thermodynamic variables. ${\ displaystyle k _ {\ mathrm {B}}}$

An equivalent formula is

${\ displaystyle S = -k _ {\ mathrm {B}} \ int w \, \ ln (w) \ mathrm {d} \ Omega}$

(with the probability for the microstate if the variables belonging to the thermodynamic system are known, and the integral over the phase space). If you use the natural measure on phase space and consider the probability to be constant in the absence of other information ( with the phase space volume belonging to the values ​​of the thermodynamic variables), it immediately leads to the "Boltzmann formula" (since the Integration is independent of : and ). Its similarity to Shannon's term for information suggests that entropy should be interpreted as the information deficit about the microstate associated with knowledge of the macroscopic variables. The greater the entropy, the less we know about the microscopic state, the less information is known about the system. ${\ displaystyle w}$${\ displaystyle \ mathrm {d} ^ {3N} q \; \ mathrm {d} ^ {3N} p}$${\ displaystyle w = 1 / \ Omega}$${\ displaystyle \ Omega}$${\ displaystyle w}$${\ displaystyle S = -k _ {\ mathrm {B}} w \, \ ln (w) \ mathrm {\ int} d \ Omega = k _ {\ mathrm {B}} \ ln \, \ Omega}$${\ displaystyle w = 1 / \ Omega = {\ text {const}}}$

The latter was worked out by ET Jaynes under the heading "information-theoretical entropy" to a concept to understand entropy as an epistemic (he called it "anthropomorphic") quantity. For example in the following quote:

“For example, I have been asked several times whether, in my opinion, a biological system, say a cat, which converts in animate food into a highly organized structure and behavior, represents a violation of the second law. The answer I always give is that, until we specify the set of parameters which define the thermodynamic state of the cat, no definite question has been asked! "

“For example, I have been asked several times whether, in my opinion, a biological system, such as a cat, which converts inanimate food into highly organized structure and behavior, is a violation of the second law. The answer I always give is that unless we specify the set of parameters that determine the thermodynamic state of the cat, no straightforward question has been asked. "

- ET Jaynes

It becomes clear that entropy - like a thermodynamic system in general - is only defined by a selection of variables and is dependent on them. It cannot be assigned to a microstate. It is criticized that here the entropy seems to have the rank of a subjective quantity, which is not appropriate in an objective description of nature.

### Proof of the second law

Above all, Boltzmann tried to derive the 2nd law (that entropy can only increase) statistically. The vivid idea is that something very likely happens during a mixing process, while the reverse process of unmixing would be very unlikely. This had to be made more precise mathematically, with his H-theorem he had a partial success here. However, with the “Loschmidt's reverse objection” it is made clear that microscopically every process could just as well run backwards and therefore a time-oriented law cannot be derived microscopically. The recurrence rate also questions the possibility of such a law.

Understood in the concept of information theory, the 2nd law means that the information about the microstate can only decrease when the macroscopic variables are observed. The proof is much easier here:

According to Liouville's theorem , the phase space volume of the microstates associated with an initial value of the thermodynamic variable remains constant over time. If one also assumes that the description by the thermodynamic variables is unambiguous, i.e. all microstates end up macroscopically at the same final state, the phase space volume of the microstates associated with this final value of the thermodynamic variable cannot be smaller than the initial phase space volume. It can be bigger, however, because not all micro-states are necessarily “controlled”. So the entropy can only increase.

You can put it differently. A distinction is made between von Neumann or "fine-grained" or "entanglement" entropy (i.e. that of microphysics, i.e. systems correlated with wave mechanics) and thermal entropy (i.e. the entropy in classical, macroscopic thermodynamics, also called "coarse-grained" entropy). Without correlation, the entanglement entropy ( ) is zero (only one state at a time, the "pure state"). With entanglement (correlation), more states are available, and the entanglement entropy is greater than zero. In macrophysics, phase space areas are considered, such as the volume of a sphere ("coarse-graining") around a point, not individual points or micro-states. The area of ​​the phase space of a system defined with initial conditions is consequently covered by spherical volumes that contain more phase space points than in the microscopic initial state. The "fine-grained" entropy is therefore always smaller than the "coarse-grained" entropy. This is the statement of the 2nd law. As information is the difference between coarse-grained entropy and fine-grained entropy. Details can be found in the book by Susskind and Lindesay. ${\ displaystyle S = k_ {B} \ ln N}$${\ displaystyle N = 1,}$

The temporal asymmetry of the second law concerns the knowledge of the system, not the ontology of the system itself. This avoids the difficulties of obtaining an asymmetrical law from a theory that is symmetrical with respect to time reversal. However, the proof also includes the uniqueness of the thermodynamic description, which is based on stochastic arguments. In order to understand the temporal asymmetry of world events, a reference to the initial state of the universe is necessary.

### Entropy as a "measure of disorder"

The picture shows, in simplified form, the states of a system in which four atoms can either be on the right or left side of a container. The columns are sorted according to the total number of particles on the right and left respectively. W indicates the number of possibilities in the respective category and is the phase space volume referred to in the previous section .${\ displaystyle \ Omega}$

A clear, but scientifically inaccurate interpretation of entropy is to understand it as a measure of disorder , see e.g. B. the chemistry book by Holleman-Wiberg . Especially when looking at the cup in the example above of mixing sees the right image with the complete mixing for most ordinary of, as the left with the streaks, so it will appear incomprehensible to this as the Messier to describe state of higher entropy.

However, this definition can be brought into line with the previous definitions using the picture on the right. The picture shows a container that has four atoms in it, which can be either on the right or on the left side of the container. If one assumes that all 16 states are equally probable, the probabilities for the individual columns are also given, with the figure from the figure indicating the number of states in the respective columns. ${\ displaystyle p = {\ frac {W} {16}}}$${\ displaystyle W}$

Now let's assume that we could macroscopically distinguish how many atoms are on the left. The first column has N = 4 atoms on the left, the second has N = 3 atoms on the left, etc. The probability that all four atoms are on the left would therefore be even , as an example , while the middle column has a higher probability of . With the formula , the macroscopic state with N = 2 has the highest entropy. ${\ displaystyle p = {\ frac {1} {16}}}$${\ displaystyle p = {\ frac {6} {16}}}$${\ displaystyle S = k _ {\ mathrm {B}} \ ln \, W}$

In this picture you can now clearly see that the first and last column are tidier than the cases in between with higher entropy. The following statements can now be made: if all 16 states are equally probable and one starts with the state N = 4 , it is very likely to find one of the states of higher entropy at the next examination. However, the system can also move from the middle column to the first or last; it is just less likely to find the state N = 4 with low entropy than the state with N = 2 . In this purely statistical sense, the system can also spontaneously switch to a state of lower entropy, it is just unlikely that it will.

This is still the case when looking at atoms in a container. The probability that these will spontaneously change to the state of lower entropy, in which all atoms are on the left, can not be ruled out , but it is very unlikely. ${\ displaystyle 10 ^ {23}}$

## Entropy as a quantity

A local balance equation ( continuity equation ) can be formulated for entropy as an extensive quantity :

${\ displaystyle {\ frac {\ partial \ rho _ {S}} {\ partial t}} + {\ vec {\ nabla}} \ cdot {\ vec {j}} _ {S} = \ sigma}$

Here is the density of entropy, the current density and the generation rate per volume. In some textbooks on physics, especially those of the Karlsruhe physics course , an idea of ​​entropy as an extensive and “ quantity-like quantity ” is suggested. In the case of entropy, what it measures can be identified with Carnot's caloricum , according to Wilhelm Ostwald and Hugh Longbourne Callendar . Since the caloricum largely coincides with the colloquial term heat, the entropy can therefore also be understood as a measure of the (colloquial) heat. Note that there is a generation rate on the right-hand side of the continuity equation , which is why no conservation law can be derived from it. ${\ displaystyle \ rho _ {S}}$${\ displaystyle {\ vec {j}} _ {S}}$${\ displaystyle \ sigma}$${\ displaystyle \ sigma}$

Of two otherwise identical bodies, the one whose temperature is higher contains more entropy. If you combine two bodies into a single system, the total entropy is the sum of the entropies of both bodies.

If two bodies at different temperatures are in heat-conducting contact with one another, the temperature difference causes an entropy flow. Entropy flows out of the warmer body, which lowers its temperature. The colder body absorbs this entropy (and the additional entropy generated in this process), causing its temperature to rise. The process comes to a standstill when the temperatures of both bodies have become the same.

As a quantity, entropy can be compared to electrical charge (for which, however, a strict conservation law applies): A charged capacitor contains electrical charge, and thus electrical energy as well. During the discharge process, not only does electrical charge flow from one capacitor plate to the other via the circuit, but energy also flows from the capacitor to a consumer , where it can be converted into other forms of energy. Correspondingly, when heat is transferred from a hot to a cold body, in addition to thermal energy, another quantity-like quantity is transferred: entropy . Just as the potential difference between the capacitor plates - i.e. the electrical voltage - drives the electrical current, the temperature difference between the two reservoirs creates an entropy flow . If there is a heat engine between the two bodies , some of the heat can be converted into another form of energy. The heat transfer can therefore be described in a purely formal manner analogous to an electrical circuit, although the entropy newly generated during the process must also be taken into account.

An energy transfer - i.e. work or heat - can be written as the product of the change in a quantity-like extensive quantity with a conjugate intensive quantity. Examples of such pairs are the electrical charge and the electrical potential or the volume and the (negative) pressure of a gas. By adding the extensive size, the intensive size increases (in general). For example, if you add a small amount of charge to a capacitor plate that is at the same potential , you do the work on the capacitor plate and thus increase the potential of the plate. The same also applies to other size pairs. For thermal processes, the temperature corresponds to the potential and the entropy of the charge: If you add the amount of entropy to a body , the heat is transferred and the temperature rises (except for phase changes) . ${\ displaystyle W}$${\ displaystyle Q}$${\ displaystyle q}$ ${\ displaystyle \ varphi}$${\ displaystyle \ mathrm {d} q}$${\ displaystyle \ varphi}$${\ displaystyle \ delta W = \ varphi \, \ mathrm {d} q}$${\ displaystyle \ mathrm {d} S}$${\ displaystyle \ delta Q = T \, \ mathrm {d} S}$${\ displaystyle T}$

Entropy is not a conserved quantity like the electric charge, because entropy can be generated. According to the second law of thermodynamics, however, it cannot be destroyed. So it applies . As long as the entropy remains the same in a closed system , all processes that take place in the system are reversible (= reversible ). However, as soon as entropy is generated , which z. B. can happen through friction, through heating of an ohmic resistance or through mixing processes, there is no turning back without entropy flowing away through external action. One then speaks of irreversible processes. ${\ displaystyle \ Delta S \ geq 0}$${\ displaystyle (\ Delta S = 0)}$${\ displaystyle (\ Delta S> 0)}$

Another special feature of entropy is that you cannot extract any amount of entropy from a body. While one can add and remove charge from the plate of a capacitor, whereby the potential becomes positive or negative, there is a natural limit for the removal of entropy, namely the absolute zero point of the temperature. In particular, the absolute temperature can never become negative.

## Application examples

### Mixture of warm and cold water

The increase in entropy is shown in a system that exchanges neither mass nor energy with the environment (closed system) by the mixing of two amounts of water at different temperatures. Since it is an isobaric process, the state variable enthalpy is used for the energy balance.

State variables for water according to the equations from: Properties of Water and Steam Industry Standard IAPWS-IF97

System 10 : mass m 10 = 1 kg, pressure = 1 bar, temperature = 10 ° C., enthalpy h 10 = 42.12 kJ / kg, entropy s 10 = 151.1 J / kg K; System 30 : mass m 30 = 1 kg, pressure = 1 bar, temperature = 30 ° C, enthalpy h 30 = 125.83 kJ / kg, entropy s 30 = 436.8 J / kg K

Irreversible mixture

The thermodynamic state of the irreversible mixture (adiabatic, no release of work) results from the energy conservation law:

HM = H10 + H30, hM = (m10*h10 + m30*h30)/(m10 + m30), hM = 83,97 kJ/kg


With the state variables enthalpy and pressure there are further state variables of the mixed state:

Temperatur tM = 19,99 °C (293,14 K), Entropie sM = 296,3 J/kg K


Reversible mix

With a reversible mixing (dS irr = 0) the entropy of the overall system does not increase, it results from the sum of the entropies of the subsystems:

SM = S10 + S30 + dSirr, sM = 293,9 J/kg K


With the state variables entropy and pressure, there are further state variables of the mixed state:

Temperatur tM = 19,82 °C (292,97 K), Enthalpie hM = 83,26 kJ/kg


In this case, the overall system is no longer closed, but exchanges work with the environment.

Differences between irreversible and reversible mixtures: entropy: 2.4 J / kg K, enthalpy: 0.71 kJ / kg, temperature: 0.17 K.

After the irreversible mixture, the entropy of the entire system is 2.4 J / kg K greater than in the reversible process. The reversible mixing could be achieved through the use of a Carnot machine. An infinitely small amount of energy would be extracted from the subsystem with the higher temperature. In the form of heat, this energy is transferred to the Carnot machine at the system boundary with an infinitely small temperature difference. In a corresponding way, the energy is fed to the subsystem with the lower temperature. The temperatures of the two subsystems would then continue to equalize and the Carnot factor of the machine would tend to zero from initially 0.066. The Carnot machine would take the otherwise devalued enthalpy difference of 0.71 kJ / kg as mechanical work from the overall system. In the irreversible case, this energy corresponds to the work dissipated within the system according to equation (3). As a result of the dissipated work, the entropy produced is raised from absolute zero to the temperature of 19.99 ° C.

### Entropy of mixing

The entropy of mixing characterizes the “well mixed” two-liquid state in the right glass

The picture on the right shows the mixture of a brown color in water. At the beginning the color is unevenly distributed. After a long wait, the water takes on an even color.

Entropy is a measure of ignorance ; H. the ignorance of the microscopic state in which the system under consideration is. As a measure of disorder , one must pay attention to the terminology. In the picture example, the liquid in the right glass is mixed "more properly", but there is a greater disorder there due to the large mixing of water and color particles. There are more microscopically possible states in which the glass could be. The entropy is therefore higher there than in the left glass. We know about the color that it is distributed throughout the water in the right glass. The picture on the left tells us more. We can spot areas where there is a high concentration of color or areas that are devoid of color.

Josiah Willard Gibbs pointed out the contradiction that the increase in entropy should also occur if water is poured into the water glass instead of the ink ( Gibbs' paradox ).

The number of arrangements of the color molecules at the beginning is significantly less than when the color can be distributed throughout the entire volume. Because the color molecules are only concentrated in a few areas. In the right picture you can stay in the entire glass. The entropy is greater here, which is why the system strives towards this uniform distribution over time.

### Increase in entropy with irreversible and reversible isothermal expansion

There are two experiments in which, starting from the same initial state, the same final state is reached via an isothermal expansion and the same change in entropy occurs when the volume is increased. One is the Gay-Lussac experiment , it serves as a model for the concept of entropy according to Boltzmann . The second experiment, which can be used to understand the formulation of entropy according to Clausius, is the isothermal expansion in the first step of the Carnot cycle .

#### Irreversible expansion, thermodynamic

Gay Lussac attempt. The experiment with an ideal gas in a closed system shows that the initial temperature is set after pressure and temperature equalization ( )${\ displaystyle t_ {2} = t_ {1}}$

The figure opposite shows the overflow experiment by Gay-Lussac . Assuming an ideally closed system, the first law of thermodynamics says that the total energy of the system does not change ( ). For an ideal gas, therefore, there is no overall temperature change when it flows over into a larger volume. ${\ displaystyle \ mathrm {d} U = 0}$

Since entropy is a state variable, it is path-independent. Instead of opening the tap, one can also imagine the gas expanding slowly by sliding a partition to the right. For an infinitesimal shift, the volume increases and the entropy increases . From the first law it follows that all expansion work done by the gas must benefit it again in the form of heat. So it applies . From this follows and thus ${\ displaystyle \ mathrm {d} V}$${\ displaystyle \ mathrm {d} S = {\ frac {\ delta Q} {T}}}$${\ displaystyle \ mathrm {d} U = \ delta Q + \ delta W}$${\ displaystyle \ mathrm {d} U = 0}$${\ displaystyle \ delta Q = - \ delta W}$${\ displaystyle \ delta Q = - (- p \ mathrm {d} V)}$

${\ displaystyle \ mathrm {d} S = {\ frac {p} {T}} \ mathrm {d} V}$

From the equation of state for ideal gases ( is the number of gas atoms): ${\ displaystyle N \,}$

${\ displaystyle p = {\ frac {k _ {\ mathrm {B}} NT} {V}}}$

follows:

${\ displaystyle \ mathrm {d} S = {\ frac {k _ {\ mathrm {B}} N} {V}} \ mathrm {d} V}$.

This results in integration:

${\ displaystyle \ Delta S = \ int _ {S_ {1}} ^ {S_ {2}} \ mathrm {d} S = \ int _ {V_ {1}} ^ {V_ {2}} {\ frac { k _ {\ mathrm {B}} N} {V}} \ mathrm {d} V = k _ {\ mathrm {B}} N \ ln (V_ {2} / V_ {1})}$.

For one mole of a gas, doubling the volume gives

${\ displaystyle \ Delta S = 1 \, \ mathrm {mol} \ cdot N _ {\ mathrm {A}} k _ {\ mathrm {B}} \ ln 2 = 5 {,} 76 \, \ mathrm {J / K }}$

by inserting numerical values ​​for the Boltzmann constant and Avogadro number . ${\ displaystyle k _ {\ mathrm {B}} = 1 {,} 3807 \ cdot 10 ^ {- 23} \, \ mathrm {J / K}}$${\ displaystyle N _ {\ mathrm {A}} = 6 {,} 0220 \ cdot 10 ^ {23} \, \ mathrm {mol} ^ {- 1}}$

#### Irreversible expansion, statistical

The idea is based on the overflow test according to Gay-Lussac. A tap is opened and an ideal gas spreads spontaneously over twice the volume. According to Boltzmann, the corresponding entropy values ​​are obtained from the statistical weights (= number of microstates) before and after the expansion: If molecules are distributed over two halves of space in such a way that there are molecules in one half and the other in the other , then this is the statistical weight this macro state ${\ displaystyle n \,}$${\ displaystyle n_ {1}}$${\ displaystyle n_ {2}}$

${\ displaystyle W (n_ {1}, n_ {2}) = {\ frac {n!} {n_ {1}! \, n_ {2}!}}}$

and the entropy of that state . If there is a whole mole ( ) in one half (and nothing in the other), then is ${\ displaystyle S (n_ {1}, n_ {2}) = k _ {\ mathrm {B}} \ cdot \ ln \, W}$${\ displaystyle n = N _ {\ mathrm {A}}}$

${\ displaystyle W (N _ {\ mathrm {A}}, 0) = {\ frac {N _ {\ mathrm {A}}!} {N _ {\ mathrm {A}}! \, 0!}} = 1}$

and the entropy

${\ displaystyle S (N _ {\ mathrm {A}}, 0) = k _ {\ mathrm {B}} \ cdot \ ln 1 = 0}$.

If distributed evenly,

${\ displaystyle W \ left ({\ frac {N _ {\ mathrm {A}}} {2}}, {\ frac {N _ {\ mathrm {A}}} {2}} \ right) = {\ frac { 6 \ cdot 10 ^ {23}!} {3 \ cdot 10 ^ {23}! \, 3 \ cdot 10 ^ {23}!}}}$.

The faculty can be approximated with the Stirling formula , whereby one can limit oneself to . The logarithm of is . So that ${\ displaystyle n! \ approx n ^ {n}}$${\ displaystyle n ^ {n}}$${\ displaystyle n \ cdot \ ln \, n}$

${\ displaystyle \ ln (6 \ cdot 10 ^ {23}!) \ approx 6 \ cdot 10 ^ {23} \ cdot \ ln \, (6 \ cdot 10 ^ {23})}$

and

${\ displaystyle \ ln \ left [W \ left ({\ frac {N _ {\ mathrm {A}}} {2}}, {\ frac {N _ {\ mathrm {A}}} {2}} \ right) \ right] \ approx 6 \ cdot 10 ^ {23} \ cdot \ ln (6 \ cdot 10 ^ {23}) - 2 \ cdot 3 \ cdot 10 ^ {23} \ cdot \ ln (3 \ cdot 10 ^ { 23}) = 6 \ cdot 10 ^ {23} \ cdot (\ ln 6- \ ln 3)}$ .

One obtains for the entropy after the expansion

${\ displaystyle S = k _ {\ mathrm {B}} \ cdot \ ln \ left [W \ left ({\ frac {N _ {\ mathrm {A}}} {2}}, {\ frac {N _ {\ mathrm {A}}} {2}} \ right) \ right] = 5 {,} 76 \, \ mathrm {J / K}}$.
${\ displaystyle \ Delta S = 5 {,} 76 \, \ mathrm {J / K} -0 = 5 {,} 76 \, \ mathrm {J / K}}$.

Since the particles have no attractive forces and the walls of the vessel are rigid, no work is done, not even against external air pressure. The molecules hit the wall and are reflected, but do not lose any energy. The system is not in equilibrium during the overflow.

#### Reversible isothermal expansion

The second experiment corresponds to the isothermal expansion in the first step of the Carnot cycle. Heat is transferred from the outside to the substance. In this way, work is done by releasing the corresponding energy in the course of expansion to a connected mechanism, which stores it externally as potential energy. None of this remains in the medium and the system is always in balance. The process is reversible. Clausius has now formulated the change in entropy taking this added heat into account as . In the reversible case, one also obtains for one mole with . ${\ displaystyle \ Delta S = {\ frac {Q _ {\ mathrm {rev}}} {T}}}$${\ displaystyle Q _ {\ mathrm {rev}}}$ ${\ displaystyle = nRT \ cdot \ ln {\ frac {V _ {\ mathrm {2}}} {V _ {\ mathrm {1}}}}}$${\ displaystyle \ Delta S = 5 {,} 76 \, \ mathrm {J / K}}$

#### Numerical equivalence of the results

For Boltzmann constant : In the consideration of Boltzmann using the Überströmversuchs once neither heat nor a temperature so no dimension prone size, but only the logarithm dimensionless statistical weights happens, . But since the change in entropy is the same (same initial and final state; state function) as in the reversible case, Planck introduced the constant . With it one obtains the same numerical result for the change in entropy in the irreversible isothermal expansion with the same unit J / K as for the reversible one from the experiment in which heat is added. In the real experiment, however, this is only the case when the system has reached equilibrium, i.e. the maximum statistical weight according to the Boltzmann distribution . ${\ displaystyle k _ {\ mathrm {B}}}$${\ displaystyle \ ln W}$${\ displaystyle k _ {\ mathrm {B}}}$

### Biomembranes

If lipids , which occur in living organisms as building blocks of biomembranes , for example , are added to water, spontaneous closed membrane structures, so-called vesicles, are formed . Here, since temperature and pressure are added ( heat bath and pressure ensemble ) is the thermodynamic potential , which aims at a minimum the free enthalpy . The enthalpy can be proven experimentally, so it is measurable and is positive. Since the process is spontaneous, it must be negative; d. i.e., the entropy must increase. At first glance, this is confusing, because entropy is usually the reason why substances mix (entropy of mixing). The increase in entropy is due to a special property of water. It forms hydrogen bonds between the individual water molecules , which fluctuate constantly and thus make a high contribution to the entropy of the water. When the lipids are dissolved in water, a larger area is created around the long fatty acid chains in which hydrogen bonds can no longer be formed. In the areas around the fatty acid chains, the entropy contribution of the hydrogen bonds is missing, so that the entropy decreases overall. This decrease is significantly greater than the increase expected from simply mixing the water and lipid. When the fatty acid chains stick together, more hydrogen bonds can be formed and the entropy increases. Another way of phrasing this is that water's ability to form fluctuating hydrogen bonds drives lipids out of solution. Ultimately, this property is also one of the reasons for the poor solubility of many non-polar substances that interfere with the formation of hydrogen bonds. ${\ displaystyle \ Delta G = \ Delta HT \ Delta S}$ ${\ displaystyle \ Delta H}$${\ displaystyle \ Delta G}$

### Calculation and use of tabulated entropy values

The molar entropy S mol at a certain temperature T 2 and at constant pressure p is obtained with the help of the molar heat capacity c p ( T ) by integrating from absolute zero to the current temperature:

${\ displaystyle S _ {\ mathrm {mol}} = \ int _ {0} ^ {T_ {2}}}$ ${\ displaystyle {\ frac {c_ {p}} {T}} \ \ mathrm {d} T = \ int _ {0} ^ {T_ {2}} c_ {p} \ \ mathrm {d} (\ ln \, T)}$

In addition, there are entropy components at phase transitions. According to Planck, the entropy of ideally crystallized, pure solids is set to zero at absolute zero ( mixtures or frustrated crystals, on the other hand, retain a residual entropy ). Under standard conditions one speaks of the standard entropy S 0 . Even from a statistical point of view, the entropy value and heat capacity are related to one another: A high heat capacity means that a molecule can store a lot of energy. B. be based on a large number of low-lying and therefore easily achievable energy levels. There are correspondingly many different distribution possibilities at these levels for the molecules and this also leads to a high entropy value for the most probable state.

In electrochemical reactions, the reaction entropy ∆S results from the measured change in d E (electromotive force) with temperature:

${\ displaystyle \ Delta S = z \ cdot F \ cdot {\ bigg (} {\ frac {\ mathrm {d} E} {\ mathrm {d} T}} {\ bigg)} _ {p}}$( z = number of charges, F = Faraday constant)

The change in entropy in ideal mixtures is obtained with the help of the mole fractions x i of the substances involved:

${\ displaystyle \ Delta S _ {\ mathrm {id}} \ = \ -R \ cdot \ sum _ {i = 1} ^ {k} x_ {i} \ cdot \ ln \, x_ {i}}$

in real mixtures there is an additional entropy due to the change in the intermolecular forces during mixing.

If new molecules arise during a chemical reaction, the highest entropy occurs in a very specific state of equilibrium, in which the molecules can be distributed over both the educt and the product levels. The equilibrium constant K can be calculated using the following relationship, in which the differences in the standard entropy values ∆S 0 of the substances involved play an essential role :

${\ displaystyle \ ln \, K = - {\ frac {\ Delta H ^ {0} -T \ cdot \ Delta S ^ {0}} {RT}}}$

(In this case the ∆ means the change in size when the reaction is complete). What can be used to estimate the intensity of this process in a spontaneous process (e.g. chemical reactions, solution and mixing processes, setting of phase equilibria and their temperature dependence, osmosis, etc.) is the increase in total entropy between the initial and equilibrium state, which of the reactants and those of the environment taken together (→ chemical equilibrium ). The spontaneous increase in entropy, in turn, is a result of the constant movement of the molecules.

In short: The standard entropy of substances can be calculated from the course of the heat capacity with the temperature. Knowing tabulated entropy values ​​(together with the enthalpies of reaction) enables the chemical equilibrium to be predicted.

## Quantum mechanics

In quantum statistics , a microstate is a pure state that is given by a vector in the Hilbert space of the many-body system. As in classical statistical mechanics, this is a space with an extraordinarily large number of dimensions, even if the individual particle has only a few different energy eigenstates available. For example, in nuclear magnetic resonance there are only two energy eigenstates for each proton spin, but with it a Hilbert space with twice as many dimensions as there are protons in the sample (e.g. in a small water droplet). The associated macrostate is a mixed state , which is described by a statistical operator or also a density operator . ${\ displaystyle | n \ rangle}$ ${\ displaystyle {\ mathcal {H}}}$${\ displaystyle 10 ^ {20}}$

This contains all information about the system that is accessible through an ideal measurement (this is much less than in the pure state , the micro-state ). The macrostate is classically given by an ensemble of those microstates that have certain “typical macroscopic quantities” in common, such as B. Energy, volume and number of particles. The distribution of the micro-states in phase space is classically given by a distribution function. The density operator takes its place in the quantum mechanical description: ${\ displaystyle | n \ rangle}$${\ displaystyle | n \ rangle}$

${\ displaystyle \ rho = \ sum _ {i} p_ {i} | i \ rangle \ langle i |}$.

If the states are all orthogonal, the probability is that the system under consideration is in the “pure” quantum mechanical state . ${\ displaystyle | i \ rangle}$${\ displaystyle p_ {i}}$${\ displaystyle | i \ rangle}$

The expected value of an observable on the mixture of states described by the density operator is given by a trace formation:

${\ displaystyle \ langle A \ rangle = \ operatorname {Tr} \, \ left (\ rho A \ right)}$.

The trace of an operator is defined as follows: for any (complete) basis . ${\ displaystyle \ operatorname {Tr} \, (A) = \ sum \ nolimits _ {m} \ langle m | A | m \ rangle}$${\ displaystyle \ left \ {| m \ rangle \ right \}}$

### Von Neumann entropy

The von Neumann entropy (after John von Neumann ) is defined as the expected value of the density operator:

${\ displaystyle S = - \ operatorname {Tr} \ left (\ rho \ ln \, \ rho \ right) = - \ langle \ ln \, \ rho \ rangle}$.

If you multiply this dimensionless Von Neumann entropy by the Boltzmann constant , you get an entropy with the usual unit. ${\ displaystyle k _ {\ mathrm {B}}}$

The entropy is given by the probabilities of the individual pure quantum mechanical states in the macrostate ${\ displaystyle | i \ rangle}$

${\ displaystyle S = -k _ {\ mathrm {B}} \ operatorname {Tr} \ left (\ rho \ ln \, \ rho \ right) = - k _ {\ mathrm {B}} \ sum _ {i} p_ {i} \ ln \, p_ {i} \ qquad (*)}$,

where is the probability of being in the i th microstate. The probabilities can assume values ​​between and . (The singularity of the logarithm in the case is irrelevant because .) Thus and the entropy is positive semidefinite . If the mixture is in a pure state, one of these probabilities has the value , while all others assume the value zero. In this case the entropy is zero, i.e. it has the minimum value. Positive values ​​for entropy are obtained when more than one microstate have a probability other than zero. ${\ displaystyle p_ {i}}$${\ displaystyle p_ {i}}$${\ displaystyle 0}$${\ displaystyle 1}$${\ displaystyle p_ {i} = 0}$${\ displaystyle \ lim _ {x \ rightarrow 0} x \ ln \, x = 0}$${\ displaystyle p_ {i} \ ln \, p_ {i} \ leq 0}$${\ displaystyle S = -k _ {\ mathrm {B}} \ sum \ nolimits _ {i} p_ {i} \ ln \, p_ {i} \ geq 0}$ ${\ displaystyle 1}$

As an example we take a spin system with four electrons. Spin and magnetic moment are antiparallel. This means that the magnetic moment of a spin pointing down has the energy in the external magnetic field . The energy of the system should be total . This leads to the four micro-states: ${\ displaystyle \ mu}$${\ displaystyle B}$${\ displaystyle - \ mu B}$${\ displaystyle E_ {0}}$${\ displaystyle -2 \ mu B}$

${\ displaystyle [\ uparrow \ downarrow \ downarrow \ downarrow] \, \ quad [\ downarrow \ uparrow \ downarrow \ downarrow] \, \ quad [\ downarrow \ downarrow \ uparrow \ downarrow] \, \ quad [\ downarrow \ downarrow \ downarrow \ uparrow] \ ,.}$

It follows that the spin degeneration is with and as above also applies here . ${\ displaystyle \ Omega = 4 \,}$${\ displaystyle p_ {1} = p_ {2} = p_ {3} = p_ {4} = {\ frac {1} {4}} \,}$${\ displaystyle S = k _ {\ mathrm {B}} \ cdot \ ln \, \ Omega}$

The above general formula, (*), is identical to the formula for Shannon's information entropy except for a constant factor . This means that the physical entropy is also a measure of the information that is missing from knowledge of the macrostate on the microstate.

${\ displaystyle S = -k _ {\ mathrm {B}} \ sum _ {i} p_ {i} \ ln \, p_ {i} = - k _ {\ mathrm {B}} \ sum _ {i} p_ { i} {\ frac {\ log _ {2} p_ {i}} {\ log _ {2} e}} = {\ frac {k _ {\ mathrm {B}}} {\ log _ {2} e} } S _ {\ text {Shannon}}}$

### Properties of the statistical entropy of a quantum mechanical state

Be and density operators on the Hilbert space . ${\ displaystyle \ rho}$${\ displaystyle {\ tilde {\ rho}}}$${\ displaystyle {\ mathcal {H}}}$

${\ displaystyle S (\ rho) = - k _ {\ mathrm {B}} \ operatorname {Tr} \ left (\ rho \ ln \, \ rho \ right) \ leq -k _ {\ mathrm {B}} \ operatorname {Tr} \ left (\ rho \ ln \, {\ tilde {\ rho}} \ right)}$
• Invariance under unitary transformations of (with )${\ displaystyle \ rho \,}$${\ displaystyle UU ^ {\ dagger} = 1}$
${\ displaystyle S (U \ rho U ^ {\ dagger}) = S (\ rho)}$
• minimum
${\ displaystyle S (\ rho) \ geq 0}$
Minimum is assumed for pure states${\ displaystyle \ rho = | \ Psi \ rangle \ langle \ Psi |}$
• maximum
${\ displaystyle S (\ rho) \ leq k _ {\ mathrm {B}} \ ln \, (\ operatorname {dim} {\ mathcal {H}})}$
Maximum is attained when all possible state vectors equally likely to occur${\ displaystyle 1 / \ operatorname {dim} {\ mathcal {H}}}$
${\ displaystyle S \ left (\ lambda \ rho + (1- \ lambda) {\ tilde {\ rho}} \ right) \ geq \ lambda S (\ rho) + \ left (1- \ lambda \ right) S \ left ({\ tilde {\ rho}} \ right)}$   With   ${\ displaystyle 0 \ leq \ lambda \ leq 1}$
Let density operators be on   and   or reduced density operators on or${\ displaystyle \ rho \,}$${\ displaystyle {\ mathcal {H}} = {\ mathcal {H}} _ {a} \ otimes {\ mathcal {H}} _ {b}}$${\ displaystyle \ rho _ {a} \,}$${\ displaystyle \ rho _ {b} \,}$${\ displaystyle {\ mathcal {H}} _ {a}}$${\ displaystyle {\ mathcal {H}} _ {b}}$
${\ displaystyle | S (\ rho _ {a}) - S (\ rho _ {b}) | \ leq S (\ rho) \ leq S (\ rho _ {a}) + S (\ rho _ {b })}$

## Bekenstein-Hawking entropy of black holes

In his doctoral thesis, Jacob Bekenstein pointed out similarities between the physics of black holes and thermodynamics . Among other things, he compared the Second Law of Thermodynamics with the fact that the surface of black holes with incident matter always seems to grow and no matter can escape. It resulted as a formula for the entropy ${\ displaystyle \ Delta S \ geq 0}$

${\ displaystyle S _ {\ mathrm {SL}} = {\ frac {k _ {\ mathrm {B}} c ^ {3} A} {4 \ hbar G}}}$,

Here is the surface of the event horizon, the gravitational constant , the speed of light, and the Boltzmann constant . ${\ displaystyle A}$${\ displaystyle G}$${\ displaystyle c}$${\ displaystyle k _ {\ mathrm {B}}}$

Stephen Hawking criticized the fact that the black hole also had to have a temperature. However, a body with a non-vanishing temperature emits black body radiation , which contradicts the assumption that nothing more escapes from the black hole. Hawking solved this paradox by postulating the Hawking radiation named after him : In the quantum mechanical description of the vacuum, vacuum fluctuations from particle-antiparticle pairs are constantly present. If one of the two partner particles is “captured” by the black hole when a pair is formed just outside the event horizon, but the other escapes, this corresponds physically to thermal radiation from the black hole. Regarding the reality of such thermal radiation, it can be said that observers make different observations in different reference systems, i.e. temperature or intrinsic temperature. It was not until Hawking discovered that an observer who was a long way from a black hole with an event horizon measured the Schwarzschild temperature ${\ displaystyle 0}$${\ displaystyle r = 2MG / c ^ {2}}$

${\ displaystyle T = {\ frac {\ hbar c ^ {3}} {8 \ pi GM \, k _ {\ mathrm {B}}}}}$

observed, and investigation of a free quantum field theory in Rindler space coordinates, led to the discovery of Hawking radiation as the evaporation of the black hole from particles with lower angular momentum, while others with higher angular momentum are reflected from the walls of the hole.

The black hole can dissolve if the energy of the emitted Hawking radiation (which causes the mass of the black hole to decrease) exceeds the energy content of the incident matter for a sufficiently long period of time.

## literature

Scripts
• Georg Job, Regina Rüffler: Physical chemistry. Part 1: Fundamentals of material dynamics. Eduard Job Foundation for Thermo- and Material Dynamics, September 2008, accessed on December 10, 2014 (in particular Chapter 2).
• F. Herrmann: Thermodynamics. (PDF; 12.87 MB) Physics III. Department for Didactics of Physics, University of Karlsruhe, September 2003, archived from the original ; accessed on June 6, 2020 .
Textbooks and review articles
• Klaus Stierstadt, Günther Fischer: Thermodynamics: From Microphysics to Macrophysics (Chapter 5) . Springer, Berlin, New York 2010, ISBN 978-3-642-05097-8 ( limited preview in Google book search).
• R. Frigg, C. Werndl: Entropy - A Guide for the Perplexed (PDF; 301 kB). In: C. Beisbart, S. Hartmann (Ed.): Probabilities in Physics. Oxford University Press, Oxford 2010. (Overview of the various entropy terms and their links).
• G. Adam, O. Hittmair : Heat theory. 4th edition. Vieweg, Braunschweig 1992, ISBN 3-528-33311-1 .
• Richard Becker : Theory of Heat. 3rd, supplementary edition. Springer, 1985, ISBN 3-540-15383-7 .
• Arieh Ben-Naim: Statistical Thermodynamics Based on Information: A Farewell to Entropy. 2008, ISBN 978-981-270-707-9 .
• Johan Diedrich Fast: Entropy. Huethig, 1982, ISBN 3-87145-299-8 .
• Ulrich Nickel: Textbook of Thermodynamics. A clear introduction. 3rd, revised edition. PhysChem, Erlangen 2019, ISBN 978-3-937744-07-0 .
• EP Hassel, TV Vasiltsova, T. Strenziok: Introduction to Technical Thermodynamics. FVTR GmbH, Rostock 2010, ISBN 978-3-941554-02-3 .
• Arnold Sommerfeld : Lectures on theoretical physics - thermodynamics and statistics. Reprint of the 2nd edition. Harri Deutsch, 1988, ISBN 3-87144-378-6 .
• Leonhard Susskind and James Lindesay: An Introduction to BLACK HOLES, INFORMATION and the STRING THEORY REVOLUTION, World Scientific, 2005, ISBN 978-981-256-083-4 .
• Andre Thess: The entropy principle - thermodynamics for the dissatisfied . Oldenbourg-Wissenschaftsverlag, 2007, ISBN 978-3-486-58428-8 .
• Wolfgang Glöckner, Walter Jansen, Hans Joachim Bader (eds.): Handbook of experimental chemistry. Secondary level II. Volume 7: Mark Baumann: Chemical energetics . Aulis Verlag Deubner, Cologne 2007, ISBN 978-3-7614-2385-1 .
• André Thess: What is entropy? An answer for the dissatisfied . In: Research in Engineering . tape 72 , no. 1 , January 17, 2008, p. 11-17 , doi : 10.1007 / s10010-007-0063-7 .
Popular science presentations
• Arieh Ben-Naim: Entropy Demystified - The Second Law Reduced to Plain Common Sense. World Scientific, Expanded Ed., New Jersey 2008, ISBN 978-981-283-225-2 . (popular science but exact explanation based on statistical physics).
• H. Dieter Zeh: Entropy. Fischer, Stuttgart 2005, ISBN 3-596-16127-4 .
• Eric Johnson: Anxiety and the Equation: Understanding Boltzmann's Entropy. The MIT Press, Cambridge, Massachusetts 2018, ISBN 978-0-262-03861-4 .
• Jeremy Rifkin, Ted Howard: Entropy: A New World View. Viking Press, New York 1980 (German: Entropy: A New World View. Hofmann & Campe, Hamburg 1984).

Commons : Entropy  - collection of images, videos and audio files
Wiktionary: Entropy  - explanations of meanings, word origins, synonyms, translations
Wikibooks: Entropy  - Learning and Teaching Materials

1. Richard Becker : Theory of heat . Springer, Heidelberg 2013, p.  253 ( books.google.de [accessed on June 16, 2015] reprint from 1961).
2. Antoine Laurent Lavoisier : Oeuvres de Lavoisier: Traité élémentaire de chimie, opuscules physiques et chimiques, volume 1 . Ministre de L'instruction Publique et des Cultes, 1864 ( page 410, original from the Bayerische Staatsbibliothek, digitized December 8, 2009 - register entry from 1789).
3. ^ Roger Hahn: Pierre Simon Laplace, 1749-1827: A Determined Scientist . Harvard University Press, 2005, ISBN 0-674-01892-3 ( limited preview in Google Book Search).
4. ^ Joseph Black : Lectures on the Elements of Chemistry . Edinburgh, 1807 ( Original from Michigan State University, digitized Oct. 16, 2013 in Google Book Search - posthumous publication of lectures from 1760).
5. a b Pierre Kerszberg: Natural philosophy . In: Knud Haakonssen (Ed.): The Cambridge History of Eighteenth-Century Philosophy . tape 1 . Cambridge University Press, 2006, ISBN 0-521-86743-6 ( limited preview in Google Book Search).
6. James D. Stein: Cosmic Numbers: The Numbers That Define Our Universe . Basic Books, 2011 ( limited preview in Google Book Search).
7. ^ Sadi Carnot : Réflexions sur la puissance motrice du feu et sur les machines propres à développer cette puissance . Bachelier, 1824 ( original from Lyon Public Library, digitized 29 Sept. 2014 in Google Book Search).
8. Rudolf Clausius : About different, for the application convenient forms of the main equations of the mechanical heat theory . In: Annals of Physics and Chemistry . tape  125 , 1865, pp. 353-400 ( Textarchiv - Internet Archive [accessed April 24, 2019] also lecture to the Zurich Natural Research Society).
9. Rudolf Clausius : About the second law of mechanical heat theory . 1867 ( Original from Michigan State University, digitized June 29, 2007 in the Google Book Search - lecture given at a general meeting of the 41st meeting of German naturalists and doctors in Frankfurt am Main on September 23, 1867).
10. a b Wilhelm Ostwald : The energy . Publisher by Johann Ambrosius Barth, Leipzig 1908, p.  77 .
11. ^ A b Hugh Longbourne Callendar : Proceedings of the Royal Society of London. Series A: Containing Papers of a Mathematical and Physical Character . tape  134 , no. 825 , January 2, 1932, p. xxv ( snippet in Google Book search).
12. a b c d Gottfried Falk , Wolfgang Ruppel : Energy and Entropy . Springer-Verlag, 1976, ISBN 3-540-07814-2 .
13. Tomasz Downarowicz: Entropy . In: Scholarpedia . tape 2 , no. 11 , 2007, p. 3901 , doi : 10.4249 / scholarpedia.3901 (revision # 126991).
14. ^ Roman Frigg and Charlotte Werndl: Entropy - A Guide for the Perplexed. (PDF; 294 kB) June 2010, accessed on December 12, 2014 (English).
15. In the 1st law of thermodynamics , in contrast to the 2nd law, there is no such "integrating factor". The 1st law states that the sum (!) Of the added work and the added heat always results in the complete differential of a state function, the so-called internal energy , although the two individual differentials are not   complete. In contrast to the function , no distinction is made between whether the heat supply is reversible or irreversible.${\ displaystyle \ delta W}$${\ displaystyle \ delta Q}$${\ displaystyle \ mathrm {d} U}$${\ displaystyle U}$${\ displaystyle U}$${\ displaystyle S}$
16. ^ A b Hans Dieter Baehr, Stephan Kabelac: Thermodynamics - Basics and technical applications . 16th edition. Springer Vieweg, Braunschweig 2016, ISBN 978-3-662-49567-4 , 3.1.2 The formulation of the 2nd law by postulates and 3.1.3 The entropy balance equation for closed systems, p. 92-101 .
17. ^ ET Jaynes: Gibbs vs. Boltzmann entropies. In: American Journal of Physics. Volume 33 (Issue 5), 1965. P. 398
18. HJW Müller-Kirsten, Basics of Statistical Physics, 2nd edition, World Scientific 2013, ISBN 978-981-4449-53-3 , pp. 28-30.
19. L. Susskind and J. Lindesay: An Introduction to BLACK HOLES, INFORMATION and the STRING THEORY REVOLUTION . World Scientific 2005, ISBN 978-981-256-083-4 , pp. 69-77.
20. ^ AF Holleman , E. Wiberg , N. Wiberg : Textbook of Inorganic Chemistry . 101st edition. Walter de Gruyter, Berlin 1995, ISBN 3-11-012641-9 , p. 54.
21. WA Kreiner: Entropy - what is it? An overview. doi: 10.18725 / OPARU-2609
22. Gustav Jaumann : Closed system of physical and chemical differential laws . In: session area. Akad. Wiss. Vienna, Nat.-Naturwiss. Class . IIA, no.  120 , 1911, pp. 385-503 .
23. Erwin Lohr : Entropy principle and closed system of equations . In: Memorandum of the Akad. Wiss. Vienna . Nat.-Naturwiss. Class, no.  93 , 1916, pp. 339–421 ( phys.huji.ac.il [PDF; accessed June 11, 2020]).
24. Georg Job: New presentation of the theory of heat - The entropy as heat . Akdemische Verlagsgesellschaft, Frankfurt am Main (reprint from 1961).
25. a b Friedrich Herrmann: The Karlsruhe Physics Course . 9th edition. Part 1: energy momentum entropy . Aulis Verlag, 2010, ISBN 978-3-7614-2517-6 .
26. ^ Hans Fuchs: The Dynamics of Heat . Springer, New York 2010.
27. Georg Job and Regina Rüffler: Physical Chemistry - An Introduction to a New Concept . Vieweg + Teubner, Wiesbaden 2011.
28. ^ F. Herrmann: KPK - university scripts, thermodynamics. (PDF; 20.8 MB) 2015, accessed on June 8, 2020 .
29. This corresponds to Clausius' equation (2), which can also be put into words as follows: "In a closed thermodynamic system, the entropy increases until equilibrium is reached".
30. Wedler, G .: Textbook of Physical Chemistry . Verlag Chemie, Weinheim, Deerfield Beach, Basel 1982. ISBN 3-527-25880-9 . Cape. 4.2, p. 632.
31. ^ Jacob D. Bekenstein : Black holes and entropy . In: Phys. Rev. D, no.  7 , 1973, p. 2333–2346 ( phys.huji.ac.il [PDF; accessed December 9, 2014]).
32. Stephen W. Hawking : Particle Creation by Black Holes . In: Commun. Math. Phys. tape  43 , 1975, p. 199-220 , doi : 10.1007 / BF02345020 .
33. ^ Susskind, Lindesay, An Introduction to Black Holes, Information and the String Theory Revolution: The Holographic Universe . World Scientific, Singapore 2004, pp. 39-42.
34. ^ Susskind, Lindesay, An Introduction to Black Holes, Information and the String Theory Revolution: The Holographic Universe . World Scientific, Singapore 2004, pp. 48-49
35. Stephen W. Hawking : A Brief History of Time . 1st edition. Rowohlt Verlag, 1988, ISBN 3-498-02884-7 ( limited preview in Google book search).