Statistical Physics

from Wikipedia, the free encyclopedia

The statistical physics is a branch of physics , the methods of probability theory used to describe physical systems. This enables statistical physics to make statements about the properties and behavior of a large composite system without following the behavior of each of its parts in detail. Typical statements in statistical physics have the character of probabilities, which, however, become more and more certainties as the number of parts of the system increases.

Statistical physics is mainly concerned with explaining the physical behavior of many- body systems such as solids , liquids and gases from the properties of atoms and molecules . Her methods are also used in many questions from other natural and engineering sciences such as biology , chemistry , neuroscience , process engineering , as well as in social, economic and linguistic sciences (see sociophysics , economics , statistical linguistics ).

Statistical physics is a fundamental physical theory . It is based on the simplest laws for the movement of the individual particles and, with the help of a few additional physical hypotheses, etc. a. derive and justify the laws of thermodynamics , but also the statistical fluctuations around a steady state of equilibrium. Questions that are still open at the moment mainly concern irreversible processes , such as the calculation of transport coefficients from microscopic properties.

The statistical mechanics and statistical quantum mechanics are branches of statistical physics.

Basics

General

Statistical relationships can be formulated in physics wherever an observable physical quantity in an overall system depends on the current states of many of its subsystems, but these are not known precisely. For example, 1 liter of water contains around water molecules. In order to describe the flow of 1 liter of water in a pipe, it would be impractical to want to trace the paths of all 33,000,000,000,000,000,000,000,000 water molecules individually at the atomic level. It is sufficient to understand the behavior of the system on a large scale.

The basic approach is that the subsystems can behave in any way within their individual possibilities. In principle, the overall system could also receive a certain combination of macroscopic values ​​that contradicts all previous observations; but this turns out to be so unlikely that it must be ruled out sensibly. An example would be that in one liter of air all molecules spontaneously gather in one half of the volume, which would be shown on average once if one looked 10 (10 22 ) times in succession .

With such systems it is practically impossible to determine the current states of all subsystems in detail in order to draw conclusions about the values ​​of the observable quantities or the further behavior of the entire system, especially since these states also change much faster than the quantities observable in the entire system. It turns out that knowledge of the details of all subsystems is often not required at all if one wants to gain practical information about the behavior of the overall system.

On the basis of a few, but not further provable, basic assumptions, statistical physics provides terms and methods with which statements about the system as a whole can be made from the known laws for the behavior of the subsystems, down to the individual particles or quanta .

Statistical justification of thermodynamics

In the 18th and 19th centuries, the concepts and laws of classical thermodynamics were first acquired phenomenologically on macroscopic systems, primarily on those in a state of equilibrium or not far from it. With statistical physics today, they can be traced back to the properties and behavior of their smallest particles (mostly atoms or molecules). For every state of the system defined by macroscopic values ​​- referred to as macroscopic state - there are always many possibilities to give the individual particles such states that they, taken together, produce the given macroscopic values ​​of the system. The exact distribution of the particles into their individual states is called a microstate , and every macrostate has a certain set of microstates. Since the particles are in motion and undergo system-internal interaction processes, no microstate is generally preserved over time. From a microscopic point of view, it changes deterministically , but the result can only be predicted with probability. If the macrostate is supposed to be a temporally stable equilibrium state of the macroscopic system, that means that the microstate does not migrate out of the set of microstates belonging to this macrostate. The thermodynamic state equations, i.e. the laws governing the stable state of equilibrium of a macroscopic system, can now be derived as follows: For a fictitiously assumed macrostate of the system, the respective quantities of the associated microstates are determined. In order to maintain the state of equilibrium, this set is determined for various macrostates and selects the set that does not change as a whole over time due to the internal system processes or changes only with the minimum possible probability. The selection criterion is very simple: you choose the largest of the quantities.

In the case of any other macrostate that is not a state of equilibrium, the changes in the microstate through system-internal processes lead to gradual changes in macroscopic quantities, i.e. also to other macrostates. In such a case, statistical physics can explain for many physical systems why this macroscopic change takes place as a relaxation in the direction of equilibrium and how quickly it takes place.

In addition, this statistical observation shows that the state of thermodynamic equilibrium is only stable when viewed macroscopically, but must show fluctuations over time when viewed microscopically. These fluctuations are real, but in relative terms they become more and more insignificant the larger the system under consideration is. In typical macroscopic systems, they are many orders of magnitude lower than the achievable measurement accuracy and are therefore irrelevant for most applications in thermodynamics. With such statements, statistical physics goes beyond classical thermodynamics and allows its scope to be limited quantitatively. The fluctuations explain phenomena such as critical opalescence and the Brownian movement, which has been known since the beginning of the 19th century . More precise measurements of such fluctuations were carried out on mesoscopic systems at the beginning of the 20th century . The fact that these measurement results also corresponded quantitatively to the predictions of statistical physics contributed significantly to their breakthrough and thus to the acceptance of the atomic hypothesis . It was also the consideration of such fluctuations that led Max Planck to his radiation formula and Albert Einstein to the light quantum hypothesis , which gave rise to quantum physics .

Basic assumptions of statistical treatment

The starting point is the microstate of a large physical system. In the field of classical physics it is given by specifying the momentary locations and impulses of all its particles - that is, microscopically ; in the multi-dimensional phase space of the system it occupies a single point. According to the general illustration in the previous section, a measure for the size of a subset of the phase space is required. In classical physics, the points of the individual micro-states in phase space form a continuum. Since the points in it cannot be counted, the closest measure is given by the volume of the subset. For this, one can imagine the phase space divided into small volume elements, each of which contains the same quantities of very similar states. If the volume element should only contain one state, it is called a phase space cell .

In the field of quantum physics, the microstate is given by a pure quantum mechanical state of the many-body system, as it is e.g. B. is defined by a projection operator on a 1-dimensional subspace of the Hilbert space of the entire system or is represented by a normalized vector from it. The Hilbert space is also the phase space here. The dimension of the relevant subspace of the Hilbert space serves as a measure for a subset of states (if the basis can be counted).

In the course of time, the point or the state vector, which indicates the current microstate of the system, wanders around in the phase space, for example because the locations and speeds of the particles constantly vary or individual particles change from one energy level to another. All macroscopic variables of the system (such as volume , energy , but also such as the center of mass , its speed, etc.) can be calculated from the data of the current microstate (if these were then completely known). In a macroscopic state of the system, the starting point of macroscopic thermodynamics, only these macroscopic values ​​are given. A macrostate - whether in equilibrium or not - is realized by a certain set of many different microstates. Which of them is present at a certain point in time is treated as a coincidence, because it is practically impossible to determine it beforehand. In order to be able to calculate the probability for this whole set of microstates, a basic assumption about the a priori probability with which a certain individual microstate is present is necessary according to the rules of probability theory . This is:

  • Basic assumption for a priori probability: In a closed system , all reachable micro-states have the same a priori probability.

If the microstates form a continuum , this assumption is not related to a single point in the phase space, but to a volume element with microstates that belong to the same macrostate with sufficient accuracy: the a priori probability is proportional to the size of the volume element. This basic assumption cannot be proven, but it can be made understandable by means of the ergodic hypothesis put forward by Boltzmann : It is assumed that for a closed system the point of the respective microstate wanders around in the phase space of the system in such a way that it reaches each individual microstate with the same frequency ( or comes as close as you want). The choice of the volume element as a measure for the probability clearly means that not only the microstates, but also their trajectories fill the phase space with constant density.

Since the phase space includes all possible microstates of the system, those microstates that belong to a given macrostate form a subset therein. The volume of this subset is the sought-after measure for the probability that the system is currently in this given macrostate. This volume is often referred to as the “number of possible states” belonging to the given macrostate, although in classical physics it is not a pure number, but a size with one dimension that is determined by a power of effect that increases with the number of particles given is. Because the logarithm of this phase space volume is required in the statistical formulas for thermodynamic quantities, it must still be converted to a pure number by relating it to the phase space cell. If the entropy of an ideal gas is calculated in this way, it can be seen through adaptation to the measured values ​​that the phase space cell (per particle and per degree of freedom of its movement) is just as large as Planck's constant . This typically gives very large values ​​for the number that indicates the probability, which is why, in contrast to the mathematical probability, it is also referred to as the thermodynamic probability. In quantum statistics, the volume is replaced by the dimension of the relevant subspace of the Hilbert space. Even outside of statistical physics, in some quantum mechanical calculations of the phase space volume, the approximation is used to determine the size in the classical way through integration and to divide the result by a corresponding power of the quantum of action.

All macroscopic values ​​of interest can be calculated as the mean value of the density distribution of the microstates in phase space.

Stable state of equilibrium

There cannot be a state of equilibrium that is stable from a microscopic point of view. With given macroscopic values ​​of the system variables, the best approximation is achieved by that macrostate which has the greatest possible probability. The success of statistical physics is essentially based on the fact that this criterion defines the macrostate with extraordinary clarity, if the system consists of a sufficient number of subsystems (cf. the law of large numbers ). All other states lose such an extreme degree of probability that their occurrence can be neglected, even with minor deviations.

An example that illustrates this fact: Which spatial density distribution is the most likely for the molecules of a classical gas? If there are molecules in the volume of which a small part ( ) is being considered, there are ways of distributing the molecules so that molecules are in the volume part and in the volume tail ( binomial distribution ). If the molecules have the same distribution as the others with regard to all other features of their states , this formula is already a measure of the number of states. This binomial distribution has the expected value and there a maximum with the relative width . At z. B. normal air, and follows and . In the most probable macrostate, the spatial density on a mm scale corresponds to the average value better than with 8-digit accuracy in about 2/3 of the time. Larger relative deviations also occur, but for example more than only about 10 −6 of the time (see normal distribution ).

Quantum statistics of indistinguishable particles

The statistical weight of a macrostate depends heavily on whether all those microstates are counted individually that only differ by interchanging two physically identical particles. If this were the case, the formula for entropy in statistical mechanics would contain a summand that is not additive in terms of the number of particles (and is therefore incorrect). This problem became known as Gibbs' paradox . This paradox can alternatively be eliminated by an additional rule to the method of counting according to Boltzmann: exchanges of identical particles are not to be counted. The more detailed reason for this could only be provided by quantum mechanics. According to this, one must fundamentally differentiate between indistinguishable particles whether their spin is an integer (particle type boson ) or half- integer (particle type fermion ). In the case of fermions, there is an additional law that the same single-particle state cannot be occupied by more than one particle, while with bosons this number can be arbitrarily large. If these rules are observed, the uniform classical (or Boltzmannian) statistics result in the Fermi-Dirac statistics for fermions of the same type and the Bose-Einstein statistics for bosons of the same type. Both statistics show serious differences at low temperatures (the thermal radiation at any temperature, the conduction electrons in the metal even at room temperature), both among themselves and compared with the classical statistics, in the behavior of systems with several identical particles, and that for any number of particles.

Connected systems, equilibria, ensembles

General

Furthermore, statements about a non-closed system that can exchange energy, particles, volume or some other physical quantity with its environment are important. In order to be able to use the methods described above, the environment is also viewed as a system ( B ) which, together with the system of interest ( A ), forms an overall system ( A * B ), which is now assumed to be closed. The surrounding system B is often referred to as a reservoir (in the case of an energy exchange also as a heat bath ). The exchanged quantities should be conserved quantities, so their total amount remains constant. A specific set of micro-states, which are pairs of micro-states of system A and reservoir B , then belongs to a macrostate of the overall system . In each pair, the conservation quantities subject to the exchange can be divided up differently, but always in such a way that the values ​​specified for the overall system are guaranteed. That allocation, in which the number of the pairs is greatest, determines the thermodynamic equilibrium state of the whole system A * B . This is a well-defined macrostate of the overall system. Not so the set of microstates of the considered system A , which occur in the pairs belonging to the equilibrium of A * B. They represent different macroscopic states of system A , because they can differ in macroscopic sizes of system A , namely in energy, number of particles etc., depending on the type of exchange considered. For the reservoir B it is assumed that it consists of so many particles and is in such a state that its properties are not noticeably changed by the exchange with the system A.

The set of micro-states of the system A , which occur in the stationary equilibrium state of the overall system is referred to as ensemble or entirety designated. Specifically, the totality of the microstates of a closed system in equilibrium is referred to as a microcanonical ensemble , in a system with energy exchange with the environment as a canonical ensemble , and in a system with energy and particle exchange as a macrocanonical or grand canonical ensemble .

For each individual microstate of system A , a function (or, in the case of a quantum mechanical calculation, the expected value of a density operator ) indicates the frequency with which the microstate occurs in the ensemble. (In a system of classical particles, argument means a complete list of the canonical coordinates and conjugate impulses . In quantum mechanical calculations, a matrix is ​​given whose row and column indices all microstates pass through system A. ) This function is called a density function (or density distribution, Distribution function) of the ensemble, the matrix as its density matrix . The determination of such density functions or density matrices for concrete thermodynamic ensembles is a central task of statistical physics. The general solution is the searched frequency of the equilibrium state of A is just equal to the probability of the reservoir B to find a matching macro-state, so that the system and reservoir along the fixed values of the macroscopic variables of the whole system A * B arise. This probability is therefore equal to the number of possible micro-states that the reservoir can have in its phase space.

In a somewhat different representation of these relationships, which is also often found in textbooks, the ensemble is described as a large number of copies of system A , each of which forms the same state of equilibrium of the overall system A * B with environment B , and among the copies each of the relevant micro-states of A is represented with a frequency corresponding to the probability. If the ensemble is interpreted in this way, the desired temporal mean values ​​of a system A do not result from the temporal averaging of the development of this system, but from the cluster mean values ​​over all copies of A occurring in the ensemble . Because, according to the ergodic hypothesis, the system must in the course of time assume all micro-states of the ensemble with a probability given by the density function, i.e. in a corresponding fraction of the time. This is summarized as the crowd mean = time mean and is often referred to as the ergodic hypothesis.

System with many particles

If the system A also consists of a sufficiently large number of particles, it turns out for the equilibrium state of the overall system A * B that only an extremely small area of ​​closely neighboring macrostates of A contributes the microstates that make up by far the largest proportion of all pairs in the equilibrium state. The single macrostate of system A , which with its microstates makes the largest single contribution to it, is called the macrostate in which system A is in equilibrium with its environment B. The mentioned range of closely adjacent macro states is the area of the frequent fluctuations of the system A to the state of equilibrium with the environment B .

In this way, statistical physics arrives at independent fundamental interpretations of quantities such as temperature, chemical potential, etc., with which the states of equilibrium of coupled systems are characterized in macroscopic thermodynamics. They are examples of state variables :

  • The temperature is the parameter that has to match between two systems so that there is equilibrium with regard to the exchange of energy.
  • The chemical potential has the same meaning for particle exchange.
  • The pressure is the same at the expense of others for the expansion of a system.

System with few particles

The relationships presented above also apply - with one exception - to small systems down to individual particles that form a state of thermodynamic equilibrium with their surroundings, e.g. B. for each of the individual particles of a large system. The particle is then the system A , the rest of the area B . The system can even be given by just a single degree of freedom of the particles. The exception mentioned relates to the statement that the microstates of system A belonging to the equilibrium of the overall system lie closely adjacent to a certain macrostate of A , and that the probability of larger deviations quickly becomes negligibly small. The distribution for large and small systems A is given by the universally valid Boltzmann factor . ( is the Boltzmann constant, the absolute temperature of the equilibrium state, and the energy of the relevant microstate of A. ) In large systems, the energy scale of the Boltzmann factor is usually negligibly small compared to the excitation energies of the system in the area of ​​interest; it characterizes the size of the thermal Fluctuations. With a sufficiently small system, however, such changes in energy have a significant effect on behavior. Therefore, the micro-states for a small system distribute A (eg. With only one or a few particles) more over the whole of interest (and accessible) part of the phase space when it is part of a large, located at equilibrium system A * B is.

A striking example of this is the density distribution of the air. The system A is formed here by the height coordinate of a randomly selected molecule, and the Boltzmann factor with its potential energy in the gravitational field provides the barometric height formula . If one looks at larger particles instead of individual molecules, their potential energy is decisive, and the height distribution shrinks in inverse proportion to the mass, in the case of macroscopic bodies so much that one no longer has to speak of a height distribution at all.

This result is often used in large many-particle systems in order to determine the actual distribution of the individual particles in terms of their possible individual particle states or energy levels. There is no need to pay attention to the subtleties and different uses of the term ensemble, because a single system with a large number of (identical) particles already embodies a whole ensemble of physically identical systems A * B , where system A represents any single particle and system B is the remainder that is physically the same in each case.

history

Statistical physics evolved from the mechanical theory of heat proposed by Francis Bacon and Robert Boyle in the 17th and 18th centuries . As the first relevant application of the mechanical laws, Daniel Bernoulli published a purely mechanical explanation of the law of Boyle and Mariotte in 1738 . He interpreted the pressure of a gas as time-averaged momentum transfer to the wall per area and time, and calculated how high this value is on average due to elastic collisions of the gas molecules. The pressure calculated in this way turns out to be proportional to the mass , the square of the speed and the number density of the particles. It follows from this . The general gas equation immediately gives a mechanical interpretation for the (absolute) temperature: It simply indicates the (average) kinetic energy of the particles on its own scale. This interpretation of the temperature only applies strictly to the ideal gas. However, it has become widespread and is also the basis of the definition of the temperature unit K ( Kelvin ), which has been in force since 2019, by reference to the energy unit J ( Joule ).

In the further course Mikhail Wassiljewitsch Lomonossow , Georges-Louis Le Sage , John Herapath and John James Waterston developed the beginnings of the kinetic gas theory from Bernoulli's approach, but these were largely ignored. It was not until 1860 that the kinetic gas theory found broader recognition through the work of Rudolf Clausius , James Clerk Maxwell and Ludwig Boltzmann . In 1860 Maxwell calculated the velocity distribution of the molecules in the gas and thus introduced the concept of the distribution function. In 1872 Boltzmann was able to show with his H-theorem that all other distributions gradually approach the Mawell distribution through statistically uncorrelated collisions of the particles. This resulted in a purely mechanical interpretation of the law of the irreversible increase in entropy , i.e. of the 2nd law of thermodynamics. In 1884, Boltzmann also published the idea that is fundamental to statistical equilibrium, that a state of equilibrium is characterized by a certain property of the distribution function. From this, Josiah Willard Gibbs developed the terms thermodynamic ensembles around 1900. At the same time, the frequency distribution of fluctuations was examined more intensively, which led Max Planck to his radiation formula, and Albert Einstein to explain the Brownian motion and the quantum structure of light. All three discoveries are milestones on the way to modern physics.

At the same time, statistical mechanics was also heavily contested, even into the 20th century and the like. a. by the eminent natural scientists Ernst Mach and Wilhelm Ostwald , since this theory depends entirely on the existence of atoms or molecules, which at that time was still regarded as a hypothesis.

In the 1920s, u. a. by Enrico Fermi and again Einstein, who discovered the two types of quantum statistics of indistinguishable particles, which explain essential properties of matter by their differences from classical Boltzmann statistics. Since the availability of computers, transport problems and other questions of statistical physics have also increasingly been solved by direct calculation using Monte Carlo methods or molecular dynamics simulations, see the graphic opposite.

Simulation of a random walk in two dimensions with 229 steps and a random step size from the interval [−0.5; 0.5] for the x and y directions

Remarks

  1. a b Exception: the macrostate at the absolute minimum of energy. Here it may be that only a single microstate is possible.
  2. Some representations allow an infinitesimal energy interval.

literature

  • LD Landau , EM Lifschitz : Statistische Physk (vol. 6 of the textbook of theoretical physics), Akademie-Verlag Berlin, 1966
  • Klaus Stierstadt: Thermodynamics - From Microphysics to Macrophysics , Springer Verlag, 2010, ISBN 978-3-642-05097-8 , e- ISBN 978-3-642-05098-5 , DOI 10.1007 / 978-3-642-05098 -5
  • Wolfgang Nolting: Statistical Physics (Basic Course Theoretical Physics Vol. 6), 5th edition, Springer Verlag, 2007
  • Friedrich Hund : History of physical terms , part 2, BI university pocket books 544, 2.
  • Richard Becker : Theory of Heat . Heidelberg pocket books, photomechanical reprint of the ber. Edition. Springer-Verlag, Berlin, Heidelberg, New York 1966.
  • Gerd Wedler , Hans-Joachim Freund : Textbook of physical chemistry . 6th edition. Wiley-VCH, Weinheim 2012, ISBN 978-3-527-32909-0 , 4. The statistical theory of matter, 5. Transport phenomena, 6. Kinetics - especially applications in chemistry.
  • Hermann Haken : Synergetics An introduction - non-equilibrium phase transitions and self-organization in physics, chemistry and biology . 2nd Edition. Springer, Berlin 1983, ISBN 3-540-12597-3 (especially non-equilibrium processes, also examples from chemistry, biology and stochastic processes in sociology and economics).

Individual evidence

  1. ^ Carlo Cercignani: Boltzmann's legacy . In: Physics Journal . tape 5 , 2006, p. 47-51 ( online [accessed April 15, 2020]).