In statistical mechanics, entropy is an extensive property of a thermodynamic system. It is closely related to the number of microscopic configurations (known as microstates) that are consistent with the macroscopic quantities that characterize the system (such as its volume, pressure and temperature). Under the assumption that each microstate is equally probable, the entropy $S$ is the natural logarithm of the number of microstates, multiplied by the Boltzmann constant . Formally (assuming equiprobable microstates),
Macroscopic systems typically have a very large number of possible microscopic configurations. For example, the entropy of an ideal gas is proportional to the number of gas molecules . Roughly twenty liters of gas at room temperature and atmospheric pressure has (Avogadro's number). At equilibrium, each of the configurations can be regarded as random and equally likely.
The second law of thermodynamics states that the entropy of an isolated system never decreases. Such systems spontaneously evolve towards thermodynamic equilibrium, the state with maximum entropy. Nonisolated systems may lose entropy, provided their environment's entropy increases by at least that amount so that the total entropy increases. Entropy is a function of the state function, so the change in entropy of a system is determined by its initial and final states. In the idealization that a process is reversible, the entropy does not change, while irreversible processes always increase the total entropy.
Because it is determined by the number of random microstates, entropy is related to the amount of additional information needed to specify the exact physical state of a system, given its macroscopic specification. For this reason, it is often said that entropy is an expression of the disorder, or randomness of a system, or of the lack of information about it. The concept of entropy plays a central role in information theory.
Boltzmann's constant, and therefore entropy, have dimensions of energy divided by temperature, which has a unit of per kelvin (J⋅K^{−1}) in the International System of Units (or kg⋅m^{2}⋅s^{−2}⋅K^{−1} in terms of base units). The entropy of a substance is usually given as an intensive property—either entropy per unit mass (SI unit: J⋅K^{−1}⋅kg^{−1}) or entropy per unit amount of substance (SI unit: J⋅K^{−1}⋅mol^{−1}).
The first law of thermodynamics, deduced from the heatfriction experiments of James Joule in 1843, expresses the concept of energy, and its conservation in all processes; the first law, however, is unable to quantify the effects of friction and dissipation.
In the 1850s and 1860s, German physicist Rudolf Clausius objected to the supposition that no change occurs in the working body, and gave this "change" a mathematical interpretation by questioning the nature of the inherent loss of usable heat when work is done, e.g. heat produced by friction. Clausius described entropy as the transformationcontent, i.e. dissipative energy use, of a thermodynamic system or working body of chemical species during a change of state. This was in contrast to earlier views, based on the theories of Isaac Newton, that heat was an indestructible particle that had mass.
Later, scientists such as Ludwig Boltzmann, Josiah Willard Gibbs, and James Clerk Maxwell gave entropy a statistical basis. In 1877 Boltzmann visualized a probabilistic way to measure the entropy of an ensemble of ideal gas particles, in which he defined entropy to be proportional to the natural logarithm of the number of microstates such a gas could occupy. Henceforth, the essential problem in statistical thermodynamics, i.e. according to Erwin Schrödinger, has been to determine the distribution of a given amount of energy E over N identical systems. Carathéodory linked entropy with a mathematical definition of irreversibility, in terms of trajectories and integrability.
To derive the Carnot efficiency, which is (a number less than one), Kelvin had to evaluate the ratio of the work output to the heat absorbed during the isothermal expansion with the help of the Carnot–Clapeyron equation which contained an unknown function, known as the Carnot function. The possibility that the Carnot function could be the temperature as measured from a zero temperature, was suggested by Joule in a letter to Kelvin. This allowed Kelvin to establish his absolute temperature scale.
Since the latter is valid over the entire cycle, this gave Clausius the hint that at each stage of the cycle, work and heat would not be equal, but rather their difference would be a state function that would vanish upon completion of the cycle. The state function was called the internal energy and it became the first law of thermodynamics.
Now equating () and () gives
or
This implies that there is a function of state which is conserved over a complete cycle of the Carnot cycle. Clausius called this state function entropy. One can see that entropy was discovered through mathematics rather than through laboratory results. It is a mathematical construct and has no easy physical analogy. This makes the concept somewhat obscure or abstract, akin to how the concept of energy arose.
Clausius then asked what would happen if there should be less work produced by the system than that predicted by Carnot's principle. The righthand side of the first equation would be the upper bound of the work output by the system, which would now be converted into an inequality
When the second equation is used to express the work as a difference in heats, we get
So more heat is given up to the cold reservoir than in the Carnot cycle. If we denote the entropies by for the two states, then the above inequality can be written as a decrease in the entropy
The entropy that leaves the system is greater than the entropy that enters the system, implying that some irreversible process prevents the cycle from producing the maximum amount of work predicted by the Carnot equation.
The Carnot cycle and efficiency are useful because they define the upper bound of the possible work output and the efficiency of any classical thermodynamic system. Other cycles, such as the Otto cycle, Diesel cycle and Brayton cycle, can be analyzed from the standpoint of the Carnot cycle. Any machine or process that converts heat to work and is claimed to produce an efficiency greater than the Carnot efficiency is not viable because it violates the second law of thermodynamics. For very small numbers of particles in the system, statistical thermodynamics must be used. The efficiency of devices such as photovoltaic cells requires an analysis from the standpoint of quantum mechanics.
While Clausius based his definition on a reversible process, there are also irreversible processes that change entropy. Following the second law of thermodynamics, entropy of an isolated system always increases for irreversible processes. The difference between an isolated system and closed system is that heat may not flow to and from an isolated system, but heat flow to and from a closed system is possible. Nevertheless, for both closed and isolated systems, and indeed, also in open systems, irreversible thermodynamics processes may occur.
According to the Clausius theorem, for a reversible cyclic process: $\backslash oint\; \backslash frac\{\backslash delta\; Q\_\backslash text\{rev\}\}\{T\}\; =\; 0.$ This means the line integral $\backslash int\_L\; \backslash frac\{\backslash delta\; Q\_\backslash text\{rev\}\}\{T\}$ is State function.
So we can define a state function called entropy, which satisfies $d\; S\; =\; \backslash frac\{\backslash delta\; Q\_\backslash text\{rev\}\}\{T\}.$
Clausius coined the name (Entropie) for in 1865. He gives "transformational content" (Verwandlungsinhalt) as a synonym, paralleling his "thermal and ergonal content" (Wärme und Werkinhalt) as the name of Internal energy, but preferring the term entropy as a close parallel of energy, formed by replacing the root of ἔργον "work" by that of τροπή]] "transformation". "Sucht man für S einen bezeichnenden Namen, so könnte man, ähnlich wie von der Gröſse U gesagt ist, sie sey der Wärme und Werkinhalt des Körpers, von der Gröſse S sagen, sie sey der Verwandlungsinhalt des Körpers. Da ich es aber für besser halte, die Namen derartiger für die Wissenschaft wichtiger Gröſsen aus den alten Sprachen zu entnehmen, damit sie unverändert in allen neuen Sprachen angewandt werden können, so schlage ich vor, die Gröſse S nach dem griechischen Worte ἡ τροπὴ, die Verwandlung, die Entropie des Körpers zu nennen. Das Wort Entropie habei ich absichtlich dem Worte Energie möglichst ähnlich gebildet, denn die beiden Gröſsen, welche durch diese Worte benannt werden sollen, sind ihren physikalischen Bedeutungen nach einander so nahe verwandt, daſs eine gewisse Gleichartigkeit in der Benennung mir zweckmäſsig zu seyn scheint." (p. 390).
To find the entropy difference between any two states of a system, the integral must be evaluated for some reversible path between the initial and final states.
Since entropy is a state function, the entropy change of the system for an irreversible path is the same as for a reversible path between the same two states. However, the entropy change of the surroundings will be different.We can only obtain the change of entropy by integrating the above formula. To obtain the absolute value of the entropy, we need the third law of thermodynamics, which states that S = 0 at absolute zero for perfect crystals.
From a macroscopic perspective, in classical thermodynamics the entropy is interpreted as a state function of a thermodynamic system: that is, a property depending only on the current state of the system, independent of how that state came to be achieved. In any process where the system gives up energy Δ E, and its entropy falls by Δ S, a quantity at least T_{R} Δ S of that energy must be given up to the system's surroundings as unusable heat ( T_{R} is the temperature of the system's external surroundings). Otherwise the process cannot go forward. In classical thermodynamics, the entropy of a system is defined only if it is in thermodynamic equilibrium.
The interpretation of entropy in statistical mechanics is the measure of uncertainty, or mixedupness in the phrase of Gibbs, which remains about a system after its observable macroscopic properties, such as temperature, pressure and volume, have been taken into account. For a given set of macroscopic variables, the entropy measures the degree to which the probability of the system is spread out over different possible microstates. In contrast to the macrostate, which characterizes plainly observable average quantities, a microstate specifies all molecular details about the system including the position and velocity of every molecule. The more such states available to the system with appreciable probability, the greater the entropy. In statistical mechanics, entropy is a measure of the number of ways in which a system may be arranged, often taken to be a measure of "disorder" (the higher the entropy, the higher the disorder).
Specifically, entropy is a logarithmic measure of the number of states with significant probability of being occupied:
or, equivalently, the expected value of the logarithm of the probability that a microstate will be occupied
where k_{B} is the Boltzmann constant, equal to . The summation is over all the possible microstates of the system, and p_{i} is the probability that the system is in the ith microstate. Frigg, R. and Werndl, C. "Entropy – A Guide for the Perplexed". In Probabilities in Physics; Beisbart C. and Hartmann, S. Eds; Oxford University Press, Oxford, 2010 This definition assumes that the basis set of states has been picked so that there is no information on their relative phases. In a different basis set, the more general expression is
where $\backslash widehat\{\backslash rho\}$ is the density matrix, $\backslash operatorname\{Tr\}$ is trace and $\backslash log$ is the matrix logarithm. This density matrix formulation is not needed in cases of thermal equilibrium so long as the basis states are chosen to be energy eigenstates. For most practical purposes, this can be taken as the fundamental definition of entropy since all other formulas for S can be mathematically derived from it, but not vice versa.
In what has been called the fundamental assumption of statistical thermodynamics or the fundamental postulate in statistical mechanics, the occupation of any microstate is assumed to be equally probable (i.e. p_{ i} = 1/Ω, where Ω is the number of microstates); this assumption is usually justified for an isolated system in equilibrium.
In thermodynamics, such a system is one in which the volume, number of molecules, and internal energy are fixed (the microcanonical ensemble).
The most general interpretation of entropy is as a measure of our uncertainty about a system. The equilibrium state of a system maximizes the entropy because we have lost all information about the initial conditions except for the conserved variables; maximizing the entropy maximizes our ignorance about the details of the system. This uncertainty is not of the everyday subjective kind, but rather the uncertainty inherent to the experimental method and interpretative model.
The interpretative model has a central role in determining entropy. The qualifier "for a given set of macroscopic variables" above has deep implications: if two observers use different sets of macroscopic variables, they see different entropies. For example, if observer A uses the variables U, V and W, and observer B uses U, V, W, X, then, by changing X, observer B can cause an effect that looks like a violation of the second law of thermodynamics to observer A. In other words: the set of macroscopic variables one chooses must include everything that may change in the experiment, otherwise one might see decreasing entropy!
Entropy can be defined for any with reversible dynamics and the detailed balance property.
In Boltzmann's 1896 Lectures on Gas Theory, he showed that this expression gives a measure of entropy for systems of atoms and molecules in the gas phase, thus providing a measure for the entropy of classical thermodynamics.
In a thermodynamic system, pressure, density, and temperature tend to become uniform over time because the equilibrium state has higher probability (more possible of microstates) than any other state.
As an example, for a glass of ice water in air at room temperature, the difference in temperature between a warm room (the surroundings) and cold glass of ice and water (the system and not part of the room), begins to equalize as portions of the thermal energy from the warm surroundings spread to the cooler system of ice and water. Over time the temperature of the glass and its contents and the temperature of the room become equal. In other words, the entropy of the room has decreased as some of its energy has been dispersed to the ice and water.
However, as calculated in the example, the entropy of the system of ice and water has increased more than the entropy of the surrounding room has decreased. In an isolated system such as the room and ice water taken together, the dispersal of energy from warmer to cooler always results in a net increase in entropy. Thus, when the "universe" of the room and ice water system has reached a temperature equilibrium, the entropy change from the initial state is at a maximum. The entropy of the thermodynamic system is a measure of how far the equalization has progressed.
Thermodynamic entropy is a nonconserved state function that is of great importance in the sciences of physics and chemistry.
Unlike many other functions of state, entropy cannot be directly observed but must be calculated. Entropy can be calculated for a substance as the standard molar entropy from absolute zero (also known as absolute entropy) or as a difference in entropy from some other reference state which is defined as zero entropy. Entropy has the dimension of energy divided by temperature, which has a unit of per kelvin (J/K) in the International System of Units. While these are the same units as heat capacity, the two concepts are distinct. Entropy is not a conserved quantity: for example, in an isolated system with nonuniform temperature, heat might irreversibly flow and the temperature become more uniform such that entropy increases. The second law of thermodynamics states that a closed system has entropy which may increase or otherwise remain constant. Chemical reactions cause changes in entropy and entropy plays an important role in determining in which direction a chemical reaction spontaneously proceeds.
One dictionary definition of entropy is that it is "a measure of thermal energy per unit temperature that is not available for useful work". For instance, a substance at uniform temperature is at maximum entropy and cannot drive a heat engine. A substance at nonuniform temperature is at a lower entropy (than if the heat distribution is allowed to even out) and some of the thermal energy can drive a heat engine.
A special case of entropy increase, the entropy of mixing, occurs when two or more different substances are mixed. If the substances are at the same temperature and pressure, there is no net exchange of heat or work – the entropy change is entirely due to the mixing of the different substances. At a statistical mechanical level, this results due to the change in available volume per particle with mixing.
It follows from the second law of thermodynamics that the entropy of a system that is not isolated may decrease. An air conditioner, for example, may cool the air in a room, thus reducing the entropy of the air of that system. The heat expelled from the room (the system), which the air conditioner transports and discharges to the outside air, always makes a bigger contribution to the entropy of the environment than the decrease of the entropy of the air of that system. Thus, the total of entropy of the room plus the entropy of the environment increases, in agreement with the second law of thermodynamics.
In mechanics, the second law in conjunction with the fundamental thermodynamic relation places limits on a system's ability to do useful work.
The entropy change of a system at temperature T absorbing an infinitesimal amount of heat δq in a reversible way, is given by δq/ T. More explicitly, an energy is not available to do useful work, where T_{R} is the temperature of the coldest accessible reservoir or heat sink external to the system. For further discussion, see Exergy.Statistical mechanics demonstrates that entropy is governed by probability, thus allowing for a decrease in disorder even in an isolated system. Although this is possible, such an event has a small probability of occurring, making it unlikely.
The applicability of a second law of thermodynamics is limited to systems which are near or in equilibrium state. At the same time, laws governing systems which are far from equilibrium are still debatable. One of the guiding principles for such systems is the maximum entropy production principle. It claims that nonequilibrium systems evolve such as to maximize its entropy production.
Since both internal energy and entropy are monotonic functions of temperature T, implying that the internal energy is fixed when one specifies the entropy and the volume, this relation is valid even if the change from one state of thermal equilibrium to another with infinitesimally larger entropy and volume happens in a nonquasistatic way (so during this change the system may be very far out of thermal equilibrium and then the entropy, pressure and temperature may not exist).
The fundamental thermodynamic relation implies many thermodynamic identities that are valid in general, independent of the microscopic details of the system. Important examples are the Maxwell relations and the relations between heat capacities.
The thermodynamic entropy therefore has the dimension of energy divided by temperature, and the unit joule per kelvin (J/K) in the International System of Units (SI).
Thermodynamic entropy is an extensive property, meaning that it scales with the size or extent of a system. In many processes it is useful to specify the entropy as an intensive property independent of the size, as a specific entropy characteristic of the type of system studied. Specific entropy may be expressed relative to a unit of mass, typically the kilogram (unit: J⋅kg^{−1}⋅K^{−1}). Alternatively, in chemistry, it is also referred to one mole of substance, in which case it is called the molar entropy with a unit of J⋅mol^{−1}⋅K^{−1}.
Thus, when one mole of substance at about is warmed by its surroundings to , the sum of the incremental values of q_{rev}/ T constitute each element's or compound's standard molar entropy, an indicator of the amount of energy stored by a substance at .
Entropy change also measures the mixing of substances as a summation of their relative quantities in the final mixture.Entropy is equally essential in predicting the extent and direction of complex chemical reactions. For such applications, Δ S must be incorporated in an expression that includes both the system and its surroundings, Δ S_{universe} = Δ S_{surroundings} + Δ S _{system}. This expression becomes, via some steps, the Gibbs free energy equation for reactants and products in the system: Δ G the = Δ H the − T Δ S the.
To derive a generalized entropy balanced equation, we start with the general balance equation for the change in any extensive quantity Θ in a thermodynamic system, a quantity that may be either conserved, such as energy, or nonconserved, such as entropy. The basic generic balance expression states that dΘ/dt, i.e. the rate of change of Θ in the system, equals the rate at which Θ enters the system at the boundaries, minus the rate at which Θ leaves the system across the system boundaries, plus the rate at which Θ is generated within the system. For an open thermodynamic system in which heat and work are transferred by paths separate from the paths for transfer of matter, using this generic balance equation, with respect to the rate of change with time t of the extensive quantity entropy S, the entropy balance equation is:
The overdots represent derivatives of the quantities with respect to time.
where
Note, also, that if there are multiple heat flows, the term $\backslash dot\{Q\}/T$ is replaced by $\backslash sum\; \backslash dot\{Q\}\_j/T\_j,$ where $\backslash dot\{Q\}\_j$ is the heat flow and $T\_j$ is the temperature at the jth heat flow port into the system.
Here $n$ is the number of moles of gas and $R$ is the ideal gas constant. These equations also apply for expansion into a finite vacuum or a throttling process, where the temperature, internal energy and enthalpy for an ideal gas remain constant.
Similarly at constant volume, the entropy change is
At low temperatures near absolute zero, heat capacities of solids quickly drop off to near zero, so the assumption of constant heat capacity does not apply.
Since entropy is a state function, the entropy change of any process in which temperature and volume both vary is the same as for a path divided into two steps – heating at constant volume and expansion at constant temperature. For an ideal gas, the total entropy change is
Similarly if the temperature and pressure of an ideal gas both vary,
In Boltzmann's definition, entropy is a measure of the number of possible microscopic states (or microstates) of a system in thermodynamic equilibrium. Consistent with the Boltzmann definition, the second law of thermodynamics needs to be reworded as such that entropy increases over time, though the underlying principle remains the same.
Similarly, the total amount of "order" in the system is given by:
In which C_{D} is the "disorder" capacity of the system, which is the entropy of the parts contained in the permitted ensemble, C_{I} is the "information" capacity of the system, an expression similar to Shannon's channel capacity, and C_{O} is the "order" capacity of the system.
Ambiguities in the terms disorder and chaos, which usually have meanings directly opposed to equilibrium, contribute to widespread confusion and hamper comprehension of entropy for most students. As the second law of thermodynamics shows, in an isolated system internal portions at different temperatures tend to adjust to a single uniform temperature and thus produce equilibrium. A recently developed educational approach avoids ambiguous terms and describes such spreading out of energy as dispersal, which leads to loss of the differentials required for work even though the total energy remains constant in accordance with the first law of thermodynamics (compare discussion in next section). Physical chemist Peter Atkins, for example, who previously wrote of dispersal leading to a disordered state, now writes that "spontaneous changes are always accompanied by a dispersal of energy".
Thus, the fact that the entropy of the universe is steadily increasing, means that its total energy is becoming less useful: eventually, this will lead to the "heat death of the Universe".
where ρ is the density matrix and Tr is the trace operator.
This upholds the correspondence principle, because in the classical limit, when the phases between the basis states used for the classical probabilities are purely random, this expression is equivalent to the familiar classical definition of entropy,
Von Neumann established a rigorous mathematical framework for quantum mechanics with his work Mathematische Grundlagen der Quantenmechanik. He provided in this work a theory of measurement, where the usual notion of wave function collapse is described as an irreversible process (the socalled von Neumann or projective measurement). Using this concept, in conjunction with the density matrix he extended the classical concept of entropy into the quantum domain.
In information theory, entropy is the measure of the amount of information that is missing before reception and is sometimes referred to as Shannon entropy.
Shannon entropy is a broad and general concept which finds applications in information theory as well as thermodynamics. It was originally devised by Claude Shannon in 1948 to study the amount of information in a transmitted message. The definition of the information entropy is, however, quite general, and is expressed in terms of a discrete set of probabilities ''p_{i} so thatIn the case of transmitted messages, these probabilities were the probabilities that a particular message was actually transmitted, and the entropy of the message system was a measure of the average amount of information in a message. For the case of equal probabilities (i.e. each message is equally probable), the Shannon entropy (in bits) is just the number of yes/no questions needed to determine the content of the message.
The question of the link between information entropy and thermodynamic entropy is a debated topic. While most authors argue that there is a link between the two,
and if entropy is measured in units of k per nat, then the entropy is given by:
which is the famous Boltzmann entropy formula when k is Boltzmann's constant, which may be interpreted as the thermodynamic entropy per nat. There are many ways of demonstrating the equivalence of "information entropy" and "physics entropy", that is, the equivalence of "Shannon entropy" and "Boltzmann entropy". Nevertheless, some authors argue for dropping the word entropy for the H function of information theory and using Shannon's other term "uncertainty" instead.Schneider, Tom, DELILA system (Deoxyribonucleic acid Library Language), (Information Theory Analysis of binding sites), Laboratory of Mathematical Biology, National Cancer Institute, FCRDC Bldg. 469. Rm 144, P.O. Box. B Frederick, MD 217021201, USA
The resulting relation describes how entropy changes $dS$ when a small amount of energy $dQ$ is introduced into the system at a certain temperature $T$.
The process of measurement goes as follows. First, a sample of the substance is cooled as close to absolute zero as possible. At such temperatures, the entropy approaches zero—due to the definition of temperature. Then, small amounts of heat are introduced into the sample and the change in temperature is recorded, until the temperature reaches a desired value (usually 25°C). The obtained data allows the user to integrate the equation above, yielding the absolute value of entropy of the substance at the final temperature. This value of entropy is called calorimetric entropy.
If the universe can be considered to have generally increasing entropy, then – as Roger Penrose has pointed out – gravity plays an important role in the increase because gravity causes dispersed matter to accumulate into stars, which collapse eventually into . The entropy of a black hole is proportional to the surface area of the black hole's event horizon.
Jacob Bekenstein and Stephen Hawking have shown that black holes have the maximum possible entropy of any object of equal size. This makes them likely end points of all entropyincreasing processes, if they are totally effective matter and energy traps. However, the escape of energy from black holes might be possible due to quantum activity (see Hawking radiation).The role of entropy in cosmology remains a controversial subject since the time of Ludwig Boltzmann. Recent work has cast some doubt on the heat death hypothesis and the applicability of any simple thermodynamic model to the universe in general. Although entropy does increase in the model of an expanding universe, the maximum possible entropy rises much more rapidly, moving the universe further from the heat death with time, not closer.
Current theories suggest the entropy gap to have been originally opened up by the early rapid exponential expansion of the universe. (in honor of John Wheeler’s 90th birthday)
In economics, GeorgescuRoegen's work has generated the term 'entropy pessimism'. Since the 1990s, leading ecological economist and steadystate theorist Herman Daly — a student of GeorgescuRoegen — has been the economics profession's most influential proponent of the entropy pessimism position.

