Entropy (thermodynamics): Difference between revisions
imported>Paul Wormer |
imported>Paul Wormer |
||
Line 97: | Line 97: | ||
Note that the limit ''T'' → 0 implies ''V'' → 0, so that the zero temperature limit of an ideal gas is ill-defined. | Note that the limit ''T'' → 0 implies ''V'' → 0, so that the zero temperature limit of an ideal gas is ill-defined. | ||
The entropy of one mole of ideal gas | The entropy of one mole of an ideal gas depends on the molar gas constant ''R'' and the [[molar heat capacity]] at constant volume, ''C''<sub>V</sub>, | ||
:<math> | :<math> | ||
S(T,V) = C_V \log(T) + R \log(V) + C_0=R\log( T^{\frac{C_V}{R}}\, V) + C_0, | S(T,V) = C_V \log(T) + R \log(V) + C_0=R\log( T^{\frac{C_V}{R}}\, V) + C_0, | ||
</math> | </math> | ||
where ''C''<sub>0</sub> is a constant independent of ''T'', ''V'', and ''p''. From statistical thermodynamics it is known that an atomic ideal gas | where ''C''<sub>0</sub> is a constant independent of ''T'', ''V'', and ''p''. From statistical thermodynamics it is known that for an atomic ideal gas ''C''<sub>''V''</sub> = 3''R''/2, so that the exponent for ''T'' becomes 3/2. For a diatomic ideal gas ''C''<sub>''V''</sub> = 5''R''/2 and for an ideal gas of arbitrarily shaped molecules ''C''<sub>''V''</sub> = 3''R''. For an ideal gas ''C''<sub>''V''</sub> is constant, independent of ''T'', ''V'', or ''p''. | ||
The ideal gas entropy follows easily by substituting | The ideal gas entropy follows easily by substituting the ideal gas law (E1) into the following general differential equation for the entropy as function of ''T'' and ''V''—valid for ''any'' thermodynamic system, | ||
:<math> | :<math> | ||
dS = \frac{C_V}{T} dT + \left(\frac{\partial p}{\partial T}\right)_V dV \qquad \qquad\qquad \qquad\qquad\qquad \qquad\qquad \qquad\qquad(\mathrm{E2}). | dS = \frac{C_V}{T} dT + \left(\frac{\partial p}{\partial T}\right)_V dV.\qquad \qquad\qquad \qquad\qquad\qquad \qquad\qquad \qquad\qquad(\mathrm{E2}). | ||
</math> | </math> | ||
====Proof of differential equation for S==== | Integration gives | ||
The proof of the differential equation (E2) follows by some typical classical thermodynamic calculus. First, | :<math> | ||
\begin{align} | |||
\int_1^2 dS &= C_V\, \int_1^2 \frac{dT}{T} + R\, \int_1^2 \frac{dV}{V}\Longrightarrow \\ | |||
S_2-S_1 &= C_V\log(T_2) + R \log(V_2) -C_V \log(T_1) -R\log(V_1) | |||
\end{align} | |||
</math> | |||
Write | |||
:<math> | |||
C_0 \equiv S_1 -C_V \log(T_1) -R\log(V_2) \quad\hbox{and}\quad S_2 \equiv S,\; T_2\equiv T,\, V_2\equiv V | |||
</math> | |||
and the result follows. | |||
====Proof of differential equation for ''S(T,V)''==== | |||
The proof of the differential equation (E2) follows by some typical classical thermodynamic calculus. | |||
First, the [[internal energy]] at constant volume follows thus, | |||
:<math> | :<math> | ||
dU = \left(\frac{\partial U}{\partial T}\right)_V dT + \left(\frac{\partial U}{\partial V}\right)_T dV\; \underset{\scriptstyle\mathrm{constant}\; V} {\Longrightarrow}\; dU = \left(\frac{\partial U}{\partial T}\right)_V dT | dU = \left(\frac{\partial U}{\partial T}\right)_V dT + \left(\frac{\partial U}{\partial V}\right)_T dV\; \underset{\scriptstyle\mathrm{constant}\; V} {\Longrightarrow}\; dU = \left(\frac{\partial U}{\partial T}\right)_V dT . | ||
</math> | </math> | ||
The definition of heat capacity and the first law (''DQ = dU+pdV'', for constant volume: ''DQ=dU'') give, | |||
:<math> | :<math> | ||
DQ \equiv C_V dT | DQ \equiv C_V dT = dU = \left(\frac{\partial U}{\partial T}\right)_V dT, | ||
</math> | </math> | ||
so that | so that the heat capacity at constant volume is given by | ||
:<math> | :<math> | ||
C_V = \left(\frac{\partial U}{\partial T}\right)_V. | C_V = \left(\frac{\partial U}{\partial T}\right)_V. | ||
</math> | |||
The first and second law combined (''TdS=dU+pdV'') gives | |||
:<math> | |||
dS = \underbrace{\frac{C_V}{T}}_{\frac{\partial S}{\partial T}} dT + | |||
\underbrace{\frac{1}{T} \left[\left(\frac{\partial U}{\partial V}\right)_T + p\right]}_{\frac{\partial S}{\partial V}} dV. | |||
\qquad\qquad\qquad\qquad\qquad\qquad\qquad(\mathrm{E}3) | |||
</math> | |||
From, | |||
:<math> | |||
\frac{\partial}{\partial V} \frac{\partial S}{\partial T} = \frac{\partial}{\partial T} \frac{\partial S}{\partial V} | |||
</math> | |||
and | |||
:<math> | |||
\frac{\partial}{\partial V} \frac{\partial S}{\partial T} = \frac{\partial}{\partial V}\frac{C_V}{T} | |||
= \frac{1}{T} \frac{\partial C_V}{\partial V} = \frac{1}{T} \frac{\partial^2 U}{\partial V\partial T} | |||
</math> | |||
and | |||
:<math> | |||
\frac{\partial}{\partial T} \frac{\partial S}{\partial V} = | |||
\frac{\partial}{\partial T} \frac{1}{T} | |||
\left[\left( \frac{\partial U}{\partial V} \right)_T + p\right] | |||
= | |||
-\frac{1}{T^2} \left[ \left(\frac{\partial U}{\partial V}\right)_T +p\right] + | |||
\frac{1}{T}\left[ \left(\frac{\partial p}{\partial T}\right)_V + \left(\frac{\partial^2 U}{\partial T\partial V}\right)_V \right] | |||
</math> | |||
follows | |||
:<math> | |||
0 = -\frac{1}{T^2} \left[ \left(\frac{\partial U}{\partial V}\right)_T +p\right] + | |||
\frac{1}{T} \left(\frac{\partial p}{\partial T}\right)_V | |||
\Longrightarrow | |||
\left(\frac{\partial U}{\partial V}\right)_T = -p + T \left(\frac{\partial p}{\partial T}\right)_V | |||
</math> | |||
Substitute the very last equation into equation (E3), and the equation to be proved follows, | |||
:<math> | |||
dS = \frac{C_V}{T} dT + \left(\frac{\partial p}{\partial T}\right)_V dV. | |||
</math> | </math> | ||
Revision as of 08:53, 10 November 2009
Entropy is a function of the state of a thermodynamic system. It is a size-extensive[1] quantity, invariably denoted by S, with dimension energy divided by temperature (SI unit: joule/K). Entropy has no analogous mechanical meaning—unlike volume, a similar size-extensive state parameter. Moreover entropy cannot directly be measured, there is no such thing as an entropy meter, whereas state parameters like volume and temperature are easily determined. Consequently entropy is one of the least understood concepts in physics.[2]
The state variable "entropy" was introduced by Rudolf Clausius in 1865[3], see the inset for his text, when he gave a mathematical formulation of the second law of thermodynamics.
The traditional way of introducing entropy is by means of a Carnot engine, an abstract engine conceived in 1824 by Sadi Carnot[4] as an idealization of a steam engine. Carnot's work foreshadowed the second law of thermodynamics. The "engineering" manner—by an engine—of introducing entropy will be discussed below. In this approach, entropy is the amount of heat (per degree kelvin) gained or lost by a thermodynamic system that makes a transition from one state to another. The second law states that the entropy of an isolated system increases in spontaneous (natural) processes leading from one state to another, whereas the first law states that the internal energy of the system is conserved.
In 1877 Ludwig Boltzmann[5] gave a definition of entropy in the context of the kinetic gas theory, a branch of physics that developed into statistical thermodynamics. Boltzmann's definition of entropy was furthered by John von Neumann[6] to a quantum statistical definition. The quantum statistical point of view, too, will be reviewed in the present article. In the statistical approach the entropy of an isolated (constant energy) system is kB logΩ, where kB is Boltzmann's constant, Ω is the number of different wave functions ("microstates") of the system belonging to the system's energy (Ω is the degree of degeneracy, the probability that a state is described by one of the Ω wave functions, is in one of the Ω microstates), and the function log stands for the natural (base e) logarithm.
Not satisfied with the engineering type of argument, the mathematician Constantin Carathéodory gave in 1909 a new axiomatic formulation of entropy and the second law of thermodynamics.[7] His theory was based on Pfaffian differential equations. His axiom replaced the earlier Kelvin-Planck and the equivalent Clausius formulation of the second law and did not need Carnot engines. Carathéodory's work was taken up by Max Born,[8] and it is treated in a few textbooks.[9] Since it requires more mathematical knowledge than the traditional approach based on Carnot engines, and since this mathematical knowledge is not needed by most students of thermodynamics, the traditional approach is still dominant in the majority of introductory works on thermodynamics.
Traditional definition
The state (a point in state space) of a thermodynamic system is characterized by a number of variables, such as pressure p, temperature T, amount of substance n, volume V, etc. Any thermodynamic parameter can be seen as a function of an arbitrary independent set of other thermodynamic variables, hence the terms "property", "parameter", "variable" and "function" are used interchangeably. The number of independent thermodynamic variables of a system is equal to the number of energy contacts of the system with its surroundings.
An example of a reversible (quasi-static) energy contact is offered by the prototype thermodynamical system, a gas-filled cylinder with piston. Such a cylinder can perform work on its surroundings,
where dV stands for a small increment of the volume V of the cylinder, p is the pressure inside the cylinder and DW stands for a small amount of work. Work by expansion is a form of energy contact between the cylinder and its surroundings. This process can be reverted, the volume of the cylinder can be decreased, the gas is compressed and the surroundings perform work DW = pdV < 0 on the cylinder.
The small amount of work is indicated by D, and not by d, because DW is not necessarily a differential of a function. However, when we divide DW by p the quantity DW/p becomes obviously equal to the differential dV of the differentiable state function V. State functions depend only on the actual values of the thermodynamic parameters (they are local in state space), and not on the path along which the state was reached (the history of the state). Mathematically this means that integration from point 1 to point 2 along path I in state space is equal to integration along a different path II,
The amount of work (divided by p) performed reversibly along path I is equal to the amount of work (divided by p) along path II. This condition is necessary and sufficient that DW/p is the differential of a state function. So, although DW is not a differential, the quotient DW/p is one.
Reversible absorption of a small amount of heat DQ is another energy contact of a system with its surroundings; DQ is again not a differential of a certain function. In a completely analogous manner to DW/p, the following result can be shown for the heat DQ (divided by T) absorbed reversibly by the system along two different paths (along both paths the absorption is reversible):
(1)
Hence the quantity dS defined by
is the differential of a state variable S, the entropy of the system. In the next subsection equation (1) will be proved from the Kelvin-Planck principle. Observe that this definition of entropy only fixes entropy differences:
Note further that entropy has the dimension energy per degree temperature (joule per degree kelvin) and recalling the first law of thermodynamics (the differential dU of the internal energy satisfies dU = DQ − DW), it follows that
(For convenience sake only a single work term was considered here, namely DW = pdV, work done by the system). The internal energy is an extensive quantity. The temperature T is an intensive property, independent of the size of the system. It follows that the entropy S is an extensive property. In that sense the entropy resembles the volume of the system. We reiterate that volume is a state function with a well-defined mechanical meaning, whereas entropy is introduced by analogy and is not easily visualized. Indeed, as is shown in the next subsection, it requires a fairly elaborate reasoning to prove that S is a state function, i.e., that equation (1) holds.
Proof that entropy is a state function
Equation (1) gives the sufficient condition that the entropy S is a state function. The standard proof of equation (1), as given now, is physical, by means of an engine making Carnot cycles, and is based on the Kelvin-Planck formulation of the second law of thermodynamics.
Consider the figure. A system, consisting of an arbitrary closed system C (only heat goes in and out) and a reversible heat engine E, is coupled to a large heat reservoir R of constant temperature T0. The system C undergoes a cyclic state change 1-2-1. Since no work is performed on or by C, it follows that
For the heat engine E it holds (by the definition of thermodynamic temperature) that
Hence
From the Kelvin-Planck principle it follows that W is necessarily less or equal zero, because there is only the single heat source R from which W is extracted. Invoking the first law of thermodynamics we get,
so that
Because the processes inside C and E are assumed reversible, all arrows can be reverted and in the very same way it is shown that
so that equation (1) holds (with a slight change of notation, subscripts are transferred to the respective integral signs):
Entropy of an ideal gas
The equation of state of one mole of an ideal gas is
where R is the molar gas constant, p the pressure, and V the volume of the gas. Note that the limit T → 0 implies V → 0, so that the zero temperature limit of an ideal gas is ill-defined.
The entropy of one mole of an ideal gas depends on the molar gas constant R and the molar heat capacity at constant volume, CV,
where C0 is a constant independent of T, V, and p. From statistical thermodynamics it is known that for an atomic ideal gas CV = 3R/2, so that the exponent for T becomes 3/2. For a diatomic ideal gas CV = 5R/2 and for an ideal gas of arbitrarily shaped molecules CV = 3R. For an ideal gas CV is constant, independent of T, V, or p.
The ideal gas entropy follows easily by substituting the ideal gas law (E1) into the following general differential equation for the entropy as function of T and V—valid for any thermodynamic system,
Integration gives
Write
and the result follows.
Proof of differential equation for S(T,V)
The proof of the differential equation (E2) follows by some typical classical thermodynamic calculus.
First, the internal energy at constant volume follows thus,
The definition of heat capacity and the first law (DQ = dU+pdV, for constant volume: DQ=dU) give,
so that the heat capacity at constant volume is given by
The first and second law combined (TdS=dU+pdV) gives
From,
and
and
follows
Substitute the very last equation into equation (E3), and the equation to be proved follows,
Entropy in statistical thermodynamics
In classical (phenomenological) thermodynamics it is not necessary to assume that matter consists of small particles (atoms or molecules). While this has the advantage of keeping the theory transparent, not obscured by microscopic details, it has the disadvantage that it cannot predict the value of any parameters. For instance, the heat capacity of a monoatomic ideal gas at constant volume CV is equal to 3R/2, where R is the molar gas constant. One needs a microscopic theory to find this simple result.
Before the 1920s the microscopic (molecular) theory of thermodynamics was based on classical (Newtonian) mechanics and on the kind of statistical arguments that were first introduced into physics by Maxwell and developed by Gibbs and Boltzmann. Since the 1920s microscopic thermodynamics invokes quantum mechanics. The branch of physics that tries to predict thermodynamic properties departing from molecular properties is known as statistical thermodynamics or statistical mechanics.
In this section it will be shown that the statistical mechanics expression for the entropy is
where the density operator is given by
Further kB is Boltzmann's constant, is the quantum mechanical energy operator of the total system (the energies of all particles plus their interactions), and the trace (Tr) of an operator is the sum of its diagonal matrix elements.
It will also be shown under which circumstance the entropy may be given by Boltzmann's celebrated equation
Density operator
In his book[6]John von Neumann introduced into quantum mechanics the density operator (called "statistical operator" by von Neumann) for a system of which the state is only partially known. He considered the situation that certain real numbers pm are known that correspond to a complete set of orthonormal quantum mechanical states | m ⟩ (m = 0, 1, 2, …, ∞).[10] The quantity pm is the probability that state |m⟩ is occupied, or in other words, it is the percentage of systems in a (very large) ensemble of identical systems that are in the state |m⟩. As is usual for probabilities, they are normalized to unity,
The averaged value of a property with quantum mechanical operator of a system described by the probabilities pm is given by the ensemble average,
where is the usual quantum mechanical expectation value.
The expression for ⟨⟨P ⟩⟩ can be written as a trace of an operator product. First define the density operator;
then it follows that
Indeed,
where ⟨ m | n ⟩ = δmn, the Kronecker delta.
A density operator has unit trace
Closed isothermal system
For a thermodynamic system of constant temperature (T), volume (V), and number of particles (N),one considers eigenstates of the energy operator , the Hamiltonian of the total system,
Assume that pm is proportional to the Boltzmann factor, with the proportionality constant K determined by normalization,
where kB is the Boltzmann constant. It is common to designate the partition function of the system of constant T, N, and V by Q,
Hence, using that
it is found
where it used that the set of states is complete—give rise to the following resolution of the identity operator,
- Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \hat{1} = \sum_m |m\rangle \langle m| = \sum_n |n\rangle \langle n| . }
In summary, the canonical ensemble[11] average of a property with quantum mechanical operator Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \hat{P}} is given by
- Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \langle\langle \hat{P}\, \rangle\rangle = \mathrm{Tr}\big[ \hat{P}\hat{\rho}\big] = \frac{1}{Q}\mathrm{Tr}\big[ \hat{P} e^{-\hat{H}/(k_\mathrm{B} T)} \big]. }
Internal energy
The quantum statistical expression for internal energy is
From
follows
The quantum statistical expression for the internal energy U becomes
where it is used that a scalar may be taken of the trace and that the density operator is of unit trace.
In classical thermodynamics the internal energy is related to the entropy S and the Helmholtz free energy A by
Define
and accordingly
and
In summary,
which agrees with the quantum statistical expression for U, which in turn means that the definitions (1) of the entropy operator and Helmholtz free energy operator are consistent.
Note that neither the entropy nor the free energy are given by an ordinary quantum mechanical operator, both depend on the temperature through the partition function Q. Furthermore Q is defined as a trace:
and thus samples the whole (Hilbert) space containing the state vectors | m ⟩. Almost all quantum mechanical operators that represent observable (physical) quantities have a classical (electromagnetic or mechanical) counterpart. Clearly the entropy operator lacks such a parallel definition, and this is probably the main reason why entropy is a concept that is difficult to comprehend
Boltzmann's formula for entropy
Let us consider an isolated system (constant U, V, and N). Traces are taken only over states with energy U. Let there be Ω(U, V, N) of these states. This is in general a very large number, for instance for one mole of a mono-atomic ideal gas consisting of N = NA ≈ 1023 (Avogadro's number) it holds that[12]
Here m is the mass of an atom, h is Planck's constant, V is the volume of the vessel containing the gas, and e ≈ 2.7.
The sum in the partition function shrinks to a sum over Ω states of energy U, hence
Likewise,
so that Boltzmann's celebrated equation follows[13]
Boltzmann's equation is derived as an average over an ensemble consisting of identical systems of constant energy, number of particles, and volume; such an ensemble is known as a microcanonical ensemble. However, it can be shown that energy fluctuations around the mean energy in a canonical ensemble (constant T) are extremely small, so that taking the trace over only the states of mean energy is a very good approximation. In other words, although Boltzmann's formula does not hold formally for a canonical ensemble, in practice it is a very good approximation, also for isothermal systems.
- ↑ A size-extensive property of a system becomes x times larger when the system is enlarged by a factor x, provided all intensive parameters remain the same upon the enlargement. Intensive parameters, like temperature, density, and pressure, are independent of size.
- ↑ It is reported that in a conversation with Claude Shannon, John (Johann) von Neumann said: "In the second place, and more important, nobody knows what entropy really is [..]”. M. Tribus, E. C. McIrvine, Energy and information, Scientific American, vol. 224 (September 1971), pp. 178–184.
- ↑ 3.0 3.1 R. J. E. Clausius, Ueber verschiedene für die Anwendung bequeme Formen der Hauptgleichungen der Mechanischen Wärmetheorie [On several forms of the fundamental equations of the mechanical theory of heat that are useful for application], Annalen der Physik, (is Poggendorff's Annalen der Physik und Chemie) vol. 125, pp. 352–400 (1865) pdf. Around the same time Clausius wrote a two-volume treatise: R. J. E. Clausius, Abhandlungen über die mechanische Wärmetheorie [Treatise on the mechanical theory of heat], F. Vieweg, Braunschweig, (vol I: 1864, vol II: 1867); Google books (contains two volumes). The 1865 Annalen paper was reprinted in the second volume of the Abhandlungen and included in the 1867 English translation.
- ↑ S. Carnot, Réflexions sur la puissance motrice du feu et sur les machines propres à développer cette puissance (Reflections on the motive power of fire and on machines suited to develop that power), Chez Bachelier, Paris (1824).
- ↑ L. Boltzmann, Über die Beziehung zwischen dem zweiten Hauptsatz der mechanischen Wärmetheorie und der Wahrscheinlichkeitsrechnung respektive den Sätzen über das Wärmegleichgewicht, [On the relation between the second fundamental law of the mechanical theory of heat and the probability calculus with respect to the theorems of heat equilibrium] Wiener Berichte vol. 76, pp. 373-435 (1877)
- ↑ 6.0 6.1 J. von Neumann, Mathematische Grundlagen der Quantenmechanik, [Mathematical foundation of quantum mechanics] Springer, Berlin (1932)
- ↑ C. Carathéodory, Untersuchungen über die Grundlagen der Thermodynamik [Investigation on the foundations of thermodynamics], Mathematische Annalen, vol. 67, pp. 355-386 (1909).
- ↑ M. Born, Physikalische Zeitschrift, vol. 22, p. 218, 249, 282 (1922)
- ↑ H. B. Callen, Thermodynamics and an Introduction to Thermostatistics. John Wiley and Sons, New York, 2nd edition, (1965); E. A. Guggenheim, Thermodynamics, North-Holland, Amsterdam, 5th edition (1967)
- ↑ In order to distinguish the macroscopic thermodynamical states of a system (determined by a few thermodynamic parameters, such as T and V) from the quantum mechanical states (functions of 3N parameters, the coordinates of the N particles), the quantum mechanical states are often referred to as "microstates".
- ↑ A large number of systems with constant T, V, and N is known as a canonical ensemble; the term is due to Willard Gibbs.
- ↑ T. L. Hill, An introduction to statistical thermodynamics, Addison-Wesley, Reading, Mass. (1960) p. 82
- ↑ The equation S = k log W is on the tombstone of the family grave of Boltzmann, see Photo Boltzmann tombstone.
Entropy as disorder
In common parlance the term entropy is used for lack of order and gradual decline into disorder. One can find in many introductory physics texts the statement that entropy is a measure for the degree of randomness in a system.
The origin of these statements is Boltzmann's 1877 equation S=kB logΩ that was dicussed above. The third law of thermodynamics states the following: when T → 0 the number of accessible states Ω goes to unity, and the entropy S goes to zero. That is, if one interprets entropy as randomness, then at zero K there is no disorder whatsoever, matter is in complete order. Clearly, this low-temperature limit supports the intuitive notion of entropy as a measure of chaos.
It was shown above that Ω gives the number of quantum states accessible to a system. It can be argued that the more quantum states are available to a system, the greater the complexity of the system. If one equates complexity with randomness, as is often done in this context, it confirms the notion of entropy as a measure of disorder. The second law of thermodynamics, which states that a spontaneous process in an isolated system strives toward maximum entropy, can be interpreted as the tendency of the universe to become more and more chaotic.
However, the view of entropy as disorder, as a measure of chaos, is disputed. For instance, Lambert[1] contends that entropy is a "measure for energy dispersal". If one reads "energy dispersal" as heat divided by temperature, this is true by the classical (phenomenological) definition of entropy. Lambert states that from a molecular point of view, entropy increases when more microstates become available to the system (i.e., Ω increases) and the energy is dispersed over the greater number of accessible microstates. This interpretation agrees with the discussion above. Lambert argues further that the view of entropy as disorder, is "so misleading as actually to be a failure-prone crutch".
If one rejects completely the idea of entropy as randomness, one discards a convenient mnemonic device. Generations of physicists and chemists have remembered that a gas contains more entropy than a crystal, "because a gas is more chaotic than a crystal". This is easier to remember than "because the gas has more microstates to its disposal and its energy is dispersed over these larger number of microstates", although the latter statement is the more correct one.
Footnotes
- ↑ F. L. Lambert, Disorder—A Cracked Crutch for Supporting Entropy Discussions, Journal of Chemical Education, vol. 79 pp. 187–192 (2002)
References
- M. W. Zemansky, Kelvin and Carathéodory—A Reconciliation, American Journal of Physics Vol. 34, pp. 914-920 (1966) [1]