Вы находитесь на странице: 1из 19

Entropy

This article is about entropy in thermodynamics. For system (Q). (If heat is transferred out the sign would
other uses, see Entropy (disambiguation). be reversed giving a decrease in entropy of the system.)
Not to be confused with Enthalpy. The above denition is sometimes called the macroscopic
For a more accessible and less technical introduction to denition of entropy because it can be used without re-
this topic, see Introduction to entropy. gard to any microscopic description of the contents of a
system. The concept of entropy has been found to be gen-
erally useful and has several other formulations. Entropy
In statistical mechanics, entropy (usual symbol S) is re-
lated to the number of microscopic congurations that was discovered when it was noticed to be a quantity that
a thermodynamic system can have when in a state as spec- behaves as a function of state, as a consequence of the
ied by some macroscopic variables. Specically, as- second law of thermodynamics.
suming for simplicity that each of the microscopic con- Entropy is an extensive property. It has the dimension
gurations is equally probable, the entropy of the system of energy divided by temperature, which has a unit of
is the natural logarithm of that number of congurations, joules per kelvin (J K1 ) in the International System of
multiplied by the Boltzmann constant kB. Formally, Units (or kg m2 s2 K1 in terms of base units). But the
entropy of a pure substance is usually given as an intensive
propertyeither entropy per unit mass (SI unit: J K1
S = kB ln (assuming equiprobable states). kg1 ) or entropy per unit amount of substance (SI unit: J
K1 mol1 ).
This is consistent with 19th century formulas for en-
The absolute entropy (S rather than S) was dened later,
tropy in terms of heat and temperature, as discussed be-
using either statistical mechanics or the third law of ther-
low. Boltzmanns constant, and therefore entropy, have
modynamics, an otherwise arbitrary additive constant is
dimensions of energy divided by temperature.
xed such that the entropy of a pure substance at absolute
For example, gas in a container with known volume, pres- zero is zero. In statistical mechanics this reects that the
sure, and energy could have an enormous number of pos- ground state of a system is generally non-degenerate and
sible congurations of the collection of individual gas only one microscopic conguration corresponds to it.
molecules. At equilibrium, each instantaneous congu-
In the modern microscopic interpretation of entropy in
ration of the gas may be regarded as random. Entropy
statistical mechanics, entropy is the amount of additional
may be understood as a measure of disorder within a
information needed to specify the exact physical state of
macroscopic system. The second law of thermodynamics
a system, given its thermodynamic specication. Under-
states that an isolated systems entropy never decreases.
standing the role of thermodynamic entropy in various
Such systems spontaneously evolve towards thermody-
processes requires an understanding of how and why that
namic equilibrium, the state with maximum entropy.
information changes as the system evolves from its initial
Non-isolated systems may lose entropy, provided their
to its nal state. It is often said that entropy is an expres-
environments entropy increases by at least that amount.
sion of the disorder, or randomness of a system, or of
Since entropy is a function of the state of the system, a
our lack of information about it. The second law is now
change in entropy of a system is determined by its ini-
often seen as an expression of the fundamental postulate
tial and nal states. This applies whether the process
of statistical mechanics through the modern denition of
is reversible or irreversible. However, irreversible pro-
entropy.
cesses increase the combined entropy of the system and
its environment.
In the mid-19th century, the change in entropy (S) of a
system undergoing a thermodynamically reversible pro- 1 History
cess was dened by Rudolf Clausius as:
Main article: History of entropy

Qrev
S = The French mathematician Lazare Carnot proposed in
T
his 1803 paper Fundamental Principles of Equilibrium
where T is the absolute temperature of the system, di- and Movement that in any machine the accelerations and
viding an incremental reversible transfer of heat into that shocks of the moving parts represent losses of moment of

1
2 2 DEFINITIONS AND DESCRIPTIONS

transformation-content, i.e. dissipative energy use, of


a thermodynamic system or working body of chemical
species during a change of state.[3] This was in contrast
to earlier views, based on the theories of Isaac Newton,
that heat was an indestructible particle that had mass.
Later, scientists such as Ludwig Boltzmann, Josiah
Willard Gibbs, and James Clerk Maxwell gave entropy
a statistical basis. In 1877 Boltzmann visualized a prob-
abilistic way to measure the entropy of an ensemble
of ideal gas particles, in which he dened entropy to
be proportional to the logarithm of the number of mi-
crostates such a gas could occupy. Henceforth, the essen-
tial problem in statistical thermodynamics, i.e. according
to Erwin Schrdinger, has been to determine the distri-
bution of a given amount of energy E over N identical
systems. Carathodory linked entropy with a mathemat-
ical denition of irreversibility, in terms of trajectories
and integrability.

Rudolf Clausius (18221888), originator of the concept of en-


2 Denitions and descriptions
tropy
Any method involving the notion of entropy, the very
existence of which depends on the second law of ther-
activity. In other words, in any natural process there ex- modynamics, will doubtless seem to many far-fetched,
ists an inherent tendency towards the dissipation of useful and may repel beginners as obscure and dicult of
energy. Building on this work, in 1824 Lazares son Sadi comprehension.
Carnot published Reections on the Motive Power of Fire
which posited that in all heat-engines, whenever "caloric" Willard Gibbs, Graphical Methods in the Thermodynam-
(what is now known as heat) falls through a temperature ics of Fluids[4]
dierence, work or motive power can be produced from
the actions of its fall from a hot to cold body. He made There are two related denitions of entropy: the
the analogy with that of how water falls in a water wheel. thermodynamic denition and the statistical mechanics
This was an early insight into the second law of thermo- denition. Historically, the classical thermodynamics
dynamics.[1] Carnot based his views of heat partially on denition developed rst. In the classical thermodynam-
the early 18th century Newtonian hypothesis that both ics viewpoint, the system is composed of very large num-
heat and light were types of indestructible forms of mat- bers of constituents (atoms, molecules) and the state of
ter, which are attracted and repelled by other matter, and the system is described by the average thermodynamic
partially on the contemporary views of Count Rumford properties of those constituents; the details of the sys-
who showed (1789) that heat could be created by friction tems constituents are not directly considered, but their
as when cannon bores are machined.[2] Carnot reasoned behavior is described by macroscopically averaged prop-
that if the body of the working substance, such as a body erties, e.g. temperature, pressure, entropy, heat capacity.
of steam, is returned to its original state at the end of a The early classical denition of the properties of the sys-
complete engine cycle, that no change occurs in the con- tem assumed equilibrium. The classical thermodynamic
dition of the working body. denition of entropy has more recently been extended
into the area of non-equilibrium thermodynamics. Later,
The rst law of thermodynamics, deduced from the heat- the thermodynamic properties, including entropy, were
friction experiments of James Joule in 1843, expresses given an alternative denition in terms of the statistics
the concept of energy, and its conservation in all pro- of the motions of the microscopic constituents of a sys-
cesses; the rst law, however, is unable to quantify the tem modeled at rst classically, e.g. Newtonian par-
eects of friction and dissipation. ticles constituting a gas, and later quantum-mechanically
In the 1850s and 1860s, German physicist Rudolf Clau- (photons, phonons, spins, etc.). The statistical mechan-
sius objected to the supposition that no change occurs in ics description of the behavior of a system is necessary
the working body, and gave this change a mathemat- as the denition of the properties of a system using clas-
ical interpretation by questioning the nature of the in- sical thermodynamics become an increasingly unreliable
herent loss of usable heat when work is done, e.g. heat method of predicting the nal state of a system that is
produced by friction.[3] Clausius described entropy as the subject to some process.
2.3 Carnot cycle 3

2.1 Function of state when, in fact, QH is greater than QC.[9][10] Through the
eorts of Clausius and Kelvin, it is now known that the
There are many thermodynamic properties that are maximum work that a heat engine can produce is the
functions of state. This means that at a particular ther- product of the Carnot eciency and the heat absorbed
modynamic state (which should not be confused with the from the hot reservoir:
microscopic state of a system), these properties have a
certain value. Often, if two properties of the system are
determined, then the state is determined and the other
properties values can also be determined. For instance, To derive the Carnot eciency, which is 1-(TC/TH) (a
a quantity of gas at a particular temperature and pressure number less than one), Kelvin had to evaluate the ratio of
has its state xed by those values and thus has a specic the work output to the heat absorbed during the isother-
volume that is determined by those values. As another in- mal expansion with the help of the Carnot-Clapeyron
stance, a system composed of a pure substance of a single equation which contained an unknown function, known
phase at a particular uniform temperature and pressure is as the Carnot function. The possibility that the Carnot
determined (and is thus a particular state) and is at not function could be the temperature as measured from a
only a particular volume but also at a particular entropy.[5] zero temperature, was suggested by Joule in a letter to
The fact that entropy is a function of state is one reason Kelvin. This allowed Kelvin to establish his absolute tem-
it is useful. In the Carnot cycle, the working uid returns perature scale.[11] It is also known that the work produced
to the same state it had at the start of the cycle, hence the by the system is the dierence between the heat absorbed
line integral of any state function, such as entropy, over from the hot reservoir and the heat given up to the cold
this reversible cycle is zero. reservoir:

2.2 Reversible process


Entropy is conserved for a reversible process. A re- Since the latter is valid over the entire cycle, this gave
versible process is one that does not deviate from ther- Clausius the hint that at each stage of the cycle, work and
modynamic equilibrium, while producing the maximum heat would not be equal, but rather their dierence would
work. Any process which happens quickly enough to de- be a state function that would vanish upon completion of
viate from thermal equilibrium cannot be reversible. In the cycle. The state function was called the internal en-
these cases energy is lost to heat, total entropy increases, ergy and it became the rst law of thermodynamics.[12]
and the potential for maximum work to be done in the Now equating (1) and (2) gives
transition is also lost. More specically, total entropy
is conserved in a reversible process and not conserved
in an irreversible process.[6] For example, in the Carnot QH

QC
=0
cycle, while the heat ow from the hot reservoir to the TH TC
cold reservoir represents an increase in entropy, the work or
output, if reversibly and perfectly stored in some energy
storage mechanism, represents a decrease in entropy that
could be used to operate the heat engine in reverse and QH QC
=
return to the previous state, thus the total entropy change TH TC
is still zero at all times if the entire process is reversible.
An irreversible process increases entropy.[7] This implies that there is a function of state which is con-
served over a complete cycle of the Carnot cycle. Clau-
sius called this state function entropy. One can see that
2.3 Carnot cycle entropy was discovered through mathematics rather than
through laboratory results. It is a mathematical construct
The concept of entropy arose from Rudolf Clausius's and has no easy physical analogy. This makes the concept
study of the Carnot cycle.[8] In a Carnot cycle, heat QH somewhat obscure or abstract, akin to how the concept of
is absorbed isothermally at temperature TH from a 'hot' energy arose.
reservoir and given up isothermally as heat QC to a 'cold' Clausius then asked what would happen if there should be
reservoir at TC. According to Carnots principle, work less work produced by the system than that predicted by
can only be produced by the system when there is a tem- Carnots principle. The right-hand side of the rst equa-
perature dierence, and the work should be some func- tion would be the upper bound of the work output by the
tion of the dierence in temperature and the heat ab- system, which would now be converted into an inequality
sorbed (QH). Carnot did not distinguish between QH
and QC, since he was using the incorrect hypothesis that
( )
caloric theory was valid, and hence heat was conserved TC
(the incorrect assumption that QH and QC were equal) W < 1 TH QH
4 2 DEFINITIONS AND DESCRIPTIONS

When the second equation is used to express the work as state of the system. Clausius wrote that he intention-
a dierence in heats, we get ally formed the word Entropy as similar as possible to
the word Energy, basing the term on the Greek
( ) trop, transformation.[13][note 1]
QH QC < 1 TH QH TC

While Clausius based his denition on a reversible pro-


or
cess, there are also irreversible processes that change en-
QC > TTH C
QH tropy. Following the second law of thermodynamics, en-
tropy of an isolated system always increases. The dif-
So more heat is given up to the cold reservoir than in the ference between an isolated system and closed system is
Carnot cycle. If we denote the entropies by S=Q/T for that heat may not ow to and from an isolated system, but
the two states, then the above inequality can be written as heat ow to and from a closed system is possible. Never-
a decrease in the entropy theless, for both closed and isolated systems, and indeed,
also in open systems, irreversible thermodynamics pro-
S S <0 cesses may occur.
H C

or AccordingHto the Clausius equality, for a reversible cyclic


Qrev
process: = 0. This means the line integral
SH < SC Qrev T
L T
is path-independent.
So we can dene a state function S called entropy, which
The entropy that leaves the system is greater than the
satises dS = QTrev .
entropy that enters the system, implying that some irre-
versible process prevents the cycle from producing the To nd the entropy dierence between any two states of a
maximum amount of work predicted by the Carnot equa- system, the integral must be evaluated for some reversible
tion. path between the initial and nal states.[14] Since entropy
is a state function, the entropy change of the system for
The Carnot cycle and eciency are useful because they
an irreversible path is the same as for a reversible path
dene the upper bound of the possible work output and
between the same two states.[15] However, the entropy
the eciency of any classical thermodynamic system.
change of the surroundings will be dierent.
Other cycles, such as the Otto cycle, Diesel cycle and
Brayton cycle, can be analyzed from the standpoint of the We can only obtain the change of entropy by integrating
Carnot cycle. Any machine or process that converts heat the above formula. To obtain the absolute value of the
to work and is claimed to produce an eciency greater entropy, we need the third law of thermodynamics, which
than the Carnot eciency is not viable because it vio- states that S = 0 at absolute zero for perfect crystals.
lates the second law of thermodynamics. For very small From a macroscopic perspective, in classical thermody-
numbers of particles in the system, statistical thermody- namics the entropy is interpreted as a state function of
namics must be used. The eciency of devices such as a thermodynamic system: that is, a property depending
photovoltaic cells require an analysis from the standpoint only on the current state of the system, independent of
of quantum mechanics. how that state came to be achieved. In any process where
the system gives up energy E, and its entropy falls by S,
a quantity at least TR S of that energy must be given up
2.4 Classical thermodynamics
to the systems surroundings as unusable heat (TR is the
temperature of the systems external surroundings). Oth-
Main article: Entropy (classical thermodynamics)
erwise the process cannot go forward. In classical ther-
modynamics, the entropy of a system is dened only if it
The thermodynamic denition of entropy was developed is in thermodynamic equilibrium.
in the early 1850s by Rudolf Clausius and essentially de-
scribes how to measure the entropy of an isolated sys-
tem in thermodynamic equilibrium with its parts. Clau- 2.5 Statistical mechanics
sius created the term entropy as an extensive thermody-
namic variable that was shown to be useful in character- The statistical denition was developed by Ludwig Boltz-
izing the Carnot cycle. Heat transfer along the isotherm mann in the 1870s by analyzing the statistical behavior of
steps of the Carnot cycle was found to be proportional the microscopic components of the system. Boltzmann
to the temperature of a system (known as its absolute showed that this denition of entropy was equivalent to
temperature). This relationship was expressed in incre- the thermodynamic entropy to within a constant number
ments of entropy equal to the ratio of incremental heat which has since been known as Boltzmanns constant. In
transfer divided by temperature, which was found to vary summary, the thermodynamic denition of entropy pro-
in the thermodynamic cycle but eventually return to the vides the experimental denition of entropy, while the
same value at the end of every cycle. Thus it was found statistical denition of entropy extends the concept, pro-
to be a function of state, specically a thermodynamic viding an explanation and a deeper understanding of its
2.6 Entropy of a system 5

nature. In what has been called the fundamental assumption of


The interpretation of entropy in statistical mechanics is statistical thermodynamics or the fundamental postulate in
the measure of uncertainty, or mixedupness in the phrase statistical mechanics, the occupation of any microstate is
of Gibbs, which remains about a system after its observ- assumed to be equally probable (i.e. pi = 1/, where
able macroscopic properties, such as temperature, pres- is the number of microstates); this assumption is usually
sure and volume, have been taken into account. For a justied for an isolated system in equilibrium.[20] Then
given set of macroscopic variables, the entropy measures the previous equation reduces to
the degree to which the probability of the system is spread
out over dierent possible microstates. In contrast to the
macrostate, which characterizes plainly observable aver- S = kB log .
age quantities, a microstate species all molecular details
about the system including the position and velocity of In thermodynamics, such a system is one in which the vol-
every molecule. The more such states available to the ume, number of molecules, and internal energy are xed
system with appreciable probability, the greater the en- (the microcanonical ensemble).
tropy. In statistical mechanics, entropy is a measure of The most general interpretation of entropy is as a measure
the number of ways in which a system may be arranged, of our uncertainty about a system. The equilibrium state
often taken to be a measure of disorder (the higher the of a system maximizes the entropy because we have lost
entropy, the higher the disorder).[16][17][18] This deni- all information about the initial conditions except for the
tion describes the entropy as being proportional to the conserved variables; maximizing the entropy maximizes
natural logarithm of the number of possible microscopic our ignorance about the details of the system.[21] This un-
congurations of the individual atoms and molecules of certainty is not of the everyday subjective kind, but rather
the system (microstates) which could give rise to the ob- the uncertainty inherent to the experimental method and
served macroscopic state (macrostate) of the system. The interpretative model.
constant of proportionality is the Boltzmann constant.
The interpretative model has a central role in determin-
Specically, entropy is a logarithmic measure of the num- ing entropy. The qualier for a given set of macroscopic
ber of states with signicant probability of being occu- variables above has deep implications: if two observers
pied: use dierent sets of macroscopic variables, they see dif-
ferent entropies. For example, if observer A uses the
variables U, V and W, and observer B uses U, V, W, X,
S = kB pi log pi , then, by changing X, observer B can cause an eect that
i looks like a violation of the second law of thermodynam-
ics to observer A. In other words: the set of macroscopic
or, equivalently, the expected value of the logarithm of
variables one chooses must include everything that may
the probability that a microstate will be occupied
change in the experiment, otherwise one might see de-
creasing entropy![22]

S = kB Ei (log pi ) Entropy can be dened for any Markov processes with


reversible dynamics and the detailed balance property.
where kB is the Boltzmann constant, equal to In Boltzmanns 1896 Lectures on Gas Theory, he showed
1.380651023 J/K. The summation is over all the that this expression gives a measure of entropy for systems
possible microstates of the system, and pi is the prob- of atoms and molecules in the gas phase, thus providing
ability that the system is in the i-th microstate.[19] a measure for the entropy of classical thermodynamics.
This denition assumes that the basis set of states has
been picked so that there is no information on their
relative phases. In a dierent basis set, the more general 2.6 Entropy of a system
expression is
Entropy arises directly from the Carnot cycle. It can also
be described as the reversible heat divided by tempera-
S = kB Tr (b
log(b
)), ture. Entropy is a fundamental function of state.

where b is the density matrix, Tr is trace (linear algebra) In a thermodynamic system, pressure, density, and tem-
and log is the matrix logarithm. This density matrix for- perature tend to become uniform over time because the
mulation is not needed in cases of thermal equilibrium equilibrium state has higher probability (more possible
so long as the basis states are chosen to be energy eigen- combinations of microstates) than any other state.
states. For most practical purposes, this can be taken as As an example, for a glass of ice water in air at room tem-
the fundamental denition of entropy since all other for- perature, the dierence in temperature between a warm
mulas for S can be mathematically derived from it, but room (the surroundings) and cold glass of ice and water
not vice versa. (the system and not part of the room), begins to equal-
6 2 DEFINITIONS AND DESCRIPTIONS

the universe of the room and ice water system has


SURROUNDINGS reached a temperature equilibrium, the entropy change
from the initial state is at a maximum. The entropy of
the thermodynamic system is a measure of how far the
equalization has progressed.
Thermodynamic entropy is a non-conserved state func-
tion that is of great importance in the sciences of physics
SYSTEM and chemistry.[16][23] Historically, the concept of en-
tropy evolved to explain why some processes (permit-
ted by conservation laws) occur spontaneously while their
time reversals (also permitted by conservation laws) do
not; systems tend to progress in the direction of in-
creasing entropy.[24][25] For isolated systems, entropy
BOUNDARY never decreases.[23] This fact has several important conse-
quences in science: rst, it prohibits "perpetual motion"
machines; and second, it implies the arrow of entropy has
A thermodynamic system the same direction as the arrow of time. Increases in en-
tropy correspond to irreversible changes in a system, be-
cause some energy is expended as waste heat, limiting the
100,000 ft^3/lbm
10,000 ft^3/lbm

20,000 ft^3/lbm

50,000 ft^3/lbm
1000 ft^3/lbm

2000 ft^3/lbm

5000 ft^3/lbm
100 ft^3/lbm

200 ft^3/lbm

500 ft^3/lbm
100,000 psi
.02 ft^3/lbm

.05 ft^3/lbm

10 ft^3/lbm

20 ft^3/lbm

50 ft^3/lbm
50,000 psi

20,000 psi

10,000 psi
.1 ft^3/lbm

.2 ft^3/lbm

.5 ft^3/lbm

1 ft^3/lbm

2 ft^3/lbm

5 ft^3/lbm
5000 psi

2000 psi

1000 psi

500 psi

200 psi

100 psi

.05 psi

.02 psi

.01 psi

amount of work a system can do.[16][17][26][27]


50 psi

20 psi
10 psi

.5 psi

.2 psi

.1 psi
5 psi

2 psi

1 psi

2200

2100

2000
Unlike many other functions of state, entropy cannot be
1900
directly observed but must be calculated. Entropy can
1800 be calculated for a substance as the standard molar en-
1700

supercritical region
tropy from absolute zero (also known as absolute entropy)
1600
or as a dierence in entropy from some other reference
1500

state which is dened as zero entropy. Entropy has the


dimension of energy divided by temperature, which has
1400
temperature, R

a unit of joules per kelvin (J/K) in the International Sys-


1300

vapor region
liquid region

1200

1100
tem of Units. While these are the same units as heat ca-
1000
pacity, the two concepts are distinct.[28] Entropy is not a
900
conserved quantity: for example, in an isolated system
800 with non-uniform temperature, heat might irreversibly
700

saturated region
ow and the temperature become more uniform such that
600
entropy increases. The second law of thermodynamics
500

states that a closed system has entropy which may in-


10%

100%
70%
0%

30%

50%

60%

80%

90%
20%

quality
40%

crease or otherwise remain constant. Chemical reactions


0.0 0.2 0.4 0.6 0.8 1.0 1.2 1.4 1.6 1.8 2.0 2.2 2.4 2.6 2.8 3.0 3.2
entropy, Btu/lbm-R

cause changes in entropy and entropy plays an important


A temperatureentropy diagram for steam. The vertical axis rep- role in determining in which direction a chemical reaction
resents uniform temperature, and the horizontal axis represents
spontaneously proceeds.
specic entropy. Each dark line on the graph represents con-
One dictionary denition of entropy is that it is a mea-
stant pressure, and these form a mesh with light gray lines of
sure of thermal energy per unit temperature that is not
constant volume. (Dark-blue is liquid water, light-blue is liquid-
steam mixture, and faint-blue is steam. Grey-blue represents su-
available for useful work. For instance, a substance at
percritical liquid water.) uniform temperature is at maximum entropy and cannot
drive a heat engine. A substance at non-uniform temper-
ature is at a lower entropy (than if the heat distribution is
ize as portions of the thermal energy from the warm sur- allowed to even out) and some of the thermal energy can
roundings spread to the cooler system of ice and water. drive a heat engine.
Over time the temperature of the glass and its contents
A special case of entropy increase, the entropy of mix-
and the temperature of the room become equal. In other
ing, occurs when two or more dierent substances are
words, the entropy of the room has decreased as some of
mixed. If the substances are at the same temperature and
its energy has been dispersed to the ice and water.
pressure, there is no net exchange of heat or work the
However, as calculated in the example, the entropy of entropy change is entirely due to the mixing of the dif-
the system of ice and water has increased more than the ferent substances. At a statistical mechanical level, this
entropy of the surrounding room has decreased. In an results due to the change in available volume per particle
isolated system such as the room and ice water taken to- with mixing.[29]
gether, the dispersal of energy from warmer to cooler
always results in a net increase in entropy. Thus, when
7

3 Second law of thermodynamics 4 Applications

Main article: Second law of thermodynamics 4.1 The fundamental thermodynamic rela-
See also: Thermodynamic Equilibrium and Non- tion
equilibrium thermodynamics
Main article: Fundamental thermodynamic relation
The second law of thermodynamics requires that, in gen-
eral, the total entropy of any system can't decrease other The entropy of a system depends on its internal energy
than by increasing the entropy of some other system. and its external parameters, such as its volume. In the
Hence, in a system isolated from its environment, the en- thermodynamic limit, this fact leads to an equation re-
tropy of that system tends not to decrease. It follows that lating the change in the internal energy U to changes in
heat can't ow from a colder body to a hotter body without the entropy and the external parameters. This relation
the application of work (the imposition of order) to the is known as the fundamental thermodynamic relation. If
colder body. Secondly, it is impossible for any device op- external pressure P bears on the volume V as the only
erating on a cycle to produce net work from a single tem- external parameter, this relation is:
perature reservoir; the production of net work requires
ow of heat from a hotter reservoir to a colder reservoir,
or a single expanding reservoir undergoing adiabatic cool-
dU = T dS pdV
ing, which performs adiabatic work. As a result, there is
no possibility of a perpetual motion system. It follows
Since both internal energy and entropy are monotonic
that a reduction in the increase of entropy in a specied
functions of temperature T, implying that the internal
process, such as a chemical reaction, means that it is en-
energy is xed when one species the entropy and the
ergetically more ecient.
volume, this relation is valid even if the change from
It follows from the second law of thermodynamics that the one state of thermal equilibrium to another with in-
entropy of a system that is not isolated may decrease. An nitesimally larger entropy and volume happens in a non-
air conditioner, for example, may cool the air in a room, quasistatic way (so during this change the system may be
thus reducing the entropy of the air of that system. The very far out of thermal equilibrium and then the entropy,
heat expelled from the room (the system), which the air pressure and temperature may not exist).
conditioner transports and discharges to the outside air,
The fundamental thermodynamic relation implies many
always makes a bigger contribution to the entropy of the
thermodynamic identities that are valid in general, inde-
environment than the decrease of the entropy of the air of
pendent of the microscopic details of the system. Impor-
that system. Thus, the total of entropy of the room plus
tant examples are the Maxwell relations and the relations
the entropy of the environment increases, in agreement
between heat capacities.
with the second law of thermodynamics.
In mechanics, the second law in conjunction with the
fundamental thermodynamic relation places limits on a 4.2 Entropy in chemical thermodynamics
systems ability to do useful work.[30] The entropy change
of a system at temperature T absorbing an innitesimal Thermodynamic entropy is central in chemical thermo-
amount of heat q in a reversible way, is given by q/T. dynamics, enabling changes to be quantied and the out-
More explicitly, an energy TR S is not available to do use- come of reactions predicted. The second law of ther-
ful work, where TR is the temperature of the coldest ac- modynamics states that entropy in an isolated system
cessible reservoir or heat sink external to the system. For the combination of a subsystem under study and its sur-
further discussion, see Exergy. roundings increases during all spontaneous chemical
Statistical mechanics demonstrates that entropy is gov- and physical processes. The Clausius equation of q/T
erned by probability, thus allowing for a decrease in dis- = S introduces the measurement of entropy change, S.
order even in an isolated system. Although this is pos- Entropy change describes the direction and quanties the
sible, such an event has a small probability of occurring, magnitude of simple changes such as heat transfer be-
making it unlikely.[31] tween systems always from hotter to cooler sponta-
neously.
The applicability of a second law of thermodynamics is
a limited to systems which are near or in equilibrium The thermodynamic entropy therefore has the dimension
state.[32] At the same time, laws governing systems which of energy divided by temperature, and the unit joule per
are far from equilibrium are still debatable. One of the kelvin (J/K) in the International System of Units (SI).
guiding principles for such systems is the maximum en- Thermodynamic entropy is an extensive property, mean-
tropy production principle.[33][34] It claims that a non- ing that it scales with the size or extent of a system. In
equilibrium systems evolves such as to maximize its en- many processes it is useful to specify the entropy as an
tropy production.[35][36] intensive property independent of the size, as a specic
8 5 ENTROPY CHANGE FORMULAS FOR SIMPLE PROCESSES

entropy characteristic of the type of system studied. Spe- tinct from the paths of entry and exit of matter from the
cic entropy may be expressed relative to a unit of mass, system.[40][41]
typically the kilogram (unit: Jkg1 K1 ). Alternatively, in To derive a generalized entropy balanced equation, we
chemistry, it is also referred to one mole of substance, in start with the general balance equation for the change in
which case it is called the molar entropy with a unit of any extensive quantity in a thermodynamic system, a
Jmol1 K1 . quantity that may be either conserved, such as energy, or
Thus, when one mole of substance at about 0K is warmed non-conserved, such as entropy. The basic generic bal-
by its surroundings to 298K, the sum of the incremental ance expression states that d/dt, i.e. the rate of change
values of q/T constitute each elements or compounds of in the system, equals the rate at which enters the
standard molar entropy, an indicator of the amount of en- system at the boundaries, minus the rate at which leaves
ergy stored by a substance at 298K.[37][38] Entropy change the system across the system boundaries, plus the rate at
also measures the mixing of substances as a summation which is generated within the system. For an open
of their relative quantities in the nal mixture.[39] thermodynamic system in which heat and work are trans-
Entropy is equally essential in predicting the extent and ferred by paths separate from the paths for transfer of
direction of complex chemical reactions. For such ap- matter, using this generic balance equation, with respect
plications, S must be incorporated in an expression that to the rate of change with time t of the extensive quantity
includes both the system and its surroundings, S entropy S, the entropy balance equation is:[42][note 2]
= S + S . This expression becomes, via
some steps, the Gibbs free energy equation for reactants K
dS Q
and products in the system: G [the Gibbs free energy = Mk Sk + + Sgen
change of the system] = H [the enthalpy change] T dt T
k=1
[37]
S [the entropy change]. where
K
k=1 Mk Sk = the net rate of entropy ow due
4.3 Entropy balance equation for open sys- to the ows of mass into and out of the system
tems (where S = entropy per unit mass).
Q
T = the rate of entropy ow due to the ow of
Heat added heat across the system boundary.
Q
Sgen = the rate of entropy production within
Work performed
external to boundary the system. This entropy production arises
Wshaft from processes within the system, including
chemical reactions, internal matter diusion,
internal heat transfer, and frictional eects
Hout such as viscosity occurring within the system
from mechanical work transfer to or from the
Hin system.
System boundary (open)
Note, also, that if there
are multiple heat ows, the term
Q/T is replaced by Qj /Tj , where Qj is the heat ow
and Tj is the temperature at the jth heat ow port into the
During steady-state continuous operation, an entropy balance ap- system.
plied to an open system accounts for system entropy changes re-
lated to heat ow and mass ow across the system boundary.

5 Entropy change formulas for


In chemical engineering, the principles of thermodynam-
ics are commonly applied to "open systems", i.e. those in simple processes
which heat, work, and mass ow across the system bound-
ary. Flows of both heat ( Q ) and work, i.e. WS (shaft For certain simple transformations in systems of constant
work) and P(dV/dt) (pressure-volume work), across the composition, the entropy changes are given by simple
system boundaries, in general cause changes in the en- formulas.[43]
tropy of the system. Transfer as heat entails entropy trans-
fer Q/T, where T is the absolute thermodynamic temper-
ature of the system at the point of the heat ow. If there 5.1 Isothermal expansion or compression
are mass ows across the system boundaries, they also in- of an ideal gas
uence the total entropy of the system. This account, in
terms of heat and work, is valid only for cases in which For the expansion (or compression) of an ideal gas from
the work and heat transfers are by paths physically dis- an initial volume V0 and pressure P0 to a nal volume V
9

and pressure P at any constant temperature, the change change for the transition, and the entropy change is the
in entropy is given by: enthalpy change divided by the thermodynamic temper-
ature. For fusion (melting) of a solid to a liquid at the
melting point T , the entropy of fusion is
V P
S = nR ln = nR ln .
V0 P0
Hfus
Here n is the number of moles of gas and R is the ideal Sfus = .
Tm
gas constant. These equations also apply for expansion
into a nite vacuum or a throttling process, where the Similarly, for vaporization of a liquid to a gas at the boil-
temperature, internal energy and enthalpy for an ideal gas ing point T , the entropy of vaporization is
remain constant.

Hvap
5.2 Cooling and heating Svap =
Tb
.

For heating or cooling of any system (gas, liquid or solid)


at constant pressure from an initial temperature T0 to a 6 Approaches to understanding en-
nal temperature T , the entropy change is
tropy
T As a fundamental aspect of thermodynamics and physics,
S = nCP ln
T0 several dierent approaches to entropy beyond that of
provided that the constant-pressure molar heat capacity Clausius and Boltzmann are valid.
(or specic heat) CP is constant and that no phase transi-
tion occurs in this temperature interval.
6.1 Standard textbook denitions
Similarly at constant volume, the entropy change is
The following is a list of additional denitions of entropy
from a collection of textbooks:
T
S = nCv ln
T0 a measure of energy dispersal at a specic tempera-
where the constant-volume heat capacity C is constant ture.
and there is no phase change.
a measure of disorder in the universe or of the avail-
At low temperatures near absolute zero, heat capacities ability of the energy in a system to do work.[46]
of solids quickly drop o to near zero, so the assumption
of constant heat capacity does not apply.[44] a measure of a systems thermal energy per unit tem-
perature that is unavailable for doing useful work.[47]
Since entropy is a state function, the entropy change of
any process in which temperature and volume both vary
is the same as for a path divided into two steps heating at In Boltzmanns denition, entropy is a measure of the
constant volume and expansion at constant temperature. number of possible microscopic states (or microstates)
For an ideal gas, the total entropy change is[45] of a system in thermodynamic equilibrium. Consistent
with the Boltzmann denition, the second law of ther-
modynamics needs to be re-worded as such that entropy
T V increases over time, though the underlying principle re-
S = nCv ln + nR ln mains the same.
T0 V0
Similarly if the temperature and pressure of an ideal gas
both vary, 6.2 Order and disorder
Main article: Entropy (order and disorder)
T P
S = nCP ln nR ln
T0 P0
Entropy has often been loosely associated with the
amount of order or disorder, or of chaos, in a
5.3 Phase transitions thermodynamic system. The traditional qualitative de-
scription of entropy is that it refers to changes in the sta-
Reversible phase transitions occur at constant tempera- tus quo of the system and is a measure of molecular dis-
ture and pressure. The reversible heat is the enthalpy order and the amount of wasted energy in a dynamical
10 6 APPROACHES TO UNDERSTANDING ENTROPY

energy transformation from one state or form to another. spontaneous changes are always accompanied by a dis-
In this direction, several recent authors have derived ex- persal of energy.[54]
act entropy formulas to account for and measure disorder
and order in atomic and molecular assemblies.[48][49][50]
One of the simpler entropy order/disorder formulas is 6.4 Relating entropy to energy usefulness
that derived in 1984 by thermodynamic physicist Peter
Landsberg, based on a combination of thermodynamics Following on from the above, it is possible (in a thermal
and information theory arguments. He argues that when context) to regard entropy as an indicator or measure of
constraints operate on a system, such that it is prevented the eectiveness or usefulness of a particular quantity of
from entering one or more of its possible or permitted energy.[55] This is because energy supplied at a high tem-
states, as contrasted with its forbidden states, the mea- perature (i.e. with low entropy) tends to be more useful
sure of the total amount of disorder in the system is than the same amount of energy available at room tem-
given by:[49][50] perature. Mixing a hot parcel of a uid with a cold one
produces a parcel of intermediate temperature, in which
the overall increase in entropy represents a loss which
CD can never be replaced.
Disorder = .
CI Thus, the fact that the entropy of the universe is steadily
Similarly, the total amount of order in the system is increasing, means that its total energy is becoming less
given by: useful: eventually, this will lead to the "heat death of the
Universe".[56]

CO
Order = 1 . 6.5 Entropy and adiabatic accessibility
CI
In which CD is the disorder capacity of the system, A denition of entropy based entirely on the relation
which is the entropy of the parts contained in the per- of adiabatic accessibility between equilibrium states was
mitted ensemble, CI is the information capacity of the given by E.H.Lieb and J. Yngvason in 1999.[57] This ap-
system, an expression similar to Shannons channel ca- proach has several predecessors, including the pioneering
pacity, and CO is the order capacity of the system.[48] work of Constantin Carathodory from 1909[58] and the
monograph by R. Giles.[59] In the setting of Lieb and Yn-
gvason one starts by picking, for a unit amount of the sub-
6.3 Energy dispersal stance under consideration, two reference states X0 and
X1 such that the latter is adiabatically accessible from the
Main article: Entropy (energy dispersal) former but not vice versa. Dening the entropies of the
reference states to be 0 and 1 respectively the entropy of
The concept of entropy can be described qualitatively as a a state X is dened as the largest number such that X
measure of energy dispersal at a specic temperature.[51] is adiabatically accessible from a composite state consist-
Similar terms have been in use from early in the history of ing of an amount in the state X1 and a complementary
classical thermodynamics, and with the development of amount, (1) , in the state X0 . A simple but important
statistical thermodynamics and quantum theory, entropy result within this setting is that entropy is uniquely deter-
changes have been described in terms of the mixing or mined, apart from a choice of unit and an additive con-
spreading of the total energy of each constituent of a stant for each chemical element, by the following prop-
system over its particular quantized energy levels. erties: It is monotonic with respect to the relation of adi-
abatic accessibility, additive on composite systems, and
Ambiguities in the terms disorder and chaos, which usu- extensive under scaling.
ally have meanings directly opposed to equilibrium, con-
tribute to widespread confusion and hamper comprehen-
sion of entropy for most students.[52] As the second law 6.6 Entropy in quantum mechanics
of thermodynamics shows, in an isolated system inter-
nal portions at dierent temperatures tend to adjust to Main article: von Neumann entropy
a single uniform temperature and thus produce equilib-
rium. A recently developed educational approach avoids
In quantum statistical mechanics, the concept of entropy
ambiguous terms and describes such spreading out of en-
was developed by John von Neumann and is generally re-
ergy as dispersal, which leads to loss of the dierentials
ferred to as "von Neumann entropy",
required for work even though the total energy remains
constant in accordance with the rst law of thermody-
namics[53] (compare discussion in next section). Physical
S = kB Tr( log )
chemist Peter Atkins, for example, who previously wrote
of dispersal leading to a disordered state, now writes that where is the density matrix and Tr is the trace operator.
11

This upholds the correspondence principle, because in the


classical limit, when the phases between the basis states
n
used for the classical probabilities are purely random, this H(X) = p(xi ) log p(xi ).
expression is equivalent to the familiar classical denition i=1
of entropy,
In the case of transmitted messages, these probabilities
were the probabilities that a particular message was ac-
tually transmitted, and the entropy of the message sys-
S = kB pi log pi tem was a measure of the average amount of informa-
i
tion in a message. For the case of equal probabilities (i.e.
each message is equally probable), the Shannon entropy
i.e. in such a basis the density matrix is diagonal. (in bits) is just the number of yes/no questions needed to
Von Neumann established a rigorous mathematical determine the content of the message.[19]
framework for quantum mechanics with his work Mathe- The question of the link between information entropy
matische Grundlagen der Quantenmechanik. He provided and thermodynamic entropy is a debated topic. While
in this work a theory of measurement, where the usual most authors argue that there is a link between the
notion of wave function collapse is described as an irre- two,[62][63][64][65][66] a few argue that they have nothing
versible process (the so-called von Neumann or projec- to do with each other.[19]
tive measurement). Using this concept, in conjunction
with the density matrix he extended the classical concept The expressions for the two entropies are similar. If
of entropy into the quantum domain. W is the number of microstates that can yield a given
macrostate, and each microstate has the same a priori
probability, then that probability is p=1/W. The Shannon
entropy (in nats) is:
6.7 Information theory

I thought of calling it information, but the word was W

overly used, so I decided to call it uncertainty. [...] H = p log(p) = log(W )


Von Neumann told me, You should call it entropy, i=1

for two reasons. In the rst place your uncertainty and if entropy is measured in units of k per nat, then the
function has been used in statistical mechanics under entropy is given[67] by:
that name, so it already has a name. In the second place,
and more important, nobody knows what entropy re-
ally is, so in a debate you will always have the advantage.
H = k log(W )

Conversation between Claude Shannon and John von which is the famous Boltzmann entropy formula when k
Neumann regarding what name to give to the attenuation is Boltzmanns constant, which may be interpreted as the
in phone-line signals[60] thermodynamic entropy per nat. There are many ways of
Main articles: Entropy (information theory), Entropy in demonstrating the equivalence of information entropy
thermodynamics and information theory, and Entropic and physics entropy, that is, the equivalence of Shan-
uncertainty non entropy and Boltzmann entropy. Nevertheless,
some authors argue for dropping the word entropy for
the H function of information theory and using Shannons
When viewed in terms of information theory, the entropy other term uncertainty instead.[68]
state function is simply the amount of information (in the
Shannon sense) that would be needed to specify the full
microstate of the system. This is left unspecied by the
macroscopic description. 7 Interdisciplinary applications of
In information theory, entropy is the measure of the entropy
amount of information that is missing before recep-
tion and is sometimes referred to as Shannon en- Although the concept of entropy was originally a
tropy.[61] Shannon entropy is a broad and general concept thermodynamic construct, it has been adapted in
which nds applications in information theory as well as other elds of study, including information theory,
thermodynamics. It was originally devised by Claude psychodynamics, thermoeconomics/ecological eco-
Shannon in 1948 to study the amount of information in nomics, and evolution.[69][70][71][72][73] For instance,
a transmitted message. The denition of the informa- an entropic argument has been recently proposed for
tion entropy is, however, quite general, and is expressed explaining the preference of cave spiders in choosing a
in terms of a discrete set of probabilities pi so that suitable area for laying their eggs.[74]
12 7 INTERDISCIPLINARY APPLICATIONS OF ENTROPY

7.1 Thermodynamic and statistical me- 7.2 The arrow of time


chanics concepts
Main article: Entropy (arrow of time)
Entropy unit a non-S.I. unit of thermodynamic en-
tropy, usually denoted e.u. and equal to one calorie Entropy is the only quantity in the physical sciences that
per Kelvin per mole, or 4.184 Joules per Kelvin per seems to imply a particular direction of progress, some-
mole.[75] times called an arrow of time. As time progresses, the
second law of thermodynamics states that the entropy of
Gibbs entropy the usual statistical mechanical en- an isolated system never decreases. Hence, from this per-
tropy of a thermodynamic system. spective, entropy measurement is thought of as a kind of
clock.
Boltzmann entropy a type of Gibbs entropy, which
neglects internal statistical correlations in the overall
particle distribution.
7.3 Cosmology
Tsallis entropy a generalization of the standard
Boltzmann-Gibbs entropy. Main article: Heat death of the universe

Standard molar entropy is the entropy content of Since a nite universe is an isolated system, the Second
one mole of substance, under conditions of standard Law of Thermodynamics states that its total entropy is
temperature and pressure. constantly increasing. It has been speculated, since the
19th century, that the universe is fated to a heat death
Residual entropy the entropy present after a sub- in which all the energy ends up as a homogeneous distri-
stance is cooled arbitrarily close to absolute zero. bution of thermal energy, so that no more work can be
extracted from any source.
Entropy of mixing the change in the entropy when If the universe can be considered to have generally in-
two dierent chemical substances or components are creasing entropy, then as Roger Penrose has pointed
mixed. out gravity plays an important role in the increase be-
cause gravity causes dispersed matter to accumulate into
Loop entropy is the entropy lost upon bringing to- stars, which collapse eventually into black holes. The en-
gether two residues of a polymer within a prescribed tropy of a black hole is proportional to the surface area
distance. of the black holes event horizon.[76] Jacob Bekenstein
and Stephen Hawking have shown that black holes have
Conformational entropy is the entropy associated the maximum possible entropy of any object of equal
with the physical arrangement of a polymer chain size. This makes them likely end points of all entropy-
that assumes a compact or globular state in solution. increasing processes, if they are totally eective matter
and energy traps. However, the escape of energy from
Entropic force a microscopic force or reaction black holes might be possible due to quantum activity, see
tendency related to system organization changes, Hawking radiation. In 2014 Hawking changed his stance
molecular frictional considerations, and statistical on some details, in a paper which largely redened the
variations. event horizons of black holes, positing that black holes
do not exist.[77]
Free entropy an entropic thermodynamic potential The role of entropy in cosmology remains a controver-
analogous to the free energy. sial subject since the time of Ludwig Boltzmann. Re-
cent work has cast some doubt on the heat death hypoth-
Entropic explosion an explosion in which the re- esis and the applicability of any simple thermodynamic
actants undergo a large change in volume without model to the universe in general. Although entropy does
releasing a large amount of heat. increase in the model of an expanding universe, the max-
imum possible entropy rises much more rapidly, moving
Entropy change a change in entropy dS between the universe further from the heat death with time, not
two equilibrium states is given by the heat trans- closer.[78][79][80] This results in an entropy gap push-
ferred dQrev divided by the absolute temperature T ing the system further away from the posited heat death
of the system in this interval. equilibrium.[81] Other complicating factors, such as the
energy density of the vacuum and macroscopic quantum
Sackur-Tetrode entropy the entropy of a eects, are dicult to reconcile with thermodynamical
monatomic classical ideal gas determined via models, making any predictions of large-scale thermody-
quantum considerations. namics extremely dicult.[82]
13

Current theories suggest the entropy gap to have been Harmonic entropy
originally opened up by the early rapid exponential ex-
pansion of the universe.[83] Heat death of the universe

Laws of thermodynamics
7.4 Economics Multiplicity function

See also: Nicholas Georgescu-Roegen The relevance Negentropy (negative entropy)


of thermodynamics to economics, and Ecological eco-
nomics Methodology Orders of magnitude (entropy)

Principle of maximum entropy


Romanian American economist Nicholas Georgescu-
Roegen, a progenitor in economics and a paradigm Stirlings formula
founder of ecological economics, made extensive use of Thermodynamic databases for pure substances
the entropy concept in his magnum opus on The Entropy
Law and the Economic Process.[84] Due to Georgescu- Thermodynamic potential
Roegens work, the laws of thermodynamics now form an
integral part of the ecological economics school.[85]:204f Thermodynamic equilibrium
[86]:2935
Although his work was blemished somewhat by Wavelet entropy
mistakes, a full chapter on the economics of Georgescu-
Roegen has approvingly been included in one elemen- Phase space
tary physics textbook on the historical development of
thermodynamics.[87]:95112
In economics, Georgescu-Roegens work has generated 9 Notes
the term 'entropy pessimism'.[88]:116 Since the 1990s,
leading ecological economist and steady-state theorist [1] A machine in this context includes engineered devices as
Herman Daly a student of Georgescu-Roegen has well as biological organisms.
been the economics professions most inuential propo-
[2] The overdots represent derivatives of the quantities with
nent of the entropy pessimism position.[89]:545f respect to time.

8 See also 10 References


Autocatalytic reactions and order creation [1] Carnot, Sadi (17961832)". Wolfram Research. 2007.
Retrieved 2010-02-24.
Brownian ratchet
[2] McCulloch, Richard, S. (1876). Treatise on the Mechan-
ClausiusDuhem inequality
ical Theory of Heat and its Applications to the Steam-
Conguration entropy Engine, etc. D. Van Nostrand.

[3] Clausius, Rudolf (1850). On the Motive Power of Heat,


Departure function
and on the Laws which can be deduced from it for the The-
Enthalpy ory of Heat. Poggendors Annalen der Physick, LXXIX
(Dover Reprint). ISBN 0-486-59065-8.
Entropic force
[4] The scientic papers of J. Willard Gibbs in Two Volumes.
Entropic value at risk 1. Longmans, Green, and Co. 1906. p. 11. Retrieved
2011-02-26.
Entropy (information theory)
[5] J. A. McGovern,2.5 Entropy. Archived from the origi-
Entropy (computing) nal on 2012-09-23. Retrieved 2013-02-05.

Entropy and life [6] 6.5 Irreversibility, Entropy Changes, and ``Lost Work".
web.mit.edu. Retrieved 21 May 2016.
Entropy (order and disorder)
[7] Lower, Stephen. What is entropy?". www.chem1.com.
Entropy rate Retrieved 21 May 2016.

Entropy production [8] Lavenda, Bernard H. (2010). 2.3.4. A new perspec-


tive on thermodynamics (Online-Ausg. ed.). New York:
Geometrical frustration Springer. ISBN 978-1-4419-1430-9.
14 10 REFERENCES

[9] Carnot, Sadi Carnot (1986). Fox, Robert, ed. Reexions [25] Haynie, Donald, T. (2001). Biological Thermodynamics.
on the motive power of re. New York, NY: Lilian Barber Cambridge University Press. ISBN 0-521-79165-0.
Press. p. 26. ISBN 978-0-936508-16-0.
[26] Daintith, John (2005). A dictionary of science (5th
[10] Truesdell, C. (1980). The tragicomical history of thermo- ed.). Oxford: Oxford University Press. ISBN 978-0-19-
dynamics 18221854. New York: Springer. pp. 7885. 280641-3.
ISBN 978-0-387-90403-0.
[27] de Rosnay, Joel (1979). The Macroscope a New World
[11] Clerk Maxwel, James (2001). Pesic, Peter, ed. Theory of View (written by an M.I.T.-trained biochemist). Harper &
heat. Mineola: Dover Publications. pp. 115158. ISBN Row, Publishers. ISBN 0-06-011029-5.
978-0-486-41735-6.
[28] J. A. McGovern,Heat Capacities. Archived from the
[12] Rudolf Clausius (1867). The Mechanical Theory of Heat: original on 2012-08-19. Retrieved 2013-01-27.
With Its Applications to the Steam-engine and to the Phys-
ical Properties of Bodies. J. Van Voorst. p. 28. ISBN [29] Ben-Naim, Arieh (21 September 2007). On the So-
978-1-4981-6733-8. Called Gibbs Paradox, and on the Real Paradox (PDF).
Entropy. 9 (3): 132136. doi:10.3390/e9030133.
[13] Clausius, Rudolf (1865). Ueber verschiedene fr die
Anwendung bequeme Formen der Hauptgleichungen der [30] Daintith, John (2005). Oxford Dictionary of Physics. Ox-
mechanischen Wrmetheorie: vorgetragen in der natur- ford University Press. ISBN 0-19-280628-9.
forsch. Gesellschaft den 24. April 1865. p. 46.
[31] Saha, Arnab; Lahiri, Sourabh; Jayannavar, A. M.
[14] Atkins, Peter; Julio De Paula (2006). Physical Chemistry, (2009). Entropy production theorems and some
8th ed. Oxford University Press. p. 79. ISBN 0-19- consequences. Physical Review E. 80: 110.
870072-5. doi:10.1103/PhysRevE.80.011117.

[15] Engel, Thomas; Philip Reid (2006). Physical Chemistry. [32] Martyushev, L. M.; Seleznev, V. D. (2014). The
Pearson Benjamin Cummings. p. 86. ISBN 0-8053- restrictions of the maximum entropy production prin-
3842-X. ciple. Physica A: Statistical Mechanics and its
Applications. 410: 1721. arXiv:1311.2068 .
[16] Licker, Mark D. (2004). McGraw-Hill concise encyclope- doi:10.1016/j.physa.2014.05.014.
dia of chemistry. New York: McGraw-Hill Professional.
ISBN 978-0-07-143953-4. [33] Ziegler, H. (1983). An Introduction to Thermomechanics.
North Holland, Amsterdam.
[17] Sethna, James P. (2006). Statistical mechanics : entropy,
order parameters, and complexity. ([Online-Ausg.]. ed.). [34] Onsager, Lars (1931). Reciprocal Relations in Ir-
Oxford: Oxford University Press. p. 78. ISBN 978-0-19- reversible Processes. Phys. Rev. 37: 405.
856677-9. doi:10.1103/PhysRev.37.405.

[18] Clark, John O.E. (2004). The essential dictionary of sci- [35] Kleidon, A.; et., al. (2005). Non-equilibrium Thermo-
ence. New York: Barnes & Noble. ISBN 978-0-7607- dynamics and the Production of Entropy. Heidelberg:
4616-5. Springer.

[19] Frigg, R. and Werndl, C. Entropy A Guide for the Per- [36] Belkin, Andrey; et., al. (2015). Self-assembled
plexed. In Probabilities in Physics; Beisbart C. and Hart- wiggling nano-structures and the principle of maxi-
mann, S. Eds; Oxford University Press, Oxford, 2010 mum entropy production. Scientic Reports. 5.
doi:10.1038/srep08323.
[20] Schroeder, Daniel V. (2000). An introduction to thermal
physics ([Nachdr.] ed.). San Francisco, CA [u.a.]: Addi- [37] Moore, J. W.; C. L. Stanistski; P. C. Jurs (2005). Chem-
son Wesley. p. 57. ISBN 978-0-201-38027-9. istry, The Molecular Science. Brooks Cole. ISBN 0-534-
42201-2.
[21] EntropyOrderParametersComplexity.pdf www.physics.
cornell.edu" (PDF). Retrieved 2012-08-17. [38] Jungermann, A.H. (2006). Entropy and the Shelf
Model: A Quantum Physical Approach to a Physi-
[22] Jaynes, E. T., The Gibbs Paradox, In Maximum En- cal Property. Journal of Chemical Education. 83
tropy and Bayesian Methods; Smith, C. R; Erickson, G. (11): 16861694. Bibcode:2006JChEd..83.1686J.
J; Neudorfer, P. O., Eds; Kluwer Academic: Dordrecht, doi:10.1021/ed083p1686.
1992, pp. 122 (PDF). Retrieved 2012-08-17.
[39] Levine, I. N. (2002). Physical Chemistry, 5th ed.
[23] Sandler, Stanley I. (2006). Chemical, biochemical, and McGraw-Hill. ISBN 0-07-231808-2.
engineering thermodynamics (4th ed.). New York: John
Wiley & Sons. p. 91. ISBN 978-0-471-66174-0. [40] Late Nobel Laureate Max Born (8 August 2015). Natural
Philosophy of Cause and Chance. BiblioLife. pp. 44,
[24] Simon, Donald A. McQuarrie; John D. (1997). Physi- 146147. ISBN 978-1-298-49740-6.
cal chemistry : a molecular approach (Rev. ed.). Sausal-
ito, Calif.: Univ. Science Books. p. 817. ISBN 978-0- [41] Haase, R. (1971). Thermodynamics. New York: Aca-
935702-99-6. demic Press. pp. 197. ISBN 0-12-245601-7.
15

[42] Sandler, Stanley, I. (1989). Chemical and Engineering [59] R. Giles (22 January 2016). Mathematical Foundations of
Thermodynamics. John Wiley & Sons. ISBN 0-471- Thermodynamics: International Series of Monographs on
83050-X. Pure and Applied Mathematics. Elsevier Science. ISBN
978-1-4831-8491-3.
[43] GRC.nasa.gov. GRC.nasa.gov. 2000-03-27. Retrieved
2012-08-17. [60] M. Tribus, E.C. McIrvine, Energy and information, Sci-
entic American, 224 (September 1971), pp. 178184
[44] Franzen, Stefan. Third Law. (PDF). ncsu.ed.
[61] Balian, Roger (2004). Entropy, a Protean concept. In
[45] GRC.nasa.gov. GRC.nasa.gov. 2008-07-11. Retrieved Dalibard, Jean. Poincar Seminar 2003: Bose-Einstein
2012-08-17. condensation entropy. Basel: Birkhuser. pp. 119144.
ISBN 978-3-7643-7116-6.
[46] Gribbin, John (1999). Gribbin, Mary, ed. Q is for quan-
tum : an encyclopedia of particle physics. New York: Free [62] Brillouin, Leon (1956). Science and Information Theory.
Press. ISBN 0-684-85578-X. ISBN 0-486-43918-6.

[47] Entropy: Denition and Equation. Encyclopdia Bri- [63] Georgescu-Roegen, Nicholas (1971). The Entropy Law
tannica. Retrieved 22 May 2016. and the Economic Process. Harvard University Press.
ISBN 0-674-25781-2.
[48] Brooks, Daniel R.; Wiley, E. O. (1988). Evolution as
[64] Chen, Jing (2005). The Physical Foundation of Economics
entropy : toward a unied theory of biology (2nd ed.).
an Analytical Thermodynamic Theory. World Scientic.
Chicago [etc.]: University of Chicago Press. ISBN 0-226-
ISBN 981-256-323-7.
07574-5.
[65] Kalinin, M.I.; Kononogov, S.A. (2005). Boltzmanns
[49] Landsberg, P.T. (1984). Is Equilibrium always an En- constant. Measurement Techniques. 48 (7): 632636.
tropy Maximum?". J. Stat. Physics. 35: 159169. doi:10.1007/s11018-005-0195-9.
Bibcode:1984JSP....35..159L. doi:10.1007/bf01017372.
[66] Ben-Naim, Arieh (2008). Entropy demystied the second
[50] Landsberg, P.T. (1984). Can Entropy and Order In- law reduced to plain common sense (Expanded ed.). Sin-
crease Together?". Physics Letters. 102A (4): 171 gapore: World Scientic. ISBN 9789812832269.
173. Bibcode:1984PhLA..102..171L. doi:10.1016/0375-
9601(84)90934-4. [67] Edwin T. Jaynes Bibliography. Bayes.wustl.edu.
1998-03-02. Retrieved 2009-12-06.
[51] Lambert, Frank L. A Students Approach to the Second
Law and Entropy. entropysite.oxy.edu. Archived from [68] Schneider, Tom, DELILA system (Deoxyribonucleic acid
the original on 17 July 2009. Retrieved 22 May 2016. Library Language), (Information Theory Analysis of
binding sites), Laboratory of Mathematical Biology, Na-
[52] Watson, J.R.; Carson, E.M. (May 2002). Undergraduate tional Cancer Institute, FCRDC Bldg. 469. Rm 144, P.O.
students understandings of entropy and Gibbs free en- Box. B Frederick, MD 21702-1201, USA
ergy. (PDF). University Chemistry Education. 6 (1): 4.
[69] Brooks, Daniel, R.; Wiley, E.O. (1988). Evolution
ISSN 1369-5614.
as Entropy Towards a Unied Theory of Biology.
[53] Lambert, Frank L. (February 2002). Disorder University of Chicago Press. ISBN 0-226-07574-5.
A Cracked Crutch for Supporting Entropy Discus- [70] Avery, John (2003). Information Theory and Evolution.
sions. Journal of Chemical Education. 79 (2): 187. World Scientic. ISBN 981-238-399-9.
doi:10.1021/ed079p187.
[71] Yockey, Hubert, P. (2005). Information Theory, Evolu-
[54] Atkins, Peter (1984). The Second Law. Scientic Ameri- tion, and the Origin of Life. Cambridge University Press.
can Library. ISBN 0-7167-5004-X. ISBN 0-521-80293-8.
[55] Sandra Saary (Head of Science, Latifa Girls School, [72] Chiavazzo, Eliodoro; Fasano, Matteo; Asinari,
Dubai) (23 February 1993). Book Review of A Sci- Pietro (2013). Inference of analytical thermody-
ence Miscellany"". Khaleej Times. Galadari Press, UAE: namic models for biological networks. Physica
XI. A: Statistical Mechanics and its Applications. 392
(5): 11221132. Bibcode:2013PhyA..392.1122C.
[56] Lathia, R; Agrawal, T; Parmar, V; Dobariya, K; Patel, doi:10.1016/j.physa.2012.11.030.
A (2015-10-20). Heat Death (The Ultimate Fate of the
Universe)". doi:10.13140/rg.2.1.4158.2485. [73] Chen, Jing (2015). The Unity of Science and Economics:
A New Foundation of Economic Theory. https://www.
[57] Lieb, Elliott H.; Yngvason, Jakob (March 1999). The springer.com/us/book/9781493934645: Springer.
physics and mathematics of the second law of ther-
modynamics. Physics Reports. 310 (1): 196. [74] Chiavazzo, Eliodoro; Isaia, Marco; Mammola, Stefano;
doi:10.1016/S0370-1573(98)00082-9. Lepore, Emiliano; Ventola, Luigi; Asinari, Pietro; Pugno,
Nicola Maria (2015). Cave spiders choose optimal en-
[58] Carathodory, C. (September 1909). Untersuchun- vironmental factors with respect to the generated en-
gen ber die Grundlagen der Thermodynamik. Math- tropy when laying their cocoon. Scientic Reports. 5:
ematische Annalen (in German). 67 (3): 355386. 7611. Bibcode:2015NatSR...5E7611C. PMC 5154591
doi:10.1007/BF01450409. . PMID 25556697. doi:10.1038/srep07611.
16 11 FURTHER READING

[75] IUPAC, Compendium of Chemical Terminology, 2nd ed. [88] Ayres, Robert U. (2007). On the practical
(the Gold Book) (1997). Online corrected version: limits to substitution (PDF). Ecological Eco-
(2006) "Entropy unit". nomics. Amsterdam: Elsevier. 61: 115128.
doi:10.1016/j.ecolecon.2006.02.011.
[76] von Baeyer, Christian, H. (2003). Information
the New Language of Science. Harvard University [89] Kerschner, Christian (2010). Economic de-growth vs.
Press. ISBN 0-674-01387-5.Srednicki M (August steady-state economy (PDF). Journal of Cleaner Pro-
1993). Entropy and area. Phys. Rev. Lett. 71 duction. Amsterdam: Elsevier. 18 (6): 544551.
(5): 666669. Bibcode:1993PhRvL..71..666S. doi:10.1016/j.jclepro.2009.10.019.
PMID 10055336. arXiv:hep-th/9303048 .
doi:10.1103/PhysRevLett.71.666.Callaway DJE (April
1996). Surface tension, hydrophobicity, and black
holes: The entropic connection. Phys. Rev. E. 53 11 Further reading
(4): 37383744. Bibcode:1996PhRvE..53.3738C.
PMID 9964684. arXiv:cond-mat/9601111 . Adam, Gerhard; Otto Hittmair (1992). Wrmethe-
doi:10.1103/PhysRevE.53.3738. orie. Vieweg, Braunschweig. ISBN 3-528-33311-1.
[77] Buchan, Lizzy. Black holes do not exist, says Stephen Atkins, Peter; Julio De Paula (2006). Physical
Hawking. Cambridge News. Retrieved 27 January 2014.
Chemistry (8th ed.). Oxford University Press. ISBN
[78] Layzer, David (1988). Growth of Order in the Universe. 0-19-870072-5.
MIT Press.
Baierlein, Ralph (2003). Thermal Physics. Cam-
[79] Chaisson, Eric J. (2001). Cosmic Evolution: The Rise of bridge University Press. ISBN 0-521-65838-1.
Complexity in Nature. Harvard University Press. ISBN
0-674-00342-X. Ben-Naim, Arieh (2007). Entropy Demystied.
[80] Lineweaver, Charles H.; Davies, Paul C. W.; Ruse, World Scientic. ISBN 981-270-055-2.
Michael, eds. (2013). Complexity and the Arrow of Time.
Cambridge University Press. ISBN 978-1-107-02725-1. Callen, Herbert, B (2001). Thermodynamics and an
Introduction to Thermostatistics (2nd ed.). John Wi-
[81] Stenger, Victor J. (2007). God: The Failed Hypothesis. ley and Sons. ISBN 0-471-86256-8.
Prometheus Books. ISBN 1-59102-481-1.
[82] Benjamin Gal-Or (1987). Cosmology, Physics and Philos-
Chang, Raymond (1998). Chemistry (6th ed.). New
ophy. Springer Verlag. ISBN 0-387-96526-2. York: McGraw Hill. ISBN 0-07-115221-0.

[83] Albrecht, Andreas (2004). Cosmic ination and the ar- Cutnell, John, D.; Johnson, Kenneth, J. (1998).
row of time (PDF). In Barrow, John D.; Davies, Paul Physics (4th ed.). John Wiley and Sons, Inc. ISBN
C.W.; Harper, Charles L., Jr. Science and Ultimate Re- 0-471-19113-2.
ality: From Quantum to Cosmos. Cambridge, UK: Cam-
bridge University Press. arXiv:astro-ph/0210527 . Re- Dugdale, J. S. (1996). Entropy and its Physical
trieved 28 June 2017 (in honor of John Wheelers 90th Meaning (2nd ed.). Taylor and Francis (UK); CRC
birthday) (US). ISBN 0-7484-0569-0.
[84] Georgescu-Roegen, Nicholas (1971). The Entropy Law Fermi, Enrico (1937). Thermodynamics. Prentice
and the Economic Process. (Full book accessible in three
Hall. ISBN 0-486-60361-X.
parts at SlideShare). Cambridge, Massachusetts: Harvard
University Press. ISBN 0-674-25780-4. Goldstein, Martin; Inge, F (1993). The Refrigerator
[85] Cleveland, Cutler J.; Ruth, Matthias (1997). When, and the Universe. Harvard University Press. ISBN
where, and by how much do biophysical limits constrain 0-674-75325-9.
the economic process? A survey of Nicholas Georgescu-
Roegens contribution to ecological economics (PDF). Gyftopoulos, E.P.; G.P. Beretta (2010). Thermo-
Ecological Economics. Amsterdam: Elsevier. 22 dynamics. Foundations and Applications. Dover.
(3): 203223. doi:10.1016/s0921-8009(97)00079-7. ISBN 0-486-43932-1.
Archived from the original (PDF) on 2015-12-08.
Haddad, Wassim M.; Chellaboina, VijaySekhar;
[86] Daly, Herman E.; Farley, Joshua (2011). Ecological Eco-
Nersesov, Sergey G. (2005). Thermodynamics A
nomics. Principles and Applications. (PDF contains full
Dynamical Systems Approach. Princeton University
book) (2nd ed.). Washington: Island Press. ISBN 978-1-
59726-681-9. Press. ISBN 0-691-12327-6.

[87] Schmitz, John E.J. (2007). The Second Law of Life: En- Kroemer, Herbert; Charles Kittel (1980). Thermal
ergy, Technology, and the Future of Earth As We Know Physics (2nd ed.). W. H. Freeman Company. ISBN
It. (Link to the authors science blog, based on his text- 0-7167-1088-9.
book). Norwich: William Andrew Publishing. ISBN 0-
8155-1537-5. Lambert, Frank L.; entropysite.oxy.edu
17

Mller-Kirsten, Harald J.W. (2013). Basics of Sta- Moriarty, Philip; Merrield, Michael (2009). S
tistical Physics (2nd ed.). Singapore: World Scien- Entropy. Sixty Symbols. Brady Haran for the
tic. ISBN 978-981-4449-53-3. University of Nottingham.

Penrose, Roger (2005). The Road to Reality: A Entropy Scholarpedia


Complete Guide to the Laws of the Universe. New
York: A. A. Knopf. ISBN 0-679-45443-8.

Reif, F. (1965). Fundamentals of statistical and


thermal physics. McGraw-Hill. ISBN 0-07-
051800-9.

Schroeder, Daniel V. (2000). Introduction to Ther-


mal Physics. New York: Addison Wesley Longman.
ISBN 0-201-38027-7.

Serway, Raymond, A. (1992). Physics for Scientists


and Engineers. Saunders Golden Subburst Series.
ISBN 0-03-096026-6.

Spirax-Sarco Limited, Entropy A Basic Under-


standing A primer on entropy tables for steam en-
gineering

vonBaeyer; Hans Christian (1998). Maxwells De-


mon: Why Warmth Disperses and Time Passes. Ran-
dom House. ISBN 0-679-43342-2.

Entropy for beginners a wikibook

An Intuitive Guide to the Concept of Entropy Aris-


ing in Various Sectors of Science a wikibook

12 External links
Entropy and the Second Law of Thermodynamics
an A-level physics lecture with detailed derivation of
entropy based on Carnot cycle

Khan Academy: entropy lectures, part of Chemistry


playlist

Proof: S (or Entropy) is a valid state variable


Thermodynamic Entropy Denition Clarica-
tion
Reconciling Thermodynamic and State De-
nitions of Entropy
Entropy Intuition
More on Entropy

The Second Law of Thermodynamics and Entropy


Yale OYC lecture, part of Fundamentals of Physics
I (PHYS 200)

Entropy and the Clausius inequality MIT OCW lec-


ture, part of 5.60 Thermodynamics & Kinetics,
Spring 2008

The Discovery of Entropy by Adam Shulman.


Hour-long video, January 2013.
18 13 TEXT AND IMAGE SOURCES, CONTRIBUTORS, AND LICENSES

13 Text and image sources, contributors, and licenses


13.1 Text
Entropy Source: https://en.wikipedia.org/wiki/Entropy?oldid=798004792 Contributors: Tobias Hoevekamp, Chenyu, CYD, Bryan Derk-
sen, Zundark, The Anome, BlckKnght, Awaterl, XJaM, Roadrunner, Peterlin~enwiki, Jdpipe, Heron, Youandme, Olivier, Stevertigo,
PhilipMW, Michael Hardy, Macvienna, Zeno Gantner, Looxix~enwiki, J'raxis, Humanoid, Darkwind, AugPi, Jiang, Kaihsu, Jani~enwiki,
Mxn, Smack, Disdero, Tantalate, Timwi, Reddi, Terse, Dysprosia, Jitse Niesen, Andrewman327, Piolinfax, Tpbradbury, Saltine, J D, Atuin,
Raul654, Wetman, Lumos3, Jni, Phil Boswell, Ruudje, Robbot, Fredrik, Alrasheedan, Naddy, Sverdrup, Texture, Hadal, David Edgar,
Ianml, Aetheling, Tea2min, Connelly, Paisley, Giftlite, Graeme Bartlett, DavidCary, Haeleth, BenFrantzDale, Lee J Haywood, Herbee,
Xerxes314, Everyking, Anville, Dratman, Henry Flower, NotableException, Gracefool, Macrakis, Christofurio, Zeimusu, Yath, Gunnar
Larsson, Karol Langner, JimWae, Mjs, H Padleckas, Pmanderson, Icairns, Arcturus, Tsemii, Edsanville, E David Moyer, Mschlindwein,
Freakofnurture, Lone Isle, Rich Farmbrough, KillerChihuahua, Pjacobi, Vsmith, Dave souza, Gianluigi, Mani1, Paul August, Bender235,
Kbh3rd, Kjoonlee, Geoking66, RJHall, Pt, El C, Laurascudder, Aaronbrick, Chuayw2000, Bobo192, Marathoner, Wisdom89, Girae-
data, VBGFscJUn3, Physicistjedi, 99of9, Obradovic Goran, Haham hanuka, Mdd, Geschichte, Gary, Mennato, Arthena, Keenan Pep-
per, Benjah-bmm27, Riana, Iris lorain, PAR, Melaen, Velella, Knowledge Seeker, Jheald, RJFJR, Count Iblis, Drat, Egg, Artur adib,
Lerdsuwa, Gene Nygaard, Oleg Alexandrov, Omnist, Sandwiches, Joriki, Velho, Simetrical, MartinSpacek, Woohookitty, Linas, Tiger-
Shark, StradivariusTV, Jacobolus, Wijnand, EnSamulili, Pkeck, Mouvement, Jwanders, Eleassar777, Tygar, SeventyThree, Jonathan48,
DL5MDA, Aarghdvaark, Graham87, Marskell, V8rik, Nanite, Rjwilmsi, Salix alba, Thechamelon, HappyCamper, Ligulem, TheIn-
credibleEdibleOompaLoompa, Dougluce, MarnetteD, GregAsche, FlaBot, RobertG, Mathbot, Nihiltres, Gurch, Frelke, Intgr, Fresh-
eneesz, Srleer, Physchim62, WhyBeNormal, Chobot, DVdm, VolatileChemical, YurikBot, Wavelength, Jimp, Alpt, Kafziel, Wolf-
mankurd, Bobby1011, RussBot, Loom91, Bhny, JabberWok, Stephenb, Gaius Cornelius, Wimt, Ugur Basak, Odysses, Shanel, NawlinWiki,
SAE1962, Sitearm, Retired username, Dhollm, Ellwyz, Crasshopper, Shotgunlee, Dr. Ebola, DeadEyeArrow, Bota47, Rayc, Brisvegas,
Doetoe, Ms2ger, WAS 4.250, Vadept, Light current, Enormousdude, Theodolite, Ballchef, The Fish, ChrisGriswold, Theda, CharlesH-
Bennett, Chaiken, Paganpan, Bo Jacoby, Pentasyllabic, Pipifax, DVD R W, ChemGardener, Itub, Attilios, Otheus, SmackBot, Electr-
icRay, Reedy, InverseHypercube, KnowledgeOfSelf, Jim62sch, David Shear, Mscuthbert, Ixtli, Jab843, Pedrose, Edgar181, Xaosux,
Hmains, Betacommand, Skizzik, ThorinMuglindir, Kmarinas86, Oneismany, Master Jay, Kurykh, QTCaptain, Bduke, Dreg743, Com-
plexica, Imaginaryoctopus, Basalisk, Nbarth, Sciyoshi~enwiki, Dlenmn, Colonies Chris, Darth Panda, Chrislewis.au, BW95, Zachorious,
Can't sleep, clown will eat me, Ajaxkroon, ZezzaMTE, Apostolos Margaritis, Shunpiker, Homestarmy, AltheaJ, Ddon, Memming, Eng-
war, Nakon, J.Wolfe@unsw.edu.au, G716, LoveMonkey, Metamagician3000, Sadi Carnot, Yevgeny Kats, SashatoBot, Tsiehta, Lambiam,
AThing, Oenus, Eric Hawthorne, MagnaMopus, Lakinekaki, Mbeychok, JorisvS, Mgiganteus1, Nonsuch, Dftb, Physis, Slakr, Dicklyon,
Tiogalinha~enwiki, Abjad, Dr.K., Cbuckley, HappyVR, Adodge, BranStark, HisSpaceResearch, K, Astrobayes, Paul venter, Gmaster108,
RekishiEJ, Jive Dadson, JRSpriggs, Emote, Patrickwooldridge, Vaughan Pratt, CmdrObot, Hanspi, Philiprbrenan, Jsd, Dgw, BassBone,
Omnichic82, Electricmic, NE Ent, Adhanali, FilipeS, Jac16888, Cydebot, Natasha2006, Kanags, WillowW, Gtxfrance, Mike Christie,
Rieman 82, Gogo Dodo, Sam Staton, Hkyriazi, Rracecarr, Miguel de Servet, Michael C Price, Rize Above, Soumya.92, Aintsemic,
Hugozam, Gurudev23, Csdidier, Abtract, Yian, Thijs!bot, Epbr123, Lg king, Opabinia regalis, Moveaway00, LeBofSportif, Teh ten-
nisman, Kahriman~enwiki, Fred t hamster, Headbomb, Neligterink, Esowteric, Electron9, EdJohnston, D.H, Dartbanks, DJ Creature,
Stannered, Seaphoto, FrankLambert, Ray Eston Smith Jr, Tim Shuba, MECU, Astavats, Serpents Choice, JAnDbot, MER-C, Really-
bored999, Physical Chemist, XerebZ, RebelRobot, Magioladitis, Bongwarrior, VoABot II, Avjoska, Kaiserkarl13, Bargebum, Tonyfaull,
HGHSTROJAN, Dirac66, User A1, Jacobko, Glen, Steevven1, DGG, Hdt83, GuidoGer, Keith D, Ronburk, Pbroks13, Leyo, Mbweissman,
Mausy5043, HEL, J.delanoy, Captain panda, Jorgenumata, Numbo3, Peter Chastain, Josterhage, Maurice Carbonaro, Thermbal, Shawn
in Montreal, Nemo bis, Camarks, Cmbreuel, Nwbeeson, Touch Of Light, Constatin666999, Pundit, Edzevallos, Juliancolton, Linshukun,
DorganBot, Rising*From*Ashes, Inwind, Lseixas, Izno, Idioma-bot, Fimbulfamb, Cuzkatzimhut, Ballhausip, Larryisgood, Macedonian,
Pasquale.Carelli, LokiClock, Philip Trueman, Nikhil Sanjay Bapat, TXiKiBoT, BJNartowt, Antoni Barau, Rei-bot, Anonymous Dissi-
dent, Drestros power, Hai2410, Vendrov, Leafyplant, Raymondwinn, Billgdiaz, Mwilso24, Kpedersen1, Mouse is back, Koen Van de
moortel~enwiki, UeHThygesen, Synthebot, Sesshomaru, Locke9k, Arcfrk, Nagy, Tennismaniac2112, Bojack727, Katzmik, EmxBot,
Vbrayne, Kbrose, YohanN7, SieBot, Wolf.312, Moonriddengirl, Paradoctor, Gerakibot, Vanished user 82345ijgeke4tg, Arjun r acharya,
Happysailor, Radon210, AngelOfSadness, LidiaFourdraine, Georgette2, Hamiltondaniel, WikiLaurent, Geo Plourde, Mad540trix, Dol-
phin51, Emansf, ClueBot, Compdude47, Fox, Yurko~enwiki, The Thing That Should Not Be, Ciacco, Plastikspork, Dtguelph, Riskdoc,
Drmies, Bbanerje, ILikeMIDI, Josemald, Lbertolotti, DragonBot, Djr32, Awickert, Graphitepalms, PhySusie, Tnxman307, M.O.X, Wing-
wongdong, Revotfel, SchreiberBike, Galor612, Versus22, Edkarpov, Passwordwas1234, DumZiBoT, TimothyRias, Tuuky, XLinkBot,
Gnowor, Superkan619, BodhisattvaBot, Boob12, Ost316, Quidproquo2004, Gonfer, MilesTerrex, Subversive.sound, Private Pilot, Wiki-
Dao, Aunt Entropy, Mathieu ottawa, NCDane, JohnBonham69, Debzer, Phidus, Addbot, Eric Drexler, Tanhabot, MrOllie, Favonian,
Ruddy9hell, Causticorulos, Mean Free Path, Dougbateman, Tide rolls, Suz115, Gatewayontrigue, Gail, Legobot, Luckas-bot, Yobot,
Zaereth, WikiDan61, Ht686rg90, Legobot II, Kissnmakeup, JHomueller~enwiki, AnomieBOT, Cantanchorus, Ciphers, IRP, Galou-
bet, Piano non troppo, Materialscientist, Citation bot, ArthurBot, DirlBot, Branxton, FreeRangeFrog, Xqbot, Engineering Guy, Addi-
hockey10, Jerey Mall, DSisyphBot, Necron909, Raamaiden, Srich32977, Almabot, Munozdj, Schwijker, GrouchoBot, Tnf37, Ute in
DC, Philip2357, Omnipaedista, RibotBOT, Waleswatcher, Smallman12q, Garethb1961, Mishka.medvezhonok, Chjoaygame, GT5162,
Maghemite, C1t1v151on, Theowoo, Craig Pemberton, BenzolBot, Kwiki, Vh mby, MorphismOfDoom, DrilBot, Pinethicket, I dream
of horses, HRoestBot, Marsiancba, Martinvl, Calmer Waters, Jschnur, RedBot, Tcnuk, Nora lives, SkyMachine, IVAN3MAN, Noble-
ness of Mind, Quantumechanic, TobeBot, Jschissel, Lotje, DLMcN, Dinamik-bot, Vrenator, Lordloihi, Bookbuddi, Rr parker, Strop-
polo, Gegege13, DARTH SIDIOUS 2, Mean as custard, Woogee, Dick Chu, Regancy42, Drpriver, Massieu, Prasadmalladi, EmausBot,
John of Reading, Lea phys, 12seda78, Nick Moyes, 478jjjz, Heoigi, Klbrain, Netheril96, Dcirovic, K6ka, Serketan, Capcom1116, Jaso-
nanaggie, Oceans and oceans, Sepguilherme, Akhil 0950, JSquish, Tranhungnghiep, F, Mkratz, Lateg, , Cobaltcigs, Quon-
dum, Glockenklang1, Parodi, AManWithNoPlan, Pitrus, Music Sorter, Pachyphytum, Schurasbrat, Zueignung, Carmichael, RockMag-
netist, Tritchls, GP modernus, DASHBotAV, Manytexts, ResearchRave, Mikhail Ryazanov, Debu334, ClueBot NG, Tschijnmotschau,
Intoronto1125, Chester Markel, Frietjes, Marechal Ney, Widr, Antiqueight, Natron25, Amircrypto, Helpful Pixie Bot, Art and Mus-
cle, Jack sherrod, Ramaksoud2000, Bibcode Bot, BZTMPS, BG19bot, Jescott007, Scyllagist, Bths83Cu87Aiu06, Juro2351, Paolo Lip-
parini, DIA-888, FutureTrillionaire, Zedshort, Cky2250, Uopchem2510, Uopchem2517, BattyBot, Millennium bug, Justincheng12345-
bot, Bobcorn123321, LEBOLTZMANN2, Smileguy91, Toploftical, Toni 001, ChrisGualtieri, Layzeeboi, Adwaele, JYBot, Enterprisey,
AlecTaylor, Thinkadoodle, Webclient101, Mogism, Makecat-bot, Jiejie9988, CuriousMind01, Sfzh, Ssteve90266, Arrayoutofbounds,
KingQueenPrince, Blue3snail, Project Osprey, Thearchontect, Spetalnick, Rjg83, Curatrice, Random Dude Who Is Cool, Sajjadha, Mat-
tia Guerri, Probeb217, Loverthehater, TheNyleve, Rkswb, Prokaryotes, Mommi84, DavRosen, Damin A. Fernndez Beanato, Bruce
13.2 Images 19

Chen 0010334, Jianhui67, W. P. Uzer, Balljust, PhoenixPub, Technoalpha, DrADScott, ProKro, Anrnusna, Saad bin zubair, Quan-
tumMatt101, Dragonlord Jack, Elenceq, Monkbot, Yikkayaya, Asympto, George Evgenidis, Eczanne, Lamera1234, TaeYunPark, Vtor,
ClockWork96, Georgeciobanu, Gbkrishnappa2015, Eliodorochia, KasparBot, Asterixf2, Sir Cumference, Datbubblegumdoe, Gaeanautes,
Ericliu shu, Miller.alexb, Tanmay pathak987654, TomKaufmann869, Spinrade, Stemwinders, Ssmmachen, Samuelchuuu, InternetArchive-
Bot, SSA7471, PhyKBA, JosiahWilard, WandaLan, Sir.Arjit Chauhan, KGirlTrucker81, Fmadd, LathiaRutvik, Bender the Bot, Rippledj,
Apollo The Logician, PeterPresent, Chuckwoodjohn, TheRealChrisWill, Alan T, Lochan94s, Justeditingtoday, Dbelkina, CharlieBob,
Servicesoon, Sharonova, Sens15, AlanTuring546, Shubhamssl, Miopew, KolbertBot, JCW-CleanerBot and Anonymous: 837

13.2 Images
File:Clausius.jpg Source: https://upload.wikimedia.org/wikipedia/commons/4/40/Clausius.jpg License: Public domain Contributors:
http://www-history.mcs.st-andrews.ac.uk/history/Posters2/Clausius.html Original artist: Original uploader was user:Sadi Carnot at
en.wikipedia
File:First_law_open_system.svg Source: https://upload.wikimedia.org/wikipedia/commons/8/86/First_law_open_system.svg License:
Public domain Contributors:
First_law_open_system.png Original artist:
derivative work: Pbroks13 (talk)
File:Lock-green.svg Source: https://upload.wikimedia.org/wikipedia/commons/6/65/Lock-green.svg License: CC0 Contributors: en:File:
Free-to-read_lock_75.svg Original artist: User:Trappist the monk
File:System_boundary.svg Source: https://upload.wikimedia.org/wikipedia/commons/b/b6/System_boundary.svg License: Public do-
main Contributors: en:Image:System-boundary.jpg Original artist: en:User:Wavesmikey, traced by User:Stannered
File:Temperature-entropy_chart_for_steam,_US_units.svg Source: https://upload.wikimedia.org/wikipedia/commons/6/63/
Temperature-entropy_chart_for_steam%2C_US_units.svg License: CC BY-SA 3.0 Contributors: Own workData retrieved from: E.W.
Lemmon, M.O. McLinden and D.G. Friend, Thermophysical Properties of Fluid Systems in NIST Chemistry WebBook, NIST Standard
Reference Database Number 69, Eds. P.J. Linstrom and W.G. Mallard, National Institute of Standards and Technology, Gaithersburg
MD, 20899, http://webbook.nist.gov, (retrieved November 2, 2010).) Original artist: Emok
File:Wiktionary-logo-en-v2.svg Source: https://upload.wikimedia.org/wikipedia/commons/9/99/Wiktionary-logo-en-v2.svg License:
CC-BY-SA-3.0 Contributors: ? Original artist: ?

13.3 Content license


Creative Commons Attribution-Share Alike 3.0

Вам также может понравиться