Você está na página 1de 22

What is the Second Law of Thermodynamics?

The laws of thermodynamics describe the relationships between thermal energy, or


heat, and other forms of energy, and how energy affects matter. The First Law of
Thermodynamics states that energy cannot be created or destroyed; the
total quantity of energy in the universe stays the same. The Second Law of
Thermodynamics is about the quality of energy. It states that as energy is transferred or
transformed, more and more of it is wasted. The Second Law also states that there is a
natural tendency of any isolated system to degenerate into a more disordered state.
Saibal Mitra, a professor of physics at Missouri State University, finds the Second Law
to be the most interesting of the four laws of thermodynamics. There are a number of
ways to state the Second Law," he said. "At a very microscopic level, it simply says that
if you have a system that is isolated, any natural process in that system progresses in
the direction of increasing disorder, or entropy, of the system.
Mitra explained that all processes result in an increase in entropy. Even when order is
increased in a specific location, for example by the self-assembly of molecules to form a
living organism, when you take the entire system including the environment into
account, there is always a net increase in entropy. In another example, crystals can
form from a salt solution as the water is evaporated. Crystals are more orderly than salt
molecules in solution; however, vaporized water is much more disorderly than liquid
water. The process taken as a whole results in a net increase in disorder.
History
In his book, "A New Kind of Science," Stephen Wolfram wrote, Around 1850 Rudolf
Clausius and William Thomson (Lord Kelvin) stated that heat does not spontaneously
flow from a colder body to a hotter body. This became the basis for the Second Law.
Subsequent works by Daniel Bernoulli, James Clerk Maxwell, and Ludwig
Boltzmann led to the development of the kinetic theory of gases, in which a gas is
recognized as a cloud of molecules in motion that can be treated statistically. This
statistical approach allows for precise calculation of temperature, pressure and volume
according to the ideal gas law.
This approach also led to the conclusion that while collisions between individual
molecules are completely reversible, i.e., they work the same when played forward or
backward, for a large quantity of gas, the speeds of individual molecules tend over time

to form a normal or Gaussian distribution, sometimes depicted as a bell curve, around


the average speed. The result of this is that when hot gas and cold gas are placed
together in a container, you eventually end up with warm gas. However, the warm gas
will never spontaneously separate itself into hot and cold gas, meaning that the process
of mixing hot and cold gasses is irreversible. This has often been summarized as, You
cant unscramble an egg. According to Wolfram, Boltzmann realized around 1876 that
the reason for this is that there must be many more disordered states for a system than
there are ordered states; therefore random interactions will inevitably lead to greater
disorder.
Work and energy
One thing the Second Law explains is that it is impossible to convert heat energy to
mechanical energy with 100 percent efficiency. After the process of heating a gas to
increase its pressure to drive a piston, there is always some leftover heat in the gas that
cannot be used to do any additional work. This waste heat must be discarded by
transferring it to a heat sink. In the case of a car engine, this is done by exhausting the
spent fuel and air mixture to the atmosphere. Additionally, any device with movable
parts produces friction that converts mechanical energy to heat that is generally
unusable and must be removed from the system by transferring it to a heat sink. This is
why claims for perpetual motion machines are summarily rejected by the U.S. Patent
Office.
When a hot and a cold body are brought into contact with each other, heat energy will
flow from the hot body to the cold body until they reach thermal equilibrium, i.e., the
same temperature. However, the heat will never move back the other way; the
difference in the temperatures of the two bodies will never spontaneously increase.
Moving heat from a cold body to a hot body requires work to be done by an external
energy source such as a heat pump.
The most efficient engines we build right now are large gas turbines, said David
McKee, a professor of physics at Missouri State University. They burn natural gas or
other gaseous fuels at very high temperature, over 2,000 degrees C [3,600 F], and the
exhaust coming out is just a stiff, warm breeze. Nobody tries to extract energy from the
waste heat, because theres just not that much there.

The arrow of time


The Second Law indicates that thermodynamic processes, i.e., processes that involve
the transfer or conversion of heat energy, are irreversible because they all result in an
increase in entropy. Perhaps one of the most consequential implications of the Second
Law, according to Mitra, is that it gives us the thermodynamic arrow of time.
In theory, some interactions, such as collisions of rigid bodies or certain chemical
reactions, look the same whether they are run forward or backward. In practice,
however, all exchanges of energy are subject to inefficiencies, such as friction and
radiative heat loss, which increase the entropy of the system being observed.
Therefore, because there is no such thing as a perfectly reversible process, if someone
asks what is the direction of time, we can answer with confidence that time always flows
in the direction of increasing entropy.
The fate of the universe
The Second Law also predicts the end of the universe, according to Boston University.
"It implies that the universe will end in a heat death in which everything is at the same
temperature. This is the ultimate level of disorder; if everything is at the same
temperature, no work can be done, and all the energy will end up as the random motion
of atoms and molecules.
In the far distant future, stars will have used up all of their nuclear fuel ending up
as stellar remnants, such as white dwarfs, neutron stars or black holes, according to
Margaret Murray Hanson, a physics professor at the University of Cincinnati. They will
eventually evaporate into protons, electrons, photons and neutrinos, ultimately reaching
thermal equilibrium with the rest of the Universe. Fortunately, John Baez, a
mathematical physicist at the University of California Riverside, predicts that
this process of cooling down could take as long as 10(10^26) (1 followed by 1026(100
septillion) zeros) years with the temperature dropping to around 10 30 K (1030 C
above absolute zero).
Second Law of Thermodynamics
Second Law of Thermodynamics - The Laws of Heat Power
The Second Law of Thermodynamics is one of three Laws of Thermodynamics. The
term "thermodynamics" comes from two root words: "thermo," meaning heat, and

"dynamic," meaning power. Thus, the Laws of Thermodynamics are the Laws of "Heat
Power." As far as we can tell, these Laws are absolute. All things in the observable
universe are affected by and obey the Laws of Thermodynamics.
The First Law of Thermodynamics, commonly known as the Law of Conservation of
Matter, states that matter/energy cannot be created nor can it be destroyed. The
quantity of matter/energy remains the same. It can change from solid to liquid to gas to
plasma and back again, but the total amount of matter/energy in the universe remains
constant.
Second Law of Thermodynamics - Increased Entropy
The Second Law of Thermodynamics is commonly known as the Law of Increased
Entropy. While quantity remains the same (First Law), the quality of matter/energy
deteriorates gradually over time. How so? Usable energy is inevitably used for
productivity, growth and repair. In the process, usable energy is converted into unusable
energy. Thus, usable energy is irretrievably lost in the form of unusable energy.
"Entropy" is defined as a measure of unusable energy within a closed or isolated
system (the universe for example). As usable energy decreases and unusable energy
increases, "entropy" increases. Entropy is also a gauge of randomness or chaos within
a closed system. As usable energy is irretrievably lost, disorganization, randomness and
chaos increase.
Second Law of Thermodynamics - In the Beginning...
The implications of the Second Law of Thermodynamics are considerable. The universe
is constantly losing usable energy and never gaining. We logically conclude the
universe is not eternal. The universe had a finite beginning -- the moment at which it
was at "zero entropy" (its most ordered possible state). Like a wind-up clock, the
universe is winding down, as if at one point it was fully wound up and has been winding
down ever since. The question is who wound up the clock?
The theological implications are obvious. NASA Astronomer Robert Jastrow commented
on these implications when he said, "Theologians generally are delighted with the proof
that the universe had a beginning, but astronomers are curiously upset. It turns out that
the scientist behaves the way the rest of us do when our beliefs are in conflict with the
evidence." (Robert Jastrow, God and the Astronomers, 1978, p. 16.)
Jastrow went on to say, "For the scientist who has lived by his faith in the power of
reason, the story ends like a bad dream. He has scaled the mountains of ignorance; he
is about to conquer the highest peak; as he pulls himself over the final rock, he is
greeted by a band of theologians who have been sitting there for centuries." (God and
the Astronomers, p. 116.) It seems the Cosmic Egg that was the birth of our universe
logically requires a Cosmic Chicken...

Morowitz

The Second Law of Thermodynamics What'sNEW


The use of thermodynamics in biology has a long history rich in
confusion. Harold J. Morowitz (1)
Sometimes people say that life violates the second law of thermodynamics. This is not
the case; we know of nothing in the universe that violates that law. So why do people
say that life violates the second law of thermodynamics? What is the second law of
thermodynamics?
The second law is a straightforward law of physics with the consequence that, in a
closed system, you can't finish any real physical process with as much useful energy as
you had to start with some is always wasted. This means that a perpetual motion
machine is impossible. The second law was formulated after nineteenth century
engineers noticed that heat cannot pass from a colder body to a warmer body by itself.
According to philosopher of science Thomas Kuhn, the second law was first put into
words by two scientists, Rudolph Clausius and William Thomson (Lord Kelvin), using
different examples, in 1850-51 (2). American quantum physicist Richard P. Feynman,
however, says the French physicist Sadi Carnot discovered the second law 25 years
earlier (3). That would have been before the first law, conservation of energy, was
discovered! In any case, modern scientists completely agree about the above principles.
Thermodynamic Entropy
The first opportunity for confusion arises when we introduce the term entropy into the
mix. Clausius invented the term in 1865. He had noticed that a certain ratio was
constant in reversible, or ideal, heat cycles. The ratio was heat exchanged to absolute
temperature. Clausius decided that the conserved ratio must correspond to a real,
physical quantity, and he named it "entropy".
Surely not every conserved ratio corresponds to a real, physical quantity. Historical
accident has introduced this term to science. On another planet there could be physics
without the concept of entropy. It completely lacks intuitive clarity. Even the great
physicist James Clerk Maxwell had it backward for a while (4). Nevertheless, the term
has stuck.

The American Heritage Dictionary gives as the first definition of entropy, "For a closed
system, the quantitative measure of the amount of thermal energy not available to do
work." So it's a negative kind of quantity, the opposite of available energy.
Today, it is customary to use the term entropy to state the second law: Entropy in a
closed system can never decrease. As long as entropy is defined as unavailable
energy, this paraphrase of the second law is equivalent to the earlier ones above. In a
closed system, available energy can never increase, so (because energy is conserved)
its complement, entropy, can never decrease.
A familiar demonstration of the second law is the flow of heat from hot things to cold,
and never vice-versa. When a hot stone is dropped into a bucket of cool water, the
stone cools and the water warms until each is the same temperature as the other.
During this process, the entropy of the system increases. If you know the heat
capacities and initial temperatures of the stone and the water, and the final temperature
of the water, you can quantify the entropy increase in calories or joules per degree.
You may have noticed the words "closed system" a couple of times above. Consider
simply a black bucket of water initially at the same temperature as the air around it. If
the bucket is placed in bright sunlight, it will absorb heat from the sun, as black things
do. Now the water becomes warmer than the air around it, and the available energy has
increased. Has entropy decreased? Has energy that was previously unavailable
become available, in a closed system? No, this example is only an apparent violation of
the second law. Because sunlight was admitted, the local system was not closed; the
energy of sunlight was supplied from outside the local system. If we consider the larger
system, including the sun, available energy has decreased and entropy has increased
as required.
Let's call this kind of entropy thermodynamic entropy. The qualifier "thermodynamic" is
necessary because the word entropy is also used in another, nonthermodynamic sense.
Logical Entropy
Entropy is also used to mean disorganization or disorder. J. Willard Gibbs, the
nineteenth century American theoretical physicist, called it "mixedupness."
The American Heritage Dictionary gives as the second definition of entropy, "a measure
of disorder or randomness in a closed system." Again, it's a negative concept, this time
the opposite of organization or order. The term came to have this second meaning
thanks to the great Austrian physicist Ludwig Boltzmann.
In Boltzmann's day, one complaint about the second law of
thermodynamics was that it seemed to impose upon nature a preferred
direction in time. Under the second law, things can only go one way. This
apparently conflicts with the laws of physics at the molecular level, where
there is no preferred direction in time an elastic collision between
molecules would look the same going forward or backward. In the 1880s
Boltzmann

and 1890s, Boltzmann used molecules of gas as a model, along with the laws of
probability, to show that there was no real conflict. The model showed that, no matter
how it was introduced, heat would soon become evenly diffused throughout the gas, as
the second law required.
The model could also be used to show that two different kinds of gasses would become
thoroughly mixed. The reasoning he used for mixing is very similar to that for the
diffusion of heat, but there is an important difference. In the diffusion of heat, the entropy
increase can be measured with the ratio of physical units, joules per degree. In the
mixing of two kinds of gasses already at the same temperature, if no energy is
dissipated, the ratio of joules per degree thermodynamic entropy is irrelevant. The
non-dissipative mixing process is related to the diffusion of heat only by analogy (5).
Nevertheless, Boltzmann used a factor, k, now called Boltzmann's constant, to attach
physical units to the latter situation. Now the word entropy has come to be applied to the
simple mixing process, too. (Of course, Boltzmann's constant has a legitimate use it
relates the average kinetic energy of a molecule to its temperature.)
Entropy in this latter sense has come to be used in the growing fields of information
science, computer science, communications theory, etc. The story is often told that in
the late 1940s, John von Neumann, a pioneer of the computer age, advised
communication-theorist Claude E. Shannon to start using the term "entropy" when
discussing information because "no one knows what entropy really is, so in a debate
you will always have the advantage" (6).
Richard Feynman knew there is a difference between the two meanings of entropy. He
discussed thermodynamic entropy in the section called "Entropy" of his Lectures on
Physics published in 1963 (7), using physical units, joules per degree, and over a dozen
equations (vol I section 44-6). He discussed the second meaning of entropy in a
different section titled "Order and entropy" (vol I section 46-5) as follows:
So we now have to talk about what
we mean by disorder and what we
mean by order. ... Suppose we divide
the space into little volume elements.
If we have black and white
molecules, how many ways could we
distribute them among the volume
elements so that white is on one side
and black is on the other? On the
other hand, how many ways could
Feynman
we distribute them with no restriction
on which goes where? Clearly, there are many more ways to arrange them in the latter
case. We measure "disorder" by the number of ways that the insides can be arranged,
so that from the outside it looks the same. The logarithm of that number of ways is the
entropy.The number of ways in the separated case is less, so the entropy is less, or the
"disorder" is less.

This is Boltzmann's model again. Notice that Feynman does not use Boltzmann's
constant. He assigns no physical units to this kind of entropy, just a number (a
logarithm.) And he uses not a single equation in this section of his Lectures.
Notice another thing. The "number of ways" can only be established by first artificially
dividing up the space into little volume elements. This is not a small point. In every real
physical situation, counting the number of possible arrangements requires an arbitrary
parceling. As Peter Coveney and Roger Highfield say (7.5):
There is, however, nothing to tell us how fine the [parceling] should
be. Entropies calculated in this way depend on the size-scale decided
upon, in direct contradiction with thermodynamics in which entropy
changes are fully objective.
Claude Shannon himself seems to be aware of these differences in
his famous 1948 paper, "A Mathematical Theory of
Communcation" (8). With respect to the parcelling he writes, "In the
continuous case the measurement is relative to the coordinate
system. If we change coordinates the entropy will in general change"
(p 37, Shannon's italics).

Shannon

In the same paper Shannon attaches no physical units to his entropy and never
mentions Boltzmann's constant, k. At one point he briefly introduces K, saying tersely,
"The constant K merely amounts to a choice of a unit of measure" (p 11). Although the
the 55-page paper contains more than 300 equations, K appears only once again, in
Appendix 2, which concludes, "The choice of coefficient K is a matter of convenience
and amounts to the choice of a unit of measure" (p 29). Shannon never specifies the
unit of measure.
This sort of entropy is clearly different. Physical units do not pertain to it, and (except in
the case of digital information) an arbitrary convention must be imposed before it can be
quantified. To distinguish this kind of entropy from thermodynamic entropy, let's call
it logical entropy.
The equation S = k logW
+ constappears without
an elementary theory or
however one wants to say
it devoid of any
meaning from a
phenomenological point
of view
Albert Einstein, 1910 (8.5)

In spite of the important distinction between the two


meanings of entropy, the rule as stated above for
thermodynamic entropy seems to apply nonetheless to the
logical kind: entropy in a closed system can never
decrease. And really, there would be nothing mysterious
about this law either. It's similar to saying things never
organize themselves. (The original meaning of organize is
"to furnish with organs.") Only this rule has little to do with
thermodynamics.

It is true that crystals and other regular configurations can


be formed by unguided processes. And we are accustomed to saying that these

configurations are "organized." But crystals have not been spontaneously "furnished
with organs." The correct term for such regular configurations is "ordered." The recipe
for a crystal is already present in the solution it grows from the crystal lattice is
prescribed by the structure of the molecules that compose it. The formation of crystals is
the straightforward result of chemical and physical laws that do not evolve and that are,
compared to genetic programs, very simple.
The rule that things never organize themselves is also upheld in our everyday
experience. Without someone to fix it, a broken glass never mends. Without
maintenance, a house deteriorates. Without management, a business fails. Without new
software, a computer never acquires new capabilities. Never.
Charles Darwin understood this universal principle. It's common sense. That's why he
once made a note to himself pertaining to evolution, "Never use the words higher or
lower" (9). However, the word "higher" in this forbidden sense appears half a dozen
times in the first edition of Darwin's Origin of Species (10).
Even today, if you assert that a human is more highly evolved than a flatworm or
an amoeba, there are darwinists who'll want to fight about it. They take the position,
apparently, that evolution has not necessarily shown a trend toward more highly
organized forms of life, just different forms:

All extant species are equally evolved. Lynn Margulis and Dorion Sagan,
1995 (11)

There is no progress in evolution. Stephen Jay Gould, 1995 (12)

We all agree that there's no progress. Richard Dawkins, 1995 (13)

The fallacy of progress John Maynard Smith


and Ers Szathmry, 1995 (14)

But this ignores the plain facts about life and


evolution.
Life is Organization
Seen in retrospect, evolution as a whole doubtless
had a general direction, from simple to complex, from dependence on to relative
independence of the environment, to greater and greater autonomy of individuals,
greater and greater development of sense organs and nervous systems conveying and
processing information about the state of the organism's surroundings, and finally
greater and greater consciousness. You can call this direction progress or by some
other name. Theodosius Dobzhansky (15)

Progress, then, is a property of the evolution of life as a whole by almost any


conceivable intuitive standard.... Let us not pretend to deny in our philosophy what we
know in our hearts to be true. Edward O. Wilson (16)
Life is organization. From prokaryotic cells, eukaryotic cells, tissues and organs, to
plants and animals, families, communities, ecosystems, and living planets, life is
organization, at every scale. The evolution of life is the increase of biological
organization, if it is anything. Clearly, if life originates and makes evolutionary progress
without organizing input somehow supplied, then something has organized itself.
Logical entropy in a closed system has decreased. This is the violation that people are
getting at, when they say that life violates the second law of thermodynamics. This
violation, the decrease of logical entropy in a closed system, must happen continually in
the darwinian account of evolutionary progress.
Most darwinists just ignore this staggering problem. When confronted with it, they seek
refuge in the confusion between the two kinds of entropy. [Logical] entropy has not
decreased, they say, because the system is not closed. Energy such as sunlight is
constantly supplied to the system. If you consider the larger system that includes the
sun, [thermodynamic] entropy has increased, as required.
Recent Writing About Entropy and Biology
An excellent example of this confusion is given in a popular 1982 treatise against
creationism, Abusing Science, by Philip Kitcher. He is aware that entropy has different
meanings, but he treats them as not different: "There are various ways to understand
entropy.... I shall follow the approach of classical thermodynamics, in which entropy is
seen as a function of unusable energy. But the points I make will not be affected by this
choice" (17).
Another typical example of confusion between the two kinds of entropy comes from a
similar book by Tim M. Berra,Evolution and the Myth of Creationism. The following
paragraph from that book would seem to indicate that any large animal can assemble a
bicycle (18).
For example, an unassembled bicycle that arrives at your house in a shipping carton is
in a state of disorder. You supply the energy of your muscles (which you get from food
that came ultimately from sunlight) to assemble the bike. You have got order from
disorder by supplying energy. The Sun is the source of energy input to the earth's living
systems and allows them to evolve.
A rare example of the use of mathematics to combine the two kinds of entropy is given
in The Mystery of Life's Origin, published in 1984. Its authors acknowledge two kinds of
entropy, which they call "thermal" and "configurational." To count the "number of ways"
for the latter kind of entropy they use restrictions which they later admit to be unrealistic.
They count only the number of ways a string of amino acids of fixed length can be
sequenced. They admit in the end, however, that the string might never form. To impose

the units joules per degree onto "configurational" entropy, they simply multiply by
Boltzmann's constant(19). Nevertheless, they ultimately reach the following conclusion
(p 157-158):
In summary, undirected thermal energy is only able to do the chemical and thermal
entropy work in polypetide synthesis, but not the coding (or sequencing) portion of the
configurational entropy work.... It is difficult to imagine how one could ever couple
random thermal energy flow through the system to do the required configurational
entropy work of selecting and sequencing.
In Evolution, Thermodynamics and Information, Jeffrey S. Wicken also adopts the
terms "thermal" and "configurational." But here they both pertain only to the nonenergetic "information content" of a thermodynamic state, and "energetic" information is
also necessary for the complete description of a system. Shannon entropy is different
from all of these, and not a useful concept to Wicken. Nevertheless, he says that
evolution and the origin of life are not separate problems and, "The most parsimonious
explanation is to assume that life always existed" (19.5)!
Roger Penrose's treatment of entropy is worth mentioning. In The Emperor's New
Mind (20), he nimbly dodges the problem of assigning physical units to logical entropy
(p 314, Penrose's italics):
In order to give the actual entropy values for these compartments we should have to
worry a little about the question of the units that are chosen (metres, Joules, kilograms,
degrees Kelvin, etc.). That would be out of place here, and in fact, for the utterly
stupendous entropy values that I shall be giving shortly, it makes essentially no
difference at all what units are in fact chosen. However, for definiteness (for the
experts), let me say that I shall be taking natural units, as are provided by the rules of
quantum mechanics, and for which Boltzmann's constant turns out to be unity: k = 1.
Someday in the future, an extension of quantum theory might provide a
natural way to parcel any real physical situation. If that happens, one of
the problems with quantifying logical entropy in a real physical situation
will be removed. But nobody, not even Penrose, is suggesting that this
is the case today. And even if that day comes, still we will have no
reason to attach thermodynamic units to logical entropy. (Although the
word "stupendous" appears again, no "actual entropy values" follow the
quoted passage.) (Penrose, May 2012 )

Penrose

In The Refrigerator and the Universe (21), Martin Goldstein and Inge F. Goldstein
wonder if there is "an irreconcilable difference" between the two kinds of entropy. They
begin their consideration of logical entropy by discussing the possible arrangements of
playing cards, where the parceling is not arbitrary the number of possibilities can be
counted. When they move to the world of physics, they are not concerned over the fact
that parceling must now be done arbitrarily. They are concerned, initially, about
attaching physical units to logical entropy. "...Entropy is measured in units of energy

divided by temperature.... W [counting microstates] is a pure number" (p 173). But


ultimately they apply Boltzmann's constant. No calculations using logical entropy with
physical units ensue. The next time they mention logical entropy is in the section
"Information and Entropy," where they divide the previous product by Boltzmann's
constant to remove the physical units!
An ambitious treatment of entropy as it pertains to biology is the book Evolution as
Entropy, by Daniel R. Brooks and E. O. Wiley. They acknowledge that the distinction
between the different kinds of entropy is important (22):
It is important to realize that the phase space, microstates, and macrostates described
in our theory are not classical thermodynamic constructs.... The entropies are array
entropies, more like the entropies of sorting encountered in considering an ideal gas
than like the thermal entropies associated with steam engines....
In fact the authors acknowledge many kinds of entropy; they describe physical entropy,
Shannon-Weaver entropy, cohesion entropy, and statistical entropy, for example. They
rarely use or mention Boltzmann's constant. One of their main arguments is that
although the progress of evolution seems to represent a reduction in entropy, this
reduction is only apparent. In reality, evolution increases entropy as the second law
requires. But evolution does not increase entropy as fast as the maximum possible rate.
So, by comparison to the maximum possible rate, entropy appears to be decreasing.
Our eyes have deceived us!
In another book entitled Life Itself, mathematical biologist Robert Rosen of Columbia
University seems to have grasped the problem when he writes, "The Second Law thus
asserts that... a system autonomously tending to an organized state cannot be
closed " (23). But immediately he veers away, complaining that the term "organization"
is vague. Intent on introducing terms he prefers, like "entailment," he does not consider
the possibility that, in an open system, life's organization could be imported into one
region from another.
Hans Christian von Baeyer's 1998 book, Maxwell's Demon, is engaging and
informative about the scientists who pioneered the second law. The story concludes
with an interview of Wojciech Zurek of the Theoretical Division of the Los Alamos
National Laboratory. Zurek introduces another second kind of entropy, because, "Like all
scientific ideas, the concept of entropy, useful as it is, needs to be refurbished and
updated and adjusted to new insights. Someday... the two types of entropy will begin to
approach each other in value, and the new theory will become amenable
to experimental verification" (23.5).
One of the most profound and original treatments of entropy is that by
the Nobel prize-winning chemist Ilya Prigogine. He begins by noticing that
some physical processes create surprising patterns such as snowflakes,
or exhibit surprising behavior such as oscillation between different states.
In From Being To Becoming he says, in effect, that things sometimes do,
Prigogine

under certain circumstances, organize themselves. He reasons that these processes


may have produced life (24):
It seems that most biological mechanisms of action show that life involves far-fromequilibrium conditions beyond the stability of the threshold of the thermodynamic
branch. It is therefore very tempting to suggest that the origin of life may be related to
successive instabilities somewhat analogous to the successive bifurcations that have
lead to a state of matter of increasing coherence.
Some find such passages obscure and tentative. One critic complains that work along
the lines advocated by Prigogine fifteen years earlier has borne little fruit subsequently.
"I don't know of a single phenomenon he has explained," said Pierre C. Hohenberg of
Yale University (25).
Dr. Hubert P. Yockey gives the subject of entropy and biology a probing and insightful
treatment in his monograph,Information theory and molecular biology (26). He
emphatically agrees that there are different kinds of entropy that do not correlate. "The
Shannon entropy and the Maxwell-Boltzmann-Gibbs entropy... have nothing to do with
each other" (p 313). But Shannon entropy (which pertains to information theory) makes
no distinction between meaningful DNA sequences that encode life, and random DNA
sequences of equal length. (Shannon wrote, "These semantic aspects of
communication are irrelevant to the engineering problem.") With no distinction between
meaningful and meaningless sequences, Yockey is able to conclude that evolution does
not create any paradox for Shannon entropy. Nevertheless, Yockey proves with
impressive command of biology and statistics that it would be impossible to find the new
genes necessary for evolutionary progress by the random search method currently in
favor. He is deeply sceptical of the prevailing theories of evolution and the origin of life
on Earth. (Cynthia Yockey, 2005 )
In 1998, computer scientist Christoph Adami agrees that trouble dogs the marriage of
biology and logical entropy. In Introduction to Artificial Life (27), he comments on "the
decades of confusion that have reigned over the treatment of living systems
from the point of view of thermodynamics and information theory..." (p 59).
He says, "information is always shared between two ensembles" (p 70), a
restriction that sounds promising. Yet in his section entitled "Second Law of
Thermodynamics," he says that as a thermodynamic system is put into
contact with another one at a lower temperature, and thermal equilibrium is
reached, the total entropy of the combined ensemble "stays constant" (p
Adami
99). This flatly contradicts the second law. Later, applying the second law to
information, he explains that only the "conditional entropy" increases in such examples.
"The unconditional (or marginal) entropy given by conditional entropy plus mutual
entropy... stays constant" (p 118, Adami's italics). More new kinds of entropy.
In 1999's The Fifth Miracle (28), theoretical physicist and science writer Paul Davies
devotes a chapter, "Against the Tide," to the relationship between entropy and biology.
In an endnote to that chapter he writes, "'higher' organisms have higher (notlower)

algorithmic entropy..." (p 277, Davies' italics) another reversal of the usual


understanding. He concludes, "The source of biological information, then, is the
organism's environment" (p 57). Later, "Gravitationally induced instability is a source of
information" (p 63). But this "still leaves us with the problem.... How
has meaningful information emerged in the universe?" (p 65). He gives no answer to
this question.
The Touchstone of Life (1999) follows Prigogine's course, relying on Boltzmann's
constant to link thermodynamic and logical entropy (29). Author Werner Loewenstein
often strikes the chords that accompany deep understanding. "As for the origin of
information, the fountainhead, this must lie somewhere in the territory close to the big
bang" (p 25). "Evidently a little bubbling, whirling and seething goes a long way in
organizing matter.... That understanding has led to the birth of a new alchemy..." (p 4849). Exactly.
Conclusion
In my opinion, the audacious attempt to reveal the formal equivalence of
the ideas of biological organization and thermodynamic order ...must be
judged to have failed. Peter Medawar (30)
Computer scientist Rolf Landauer wrote an article published in June,
1996, which contains insight that should discourage attempts to
Medawar
physically link the two kinds of entropy. He demonstrates that "there is
no unavoidable minimal energy requirement per transmitted bit" (31).
Using Boltzmann's constant to tie together thermodynamic entropy and logical entropy
is thus shown to be without basis. One may rightly object that the minimal energy
requirement per bit of information is unrelated to logical entropy. But this
supposed requirement was the keystone of modern arguments
connecting the two concepts.
It is surprising that mixing entropy and biology still fosters confusion. The
relevant concepts from physics pertaining to the second law of
thermodynamics are at least 100 years old. The confusion can be
eradicated if we distinguish thermodynamic entropy from logical entropy,
and admit that Earth's biological system is open to organizing input from
outside.

Landauer

2nd Law of Thermodynamics


The Second Law of Thermodynamics states that the state of entropy of the entire
universe, as an isolated system, will always increase over time. The second law also
states that the changes in the entropy in the universe can never be negative.

Introduction
Why is it that when you leave an ice cube at room temperature, it begins to melt? Why
do we get older and never younger? And, why is it whenever rooms are cleaned, they
become messy again in the future? Certain things happen in one direction and not the
other, this is called the "arrow of time" and it encompasses every area of science. The
thermodynamic arrow of time (entropy) is the measurement of disorder within a system.
Denoted as SS, the change of entropy suggests that time itself is asymmetric with
respect to order of an isolated system, meaning: a system will become more disordered,
as time increases.

Major players in developing the Second Law

Nicolas Lonard Sadi Carnot was a French physicist, who is considered to be the
"father of thermodynamics," for he is responsible for the origins of the Second Law of
Thermodynamics, as well as various other concepts. The current form of the second law
uses entropy rather than caloric, which is what Sadi Carnot used to describe the law.
Caloric relates to heat and Sadi Carnot came to realize that some caloric is always lost
in the motion cycle. Thus, the thermodynamic reversibility concept was proven wrong,
proving that irreversibility is the result of every system involving work.
Rudolf Clausius was a German physicist, and he developed the Clausius
statement, which says "Heat generally cannot flow spontaneously from a material at a
lower temperature to a material at a higher temperature."
William Thompson, also known as Lord Kelvin, formulated the Kelvin statement,
which states "It is impossible to convert heat completely in a cyclic process." This
means that there is no way for one to convert all the energy of a system into work,
without losing energy.
Constantin Carathodory, a Greek mathematician, created his own statement of
the second low arguing that "In the neighborhood of any initial state, there are states
which cannot be approached arbitrarily close through adiabatic changes of state."

Fig. 1: Nicolas Carnot (left), Rudolf Clausius (second on left), William Thompson
(second on right), Constantin Carathodory (right)

Probabilities
If a given state can be accomplished in more ways, then it is more probable than the
state that can only be accomplished in a fewer/one way.
Assume a box filled with jigsaw pieces were jumbled in its box, the probability that a
jigsaw piece will land randomly, away from where it fits perfectly, is very high. Almost
every jigsaw piece will land somewhere away from its ideal position. The probability of a
jigsaw piece landing correctly in its position, is very low, as it can only happened one
way. Thus, the misplaced jigsaw pieces have a much higher multiplicity than the
correctly placed jigsaw piece, and we can correctly assume the misplaced jigsaw pieces
represent a higher entropy.

Derivation and Explanation


To understand why entropy increases and decreases, it is important to recognize that
two changes in entropy have to considered at all times. The entropy change of the
surroundings and the entropy change of the system itself. Given the entropy change of
the universe is equivalent to the sums of the changes in entropy of the system and
surroundings:
Suniv=Ssys+Ssurr=qsysT+qsurrT(1)(1)Suniv=Ssys+Ssurr=qsysT+qsurrT
In an isothermal reversible expansion, the heat q absorbed by the system from the
surroundings is
qrev=nRTlnV2V1(2)(2)qrev=nRTlnV2V1
Since the heat absorbed by the system is the amount lost by the
surroundings, qsys=qsurrqsys=qsurr.Therefore, for a truly reversible process, the
entropy change is
Suniv=nRTlnV2V1T+nRTlnV2V1T=0(3)(3)Suniv=nRTlnV2V1T+nRTlnV2V1T=0
If the process is irreversible however, the entropy change is
Suniv=nRTlnV2V1T>0(4)(4)Suniv=nRTlnV2V1T>0
If we put the two equations for SunivSunivtogether for both types of processes, we
are left with the second law of thermodynamics,
Suniv=Ssys+Ssurr0(5)(5)Suniv=Ssys+Ssurr0
where SunivSuniv equals zero for a truly reversible process and is greater than zero
for an irreversible process. In reality, however, truly reversible processes never happen
(or will take an infinitely long time to happen), so it is safe to say all thermodynamic
processes we encounter everyday are irreversible in the direction they occur.

Note
The
second
law
of
thermodynamics
can
also
be
stated
"all spontaneous processes produce an increase in the entropy of the universe".

that

Gibbs Free Energy


Given another equation:
Stotal=Suniv=Ssurr+Ssys(6)(6)Stotal=Suniv=Ssurr+Ssys
The
formula
for
the
entropy
change
in
the
surroundings
is Ssurr=Hsys/TSsurr=Hsys/T. If this equation is replaced in the previous formula,
and the equation is then multiplied by T and by -1 it results in the following formula.
TSuniv=HsysTSsys(7)(7)TSuniv=HsysTSsys
If the left side of the equation is replaced by GG, which is know as Gibbs energy or free
energy, the equation becomes
G=HTS(8)(8)G=HTS
Now it is much simpler to conclude whether a system is spontaneous, nonspontaneous, or at equilibrium.

HH refers to the heat change for a reaction. A positive HH means that heat
is taken from the environment (endothermic). A negativeHH means that heat is
emitted or given the environment (exothermic).
GG is a measure for the change of a system's free energy in which a reaction
takes place at constant pressure (PP) and temperature (TT).
According to the equation, when the entropy decreases and enthalpy increases the free
energy change, GG, is positive and not spontaneous, and it does not matter what the
temperature of the system is. Temperature comes into play when the entropy and
enthalpy both increase or both decrease. The reaction is not spontaneous when both
entropy and enthalpy are positive and at low temperatures, and the reaction is
spontaneous when both entropy and enthalpy are positive and at high temperatures.
The reactions are spontaneous when the entropy and enthalpy are negative at low
temperatures, and the reaction is not spontaneous when the entropy and enthalpy are
negative at high temperatures. Because all spontaneous reactions increase entropy,
one can determine if the entropy changes according to the spontaneous nature of the
reaction (Equation 8).

Table 1: Matrix of Conditione Dictating Spontaneity


Case

HH

SS

GG

Answer

high temperature

Spontaneous

low temperature

Spontaneous

high temperature

Nonspontaneous

low temperature

Spontaneous

high temperature

Spontaneous

low temperature

Nonspontaneous

high temperature

Nonspontaneous

low temperature

Nonspontaneous

Example 1
Lets start with an easy reaction:
2H2(g)+O2(g)2H2O(g)2H2(g)+O2(g)2H2O(g)
The enthalpy, HH, for this reaction is -241.82 kJ, and the entropy, SS, of this
reaction is -233.7 J/K. If the temperature is at 25 C, then there is enough information to
calculate the standard free energy change, GG.
The first step is to convert the temperature to Kelvin, so add 273.15 to 25 and the
temperature is at 298.15 K. Next plug HH, SS, and the temperature into
the G=HTSG=HTS.
GG= -241.8 kJ + (298.15 K)(-233.7 J/K)
= -241.8 kJ + -69.68 kJ (Don't forget to convert Joules to Kilojoules)
= -311.5 kJ
Example 2
Here is a little more complex reaction:
2ZnO(s)+2C(g)2Zn(s)+2CO(g)2ZnO(s)+2C(g)2Zn(s)+2CO(g)
If this reaction occurs at room temperature (25 C) and the enthalpy, HH, and
standard free energy, GG, is given at -957.8 kJ and -935.3 kJ, respectively. One

must work backwards somewhat using the same equation from Example 1 for the free
energy is given.
-935.3 kJ = -957.8 kJ + (298.15 K) (SS)
22.47 kJ = (298.15 K) (SS)
0.07538 kJ/K = SS

(Add -957.8 kJ to both sides)


(Divide by 298.15 K to both sides)

Multiply the entropy by 1000 to convert the answer to Joules, and the new answer is
75.38 J/K.
Example 3
For the following dissociation reaction
O2(g)2O(g)O2(g)2O(g)
under what temperature conditions will it occurs spontaneously?
SOLUTION
By simply viewing the reaction one can determine that the reaction increases in the
number of moles, so the entropy increases. Now all one has to do is to figure out the
enthalpy of the reaction. The enthalpy is positive, because covalent bonds are broken.
When covalent bonds are broken energy is absorbed, which means that the enthalpy of
the reaction is positive. Another way to determine if enthalpy is positive is to to use the
formation data and subtract the enthalpy of the reactants from the enthalpy of the
products to calculate the total enthalpy. So, if the temperature is low it is probable
that HH is more than TSTS, which means the reaction is not spontaneous. If
the temperature is large then TSTS will be larger than the enthalpy, which means
the reaction is spontaneous.
Example 4
The following reaction
CO(g)+H2O(g)CO2(g)+H2(g)CO(g)+H2O(g)CO2(g)+H2(g)
occurs spontaneously under what temperature conditions? The enthalpy of the reaction
is -40 kJ.
SOLUTION
One may have to calculate the enthalpy of the reaction, but in this case it is given. If the
enthalpy is negative then the reaction is exothermic. Now one must find if the entropy is
greater than zero to answer the question. Using the entropy of formation data and the

enthalpy of formation data, one can determine that the entropy of the reaction is -42.1
J/K and the enthalpy is -41.2 kJ. Because both enthalpy and entropy are negative, the
spontaneous nature varies with the temperature of the reaction. The temperature would
also determine the spontaneous nature of a reaction if both enthalpy and entropy were
positive. When the reaction occurs at a low temperature the free energy change is also
negative, which means the reaction is spontaneous. However, if the reaction occurs at
high temperature the reaction becomes nonspontaneous, for the free energy change
becomes positive when the high temperature is multiplied with a negative entropy as the
enthalpy is not as large as the product.
Example 5
Under what temperature conditions does the following reaction occurs spontaneously ?
H2(g)+I(g)2HI(g)H2(g)+I(g)2HI(g)
SOLUTION
Only after calculating the enthalpy and entropy of the reaction is it possible for one can
answer the question. The enthalpy of the reaction is calculated to be -53.84 kJ, and the
entropy of the reaction is 101.7 J/K. Unlike the previous two examples, the temperature
has no affect on the spontaneous nature of the reaction. If the reaction occurs at a high
temperature, the free energy change is still negative, and GG is still negative if the
temperature is low. Looking at the formula for spontaneous change one can easily come
to the same conclusion, for there is no possible way for the free energy change to be
positive. Hence, the reaction is spontaneous at all temperatures.

Application of the Second Law


The second law occurs all around us all of the time, existing as the biggest, most
powerful, general idea in all of science.
Explanation of Earth's Age
When scientists were trying to determine the age of the Earth during 1800s they failed
to even come close to the value accepted today. They also were incapable of
understanding how the earth transformed. Lord Kelvin, who was mentioned earlier, first
hypothesized that the earth's surface was extremely hot, similar to the surface of the
sun. He believed that the earth was cooling at a slow pace. Using this information,
Kelvin used thermodynamics to come to the conclusion that the earth was at least
twenty million years, for it would take about that long for the earth to cool to its current
state. Twenty million years was not even close to the actual age of the Earth, but this is
because scientists during Kelvin's time were not aware of radioactivity. Even though
Kelvin was incorrect about the age of the planet, his use of the second law allowed him
to predict a more accurate value than the other scientists at the time.

Evolution and the Second Law


Some critics claim that evolution violates the Second Law of Thermodynamics, because
organization and complexity increases in evolution. However, this law is referring to
isolated systems only, and the earth is not an isolated system or closed system. This is
evident for constant energy increases on earth due to the heat coming from the sun. So,
order may be becoming more organized, the universe as a whole becomes more
disorganized for the sun releases energy and becomes disordered. This connects to
how the second law and cosmology are related, which is explained well in the video
below.

Você também pode gostar