You are on page 1of 18

The Second Law of

Thermodynamics What'sN
EW

The use of thermodynamics in biology has a long


history rich in
confusion. Harold J. Morowitz (1)
Morowitz

Sometimes people say that life violates the second


law of thermodynamics. This is not the case; we know of nothing in
the universe that violates that law. So why do people say that life
violates the second law of thermodynamics? What is the second law
of thermodynamics?
The second law is a straightforward law of physics with the
consequence that, in a closed system, you can't finish any real
physical process with as much useful energy as you had to start with
some is always wasted. This means that a perpetual motion
machine is impossible. The second law was formulated after
nineteenth century engineers noticed that heat cannot pass from a
colder body to a warmer body by itself.
According to philosopher of science Thomas Kuhn, the second law
was first put into words by two scientists, Rudolph Clausius and
William Thomson (Lord Kelvin), using different examples, in 185051 (2). American quantum physicist Richard P. Feynman, however,
says the French physicist Sadi Carnot discovered the second law 25
years earlier (3). That would have been before the first law,
conservation of energy, was discovered! In any case, modern
scientists completely agree about the above principles.

Thermodynamic Entropy
The first opportunity for confusion arises when we introduce the
term entropy into the mix. Clausius invented the term in 1865. He
had noticed that a certain ratio was constant in reversible, or ideal,
heat cycles. The ratio was heat exchanged to absolute temperature.
Clausius decided that the conserved ratio must correspond to a real,
physical quantity, and he named it "entropy".
Surely not every conserved ratio corresponds to a real, physical
quantity. Historical accident has introduced this term to science. On
another planet there could be physics without the concept of entropy.
It completely lacks intuitive clarity. Even the great physicist James
Clerk Maxwell had it backward for a while (4). Nevertheless, the

term has stuck.


The American Heritage Dictionary gives as the first definition of
entropy, "For a closed system, the quantitative measure of the
amount of thermal energy not available to do work." So it's a
negative kind of quantity, the opposite of available energy.
Today, it is customary to use the term entropy to state the second
law: Entropy in a closed system can never decrease.As long as
entropy is defined as unavailable energy, this paraphrase of the
second law is equivalent to the earlier ones above. In a closed
system, available energy can never increase, so (because energy is
conserved) its complement, entropy, can never decrease.
A familiar demonstration of the second law is the flow of heat from
hot things to cold, and never vice-versa. When a hot stone is dropped
into a bucket of cool water, the stone cools and the water warms
until each is the same temperature as the other. During this process,
the entropy of the system increases. If you know the heat capacities
and initial temperatures of the stone and the water, and the final
temperature of the water, you can quantify the entropy increase in
calories or joules per degree.
You may have noticed the words "closed system" a couple of times
above. Consider simply a black bucket of water initially at the same
temperature as the air around it. If the bucket is placed in bright
sunlight, it will absorb heat from the sun, as black things do. Now
the water becomes warmer than the air around it, and the available
energy has increased. Has entropy decreased? Has energy that was
previously unavailable become available, in a closed system? No,
this example is only an apparent violation of the second law.
Because sunlight was admitted, the local system was not closed; the
energy of sunlight was supplied from outside the local system. If we
consider the larger system, including the sun, available energy has
decreased and entropy has increased as required.
Let's call this kind of entropy thermodynamic entropy. The qualifier
"thermodynamic" is necessary because the word entropy is also used
in another, nonthermodynamic sense.

Logical Entropy
Entropy is also used to mean disorganization or disorder. J. Willard
Gibbs, the nineteenth century American theoretical physicist, called
it "mixedupness." The American Heritage Dictionary gives as the
second definition of entropy, "a measure of disorder or randomness
in a closed system." Again, it's a negative concept, this time the
opposite of organization or order. The term came to have this second

meaning thanks to the great Austrian physicist Ludwig Boltzmann.


In Boltzmann's day, one complaint about the second
law of thermodynamics was that it seemed to impose
upon nature a preferred direction in time. Under the
second law, things can only go one way. This
apparently conflicts with the laws of physics at the
molecular level, where there is no preferred direction
Boltzmann
in time an elastic collision between molecules
would look the same going forward or backward. In
the 1880s and 1890s, Boltzmann used molecules of gas as a model,
along with the laws of probability, to show that there was no real
conflict. The model showed that, no matter how it was introduced,
heat would soon become evenly diffused throughout the gas, as the
second law required.
The model could also be used to show that two different kinds of
gasses would become thoroughly mixed. The reasoning he used for
mixing is very similar to that for the diffusion of heat, but there is an
important difference. In the diffusion of heat, the entropy increase
can be measured with the ratio of physical units, joules per degree.
In the mixing of two kinds of gasses already at the same
temperature, if no energy is dissipated, the ratio of joules per degree
thermodynamic entropy is irrelevant. The non-dissipative
mixing process is related to the diffusion of heat only by analogy (5).
Nevertheless, Boltzmann used a factor, k, now called Boltzmann's
constant, to attach physical units to the latter situation. Now the
word entropy has come to be applied to the simple mixing process,
too. (Of course, Boltzmann's constant has a legitimate use it
relates the average kinetic energy of a molecule to its temperature.)
Entropy in this latter sense has come to be used in the growing fields
of information science, computer science, communications theory,
etc. The story is often told that in the late 1940s, John von Neumann,
a pioneer of the computer age, advised communication-theorist
Claude E. Shannon to start using the term "entropy" when discussing
information because "no one knows what entropy really is, so in a
debate you will always have the advantage" (6).
Richard Feynman knew there is a difference between the two
meanings of entropy. He discussed thermodynamic entropy in the
section called "Entropy" of his Lectures on Physics published in
1963 (7), using physical units, joules per degree, and over a dozen
equations (vol I section 44-6). He discussed the second meaning of
entropy in a different section titled "Order and entropy" (vol I
section 46-5) as follows:

So we now hav
e to talk about
what we mean
by disorder and
what we mean
by order. ...
Suppose we
divide the
space into little
volume
elements. If we
Feynman
have black and
white molecules, how many ways could we distribute them among the volume
elements so that white is on one side and black is on the other? On the other hand,
how many ways could we distribute them with no restriction on which goes
where? Clearly, there are many more ways to arrange them in the latter case. We
measure "disorder" by the number of ways that the insides can be arranged, so that
from the outside it looks the same. The logarithm of that number of ways is the
entropy. The number of ways in the separated case is less, so the entropy is less, or
the "disorder" is less.

This is Boltzmann's model again. Notice that Feynman does not use
Boltzmann's constant. He assigns no physical units to this kind of
entropy, just a number (a logarithm.) And he uses not a single
equation in this section of his Lectures.
Notice another thing. The "number of ways" can only be established
by first artificially dividing up the space into little volume elements.
This is not a small point. In every real physical situation, counting
the number of possible arrangements requires an arbitrary parceling.
As Peter Coveney and Roger Highfield say (7.5):
There is, however, nothing to tell us how fine the [parceling] should be. Entropies
calculated in this way depend on the size-scale decided
upon, in direct contradiction with thermodynamics in
which entropy changes are fully objective.

Claude Shannon himself seems to be aware of


these differences in his famous 1948 paper, "A
Mathematical Theory of Communcation" (8).
With respect to the parcelling he writes, "In the
continuous case the measurement is relative to
the coordinate system. If we change coordinates
the entropy will in general change" (p 37,
Shannon's italics).

Shannon

In the same paper Shannon attaches no physical units to his entropy


and never mentions Boltzmann's constant, k. At one point he briefly
introduces K, saying tersely, "The constant K merely amounts to a
choice of a unit of measure" (p 11). Although the the 55-page paper
contains more than 300 equations, K appears only once again, in
Appendix 2, which concludes, "The choice of coefficient Kis a
matter of convenience and amounts to the choice of a unit of

measure" (p 29). Shannon never specifies the unit of measure.


This sort of entropy is clearly different. Physical units do not pertain
to it, and (except in the case of digital information) an arbitrary
convention must be imposed before it can be quantified. To
distinguish this kind of entropy from thermodynamic entropy, let's
call it logical entropy.
The equation S = k
logW +
constappears
without an
elementary theory
or however one
wants to say it
devoid of any
meaning from a
phenomenological
point of view
Albert Einstein, 19
10 (8.5)

In spite of the important distinction between


the two meanings of entropy, the rule as stated
above for thermodynamic entropy seems to
apply nonetheless to the logical kind: entropy
in a closed system can never decrease. And
really, there would be nothing mysterious
about this law either. It's similar to
saying things never organize themselves. (The
original meaning of organize is "to furnish
with organs.") Only this rule has little to do
with thermodynamics.

It is true that crystals and other regular


configurations can be formed by unguided
processes. And we are accustomed to saying
that these configurations are "organized." But crystals have not been
spontaneously "furnished with organs." The correct term for such
regular configurations is "ordered." The recipe for a crystal is
already present in the solution it grows from the crystal lattice is
prescribed by the structure of the molecules that compose it. The
formation of crystals is the straightforward result of chemical and
physical laws that do not evolve and that are, compared to genetic
programs, very simple.
The rule that things never organize themselves is also upheld in our
everyday experience. Without someone to fix it, a broken glass never
mends. Without maintenance, a house deteriorates. Without
management, a business fails. Without new software, a computer
never acquires new capabilities. Never.
Charles Darwin understood this universal principle. It's common
sense. That's why he once made a note to himself pertaining to
evolution, "Never use the words higher or lower" (9). However, the
word "higher" in this forbidden sense appears half a dozen times in
the first edition of Darwin's Origin of Species (10).
Even today, if you assert that a human is more highly evolved than a
flatworm or an amoeba, there are darwinists who'll want to fight
about it. They take the position, apparently, that evolution has not
necessarily shown a trend toward more highly organized forms of

life, just different forms:

All extant species are equally evolved. Lynn Margulis and


Dorion Sagan, 1995 (11)

There is no progress in evolution. Stephen Jay Gould,


1995 (12)

We all agree that there's no progress. Richard Dawkins,


1995 (13)

The fallacy of
progress John
Maynard Smith and
Ers Szathmry,
1995 (14)

But this ignores the plain facts


about life and evolution.

Life is Organization
Seen in retrospect, evolution as a whole doubtless had a general
direction, from simple to complex, from dependence on to relative
independence of the environment, to greater and greater autonomy
of individuals, greater and greater development of sense organs and
nervous systems conveying and processing information about the
state of the organism's surroundings, and finally greater and greater
consciousness. You can call this direction progress or by some other
name. Theodosius Dobzhansky (15)
Progress, then, is a property of the evolution of life as a whole by
almost any conceivable intuitive standard.... Let us not pretend to
deny in our philosophy what we know in our hearts to be
true. Edward O. Wilson (16)
Life is organization. From prokaryotic cells, eukaryotic cells, tissues
and organs, to plants and animals, families, communities,
ecosystems, and living planets, life is organization, at every scale.
The evolution of life is the increase of biological organization, if it is
anything. Clearly, if life originates and makes evolutionary progress
without organizing input somehow supplied, then something has
organized itself. Logical entropy in a closed system has decreased.
This is the violation that people are getting at, when they say that life
violates the second law of thermodynamics. This violation,
the decrease of logical entropy in a closed system, must happen
continually in the darwinian account of evolutionary progress.

Most darwinists just ignore this staggering problem. When


confronted with it, they seek refuge in the confusion between the two
kinds of entropy. [Logical] entropy has not decreased, they say,
because the system is not closed. Energy such as sunlight is
constantly supplied to the system. If you consider the larger system
that includes the sun, [thermodynamic] entropy has increased, as
required.

Recent Writing About Entropy and Biology


An excellent example of this confusion is given in a popular 1982
treatise against creationism, Abusing Science, by Philip Kitcher. He
is aware that entropy has different meanings, but he treats them as
not different: "There are various ways to understand entropy.... I
shall follow the approach of classical thermodynamics, in which
entropy is seen as a function of unusable energy. But the points I
make will not be affected by this choice" (17).
Another typical example of confusion between the two kinds of
entropy comes from a similar book by Tim M. Berra,Evolution and
the Myth of Creationism. The following paragraph from that book
would seem to indicate that any large animal can assemble a
bicycle (18).
For example, an unassembled bicycle that arrives at your house in a shipping
carton is in a state of disorder. You supply the energy of your muscles (which you
get from food that came ultimately from sunlight) to assemble the bike. You have
got order from disorder by supplying energy. The Sun is the source of energy input
to the earth's living systems and allows them to evolve.

A rare example of the use of mathematics to combine the two


kinds of entropy is given in The Mystery of Life's Origin, published
in 1984. Its authors acknowledge two kinds of entropy, which they
call "thermal" and "configurational." To count the "number of ways"
for the latter kind of entropy they use restrictions which they later
admit to be unrealistic. They count only the number of ways a string
of amino acids of fixed length can be sequenced. They admit in the
end, however, that the string might never form. To impose the units
joules per degree onto "configurational" entropy, they simply
multiply by Boltzmann's constant (19). Nevertheless, they ultimately
reach the following conclusion (p 157-158):
In summary, undirected thermal energy is only able to do the chemical and thermal
entropy work in polypetide synthesis, but not the coding (or sequencing) portion of
the configurational entropy work.... It is difficult to imagine how one could ever
couple random thermal energy flow through the system to do the required
configurational entropy work of selecting and sequencing.

In Evolution, Thermodynamics and Information, Jeffrey S.


Wicken also adopts the terms "thermal" and "configurational." But

here they both pertain only to the non-energetic "information


content" of a thermodynamic state, and "energetic" information is
also necessary for the complete description of a system. Shannon
entropy is different from all of these, and not a useful concept to
Wicken. Nevertheless, he says that evolution and the origin of life
are not separate problems and, "The most parsimonious explanation
is to assume that life always existed" (19.5)!
Roger Penrose's treatment of entropy is worth mentioning. In The
Emperor's New Mind (20), he nimbly dodges the problem of
assigning physical units to logical entropy (p 314, Penrose's italics):
In order to give the actual entropy values for these compartments we should have
to worry a little about the question of the units that are chosen (metres, Joules,
kilograms, degrees Kelvin, etc.). That would be out of place here, and in fact, for
the utterly stupendous entropy values that I shall be giving shortly, it makes
essentially no difference at all what units are in fact chosen. However, for
definiteness (for the experts), let me say that I shall be taking natural units, as are
provided by the rules of quantum mechanics, and for which Boltzmann's constant
turns out to be unity: k = 1.

Someday in the future, an extension of quantum theory might


provide a natural way to parcel any real physical
situation. If that happens, one of the problems with
quantifying logical entropy in a real physical
situation will be removed. But nobody, not even
Penrose, is suggesting that this is the case today.
And even if that day comes, still we will have no
Penrose
reason to attach thermodynamic units to logical
entropy. (Although the word "stupendous" appears again, no
"actual entropy values" follow the quoted passage.) (Penrose, May
2012 )
In The Refrigerator and the Universe (21), Martin Goldstein and
Inge F. Goldstein wonder if there is "an irreconcilable difference"
between the two kinds of entropy. They begin their consideration of
logical entropy by discussing the possible arrangements of playing
cards, where the parceling is not arbitrary the number of
possibilities can be counted. When they move to the world of
physics, they are not concerned over the fact that parceling must now
be done arbitrarily. They are concerned, initially, about attaching
physical units to logical entropy. "...Entropy is measured in units of
energy divided by temperature.... W [counting microstates] is a pure
number" (p 173). But ultimately they apply Boltzmann's constant.
No calculations using logical entropy with physical units ensue. The
next time they mention logical entropy is in the section "Information
and Entropy," where they divide the previous product by
Boltzmann's constant to remove the physical units!
An ambitious treatment of entropy as it pertains to biology is the

book Evolution as Entropy, by Daniel R. Brooks and E. O. Wiley.


They acknowledge that the distinction between the different kinds of
entropy is important (22):
It is important to realize that the phase space, microstates, and macrostates
described in our theory are not classical thermodynamic constructs.... The
entropies are array entropies, more like the entropies of sorting encountered in
considering an ideal gas than like the thermal entropies associated with steam
engines....

In fact the authors acknowledge many kinds of entropy; they


describe physical entropy, Shannon-Weaver entropy, cohesion
entropy, and statistical entropy, for example. They rarely use or
mention Boltzmann's constant. One of their main arguments is that
although the progress of evolution seems to represent a reduction in
entropy, this reduction is only apparent. In reality, evolution
increases entropy as the second law requires. But evolution does not
increase entropy as fast as the maximum possible rate. So, by
comparison to the maximum possible rate, entropy appears to be
decreasing. Our eyes have deceived us!
In another book entitled Life Itself, mathematical biologist Robert
Rosen of Columbia University seems to have grasped the problem
when he writes, "The Second Law thus asserts that... a system
autonomously tending to an organized state cannot be closed " (23).
But immediately he veers away, complaining that the term
"organization" is vague. Intent on introducing terms he prefers, like
"entailment," he does not consider the possibility that, in an open
system, life's organization could be imported into one region from
another.
Hans Christian von Baeyer's 1998 book, Maxwell's Demon, is
engaging and informative about the scientists who pioneered the
second law. The story concludes with an interview of Wojciech
Zurek of the Theoretical Division of the Los Alamos National
Laboratory. Zurek introduces another second kind of entropy,
because, "Like all scientific ideas, the concept of entropy, useful as it
is, needs to be refurbished and updated and adjusted to new insights.
Someday... the two types of entropy will begin to approach each
other in value, and the new theory will become amenable to
experimental verification" (23.5).
One of the most profound and original treatments
of entropy is that by the Nobel prize-winning chemist
Ilya Prigogine. He begins by noticing that some
physical processes create surprising patterns such as
snowflakes, or exhibit surprising behavior such as
oscillation between different states. In From Being
To Becoming he says, in effect, that things

Prigogine

sometimes do, under certain circumstances, organize themselves. He


reasons that these processes may have produced life (24):
It seems that most biological mechanisms of action show that life involves farfrom-equilibrium conditions beyond the stability of the threshold of the
thermodynamic branch. It is therefore very tempting to suggest that the origin of
life may be related to successive instabilities somewhat analogous to the
successive bifurcations that have lead to a state of matter of increasing coherence.

Some find such passages obscure and tentative. One critic complains
that work along the lines advocated by Prigogine fifteen years earlier
has borne little fruit subsequently. "I don't know of a single
phenomenon he has explained," said Pierre C. Hohenberg of Yale
University (25).
Dr. Hubert P. Yockey gives the subject of entropy and biology a
probing and insightful treatment in his monograph,Information
theory and molecular biology (26). He emphatically agrees that
there are different kinds of entropy that do not correlate. "The
Shannon entropy and the Maxwell-Boltzmann-Gibbs entropy... have
nothing to do with each other" (p 313). But Shannon entropy (which
pertains to information theory) makes no distinction between
meaningful DNA sequences that encode life, and random DNA
sequences of equal length. (Shannon wrote, "These semantic aspects
of communication are irrelevant to the engineering problem.") With
no distinction between meaningful and meaningless sequences,
Yockey is able to conclude that evolution does not create any
paradox for Shannon entropy. Nevertheless, Yockey proves with
impressive command of biology and statistics that it would be
impossible to find the new genes necessary for evolutionary progress
by the random search method currently in favor. He is deeply
sceptical of the prevailing theories of evolution and the origin of life
on Earth. (Cynthia Yockey, 2005 )
In 1998, computer scientist Christoph Adami agrees that trouble
dogs the marriage of biology and logical entropy. In Introduction to
Artificial Life (27), he comments on "the decades of confusion that
have reigned over the treatment of living systems from the point of
view of thermodynamics and information theory..." (p
59). He says, "information is always shared between
two ensembles" (p 70), a restriction that sounds
promising. Yet in his section entitled "Second Law of
Thermodynamics," he says that as a thermodynamic
system is put into contact with another one at a lower
Adami
temperature, and thermal equilibrium is reached, the
total entropy of the combined ensemble "stays constant" (p 99). This
flatly contradicts the second law. Later, applying the second law to
information, he explains that only the "conditionalentropy" increases
in such examples. "The unconditional (or marginal) entropy

given by conditional entropy plusmutual entropy... stays constant" (p


118, Adami's italics). More new kinds of entropy.
In 1999's The Fifth Miracle (28), theoretical physicist and science
writer Paul Davies devotes a chapter, "Against the Tide," to the
relationship between entropy and biology. In an endnote to that
chapter he writes, "'higher' organisms have higher (not lower)
algorithmic entropy..." (p 277, Davies' italics) another reversal of
the usual understanding. He concludes, "The source of biological
information, then, is the organism's environment" (p 57). Later,
"Gravitationally induced instability is a source of information" (p
63). But this "still leaves us with the problem.... How
has meaningfulinformation emerged in the universe?" (p 65). He
gives no answer to this question.
The Touchstone of Life (1999) follows Prigogine's course, relying
on Boltzmann's constant to link thermodynamic and logical
entropy (29). Author Werner Loewenstein often strikes the chords
that accompany deep understanding. "As for the origin of
information, the fountainhead, this must lie somewhere in the
territory close to the big bang" (p 25). "Evidently a little bubbling,
whirling and seething goes a long way in organizing matter.... That
understanding has led to the birth of a new alchemy..." (p 48-49).
Exactly.

Conclusion
In my opinion, the audacious attempt to reveal the
formal equivalence of the ideas of biological
organization and thermodynamic order ...must be
judged to have failed. Peter Medawar (30)
Computer scientist Rolf Landauer wrote an article
published in June, 1996, which contains insight that
should discourage attempts to physically link the two kinds of
entropy. He demonstrates that "there is no unavoidable minimal
energy requirement per transmitted bit" (31). Using Boltzmann's
constant to tie together thermodynamic entropy and logical entropy
is thus shown to be without basis. One may rightly object that the
minimal energy requirement per bit of information is unrelated to
logical entropy. But this supposed requirement was the keystone of
modern arguments connecting the two concepts.
Medawar

It is surprising that mixing entropy and biology still


fosters confusion. The relevant concepts from
physics pertaining to the second law of
thermodynamics are at least 100 years old. The
confusion can be eradicated if we distinguish
Landauer

thermodynamic entropy from logical entropy, and admit that Earth's


biological system is open to organizing input from outside.

What'sNEW
Entropy is an anthromorphic concept E.P. Wigner (32)
Stability and its manifestation in the chemical and biological
worlds, Robert Pascal and Addy Pross,
doi:10.1039/C5CC06260H, Chem. Commun., 07 Oct 2015. Through
a kinetic perspective it can be demonstrated that the process of life's
emergence, as well as its subsequent evolution, seemingly
inconsistent, if not at odds with thermodynamic logic, actually have
a mathematical/logical basis.
23 Feb 2015: The second law essentially says that the universe
must have had a beginning and an end.
Roger Penrose, Cycles of Time: An Extraordinary New View of the
Universe, ISBN: 9780099505945, Vintage, May 2012. "We must
indeed bear in mind that there is likely to be always some
measure of subjectivity in the precise value of the entropy that
one might assign to a system." (p 37)
Jeremy L. England, "Statistical physics of self-replication"
[abstract | pdf], doi:10.1063/1.4818538, v 139 n 121923, J. Chem.
Phys., Aug 2013; and commentary: A New Physics Theory of
Life by Natalie Wolchover, Quanta Magazine, 22 Jan 2014.
Charles Lineweaver, Paul C.W. Davies and Michael Ruse,
eds., Complexity and the Arrow of Time, ISBN 978-1-10702725, Cambridge University Press, 2013; and a review by Daniel
W. McShea, "Unnecessary Complexity" [html],
doi:10.1126/science.1245386, p 1319-1320 v 342, Science, 13 Dec
2013.
18 Oct 2013: Manfred Eigen's new book, From Strange Simplicity
to Complex Familiarity.
13 Sep 2013: A major problem with all origin-of-life theories
how did biological "self-preservation" arise?
2 Jan 2013: Shufeng Zhang sends links to his paper, "Entropy: A
concept that is not a physical quantity".
BaBar Experiment Confirms Time Asymmetry, SLAC National
Accelerator Labratory, 19 Nov 2012.
Granville Sewell, "A second look at the second law" [pdf],
doi:101016/j.aml.2011.01.019, Applied Mathematics Letters, 2011
[retracted]. "If an increase in order is extremely improbable when a
system is closed, it is still extremely improbable when the system is
open, unless something is entering which makes it not extremely
improbable."
The unavoidable cost of computation revealed by Philip Ball,

Nature News, 7 Mar 2012.


Peter A. Corning, "Thermoeconomics: Beyond the Second Law"
[abstract | 38-page PDF], Institute for the Study of Complex
Systems, 2001. "We believe that the entire strategy associated with
various attempts to reduce biological evolution and the dynamics of
living systems to the principles either of classical, irreversible
thermodynamics or to statistical mechanics... is a theoretical cul de
sac."
Arieh Ben-Naim, Entropy Demystified [publisher's promo],
ISBN:978-981-270-055-1, World Scientific Publishing Co., May
2007. "The units of entropy (J/K)... should not be used to express
entropy at all" (p204).
28 Apr 2009: Life is nothing but an electron looking for a place to
rest.
F. Alexander Bais and J. Doyne Farmer, "Physics of Information"
[PDF], SFI Working Paper 07-08-029, 2007. "...From the point of
view of thermodynamics entropy is a purely macroscopic quantity."
Howard Landman disagrees and offers helpful discussion,
beginning 29 Sep 2007.
Frasto da Silva recommends Peter A. Corning, 21 Jun 2007.
John Whitfield, "Survival of the Likeliest?" [text],
10.1371/journal.pbio.0050142, e142, v 5 n 5, PLoS Biology,
published 15 May 2007. "Throughout the universe, the interaction of
energy and matter brings regular structuresbe they stars, crystals,
eddies in fluids, or weather systems in atmospheresinto being.
...Could [living things] be part of the same phenomenon?"
Philip Dorrell disagrees, 26 Dec 2006.
Brig Klyce interviewed about the Evolution Prize by Tom
Barbalet of biota.org, 2 Sep 2006.
Anonymous replies with a compliment, 21 Dec 2005.
Cynthia Yockey replies to amend our comments on Hubert P.
Yockey, 17 Nov 2005.
Dr. Shu-Kun Lin, publisher of the online journal Entropy,
disagrees with Rossell, 26 Sep 2005.
Sergio Rossell suggests that when two gasses mix, the
thermodynamic entropy change is measurable, 27 Aug 2005.
Jean-Bernard Brissaud, "The meanings of entropy" [abstract | pdf],
p 68-96 v 7, Entropy, 14 Feb 2005. "A confusion about the nature of
entropy comes from the fact that a perfectly compressed message is
of maximum entropy, containing a maximum amount of information,
while a random sequence of 0s and 1s, also of maximum entropy,
contains no information."
24 Apr 2005: Information Theory, Evolution and the Origin of
Life, by Hubert Yockey
Richard Dawkins, "Human Chauvinism and Evolutionary
Progress," p 206-217, A Devils Chaplain, Mariner Books, 2004.
"Evolution turns out to be clearly and importantly progressive,"
Dawkins now says.
Todd L. Duncan and Jack S. Semura, "The Deep Physics Behind

the Second Law: Information and Energy As Independent Forms of


Bookkeeping" [abstract], p 21-29 v 6, Entropy, Mar 2004.
22 Mar 2004: Stephen Wolfram quote
Dr. Attila Grandpierre, Konkoly Observatory, Hungary replies, 22
Jan 2004.
Andreas Greven et al., eds., Entropy, ISBN: 0-691-11338-6
[promo] [Chapter 1.pdf], Princeton University Press, 2003. "We
hope that these seemingly mysterious relations become clearer by
reading through this book."
Harvey S. Leff and Andrew F. Rex, Maxwell's Demon 2: Entropy,
Classical and Quantum, Information, Computing, Institute of
Physics Publishing, 2003.
The Adjacent Possible Stuart Kauffman talks about "the need
for a theory of organization," n 127, Edge, 3 Nov 2003.
Is Intelligence a Biological Imperative?: Part IV, of a forum
entitled, "The Drake Equation Revisited," held in Palo Alto, CA, 26
August 2003. In the discussion between Peter Ward and David
Grinspoon, the latter invokes non-equilibrium thermodynamics to
enable life to decrease its logical entropy.
Henry Gee, "Progressive evolution: Aspirational thinking" [text], p
611 v 420, Nature, 12 Dec 2002. "Progressive evolution... stems
from a profoundly idealistic, pre-evolutionary view of life." (Gee
agrees with Ruse.)
"Claude Shannon, Mathematician, Dies at 84," The New York
Times, 27 February 2001.
2000, November 23: Monad to Man, by Michael Ruse, who
doubts evolutionary progress.
2000, October 25: Jim Galasyn comments on the second law.
2000, June 12: Ernst Mayr does not doubt evolutionary progress.
Christoph Adami, Charles Ofria and Travis C. Collier, "Evolution
of biological complexity" [abstract], doi:10.1073/pnas.97.9.4463,
p4463-4468 v97, Proc. Natl. Acad. Sci., USA, 25 Apr 2000.
1998, December 23: Gert Korthof compliments this page.

References
1. Harold J. Morowitz, Beginnings of Cellular Life: Metabolism
Recapitulates Biogenesis, Yale University Press, 1992. p 69.
2. Thomas Kuhn, Black-Body Theory and the Quantum
Discontinuity, 1894-1912, The University of Chicago Press, 1978. p
13.
3. Richard P. Feynman, Robert B. Leighton and Matthew Sands, The
Feynman Lectures on Physics, v I; Reading, Massachusetts:
Addison-Wesley Publishing Company, 1963. section 44-3.
4. Harvey S. Leff and Andrew F. Rex, Maxwell's Demon: Entropy,
Information, Computing, Princeton University Press, 1990. p 6.
5. For technical discussions of this difference see The Maximum
Entropy Formalism, Raphael D. Levine and Myron Tribus, eds., The
MIT Press, 1979. Also see correspondence with Sergio Rossell

beginning 27 Aug 2005 and 26 Sep 2005. Following these exchanges


we have changed our text from "...if no heat is exchanged..." to "...if
no energy is dissipated...".
6. Myron Tribus and Edward C. McIrvine. "Energy and
Information," p 179-188 v 225, Scientific American, September,
1971.
7. Richard P. Feynman, Robert B. Leighton and Matthew Sands, The
Feynman Lectures on Physics, v I; Reading, Massachusetts:
Addison-Wesley Publishing Company, 1963.
7.5. Peter Coveney and Roger Highfield, The Arrow of Time,
Ballentine Books, 1990. p 176-177.
8. C. E. Shannon, "A Mathematical Theory of Communication" p
379-423 and 623-656, v 27, The Bell System Technical Journal, July,
October, 1948. PostScript and pdf reprints are available.
8.5. [Einstein quoted in] F. Alexander Bais and J. Doyne Farmer,
"Physics of Information" [PDF], SFI Working Paper 07-08-029,
2007. p 35. [Also quoted in] Constantino Tsallis, Murray Gell-Mann
and Yuzuru Sato, "Asymptotically scale-invariant occupancy of
phase space makes the entropy Sq extensive" [abstract],
doi:10.1073/pnas.0503807102, p 15377-15382 v 102, Proc. Natl.
Acad. Sci., USA, 25 Oct 2005.
9. Ernst Mayr, Toward a New Philosophy of Biology, Harvard
University Press, 1988. p 251.
10. Charles Darwin, On the Origin of Species by Means of Natural
Selection, or the Preservation of Favoured Races in the Struggle for
Life. London: John Murray, Albemarle Street, 1859.
11. Lynn Margulis and Dorion Sagan, What Is Life? Simon and
Schuster, 1995. p 44.
12. Stephen Jay Gould, [interviewed in] The Third Culture, by John
Brockman, Simon and Schuster, 1995. p 52.
13. Richard Dawkins, [interviewed in] The Third Culture by John
Brockman, Simon and Schuster, 1995. p 84.
14. John Maynard Smith and Ers Szathmry, The Major
Transitions in Evolution, W.H. Freeman and Company Limited,
1995. p 4.
15. Theodosius Dobzhansky, Studies in the Philosophy of Biology:
Reduction and Related Problems, Francisco J. Ayala and Theodosius
Dobzhansky, eds. University of California Press, 1974. p 311.
16. Edward O. Wilson, The Diversity of Life, Harvard University
Press, 1992. p 187. Wilson acknowledges Charles S. Pierce, who
wrote, Let us not pretend to doubt in philosophy what we do not
doubt in our hearts "Some Consequences of Four
Incapacities," Collected Papers of Charles Sanders Pierce, v 5,
Charles Hartshone and Paul Weiss, eds., Harvard University Press,
1934.
17. Philip Kitcher, Abusing Science, The MIT Press, 1982. p 90.
18. Tim M. Berra, Evolution and the Myth of Creationism: A Basic
Guide to the Facts in the Evolution Debate, Stanford University
Press, 1990. p 126.
19. Charles B. Thaxton, Walter L. Bradley and Roger L. Olsen, The

Mystery of Life's Origin: Reassessing Current Theories, New York:


Philosophical Library, 1984. p 136-142. The website has three online
chapters.
19.5. Jeffrey S. Wicken, Evolution, Thermodynamics and
Information: Extending the Darwinian Program, Oxford University
Press, 1987. p 59.
20. Roger Penrose, The Emperor's New Mind, Oxford University
Press, 1989.
21. Martin Goldstein and Inge F. Goldstein, The Refrigerator and
the Universe: Understanding the Laws of Energy, Harvard
University Press, 1993.
22. Daniel R. Brooks and E. O. Wiley, Evolution as Entropy, second
edition; The University of Chicago Press, 1988. p 37-38.
23. Robert Rosen, Life Itself: A Comprehensive Inquiry Into the
Nature, Origin and Fabrication of Life, Columbia University Press,
1991. p 114.
23.5. Hans Christian von Baeyer, Maxwell's Demon: Why Warmth
Disperses and Time Passes, Random House, 1998. p 165. [review in
physicsworld.com by Rolf Landauer, 8 Jan 1999].
24. Ilya Prigogine, From Being To Becoming, New York: W. H.
Freeman and Company, 1980. p 123.
25. [quoted in] John Horgan, "From Complexity to Perplexity," p
104-109, Scientific American June 1995.
26. Hubert P. Yockey, Information theory and molecular biology,
Cambridge University Press, 1992.
27. Christoph Adami, Introduction to Artificial Life, Telos (SpringerVerlag), 1998.
28. Paul Davies, The Fifth Miracle, Simon and Schuster, 1999.
29. Werner R. Loewenstein, The Touchstone of Life: Molecular
Information, Cell Communication, and the Foundations of Life,
Oxford University Press, 1999.
30. Peter Medawar, Pluto's Republic, Oxford University Press, 1984.
p 226.
31. Rolf Landauer, "Minimal Energy Requirements in
Communication" p 1914-1918 v 272 Science, 28 June 1996.
32. E.P. Wigner, [cited by] E. T. Jaynes, "Gibbs vs Boltzmann
Entropies" p 391-398 v 33 n 5 American Journal of Physics, May
1965.
Thank you for creating a free access account
with Science
Name:ELIAS HUANQUI
AAAS Number:41469476
Subscription:Science Free Access Account
Email Address:INGENIERIAQUIMICA_UNSA@HOTMAIL.COM
Order Date:Jun 4, 2016
LATINOAMRICA

Mapa del Sitio

Inicio
Novedades
Asistencia Tcnica
La Empresa

Inicio > Asistencia Tcnica > Disponibilidad de Productos >

Calculadoras
Los productos Casio estn disponibles aqu en cuanto a dnde comprar y dnde obtener soporte cliente
en su idioma local.
Mxico ya est aqu

Pas

Ciudad

Puntos de venta

IMPORTACIONES HIRAOKA S.A.

Direccin

Tel / Fax

Horario de apertura

Tel:01-428-3213

tecnicentro@hiraoka.

Av. Abancay 594, Lima


Fax:01-428-3212

Puntos de venta

E-mail / Sitio Web

Sitio Web

PT MARKET S.R.L.

Direccin

Tel / Fax

Horario de
apertura

E-mail / Sitio We

Tel:Calculadoras: 225-2254

Avenida Javier Prado Este, 2372,


San Borja, Lima

Tel:Cmaras Digitales: 3460160


Fax:Calculadoras: 226-4041
Fax:Cmaras Digitales:
346-2555

ventas@ptmarket.c
Sitio Web

Asistencia Tcnica

Preguntas Ms Frecuentes

Manuales

Descargas

Windows / Mac OS

iOS/Android

Reparacin y Repuestos

Asistencia al Cliente

Disponibilidad de Productos

Principio de la pgina
Compartir esta pgina

Imprimir esta pgina

Poltica de Privacidad
Trminos de Uso
Contactos

2016 CASIO BRASIL COMRCIO DE PRODUTOS ELETRNICOS LTDA

You might also like