Professional Documents
Culture Documents
Thermodynamics What'sN
EW
Thermodynamic Entropy
The first opportunity for confusion arises when we introduce the
term entropy into the mix. Clausius invented the term in 1865. He
had noticed that a certain ratio was constant in reversible, or ideal,
heat cycles. The ratio was heat exchanged to absolute temperature.
Clausius decided that the conserved ratio must correspond to a real,
physical quantity, and he named it "entropy".
Surely not every conserved ratio corresponds to a real, physical
quantity. Historical accident has introduced this term to science. On
another planet there could be physics without the concept of entropy.
It completely lacks intuitive clarity. Even the great physicist James
Clerk Maxwell had it backward for a while (4). Nevertheless, the
Logical Entropy
Entropy is also used to mean disorganization or disorder. J. Willard
Gibbs, the nineteenth century American theoretical physicist, called
it "mixedupness." The American Heritage Dictionary gives as the
second definition of entropy, "a measure of disorder or randomness
in a closed system." Again, it's a negative concept, this time the
opposite of organization or order. The term came to have this second
So we now hav
e to talk about
what we mean
by disorder and
what we mean
by order. ...
Suppose we
divide the
space into little
volume
elements. If we
Feynman
have black and
white molecules, how many ways could we distribute them among the volume
elements so that white is on one side and black is on the other? On the other hand,
how many ways could we distribute them with no restriction on which goes
where? Clearly, there are many more ways to arrange them in the latter case. We
measure "disorder" by the number of ways that the insides can be arranged, so that
from the outside it looks the same. The logarithm of that number of ways is the
entropy. The number of ways in the separated case is less, so the entropy is less, or
the "disorder" is less.
This is Boltzmann's model again. Notice that Feynman does not use
Boltzmann's constant. He assigns no physical units to this kind of
entropy, just a number (a logarithm.) And he uses not a single
equation in this section of his Lectures.
Notice another thing. The "number of ways" can only be established
by first artificially dividing up the space into little volume elements.
This is not a small point. In every real physical situation, counting
the number of possible arrangements requires an arbitrary parceling.
As Peter Coveney and Roger Highfield say (7.5):
There is, however, nothing to tell us how fine the [parceling] should be. Entropies
calculated in this way depend on the size-scale decided
upon, in direct contradiction with thermodynamics in
which entropy changes are fully objective.
Shannon
The fallacy of
progress John
Maynard Smith and
Ers Szathmry,
1995 (14)
Life is Organization
Seen in retrospect, evolution as a whole doubtless had a general
direction, from simple to complex, from dependence on to relative
independence of the environment, to greater and greater autonomy
of individuals, greater and greater development of sense organs and
nervous systems conveying and processing information about the
state of the organism's surroundings, and finally greater and greater
consciousness. You can call this direction progress or by some other
name. Theodosius Dobzhansky (15)
Progress, then, is a property of the evolution of life as a whole by
almost any conceivable intuitive standard.... Let us not pretend to
deny in our philosophy what we know in our hearts to be
true. Edward O. Wilson (16)
Life is organization. From prokaryotic cells, eukaryotic cells, tissues
and organs, to plants and animals, families, communities,
ecosystems, and living planets, life is organization, at every scale.
The evolution of life is the increase of biological organization, if it is
anything. Clearly, if life originates and makes evolutionary progress
without organizing input somehow supplied, then something has
organized itself. Logical entropy in a closed system has decreased.
This is the violation that people are getting at, when they say that life
violates the second law of thermodynamics. This violation,
the decrease of logical entropy in a closed system, must happen
continually in the darwinian account of evolutionary progress.
Prigogine
Some find such passages obscure and tentative. One critic complains
that work along the lines advocated by Prigogine fifteen years earlier
has borne little fruit subsequently. "I don't know of a single
phenomenon he has explained," said Pierre C. Hohenberg of Yale
University (25).
Dr. Hubert P. Yockey gives the subject of entropy and biology a
probing and insightful treatment in his monograph,Information
theory and molecular biology (26). He emphatically agrees that
there are different kinds of entropy that do not correlate. "The
Shannon entropy and the Maxwell-Boltzmann-Gibbs entropy... have
nothing to do with each other" (p 313). But Shannon entropy (which
pertains to information theory) makes no distinction between
meaningful DNA sequences that encode life, and random DNA
sequences of equal length. (Shannon wrote, "These semantic aspects
of communication are irrelevant to the engineering problem.") With
no distinction between meaningful and meaningless sequences,
Yockey is able to conclude that evolution does not create any
paradox for Shannon entropy. Nevertheless, Yockey proves with
impressive command of biology and statistics that it would be
impossible to find the new genes necessary for evolutionary progress
by the random search method currently in favor. He is deeply
sceptical of the prevailing theories of evolution and the origin of life
on Earth. (Cynthia Yockey, 2005 )
In 1998, computer scientist Christoph Adami agrees that trouble
dogs the marriage of biology and logical entropy. In Introduction to
Artificial Life (27), he comments on "the decades of confusion that
have reigned over the treatment of living systems from the point of
view of thermodynamics and information theory..." (p
59). He says, "information is always shared between
two ensembles" (p 70), a restriction that sounds
promising. Yet in his section entitled "Second Law of
Thermodynamics," he says that as a thermodynamic
system is put into contact with another one at a lower
Adami
temperature, and thermal equilibrium is reached, the
total entropy of the combined ensemble "stays constant" (p 99). This
flatly contradicts the second law. Later, applying the second law to
information, he explains that only the "conditionalentropy" increases
in such examples. "The unconditional (or marginal) entropy
Conclusion
In my opinion, the audacious attempt to reveal the
formal equivalence of the ideas of biological
organization and thermodynamic order ...must be
judged to have failed. Peter Medawar (30)
Computer scientist Rolf Landauer wrote an article
published in June, 1996, which contains insight that
should discourage attempts to physically link the two kinds of
entropy. He demonstrates that "there is no unavoidable minimal
energy requirement per transmitted bit" (31). Using Boltzmann's
constant to tie together thermodynamic entropy and logical entropy
is thus shown to be without basis. One may rightly object that the
minimal energy requirement per bit of information is unrelated to
logical entropy. But this supposed requirement was the keystone of
modern arguments connecting the two concepts.
Medawar
What'sNEW
Entropy is an anthromorphic concept E.P. Wigner (32)
Stability and its manifestation in the chemical and biological
worlds, Robert Pascal and Addy Pross,
doi:10.1039/C5CC06260H, Chem. Commun., 07 Oct 2015. Through
a kinetic perspective it can be demonstrated that the process of life's
emergence, as well as its subsequent evolution, seemingly
inconsistent, if not at odds with thermodynamic logic, actually have
a mathematical/logical basis.
23 Feb 2015: The second law essentially says that the universe
must have had a beginning and an end.
Roger Penrose, Cycles of Time: An Extraordinary New View of the
Universe, ISBN: 9780099505945, Vintage, May 2012. "We must
indeed bear in mind that there is likely to be always some
measure of subjectivity in the precise value of the entropy that
one might assign to a system." (p 37)
Jeremy L. England, "Statistical physics of self-replication"
[abstract | pdf], doi:10.1063/1.4818538, v 139 n 121923, J. Chem.
Phys., Aug 2013; and commentary: A New Physics Theory of
Life by Natalie Wolchover, Quanta Magazine, 22 Jan 2014.
Charles Lineweaver, Paul C.W. Davies and Michael Ruse,
eds., Complexity and the Arrow of Time, ISBN 978-1-10702725, Cambridge University Press, 2013; and a review by Daniel
W. McShea, "Unnecessary Complexity" [html],
doi:10.1126/science.1245386, p 1319-1320 v 342, Science, 13 Dec
2013.
18 Oct 2013: Manfred Eigen's new book, From Strange Simplicity
to Complex Familiarity.
13 Sep 2013: A major problem with all origin-of-life theories
how did biological "self-preservation" arise?
2 Jan 2013: Shufeng Zhang sends links to his paper, "Entropy: A
concept that is not a physical quantity".
BaBar Experiment Confirms Time Asymmetry, SLAC National
Accelerator Labratory, 19 Nov 2012.
Granville Sewell, "A second look at the second law" [pdf],
doi:101016/j.aml.2011.01.019, Applied Mathematics Letters, 2011
[retracted]. "If an increase in order is extremely improbable when a
system is closed, it is still extremely improbable when the system is
open, unless something is entering which makes it not extremely
improbable."
The unavoidable cost of computation revealed by Philip Ball,
References
1. Harold J. Morowitz, Beginnings of Cellular Life: Metabolism
Recapitulates Biogenesis, Yale University Press, 1992. p 69.
2. Thomas Kuhn, Black-Body Theory and the Quantum
Discontinuity, 1894-1912, The University of Chicago Press, 1978. p
13.
3. Richard P. Feynman, Robert B. Leighton and Matthew Sands, The
Feynman Lectures on Physics, v I; Reading, Massachusetts:
Addison-Wesley Publishing Company, 1963. section 44-3.
4. Harvey S. Leff and Andrew F. Rex, Maxwell's Demon: Entropy,
Information, Computing, Princeton University Press, 1990. p 6.
5. For technical discussions of this difference see The Maximum
Entropy Formalism, Raphael D. Levine and Myron Tribus, eds., The
MIT Press, 1979. Also see correspondence with Sergio Rossell
Inicio
Novedades
Asistencia Tcnica
La Empresa
Calculadoras
Los productos Casio estn disponibles aqu en cuanto a dnde comprar y dnde obtener soporte cliente
en su idioma local.
Mxico ya est aqu
Pas
Ciudad
Puntos de venta
Direccin
Tel / Fax
Horario de apertura
Tel:01-428-3213
tecnicentro@hiraoka.
Puntos de venta
Sitio Web
PT MARKET S.R.L.
Direccin
Tel / Fax
Horario de
apertura
E-mail / Sitio We
Tel:Calculadoras: 225-2254
ventas@ptmarket.c
Sitio Web
Asistencia Tcnica
Preguntas Ms Frecuentes
Manuales
Descargas
Windows / Mac OS
iOS/Android
Reparacin y Repuestos
Asistencia al Cliente
Disponibilidad de Productos
Principio de la pgina
Compartir esta pgina
Poltica de Privacidad
Trminos de Uso
Contactos