Psychology Wiki
Advertisement

In 1943 Erwin Schrödinger used the concept of “negative entropy” in his popular-science book What is Life? . The actual term “negentropy” was later coined by Léon Brillouin. In 1974, Albert Szent-Györgyi proposed replacing the term negentropy with syntropy.

Schrödinger introduced the concept when explaining that a living system exports entropy in order to maintain its own entropy at a low level. By using the term "Negentropy", he could express this fact in a more "positive" way: A living system imports negentropy and stores it.

Actually, negentropy is a misconception derived from entropy. Negentropy is the loss of entropy in a system by having more entropy flowing out than in. An example in thermodynamics, where entropy is heat divided by tempeature, is the flow of energy from hot to cold via another medium. This medium will be in an energetic equilibrium, that is, emit as much energy as it receives. Yet, emission occurs at a lower temperature than immission. That means, that MORE entropy is emitted than is received, resulting in a net loss of entropy in the transisting medium. However, the net total amount of entropy rises. (As the temperature by which the heat is conducted is now lower than before.)

In a note to What is Life? Schrödinger explains his usage of this term.

Let me say first, that if I had been catering for them [physicists] alone I should have let the discussion turn on free energy instead. It is the more familiar notion in this context. But this highly technical term seemed linguistically too near to energy for making the average reader alive to the contrast between the two things. (Erwin Schrödinger)

Further, Schrödinger acknowledges the ambiguity of his term usage but is explicit in defining it:

"Hence the awkward expression 'negative entropy' can be replace by a better one: entropy, taken with the negative sign, is itself a measure of order. Thus the device by which an organism maintains itself stationary at a fairly high level of orderliness (= low level of entropy) really consists in continually sucking orderliness from the environment." (What Is Life?, pg 79)

The natural phenomena which Schrödinger recognized has been seen by others. In his 1996 "Investigations" series, Stuart A. Kauffman describes essentially the same natural phenomena in different terms. First, he defines Autonomous Agents which includes living organisms:

"It is precisely because E. coli is a living cell, a collectively reproducing organization of matter, energy, and process capable of self maintenance, reproduction, and evolution, that we unhesitatingly think of the E. coli cell as an AUTONOMOUS AGENT ... In order to be an Autonomous Agent, a system must carry out work cycles by virtue of which it maintains and amplifies itself. These work cycles require that the Devices carry out work in the "+" direction. At exact equilibrium no work cycles can be carried out. Hence no Agency exists." Kauffman, 1996 Lecture Series, sec. 3.1.9

But rather than "sucking orderliness" to escape equilibrium, as Schrödinger described the phenomena, Kauffman describes "Autonomous Agents" (living organisms) as "ratcheting" away from equilibrium:

"The fact that Autonomous Agents are necessarily displaced from equilibrium and perform work cycles means that agents can, and do "ratchet" themselves further from equilibrium." Kauffman, 1996 Lecture Series, Preface

Though controversy exists regarding the terminology, the phenomena being observed has been described by multiple sources, and it can be experimentally tested and measured.

References[]

  • Kauffman, Stuart A. (1996). INVESTIGATIONS: The nature of Autonomous Agents and the Worlds they Mutually Create. fourth ed. NEW YORK: Santa Fe Institute.
  • Schrödinger, Erwin. (1967). What Is Life? & Mind and Matter. CAMBRIDGE: Cambridge University Press, ISBN 052109397



Information theory[]

In information theory, “negentropy” is used as a measure of distance to normality. Consider a signal with a certain distribution. If the signal is Gaussian, the signal is said to have a normal distribution. Negentropy is always positive, is invariant by any linear invertible change of coordinates, and vanishes iff the signal is Gaussian.

Negentropy is defined as

where stands for the Gaussian density with the same mean and variance as and is the differential entropy:

Negentropy is used in statistics and signal processing. It is related to network entropy, which is used in Independent Component Analysis.

P. Comon, Independent Component Analysis - a new concept?, Signal Processing, 36:287-314, 1994.

See also[]

de:Negentropie fr:Néguentropie no:Negentropi simple:Negentropy


This page uses Creative Commons Licensed content from Wikipedia (view authors).
Advertisement