Negentropy
From Free net encyclopedia
In 1943 Erwin Schrödinger used the concept of “negative entropy” in his popular-science book What is life?. The actual term “negentropy” was later coined by Léon Brillouin.
Schrödinger introduced the concept when explaining that a living system exports entropy in order to maintain its own entropy at a low level. By using the term "Negentropy", he could express this fact in a more "positive" way: A living system imports negentropy and stores it.
In a note to What is Life? Schrödinger explains his usage of this term.
- Let me say first, that if I had been catering for them [physicists] alone I should have let the discussion turn on free energy instead. It is the more familiar notion in this context. But this highly technical term seemed linguistically too near to energy for making the average reader alive to the contrast between the two things. (Erwin Schrödinger)
Information theory
In information theory, “negentropy” is used as a measure of distance to normality. Consider a signal with a certain distribution. If the signal is Gaussian, the signal is said to have a normal distribution. Negentropy is always positive, is invariant by any linear invertible change of coordinates, and vanishes iff the signal is Gaussian.
Negentropy is defined as
- <math>J(p_x) = S(\phi_x) - S(p_x)</math>
where <math>S(\phi_x)</math> stands for the Gaussian density with the same mean and variance as <math>p_x</math> and <math>S(p_x)</math> is the differential entropy:
- <math>S(p_x) = - \int p_x(u) \log p_x(u) du</math>
Negentropy is used in statistics and signal processing. It is related to network entropy, which is used in Independent Component Analysis.
P. Comon, Independent Component Analysis - a new concept?, Signal Processing, 36:287-314, 1994.