Everyday entropy: Shannon and Hawking

“Opposites are Complementary” – Niels Bohr coat of arms


“Truth and clarity are complementary.” – Niels Bohr

  • As quoted in Quantum Theory and the Flight from Realism : Philosophical Responses to Quantum Mechanics (2000) by Christopher Norris, p. 234


Before virtual reality replaces physical reality with information, it may be worth our while to ask what distinguishes information from the physical stuff it is replacing.  In particular, why is Stephen Hawking, the physicist, concerned with the loss of information as matter falls into a black hole, and why did Claude Shannon, the information theorist, describe the amount of information a system can transmit as its entropy?  The simple answer is that information and entropy are complementary in Bohr’s terminology, so that each conforms the other.  Entropy is the configuration of a body, which may be measured in terms of energy (Gibbs entropy), information transmission capacity (Shannon entropy), or even its quantum state (von Neumann entropy).  Shannon entropy is the quantum of information an information system can transmit, while Hawking information is the quantum description of the physical body.  These informations are fundamentally distinct outside of a black hole, but what does their differentiation mean for our virtual future?

When Hawking talks about the information of a body swallowed by a black hole, he means the quantum mechanical description of its particles, or the Hamiltonian measure of its energy.  When Shannon talks about the entropy of an information system, he means sum of transmissible messages that can be transmitted from one physical body to another with measurable accuracy.  Their formulae are nearly identical, but the critical distinction is that Shannon’s information is epistemological and hermeneutic – communicable data – while Hawking information is ontological – the essential stuff that defines the body itself.   Shannon information is data that one body can “know” about another body and communicate to a third body, while Hawking information is the intrinsic characteristics that make something distinguishable as itself.  Hawking information is true but unknowable, partly because it is inaccessible without changing the thing and partly because of Heisenberg uncertainty.  Shannon information is knowable but not true with regard to any physical reality because the true information is inaccessible.

Shannon information is data that one body can “know” about another body and communicate to a third body, while Hawking information is the intrinsic characteristics that make something distinguishable as itself.  These two informations are complementary in Niels Bohr’s epistemology of quantum mechanics. Hawking information is true but unknowable, partly because it is inaccessible without changing the thing and partly because of Heisenberg uncertainty.  Shannon information is knowable but not true with regard to any physical reality because, first, the true information is inaccessible, and second because that information would need to be translated for transmission, and even if it were somehow accessible and measurable there would be too much of it to fit into any information system.  Hawking information is infinitely variable and uncertain, while Shannon information is finite and certain.  This mirrors the difference between fiction and non-fiction.  Fiction is untrue but certain, i.e., the author can determine facts with an arbitrary degree of certainty, from who said what all the way down to the momentum and position of a particle, which is impossible in reality but perfectly normal in an imaginary universe.  Non-fiction is true but uncertain, from “he said-she said” all the way down to quantum uncertainty, the author cannot establish any facts with any degree of certainty, but there is an infinite amount of information about the facts that can be presented as if they formed a coherent narrative.

In his nobel acceptance speech, Max Planck hinted at the challenge of knowing anything more than the binary resonance of matter, which is the Shannon information that it transmits in radiation:

“The general connection between the energy of a resonator of specific natural period of vibration and the energy radiation of the corresponding spectral region in the surrounding field under conditions of stationary energy exchange. The noteworthy result was found that this connection was in no way dependent upon the nature of the resonator, particularly its attenuation constants – a circumstance which I welcomed happily since the whole problem thus became simpler, for instead of the energy of radiation, the energy of the resonator could be taken and, thereby, a complex system, composed of many degrees of freedom, could be replaced by a simple system of one degree of freedom…

…from the start I tried to get a connection, not between the temperature but rather the entropy of the resonator and its energy, and in fact, not its entropy exactly but the second derivative with respect to the energy since this has a direct physical meaning for the irreversibility of the energy exchange between resonator and radiation.”

But look what happens when he mentions entropy: “…not its entropy exactly…”  Planck is a towering figure among nobel laureates whose eponymous distance is so small that it cannot be measured.  When he says “not exactly,” something strange is happening.  Entropy was Planck’s gateway, his path and his inspiration for resolving the mystery of blackbody radiation energy, but the entropy itself was unnecessarily complex.  The information transmitted by the resonance of matter is not its entropy, but a much simpler function of its energy.  Note, also, that the “second derivative of the energy” – the rate of change of the rate of change of the free energy in the system – is the form of the system’s entropy, and it is consistent for any system out of equilibrium, whether informational or physical.

Shannon and Hawking information can’t be reconciled into one system, but they are inherently connected through entropy.  One is always becoming the other and vice versa.  By emitting photons, sharing electrons and generally bumping into one another, things are constantly converting their Hawking information into Shannon information and transmitting it to other things, which covert the Shannon information back into Hawking information.  In an information system, Shannon entropy measures the limit on information that can be transmitted, which is imposed by the physical configuration (Hawking information) of the system – information cannot normally transmit itself.  But every transmission of information also results in an increase in the physical entropy of the information system.  Shannon entropy is related to physical (Gibbs) entropy by the fact that matter is transformed by the transmission and receipt of information, and information only exist in transmission between a physical transmitter and receiver.  The entropies transform one another.

The meaning of the information, both for transmitter and for the recipient, is infinitesimally variable, and infinitely expandable, but the information that is certain is confined to the capacity of the system.  This is why Shannon entropy matters.  The infinite and infinitesimal variability of the meaning of the message for the sender and receiver is a function of physical entropy, because only matter can transmit or receive information.  A photon may be entangled with another photon by their mutual interaction with matter, but one photon can’t transmit its information to another.

Delft University has recently confirmed that local determinism is impossible, which means that Shannon and Hawking information cannot be reconciled, i.e., the information you can know about a thing cannot be reconciled with the information in the thing because the information in the thing is not determined locally.  Truth cannot be communicated and communication cannot be true.  This is not to say that communication is trivial or that truth is irrelevant – both fiction and non-fiction are socially, politically and scientifically essential – only that you have to respect the difference between truth (Hawking information) and communication (Shannon information) to avoid falling into social, political and scientific fantasy.

Probability and Information: note that there is no information when probability is too low and also no information when probability is too high.
Finally, it is important to remember that every “measure” of entropy is fiction.  From the statistical mechanics of the average momentum of the molecules of an ideal gas in isolation to the notions of energy and temperature, everything that describes entropy has to be made up because actually measuring physical entropy would require resolving Heisenberg uncertainty.  Having said that, observations have shown conclusively that the fiction is almost exactly what you would see if you could see what was going on in the non-resonant relationships between energy and matter.  The inner transformation of matter really works like the distribution of momentum and position of molecules of an ideal gas in an isolated container.  You just can’t observe it actually happening.  Ever.  That use of fiction to describe truth in a way that is unobservable is precisely why fiction is so important in all forms of communication, and why art can’t be removed from science without losing contact with half of reality.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s