Everyday entropy: Shannon’s intuition

pia21386
Jupiter whorles
“Much like the kinetic energy of a massive object, the energy of a photon is a reference frame dependent quantity. A nice way put to it is to say the energy of light is expressed by the relationship between the emitter and receiver.”
“E = mc2 famously suggests the idea that you can get a lot of energy out of a small amount of mass. But that’s not what Einstein had in mind, really, and you won’t find that equation in the original paper. The way he wrote it was M = e/c2 and the original paper had a title that was a question, which was, “Does the inertia of a body depend on its energy content?” So right from the beginning Einstein was thinking about the question of could you explain mass in terms of energy. It turned out that the realization of that vision, the understanding of how not only a little bit of mass but most of the mass, 90 percent or 95 percent of the mass of matter as we know it, comes from energy. We build it up out of massless gluons and almost massless quarks, producing mass from pure energy. That’s the deeper vision.”
Frank Wilczek, Theoretical Physicist, MIT
Shannon’s use of “entropy” to describe information has created some confusion, possibly because Boltzmann’s equation is taken as the definition of entropy, rather than an explanation of entropy.  The same thing happened with e=mc2.  Note that Einstein was only interested in the relationship between energy and mass, not their definitions.  One reason why his equation was converted to define energy was the crisis of confidence arising from statistical mechanics and light quantisation.  The notion that energy was made of pure electromagnetic waves was crumbling, so a clear definition of energy was really attractive.
But Einstein didn’t define energy or mass, he just said they were inextricably linked to one another and to the speed of light.  But, because people felt like they had a handle on what mass was, i.e., weight, e=mc2 looked like a definition of energy that was stable and comprehensible in a way that statistical mechanical entropy was not.  Einstein meant to deconstruct the definition of mass and inertia, not create a definition of energy.  The popularity of e=mc2 resulted from the vacuum of understanding around energy, and it lead to a general misunderstanding of energy as well as mass.  Shannon and von Neumann, far from confusing the issues of entropy and information, recovered the relationship between energy, information and matter, albeit in a way that is extremely confusing and maybe not entirely intentional.

John von Neumann is usually credited with suggesting the term “entropy” to describe the state of information in a communication system in a flippant or arbitrary way, saying “nobody really knows what entropy is anyway, so you’ll have the upper hand in any debate”.  But, von Neumann was an entropy guy – he brought the concept of entropy into quantum entanglement – so he was fluent in Einstein’s use of entropy to describe the quantisation of light energy.  He could see, intuitively, the relationship between the quantisation of information and the quantisation of light, and their mutual relationship with the quantisation of momentum in Boltzmann’s statistical mechanics.  Besides, Shannon never credited von Neumann, and was himself convinced of the deeper relationship between information and thermodynamics.  In his 1968 Encyclopedia Britannia article “Information Theory”, Shannon wrote the following telling statement:

“The formula for the amount of information is identical in form with equations representing entropy in statistical mechanics, and suggest that there may be deep-lying connections between thermodynamics and information theory. Some scientists believe that a proper statement of the second law of thermodynamics requires a term related to information. These connections with physics, however, do not have to be considered in the engineering and other [fields?].”

Like the phases of Jupiter’s hydrogen – gas, liquid and metal – energy, information and mass are in constant circulation through one another, and never sufficiently static in space or time to have definitions that “make sense.”  Entropy, in all of its forms, tries to make sense of their relationships with one another and with space and time.
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s