Everyday entropy: what is it again, again?

article-2052533-0E7EAA0500000578-962_964x533
Crashing at 120 mph

“entropy is a global quantity, like energy or angular momentum, and shouldn’t be localized on the horizon. The various attempts to identify the microstates responsible for black hole entropy, are in fact constructions of dual theories, that live in separate spacetimes.”
– Stephen Hawking

So what is entropy again, again?  There is a civil war in physics between those who believe in the statistical mechanical definition of entropy and those who believe that the classical thermodynamic definition is still more conceptually applicable.  Statistical mechanics is obviously more correct, but the classical school says it is just too hard to understand and apply in practical terms.  The big difference between the two is that one defines energy in terms of momentum, space and quantity (statistical mechanics), while the other defines energy in terms of heat, mass and temperature (classical thermodynamics).

In statistical mechanics, entropy is the probable distribution of momenta of a set of particles in a defined space or configuration.

In classical terms, entropy is the heat that goes into a volume of mass divided by the corresponding increase in temperature.

Microscopically, there is no difference between Q(reversible) and Q(actual), so the distinction between energy and entropy only takes on meaning in masses that are large enough to have an average mass.  When you remember that momentum and temperature are related to mass and energy by Boltzmann’s constant, you can see how the two formulations have essentially the same  difference as exists between the Catholic and Protestant definitions of the Christ.

Before getting too carried away in what is right or what is practical, you have to ask what is energy?  It isn’t one of the fundamental forces, and there is no “energy” field in space.  Then there is the cosmological mystery surrounding something physicists call “dark energy”, which may make up 68% of all “energy” in the universe.  Like space and time, “energy” seems so obvious that we feel like we can define it in a circle with matter without defining either as anything other than themselves.  But what they lack in definition, they make up for in consistency, and this is where entropy comes in.  There is a consistent relationship between energy, mass and the space they take up, and this relationship is expressed in the average momentum and position of the particles (statistical mechanics) as well as the wavelength of the radiation they emit (temperature in classical thermodynamics).
The bottom line is that energy and entropy are relationships, not things.  Energy is the relationship between mass and acceleration, while entropy is the relationship between energy/mass and space/information.  Does this help?  Not really.  But when someone tries to tell you about “efficiency,” you can be pretty sure that they are either hiding something or don’t know what they are talking about.
Usually, when someone says “efficient”, they mean that a task will be accomplished in less time, with less energy or in less space than normal.  It is important to remember that these things are not independent of one another.  Usually, something that takes less time takes more energy or space; something that takes less energy takes more time or space, and something that takes less space takes more time or energy.  All efficiency measures depend on more information than the average process.  In the case of a miniature operating system like that found in a smart phone, the design process is many times more information intensive than the design required for a macro-computer operating system, where the system can be spread out over acres of cpu space.  Also, the miniaturised system, while more “efficient” for processing small amounts of information, simply can’t handle the quantity of information that a macro-computer can.  Efficiency always has a cost.  What and where that cost is hidden will make the difference between good decisions and bad ones.
Regardless of how you choose to define entropy, the entropy of a dynamic system will change along a logarithmic curve that mirrors the way a ball bounces to a standstill.  This curve is incredibly consistent, whether in terms of momentum, temperature, space or time.
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s