Everyday entropy: what is it again?

javelin-women.jpg
Force and feedback

 

 

“You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.”

  • Suggesting to Claude Shannon a name for his new uncertainty function, as quoted in Scientific American Vol. 225 No. 3, (1971), p. 180.

Obviously, I can’t tell you what entropy is.  What I can say is that entropy only exists as a side effect of the definition of energy.  Energy is defined kinetically as a force applied over a distance (one joule of energy is what it takes to apply one newton of force over one meter; and one watt is one joule applied over one second).  Physicists generally define energy as the ability to do work, and as animals that need to go out and find food, it makes intuitive sense to think about energy in terms of locomotion.  But that’s so obviously arbitrary and biased that physicists immediately sweep it under the rug of measurable observations.  There is no general concept of energy that describes it in the abstract.  Not only that, but when it does do work, some energy always goes slightly astray in its perambulations from one form to another.  No matter how perfect your system of transmission, it always takes more than one joule of energy to actually apply one newton of force over an entire meter in a particular direction.  This matters because we know that the energy doesn’t disappear along the way.  The energy is still there, it just can’t do any work.  So in order to maintain our definition of energy as the ability to do work, we need a concept to take care of the energy that gets away, to distinguish free energy that can use to find food from all that useless energy that is unavailable for work.  For this, we need entropy.  For a plant, entropy is immaterial.

The problem with entropy is not that nobody understands it; the problem is that everybody thinks they understand energy, when they don’t.  I am not going to fix that here, but as we stare into a future with uncertain access to free energy, it is important to understand the ways in which energy can become entropy and entropy can become energy.  If free energy is transmissible force or heat, then entropy is the result of that force or heat when it is exhausted.  The energy doesn’t change or dissipate, but it is absorbed into the relationships between things.

In this formulation, energy is the communication of force or radiation from one thing to another while entropy is the relationships established by that communication.  This translation between energy and entropy explains where energy comes from when it seems to be “generated” and where it goes when it seems to be “dissipated.”  It also holds for all scales, whether you are talking about particles, billiard balls or human beings in a society.  If you shoot a cue ball into a group of balls, you can see how its force is translated into a set of relationships after it strikes the balls, and that each ensuing collision communicates force from one ball to the other, while altering all of the relationships (relative momentum and position) between the balls.  This last point will be significant.

This formulation also illustrates why Shannon’s equation relating information with entropy looks the same as Boltzmann’s equation relating momentum and location with entropy. Energy, like information, only exists in transmission. Both equations define how much stuff, either information or energy, can be communicated by the configuration or relationships of a given physical system.  This is more obvious for information, but remember that there is no way to know how much energy is “contained” in an object unless it transmits some of its energy or information to another object.  This is why the zeroth law of thermodynamics is fundamental.  Two systems that are both in equilibrium with a third system are in equilibrium with each other.  This is also true of resonance, where two systems that are both in tune with a third system are in tune with each other.  In this way, the zeroth law also allows for the concept of time within a static frame of reference, where two periodic phenomena are both in unison with a third periodic phenomenon, they are in unison with each other.  Finally, the zeroth law allows for the transmission of information between resonators in the form of resonant energy.  This last point ties information together with thermodynamics: in order for equilibrium to be established between two systems, they must share information, either with one another or with a third system.  This is why it makes sense for Arieh ben Haim to describe Boltzmann’s entropy equation (Gibbs entropy) as a special case of Shannon’s measure of information.  The one quibble I would have is that neither one defines what entropy really is, so what the Boltzmann equation describes is the limit of what is knowable as uncertainty approaches infinity.

Perhaps more important, information can come in two forms: first, the abstract representation of the energy/mass equivalence of a body; and second, the physical mass and location of a body in relation to space and all of the bodies with which it can interact.  The first type of information is the abstract information about an object, which may be transmitted to another object, and allows the object to be known.  The second type is the physical information in the object, which is the sum of its mass and its kinetic degrees of freedom, and defines the object internally as itself, which cannot be known due to Heisenberg uncertainty.  This, again, is the difference between the epistemological and the ontological, between knowable fiction and unknowable truth.

Because the entropy of a system is much more ontological than epistemological, it is hard to talk about it.  Statistical mechanics relates energy and entropy conceptually, but in a way that isn’t terribly helpful to a real person.  Energy is defined as system A’s ability to accelerate system B in a particular direction. This means that system A must constrain the freedom of system B to move in the other direction or remain still.  In this way, energy is the name for system A applying a constraint to system B.  When system A accelerates system B, both systems lose the constraint of isolation from one another, and in this loss of isolation their entropy increases by an amount that is equal to the acceleration of system B.  Because both systems have also lost a degree of internal isolation and some additional isolation from their surroundings, the total entropy is greater than the energy of acceleration applied to system B.  This is why any application of energy is accompanied by an increase in entropy.  Entropy is the relationship between energy, matter and space, which can be described mathematically in terms of heat by the thermodynamic formula of entropy equals energy over temperature, and in terms of mechanics by the Boltzmann equation of entropy equals the probability of finding the system any given possible microstate multiplied by the logarithm of the total number of microstates, given an observable macrostate.  Both equations trace the energy that is exhausted into the relationships between the parts of the system.

This is equally true in information.  A message is the transmitter’s application of a constraint on the receiver by a connection through a channel (which also increases the physical entropy of the system).  In information systems there is also the informational entropy of the language used to convey the message, so there is confusion between the physical entropy of the system and the abstract entropy of the code being used, but I will leave this confusion with you for now.

Clausius used the word “entropy” to describe the limit of the efficiency of a heat engine for a reason.  There was a transformation hidden in the stuff of the engine that couldn’t be converted from momentum back to heat or from heat back to momentum.  This was the inner transformation of matter that was thermodynamically “irreversible”.  Statistical thermodynamics defined the kinetic and thermal energy of a system such that this transformation could be defined and predicted probabilistically, but in a way that the reversible and irreversible parts of the transformation are indistinguishable and the transformation itself is a meaningless restatement of the equality of mass and energy.  S = k_\mathrm{B} \ln W \!

Clausius wanted to understand why some processes only go one direction.  He didn’t define it in a way that was determinative, but his definition was evocative and still applies to molar entropy in chemistry in the form of joules per molekelvin, J mol−1 K−1, or energy over mass times temperature.

The statistical mechanics model of entropy is always correct, but it takes the transformation out of entropy, which is like taking the caffeine out of coffee, the sex out of romance, or the alcohol out of beer. It gives you a notion about the thing, but it misses the point. To say that the entropy of a system is the sum of its possible micro-states multiplied by the logarithm of its total number of micro-states is a bit like the old joke about the balloonist who is lost in a fog: when he sees a person on the ground, the balloonist shouts down “where am I?”, to which the person replies “you’re in a balloon.”

No one has every created a satisfying definition of entropy, which is why everyone uses it differently.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s