Everyday entropy – linear entropy

When you think about it formally, the statistical mechanical definition of entropy comes off as quaint in a world dominated by electrical equipment and electronic communications.  What relevance has the Boltzmann equation for an automobile driven by electric motors attached to lithium chemical batteries.  Boltzmann entropy deliberately excludes the potential energy of the system in favour of an objective kinetic model, which makes sense if you think of entropy as the energy not available for work, but it begs the question: what is work?  This is the arbitrary factor that creeps into statistical mechanics and makes it impossible to explain entropy to someone who hasn’t been steeped in mechanics for years.  We have to remember that work is an entirely macroscopic concept.  The concepts of work and play, energy and entropy, active and passive, are all indistinguishable at microscopic scales, and only diverge gradually as particles are accumulated into macroscopic averages.  Q(reversible) and Q(actual) are identical for any given particle, but diverge as Q becomes the average value for more and more particles.  This is why a big wave crashing looks different from a small wave crashing and a big rock rolling down a hill looks different from a small rock rolling down a hill, even though they are made of the same material and subject to the same accelerations.

Linear entropy is used to calculate only the mixed-ness of quantum states, and it may be express in terms from zero to unity, which seems more useful in general as a way of stating entropy.  Linear entropy from zero to one is a far more helpful expression than any other formula, because it can be considered as a universal quality rather than an esoteric value.  The value of entropy in joules per gram kelvin may be helpful for chemists working with reactions of pure substances, but it is profoundly unhelpful in mechanics and even more unhelpful in structural analysis.  kB ln W is, of course, no more helpful or comparable the j/g(t), so the reduction to linear entropy for general conversations about classical physical properties would at least allow people to speak about entropy in general terms, rather than formulating a specific entropy for each conversation.

Statistical mechanics relates the macroscopic measures of heat and force to the microscopic kinetic energies of the particles that make up the whole.  This misses all of the potential energies that exist in relationships that are neither crystalline nor uniform, and it leaves you with a number that has no comparative value.  The value of entropy is only meaningful for one system, and only once.  You can’t compare entropies for dissimilar systems, and this makes it impossible for entropy to be a word with communication value.  If a word never means the same thing more than once, it is indefinite and meaningless.  So while the word “entropy” may, at the moment, point you towards the equations of thermodynamics and statistical mechanics, it can’t mean anything by itself, which is why it means something different to everyone, and something different again every time it is used.

If we say that entropy is the kinetic energy (derived from statistical mechanics) divided by the total energy (the sum of the kinetic and potential energies of the system), then we have a number that is essentially the percentage of the total energy that has already been converted from potential to kinetic energy.  Not only that, but it makes sense that it will not spontaneously go down because there the conversion of kinetic energy to potential energy is far less likely than the inverse.  For example, if you are sitting on a truckload of dynamite, the entropy of your system, in terms of statistical mechanics, is the same as if you were sitting on a truckload of sand.  But you know, intuitively and chemically, that the entropy of your dynamite-tuck system is much lower than the entropy of your sand-truck system.  What’s helpful about entropy as the hamiltonian over potential is that it makes it clear that the differential is the essential function of entropy, and that the situations are now comparable.  The sand-truck entropy is near 1, while the dynamite entropy is near zero.

Energy isn’t a thing that you get from coal and distribute through an electricity grid.  That is like the “caloric” theory of energy that existed before modern thermodynamics.  Energy should be defined using Boltzmann’s entropy equation, or the Hamiltonian to include charge and quantum properties.  The energy in the coal doesn’t go anywhere.  When the coal burns, the entropy of the system increases.  That increase is translated into temperature in the boiler, which is translated into space in the steam, which is translated into pressure in the pressure vessel, which is translated into force in the turbine, which is translated into momentum in the drive shaft, which is translated into charge in the dynamo, which is shared with the circuits in your toaster, which is translated back into temperature in the element, which increases the entropy of your slice of bread, which gives you toast.  All of the energy in the coal is fully accounted for in its byproducts if you do the stoichiometry, so it didn’t go anywhere.  There was, however, a very clever set of contraptions that established a relationship between your toast and the burning coal, so that you got to share in a bit of the increase in entropy.  It’s much easier to think of this in terms of energy as sort of an abstract force that can be sent from here to there in different forms, but that convenience means that we are left with this really confusing concept of entropy for all of the physical relationships that make the energy happen but don’t fit our conception of what energy is.

Finally, considering the modern world’s reliance on electricity, which is entirely excluded from the both statistical mechanical and thermodynamic definitions of entropy, it seems pretty important that we use a definition that can accommodate electromagnetic potential in both batteries and mains power.  Traditionally, physicists have simply divided mechanics and electromagnetism into two separate fields, and left mechanics to be taught to children.  By analogy with classical mechanics, the Hamiltonian is commonly expressed as the sum of operators corresponding to the kinetic and potential energies of a system.  If you look at the hamiltonian, the kinetic energy plus the potential energy, you can see an obvious opening for a more practical understanding of entropy in the difference between the kinetic and potential energies of a system.  Rather than combining the kinetic and potential energies into one probability function, the potential energy function is broken out as a function of the spatial configuration of the system and time (a particular set of spatial positions at some instant of time defines a configuration), separate from the actual kinetic energy of the particles.  This avoids the confusion between the total energy and the entropy, and particularly eliminates the arbitrary terms “microstate” and “macrostate” and the extremely arbitrary notion of “work” in the form of heat or force.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s