Everyday entropy: ataxophobia and anxiety

image

https://factualfacts.com/ataxophobia/

Ataxophobia is a lot like arachnophobia. Most people have a little ataxophobia, fear of disorder, and that is probably healthy. Science is dominated by people who have a lot.  This, more than anything else, is the barrier to understanding entropy.

Arieh ben-Naim’s passion comes from the utter failure of traditional texts to actually teach anybody about entropy.  If entropy pedagogy were in any way adequate, you would expect a few people, at least ones with advanced chemistry degrees, to understand entropy.  But they don’t.   Von Neumann’s (possibly apocryphal) assessment is still accurate: “nobody really knows what entropy is.”  Ben-Naim has spent his life trying to figure out why science students fail to grasp entropy, and how to fix that.  He has assailed chemistry and physics textbooks for misinforming students about the true nature of entropy.  But, like Boltzmann, ben-Naim is able to see thermodynamics in terms of probability and uncertainty, so Shannon’s measure of information seems like a simpler and easier way to understand entropy than the traditional “spreading out” or “mixing up.” The thing is that if ataxophobia is like arachnophobia, the fear of uncertainty is like all other fears combined.  Telling people that entropy is actually uncertainty is like telling them that what they thought was a single spider is actually a big box of snakes, and they’re on a high ledge, in the dark, and they have to speak publicly in the nude (uncertainty fuels anxiety.)  Fear of uncertainty is so common that it has been upgraded from a phobia to an emotion: anxiety.

Ben-Naim is not confused about entropy, but he doesn’t appreciate that his vision is unique and his preference for uncertainty over disorder is peculiar.  Most people literally can’t see entropy this way, and probably can’t see entropy at all.  They generally fall into two camps: those who, like Lambert, view entropy as old and settled, and so not very interesting; and those who see entropy as too terrifying for serious consideration.  Both are wrong, but their wrongness seems to stem from an emotional need for stability rather than a failure of pedagogy. Lambert (quoted below) cannot be credited with helping to correct this situation.  At least ben-Naim is trying.  But the problem is not in the pedagogy.  It is ataxophobia.  Entropy is precisely what people go into science to avoid.  Ben-Naim hopes that an informational definition will help calm the ataxophobia that science students get from physical entropy, but the replacement is anxiety.  The problem is far deeper than misunderstanding.  When confronted with entropy, people feel terror and naturally settle into denial.

To go back to ben-Naim’s argument, defining entropy as disorder does beg the question.  Order, in the thermodynamic sense, is no better understood than entropy.  It is not the static and arbitrary order of a tidy room or neat arrangement.  Repetitive patterns may describe certain instances of low informational entropy, but that misses the point of thermodynamic entropy, which is concerned with momentum. Order, in thermodynamics, means greater freedom in less space. This may correspond with the appearance of order, but it is the freedom that matters, not the pattern or tidiness.

The difference between diamond and graphite illustrates how entropy relates to freedom and space. In a diamond, the carbon nuclei vibreate in three dimensions in the tetrahedron while each of the four valence electrons can overlap with an electron from one of the four bonding partners in the tetrahedral lattice. Compare this to graphite, where each carbon atom is bonded to three others in a sheet of hexagonal rings. The fourth valence electron is stuck with its parent and the nucleus is pinned in two dimensions. Furthermore, there is more distance between the sheets than between the diamond’s tetrahedral corners. What gives a diamond it’s incredible heat capacity is the freedom of its parts to move within the tiny space of its lattice. It is not that the tetrahedron is more orderly that honeycomb sheets in any abstract or platonic sense, but that it provides carbon atoms with tremendous freedom in minimal space.

Order, in the entropic sense, is only a function of freedom to move in space, not the static arrangement of parts. This is why dust settled on the floor has higher entropy that dust floating through the air, even though it looks more orderly. The dust on the floor has lost its freedom to move in three dimensions.

Contrast the freedom of carbon atoms with the constipation of a city in gridlock. It is not the “order” that matters but the momentum it allows. A hurricane can transfer more energy in a tighter package than a nor’easter because it’s spiral is the perfect shape for packing and flow.

If you add the dispersal of information to the classical dispersal of energy, mass or freedom, then ben Naim’s informational interpretation of entropy folds fluidly into the more intuitive explanations of entropy.  “Disorder” stands out as misunderstood, but a thorough exploration of order in terms of freedom in space or informational expectation clarifies that point as well.

In every science book there is a land of the unknown, and an impression that some new discovery might just push open the door to that land.  But in every field, the new discoveries, year after year, just circle around the promised land, occasionally a little bit closer, but never substantially.  This is where the actual entropy starts.  It isn’t science, although there are great scientific models for it.  Boltzmann’s entropy equations are wonderful, but they can’t tell you anything about the actual entropy of your environment.  They are like the score to Beethoven’s 9th symphony.  If you know how to read them, they can give you an impression of what the music might sound like, but they are nothing like the music.

Entropy might seem like a scientific concept, but it’s actually the asymptotic limit of what can be known scientifically.  The entropy of an isolated system increases or stays the same. How unhelpful and unscientific is that?  What’s really crazy is that sometimes entropy makes sense in terms of uncertainty, sometimes in terms of distribution of momentum, and sometimes in terms of distribution in space, for no good reason.

 

 

By Frank L. Lambert
This review is from: Entropy Demystified: The Second Law Reduced to Plain Common Sense (Paperback)

“Fifty years ago, Arieh Ben-Naim, as every student in a physics or chemistry class of that era, was mystified by his introduction to entropy and the second law of thermodynamics. Although he was a professor of chemistry before retiring 15 years ago, Ben-Naim has evidently not kept up with the teaching of those topics in current chemistry texts. Thus, he seems unaware that most general chemistry texts currently published in the US (16) and three in physical chemistry – most available from Amazon.com – now clearly and simply present entropy and the second law.

The connection between spontaneous chemical reactions or physical processes, dispersal of energy, and entropy is integral, tight, and generally accepted. It does not require 200 pages to justify.  Therefore, his 217 pages of “Entropy Demystified” that are necessary to develop his personal viewpoint (an information theory variant, not present in a US undergraduate chemistry textbook) can be clarified by 3-4 pages in any of the chemistry texts listed with their ISBN numbers (for exact Amazon.com identification) in […] at “May 2009”. In fact, a conceptual summary of the second law and entropy for most interested readers can be abstracted from these texts in two sentences: “Energy of all types changes from being localized to become more dispersed, spread out, distributed in space (and abstractly, in more energy quantum states, microstates) if that energy is not constrained.” Then, “entropy change is the quantitative measure of how much more widely distributed the initial energy becomes in a spontaneous process.” Thus, in real processes, energy spreads out spatially. Probabilistic methods are one way of quantifying thermodynamic entropy.

Unfortunately, Professor Ben-Naim’s fundamental error, summarized on page 204 but weakening all previous pages, is his misinterpretation of what occurs in real systems of molecules, especially in the simple isothermal expansion of ideal gases or in their “mixing”/expansion. These cases have misled him to focus on their particular lack of change in the total energy of the system, rather than on what is the fundamental cause of all thermodynamic entropy change: the increased spreading of the initial energy of actual molecules in space when constraints are removed – e.g., their spontaneously moving into a greater volume from a smaller volume (with unchanged energy) in a process such as expansion/mixing. This is what traditional thermodynamic entropy readily measures and, as just stated, can be readily understood.

The disconnect between information and the second law is stressed on page 203 by “a measure of information cannot be used to explain the Second Law of Thermodynamics.” This is true, indeed. The connection between the second law and information is tenuous. Contrast this with the modern view in beginning collegiate chemistry texts, e.g. “whenever a product-favored chemical or physical process occurs, energy becomes more dispersed…This is summarized in the second law of thermodynamics, which states that the total entropy of the universe … is continually increasing.” (Moore, Stanitski, and Jurs; 3rd edition.) A physical chemistry text that is used world-wide states “…the Second Law of thermodynamics, may also be expressed in terms of another state function, the entropy, S. …entropy…is a measure of the energy dispersed in a process…” (Atkins and de Paula, 8th edition.)”

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s