Everyday entropy: distant relationships


Through the fog

Energy is the resonance that passes from one material to another.  It may be heat, radiation, or force, but there is a direct translation of the resonance of one thing with a change in the resonance of another thing.  Entropy is all of the other relationships between the parts of a system.  The way we calculate entropy follows from the tendency for these relationships to average each other out over many interactions until the internal resonance of a system matches its surroundings.  However, in an open system, all of these relationships matter in the overall evolution of the system, so that a random alignment of particles in one region can induce wave action and rotational momentum in the whole body of the system, and with the addition of randomly distributed energy from outside, the randomly induced angular momentum can grow instead of dissipating.  This is how a hurricane forms from randomly generated undulations in the layers of the upper atmosphere.

The butterfly’s wing-beat is resonant, and because of that it was attractive to chaos theorists as the smallest unit of energy that might translate into the formation of a hurricane three thousand miles away.  But the butterfly’s wing-beat is only marginally more relevant than the position and momentum of each molecule of air in the vortices that would exist in the butterfly’s air column whether it beat its wings or not, and the actual momentum and position of the air molecules are fundamentally uncertain as well as too numerous to count.  Every relationship between every particle in the atmosphere has a relationship with the formation of a hurricane, including relationships that have no energetic connection through resonance, heat or force.  The entropy of a system is not limited to its thermodynamic or statistical mechanical properties, these properties are just directly related to measurements we can actually make.

Entropy, in terms of statistical mechanics, is the absolute limit of scientific information, but there is a set of relative phenomena beyond that limit.  In particular, the change in entropy that is not reversed by the replacement of a constraint.  Likewise, the dampening of resonance is not itself resonant.  In most engineering, these things are established through observation and then applied negatively to specific calculations, but they have no scientific basis beyond the second law of thermodynamics.  Entropy is a function of the distinguishable relationships between a body and its parts.  In classical and informational terms, it is the inverse of the sum of distinguishable relationships within a system.

Flood control illustrates how the sum of microscopic details fails to account for the macroscopic relationships of the whole.  If every town along a river measures the average range of flooding and each town builds flood defences that will protect it from the average flood, the flood controls will almost certainly fail.  The upriver flood walls will keep more water in the channel moving faster so that the water level downriver will now be higher than they were measured.  The flood controls will work for the upriver towns, but at some point down the river, the build up of water held in the channel by the upriver flood walls will overwhelm the walls that were not built to withstand the augmented flow.  What’s interesting is that the towns below that point of failure may be spared because the river will slow down at the point of failure and go back to being average for the lower stretch.  Over time, this point of failure will move up and down, depending on the distribution of the rain that causes it, which makes the flooding seem random, rather than the result of the upriver alterations.  Entropy in the sense of relationships beyond the immediate or locally obvious has a tendency to look like occult, religious, or paranormal activity, where things happen for reasons that are hidden from view.

One curiosity is that there is energy in the space between things.  If you have an ounce of gas in a 100 ml can at room temperature, and then you release it into a litre, the temperature drops.  The number of molecules is the same, and the total energy is the same, so where did the temperature go?  What slows the particles down when there is more space?  Some physics teachers say that the particles slow down because one wall is receding, but the particles would slow just as much if the wall disintegrated (or disappeared magically).  The average momentum of the particles is lower because the energy is spread out in space.  This may be where entropy and quantum fluctuations begin to interact.  Entropy defines a volume of space that is equal to a quantum of energy in matter or resonance.

Classically, energy is defined as heat or force, but there is a lot of energy that is bound up in the structure of matter and its location relative to the matter around it.  That energy, whether in the structure of crystals or the space between neighbouring particles, isn’t directly translatable into heat or force, but it can transform into heat or force (or absorb heat or force) through reactions and interactions, so it is energy by a different name.  You might say that entropy is the energy of the relationships between things, excluding what can be transferred as heat or force.  The maximum entropy of a system is the sum of the heat and force distributed evenly throughout the system, so that no more heat or force can be transferred from one part to another.  This distribution is driving both climate change and refugees.

Nineteenth century science has no insights for climate change, deforestation or refugees.  These developments are relational and entropic.  They cannot be broken down into manageable bits and solved piecemeal.  You can’t measure a relationship; you just have to see how it evolves.  How many time have you met a random person and found that you already have many relationships with them, even though you couldn’t have known about them before the meeting?  This is not to say that science is not valuable, only that there is science beyond measurement.  Boltzmann didn’t measure entropy, he imagined it.  He understood that the relationships between the momentum and position of each particle made up the energy of the whole, and that those relationships were as important as the measurable temperature.

When you look at the financial crisis of 2007, the devolution of war in Iraq, the refugee crises at the US/Mexican border and the EU/Africa/Middle East border, it seems inconceivable that intelligent people would let these things happen.  But remember that the people in charge don’t see entropy.  They have built their success on breaking problems down into manageable chunks, which works for resonance problems where the offending harmonic can be isolated and dampened, but they are incompetent when faced with any problem that doesn’t respond as a resonator.  It was the failure to appreciate the relationships between the people and their space that led to the US military failures in Vietnam and Iraq.  If we do not mention entropy when we talk about energy; if we do not mention free space when we talk about development; if we do not mention relationships when we talk about physical crises; then all is lost here.

When it comes to relationships, information is the enemy.  The Afghan war alliance flowchart was both incoherent and obsolete by the time it was drafted.

afghan ppt.jpg



Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s