Everyday entropy: beyond efficiency

Beyond Efficiency: Entropy, Information, and Energy Conservation, or why the relationship between information and loss is not trivial.

low entropy

high entropy copy

The average single-family home now uses about 10% more energy (site level) than a home built in 1980.  The average SUV can go about as far on a gallon of gasoline as a Model T Ford.  This isn’t because of a regulatory failure or an industrial conspiracy.  If anything, it’s a failure of imagination.  Every piece and part of a modern home is more efficient than a home built in 1980, and every part and component of a Ford Explorer is more efficient than a Model T.  So what happened?  They spread out, got bigger and incorporated more stuff.  This seems so obvious that it’s hardly worth saying, let alone studying, but if it’s so obvious, why did it happen?  It’s easy to say that hindsight is 20/20, but in this case it’s more accurate to say that entropy is obvious in hindsight, but very hard to see into the future.  Not only is it obvious that big houses full of stuff will consume more energy, but it’s obvious that people with money will move into bigger houses and buy more stuff.  The only question is what happened to all the energy efficient technology that people bought so that they could save money with a clear conscience?  Technology can hide and redistribute entropy in more or less clever ways, but reversing entropy is no more possible than reversing gravity or time.

Entropy is a great scapegoat.  It’s everywhere; it can’t be stopped; and it explains everything that’s wrong with the world, from the mess in my closet to climate change.  It often seems like life is a never ending battle against entropy, but if you imagine a universe without entropy, it’s clear that entropy isn’t the problem: without entropy, sunlight wouldn’t diffuse through leaf cuticles to the chloroplasts, so there would be no photosynthesis, no gaseous oxygen, and no bananas; every day the earth’s surface would heat up, but the energy wouldn’t diffuse away from the surface, so it would melt and then evaporate, leaving a cloud of hot vapour in cold space.  But take a step back, without entropy, the energy in the sun’s core wouldn’t diffuse outward to the photosphere, so we wouldn’t even get any sunlight and heat build-up in the sun’s core would initiate runaway nuclear fusion and blow up in a supernova.  So, whatever issues we may have with entropy, a universe without it would be like a universe without gravity: not good at all.

But we don’t talk about entropy.  Despite playing a central role in energy efficiency, entropy doesn’t show up in even high-concept papers about energy policy.  I suspect there are two reasons why we repress entropy: first, it seems counterintuitive and technical and we’re afraid of getting it wrong; and second, entropy makes us think of death and Sisyphean tasks like tidying our closets.  But entropy isn’t evil, it’s just indescribable.  You can describe it as the probability density of the parts of a system occupying a given configuration, but that’s a lot like the old joke about the lost balloonist asking a person on the ground “where am I,” to which the person responds “you’re in a balloon.”  You could say that entropy is the diffusion of energy, but that’s not always correct.  Entropy literally means the “inner transformation,” which may be the best possible description.  Any interaction between matter and energy will result in a transformation, and the essence of entropy is that some part of that transformation is not reversible.  This irreversible transformation is usually hidden, but it’s obvious when you bend a paperclip and can’t bend it back again.  We all have an intuitive understanding of entropy, in the same way that we have an intuitive understanding of gravity and radiation, even if we can’t understand the math.

One way to make yourself aware of your intuitive sense of entropy is to locate energy around you.  If the energy is concentrated in a particular place, entropy is low; if you can’t locate the energy, entropy is high.  In the images on the left, the energy is concentrated into a moment of force, while on the right, the energy is evenly distributed.  Bright lights, loud sounds, heat on your skin, and caustic smells tell you that entropy is low, for the moment.  We love watching explosions and listening to powerful engines in part because they are loci of energy, and we perceive profound beauty in moments when entropy is low.

But moments of low entropy are poignant and fleeting.  Entropy never rests.  Let’s go back to the diffusion of sunlight through the leaf cuticle to the chloroplast where it pumps protons across the thylakoid membrane.  Every reaction in the process of photosynthesis is spontaneous and entropic: the plant uses less than 2% of the energy it receives in sunlight, and the energy losses just pile up from there.  The trick lies in the information that organises the molecules and infrastructure in the right place and the right order to take advantage of the rising entropy that starts with the arrival of sunlight.  Paradoxically, life doesn’t struggle against entropy; life surfs on the rising waves of entropy that roll across the earth every day.  It looks like magic from the outside, but a close investigation reveals the entropy inside.

Magicians also use information and organisation to create the illusion of inverted entropy.  The classic rabbit in the hat trick is one example of hiding entropy in information and organisation.  There is nothing magical about pulling a rabbit from a hat.  If someone walked up to you and pulled a rabbit from a hat, you’d be slightly bewildered as to why they had a rabbit in their hat, but you wouldn’t call it magic.  The key to the trick is convincing you that the hat is empty first, by showing it to you, maybe even letting you physically inspect it, so that you imagine an empty space inside the hat.  Now that your information about the hat is established, the trick is to add a rabbit without changing your information.  This is done by sleight of hand, and it’s a beautiful example of information concealing physical entropy – the diffusion of matter through space.  That is also the magic of television and more advanced telecommunications.  We seem to be defying entropy.  But like the rabbit in the hat trick, it’s a mixture of information, energy and sleight of hand.  Entropy does not bend.

Another great trick is the chemical reaction in instant ice packs, where solid crystals become liquid and suck the heat out of the surrounding environment.  This, again, runs counter to our entropy intuition, which says that energy diffuses from hot things to cold things, so things can’t spontaneously get colder than their surroundings.  But the reaction isn’t reverse entropy.  When the solid crystals turn to liquid, the molecules are freed from their highly organised crystalline prison, and they became a disorganised fluid.  It is possible to imagine the endothermic part of this reaction as thermal energy being converted into kinetic energy when the molecules are suddenly free to move around: entropy from heat to motion.  However, it may be more accurate to say that the information required to describe the physical state of the molecules, their position and momentum, increases spectacularly when they become fluid: entropy from heat to information.  And to me, as an information guy, that’s magical.

There are two ideas I’d like to introduce here: first, entropy looks the same whether it is affecting energy, momentum or information; and second, the distinction between mass, energy and information becomes fuzzy at the scale of subatomic particles.  Entropy seems to commute between these forms.  Which brings me to the point.  The fact that entropy is constant and inevitable doesn’t mean there is nothing we can do about it.  Life has been surfing entropy for at least a billion years by pumping and moving the right equipment into the right place to ride the waves: information, decision, location, execution.  Human consciousness adds a recursive loop to this cascade, but even for us, efficiency cannot escape entropy.  It’s also important to remember that information is also subject to entropy, so no matter how hard we try to set our knowledge in stone, over time, the information will be lost.  Every organisation is continuously forgetting how it should be organised.  Life isn’t evolving to improve itself, it is evolving to rediscover the information it has lost to entropy.  Again, human knowledge can loop much faster than DNA, but it isn’t free from the process.  That’s why you have to buy a new computer every few years, even if they’re not really getting any better.

Take one more look at the images above.  It’s easy to see how the conditions on the left would evolve into the conditions on the right, but not the other way around.  And that’s the most important thing to understand about entropy in all of its forms: entropy is what consistently happens spontaneously, but doesn’t reverse by itself.  The point is that many if not most of our failings as people are not anybody’s fault, but if we know about entropy, and we should all know about entropy, the failings shouldn’t come as a surprise.  In planning, if we don’t have a plan for the increase in entropy, then we don’t have a plan.

 

energyuse

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s