Everyday entropy: David Foster Wallace and the meaning of meaning

“we, like diminished kings or rigidly insecure presidents, are reduced to being overwhelmed by info and interpretation, or else paralyzed by cynicism and anomie, or else – worse- seduced by some particular set of dogmatic talking-points, whether these be PC or NRA, rationalist or evangelical, “Cut and Run” or “no Blood for Oil.”  The whole thing is (once again) way too complicated to do justice to in a guest intro, but one last, unabashed bias/preference in BAE ’07 (Best American Essays, 2007) is for pieces that undercut reflexive dogma, that essay to do their own Decidering in good faith and full measure, that eschew the deletion of all parts of reality that do not fit the narrow aperture of, say for instance, those cretinous fundamentalists who insist that creationism should be taught alongside science in public schools, or those sneering materialists who insist that all serious Christians are just as cretinous as the fundamentalists.
Part of our emergency is that it’s so tempting to do this sort of thing now, to retreat to narrow arrogance, pre-formed positions, rigid filters, the “moral clarity” of the immature. The alternative is dealing with massive, high-entropy amounts of info and ambiguity and conflict and flux; it’s continually discovering new vistas of personal ignorance and delusion. In sum, to really try to be informed and literate today is to feel stupid nearly all the time and to need help.”
David Foster Wallace, Best American Essays, 2007, introduction, excerpted in Both Flesh and Not, p. 316.
Although “entropy” may be misplaced here, DFW has an intuitive sense that high-entropy means noise, and that a flood of information makes people uncomfortable to the point where ignorance is preferable to any attempt at informed decision making.  Bigoted, isolationist political discourse is particularly appealing in this context, but this is not new.  This is not the first time a bloviating, chauvinist megalomaniac took power in a major industrialised nation, so it is impossible to place the blame at the feet of digital technology or total noise.  Still, digital technology serves no other purpose other than to alleviate ignorance, so we may start with the question: given the immense public and private investment in digital technology, why has it failed so spectacularly at the one thing it was supposed to accomplish?
One answer lies in David Foster Wallace’s mistake.  He was right to say there is a massive amount of information, but wrong about its entropy.  High entropy may be intuitively synonymous with noise, but in information theory, low entropy is also noise, equally useless for communication.  The internet and cable television gives us information equal in daily volume to an unabridged Oxford English Dictionary, but with only 100 definitions repeating over and over.  Low entropy noise is hard to recognise as noise, and it is much harder to filter out than high entropy noise.
Think of raindrops as high-entropy noise – random and unpredictable. Low entropy noise is more like a steady dripping tap – ordered and predictable. In Shannon’s terms, the steady drip is totally compressible in that you could compress an hour of steady drips into one drip on repeat with no loss of information.  It’s informational entropy is low.  However, while your mind can effortlessly ignore raindrops, that steady drip keeps coming back, drilling into your head. The same is true of the modern information overload. It is possible that we have no more cultural (human) information now than we had in 1932; we just have an immeasurable number of duplicates, replicas and semblances.  You could, for example, compress a week’s worth of news, drama and sport into an hour without missing anything.  However, because the entropy of the information is low, you can’t block it out of your daily routine.  Not only are the duplicates uninformative, they make it harder to find anything worth looking at, new or old.
But Wallace isn’t just talking about Shannon entropy here.  He’s also talking about the high uncertainty of the meaning of the information, which is something Shannon never addressed.  You can see why this leap is tempting, but the idea of judging whether the meaning of a statement is ordered or disordered, compressible or uncompressible, is many times more complex than distinguishing between an ordered or disordered message, which is itself an uncertain task outside of the most formal settings.

Meaning is not a sign of low entropy.  Meaning is a transformation.  Like the difference between the reversible entropy and the actual entropy of a heat engine, meaning is that part of a communication that is irreversible. That’s the part that gets lost in calculation.  The meaningful part of entropy, the thing that Clausius worked on for 15 years, was not the overall “mixed-upness” or disorder of the system; it was the irreversible transformation of the system.  Entropy means inner transformation, and whether it is due to an irreversible disorder or an irreversible order is irrelevant.  The point is that you can’t get back to the starting point without adding something: energy, mass, charge, space, etc.  Every interaction causes a transformation, but at the moment of the interaction there is no way to distinguish between the part of the transformation that is reversible and the part that it irreversible.  You have to wait and see what doesn’t spontaneously reverse itself.  It all looks the same until the Carnot cycle doesn’t quite come back to its starting position.  That is why the guts of entropy are inaccessible.  Entropy has a form and a moment, but the part that is meaningful is indistinguishable from the part that is meaningless.

This irreversible transformation is fundamental to the meaning of information as well.  All information causes a transformation.  Meaningful information causes a transformation that is irreversible.  Like physical entropy, meaning also has a form and a moment, but no measurable dimensions to distinguish it from meaningless information.  Meaning is the information you can’t un-know, for good or ill and for reasons that are inexplicable.  Remember the noises of drips and rain?  They may be annoying or soothing, but once they stop they are gone.  The transformation is reversible.  The songs that get stuck in your head are different.  One thing about meaningful information is that it’s entropy (Shannon entropy) can’t be too low or too high.  You can’t get a steady beat stuck in your head any more than you can get a whole symphony stuck in your head, but think of the themes from Mozart’s Requiem, Dvorak’s Cello Concerto, or Beethoven’s 5th and 9th symphonies, those simple tunes that haunt the whole ensemble of sound and never leave you.  The entropy of songs that get stuck in your head is a little too low to engage your intellect, so they tend to be annoying after a while, but they’re just right for some part of your brain that predates your sense of control, maybe from way back before the dinosaurs.  But I digress.
The meaning of a message is the part that changes how you will perceive the world, not the sum total or average value of its statements.  Like the song that changes how you hear the world, a meaningful message may be trite and annoying, or it may be transcendent and powerful, but either way its meaning cannot be predicted from its content.  One of the problems with pop music and daily news is that they are very much like information that people have generally found meaningful, without actually being meaningful.  Part of the cognitive dissonance that Wallace is talking about is the mind feeling like pop music and daily news should be meaningful, but then failing to find any meaning in it.  But this isn’t all bad.  Every so often something beautiful bubbles up through the scum.  Like entropy, meaning can’t be measured or controlled, but if you pay attention you can see a pattern in the form and moment of the transformation, so maybe you can stop beating the horse before is completely and irreversibly dead.  Meaning is a matter of relationships and configurations, not a sum of information, so to understand the meaning of information, it is necessary to understand its entropy.
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s