Everyday entropy: science and industry




Science is wonderful, but it is important to distinguish between the practice of science, which involves the discovery and dissemination of knowledge, and the industry of science, which involves taking profit from access to knowledge.  This is a lot like the difference between the practice of banking, which involves defining and circulating money, and the banking industry, which involves extracting profit from access to that money.  In both cases, as long as the system has plenty of free information, the difference is immaterial, but as the system approaches maximum entropy, the rent required to sustain the industry overwhelms the benefit of the practice, and the system collapses.

The important thing is not to try to innovate or discover new practices.  The important thing is to avoid making arbitrary copies.  The first car was, in fact, awesome – the 10 billionth car, not so much.  When Laird Hamilton tow-surfed jaws for the first time, that was awesome – a big wave day with 50 jet-ski teams in the water jockying for a wave, not so much.  Trying to innovate will yield arbitrarily novel distinctions that don’t make a difference (I’m talking about you, Windows 10), while crowding out really good ideas that are old, tried, and tested.  Avoiding arbitrary duplication makes space for new ideas and old ideas that actually work.

People who enjoy completing tasks are very good at duplicating awesome things, and people who enjoy organising are very good at providing taskers with raw materials and assembly lines, and at first, the copies seem just as awesome as the original.  It is only after the copies start getting in each other’s way, like cars on the interstate or jet-skis at Jaws, that the not-so-awesome impact of all that duplication becomes apparent.  But by then, nobody can remember what they did before they started copying that long-lost awesome original.

The Wartime Philosophy of “Shut up and Calculate”


“Veterans of the intense, multidisciplinary wartime projects came to speak of a new type of scientist. They touted the war-forged ‘radar philosophy’ and the quintessential ‘Los Alamos man’: a pragmatist who could collaborate with everyone from ballistics experts to metallurgists, and who had a gut feeling for the relevant phenomena without getting lost in philosophical niceties5.

Leading scientists and policy-makers actively sought to continue the wartime spirit of collaboration across disciplines. The Atomic Energy Commission oversaw a new network of national laboratories to pursue both civilian and defence research. The labs featured interdisciplinary teams that mixed physicists, mathematicians and chemists with engineers of many stripes6. A similar set-up appeared across dozens of US universities: facilities straddling several academic departments, such as the Research Laboratory for Electronics and the Laboratory for Nuclear Science and Engineering, both founded at MIT by the end of 1945 (ref. 7).

The facilities hummed with surplus equipment and know-how culled from the wartime projects. Physicist Bruno Rossi, for one, studied cosmic rays after the war by adapting the sensitive timing circuits he had built at Los Alamos to measure nuclear-fission rates5.

Openly philosophical areas of physics, the intellectual roots of which stretched back before the war, became increasingly marginalized, such as grand questions about the birth and fate of the Universe, the thin border between order and disorder in chaotic systems, or the subtle foundations of quantum theory. Sometimes these were denigrated as not even being ‘real physics’ by influential physicists in the United States, although research in these areas advanced in other parts of the world10.”


Everyday entropy: fake news and blue lies

It turns out that people prefer fake news to real news.  Investigative journalism exists because you can get better stories in the field than you can make up in your head.  Also, people don’t want the same fake news over and over again. They want real facts that confirm their false assumptions over and over again, although alternative facts can be substituted where real ones fail.  

Blue lies are false statements that support the community’s identity or absolve it from responsibility for its failings. Together, they can support a whole ethos of fake civilization. This is not to say that trumpian America is not real, only that it is not civilized. It’s comforts come from the privileges bestows upon it by civizations past, by the new deal, the GI bill and Ike’s interstate highway system. 

But there is no financially viable way to de-escalate in any given news cycle. If the truth doesn’t fit into a conceptual niche with a predictable outcome, it is very hard to sell.  Only insanely curious people prefer reality to belief, and those people are prone to clinical depression, alcoholism and suicide – not a good customer base.


4 reasons why people ignore facts and believe fake news

Dr. Michael Shermer is the author of “Why People Believe Weird Things.” He is the publisher of Skeptic Magazine and the Presidential Fellow at Chapman University, where he teaches Skepticism 101.

The new year has brought us the apparently new phenomena of fake news and alternative facts, in which black is white, up is down, and reality is up for grabs.

The inauguration crowds were the largest ever. No, that was not a “falsehood,” proclaimed by Kellyanne Conway as she defended Sean Spicer’s inauguration attendance numbers: “our press secretary…gave alternative facts to that.”

George Orwell, in fact, was the first to identify this problem in his classic Politics and the English Language (1946). In the essay, Orwell explained that political language “is designed to make lies sound truthful” and consists largely of “euphemism, question-begging and sheer cloudy vagueness.”

But if fake news and alternative facts is not a new phenomenon, and popular writers like Orwell identified the problem long ago, why do people still believe them? Well, there are several factors at work.

Cognitive simplicity

In general, when our brains process information belief comes quickly and naturally, skepticism is slow and unnatural, and most people have a low tolerance for ambiguity. Research shows that when we process and comprehend a statement our brain automatically accepts it as true, whereas the subsequent skepticism of the statement requires an extra cognitive step, which is a heavier load to lift. It is easier to just believe it and move on.

fMRI brain scan research shows that when we understand a statement we get a hit of dopamine in the reward areas of our brain, in which comprehension is positively rewarded and feels good. By contrast, the brain appears to process false or uncertain statements in regions linked to pain and disgust, especially in judging tastes and odors, giving new meaning to a claim “passing the taste test” or “passing the smell test.”

Cognitive dissonance

Cognitive dissonance is the uncomfortable tension that comes from holding two conflicting thoughts at the same time. It’s easier to dispute the facts than to alter one’s deepest beliefs. Creationists, for example, challenge the evidence for evolution not for scientific reasons but because they fear that if the theory is true they have to give up their religion. Climate deniers don’t dispute the data from tree rings, ice cores, and the rapid increase of greenhouse gases out of scientific curiosity – but because they’re afraid that if it’s true it might mean more restrictive government regulations on business and industry.

Backfire Effect

Cognitive simplicity and dissonance leads to a peculiar phenomena in which people seem to double down on their beliefs in the teeth of overwhelming evidence against them. This is called the backfire effect. In a series of experiments by the Dartmouth College, subjects were given fake newspaper articles that confirmed widespread misconceptions, such as the existence of WMDs in Iraq.

When subjects were then given a corrective article that WMDs were never found, liberals who opposed the war accepted the new article and rejected the old, whereas conservatives who supported the war did the opposite. And more: they reported being even more convinced there were WMDs after the correction, arguing that this only proved that Saddam Hussein hid or destroyed them. In the real world, when WMDs were not found, liberals who supported the war declared that they had never supported the war, and conservatives who supported the war insisted there were WMDs.

Tribal unity

We are a social primate species and we want to signal to others that we can be trusted as a reliable group member.

This means being consistent in agreeing with our other group members—whether that group is our political party or our religious faith—that we will not stray too far from our group’s core beliefs.

Thus, cognitive simplicity and cognitive dissonance may have an evolutionary adaptive purpose, as the social psychologist Carol Tavris outlined it in an email to me:

“When you find any cognitive mechanism that appears to be universal—such as the ease of creating ‘us-them’ dichotomies, ethnocentrism (‘my group is best’), or prejudice—it seems likely that it has an adaptive purpose; in these examples, binding us to our tribe would be the biggest benefit.In the case of cognitive dissonance, the benefit is functional: the ability to reduce dissonance is what lets us sleep at night and maintain our behavior, secure that our beliefs, decisions, and actions are the right ones. The fact that people who cannot reduce dissonance usually suffer mightily (whether over a small but dumb decision or because of serious harm inflicted on others) is itself evidence of how important the ability to reduce it is.”

Ultimately we are all responsible for what we believe and it is incumbent on us to be our own skeptics of fake news and alternative facts. When in doubt, doubt.

Ask “how do you know that’s true?” “What’s the source of that claim?” “Who said it and what is their motivation?” We must always be careful not to deceive ourselves, and we are the easiest people to deceive. As George Orwell wrote in a poignantly titled 1946 essay In Front of Your Nose:

“To see what is in front of one’s nose needs a constant struggle. … The point is we are all capable of believing things which we know to be untrue, and then, when we are finally proved wrong, impudently twisting the facts so as to show that we were right. Intellectually, it is possible to carry on this process for an indefinite time: the only check on it is that sooner or later a false belief bumps up against solid reality, usually on a battlefield.”

Other books by Dr. Michael Shermer include “Why Darwin Matters,” “The Science of Good and Evil,” and “The Moral Arc.” His upcoming book is “Heavens on Earth: The Quest for Immortality and Perfectibility.” Follow him on Twitter @michaelshermer.


By Clare Wilson

Seeing shouldn’t always be believing. We all have blind spots in our vision, but we don’t notice them because our brains fill the gaps with made-up information. Now subtle tests show that we trust this “fake vision” more than the real thing.

If the brain works like this in other ways, it suggests we should be less trusting of the evidence from our senses, says Christoph Teufel of Cardiff University, who wasn’t involved in the study. “Perception is not providing us with a [true] representation of the world,” he says. “It is contaminated by what we already know.”

The blind spot is caused by a patch at the back of each eye where there are no light-sensitive cells, just a gap where neurons exit the eye on their way to the brain.

We normally don’t notice blind spots because our two eyes can fill in for each other. When vision is obscured in one eye, the brain makes up what’s in the missing area by assuming that whatever is in the regions around the spot continues inwards.

Trick of the mind

But do we subconsciously know that this filled-in vision is less trustworthy than real visual information? Benedikt Ehinger of the University of Osnabrück in Germany and his colleagues set out to answer this question by asking 100 people to look at a picture of a circle of vertical stripes, which contained a small patch of horizontal stripes.

The circle was positioned so that with one eye obscured, the patch of horizontal stripes fell within the other eye’s blind spot. As a result, the circle appeared as though there was no patch and the vertical stripes were continuous.

Next to this was another circle of vertical stripes without a patch of horizontal stripes. People were asked to choose which circle seemed most likely to have continuous stripes.

Ehinger’s team were expecting that people would choose the circle without a patch more often. “It would be more logical to choose the one where they can really see all the information,” he says.

Cognitive bias

In fact, people chose the circle that had a filled-in patch 65 per cent of the time. “We never expected this,” says Ehinger. “The brain trusts its own generated information more than what it sees outside in the world.”

This fits in with what we know about cognitive biases, says Ehinger. When people hold strong beliefs, they are likely to ignore any evidence to the contrary.

There is no obvious benefit to our brains ignoring external information, says Ehinger. “I’ve talked to many people about it, and they all say it doesn’t make sense,” he says.

As well as blind spots in vision, there are other ways in which our perceptions are changed by the brain’s expectations, says Teufel. For instance, if a well-known song is converted to MIDI format, which strips out the vocals, people can usually “hear” words that aren’t really there.

Blue Lies

“Blue lies are a different category altogether, simultaneously selfish and beneficial to others—but only to those who belong to your group. As University of Toronto psychologist Kang Lee explains, blue lies fall in between generous white lies and selfish “black” ones. “You can tell a blue lieagainst another group,” he says, which makes it simultaneously selfless and self-serving. “For example, you can lie about your team’s cheating in a game, which is antisocial, but helps your team.”

In a 2008 study of seven, nine, and 11-year-old children—the first of its kind—Lee and colleagues found that children become more likely to endorse and tell blue lies as they grow older. For example, given an opportunity to lie to an interviewer about rule-breaking in the selection process of a school chess team, many were quite willing to do so, older kids more than younger ones. The children telling this lie didn’t stand to selfishly benefit; they were doing it on behalf of their school. This line of research finds that black lies drive people apart, white lies draw them together, and blue lies pull some people together while driving others away.

Around the world, children grow up hearing stories of heroes who engage in deception and violence on behalf of their in-groups. In Star Wars, for example, Princess Leia lies about the location of the “secret rebel base.” In the Harry Potter novels (spoiler alert!), the entire life of double-agent Severus Snape is a lie, albeit a “blue” one, in the service of something bigger than himself.

That explains why most Americans seem to accept that our intelligence agencies lie in the interests of national security, and we laud our spies as heroes. From this perspective, blue lies are weapons in intergroup conflict. As Swedish philosopher Sissela Bok once said, “Deceit and violence—these are the two forms of deliberate assault on human beings.” Lying and bloodshed are often framed as crimes when committed inside a group—but as virtues in a state of war.

This research—and those stories—highlight a difficult truth about our species: We are intensely social creatures, but we’re prone to divide ourselves into competitive groups, largely for the purpose of allocating resources. People can be prosocial—compassionate, empathic, generous, honest—in their groups, and aggressively antisocial toward out-groups. When we divide people into groups, we open the door to competition, dehumanization, violence—and socially sanctioned deceit.

“People condone lying against enemy nations, and since many people now see those on the other side of American politics as enemies, they may feel that lies, when they recognize them, are appropriate means of warfare,” says George Edwards, a Texas A&M political scientist and one of the country’s leading scholars of the presidency.

If we see Trump’s lies not as failures of character but rather as weapons of war, then we can come to see why his supporters might see him as an effective leader. From this perspective, lying is a feature, not a bug, of Trump’s campaign and presidency.

Research by Alexander George Theodoridis, Arlie Hochschild, Katherine J. Cramer, Maurice Schweitzer, and others have found that this kind of lying seems to thrive in an atmosphere of anger, resentment, and hyper-polarization. Party identification is so strong that criticism of the party feels like a threat to the self, which triggers a host of defensive psychological mechanisms.

For millions and millions of Americans, climate change is a hoax, Hillary Clinton ran a sex ring out of a pizza parlor, and immigrants cause crime. Whether they truly believe those falsehoods or not is debatable—and possibly irrelevant. The research to date suggests that they see those lies as useful weapons in a tribal us-against-them competition that pits the “real America” against those who would destroy it.”

Everyday entropy: entourage

Blowing Smoke

The entourage effect means that the active ingredient is not the whole experience.  This is a classic case of a theory that can’t be tested scientifically.  Take coffee for example.  Very few people use caffeine tablets in the morning.  They would rather drink bad coffee than swallow a tasteless pill.  That may seem irrational, but most coffee drinkers will tell you that the uplift begins before the first sip, when the aroma hits the olfactories.  The feeling is in the experience as a whole, and while the active ingredient plays a part in that experience, if you’ve had a great cup of coffee before, you’ll get a good feeling from the memory inspired by a lesser cup later on.

The feeling you get from taking drugs, whether coffee, beer or bud, is different every time.  If you went to the same coffee shop every day, and they used the same roaster and the same supplier and beans from the same plantation every day, you would still have a different feeling after the cup every day.  A scientific study of the feeling you get from coffee would average out as something very similar to taking a caffeine pill every morning, because when you average out the different ups, downs and ripples, the simple stimulation is all that is left.  It is exactly like the statistical mechanics of an ideal gas.  Every molecule is moving at a different speed, but on average, the speed is constant and identical to its blackbody radiation.  The basic assumption of statistical mechanics, without which entropy makes no sense, is that every microscopic constituent is interchangeable.  This simply isn’t possible for organic plant material.  No two batches of beer are identical, no two wine vintages are identical, and no two pots of coffee are identical.  Not only that, but oxidization changes the flavor and chemistry of coffee and wine from one minute to the next.  Bees prefer fake flowers with the “natural” amount of caffeine or nicotine to fake flowers with too much or too little, so the way the dose is received matters as well:

“The researchers used artificial flowers in a tightly-monitored flight arena in the laboratory to mimic how flowering plants use animals as pollen carriers and reward pollinators with sugars found in floral nectar.

30 bees were allowed to forage on two types of types of flowers – one which contained a sugar solution and was blue in colour. The second type of artificial flower was purple in colour and had different concentrations of nicotine. Another 30 were tested with the two flower colours having the opposite contents.

The experiment was repeated with the nicotine-laced flowers having three different concentrations of nicotine – two of which were found within the natural range and another that was much higher. Only the unnaturally high concentration of nicotine deterred the bees from foraging for nectar.

“Here we find that bees not only remember such flowers better, but even keep coming back for more when these flowers are demonstrably poorer options, as if they were truly hooked on these flowers.”

Crazy Medicine:


By Michael Le Page

In some cultures, it’s traditional for elders to smoke grass, a practice said to help them pass on tribal knowledge. It turns out that they might just be onto something.

Teenagers who toke perform less well on memory and attention tasks while under the influence. But low doses of the active ingredient in cannabis, THC, might have the opposite effect on the elderly, reversing brain ageing and restoring learning and memory – at least according to studies of mice.

“We repeated these experiments many times,” says team leader Andreas Zimmer at the University of Bonn, Germany. “It’s a very robust and profound effect.”

Zimmer’s team has been studying the mammalian endocannabinoid system, which is involved in balancing out our bodies’ response to stress. THC affects us by mimicking similar molecules in this system, calming us down.

The researchers discovered that mice with genetic mutations that stop this endocannabinoid system from working properly age faster than normal mice, and show more cognitive decline. This made Zimmer wonder if stimulating the endocannabinoid system in elderly mice might have the opposite effect.

Brain boost

To find out, the team gave young (2-month-old), middle-aged (12-month-old) and elderly (18-month-old) mice a steady dose of THC. The amount they received was too small to give them psychoactive effects.

After a month, the team tested the mice’s ability to perform cognitive tasks, such as finding their way around mazes, or recognising other individuals.

In the control groups, which received no THC, the young mice performed far better than the middle-aged and elderly mice. But the middle-aged and elderly mice who had been given THC performed as well as the young mice in the control group.

Further studies showed that THC boosted the number of connections between brain cells in the hippocampus, which is involved in memory formation. “It’s a quite striking finding,” says Zimmer.

Age effect

But THC seemed to have the opposite effect in young mice: when they were given THC, their performance in some tasks declined.

Young people also perform worse in learning and memory tests in the hours and days after smoking cannabis, but a joint delivers far higher doses than the mice received. Claims that heavy marijuana use can permanently impair cognition are disputed.

Zimmer thinks his findings show that both too much and too little stimulation is harmful. The endocannabinoid system is most active in young mice (and people), so extra THC may overstimulate it. In older mice, by contrast, endocannabinoid activity declines, so a little THC restores it to optimum levels.

Human trial

The team’s findings aren’t that surprising, says neuropsychopharmacologist David Nutt of Imperial College London. Animal studies have shown that the cannabinoids the body produces itself can have beneficial effects on the brain. And Nutt and his colleagues have also found that THC use protects alcoholics from alcohol-induced brain damage.

Zimmer’s team is now planning human trials to find out whether older people can benefit from low doses of THC too and, if so, from what age it is beneficial. “There is no formula to translate mouse months into human years,” Zimmer says.

The trials will use purified THC rather than weed so the dosage can be precisely controlled. It might be administered as a mouth spray, for example.

Even if the trials get similar results, it is unlikely that doctors will start prescribing spliffs to older people. “The dosing is important,” Zimmer says. “Smoking marijuana is very different.”

Everyday entropy: reforesting China

Recovering forests, with deforested areas in the background in Wolong China

It is too easy to criticise China for making the same environmental choices that Britain and the US made in previous centuries.  Air quality in Beijing is hardly worse than it was in London at the height of the industrial revolution.  Equally, it is too easy to congratulate China for joining the deforestation-NIMBY club. Like Europe and the United States, China no longer tolerates internal deforestation, but, also like Europe and the US, Chinese investment is now contributing to deforestation in South America, Africa, Indonesia, and the rest of south east Asia.  In a pattern that mirrors reforestation in colonial Europe, China has found that deforestation can be exported as easily as opium, tobacco, and electronics.

It looks like a classic case of misunderstanding the relationship between a heat engine and its heat sink, imagining that the entropy of the environment can increase infinitely and indefinitely. 

In reading the reports on forest cover, it also seems somehow important that the difference between a 50 year-old tree and a 300 year-old tree isn’t readily apparent unless you are standing next to its base.  Ecologically, Ancient trees are not just fatter, and it is impossible to reestablish an old-growth forest in one human lifetime, let alone one political cycle.  We cannot hope to see successful reforestation, if it is ever achieved, so we have no way of knowing whether any of our reforestation efforts can be classified as a success.

As it is, imported timber accounts for more than half of China’s total timber supply. With the new logging ban in place, China is expected to fix its gaze abroad to meet domestic demand with even more imported timber. Already, evidence suggests that a large portion of the wood imports entering China are illegally sourced – particularly certain rosewood species from the Mekong region of Southeast Asia and, increasingly, from Africa. In addition, the limited availability of desirable species that plantations can’t supply could exacerbate the existing illegal logging issues in the Russian Far East, whose temperate forests border China’s and share many of the same species.”

At a global scale, both historical and recent tree cover loss followed clear spatial and macro-economic patterns. Spatially, losses have disproportionately affected areas with high population pressure and easy access (figure 3a; electronic supplementary material, figure S4). Consequently, today most (71%) tree cover remains within areas with lower human population pressure. In low-income countries (with a GDP per capita of less than US$ 10 000), a large fraction of tree cover still coincides with areas of high population pressure, where it is currently being lost at an alarming rate: between 2000 and 2012 approximately 25 000 km2 yr−1 (6%) (electronic supplementary material, figure S5). A slightly different picture emerges for high-income countries with a large proportion of boreal forests (electronic supplementary material, figure S6): here, there have also been substantial tree cover losses at distances of more than 1000 km from towns and in areas of very low population density, reflecting the proneness of boreal forests to natural loss dynamics (e.g. caused by fire, storm, insects and pathogenic fungi). Protected areas appeared to hold tree cover losses, but only to an extent: recent proportionate tree cover losses outside protected areas were approximately twice as high as inside (1.6 million km2 (5%) versus 94 000 km2 (2.9%); electronic supplementary material, figure S6).

The proportion of tree cover losses not compensated by within-country tree cover gains (loss/loss + gain) was highest in countries with high levels of poverty, urbanization, population growth, a high GDP reliance on agriculture and with expanding food production (figure 6; electronic supplementary material, figures S7 and S8). There was a strong continental signal whereby the fraction of uncompensated losses was particularly high in African countries. Losses appeared to be lower in SE Asia, but this hides the fact that there large amounts of natural tree cover are replaced with tree cash crops such as rapidly growing oil palm, rubber and Eucalyptus. There was a small negative correlation (R = −0.21, p ≤ 0.001, d.f. = 176) between the proportionate historical tree cover loss (per country) and the recent proportionate uncompensated loss, indicating that countries that have historically deforested their territories may now have shifted to protecting the remaining tree cover.

The global picture of tree cover losses illustrates China’s challenging situation: China has one of the world’s greatest coincidences of tree cover and people (figure 4), with 88% of China’s tree cover located in areas with more than 10 people per km2 and 33% in areas with greater than 100 people per km2. Globally, only 29% and 6% of tree cover are located in these population density classes, respectively. Our calculations showed that China also has one of the world’s highest correlations between area suitability for tree cover and for agriculture (globally R = 0.35; p ≤ 0.001, d.f. = 8 318 389; in China R = 0.73, p ≤ 0.001, d.f. = 546 123; China ranked ninth in the world in terms of this correlation), and the zone that is suitable for tree cover therefore overlaps to 84% with the zone suitable for agriculture. Finally, China has one of the world’s lowest per person areas of suitable agricultural land (approx. 3 × 10−3 km2 per person, which is less than one-third of the global average). As is the case globally, both historical and recent tree cover losses in China have mostly affected areas with high population pressures, high agricultural suitability and easy access (figure 3b). Today, most of China’s tree cover remains on sloping lands (1–10°), while in flat areas as much as approximately 90% of original tree cover may have been lost (electronic supplementary material, figure S9). Between 2000 and 2010, approximately 10 000 km2 of tree cover have been lost from large and unbroken expanses of intact natural forest areas [23] (see electronic supplementary material, Results S4), and between 2000 and 2012, almost 3000 km2 of tree cover with crown cover of 50% or more has been lost from protected areas (3.2%). This was slightly higher than the global proportionate loss from protected areas (2.9%) and similar to the proportionate loss outside protected areas in China (4%), raising concerns over protected area efficacy.”

China’s forest recovery shows hope for mitigating global climate 

China’s sweeping program to restore forests across the country is working.

The vast destruction of China’s forests, leveled after decades of logging, floods and conversion to farmland, has become a story of recovery, according to the first independent verification published in today’s Science Advances by Michigan State University (MSU) researchers.

“It is encouraging that China’s forest has been recovering in the midst of its daunting environmental challenges such as severe air pollution and water shortages,” said co-author Jianguo “Jack” Liu, Rachel Carson Chair in Sustainability and director of MSU’s Center for Systems Integration and Sustainability (CSIS).  “In today’s telecoupled world, China is increasingly connected with other countries both socioeconomically and environmentally. Every victory must be measured holistically, or we aren’t getting a true picture.”

Forests are crucial to ensuring soil and water conservation and climate regulation. The fate of forests in the world’s most populous nation has global consequences by virtue of the country’s sheer magnitude and its rapid development.

Since the beginning of the 21st Century, China has implemented the largest forest conservation and restoration programs in the world, the Natural Forest Conservation Program (NFCP), which bans logging, and in some forested areas compensates residents for monitoring activities preventing illegal timber harvesting.

The MSU scientists used a unique combination of data, including the big-picture view of NASA’s Moderate Resolution Imaging Spectroradiometer (MODIS) annual Vegetation Continuous Fields tree cover product, along with high spatial resolution imagery available in Google Earth. Then they combined data at different scales to correlate the status of the forests with the implementation of the NFCP.

And, as the Chinese government has contended, the program is working and forests are recovering, with about 1.6 percent, or nearly 61,000 square miles, of China’s territory seeing a significant gain in tree cover, while 0.38 percent, or 14,400 square miles, experienced significant loss.

“Our results are very positive for China,” said author Andrés Viña of MSU-CSIS. “If you look at China in isolation, its program is working effectively and contributing to carbon sequestration in accordance to its agenda for climate change mitigation. But on the other hand, China is not in a vacuum.”

In the future, it is important to quantify how much China’s forest gain and improved carbon sequestration may be a loss for places like Madagascar, Vietnam and Indonesia. Those are among the countries that are chopping down their forests to sell products to China. And the global increase in greenhouse gases and loss of biodiversity may have just changed addresses.

Viña noted more research is needed to document the broader impacts of forest degradation and recovery around the world. He also noted that the voracious appetite for natural resources – both timber and the agricultural products grown on converted forestland – is not just China’s issue.

“We are all part of the problem one way or another,” he said. “We all buy products from China, and China has not changed their imports and exports of wood at all. What has changed is where timber is coming from.”

Besides Viña and Liu, “Effects of conservation policy on China’s forest recovery” was written by MSU associate professor William McConnell, and CSIS PhD students Hongbo Yang and Zhenci Xu.”

Everyday entropy: the awesome, terrible importance of make-believe


Social conservatism and supply side economics are examples of social realities that have lost connection with physical reality. No legitmiate science supports the notion that homosexuality is unnatural or that lower taxes pay for themselves with higher revenue. Nor, given that 10-20% of pregnancies end in miscarriage, is there any reason to believe that God wants all fetuses to reach full term. But shared beliefs requires shared interests, whether in pork, alcohol or headgear, and dead babies rank high in the list of easy ideas to sell. The ideas persist against all evidence because they have social significance. People use them to show where they stand, not to understand reality. This may seem stupid, but social reality is almost pure information. It is simple and manachean, with virtually no intrusion from entropy or uncertainty, and it obviates the need for costly observation. A social map is efficient.

Adpergers stndrome illustrates the terribly awesome benefits of social mapping. The problem for people with aspergers is not that they can’t see the benefits of social integration, but that they can’t see the threads of the social map: the made-up reality that brings people together.

The child who pointed out that the emperor was naked probably had Asperger’s.  Most children are happy enough to make-believe like everyone else.  Some are different.  This difference, the inability to go along to get along, makes it very difficult to manage an Asperger’s person, but every once in a while the ability to see through the veil of social reality makes that person invaluable.

In the triad of impairments for Asperger’s syndrome, it is obvious why the impairment of social communication and social interaction cause problems, but it is hard to explain the importance of social imagination.  Make-believe play helps children develop all kinds of practical skills, including math, science, language and music, but children with Aspergers are often precociously gifted in these skills.  Not only that, but it isn’t clear why make-believe should make relationships easier for adults: why pretend when you can just tell the truth?

There is value in having a social map, a local idiom, a proper way of going about things. The political importance of imaginative play cannot be underestimated.  In adult human interactions, there is a great deal of make-believe, a social agreement that we will all pretend that THIS is important and THIS is unimportant.  For someone with Asperger’s, this make-believe world of social significance, like hair-cuts and polished shoes, religion and career, is unimaginable.  This gloss of fantasy may be boring and futile, involving appropriate attire, table manners, personal development and social networking, but it’s a make-believe world that makes it possible for people to interact seamlessly with people they’ve never met before or don’t actually like.

But here’s the thing.  The adult fantasy world is still make-believe, and it has a way of drifting away from physical reality, sometimes to the point where starting a war for obviously made-up reasons seems totally appropriate.  Part of the emergency in civilisation is that the make-believe political world of middle-class improvement is falling apart, but people are so good at playing this game of make-believe that they don’t have a way to look at the reality and change the game.

What most people don’t seem to understand about Asperger’s is that social imagination isn’t fun for them.  Playing dress-up or make-believe – trying to imagine what someone else would do in this situation – is just stressful.  Curiously, dungeon’s and dragons and similar abstract role-playing is easier because it involves observation rather than impersonation.

“People with Asperger’s can be imaginative in the conventional use of the word. For example, many are accomplished writers, artists and musicians. But people with Asperger syndrome can have difficulty with social imagination. This can include:

  • imagining alternative outcomes to situations and finding it hard to predict what will happen next
  • understanding or interpreting other peoples thoughts, feelings or actions. The subtle messages that are put across by facial expression and body language are often missed
  • having a limited range of imaginative activities, which can be pursued rigidly and repetitively, eg lining up toys or collecting and organising things related to his or her interest.”







Everyday entropy: waggle dance


“The bee’s dance is analog.

– Orrin W. “Rob” Robinson

This is the thing that information theory can’t describe.  The bee’s dance is literally analog, not figuratively digital.  Bees are not trying to convey 7.4 bits of information in an approximate way, they are just dancing in way that sometimes conveys information.  Only 10% of bees follow the instructions a waggler, and among them, the quantum of onboard information is uncertain.  The dance itself isn’t being recorded, stored or copied, so its digital equivalent is moot.  What information the recruits actually understand and how they understand it are both hidden inside the whole system, nerves, muscles and skeleton.  It is possible to conceptualise the dance in terms of information so that a digital replica of the dance would need 8 bits, but the dance itself is physical and entropic.

The dance is also resonant – the vibrations of the dancer‘s waggle resonate with the physical structures of the listening bees.  The dance imprints vibrations and vectors; it commutes between bees in waves of approximation.


The spatial information content of the honey bee waggle dance

Roger Schürch and Francis L. W. Ratnieks

“In 1954, Haldane and Spurway published a paper in which they discussed the information content of the honey bee waggle dance with regard to the ideas of Norbert Wiener, who had recently developed a formal theory of information. We return to this concept by reanalyzing the information content in both vector components (direction, distance) of the waggle dance using recent empirical data from a study that investigated the accuracy of the dance. Our results show that the direction component conveys 2.9 bits and the distance component 4.5 bits of information, which agrees to some extent with Haldane and Spurway’s estimates that were based on data gathered by von Frisch. Of course, these are small amounts of information compared to what can be conveyed, given enough time, by human language, or compared to what is routinely transferred via the internet. Nevertheless, small amounts of information can be very valuable if it is the right information. The receivers of this information, the nestmate bees, know how to react adaptively so that the value of the information is not negated by its low information content.

In 1954, Haldane and Spurway (1954) published a paper in the scientific journal Insectes Sociaux with the title “A statistical analysis of communication in Apis mellifera and a comparison with communication in other animals.” Haldane and Spurway (1954), using the data set of Karl von Frisch, looked at the waggle dance communication using an information theory approach, at least in terms of the direction communicated by a dancing bee (von Frisch, 1946, 1967). Of course, von Frisch’s primary target was to understand the dance language, not to obtain a precise calibration to study where the bees had foraged: he chose to work only with good dancers (Chittka and Dornhaus, 1999), which seems to underestimate systematically the error present in the dances (Schürch and Couvillon, 2013; Schürch et al., 2013) and therefore bias the data.

Despite the limitations outlined above, our calculations are a first step, and important questions arise from the calculations. For example, how much information in a dance is useful to a colony? Is one bit of spatial information helpful, that is, fly north or south? How useful are two bits that could communicate four directions unambiguously (north, east, south, west)? And how much better are the 2.9 bits, that is sectors of the circle of about 50°, that bees communicate? Much will probably depend on the environment (Sherman and Visscher, 2002; Donaldson-Matasci and Dornhaus, 2012; Okada et al., 2012), or the benefits of the spatial information may also depend on colony size (Donaldson-Matasci et al., 2013). For example, if a hive were situated in the middle of a large-scale farming landscape with mass flowering crops, a dance with relatively little information might be informative, whereas in a more fragmented landscape with small flower patches, more information will be necessary to allow a dance follower to find an advertised resource. Future honey bee foraging models should incorporate variability in the dance’s information to investigate the relationship between spatial information content and adaptiveness of the dance.”

Everyday entropy: ataxophobia and anxiety



Ataxophobia is a lot like arachnophobia. Most people have a little ataxophobia, fear of disorder, and that is probably healthy. Science is dominated by people who have a lot.  This, more than anything else, is the barrier to understanding entropy.

Arieh ben-Naim’s passion comes from the utter failure of traditional texts to actually teach anybody about entropy.  If entropy pedagogy were in any way adequate, you would expect a few people, at least ones with advanced chemistry degrees, to understand entropy.  But they don’t.   Von Neumann’s (possibly apocryphal) assessment is still accurate: “nobody really knows what entropy is.”  Ben-Naim has spent his life trying to figure out why science students fail to grasp entropy, and how to fix that.  He has assailed chemistry and physics textbooks for misinforming students about the true nature of entropy.  But, like Boltzmann, ben-Naim is able to see thermodynamics in terms of probability and uncertainty, so Shannon’s measure of information seems like a simpler and easier way to understand entropy than the traditional “spreading out” or “mixing up.” The thing is that if ataxophobia is like arachnophobia, the fear of uncertainty is like all other fears combined.  Telling people that entropy is actually uncertainty is like telling them that what they thought was a single spider is actually a big box of snakes, and they’re on a high ledge, in the dark, and they have to speak publicly in the nude (uncertainty fuels anxiety.)  Fear of uncertainty is so common that it has been upgraded from a phobia to an emotion: anxiety.

Ben-Naim is not confused about entropy, but he doesn’t appreciate that his vision is unique and his preference for uncertainty over disorder is peculiar.  Most people literally can’t see entropy this way, and probably can’t see entropy at all.  They generally fall into two camps: those who, like Lambert, view entropy as old and settled, and so not very interesting; and those who see entropy as too terrifying for serious consideration.  Both are wrong, but their wrongness seems to stem from an emotional need for stability rather than a failure of pedagogy. Lambert (quoted below) cannot be credited with helping to correct this situation.  At least ben-Naim is trying.  But the problem is not in the pedagogy.  It is ataxophobia.  Entropy is precisely what people go into science to avoid.  Ben-Naim hopes that an informational definition will help calm the ataxophobia that science students get from physical entropy, but the replacement is anxiety.  The problem is far deeper than misunderstanding.  When confronted with entropy, people feel terror and naturally settle into denial.

To go back to ben-Naim’s argument, defining entropy as disorder does beg the question.  Order, in the thermodynamic sense, is no better understood than entropy.  It is not the static and arbitrary order of a tidy room or neat arrangement.  Repetitive patterns may describe certain instances of low informational entropy, but that misses the point of thermodynamic entropy, which is concerned with momentum. Order, in thermodynamics, means greater freedom in less space. This may correspond with the appearance of order, but it is the freedom that matters, not the pattern or tidiness.

The difference between diamond and graphite illustrates how entropy relates to freedom and space. In a diamond, the carbon nuclei vibreate in three dimensions in the tetrahedron while each of the four valence electrons can overlap with an electron from one of the four bonding partners in the tetrahedral lattice. Compare this to graphite, where each carbon atom is bonded to three others in a sheet of hexagonal rings. The fourth valence electron is stuck with its parent and the nucleus is pinned in two dimensions. Furthermore, there is more distance between the sheets than between the diamond’s tetrahedral corners. What gives a diamond it’s incredible heat capacity is the freedom of its parts to move within the tiny space of its lattice. It is not that the tetrahedron is more orderly that honeycomb sheets in any abstract or platonic sense, but that it provides carbon atoms with tremendous freedom in minimal space.

Order, in the entropic sense, is only a function of freedom to move in space, not the static arrangement of parts. This is why dust settled on the floor has higher entropy that dust floating through the air, even though it looks more orderly. The dust on the floor has lost its freedom to move in three dimensions.

Contrast the freedom of carbon atoms with the constipation of a city in gridlock. It is not the “order” that matters but the momentum it allows. A hurricane can transfer more energy in a tighter package than a nor’easter because it’s spiral is the perfect shape for packing and flow.

If you add the dispersal of information to the classical dispersal of energy, mass or freedom, then ben Naim’s informational interpretation of entropy folds fluidly into the more intuitive explanations of entropy.  “Disorder” stands out as misunderstood, but a thorough exploration of order in terms of freedom in space or informational expectation clarifies that point as well.

In every science book there is a land of the unknown, and an impression that some new discovery might just push open the door to that land.  But in every field, the new discoveries, year after year, just circle around the promised land, occasionally a little bit closer, but never substantially.  This is where the actual entropy starts.  It isn’t science, although there are great scientific models for it.  Boltzmann’s entropy equations are wonderful, but they can’t tell you anything about the actual entropy of your environment.  They are like the score to Beethoven’s 9th symphony.  If you know how to read them, they can give you an impression of what the music might sound like, but they are nothing like the music.

Entropy might seem like a scientific concept, but it’s actually the asymptotic limit of what can be known scientifically.  The entropy of an isolated system increases or stays the same. How unhelpful and unscientific is that?  What’s really crazy is that sometimes entropy makes sense in terms of uncertainty, sometimes in terms of distribution of momentum, and sometimes in terms of distribution in space, for no good reason.



By Frank L. Lambert
This review is from: Entropy Demystified: The Second Law Reduced to Plain Common Sense (Paperback)

“Fifty years ago, Arieh Ben-Naim, as every student in a physics or chemistry class of that era, was mystified by his introduction to entropy and the second law of thermodynamics. Although he was a professor of chemistry before retiring 15 years ago, Ben-Naim has evidently not kept up with the teaching of those topics in current chemistry texts. Thus, he seems unaware that most general chemistry texts currently published in the US (16) and three in physical chemistry – most available from Amazon.com – now clearly and simply present entropy and the second law.

The connection between spontaneous chemical reactions or physical processes, dispersal of energy, and entropy is integral, tight, and generally accepted. It does not require 200 pages to justify.  Therefore, his 217 pages of “Entropy Demystified” that are necessary to develop his personal viewpoint (an information theory variant, not present in a US undergraduate chemistry textbook) can be clarified by 3-4 pages in any of the chemistry texts listed with their ISBN numbers (for exact Amazon.com identification) in […] at “May 2009”. In fact, a conceptual summary of the second law and entropy for most interested readers can be abstracted from these texts in two sentences: “Energy of all types changes from being localized to become more dispersed, spread out, distributed in space (and abstractly, in more energy quantum states, microstates) if that energy is not constrained.” Then, “entropy change is the quantitative measure of how much more widely distributed the initial energy becomes in a spontaneous process.” Thus, in real processes, energy spreads out spatially. Probabilistic methods are one way of quantifying thermodynamic entropy.

Unfortunately, Professor Ben-Naim’s fundamental error, summarized on page 204 but weakening all previous pages, is his misinterpretation of what occurs in real systems of molecules, especially in the simple isothermal expansion of ideal gases or in their “mixing”/expansion. These cases have misled him to focus on their particular lack of change in the total energy of the system, rather than on what is the fundamental cause of all thermodynamic entropy change: the increased spreading of the initial energy of actual molecules in space when constraints are removed – e.g., their spontaneously moving into a greater volume from a smaller volume (with unchanged energy) in a process such as expansion/mixing. This is what traditional thermodynamic entropy readily measures and, as just stated, can be readily understood.

The disconnect between information and the second law is stressed on page 203 by “a measure of information cannot be used to explain the Second Law of Thermodynamics.” This is true, indeed. The connection between the second law and information is tenuous. Contrast this with the modern view in beginning collegiate chemistry texts, e.g. “whenever a product-favored chemical or physical process occurs, energy becomes more dispersed…This is summarized in the second law of thermodynamics, which states that the total entropy of the universe … is continually increasing.” (Moore, Stanitski, and Jurs; 3rd edition.) A physical chemistry text that is used world-wide states “…the Second Law of thermodynamics, may also be expressed in terms of another state function, the entropy, S. …entropy…is a measure of the energy dispersed in a process…” (Atkins and de Paula, 8th edition.)”