It turns out that people prefer fake news to real news. Investigative journalism exists because you can get better stories in the field than you can make up in your head. Also, people don’t want the same fake news over and over again. They want real facts that confirm their false assumptions over and over again, although alternative facts can be substituted where real ones fail.
Blue lies are false statements that support the community’s identity or absolve it from responsibility for its failings. Together, they can support a whole ethos of fake civilization. This is not to say that trumpian America is not real, only that it is not civilized. It’s comforts come from the privileges bestows upon it by civizations past, by the new deal, the GI bill and Ike’s interstate highway system.
But there is no financially viable way to de-escalate in any given news cycle. If the truth doesn’t fit into a conceptual niche with a predictable outcome, it is very hard to sell. Only insanely curious people prefer reality to belief, and those people are prone to clinical depression, alcoholism and suicide – not a good customer base.
4 reasons why people ignore facts and believe fake news
Dr. Michael Shermer is the author of “Why People Believe Weird Things.” He is the publisher of Skeptic Magazine and the Presidential Fellow at Chapman University, where he teaches Skepticism 101.
The new year has brought us the apparently new phenomena of fake news and alternative facts, in which black is white, up is down, and reality is up for grabs.
The inauguration crowds were the largest ever. No, that was not a “falsehood,” proclaimed by Kellyanne Conway as she defended Sean Spicer’s inauguration attendance numbers: “our press secretary…gave alternative facts to that.”
George Orwell, in fact, was the first to identify this problem in his classic Politics and the English Language (1946). In the essay, Orwell explained that political language “is designed to make lies sound truthful” and consists largely of “euphemism, question-begging and sheer cloudy vagueness.”
But if fake news and alternative facts is not a new phenomenon, and popular writers like Orwell identified the problem long ago, why do people still believe them? Well, there are several factors at work.
In general, when our brains process information belief comes quickly and naturally, skepticism is slow and unnatural, and most people have a low tolerance for ambiguity. Research shows that when we process and comprehend a statement our brain automatically accepts it as true, whereas the subsequent skepticism of the statement requires an extra cognitive step, which is a heavier load to lift. It is easier to just believe it and move on.
fMRI brain scan research shows that when we understand a statement we get a hit of dopamine in the reward areas of our brain, in which comprehension is positively rewarded and feels good. By contrast, the brain appears to process false or uncertain statements in regions linked to pain and disgust, especially in judging tastes and odors, giving new meaning to a claim “passing the taste test” or “passing the smell test.”
Cognitive dissonance is the uncomfortable tension that comes from holding two conflicting thoughts at the same time. It’s easier to dispute the facts than to alter one’s deepest beliefs. Creationists, for example, challenge the evidence for evolution not for scientific reasons but because they fear that if the theory is true they have to give up their religion. Climate deniers don’t dispute the data from tree rings, ice cores, and the rapid increase of greenhouse gases out of scientific curiosity – but because they’re afraid that if it’s true it might mean more restrictive government regulations on business and industry.
Cognitive simplicity and dissonance leads to a peculiar phenomena in which people seem to double down on their beliefs in the teeth of overwhelming evidence against them. This is called the backfire effect. In a series of experiments by the Dartmouth College, subjects were given fake newspaper articles that confirmed widespread misconceptions, such as the existence of WMDs in Iraq.
When subjects were then given a corrective article that WMDs were never found, liberals who opposed the war accepted the new article and rejected the old, whereas conservatives who supported the war did the opposite. And more: they reported being even more convinced there were WMDs after the correction, arguing that this only proved that Saddam Hussein hid or destroyed them. In the real world, when WMDs were not found, liberals who supported the war declared that they had never supported the war, and conservatives who supported the war insisted there were WMDs.
We are a social primate species and we want to signal to others that we can be trusted as a reliable group member.
This means being consistent in agreeing with our other group members—whether that group is our political party or our religious faith—that we will not stray too far from our group’s core beliefs.
Thus, cognitive simplicity and cognitive dissonance may have an evolutionary adaptive purpose, as the social psychologist Carol Tavris outlined it in an email to me:
“When you find any cognitive mechanism that appears to be universal—such as the ease of creating ‘us-them’ dichotomies, ethnocentrism (‘my group is best’), or prejudice—it seems likely that it has an adaptive purpose; in these examples, binding us to our tribe would be the biggest benefit.In the case of cognitive dissonance, the benefit is functional: the ability to reduce dissonance is what lets us sleep at night and maintain our behavior, secure that our beliefs, decisions, and actions are the right ones. The fact that people who cannot reduce dissonance usually suffer mightily (whether over a small but dumb decision or because of serious harm inflicted on others) is itself evidence of how important the ability to reduce it is.”
Ultimately we are all responsible for what we believe and it is incumbent on us to be our own skeptics of fake news and alternative facts. When in doubt, doubt.
Ask “how do you know that’s true?” “What’s the source of that claim?” “Who said it and what is their motivation?” We must always be careful not to deceive ourselves, and we are the easiest people to deceive. As George Orwell wrote in a poignantly titled 1946 essay In Front of Your Nose:
“To see what is in front of one’s nose needs a constant struggle. … The point is we are all capable of believing things which we know to be untrue, and then, when we are finally proved wrong, impudently twisting the facts so as to show that we were right. Intellectually, it is possible to carry on this process for an indefinite time: the only check on it is that sooner or later a false belief bumps up against solid reality, usually on a battlefield.”
By Clare Wilson
Seeing shouldn’t always be believing. We all have blind spots in our vision, but we don’t notice them because our brains fill the gaps with made-up information. Now subtle tests show that we trust this “fake vision” more than the real thing.
If the brain works like this in other ways, it suggests we should be less trusting of the evidence from our senses, says Christoph Teufel of Cardiff University, who wasn’t involved in the study. “Perception is not providing us with a [true] representation of the world,” he says. “It is contaminated by what we already know.”
The blind spot is caused by a patch at the back of each eye where there are no light-sensitive cells, just a gap where neurons exit the eye on their way to the brain.
We normally don’t notice blind spots because our two eyes can fill in for each other. When vision is obscured in one eye, the brain makes up what’s in the missing area by assuming that whatever is in the regions around the spot continues inwards.
Trick of the mind
But do we subconsciously know that this filled-in vision is less trustworthy than real visual information? Benedikt Ehinger of the University of Osnabrück in Germany and his colleagues set out to answer this question by asking 100 people to look at a picture of a circle of vertical stripes, which contained a small patch of horizontal stripes.
The circle was positioned so that with one eye obscured, the patch of horizontal stripes fell within the other eye’s blind spot. As a result, the circle appeared as though there was no patch and the vertical stripes were continuous.
Next to this was another circle of vertical stripes without a patch of horizontal stripes. People were asked to choose which circle seemed most likely to have continuous stripes.
Ehinger’s team were expecting that people would choose the circle without a patch more often. “It would be more logical to choose the one where they can really see all the information,” he says.
In fact, people chose the circle that had a filled-in patch 65 per cent of the time. “We never expected this,” says Ehinger. “The brain trusts its own generated information more than what it sees outside in the world.”
This fits in with what we know about cognitive biases, says Ehinger. When people hold strong beliefs, they are likely to ignore any evidence to the contrary.
There is no obvious benefit to our brains ignoring external information, says Ehinger. “I’ve talked to many people about it, and they all say it doesn’t make sense,” he says.
As well as blind spots in vision, there are other ways in which our perceptions are changed by the brain’s expectations, says Teufel. For instance, if a well-known song is converted to MIDI format, which strips out the vocals, people can usually “hear” words that aren’t really there.
“Blue lies are a different category altogether, simultaneously selfish and beneficial to others—but only to those who belong to your group. As University of Toronto psychologist Kang Lee explains, blue lies fall in between generous white lies and selfish “black” ones. “You can tell a blue lieagainst another group,” he says, which makes it simultaneously selfless and self-serving. “For example, you can lie about your team’s cheating in a game, which is antisocial, but helps your team.”
In a 2008 study of seven, nine, and 11-year-old children—the first of its kind—Lee and colleagues found that children become more likely to endorse and tell blue lies as they grow older. For example, given an opportunity to lie to an interviewer about rule-breaking in the selection process of a school chess team, many were quite willing to do so, older kids more than younger ones. The children telling this lie didn’t stand to selfishly benefit; they were doing it on behalf of their school. This line of research finds that black lies drive people apart, white lies draw them together, and blue lies pull some people together while driving others away.
Around the world, children grow up hearing stories of heroes who engage in deception and violence on behalf of their in-groups. In Star Wars, for example, Princess Leia lies about the location of the “secret rebel base.” In the Harry Potter novels (spoiler alert!), the entire life of double-agent Severus Snape is a lie, albeit a “blue” one, in the service of something bigger than himself.
That explains why most Americans seem to accept that our intelligence agencies lie in the interests of national security, and we laud our spies as heroes. From this perspective, blue lies are weapons in intergroup conflict. As Swedish philosopher Sissela Bok once said, “Deceit and violence—these are the two forms of deliberate assault on human beings.” Lying and bloodshed are often framed as crimes when committed inside a group—but as virtues in a state of war.
This research—and those stories—highlight a difficult truth about our species: We are intensely social creatures, but we’re prone to divide ourselves into competitive groups, largely for the purpose of allocating resources. People can be prosocial—compassionate, empathic, generous, honest—in their groups, and aggressively antisocial toward out-groups. When we divide people into groups, we open the door to competition, dehumanization, violence—and socially sanctioned deceit.
“People condone lying against enemy nations, and since many people now see those on the other side of American politics as enemies, they may feel that lies, when they recognize them, are appropriate means of warfare,” says George Edwards, a Texas A&M political scientist and one of the country’s leading scholars of the presidency.
If we see Trump’s lies not as failures of character but rather as weapons of war, then we can come to see why his supporters might see him as an effective leader. From this perspective, lying is a feature, not a bug, of Trump’s campaign and presidency.
Research by Alexander George Theodoridis, Arlie Hochschild, Katherine J. Cramer, Maurice Schweitzer, and others have found that this kind of lying seems to thrive in an atmosphere of anger, resentment, and hyper-polarization. Party identification is so strong that criticism of the party feels like a threat to the self, which triggers a host of defensive psychological mechanisms.
For millions and millions of Americans, climate change is a hoax, Hillary Clinton ran a sex ring out of a pizza parlor, and immigrants cause crime. Whether they truly believe those falsehoods or not is debatable—and possibly irrelevant. The research to date suggests that they see those lies as useful weapons in a tribal us-against-them competition that pits the “real America” against those who would destroy it.”