This story is cross-posted from You Are Not So Smart.
The Truth: When your deepest convictions are challenged by contradictory evidence, your beliefs get stronger.
The last time you got into, or sat on the sidelines of, an argument online with someone who thought they knew all there was to know about health care reform, gun control, gay marriage, climate change, sex education, the drug war, Joss Whedon or whether or not 0.9999 repeated to infinity was equal to one -- how did it go?
Did you teach the other party a valuable lesson? Did they thank you for edifying them on the intricacies of the issue after cursing their heretofore ignorance, doffing their virtual hat as they parted from the keyboard a better person?
No, probably not. Most online battles follow a similar pattern, each side launching attacks and pulling evidence from deep inside the web to back up their positions until, out of frustration, one party resorts to an all-out ad hominem nuclear strike. If you are lucky, the comment thread will get derailed in time for you to keep your dignity, or a neighboring commenter will help initiate a text-based dogpile on your opponent.
What should be evident from the studies on the backfire effect is you can never win an argument online. When you start to pull out facts and figures, hyperlinks and quotes, you are actually making the opponent feel as though they are even more sure of their position than before you started the debate. As they match your fervor, the same thing happens in your skull. The backfire effect pushes both of you deeper into your original beliefs.
Have you ever noticed the peculiar tendency you have to let praise pass through you, but feel crushed by criticism? A thousand positive remarks can slip by unnoticed, but one "you suck" can linger in your head for days. One hypothesis as to why this and the backfire effect happens is that you spend much more time considering information you disagree with than you do information you accept. Information which lines up with what you already believe passes through the mind like a vapor, but when you come across something which threatens your beliefs, something which conflicts with your preconceived notions of how the world works, you seize up and take notice. Some psychologists speculate there is an evolutionary explanation. Your ancestors paid more attention and spent more time thinking about negative stimuli than positive because bad things required a response. Those who failed to address negative stimuli failed to keep breathing.
In 1992, Peter Ditto and David Lopez conducted a study in which subjects dipped little strips of paper into cups filled with saliva. The paper wasn't special, but the psychologists told half the subjects the strips would turn green if he or she had a terrible pancreatic disorder and told the other half it would turn green if they were free and clear. For both groups, they said the reaction would take about 20 seconds. The people who were told the strip would turn green if they were safe tended to wait much longer to see the results, far past the time they were told it would take. When it didn't change colors, 52 percent retested themselves. The other group, the ones for whom a green strip would be very bad news, tended to wait the 20 seconds and move on. Only 18 percent retested.
When you read a negative comment, when someone shits on what you love, when your beliefs are challenged, you pore over the data, picking it apart, searching for weakness. The cognitive dissonance locks up the gears of your mind until you deal with it. In the process you form more neural connections, build new memories and put out effort -- once you finally move on, your original convictions are stronger than ever.
When our bathroom scale delivers bad news, we hop off and then on again, just to make sure we didn't misread the display or put too much pressure on one foot. When our scale delivers good news, we smile and head for the shower. By uncritically accepting evidence when it pleases us, and insisting on more when it doesn't, we subtly tip the scales in our favor.
- Psychologist Dan Gilbert in The New York Times
The backfire effect is constantly shaping your beliefs and memory, keeping you consistently leaning one way or the other through a process psychologists call biased assimilation. Decades of research into a variety of cognitive biases shows you tend to see the world through thick, horn-rimmed glasses forged of belief and smudged with attitudes and ideologies. When scientists had people watch Bob Dole debate Bill Clinton in 1996, they found supporters before the debate tended to believe their preferred candidate won. In 2000, when psychologists studied Clinton lovers and haters throughout the Lewinsky scandal, they found Clinton lovers tended to see Lewinsky as an untrustworthy homewrecker and found it difficult to believe Clinton lied under oath. The haters, of course, felt quite the opposite. Flash forward to 2011, and you have Fox News and MSNBC battling for cable journalism territory, both promising a viewpoint which will never challenge the beliefs of a certain portion of the audience. Biased assimilation guaranteed.
Biased assimilation doesn't only happen in the presence of current events. Michael Hulsizer of Webster University, Geoffrey Munro at Towson, Angela Fagerlin at the University of Michigan, and Stuart Taylor at Kent State conducted a study in 2004 in which they asked liberals and conservatives to opine on the 1970 shootings at Kent State where National Guard soldiers fired on Vietnam War demonstrators killing four and injuring nine.
As with any historical event, the details of what happened at Kent State began to blur within hours. In the years since, books and articles and documentaries and songs have plotted a dense map of causes and motivations, conclusions and suppositions with points of interest in every quadrant. In the weeks immediately after the shooting, psychologists surveyed the students at Kent State who witnessed the event and found that 6 percent of the liberals and 45 percent of the conservatives thought the National Guard was provoked. Twenty-five years later, they asked current students what they thought. In 1995, 62 percent of liberals said the soldiers committed murder, but only 37 percent of conservatives agreed. Five years later, they asked the students again and found conservatives were still more likely to believe the protesters overran the National Guard while liberals were more likely to see the soldiers as the aggressors. What is astonishing, is they found the beliefs were stronger the more the participants said they knew about the event. The bias for the National Guard or the protesters was stronger the more knowledgeable the subject. The people who only had a basic understanding experienced a weak backfire effect when considering the evidence. The backfire effect pushed those who had put more thought into the matter farther from the gray areas.