This is a post on what is called “The Backfire Effect.” Essentially, it holds that:
Confronting a belief with facts to the contrary
only strengthens the initial belief.
I was discouraged when I sat down to write this post. How could it be, I wondered, that so many people were clinging so tenaciously to a candidate so obviously (to me) unqualified?
The more I learned of this Backfire Effect, the more depressed I became.
I needed to start with a little fun.
Remember Blame it on the Bossa Nova? Edyie Gorme made it popular in the ’60s sometime and here she is singing it:
If you can remember, Blame It On the Oxytocin, with its magic spell, I’ll explain a bit later.
The world is filled with people who believe opposite things, who hold opposing points of view, and behave in diametrically opposite ways. Usually, I find that just makes my world more interesting.
Diversity, in my experience, brings a zest, a spice to an otherwise rather vanilla existence. AND, as I try to make clear in my memoir, it helps me understand my own positions, my own culture, my own decisions, more fully.
Have you ever divided the world into various dichotomies? I did. Those with sailboats vs. those with motorboats was my first. Then, there were the PC users vs. Macs. Coffee vs. Tea. Morning vs. night people. Fun stuff.
But when we get into differences in politics, science, history, religion, health claims, and world views, things get serious. Conversations don’t help and are generally fraught with dissension. Why is that?
Abortion Education Hillary Clinton Welfare
Affirmative Action Evolution Immigration
Capital Punishment Euthanasia Marijuana
Climate Change Guns Vaccinations
Donald Trump Health Care War
Can you think of the one thing they all have in common?
Each of these issues has a community of believers, supporters, defenders. Some are better organized than others, but a community. A tribe.
And it is that pull to stay with our tribe that gets reinforced over and over again. Let’s start at the beginning.
Back in 1954, a fundamentalist group called The Seekers, believing that they would be rescued from the coming apocalypse by a flying saucer sent from the Planet Clarion, gathered to await the departure that never came. They had left behind their jobs, their families, and their homes.
When their anticipated salvation didn’t happen, rather than admit their mistake and return to “normal” life, they created an explanation for the disconnect. They had “spread so much light” with their obvious dedication that God had decided to spare earth from the coming catastrophe. Therefore, they had not only saved all of earth, there was no longer a need to save them via direct UFO express.
Over the ensuing years, their belief became stronger than ever. And they became more public about their beliefs.
This was fascinating stuff to me as a first year sociology major reading Leon Festinger’s 1959 work, When Prophecy Fails. I still have my copy and had a bit of fun over the weekend perusing it. Festinger was interested then in what I’m interested in now: how is it that people cling to a belief in spite of evidence to the contrary?
We see what we expect to see, hear what we want to hear, and believe what reinforces our already held ideology.
You know, immediately, where you stand on each of these issues I listed above. You might feel more committed to some than others, but I’ll bet that you have a firm opinion on each of them. Great.
Now, imagine you are faced with overwhelming evidence that shows your position is wrong. Just imagine. Whatever your initial position, you are shown evidence that you are wrong.
Sit with it. Feel your anterior cingulate cortex doing what it’s programmed to do.
We’re hard wired to hang onto our beliefs.
And the wiring is in our anterior cingulate cortex
Actually, Charles Bukowski ‘s observation notwithstanding, the more intelligent you are, the better you are at rationalizing your position in the light of conflicting evidence.
Confronting a belief with facts to the contrary
only strengthens the initial belief.
Let me explain.
Homophily is the “birds of a feather” force that binds us to our tribe and signals who is the outsider. Safely ensconced within our tribe, we feel safe, secure.
Oxytocin flows (They don’t call it the trust hormone for nothing.) when we belong. It explains that warm, fuzzy feeling we get when we feel connected to others. We are social creatures at heart. And we like to feel good. We love to dance. Now you can hum a few bars of “Blame it on the oxytocin.”
When we are shown evidence that we may be wrong, that we’ve taken the wrong path, or perhaps we’re not dancing to the right beat, our anterior cingulate cortex, which governs how we perceive pain, kicks in.
Cognitive dissonance hurts! Embarrassment, shame, humiliation! We want it to go away. And we are uncannily creative in coming up with explanations to put the new information in its place. To make us feel safe again.
Our survival instinct is, I believe, the strongest instinct we have and it kicks in at the most unexpected times, in its Neanderthal form. To the extent that our beliefs help define who we are, they are tied directly to our self-esteem, our self-confidence. We are emotionally invested in them.
And, when our beliefs are threatened, when evidence appears that challenges a deeply held belief, one which helps us identify who we are and where we belong, we react. And we react from that part of our brain that deals with emotions and feelings, not the part that deals with reason.
It’s just how we are wired.
As a result, to say “I don’t know,” can be scary.
It can be lonely.
It’s not always about being right. It’s about how fixed we are in a given belief, how closed-minded we are in allowing the possibility of another point of view. It’s about whether we take on the mantle of the True Believer. It’s the absoluteness, the unquestionability of our belief that creates the distress.
“What about the ‘courage of one’s convictions’?” you fairly ask.
To hold fast to your belief, without having to prove the other wrong — and all the while acknowledging the possibility that you could be wrong yourself — might that be one definition of maturity?
I think back to Carl Sagan, who died way too soon. He had a lovely book with a chapter called the Baloney Detection Kit.
And Maria Popova has made it easy to link to Sagan’s list through her weekly blog, Brain Pickings (above; just click on the image). To whet your appetite, here’s the quote from Sagan:
Try not to get overly attached to a hypothesis just because it’s yours. It’s only a way station in the pursuit of knowledge. Ask yourself why you like the idea. Compare it fairly with the alternatives. See if you can find reasons for rejecting it. If you don’t, others will.
For a list of references used to produce this post, click here
David Gal and Derek Rucker, 2010. When in Doubt, Shout! — Paradoxical Influences of Doubt on Proselytizing.
Brendan Nyhan and Jason Reifler, 2010. “When Corrections Fail: The persistence of political misperceptions.” Political Behavior 32(2): 303-330
Jason Reifler and Brendan Nyhan, 2011. “Opening the Political Mind? The effects of self-affirmation and graphical information on factual misperceptions.”
James Kuklinski, 2000. “Misinformation and the Currency of Democratic Citizenship.”
Chris Mooney, 2014. Mother Jones. “Here are 5 Infuriating Examples of Facts Making People Dumber.”
What can be done?
The experts aren’t sure. They’ve been more focused on the what, why, where, and how than on the “what’s to be done?” Or, less political, the “how are these effects minimized?” Two points, however, come up often:
- We tend to believe things are true when they are repeated often (and from multiple sources). Ad execs have certainly taken this one to heart.
- It is impossible to use logic and reason to dissuade someone from a belief that he/she did not arrive at through logic and reason. An appeal to emotion works.
“It’s never been easier for people to be wrong, and at the same time feel more certain that they are right.” So wrote Joe Keohane in a 2010 article in The Boston Globe, “How Facts Backfire. Researchers discover a surprising threat to democracy: our brains.” And six years later, I fear that’s even truer. But at least now we might have a way to move forward.
I’m not talking only of the True Believers who will vote for “he who shall not be named,” and the Republican leadership who have chosen party over country. I’m thinking also of the “Bernie or Bust” folks, the idealists and young zealots who couch the debate by vilifying the opposition.
Unless we give them a tribe to come home to, I fear for that apocalypse that never came in 1954. And there’ll still be no UFO to carry us away.