by Marsha Rakestraw
It seems like common sense that when we learn that previously-believed information is false – when we receive new facts that contradict what we previously thought — then we’ll change our beliefs to fit with the new information. Right? After all, we’re thinking, logical beings.
But numerous studies show that’s not the case, at least when it comes to our deeply help beliefs. When something contradicts something we’ve long held as important, we’re much more likely to cling to those beliefs. Or, as one article notes, “If information doesn’t square with someone’s prior beliefs, he discards the beliefs if they’re weak and discards the information if the beliefs are strong.”
Another journalist summarized our counter-intuitive reaction this way:
“Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information.”
This phenomenon is popularly known as the “backfire effect.”
Why does this happen? Several reasons, including:
- We don’t like to be wrong.
- We like consistency.
- We apply motivated reasoning, which means our emotions often “decide” before our brains do, and confirmation bias, which means we give greater weight to information that supports our current beliefs.
- Once we internalize “facts” it’s very difficult for us to change our views about them.
- We’re really good at justifying.
The default strategy for inspiring change has often been a belief that once people know the facts, they’ll want to change their behavior. However, studies indicate otherwise. As one article notes, “In fact, head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever.”
What do researchers say we can do?
In an article about the backfire effect by Chris Mooney, researchers note that “If you want someone to accept new evidence, make sure to present it to them in a context that doesn’t trigger a defensive, emotional reaction. … In other words, paradoxically, you don’t lead with the facts in order to convince. You lead with the values—so as to give the facts a fighting chance.”
In another article, one researcher showed that people who felt good about themselves were more likely to listen to new information.
Research also indicates that opinion leaders “can make it possible for messages to spread broadly.”
And psychologist Stephan Lewandowsky notes, “What our research shows is that if people are aware of the possibility that they might be misled ahead of time, then they’re much better at recognizing corrections later on.”
Lewandowsky and his colleague John Cook have written a free “Debunking Handbook” (PDF), which highlights several backfire effect types and offers tips for successful debunking of myths and misinformation.
As humane educators and changemakers, it’s important that we learn to identify when a backfire effect may be operating in our own lives and to find ways not to trigger it in those we’re seeking to educate and empower.
One key takeaway: When engaging in conversations with people, teaching courses, writing articles, and educating in other ways, it’s essential that we find common ground with others’ values and viewpoints. So, if we can show how what we’re talking about connects to their need for safety or saving money, or how it supports their values of fairness or compassion, or how it helps uphold their religious beliefs or their desire not to contribute to violence or suffering, we can interweave the information we want to share and build bridges to a better understanding of how we all can contribute to creating a humane world for everyone.
- “The Backfire Effect” by David McRaney. You Are Not So Smart. 10 June 2010.
- “How Facts Backfire” by John Keohane. Boston Globe. 11 July 2010.
- “How to Debunk False Beliefs Without Having It Backfire” by Susannah Locke. Vox. 22 December 2014.
- “I Don’t Want to Be Right” by Maria Konnikova. The New Yorker. 16 May 2014.
- “The Science of Why We Don’t Believe Science” by Chris Mooney. Mother Jones. May/June 2011.