About 10 years ago when I was employed, my manager assigned our team a project and asked us to distribute the responsibilities among ourselves. I got tasked with watching a video and capturing its highlights.
I put the task on the back burner and spent days scrolling Twitter instead (it was new back then). On the final night before the deadline, I skimmed through the video and made a few quick notes. I knew I had not done a good job, but hoped I would get away with it.
What happened the next day caught me off-guard.
We met in the conference room to present our work to the project leader, who was also our teammate. When I showed mine, he highlighted a few points I hadn’t captured and said I hadn’t watched the video. He was right on both counts, but I took it as an accusation, probably because he flung it at me in front of the whole team.
I countered by saying I had watched the video twice, and that the points he mentioned weren’t in the video. Immediately, he played the part of the video to call out my lie. And I accused him of targeting me. Tempers flew, things got ugly, and the meeting had to be postponed.
For the next hour, I told anyone within an earshot how I was being victimized instead of being appreciated for my effort. By lunchtime, I genuinely believed I had watched the video twice and worked on it for three days.
In fact, I lugged that lie around as a truth for a long time.
One day about six years later, the memory returned while I was meditating. Instead of dismissing it, I chose to observe it. That’s how I meditate — I let thoughts rise, fall, and flow like waves. Sitting with thoughts and memories helps me gain clarity on them.
Staying with the unpleasant memory made me see it for what it was for the first time. I remembered everything. I hadn’t watched the video, I had lied about watching it twice, and shockingly, I believed the lie for years.
In hindsight, this wasn’t shocking because though I didn’t know it at that time, I had fallen prey to a cognitive bias named the backfire effect.
What is the Backfire Effect?
Author and journalist David McRaney presented a simple explanation for the cognitive bias:
The Misconception: When your beliefs are challenged with facts, you alter your opinions and incorporate the new information into your thinking.
The Truth: When your deepest convictions are challenged by contradictory evidence, your beliefs get stronger.
For instance, I believed I was a dedicated executive. But the project leader’s accusation, however accurate, challenged that belief. The added fear of being seen as a screw-up in front of my teammates was enough to activate my fight-or-flight response.
And more evidence he presented that challenged my beliefs, the deeper the untruth that I had watched the video twice embedded itself in me.
In his book You Are Now Less Dumb, McRaney wrote:
Once something is added to your collection of beliefs, you protect it from harm. You do this instinctively and unconsciously when confronted with attitude-inconsistent information. Just as confirmation bias shields you when you actively seek information, the backfire effect defends you when the information seeks you, when it blindsides you. Coming or going, you stick to your beliefs instead of questioning them. When someone tries to correct you, tries to dilute your misconceptions, it backfires and strengthens those misconceptions instead. Over time, the backfire effect makes you less skeptical of those things that allow you to continue seeing your beliefs and attitudes as true and proper.
This effect is more common in our daily lives than we know.
If someone who prides himself on being a great human comes across evidence that challenges his belief, he may not just reject it; he could use it to reinforce his existing beliefs. The same occurs with investors who get emotionally attached to a stock or people who view their toxic partners through rose-tinted glasses.
In fact, this phenomenon is just all in a day’s work on the internet. Climate change, vaccination, politics — the more people come across facts that challenge their beliefs, the more they cling to their misconceptions.
How to Overcome the Backfire Effect
Facts don’t change people’s minds. In fact, they worsen situations because people see attitude-inconsistent information as an attack on their identity. As a result, they either recoil into their shell or fight back with everything they have.
Here are better ways to help them overcome this bias:
1. Appeal to their identity
I wonder what would happen if my teammate had said, “If you’re saying the point is not in the video, then it probably isn’t. You’re not a liar,” rather than proving that I’d done a slipshod job.
Instead of a conflict between him and me, his words would’ve created a conflict between me and myself. I would’ve lied about doing good work AND about not being a liar. I probably would’ve felt like crap, confessed to him after the meeting, and never repeated the mistake.
Appeal to people’s self-esteem and allow them to save face when they walk away. You might lose the battle, but you stand a better chance to win the war.
2. Paint a bright picture
I also wonder how things would’ve turned out if he shared his thoughts in a positive way. What if he had said, “This is well done. Just a couple of points are missing. Could you have a quick look at it again?” Bright chances are I would’ve gladly complied instead of getting into an ego war.
Calling people out only makes them cling tighter to their beliefs. It’s better to encourage them and then ask for a few rough edges to be smoothed out. This technique is simple, but it’s not easy. I succumb to my ego and fail to apply this lesson far more than I would like. But I try to get a wee bit better each day.
3. Don’t fool yourself
For six years, I lived with the feeling that I was wronged when the truth was that I was wrong when I should’ve taken less than 24 hours to figure this out.
It’s essential to admit that we’re not immune to the backfire effect. We’re good at spotting cognitive biases in others, but weak to spot them in ourselves, Tali Sharot wrote in The Optimism Bias. This creates a bias where we believe we’re immune to cognitive biases, which is the irony of cognitive biases.
You’re just as prone to the backfire effect as anyone else. Train yourself to be objective while looking at evidence that challenges your views and readjust your perspectives accordingly. As Daniel Dennett wrote,
“The chief trick to making good mistakes is to not hide them — especially not from yourself.”
Making mistakes indicates that you’re human. Acknowledging and working on them indicates that you’re learning.
The backfire effect makes us cling tighter to our deepest convictions when we encounter evidence that challenges us. This cognitive bias makes us get in our own way of seeking growth and the truth.
Overcoming it is no mean feat, and we’re never really done. Because like all cognitive biases, it shows up over and over again and demands that we stay on our toes.
Being aware of this effect is the first step in the right direction. It makes you flexible, lets you remove the shame from saying “I don’t know,” and broadens your horizons.
It’s impossible to be smart all the time. A better approach is to be less foolish, a part of which is allowing yourself the luxuries of accepting that you were wrong and changing your mind.
Such a mindset is incredibly liberating. And a liberated life is a happy one.