New Research Shows We Have An Evolutionary Safeguard Against False Information


This ancient instinct can help protect you from fake news now and in the future.
AdaptImage by Author via Canva

We've all done it, and right now, it's hard to avoid.

You know, you're scrolling happily along on social media when something comes up.

Something so shocking it's almost unbelievable.

Maybe a friend sent it to you or a relative. Or it's a sponsored post.

You read the story, becoming more incensed by the second, and by the time you're done, you're convinced.

Suddenly, everything makes sense.

Or does it?

Because if you'd taken the time to check your sources, you might have noticed that they're not exactly reliable.

This is the world of online misinformation.

If you use social media, you're like a fish swimming among thousands of lures every time you sign on.

But wouldn't it be great if there was a way to protect yourself?

A way to make sure you don't get fooled, especially if you've fallen for something before?

If you've been taken in by false information, don't feel too bad.

Not only does it happen to the best of us, but knowing you've fallen for it might be the key to immunizing yourself from it in the future.

The Association for Psychological Science has discovered that learning you've believed misinformation can help you remember the truth.

Yes, that Who song, "Won't Get Fooled Again," is more than just hyperbole because this study shows that once you realize you've fallen for something, it's harder to fool you the next time.

Not only can you become more discerning, but you might even be able to help rehabilitate that friend who's fallen down the social media rabbit hole.

The key is flagging the flaw in their information and dropping a truth bomb because a recent experiment found that when we notice corrections, we remember the truth.
So much informationImage by Author via Canva

Here's how it went:

First, people read true and false statements taken from news websites.

Then they read reports that corrected inaccurate information.

Some of the corrections had notices placed before them, pointing out they had gotten false information, and some didn't.

Then they asked whether they believed the new data and if they could remember the truthful information and the original lie.

Turns out, subjects remembered more truth when they understood that they had been given false information in the first place.

So, the experiment showed that highlighting the lie as well as the correction increased recall and belief.

I know this works because it's happened to me.

When a "documentary" about the pandemic was released last year, I came close to being sucked in myself.

A friend who's always seemed reasonably sensible started going on about it.

When I mentioned this to my husband, he did a bunch of research.

Then he took me through the finer points with actual facts and data and steered me back on track.

Because he debunked it with reliable sources, I was able to give my head a shake and move on.

Not only that, I was prepared and looking for it. I'd become more thoughtful and discerning about what I was reading.

It was like a veil was lifted, and suddenly I could see how much misinformation was actually coming at me.

So this bit of research should give hope to those of us who've lost friends and loved ones to this modern cult of political fiction.

Because when people understand they've believed something that isn't true, just showing them how it's happened can raise awareness and instill a bit of immunity.

Though this might not work for someone, who's seriously indoctrinated, it can help someone who isn't totally buying in or is just lightly convinced.

If you know someone wading into these dangerous waters, getting them to fact check or sending them fact-checked information might help.

Because our brains are hardwired to remember we've been wrong.

It would make sense to have this type of system evolve for safety.

It's also important to remember which information was wrong and what's actually correct.

It's a matter of survival.

10,000 years ago. You might see a plant, think it's edible, and pick a bunch of it for your tribe.

But if the evidence tells you it's poisonous — say a bunch of people get sick and die — that's a big deal.

After that, you'll not only have to make a mental note of the mistake -which plant killed everyone- but also remember the correction. You'll want to remember the plant that looked a lot like the poisonous one but was actually edible.

This is called evolutionary psychology.

It's how we adapt to problems in our environment, with the role of proof being a crucial element in replacing bad information.

Whether you're doom-scrolling propaganda or getting social media shares about a "documentary," realizing there's a lie there and getting a proven correction can have a real impact.

It's the understanding that you believed something that wasn't true and then replaced the information with facts that defends you against it happening again.

So, understanding that you've believed false information can be a game-changer for someone open to proof.
Show meImage by Author via Canva

This can be helpful for anyone who's struggling with what to believe.

If you can convince someone they've been misled, they might not believe false information the next time they see it.

This can be pretty powerful stuff because what someone believes informs how they live.

The better someone's information is, the more informed their choices become.

Someone who doesn't follow safety protocols can be endangering themselves and others if they've been led to believe they don't matter.

So, giving them truthful information from a reliable source can change their attitude and actions.

If they understand they've been misled, they'll be more likely to remember to double-check their sources.

"But how could they believe that in the first place?"

Some of the reasons people are so susceptible to believing false information, especially on social media, are:

  • If a group of people seems to know something, we automatically assume there's some kind of wisdom afoot. It's an instinct. Think following a herd or running out of a burning building. If you see a bunch of people running in one direction when the fire alarm goes, you might not know where the exit is, but you'll follow the crowd because it seems like someone must know something.
  • When someone abandons their personal beliefs for the beliefs of a group, this is called Tribalism. That's just a fancy way of saying trying to fit in. So if most of your friends are posting a particular dogma, you might start to ignore your own beliefs so you can fit in with your group.
  • We can end up believing something that's not true simply because we've been exposed to it over and over. If you see the same lie repeated, it starts to feel like fact because usually only truths are enduring, and we instinctively know this.
  • Sometimes people don't want to fact-check because they're afraid it will contradict their beliefs. If those beliefs protect them from feeling insignificant or vulnerable, admitting they were wrong might feel unsafe or shameful.
  • And of course, we can't forget that some people follow the path that lets them express the worst part of themselves. For them, the doctrine allows them to hurt with impunity and avoid consequences. Unfortunately, it takes an extraordinary kind of epiphany to make someone like that want to change their sources.

These are all things that the people pumping out fake news have probably researched extensively and are counting on.

So if you know someone taken in by misinformation, instead of arguing, maybe try helping them identify the things they're wrongly presuming to be true.

If they're not in a cult-like state of suspended disbelief, this could help protect them from manipulation in the future.

If someone you love has a cult-like devotion to misinformation, take heart.

If they decide to research the truth, it's out there, and realizing the extent of their brainwashing, could help protect them in the future.

For the rest of us, all we can do is protect ourselves by double-checking what we're told and what we read on social media.
Information is powerImage by Author via Canva

You can immunize yourself against fake news by doing your own fact-checking.

Here are some resources from the University of Calfornia Berkley Library:

And did you know that fact-checking is so important there is a whole day devoted to it?

Yes, that's a fact!

You can even fact-check it.

April second is International Fact-Checking Day — an entire day dedicated to truth.

If you want to take your fact-finding up a notch, go to the International Fact-Checking Network's educational website.

You can improve your critical thinking skills with their fake news trivia quiz.

This dedicated site fights fake news with many resources, including area-specific quizzes, articles, and educational resources. It's a treasure trove of facts-first, non-partisan information for anyone who wants to double-check their sources.

So if you know someone taken in by the cult of disinformation, try to open a discussion that can lead to an open-minded analysis of their beliefs.

By replacing misinformation with facts, they'll be able to tap into evolutionary psychology, that instinctual part of themselves that knows that truth is always safer than fiction.

By embracing this ancient part of ourselves, we can learn to navigate the futuristic world of social media in a much safer and savvy way.

Comments / 0

Published by

Musician, writer, toddler wrangler. Author of "How To Be Wise AF" guided journal available on Amazon as well as "The Automatic Parent" due out in Feb. 2022.


More from Ekingwrites

Comments / 0