September 30, 2021
I’ve been thinking a lot about beliefs lately.
It started about six weeks ago when someone made a comment on one of my Instagram posts, a video clip about my latest guide, How to Change a Misinformed Mind. The guide offers how-to tips for trying to get someone you know, love, or care about to stop believing misinformation.
Here’s how the entire string went, from start to finish, in response to my video:
nikakabiri: "I appreciate the constructive feedback!"
river.r_: "I am holding on firmly to my beliefs so trying to change my mind is basically you setting yourself up for discouragement and disappointment. I will not change my mind further down the line."
This was distressing. First of all, I clearly made no attempt to change this person's mind, so they clearly weren't reacting to me. Are they going through life making decisions based on a skewed sense of reality? That's no good.
But also, comments like these are a reminder of how resistant some of us are to new information, and how we justify our resistance with a commonly shared understanding that it's a good thing to be a person of conviction.
But is it?
It’s been weighing on my mind, this idea of “beliefs.” What is a belief anyway? What do I believe in? What do you believe in?
On the one hand, beliefs ground us. They orient us to the world. They make decision-making easier (“Decide according to your values.”) According to my friend Jamie McIntyre (the former CNN correspondent, not the cheesy Australian “success coach”), “It’s good to have an open mind, but if your mind is too open, your brain can fall out.”
But beliefs can also hurt us. It’s not just that many of us are hearing and sharing misinformation. We are believing it. Completely. And in many cases, all the way to the grave.
Which begs the question: How do you know when your beliefs are helpful and when they are harmful?
As a social scientist, I approach the world with a healthy dose of doubt. I tend not to believe anything, because once I do, I close myself off to further investigation, which is necessary for greater understanding. Plus, I’ve been so sure about things in the past then proven wrong enough times to be nervous.
But doubt can be dangerous; the brain reacts to doubt by craving immediate certainty, which is why we jump to conclusions, whether those conclusions are accurate or not.
What brings me peace about it all is the discipline of statistics.
I know… what a boring hero but hear me out.
Instead of thinking, “I know this to be true,” what if we put a likelihood to it? What if we instead asked, “What are the chances?” And, instead of assuming that likelihood is never-changing, what if we consider that it can shift in light of new information? I mean, it's an estimate anyway, and estimates aren't set.
For example, instead of "Should I quit my job?" or "Should I quit my relationship?" maybe we could ask, "What are the chances I'd be happier if I quit? Thirty percent? Fifty? Eighty?"
And also, "At what point would quitting be worth it?"
What if you get an amazing job offer, or find out your partner has been cheating. Wouldn't this new information change your estimate? And wouldn't that impact your decision to leave?
A belief would occur when you estimate a one-hundred percent chance of something being true. So, to believe something, you better be able to back it up 100%, and you better have evidence that this will never change, even with new information.
Given that we can’t really predict the future, or know it all, then we can’t bet on 100% as much as we think we can. But too many of us do. Seeing the world in absolutes – where the chances of things are either 0% or 100% – leads to poorly-informed decisions, which explains a lot of how we end up in the messes we do.
This approach, using probabilistic thinking, is just an suggestion. It works for me, anyway. I figure there's a 50% chance that it would work for you too. And a 90% chance that it’s worth a try.
That is all.