I've been reading a lot about QAnon and other conspiracy theories. I'm surprised people believe in them. I'd like to think I make my decisions based on facts, but these people don't seem to know what facts are. Why do they believe dumb things?
- Anonymous in Maryland
In the early 90s, my early 20s, I believed in a government cover-up of alien visitations.
What of it?
I blame it on David Duchovny. He was hot in the X-Files. Pre-sex addiction scandal.
Scratch that. I blame it on Chris Carter, who developed the X-Files. He told compelling stories, and the production of that show had such an eerily attractive air about it. Really catchy. Really entertaining. And though it was a stretch to believe the plot lines of many X-Files episodes (especially the flesh-eating worm-man who lurked in the bowels of port-o-potties), the main storyline, which revolved around a government cover-up of alien visitation, was captivating.
For years before watching the X-Files, I had heard and read about UFO sightings, alien abductions, and government cover-ups in papers, books, and documentaries. I saw video footage taken by average people of strange objects flying around in the air. I heard interviews with military pilots who said they saw spaceships. I saw Close Encounters of the Third Kind.
I was hooked.
The truth is out there.
Yeah, I admit it… fact, farce, and fiction all get muddled together sometimes. When a film like Close Encounters employs plot elements that even mildly resemble science, the human brain is more inclined to find the story realistic. And at that point, you’re just one click away from thinking it’s real.
This isn’t just true of alien abductions; it’s true of every story you hear. Behavioral scientists call this the illusion of validity: it all seems to add up, so it must be true, right? I mean… I’m not dumb. It’s not like I walked out of Independence Day thinking aliens would invade the earth, or that Randy Quaid’s character could really fly a plane.
Then there was that notice I saw in the paper about a UFO abductees survivor group meeting that was open to everyone (as long as they were respectful). So of course I went, because why would I want to die not having that experience?! And the people there surprised me. They weren’t crazy. They were calm, collected, mild-mannered. They were average folks living average lives. Most of the meeting was spent talking about things unrelated to aliens, like social or personal issues, and I found it hard to disagree with their opinions. I was a Rice University graduate, and at the time attending law school at the University of Texas. I had to be somewhat qualified to identify an unintelligent conversation. And I didn't find these people to be idiots. They all had the similar abduction experiences. They all independently described their alien abductors the same way.
I couldn’t leave that meeting dismissing their stories.
So there I was, with all I needed to get the ball rolling. Throw the following ingredients into a saucepan and stir: a great TV show, a good film, an intriguing book I found at a used bookstore, and real stories told by seemingly normal and intelligent people sitting right there next to me. You’ve now got yourself a delicious recipe for believing that the truth is out there.
Wait… there’s one more important ingredient: the very strong feeling that I didn’t belong on planet earth. That’s a big one. Stay tuned for more on that.
I’m not in my early 20s anymore. I don’t think David Duchovny is hot anymore (sorry Dave), and I don’t believe in alien abductions anymore. Not because I’ve decided the evidence isn’t credible. I’ll explain my shift in faith later, but for now it’s enough to know that I’m in a different head space these days.
Now I spend most of my time exploring the ins and outs of human decision-making, and as a decision-making specialist, I’ve been interviewed recently by reporters wanting insight on why people believe in conspiracy theories like QAnon. In each of those interviews, as the conversations went on, one question seemed more important and much harder to answer than others: “I understand why people believe in conspiracy theories," the reporters would say, "but why do intelligent people believe them?”
No answer I gave was good enough. Well… I’m not 100% certain that my answers weren’t good enough for them. I don’t read minds. But I got a sense that my answers didn’t adequately scratch that itch. So I knew I had some more thinking and work to do.
Why do intelligent people believe wild conspiracy theories? The facts to debunk them are a few keystrokes away. The evidence to prove such theories false is right in front of us; all we have to do is get online and choose to see it. So how could someone who is well-educated, well-read, and with a relatively high IQ fall for such scams?!?
It doesn’t add up, which is frustrating. If intelligence isn’t an antidote… if being educated and well-learned won’t protect you from believing lies… then are we doomed? It’s not like conspiracy theorists these days believe the earth is flat and leave it at that. They’re organizing. They’re moving to action. They’re causing social disorder. It’s scary. So if it’s possible that the power of QAnon is greater than the power of reason, then… well… we’re all pretty fucked.
I don’t know about you, but I don’t want to be fucked.
IQ intelligence isn’t everything.
I was talking with a radio show host last week. He interviewed me on this very subject, and after the taping, we stayed on the line for a while to talk a bit more. He, like many, was perplexed about why intelligent people – people who should know better – buy such outlandish stories about laser beams from outer space causing wildfires, or lizard-people infiltrating the government. “These are people who learned in school how to look for evidence. These are people who should know how to look for facts.” I’m paraphrasing, but that was the gist.
In that moment I wondered… what is intelligence anyway? I asked him that question, and he pointed out that we measure it using the IQ test. Some of us are stronger cognitive performers than others. Though as humans we’re equal, not all of us share the same physical, mental, or emotional attributes. Our attributional differences shouldn’t entitle us to special treatment, but they’re still there.
And certainly, people with high IQs, though they shouldn’t be treated any better than the rest of us, should still be able to discern fact from fantasy.
I’d like to think so. I want to believe so. But what I think should be the case and what I really, really, really want to be the case isn’t always the case. So what’s the deal?
The deal is this: intelligence is a construct (and the radio show host I was talking with was agreed). The IQ test didn’t exist until we created it. Some heads got together and decided: this is how we’re going to define intelligence, and this is how we’re going to measure it. It’s not much different from the work I do with my business clients every day: for example, this is how we’re going to define customer loyalty, or brand strength, or the addressable market – and here is how we’re going to measure that.
And it just might be that how we measure IQ isn’t necessarily how we measure the ability to overcome shortcuts and biases in our decision-making. In other words, if it’s the case that intelligence has nothing to do with the ability to be mindful of irrationality, then intelligent people are no more or less immune from believing stories (including conspiracy theories) than the rest of us.
As it happens, work has been done on this subject. In a 2015 article in Scientific American, Keith Stanovich argues that many of us wrongly believe that the IQ test is a complete measure of all aspects of intelligence. In fact, the IQ test fails to measure a key mental attribute: the capacity for rational thinking. Stanovich coined the term dysrationalia to refer to this inadequacy and cites a number of examples of how highly educated individuals (including doctors and legal professionals) make poor decisions.
The truth is out there. And the truth is that even smart people aren't immune from irrationality.
Intelligent speed-thinkers need some speed bumps.
Stanovich points to two causes of dysrationalia. First is the tendency to be a “cognitive miser,” to employ mental shortcuts or heuristics in making decisions, rather than slowing down and thinking things through. We all tend to do this from time to time, and some intelligent people do this too. I mean, imagine deciding what to what to watch on Netflix by first firing up the laptop, pulling up a spreadsheet, typing out all options down column one, listing all tradeoffs across the columns, and assessing how each option performs on each of the tradeoffs. I’d rather poke my eyes out. It’s better to scan some choices and go with your gut.
But many times, this fast method of decision-making can be dangerous. If you’re a doctor, you don’t want to decide on the right medical treatment for your patient by using your gut. But some doctors do. Some lawyers do. Many politicians do. I’ve worked with C-level executives who tout the value of data-driven decision-making only to shoot from the hip and call audibles. Apparently for some intelligent people, thinking things through isn’t worth the energy, even when the stakes are high.
If you aren’t slowing down when you need to… if you’re not burning that cognitive energy you need to burn to do the right thing… then it doesn’t matter if you graduated from Harvard summa cum laude. You could make decisions that make no sense – including the decision to believe that the Democratic party is really just a child pornography ring.
Evidence doesn’t matter to people who don’t want to take the time.
Smart people still decide with their hearts.
Thinking fast happens in a number of ways, and it’s impossible to cover them all here. But when it comes to conspiracy theories, one particular type of speed-thinking stands out: the use of substitution questions in making decisions.
Many choices we make are complicated. There are a lot of facts, and we may not have access to enough of them. Or, we may have access, but the information is just too much to process. We have kids to manage, jobs to do, lives to live. It's simply not feasible to do the necessary first-hand, rigorous analysis required to evaluate our options well. Can you imagine sifting through all the COVID vaccine research data yourself to determine if the vaccine is in fact safe and effective? I don’t even know how I could do that.
For the sake of sanity, we often replace hard, complicated, and complex questions with easier ones. Rather than evaluate all the pros and cons ourselves, we rely on others to do the evaluating for us – such as experts or public figures. And then instead of asking ourselves the real question (“Should I believe in this conspiracy theory?”), we ask an easier question: "Do I trust this person who's telling me about this conspiracy theory?" If the answer is yes, we're more open to going along with whatever they propose.
But trust involves some deep analysis too - and intelligent people are no less likely to stop short there as well. Rather than evaluate trustworthiness by evaluating all the evidence (how often the person in question has lied, cheated, or taken advantage of others to get ahead), we substitute the question of trustworthiness with one of like-ability. We don’t do the research. Most of the time we can’t. We just assume that if we like someone, we can trust them.
So if you looked up to Trump back when you were a teenager in the 80s or 90s, and saw him portrayed on TV as a rich, successful real estate tycoon who women wanted to be with and men wanted to be, or if The Art of the Deal changed your life, or if The Apprentice gave you the impression that Trump was the shit, then it doesn’t matter how educated you are. If you like him, you’re more likely to believe him. You’re likely to go along with what he tells you is true, and what those who support him tell you is true. And if he’s touting conspiracies of stolen elections, then why wouldn't you believe? That's what rose-colored glasses specifically make you do.
Are you smart? Have you fallen in love? Have you fallen for lies told by someone you love? Well then you get it.
If you first hear of QAnon from someone you like, chances are higher you’ll believe in it. And if, since that time, more people you get along with, see eye-to-eye with, or like to have beer with also preach the gospel of QAnon, then you’re even more likely to get sucked in. Even when the stories are crazy. Even if you’ve got a high IQ. Because what you wouldn’t likely have in such instances is the tendency to spend mental energy on making the right decisions.
Quick fixes are appealing. Even for smarter folks.
What are the chances?
According to Stanovich, the second way in which intelligent people can act irrationally is due to an insufficient knowledge of probability, logic, and scientific inference. In other words, you can be smart but not have the skills to be rational.
Yes. I'm saying it's possible.
Thinking scientifically and probabilistically requires doing two things. First, rather than posing your decision-related questions in terms of yes/no, as a conditional question. For example, a yes/no question would be: “Is the government being infiltrated by lizard creatures that are pulling the strings?” Yes/no questions are more efficient, so we prefer them. They make us think quickly: “Lizard creatures in government?!? How absurd!” But also, “Well that would explain why our government isn’t doing what’s in our best interests!”
A conditional question would be: “Under what conditions might we find lizard creatures in our government?” Asking questions this way doesn’t come naturally, and it’s certainly not automatic, but it makes you to slow down a bit. It opens you up to actually imagining what would have to be in place for lizard people to pull off that sort of thing. For example, the rest of the government, including Donald Trump, would have to be really clueless not to notice. Or if they noticed, too apathetic to do anything about it.
And what are the chances of that? This is the second part of thinking scientifically and probabilistically: estimating the likelihood of an event - but in a specific way. If you rely on mental recall to judge the likelihood of future events, you're doing it wrong. You need to think about how the world works, not what's in your memory. Don’t politicians want power? Won’t they respond to threats to that power? Is it really likely that some guy named Bob from Dallas knows all about the lizard people, but the Senate Majority Leader is just letting it slide?
I'm not offering up evidence. Evidence can easily be swept under the ottoman if you want to dismiss it bad enough. I'm not even dismissing the lizard people theory. I'm just saying... what specific things would have to be in place for such a thing to happen, and what are the chances that those specific things are actually in place? We don't need someone to catch Nancy Pelosi with her guard down, and to snap a picture of a lizard tail poking out from behind her skirt. If the chances are low enough, there are better answers to look for.
Interestingly, if you employ scientific and probabilistic thinking, you’ll realize that a lot of conspiracy theories aren’t that kooky, and many of them are actually true. Take for example the long-held suspicion among Iranians that the CIA was behind the 1953 coup to overthrow their Prime Minister and put an end to their democratic government. I heard Iranians say for years that they just knew the CIA was involved. They sounded like crazy conspiracy theorists because it didn’t add up. Why would the U.S. government, a bastion of democracy and a fighter for individual freedoms everywhere, depose a democratic leader and pave the way for a despot to replace him? That’s crazy talk. Except that decades later, the U.S. government declassified documents that in fact proved that conspiracy theory to be true.
Governments do wild shit. If you pay attention to politics, you know. Probabilistic thinking doesn't make us shoo away any story we don't believe. It forces us to slow and think things through. It can help us discern likelihoods that are high from those that are not.
But unfortunately, intelligence does not predispose you to probabilistic thinking. You have to learn it. It's a skill. But most education programs – from elementary school on upwards – don’t teach these skills enough. Most universities require some courses on logic or probability, but being trained well enough to employ probabilistic thinking in everyday decision-making is not required to graduate from even the most prestigious institutions.
Even when we are trained in thinking rationally, to think like scientists, there’s no guarantee that we’ll apply what we learn to our daily lives. I knew a professor who specialized in rational choice theory but who made choices with his love life that were impulsive and destructive. I was a teaching assistant for a professor who was a statistics genius, who told me I was lazy based on a single data point: even though I delivered above and beyond what was expected in all other aspects of my role, I didn’t abruptly return her calls over the weekend. What were the chances my refusal to want to be hassled over the weekend had to do with laziness? Very low. But intelligent people are human, and that was her human conclusion.
When things are bad, we need a bad guy.
Uncertainty makes us edgy. Imagine going to sleep each night not knowing whether bombs will be dropped in your neighborhood. Not knowing whether the leader of your country will gas your street or send the goon squad to rape the women in your family. They might. They might not. You don’t know. What would you do? Stay and fight? Flee? If you knew what to expect you could make a choice. It might not be the best circumstance and you might have shitty options, but you could do something. You could have hope. But if you don’t know whether you’ll wake up tomorrow in a peaceful, idyllic neighborhood or in a disaster area, then you could short-circuit.
Sometimes things happen that we don’t like, and we want an answer. We need to have explanations because we want control, and explanations give us a target for our control. Control ensures our safety, our security, and ultimately our contentment. If our favorite candidate doesn’t win the election, we’re upset. If we’re led to believe he should have – either through misinformation or our own cognitive biases – we’re angry. Many believed that Trump was the winner because everyone in their community voted for Trump. This bias blinded them to seeing that their community was not representative of the entire country (an example of probability neglect). But their perception persisted, and what they perceived was hard to swallow. How could such a terrible, corrupt thing happen (real or imagined)? Without an answer, how can we protect ourselves from corruption and injustice?!
This lack of control is paralyzing, and hopelessness can suck the life out of you. No matter how smart you are. It’s understandable why intelligent people attach themselves to non-sensical explanations for why bad things happen. Because when it comes down to it, hope matters more than being accurate. Feeling safe and secure matters more than knowing facts.
And when you’re not trained in probabilistic thinking, no matter how intelligent you are, your answers aren’t going to be well-vetted. In other words, if the desire or need to pin injustice on a culprit is very strong, then you need to be that much better at employing probability, logic, and scientific thinking. You need to employ it like an Olympic athlete.
The more hopeless the situation, the more vulnerable we are to thinking irrationally. And according to Stanovich, thinking rationally is not the same as being intelligent.
Even smart people need hope.
Intelligent people have baggage.
We all have stories we latch onto. The ones we choose are the ones that speak to our pain.
I used to believe that I didn’t belong on this planet. I couldn’t have; I was too odd, too poor of a fit. There had to be an explanation. So when I heard about aliens, I had hope. “Maybe I was born on another planet,” I thought. “Maybe I got left here by mistake.” Crazy story, especially for someone so educated. But I really wanted to believe it; it would have explained so much.
At that time of my life, in my early 20s, I went through many bouts of depression and suicidal ideation. I was having panic attacks on a regular basis. I was acting out trauma in ways I didn’t understand. I looked around and no one was going through what I was going through. People saw me as a freak. No one saw the world the way that I did – as a scary, confusing, painful, and illogical place.
I lost hope more times than I could write about in my journal. I was smart but in survival mode. And when you’re in that frame of mind, who cares about facts? I was trying to make it to the next day, and if fantasies about alien abductions did the trick, then so be it.
Judge me. I don’t care. I'm still here.
You don’t have to be suicidal to need hope; you can just be sad, or confused, or hurt. You don’t have to have experienced childhood trauma to need peace; you can simply feel unloved, unnoticed, and misunderstood. Maybe you feel like you've fallen through the cracks. Maybe people in your life have treated you poorly.
Look around. A lot of people treat others poorly.
We all have our pain, large and small, and it all impacts us in unique ways. Most of us hide our pain from each other, so when intelligent people make irrational choices to manage that pain, it's not obvious. Many of us hide our pain from ourselves. No judgement. I get it. It just means that when we believe things that aren’t true, we may not even realize what we’re doing.
Bottom line is that intelligent people are still whole people. They, like all of us, are making decisions with more than just their brains. A recent study out of Emory University suggests that, among other traits, low self-esteem, social detachment, depression, and anxiety are correlated with conspiratorial thinking. Other research reveals that people with conspiratorial ideations adhere to a conspiratorial ideology or world view. In other words, the specific details of each theory may not be drawing people in as much as the fact that each theory tells the same underlying story, one in which powerful people are collaborating to get what they want. If you follow a syllogism from the assumption that conspiracies are a fact of life, then it's not hard to conclude that conspiratorial stories are true, even if the best logic is employed.
Everyone relies on their deeply-held beliefs to make decisions. Smart, educated people are no different. The President of the United States believes in God. He makes decisions based on that belief. If you swap God for the illuminati, then imagine the choices he'd make.
What might seem unintelligent from the outside may simply be the outcome of a compromise: do we forfeit what’s rational and feel like shit, or do we risk believing a lie just to get by a little easier? Do we dig for facts that challenge our way of life, or do we hang onto the beliefs that make us who we are?
The truth is out there – but it’s elusive.
I don’t believe in alien abductions anymore. But don’t be disappointed; I’ve still got an open mind. Rather than decide “yes” or “no,” I’ve decided to be comfortable in not knowing. I’m open to the evidence. The jury is still out. If I caught wind of another alien abductee survivor's group meeting, I’d go in a heartbeat.
I’m just a lot more rational than I used to be, but only because I’ve worked at it. I’ve dealt with my demons, gotten some hard-core social scientific training, and even applied what I’ve learned about probabilistic and scientific thinking to my personal life in a way that’s helped me recover from trauma more than therapy ever did or could.
Understanding why intelligent people believe in dumb things has been an obsession of mine for a long time. And yet, a specific and satisfying answer still eludes me. Even at the end of writing this, after I've shared with you what I think might explain why smart people believe crazy conspiracy stories, I'm still surprised and perplexed by the irrational choices intelligent people make. It’s in these moments of confusion – in arguments with loved ones or debates with colleagues where I just don’t understand how they can think what they think – that I realize I’m the irrational one. I’m the one struggling with uncertainty. I’m the one who feels uncomfortable not knowing why the people I count on make choices that make no sense. I’m the one who needs an answer, and needs it right away, in order to feel a sense of control and calm. I find myself hunting for answers. I find myself concocting explanatory stories. I find myself believing in things about the people in my life that I have no evidence for, all because I need to know. Because I need to feel secure.
As a result, in an effort to understand why other intelligent people in my life believe dumb things, I end up being the one believing dumb things.
Being human is a rough road. There’s no doubt about it. It’s hard to live in the gray, hard to not know when a threat to our security is lurking around the corner. I guess I’ve decided that it’s ok to be uncomfortable if it means being honest about the fact that I don't know. If it means being real. And I’m lucky I have the scientific tools to do it with less stress than uncertainty used to give me. But still… if I were handed the red pill or the blue pill, I’m not always certain which one I’d take.