You just saw a clip of someone expressing a strong political opinion with complete certainty, yet unable to name the three branches of government, which are essential to American politics and the country’s democracy. Someone off-camera even questioned why such knowledge would matter for politics at all. Today, there’s no shortage of moments like these. Time and time again, we hear strong opinions expressed with great confidence, without much understanding of the subject. You’ll see it among MAGA supporters, progressives, religious fundamentalists, conspiracy communities, and everywhere in between. It’s confidence over competence, presentation over substance, and emotional appeal over careful reasoning. As a result, we’re drawn to overly simplified explanations of reality. We accept claims not because they are necessarily true, but because they align with what we already believe (or want to believe) or because they’re delivered with strong rhetoric. For the same reason, we embrace narratives such as conspiracy theories despite the lack of evidence. So, how does confident ignorance arise, and why does it spread so effectively in modern-day culture? Let’s find out. My name is Stefan. This is not an AI voice, I’m a real person. Subscribe to my newsletter on Substack to stay updated on all my content. You can also support my work on Patreon and find my books on Amazon. Thank you. And I hope you’ll enjoy this video. Let’s start with a little story. Back in 2017, I decided to plunge into the hype of the moment: the cryptocurrency boom. I was still working at a bank during the day, earning a modest wage, and trying to make it as a freelance writer at night. I wanted to be self-employed. And since my freelance writing career wasn’t really going anywhere, I decided to hop on the crypto train, putting my small savings into the market. Of course, I had no idea what I was doing. I bought Bitcoin, Ethereum, and a couple of altcoins, and started watching YouTubers talk about the latest assets and ICOs. I tried to learn how blockchain technology really works. After spending a full day watching videos about it, I still couldn’t clearly explain it or explain why it was supposed to be the greatest thing since sliced bread. But I was pretty sure crypto was it. I had to be. After all, I was highly invested in it. During the 2017 boom, I was doing pretty well. And I wasn’t the only one. Friends, family members, colleagues; many of them were in the market too. I couldn’t stop talking about crypto. I was extremely confident about what I was doing. It felt like I couldn’t go wrong. Until the crash in 2018. In reality, I didn’t know what I was investing in. I hadn’t read any whitepapers. I barely understood blockchain. I hardly knew the difference between investing and speculating. Yet I had been walking around like a crypto expert, right up until the unavoidable humiliation. So, what was going on there? A number of psychological forces were at play. One of them is known as the Dunning–Kruger effect. In 1999, psychologists David Dunning and Justin Kruger found that people with low ability in a given domain often overestimate their competence, largely because they lack the insight required to recognize their incompetence. By contrast, those with real expertise often doubt themselves because they understand how complex the subject is and how much they don’t know. Tom Nichols, author of The Death of Expertise, puts it more bluntly, by saying (and I quote): The Dunning-Kruger Effect, in sum, means that the dumber you are, the more confident you are that you’re not actually dumb. Dunning and Kruger more gently label such people as “unskilled”
Segment 2 (05:00 - 10:00)
or “incompetent. ” But that doesn’t change their central finding: “Not only do they reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the ability to realize it. ” End quote. In my case, it was funny when I look back on the conversations I had about crypto with people who were just as ignorant about it as I was, tossing around terms like HODL and early adoption. “Just buy the dip, bruh. ” We thought we knew stuff others didn’t, but in reality, we were just reinforcing each other’s ignorance. Luckily for me, it took one crash, a wake-up call, to realize I was standing on what’s often called “Mount Stupid”: peak confidence with minimal knowledge. After tumbling down Mount Stupid, I had to admit that I didn’t truly understand what I was investing in. In many ways, I was simply gambling, and, to an extent, I was fine with that. In the country I’m from, the Netherlands, the Dunning-Kruger Effect runs rampant when our national soccer team plays in the European Championship or the World Cup. We always say that during these events, our national team has 17,000,000 coaches, or whatever the country’s population is at the moment. Why? Because almost everyone and his neighbor has an opinion about the team’s performance, and people are not reluctant to share these opinions, often very confidently. And I’m pretty sure that the vast majority of people are not experts on soccer. Being wrong about soccer as a layperson watching a match on TV is pretty harmless, in my opinion. But there are subjects in which ‘confident ignorance’ isn’t fun and games. I’m talking about essential topics; the stuff that concerns our health, well-being, safety, and how we treat each other. And that’s where Mount Stupid becomes dangerous. Especially when people don’t come down, meaning, when they hold strong convictions about these issues without recognizing the limits of their understanding. Let’s face it. Mount Stupid is a pretty nice place to be. It feels good if you’re an expert on what you’re talking about; if you have the world all figured out. You’re free of doubt, free of ambiguity. Consider the ‘meaning of life,’ for example. You may have heard someone say something along the lines of: “Listen… I don’t understand these people endlessly discussing the meaning of life, as it’s so obvious that we are here on Earth to procreate! ” I’ve heard that one often, by the way. But once we look more closely, it becomes clear that describing a biological function as the “meaning” or “purpose” of life is a bit simplistic. But for some, the oversimplification is the point. It’s easy, it’s digestible, and, often, it aligns with how they want to see the world. But when people begin to adopt these overly reductive explanations and premature conclusions about vital topics, such as medicine, climate, different cultures and religions, politics, or the economy, it could make things… pretty dark. And things may even get darker if such views get amplified and people refuse to give them up or update them. Convincing people you’re right is often not so much a matter of substance but of confidence. At least, studies in psychology show that people tend to trust confident people more than unconfident people. And so, bad information is often perceived as good because of confident delivery, while good information is often not valued because of unconfident delivery, which then influences people’s decision-making. We can see this phenomenon, also known as the ‘confidence heuristic,’ in many areas of life. We see charismatic politicians “fascinate the fools,” as Bertrand Russell worded it, by presenting misinformation as absolute truth. We see religious leaders convince the masses of fundamentalist ideas that aren’t grounded in facts. We see door-to-door salesmen with charming, slick sales pitches. And we also see confident employees climbing the corporate ladder, while more qualified but unconfident ones are overlooked. Back in my crypto days, those crypto-bro-YouTubers I was watching were usually very confident. They talked as if they were experts in their field. Especially those pulling up these graphs and throwing all these terms around like falling wedges, golden crosses, consolidation, RSI, moving averages, and Fibonacci retracement. And I’d listen to it, thinking they really did their
Segment 3 (10:00 - 15:00)
homework. And so, when they’d confidently make a prediction or praise some new altcoin project, they sounded pretty believable. And I suspect that many people thought the same thing, often leading them to get “rekt”. Confident speakers often present complex issues as simple and clear. This is persuasive because many people are uncomfortable with ambiguity and naturally gravitate toward certainty. Psychologist Arie Kruglanski introduced the concept ‘Need for Cognitive Closure,’ pointing to the desire for a firm, definitive answer on a topic to avoid ambiguity and uncertainty. And this is exactly what many confident speakers offer: quick, simple, and definitive explanations of complex issues. Another thing confident speakers do is use impressive-sounding jargon that most laypeople barely understand; language that feels authoritative, even when it isn’t, simply because it is delivered decisively. In such cases, people listen to incoherent word salad without much substance, but delivered in a convincing way, and believe they just heard a wonderful piece of supreme knowledge. Yet the appeal of confidence is only part of the story. There is another reason why people hold their beliefs so firmly. And it has less to do with how something is said and more with how we wish to see the world. Are Democrats stupid? I’m not going to answer this question, but I think it’s a nice starting point to explore a possible cause of ‘confident ignorance’. At the moment, American politics is a daily spectacle. For some news commentators, it’s the only topic they talk about. What strikes me is how different each news outlet reports about the exact same events. For example, when CNN says that Trump’s State of the Union was misleading and full of lies, Fox News says that he slammed the democrats. And we see these contradicting views all across the board. What’s perceived as a disaster for one is a ‘win’ for another. So how can the same reality produce such radically different conclusions? We can look at ourselves and our convictions honestly and ask: do we want the truth or rather a truth we prefer? Of course, I claim that I want ‘the truth’... but could it be that there’s a little bias there, a little tendency to pick a more convenient and preferable version of the truth over another? We like to think we are seekers of objective truth. But for a great part, we might actually be seekers of confirmation. Truth is messy. It’s ambiguous and often uncomfortable. So instead, we lean toward information that aligns with our existing beliefs and resist information that threatens these beliefs. Psychologists call this ‘confirmation bias. ’ Nichols describes it as “the most common—and easily the most irritating—obstacle to productive conversation, and not just between experts and laypeople. ” People with this bias tend to cherry-pick information that supports their pre-existing beliefs and ignore any challenges to them. Counterarguments usually don’t stand a chance. They’ll just discard them and confidently stick to their version of the truth. You’ll hear things like, “I agree to disagree. ” Or you’ll see them rejecting statistics from a trusted source “because I don’t believe in numbers. ” Pythagoras is rolling in his grave. I’ve even heard (and read) academic research dismissed as “leftist indoctrination,” followed by the suggestion that one should “do their own research. ” But if academia cannot be trusted, what reliable sources of knowledge remain, and do we even have the skills to recognize them? Today, we see plenty of people online (there are a lot of them on YouTube) who aren’t experts on what they’re talking about… which is fine; I’m no expert either. But things become problematic when such people pose as experts, as authorities of truth, claiming to provide the real facts, while opposing those with better credentials and track records, as well as more reliable sources, such as peer-reviewed studies. By appealing to what people want to hear (often exploiting distrust of authority and science) and delivering their message with certainty, they can build massive followings. We also see communities emerge in which
Segment 4 (15:00 - 17:00)
the same ideas are repeated again and again, rarely questioned or challenged. I’m talking about echo chambers: mostly online spaces where opposing views are filtered out and the same ideas are repeated over and over, often keeping people stuck on Mount Stupid. My father once said to me, and I paraphrase: “There’s a difference between today’s dumb people and those of the past. Those of the past were at least aware that they were dumb. Today’s dumb people often think they’re actually quite smart. ” At the time, I found the remark harsh and arrogant. I remember laughing it off. But beneath the bluntness, he coined an interesting idea about human awareness. It’s not necessarily bad to be ignorant. There’s no shame in not knowing something, or being uneducated, or not having a certain expertise. Ignorance is human, and we’re all ignorant about most things, if we’re really honest with ourselves. Following my dad’s observation, it could be that acknowledging one’s ignorance was more socially acceptable in the past. We live in a culture today that rewards confidence and constant expression. We also expect our opinions to be treated as valuable, regardless of expertise, and many think their opinions are just as credible and substantive as those of experts. And speaking up with conviction seems to matter more than being well-informed, just as winning the argument seems more important than speaking the truth. In such an environment, people may feel they cannot afford to say, “I don’t know. ” I remember my father literally telling me that there’s nothing wrong with saying “I don’t know”. He also felt comfortable just shutting up when people were talking about a subject he knew little about. We haven’t spoken in years, for reasons I won’t get into here, but he had his moments of insight. So, after looking into this subject (and yes, I did my own research), I’ve come to believe that the real danger isn’t ignorance itself, but the absence of something called ‘metacognition,’ the ability to recognize the limits of our own understanding. Put bluntly: if we cannot see our own ignorance, we may feel completely confident that what we believe is true, even when it isn’t. And when beliefs rooted in delusion turn hateful and divisive, the consequences can be catastrophic. As Mark Twain once said: “It ain’t what you don’t know that gets you in trouble. It’s what you know for sure that just ain’t so. ” Thank you for watching.