Recently, I came across a quote from Charlie Munger that has been stuck in my head:
“I never allow myself to have an opinion on anything that I don’t know the other side’s argument better than they do.”
At first glance, it seems like common sense. But the more I sit with it, the more I realise how rare, and radical, it actually is.
We live in a time where belief, not evidence, not nuance, not data, often seems to be the primary currency for truth. As long as something aligns with what we already think, it feels “right.” The problem is, that kind of thinking doesn’t leave a lot of room for understanding. It turns political identity into a kind of faith system: I believe, therefore it is true. You don’t, therefore you are wrong, or worse, dangerous.
What Munger’s talking about is a kind of intellectual discipline, or maybe just humility. It’s the idea that before you criticise someone’s viewpoint, you should be able to explain it fairly and fully, in a way they would recognise. It’s not about agreeing with them. It’s about giving their position the respect of genuine attention.
Why This Matters in How We Argue
I think about this every time I scroll through political debates online. So many arguments are just people yelling past each other, beating up a cartoon version of the other side. We call this strawmanning, the act of misrepresenting someone’s position so it’s easier to attack. Instead of engaging with what someone actually believes, we invent a flimsy imitation and tear that down.
Closely related, and just as common, is the misuse of a rhetorical tool called reductio ad absurdum. In its proper form, it’s a logical technique used to disprove a point by showing that it leads to an absurd or contradictory outcome (Britanica). But in everyday argument, it’s often weaponised: someone exaggerates their opponent’s view to an extreme or ridiculous scenario, and then critiques that instead.
For example:
“You want to regulate car emissions? What’s next — banning all cars and making us walk to work?”
At that point, the original argument isn’t even in the room anymore. It’s not dialogue, it’s mockery dressed up as logic.
What we need more of is the opposite: steelmanning, engaging with the strongest, most thoughtful version of someone else’s view, even if (especially if) we disagree with it.
The Reverence and Risk of Belief
Belief and faith are often seen as admirable, especially in religious traditions. They signal conviction, loyalty, identity. But that admiration can come at a cost when belief is decoupled from evidence.
Philosopher W.K. Clifford made this point forcefully:
“It is wrong always, everywhere, and for anyone, to believe anything upon insufficient evidence.”
(Clifford’s Principle)
Clifford argued that belief isn’t just a private matter, it shapes our actions and our society. When we build entire ideologies, policies, or communities around ideas we haven’t truly tested, we open the door to harm.
And yet, the opposite view exists too. William James argued that sometimes belief without evidence is necessary, like believing in the possibility of love before it’s proven, or the success of a risky venture. He called this “The Will to Believe” (source).
Both positions reflect the complexity of belief. But in public discourse, when beliefs calcify into identity, when “I believe” becomes a substitute for “I know”, the danger increases.
This Isn’t Just Politics
It’s tempting to think this only applies to politics, Labour vs Tory, Democrat vs Republican, Leave vs Remain. But the same dynamic shows up everywhere:
- iOS vs Android
- Religion vs Atheism
- Meat eaters vs Vegans
- Pro-vaccine vs Vaccine-sceptic
- Climate change believers vs deniers
- Free speech absolutists vs content moderation advocates
- Science vs spirituality
- Parenting styles: Gentle vs Authoritative
These debates often escalate not because the issues are unsolvable, but because each side reduces the other to caricature. We don’t engage, we retreat into tribes.
But what if, instead of digging in, we paused and asked: Could I explain their point of view, fairly, generously, clearly? That’s the kind of thinking Munger was getting at.
Why We Struggle: The Psychology of Stubborn Belief
Why is this so hard?
Because we’re not nearly as rational as we think we are. Multiple psychological effects explain why we hold tight to beliefs, even when they’re wrong.
-
Motivated reasoning: We unconsciously distort facts to support what we already believe. As psychologist Peter Ditto puts it, this is “the emotional tail wagging the rational dog.” (Maine Public)
-
Confirmation bias: We selectively seek and favour information that confirms our views and avoid what challenges them.
-
Belief perseverance: Even after being shown that a belief is wrong, we often continue to cling to it. The original belief becomes anchored in our sense of identity.
And at a deeper level, Moral Foundations Theory, developed by Jonathan Haidt, shows that people weight moral values like fairness, loyalty, authority, or sanctity very differently, which makes certain arguments persuasive to one group and meaningless to another. (moralfoundations.org)
A Simple Challenge
So what do we do with all this?
Here’s a challenge I’m trying to live by, and one I think Charlie Munger would appreciate:
- Pause before jumping into disagreement.
- Steelman the opposing view: articulate it as clearly and charitably as you can.
- Ask yourself: “Would they agree with my summary of their position?”
- Only then — if you still feel the need — share your response.
Because understanding doesn’t weaken your argument. It strengthens your humanity.
And maybe, if more of us did this, we’d stop yelling across divides and start building bridges across them.