background

How To Catch Your Own Bad Reasoning


[sei]

[the genius filter]

How To Catch Your Own Bad Reasoning

We like to think we're rational.

We gather facts, weigh evidence, and reach sound conclusions. But that story misses something fundamental about how our minds actually work: We don't reason to find truth. We reason to persuade others and to defend the positions we've already taken.

Hugo Mercier, a cognitive scientist at the French National Centre for Scientific Research, spent over a decade studying why humans reason the way we do. His conclusion challenges our conventional assumptions about logic and belief: the point of reason isn’t for solitary truth-seeking, but for social interaction. It's a tool built for arguing, justifying, and convincing.

This explains why smart people hold onto bad ideas, and why we're better at spotting flaws in other people's arguments than our own. It also explains why debates rarely change minds.

It’s not that we’re broken thinkers. We're using reason exactly as it was designed—just not in the way we assumed.

This issue delves into how reason actually functions, and how an awareness of that changes how you approach decisions, disagreements, and your own blind spots.

[the spark]

The Argumentative Theory of Reasoning

Reasoning was not designed to pursue the truth. Reasoning was designed to help us win arguments.
- Hugo Mercier

Mercier's research reveals something counterintuitive about how we think. Our reason doesn’t work as a truth-finding mechanism, but as a social tool for persuasion and justification. His studies show that we're remarkably skilled at poking holes in other people's logic while remaining blind to the weaknesses in our own.

He calls this tendency "myside bias." In experiments, participants produced far more reasons supporting their own views than opposing them, even when incentivized to be balanced. The asymmetry is consistent and measurable. We're not lying to ourselves. We're doing exactly what our cognitive machinery was built for. Advocacy.

The problem surfaces when we reason alone, unchecked by opposing views. That's when confirmation bias runs wild. But in group settings where ideas get challenged, a back-and-forth dialogue produces better outcomes. Each person filters the other's weak arguments and amplifies the strong ones. This collective process reaches conclusions neither would find alone.

Two thousand years ago, pilgrims climbed Mount Parnassus to reach the Temple of Apollo at Delphi. Before they could consult the Oracle, they passed beneath an inscription carved into stone: "Know Thyself."

Generations of philosophers took this as a call to introspection: Sit quietly. Examine your thoughts. Trust what you find inside.

Mercier's work suggests they had it backwards. Knowing yourself means recognizing that your own mind is the last place to look for objectivity. The solution isn't bouncing more ideas off the walls of your own mind. It's dialogue. Seek out people who see it differently and let them challenge your thinking.

Your mind sharpens through friction, not isolation.

[the science]

There's no such thing as impartial reasoning.

In 1990, psychologist Ziva Kunda published “The Case for Motivated Reasoning”, a landmark paper examining a simple but incredibly touchy subject: people say they want accurate conclusions, yet their reasoning often supports what they already believe.

Kunda asked whether this distortion is deliberate or whether motivation unconsciously shapes the reasoning process itself.

After reviewing decades of experiments, Kunda found that reasoning is guided by two competing forces:

  • Accuracy motivation: The desire to reach correct conclusions
  • Directional motivation: The desire to reach preferred conclusions

Accuracy motivation prevails when the truth matters more than our opinion of it. For example, you might not want it to be raining outside, but when you look out the window, you want to make an accurate assessment of the weather so you can prepare accordingly.

Directional motivation takes over when the conclusion feels personal. Say you've just started a new diet, and you come across an article claiming it doesn't work. Instead of weighing the evidence neutrally, you scan for flaws in the study, recall friends who lost weight doing exactly what you're doing, and decide the article must be biased.

It’s not that your conclusion is entirely made up; you've just searched your memory selectively, picking out supporting facts and ignoring conflicting ones. The reasoning feels objective. But the direction was set before you started thinking.

That’s the pattern Kunda saw over and over again in her studies. Participants judged evidence that favored their position as stronger and more credible. They rated opposing evidence as weak or flawed. They generated more counterarguments against threatening claims than against supportive ones. All the while, they believed they were being objective.

[the takeaways]

1) Assume Your Reasoning is Biased
Your first thoughts on any position are usually directionally motivated. Treat your initial reasoning as a draft argument, not a verdict.

2) Stress-Test New Ideas
Reasoning improves through resistance. A belief that has never faced opposition is a belief that has never been tested.

3) Separate Identity From Opinion
Arguments fail when beliefs become identity markers. Keep a distance between your sense of self and your conclusions.

4) Design Decisions Around Dialogue
Groups reach better conclusions when disagreement is built into the process. A consensus reached too quickly is usually a fragile consensus.

5) Treat Debates as R&D
Debates are framed as competitions, but their real function is error detection. Your goal shouldn’t be to win. Use debates as an opportunity expose weak reasoning before reality does it for you.

Stay tuned for next week’s newsletter to get one step closer to finding your genius.

[sei]

Unsubscribe · Preferences

background

Subscribe to The Genius Filter