background

How to Update Your Outdated Beliefs (faster than everyone else)


[sei]

[the genius filter]

How to Update Your Outdated Beliefs

Most people desperately cling to what they know.

When the facts shift, they double down. When evidence contradicts them, they rationalize their delusion. Call it stubbornness, but it's our human nature.

But belief updating (changing your mind when presented with a new reality) is the defining trait of people who see the future more clearly than the rest of us. It's what separates those who learn from those who stagnate. And it's a skill anyone can cultivate.

video preview

To master belief updating, you need to track how often you're right and where you go wrong. That means making forecasts you can score, not vague predictions you can spin any direction.

Then, when new evidence arrives, you adjust. Not wildly or emotionally, but incrementally, in proportion to what you've learned.

This week is all about that skill: How to recognize when a belief has expired, and how to loosen your grip without losing conviction.

Essentially, we’re breaking down belief calibration.

[the spark]

A Rather Ordinary Superpower

The best forecasters treat their beliefs as hypotheses to be tested, not treasures to be guarded.
- Philip Tetlock

Philip Tetlock, a psychology professor at the University of Pennsylvania, spent twenty years tracking how experts predict the future.

His finding? The average pundit was roughly as accurate as a dart-throwing chimpanzee. But buried in that data was something hopeful: a small group consistently outperformed everyone else.

Tetlock called them "superforecasters," and what set them apart wasn't their IQ or credentials. It was how they thought. They gathered information from many sources, stayed humble about what they didn't know, and updated their views incrementally as new evidence emerged.

The superforecasters broke big questions into smaller, testable pieces. They assigned probabilities in fine gradations, not vague maybes. Most critically, they tracked their accuracy and learned from their mistakes. Their method proved that foresight is more than an innate gift. It's a practiced, cultivated skill.

And the central habit of that practice is belief calibration: holding your conclusions loosely enough that new information can reshape them.

Superforecasters compare confidence to reality. If something they thought had a 70% chance of happening only actually happens half the time, they notice.

If they are right more often than expected, they notice that too. Over time, their beliefs start to tightly fit the reality of the world around them.

Tetlock’s core finding is simple, if uncomfortable: Our minds resist updating because certainty feels good. Accuracy requires giving that up.

[the science]

Give your judgment a reality check.

In 2011, Tetlock launched the Good Judgment Project, a massive 4-year forecasting tournament sponsored by the research arm of US intelligence.

Over 20,000 volunteers made predictions on hard geopolitical questions: Will Greece leave the eurozone? Will North Korea test a nuclear weapon? Will there be a coup in Thailand? The format was simple but rigorous. Teams competed to assign probabilistic estimates to hundreds of questions about events months to a year in the future.

Surprise, surprise: The superforecasters won. In fact, they crushed the competition. The superforecasters beat the control group (ordinary forecasters) by more than 50%. They beat a prediction market populated by professional intelligence analysts with access to classified information, too.

The organizers actually ended the tournament early because the gap had become so large.

But the superforecasters' edge came from habits that look, on the surface, pretty mundane. They broke complex questions into smaller, testable pieces. They didn't make one big forecast and lock in. They drafted many small forecasts on the same question, updating their estimates in tiny increments at a time.

This constant, incremental recalibration meant they caught shifting facts and adjusted without overreacting. They consulted multiple perspectives, treating forecasting like a puzzle where different angles reveal different patterns. They tracked their accuracy ruthlessly, studying what they got wrong and adjusting their methods.

The research revealed something else: forecasting could be taught. Even a one-hour training module in probabilistic reasoning, reference classes, and bias recognition improved a forecaster’s accuracy by around 10%, with effects persisting for at least a year.

The tournament proved something important: Discipline and methodology trump raw talent, and, despite its name, superforecasting isn’t really a superpower. It’s a skill built on intellectual humility, probabilistic thinking, and relentless self-correction.

The superforecasters had studied their own minds, recognized their own biases, and engineered systems to compensate. That's a workflow available to all of us.

[the takeaways]

1) Make Predictions You Can Score
Vague hunches don't teach you anything. Write down specific forecasts with probabilities attached. Create an accurate feedback loop between belief and outcome.

2) Update in Small Increments
When new information arrives, don't swing wildly or dig in. Adjust your confidence by a few percentage points at a time.

3) Break Big Questions Into Pieces
Complex predictions become manageable when you split them into smaller, testable components. Ask questions and let the answers build a clearer picture.

4) Seek Perspectives That Challenge You
Accuracy improves when you treat opposing viewpoints as data, not threats. Certainty feels good. It can also become a blind spot.

5) Track Your Accuracy Over Time
Review your past forecasts regularly. Study where you were overconfident, where you hedged too much. That habit turns judgment into an edge that sharpens with use.

Stay tuned for next week’s newsletter to get one step closer to finding your genius.

[sei]

Unsubscribe · Preferences

background

Subscribe to The Genius Filter