background

How To Lead With Humility


[sei]

[the genius filter]

How To Lead With Humility

I can live with doubt and uncertainty and not knowing. I think it's much more interesting to live not knowing than to have answers which might be wrong.
- Richard Feynman

The smartest person in the room is often the one willing to say "I don't know."

The executive who speaks without hesitation; the expert who never says "I don't know"; the leader who projects total confidence in every decision. We tend to mistake a strong will for competence. We confuse the appearance of knowledge with the real thing.

What looks like a strength is often a trap. When leaders become too attached to being right, they stop listening, and they stop learning. They surround themselves with people who confirm what they already believe, and they dismiss anyone who questions their assumptions. Then, they find themselves in a culture where no one dares to speak up, where problems hide until they explode.

But real authority, the kind that demands respect, doesn't come from knowing everything. It comes from knowing what you don't know, and being willing to admit it.

This is humility: the recognition that your knowledge has limits, that you might be wrong, and that admitting it makes you stronger.

Few embodied this better than Richard Feynman.

[the spark]

The Power of Asking Simple Questions

On January 28, 1986, the Space Shuttle Challenger broke apart 73 seconds after launch.

Seven crew members died, including Christa McAuliffe, a high school teacher selected to be the first civilian in space. The nation watched it happen live. President Reagan appointed a commission to investigate, led by former Secretary of State William Rogers and staffed with astronauts, generals, and one Nobel Prize-winning physicist named Richard Feynman.

Feynman joined the Rogers Commission reluctantly. At 67, battling cancer, he had no interest in bureaucratic theater. But his wife convinced him he was the only one who would actually look for the truth instead of following the script. Turns out, she was right.

While the commission sat through technical presentations filled with acronyms and formalities, Feynman spent his free hours talking to engineers at NASA headquarters. He asked basic questions that others assumed had obvious answers: What were the O-rings made of? Who decided when it was safe to launch? How did NASA calculate the odds of disaster?

What he found disturbed him. NASA managers estimated the risk of catastrophic failure at one in 100,000. The engineers who actually built the rockets put it closer to one in 200. When Feynman asked how management arrived at their number, he discovered they had simply made it up. They weren't working from data. They were working from faith in their own systems.

The more Feynman learned, the more hubris he uncovered. The O-rings had been leaking on previous flights. But it was only small burns, minor erosion, easy enough to ignore. Each time, the flight succeeded anyway, so NASA treated the problem as acceptable. They were playing Russian Roulette and calling it risk management.

During a televised hearing, Feynman dropped a piece of O-ring rubber into a glass of ice water. He let it sit, then squeezed it with a clamp. When he released the pressure, the rubber stayed compressed. It had lost its resilience at 32 degrees. The same temperature as the morning Challenger launched. The cameras caught everything, and the nation watched on as Feynman revealed years of denial.

In his personal report, filed separately from the commission's official findings, Feynman wrote that NASA had "exaggerated the reliability of the shuttle to the point of fantasy." They had fooled themselves into believing the shuttle was safer than their own engineers knew it to be.

The courage to say "I don't know" requires more than honesty. It requires letting go of the need to appear certain when you're not.

[the science]

Humble minds make better leaders.

Psychologists Tenelle Porter and Carina Schumann set out in 2018 to measure what happens when people recognize the limits of their own knowledge. Their research focused on intellectual humility, which they defined as the willingness to acknowledge that one's beliefs might be wrong and to remain open to evidence that contradicts them.

Across multiple studies, they found that individuals who scored higher in intellectual humility were significantly better at distinguishing strong arguments from weak ones, regardless of whether those arguments supported their existing views.

The implications extend beyond personal reasoning. Organizational research by Bradley Owens in 2013 demonstrated that when leaders model this kind of openness, it spreads. Teams led by humble leaders showed higher rates of learning behavior and innovation, largely because members felt safe enough to voice unconventional ideas, flag potential problems, and admit their own uncertainties without fear of judgment.

Feynman's willingness to ask basic questions at NASA, to say "I don't understand this" in rooms full of experts, created exactly this kind of environment. The engineers who had been silenced by a culture of false certainty finally had someone who would listen.

His humility was contagious, and it surfaced the truth that overconfidence had buried.

[the takeaways]

1) Test Assumptions With Simple Tools
When complexity obscures truth, find the simplest experiment that reveals whether the foundation holds. Strip away the layers until you can see what actually breaks under pressure.

2) Close the Gap Between Levels
Skip the polished presentations and speak with the people doing the work. The distance between perception and reality grows in proportion to the layers you allow between yourself and the truth.

3) Recognize When Success Hides Failure
A system that survives despite its flaws is living on borrowed time. When you see warning signs but still get good outcomes, you're probably just waiting for your luck to run out.

4) Demand Clarity Over Credentials
Jargon protects people from admitting what they don't understand. If someone cannot explain a risk in plain language, they likely don't grasp it themselves.

5) Make Honesty Contagious
When leaders admit their own uncertainty, it permits others to surface problems that pride would otherwise bury. Your willingness to say "I don't know" determines whether your team will tell you what you need to hear.

Stay tuned for next week’s newsletter to get one step closer to finding your genius.

[sei]

Unsubscribe · Preferences

background

Subscribe to The Genius Filter