background

How to Avoid Playing the Wrong Games


[sei]

[the genius filter]

How to Avoid Playing the Wrong Games

Smart people make bad decisions all the time.

Not because they lack intelligence. Not because they lack good intentions. But because the systems they operate within reward the wrong things. The gap between what we say we value and what we get paid to do is often where we start to go sideways.

Stated values drift from rewarded behavior. Institutional missions diverge from operational metrics. Personal ideals bend around income dependencies. This happens in small ways; painless little misalignments. But the thing is, the effect accumulates.

Here's what makes this dangerous: incentives don't just shape behavior. They shape perception first. Over time, they reshape belief. When the structure rewards the wrong outcome, your reasoning will bend toward justifying it. You won't even notice the bend, and in the end, you'll call it pragmatism.

The wrong game can feel productive for years while moving you further from the outcomes you set out to achieve.

This issue is about identifying the game you're actually playing and asking whether its reward structure really produces what you claim to value.

[the spark]

Structures Designed to Make You Drift

It is difficult to get a man to understand something when his salary depends upon his not understanding it.
- Upton Sinclair

Sinclair wrote this in 1935, but by that time, he'd spent decades watching it happen. As a muckraking journalist in early 20th-century America, he documented meatpacking plants, oil monopolies, and political machines.

What struck him wasn't the existence of corruption; that was nothing new. It was how people inside these systems genuinely believed they were doing good work. The problem was that their livelihood depended on their not seeing clearly.

Sinclair's insight cuts deeper than it first appears. While he may have framed his findings as an economic commentary, he was really describing a structural constraint on cognition itself.

When your income depends on a particular conclusion, the range of thoughts you can think narrows. You don't consciously decide to ignore evidence; your mind simply stops surfacing it. The rationalization happens before you’re even aware of it.

The principle extends far beyond your pay.

Promotion pathways shape what you're willing to say.
Reputation capital shapes what you're willing to question. Audience perception shapes what you're willing to believe. Access to power networks shapes what you're willing to criticize.

Wherever reward concentrates, interpretation will drift toward whatever protects the reward.

Sinclair saw this through simple observation. He watched smart, decent people defend systems they would have opposed if their position didn't depend on it. Modern economics and behavioral science have since formalized the mechanics. The phenomenon has a structure. It can be measured, predicted, and in some cases, designed around.

What you depend on will slowly become what you defend.

[the science]

Losing time pursuing a proxy.

In 1975, British economist Charles Goodhart observed a pattern while studying monetary policy: "Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes."

The idea became known as Goodhart's Law, often summarized as: when a measure becomes a target, it ceases to be a good measure.

The mechanism is straightforward. Once a metric is tied to rewards or evaluation, people optimize for the metric itself rather than the underlying objective it was meant to represent. The number becomes the goal. The thing the number was supposed to track gets neglected.

This plays out everywhere. Schools optimize for test scores, not learning. Media platforms optimize for engagement, not accuracy. Companies optimize for quarterly results, not durable value. In each case, the metric substitutes for reality. Actors game the measurable dimension while the actual objective degrades.

But awareness of this pattern is also the exit. Once you see how metrics bend behavior, you can design around the distortion. Use multiple indicators rather than a single target, and always separate measurement from reward. Revisit what you're actually trying to achieve and ask whether the numbers still serve that aim. The trap only holds when it operates invisibly. Name it, and you can begin to build systems that reward what matters rather than what's easy to count.

What gets measured gets managed. What gets rewarded gets gamed.

[the takeaways]

1) Find the Real Scoreboard
Look past stated goals to operational rewards. What behaviors actually get promoted, praised, or paid? Stated metrics and real incentives can be miles apart.

2) Trace Your Points of Dependency
Map where you're economically or socially exposed. The clearer you see your dependencies, the less power they hold over your reasoning.

3) Check the Alignment
Imagine someone who pursued the incentive structure perfectly. Would their behavior match the stated mission? If not, the game itself is misaligned.

4) Build Optionality
Diversify income streams. Limit reliance on single-approval channels. Build optionality. The more dependent you are on one source, the more your perception will bend to protect it.

5) Choose Games With Long-Term Rewards
Favor environments where long-term value matters more than short-term optics. Over time, the game you select will shape what you defend and who you become.

Stay tuned for next week’s newsletter to get one step closer to finding your genius.

[sei]

Unsubscribe · Preferences

background

Subscribe to The Genius Filter