Confirmation Bias
You don't see the world as it is. You see it as you already believe it to be.
Your brain has a filter. It lets in information that agrees with what you think — and quietly blocks everything that doesn't. This is not stupidity. It's how all human brains work. The filter is fast, automatic, and invisible.
When you read a headline that matches your view, it feels true. You share it. When you read one that challenges your view, it feels wrong. You scroll past. Same brain, same person — different reaction based entirely on what you believed before you started reading.
This is confirmation bias. And it is the single most exploited feature of the human mind.
Why it matters
Every system that wants to influence you — media, politics, advertising, algorithms — starts here. If they know what you already believe, they know exactly what to feed you. Not lies, necessarily. Just the right selection of truths.
You won't feel manipulated. You'll feel informed. That's what makes it work.
How to notice it
Pay attention to how information makes you feel. If a news story makes you feel righteous, vindicated, or angry at "the other side" — slow down. That emotional charge is a signal that your filter is running, not your judgement.
Try this: pick something you believe strongly. Then spend ten minutes looking for the best argument against it. Not a weak one you can dismiss — the strongest one you can find. If you can't do it, or if it feels physically uncomfortable, that's the bias at work.
The hard part
You cannot turn this off. No one can. But you can learn to notice when it's running. And noticing is the difference between being guided by your beliefs and being trapped by them.
References
- Daniel Kahneman — Thinking, Fast and Slow (2011)
- Dan Ariely — Predictably Irrational (2008)
- Hugo Mercier & Dan Sperber — The Enigma of Reason (2017)