Back to writing
Essay 6 min read essay

Protecting Attention in the AI Era: The Paradox of Filter Success

Working in a startup, I used to treat constant input as discipline. More often it was anxiety in disguise. The problem isn't filter failure; it's filter success. The human body is a control system, and constant noise prevents it from ever reaching stability.

Lately, I have been noticing an uncomfortable pattern in myself.

Working in a startup team, I tell myself that staying informed is part of the job. I do not want to miss a shift in the market, a new model release, a product insight, or a change in user behavior. So I keep checking: articles, demos, threads, notes, product launches, and new tools. At some point, however, that stops being research. It becomes anxiety with good branding.

I used to blame the outside world. Social media is flooded with recycled ideas. AI has made content cheaper, faster, and easier to repackage. Recommendation systems keep finding new ways to make everything feel relevant. But the harder truth is that the system is not failing to filter information. It is succeeding too well.

In 2008, Clay Shirky famously declared, "It's not information overload. It's filter failure." It was a comforting thought—if the problem was just bad filters, we could simply build better ones. But three years later, technology writer Nicholas Carr pointed out the flaw in this logic. Carr argued that better filters do not mitigate information overload; they intensify it. As he put it: "It would be more accurate to say: It's not information overload. It's filter success."

Carr made a crucial distinction between situational overload (trying to find a needle in a haystack) and ambient overload (being surrounded by haystack-sized piles of needles). Our modern algorithms are incredibly good at finding the needles. The problem is that they deliver so many highly relevant, perfectly targeted needles that we are overwhelmed by the sheer volume of things we actually want to see. The cause of situational overload is too much noise, but the cause of ambient overload is too much signal.

That is what makes it dangerous. In the AI era, distraction increasingly arrives disguised as work. You can spend hours consuming summaries, expert opinions, and strategy posts, feeling incredibly productive, while still avoiding making a real decision.

I keep coming back to a simple idea from cybernetics: a person is also a control system.

Human as a Control System

In control theory, a system requires a clean feedback loop to maintain stability—a concept biologists call homeostasis. When a system is subjected to an input or disturbance, it needs a certain amount of "settling time" to return to a steady state. If you continuously inject new signals and high-frequency noise into a PID controller before it has time to settle, the system will oscillate wildly and never converge.

The human brain operates on similar principles. We improve when we take in a signal, act on it, observe the result, and adjust our internal models. But constant, high-frequency input breaks that loop. It creates what psychologists call allostatic load—the wear and tear on the body and brain from chronic stress and constant adaptation. When we never allow our cognitive systems the settling time they need, novelty starts to feel like importance, and mere stimulation masquerades as progress.

This constant context-switching also introduces a severe cognitive penalty. Researcher Sophie Leroy coined the term attention residue to describe what happens when we shift our focus from one target to another. A residue of our attention remains stuck on the previous task, significantly reducing our cognitive performance on the new one. Every time I pause to check a "highly relevant" AI news update, I am not just losing those five minutes; I am smearing attention residue across whatever actual work I was trying to accomplish.

This creates another problem I know too well: having too many ideas.

If I have three good ideas, that is useful. I can choose one, test it, and learn something. If I have twenty, that is usually a bad sign. It means I am spending too much time in possibility-space and not enough time in reality. As research from the American Psychological Association on decision fatigue shows, even when we are making choices about things we enjoy, our cognitive functions are depleted with every decision. Too many ideas create false momentum. I feel active because my mind is busy, but an untested idea is often just a mood.

What matters is not how many ideas I can generate. What matters is how quickly I can turn one into a test, get feedback, and update my model. That is the operating principle I want to keep: Do not optimize for maximum inspiration. Optimize for judgment and iteration.

River vs Bucket Mindset

Author Oliver Burkeman offers a brilliant metaphor for this: treat your reading list like a river, not a bucket. A bucket is something you feel compelled to empty; it creates a constant sense of obligation and failure. A river is a stream that flows past you. You do not need to empty the river. You just need to step into it deliberately, take the water you need for the day, and step back out before the current sweeps you away.

So I am trying to enforce simpler rules for my own control system:

Consume less, but more deliberately. Keep fewer ideas alive at the same time. Turn thoughts into small tests faster. Replace imagination with feedback. Protect time where no new input is allowed.

In the AI era, the scarcest resource is not information. It is attention that can still hold its shape.

If I can protect that, I can think more clearly. If I can think more clearly, I can judge better. And if I can judge better, I can build with less noise. That is the standard I want now: not more input, not more ideas, just better filters, cleaner feedback, and more contact with reality.

Enjoyed this? Stay in the loop.

Get daily AI briefings and deep dives delivered to your feed.

Follow on X Subscribe via RSS