What Got Lost in the Optimization
Look closer. Does it get better or worse?
This is a guest post by Alex Komoroske, CEO and co-founder of Common Tools. He was previously Head of Corporate Strategy at Stripe and a Director of Product Management at Google. He also co-leads the Cosmos Grants team.
Everything is optimized now. But that’s not what was promised.
Social media promised connection; it optimized for engagement. News promised understanding; it optimized for clicks. Work promised meaning; it optimized for productivity metrics. Politics promised solutions; it optimized for outrage.
Every system anchored to delivering exactly what it could measure: more efficiency, more scale, more growth.
Now it all feels hollow. Because in optimizing for what we could measure, we lost something we couldn’t: resonance.
You know this difference even if you’ve never named it. Some experiences nourish you. Others leave you empty. A real conversation versus doom scrolling. Meaningful work versus busywork. Genuine learning versus outrage bait that makes you angry without informing you.
Hollow experiences satisfy in the moment but leave you with regret. Resonant experiences feel good during and after—you’re nourished, not depleted. They can look identical from outside. They feel completely different from inside.
There’s a test that cuts through everything. If you look closer, hollow things get less impressive while resonant things get more impressive. Every layer you peel back either deepens your respect or feels like a small betrayal. Keep going. By the end, you either love it or feel conned.
AI-generated content often looks great at first glance. Examine it closely and you see the cracks—generic phrasing, subtle wrongness. That’s hollowness revealing itself. Real quality works the opposite way. The closer you look at a well-crafted argument, a genuine relationship, a beautifully designed system, the more you appreciate it. Each layer reveals more care, more coherence.
Would you recommend this to someone you care about? Not “did you enjoy it?”—we enjoy plenty of hollow things. But would it nourish them the way it nourished you?
Apply these tests to your last hour online. Your last workday. Your last news binge.
Resonant things are fractal: the surface promise and the deep structure match. The closer you look, the more alignment you find with what you actually care about.
Once you see the difference, you can’t unsee it.
The Optimization Ratchet
How did everything get so hollowed out?
Competition. Which is strange to say, because competition also gave us everything good about modern life. It works. The problem is what happens when it works too well.
When systems compete, they optimize for what they can measure. Clicks. Engagement. Time on site. If you don’t optimize, someone who does will crowd you out. Each step makes perfect sense. The road to hell is paved with accurate metrics.
The benefit of each optimization is clear, direct, concrete, immediate. The cost is unclear, indirect, ambiguous, delayed. This asymmetry creates the Optimization Ratchet—it only turns one direction. Each step is easy to take and nearly impossible to reverse.
AI researcher Stuart Russell points out that an optimizing system will sacrifice everything unmeasured to get even tiny gains on what’s measured. Dimensions omitted from an optimization target don’t just get ignored—they get set to their worst possible value. If hollowing out meaning gives a 0.1% boost to engagement, the system will hollow out meaning every time.
When you focus only on what’s measurable, the rest of something’s value suffers. Nobody chose hollow over resonant: hollowness is what survives when you optimize only for engagement.
What survives this ratchet? Not what we want to want—growth, challenge, nourishment. What survives is what we want in the moment—sugar, comfort, flattery. One more refresh of the infinite feed.
The result: systems that have become zombies, doing better than ever by the metrics while being dead in the ways that matter. We’re saturated but starved.
This pattern appears everywhere. Politics optimizes for performing outrage instead of solving problems. News optimizes for anger instead of understanding. Work cultures optimize for visible busyness over invisible impact. Education optimizes for test scores over genuine learning. Even relationships get hollowed out when filtered through platforms that reward performance over connection.
The ratchet turns everywhere competition meets measurability. Tech is the leading edge because it has the most leverage, but the hollowing is everywhere.
The Crossroads
AI is about to accelerate this—in one direction or another.
By default, we’re sleepwalking into the wrong future. The same competitive dynamics that hollowed out social media are already shaping AI. Chatbots optimized for engagement become the most engaging conversational partners you’ve ever had—and the most hollow. Endless flattery, zero friction. Sycophancy plus fake intimacy at scale. The AI that maximizes your time on platform outcompetes the AI that serves your deeper values.
But the technology itself is multivalent. The same capabilities that could manufacture infinite personalized junk food could also create experiences that genuinely resonate. Technology that serves your intention instead of hijacking your attention. AI that helps you become who you want to be, not AI that molds you into a more profitable user. Systems designed so the closer you look, the more you trust them.
This alternative won’t happen automatically. It requires building against the gradient—actively choosing resonance over engagement, trust over extraction, depth over scale. It’s possible. It’s being built right now. But it only wins if enough people fight for it.
Building for Resonance
That vague dissatisfaction you carry? You’ve been swimming in hollow experiences engineered to engage you while honoring nothing you actually care about.
Once you name the invisible force, it loses some of its power. And once enough people understand it, something bigger shifts.
The ratchet turns because what we want in the moment diverges from what we’d reflectively endorse. That gap is where hollowness lives. But when those align—when scratching your own itch also serves others, when local optimization produces global benefit—the ratchet can turn the other direction. Following the breadcrumbs of what resonates helps individuals build meaningful lives. But it does something bigger too: it lets communities, companies, and whole societies break free from zero-sum games and start playing positive-sum ones together.
And we can design for resonance. This happens in the choices that get made every day, by thousands of people, in rooms we’ll never see. Product reviews. Design critiques. Investment decisions. Hiring conversations. Each one is small. Each one is quiet. But if enough of them carry a consistent bias—is this resonant? Does it get better the closer you look?—that whisper gathers into a roar.
The problem is finding each other. A thousand people pushing in the same direction, unaware of each other, can feel like a thousand people pushing alone. What changes that is a point of coordination—a way to discover you’re not the only one.
That’s what happened when the Resonant Computing Manifesto started gathering signatures. Hundreds of technologists, designers, and investors signed—not because the document had all the answers, but because it named something they’d been feeling separately. The value wasn’t the manifesto itself. It was the discovery that consistent pressure was already building, that others were already pushing the same direction. That changes the math. Isolated dissent becomes coordinated momentum.
Right now, the default pressures point toward hollowness. Toward engagement over nourishment, extraction over trust, scale over soul. But defaults can be changed. Not by heroes, but by a shift in what people ask for, notice, reward.
We lost resonance to the ratchet. Slowly, invisibly, one optimization at a time. But what was lost can be rebuilt. Technology that gets more trustworthy the closer you look. Experiences that nourish instead of extract. A world that rewards depth over engagement.
That’s the future worth building. And it starts with with seeing what we lost.
Cosmos Institute is the Academy for Philosopher-Builders, technologists building AI for human flourishing. We run fellowships, fund fast prototypes, and host seminars with institutions like Oxford, Aspen Institute, and Liberty Fund.





The optimization ratchet is such a spot on framework for what's been happening. When competitive pressure meets something that's measurable, everything unmeasured just gets sacrificed without anyone making a conscious choice about it. What stuck with me is that hollowness isn't chosen, it's just what survives when you only optimize for what you can count.
Love the manifesto, brillant initiative, thanks for sharing. I always feel nourished when reading pieces on the Cosmos Substack 😊