algorithms are eating you
buddha once said, "what you think, you become". in 2025, it's more like, "what you consume, you become".
every interaction on our digital devices is meticulously optimized for maximum engagement. this algorithmic shadow follows you everywhere. every second you linger on a video, every late-night impulse buy, every ad you barely notice—it's all data, all feeding the machine. and the machine isn't just predicting what you'll do next. it's deciding it for you.
think you chose that product, formed that opinion, or selected that movie independently? the algorithm served it up, tailored to your psychological profile. it knows your triggers, fatigue points, and vulnerabilities after studying millions just like you. it precisely understands what makes you click, subscribe, buy, and vote.
this cycle is relentless. the more you consume, the more precisely it shapes you—not just your spending habits but your worldview, emotions, and identity. your very self is being rewritten, line by line, with each interaction.
false freedom
you live in a paradox: endless choices, vanishing free will. every "personalized experience" just narrows your vision. algorithms don't show you what expands your mind; they show you what keeps you locked in a comfortable prison of predictability. the illusion of choice masks the reality of manipulation.
the true crisis isn't just that algorithms serve you more of the same—it's that they systematically remove exposure to the unexpected, the challenging, the truly novel. they filter out intellectual friction and cognitive dissonance—the very elements that spark growth and new understanding. your digital experience becomes increasingly frictionless, comfortable, and unchallenging—a perfectly calibrated echo chamber that never disrupts your existing worldview.
consider how nearly impossible it is to stumble upon truly different perspectives online anymore. the algorithm has mapped your psychological boundaries and carefully avoids transgressing them. information that might expand your thinking beyond your current patterns becomes increasingly invisible—not actively censored, but effectively vanished from your digital reality through the passive violence of non-selection.
behavioral architects
is this inherently problematic? in middle school, I learned that an algorithm is simply a set of computer instructions designed to process information. that definition wasn't wrong—but it's now incomplete.
in 2025, algorithms are behavioral architects. they shape your desires, nudge your actions, and reinforce patterns that benefit the platforms running them. they optimize not for efficiency but for control.
every major company has become an algorithmic company because algorithms provide unparalleled leverage at scale. they determine what you see, buy, who you connect with, and how you think. they create feedback loops that condition your behavior.
erosion of autonomy
the digital age promised us freedom but delivered conditioning. we click and scroll through curated realities, each interaction tightening the algorithmic bubble around us. this isn't coincidence—it's by design.
you've outsourced your thinking to systems. the endless content stream doesn't expand your mind—it narrows and specializes it, making you more predictable.
what you call "personalization" is actually fragmentation—society split into countless algorithmic tribes, each convinced their reality is the only reality. the greatest trick algorithms ever pulled was convincing you that you're in control—that your feed is truly "yours," your choices spontaneous, your thoughts original. all while they silently steer your attention, emotions, and beliefs.
in a world where algorithms are eating away at your autonomy, awareness is your only defense. what you consciously choose to consume is the last frontier of your free will. everything else is just predetermined programming disguised as personal preference.
stolen agency
technology should expand your human potential, not exploit your weaknesses. it should serve your highest aspirations, not your lowest impulses. you need systems designed for human flourishing, not human extraction.
silicon valley has lost its way. we no longer prioritizes meaning or takes responsibility for what we're building.
to my friends building computing from palo alto to tel aviv, bangalore to tokyo: you wield invisible power. your algorithms silently reconstruct consciousness itself—not through coercion but through the imperceptible architecture of attention. with each elegant function, you're rewriting the neural pathways of civilization while people scroll, click, and surrender.
remember when your code emerged from genuine idealism? somewhere along the way, that vision transformed into something unrecognizable. the machine learning systems identifying patterns have themselves become patterns too powerful to perceive—frameworks that don't just predict behavior but predetermine it. engagement optimization has become existential manipulation.
fresh code
we need a radical reimagining of what technology could be. not tools that exploit our vulnerabilities, but ones that strengthen our capacities for focus, discipline, and achievement. not systems that fragment human attention and willpower, but ones that help us become stronger and more resilient versions of ourselves. not platforms that addict you to distraction, but infrastructures that foster excellence and human potential.
the markets will eventually recognize this. as users become more aware of algorithmic manipulation, they'll increasingly seek out and reward technologies that respect autonomy rather than exploit vulnerability. the platforms that survive long-term won't be those that perfect psychological traps, but those that genuinely enhance human cognition. digital environments that cultivate rather than degrade, that liberate rather than addict, will ultimately create more value. to those who build for empowerment rather than entrapment.
what you consume, you become.
thanks to naveed gorgani, jay sahnan, quinn liu, and adrianna lakatos for reading drafts of this.