What You Want To Want
Harry Frankfurt, technology, and "addiction"
“Only two industries call their customers users: drug dealers and technology companies.” The saying is provocative, but is the comparison accurate?
Harry Frankfurt, the late American philosopher, distinguished between first-order desires (what we want in the moment) and second-order desires (what we want to want). In his framework, “addiction” occurs when an agent has second-order desires but cannot make them effective. It describes a structural incapacity to shape one’s will rather than a simple failure of resolve. To know whether we’re addicted, we need to ask two things: whether the self that reflects on our desires endorses our appetite for Twitter or TikTok, and whether that self can make its endorsement stick.
The bear case: we’re all addicts
Modern technology differs fundamentally from its precursors. These differences flow from a combination of fidelity (content is stickier, more varied, and of higher quality), patterns of usage (we need to use some kind of screen to get by in the modern world), and the role of algorithmic mediation (the system is actively seeking to retain our attention).
Modern media is brighter, faster, and more enveloping than anything that came before. A radio emits sound from the corner of the room, but a phone is a high-definition light source held inches from the face. The colors are rich and the sound crisp, which means that the baseline level of stimulation is greater than both TVs of the past and the stuff of everyday life. As Matt Groening’s Futurama put it in the middle of the 2000s: “but this is HD TV, it has better resolution than the real world.”
We rely on our devices as the basic interfaces through which work, communication, and navigation now occur, which means they occupy the same cognitive space as tools we genuinely need in order to function. Email and payroll and maps and childcare photos all arrive through the same surface that also contains YouTube or TikTok. The boundary between instrument and diversion is getting harder to pin down, and this blurring of worlds makes disengagement harder than with older media.
But perhaps the most worrisome element of modern technology is the role of algorithmic mediation. AI undergirds five hours a day that people spend on social media, streaming video, and music platforms globally. Even assuming that only about two-thirds of this time is genuinely steered by recommendation engines, that tells us that roughly one-fifth of our waking life is already mediated by algorithms.
The most obvious instance of their stickiness can be seen in social media, where the feed presents a continuous stream that proactively selects what you will see next before you have even decided to keep looking. It learns what prolongs a session, refines itself around micro-signals of interest, and then delivers more of the same with a latency so low that the user never has to re-enter the process of active deliberation. The choice is nominally still ours, yet the conditions in which we make that choice have been engineered so that abstention is effortful and continuation frictionless.
Some research suggests that smartphone use encourages the release of dopamine. Many take this to mean phone use is similar to drug use in that both produce similar physical responses. But more recent research shows that the popular story gets the mechanism backwards. We often talk as though dopamine delivers a feeling of pleasure in the moment, like a chemical boost handed out for pressing the right button. In reality, dopamine tells the brain that something in the environment is worth pursuing and that the next moment might contain a reward. It is tied to anticipation rather than satisfaction insofar as it amplifies the urge to keep seeking before any pleasure has actually arrived.
This distinction matters because it shows that the smartphone keeps us engaged by sustaining anticipation of reward rather than delivering satisfaction. Anticipation still requires the user’s ongoing participation, which means some form of will is present. With drug use, the stimulus saturates the system so strongly that reflection is displaced; with phone usage, the stimulus is weak enough that reflection can still intervene.
Certainly most people want to reduce their smartphone use, with one survey of 3,000+ students finding that 74% of the respondents reported having actively tried to reduce time spent on their mobile. A separate poll of U.S. adults found that 44% said they could not go 24 hours without their phone, and three-quarters (76%) reported feeling nervous if they didn’t know where their phone was.
But wanting to reduce something is not the same as being unable to. Most people also say they want to sleep more or drink less, and we do not treat every shortfall between intention and behavior as evidence of addiction. This suggests that addiction—defined as the structural inability to make one’s second-order desires effective—must involve the consistent overruling of our higher order wants. The existence of regret does not by itself prove compulsion. It may simply show that people are negotiating competing goods in a world where attention is genuinely scarce.
The bull case: we are the captains of our fate
Even with the much maligned smartphone, some of our use is clearly functional rather than compulsive. People book flights, check maps, or send emails without having to be “addicted” to their little black screen. Habit only resembles addiction when the user not only acts against their better judgment but cannot sustain the effort to bring their conduct in line with it. Much of ordinary phone use probably does not meet that threshold.
We also retain autonomy by deleting apps and setting limits, which allows us to redesign the environment in which our impulses arise and to bring it closer to the shape of the life we say we want. But we should be cautious about treating acts of self-correction as decisive proof of freedom. Addicts also have moments of resolve when they flush the bottle, delete the dealer’s number, or announce a fresh start before giving into temptation. The presence of episodic self-control is not enough; what we are looking for is whether the higher-order stance can stably govern behavior rather than merely interrupt it.
This matters because it shows that second-order desires can still intervene and restructure the field of options, pruning temptations and changing the default conditions in which we choose. If that’s right, it suggests that technology can also train self-governance in favor of the person we wish to be. We can uninstall apps, mute notifications, and set hard boundaries to allow autonomy to become more available the more often it is exercised. In this setting, friction of use is not “proof of capture” but the stuff through which discipline is formed. I call this argument the “skill issue” objection, the idea that technology affords us the opportunity to strengthen our will by resisting it.
Of course, not everyone is equally positioned to treat distraction as a training ground. It is one thing to say that autonomy can be cultivated through resistance, but another to assume that most people have the spare cognitive bandwidth to run that drill dozens of times a day. The same environment that allows for will-strengthening also produces a constant tax on attention, and there is a real question about whether repeated micronegotiations of impulse count as “practice” or as exhaustion.
This objection has force, but it overlooks that autonomy has never been a matter of raw willpower alone. It has always depended on structuring the environment so that good choices become easier to make. We do not become virtuous by staring temptation in the face all day, but neither do we become virtuous by hiding from it. What matters is shaping the environment so that right action is easier to choose, and then choosing it often enough that it becomes part of one’s character.
Phones, laptops, and even televisions are all useful in their own ways. They let us navigate unknown places, coordinate care, learn on demand, transact instantly, and remain reachable in moments where distance once meant isolation. These technologies widen the range of actions available to an ordinary person and let us pursue goods that would otherwise be out of reach. It’s true that widening the range of options also widens the range of tensions, but it’s also true that plural goods will always pull against each other.
Addiction risks mischaracterizing our situation because most people can regain control without external treatment, institutional support, or withdrawal management. A single act of reconfiguration, like deleting the app or changing the setting, actually changes the behavior. It’s annoying and requires conscious effort, but the fact that it works tells us that we may not be “addicted” in the traditional sense. More than that, phones are platforms. You can be addicted to certain applications that run on the phone, like gambling apps, rather than the phone itself.
The Frankfurt framework
The essential problem is whether technology reflects, forms, or distorts our preferences. If we accept that technology influences desire, we need to know the extent to which particular influences are compatible with reflective endorsement. In Frankfurt’s terms, the question is whether our first-order appetite can still be brought under the guidance of a second-order stance about the kind of person we intend to be.
Influence alone is not the issue (influence is, after all, a permanent feature of human life). The issue is whether the influence leaves room for reflection to take hold, whether there is still a point at which we can say “this is a desire I stand behind” or “this is a desire I would rather reshape.” Addiction appears when the first-order impulse races ahead of our capacity to judge it.
We might say our relationship to technology is neither addiction nor full independence. For many people the phone does not overpower the will, but it does outrun it when desire arrives faster than evaluation (even though reflection can still reassert itself after the fact). We redesign our digital and physical environments because the second-order volitions, however contested among themselves, still hold authority. There may already be a tail of users for whom reflection no longer reliably intervenes, just as there is a tail of pathological gamblers. The claim is not that no one is addicted, but that addiction is not the median case.
The addiction framing fails because most people can still bring their behavior back under reflective control, which means the higher-order will still governs the lower appetite. Where in drug use the craving generally displaces reflection, in our use of technology human reflection can often interrupt and override our immediate wants. In this sense, because degrees of compulsion vary dramatically, it’s wrong to say that most of us are technology junkies. The stakes are high, the waters rough, but the choice is ours.
Cosmos Institute is the Academy for Philosopher-Builders, technologists building AI for human flourishing. We run fellowships, fund fast prototypes, and host seminars with institutions like Oxford, Aspen Institute, and Liberty Fund.



Thanks for sharing. I think this one misses the mark. Plenty of people, myself included, delete and redownload apps. Or, they experience tremors and withdrawal when they lose their phone and don't have a replacement for a few days. Sure, willpower plays A part, but that's true for all addiction. Willpower is part of it but remains necessary yet insufficient.