Friction as Luxury: What We Lose When AI Gives Us What We Want

Hunters and their dogs returning through a snow-covered village with frozen ponds and distant mountains
Hunters in the Snow, Pieter Bruegel the Elder, 1565

#The Last Scarcity

Most discussions of AGI focus on distribution: who gets access, who profits, who loses their job, who controls the infrastructure. Those are real problems, but they’re not the deepest one.

The deeper problem is what happens to desire. I do not mean ambition in the generic sense. I mean the capacity to want something at a distance, to stay oriented toward something you do not yet have, and to find meaning in the space between reaching and arriving. That capacity is more fragile than we usually admit, and it depends more on friction than most people notice.

#I.

Economists have a clean model of desire. People have preferences, goods satisfy preferences, welfare rises as more preferences are satisfied. In that framework, a technology that can satisfy almost any preference at negligible cost looks like an obvious good. The only remaining question is who gets access.

That model leaves out something important. Desire is not just a lack waiting to be filled. It has a shape, and that shape depends on certain conditions holding.

When you want something over time, you imagine having it. You plan for it, you make sacrifices toward it. The object accumulates meaning from this process. It gets layered with your effort, your anticipation, your history of reaching. When you finally arrive, you don’t just get the object. You get the object plus everything you invested in wanting it. Those two things can’t be separated.

That is why anticipation is often richer than arrival, why the best albums sometimes need months rather than minutes, and why relationships built through difficulty have a texture that convenient ones often do not. The resistance is not incidental to the value. It helps produce it.

#II.

When this structure breaks down, the clinical term is anhedonia. But there is a milder and more socially acceptable version of the same pattern. People in this condition can be entertained constantly but rarely feel deeply absorbed. They consume without much appetite. They move from one stimulating thing to the next not because anything is satisfying, but because sitting with incompleteness starts to feel unbearable.

You can already see the shape of it in declining attention spans, in the difficulty of sustaining interest in anything that doesn’t deliver immediate feedback, in people who feel simultaneously overstimulated and bored. They haven’t been deprived, they’ve been saturated.

This doesn’t distribute evenly in society. In environments where discomfort is quickly solved, by money, by services, by endless entertainment, the mind gets less practice holding lack. You can grow up surrounded by abundance and still become poor in one specific way: poor in patience for distance.

Structurally, it starts to resemble addiction: craving breaks away from fulfillment. Many substances do not mainly deliver pleasure. They intensify wanting. They train the nervous system to treat discomfort as a cue for relief, and relief as a cue for repetition. A frictionless AI environment could reproduce some of that pattern without chemicals. Boredom, loneliness, uncertainty, and effort all become prompts for instant stimulation. Over time the threshold rises, what once felt absorbing becomes merely adequate, and the rest of life starts to feel slow and underpowered by comparison.

Consumer capitalism produced a weakened version of this. Desire progressively hollowed out by eliminating friction, but with enough friction remaining that the structure didn’t fully collapse. The streaming service still requires you to choose. The algorithm still occasionally surprises you. The simulation of connection is imperfect enough that you sometimes notice it’s a simulation.

#III.

Imagine a system that can generate, on demand, a novel calibrated to your tastes: the style you find most pleasurable, the level of complexity you find most engaging, the length that matches your current patience. Or music that sounds like what you loved most at nineteen, except new, immediate, and endless. Or a conversation partner who is always interested in what interests you, always available, never distracted, never carrying needs of their own into the exchange.

The output might be genuinely good. The novel could be technically accomplished. The music could actually move you. The conversation could be substantive. The problem is what happens to wanting once the gap collapses to zero.

The capacity to stay oriented toward a distant goal, to defer, to invest, to tolerate incompleteness, atrophies when it is never exercised. Not through some dramatic break. Through disuse. The ability to want things that require time does not vanish all at once. It gets weaker, and the weakening may not even feel like a loss because something pleasant keeps arriving on schedule.

#IV.

None of this is new. The Stoics understood that wanting easily obtained things produces a character incapable of bearing difficulty. Religious traditions have long held that the meaningful life runs through resistance rather than around it.

What’s new is the scale. Previous technologies eliminated specific friction but left other friction intact. The printing press made books abundant but reading still required effort. The internet made information free but you still had to sift through it, evaluate it, decide what mattered. Every digital environment until now required you to bring something it couldn’t supply: attention, skill, patience.

A genuinely general AI dissolves this last requirement. It can supply the taste, the context, the judgment. You no longer need to bring anything except the desire to receive. And if that desire is itself shaped by the AI, tuned to whatever maintains engagement, then even the wanting has been outsourced.

#V.

Here is the inversion. In a world of material abundance, the things that keep their value are often the ones that resist the logic of abundance. Their value is tied to the conditions that make them difficult, not to artificial scarcity.

A handmade object carries the trace of the hands that made it. A wine vintage can’t be accelerated. The waiting isn’t incidental to what the wine is. A community built around a shared difficult practice, painting, rock climbing, chess, building and fielding armies of miniatures, generates bonds that digitally mediated interaction doesn’t replicate, because those bonds are forged in shared difficulty.

These things do not become valuable despite being harder than consuming AI output. They become valuable partly because of that hardness. In a world of frictionless satisfaction, friction becomes a luxury.

#VI.

The safety, alignment, and job-displacement debates are all real. But they share a common assumption: that the humans on the other side will still be capable of deciding what to do with what they’ve been given, and that political agency and collective imagination will survive intact.

That assumption is doing a lot of work.

The atrophying of desire is not some distant hypothetical. You can already see smaller versions of it in what weaker technologies have done to culture. What AGI does to human psychology is not separate from what it does to human politics. It comes first. A population that has lost the capacity to want things at a distance, to stay oriented toward a difficult future, and to find meaning in effort and incompleteness has lost something essential to self-government.

The scarcity that matters most in a post AGI world won’t be compute or energy. It will be the capacity to want something deeply enough, and for long enough, that the wanting shapes who you are.

If desire is the last scarcity, then slowness, difficulty, and incompleteness aren’t obstacles to overcome. They’re the conditions of a life worth living.