Skip to content
Chapter 2

The Old Deal

The social contract that traded time for security — and why it is unwinding.

The Old Deal

"The implicit promise of the American workforce was always: if you are smart and work hard, you will be fine."
— Ben Thompson, Stratechery, 2024


My father worked for the same company for twenty-three years. He was an engineer. He showed up early, stayed late when asked, learned whatever the company needed him to learn, and in return received something that seemed as natural as gravity: a career. Not a job — a career. A thing with a trajectory. A thing that, if he held up his end, would hold up its end back.

That deal had a structure to it, even if nobody wrote it down. You trade your productive hours and your intellectual loyalty for a predictable income, health insurance, a pension or retirement plan, and the reasonable expectation that if you did your work competently, you would continue to be employed. The company absorbed the market risk. You absorbed the personal risk of dependency — of having no customers, no product, no fallback. But dependency didn't feel like risk when the deal was being honored. It felt like stability.

For most of the twentieth century, this arrangement worked well enough that it became invisible. People didn't think of it as a deal. They thought of it as how things worked.


The deal started breaking before AI showed up. It's important to see that, because the tendency right now is to treat artificial intelligence as the cause of whatever is happening to white-collar employment. AI is an accelerant. The structural cracks were already there.

Start with the numbers that were moving before GPT-4 existed. In the United States, the average tenure at a company dropped from 4.6 years in 2014 to under 4 years by 2022. Layoff waves became cyclical — not tied to recessions anymore, but to quarterly earnings calls and strategic pivots. The phrase "reduction in force" entered everyday language. Companies that were profitable laid people off. Companies that were growing laid people off. The logic wasn't always about survival. It was about optimization.

The gig economy was another signal, though it got misread. When people talked about Uber drivers and TaskRabbit workers, the framing was usually about exploitation — which was sometimes accurate. But the deeper signal was that millions of people were voluntarily choosing arrangements with no job security at all over the traditional employment model. That's not a labor market glitch. That's a preference reveal.

Remote work was another crack. The pandemic forced the experiment, and the results were unambiguous: most knowledge workers could do their jobs from anywhere, and many preferred to. But what that experiment also revealed — what companies noticed more quietly — was how much of the office infrastructure, the middle management layer, the coordination overhead, existed to facilitate the physical proximity that turned out to be optional. Once that was visible, it couldn't become invisible again.


Here's what changed with AI, and it's worth being precise about the mechanism.

The old deal rested on an implicit assumption: that cognitive labor — the kind of work knowledge workers do — was expensive because it was scarce. Writing a legal brief took a trained lawyer. Analyzing a market took a trained analyst. Building a software feature took a trained engineer. The training was long, the supply was limited, and that scarcity justified the compensation.

AI didn't eliminate the need for cognitive labor. It eliminated the scarcity.

When a language model can produce a competent first draft of a legal brief in seconds, the economics of legal work don't change gradually. They change categorically. The same thing is happening in software development, financial analysis, research, content creation, customer support, and every other domain where the core work is manipulating information according to learned patterns.

Microsoft's internal data identified 5 million white-collar positions — management analysts, customer service representatives, sales engineers, technical writers — at what they called "extinction risk." Anthropic CEO Dario Amodei has estimated that nearly half of entry-level white-collar jobs in tech, finance, law, and consulting could be replaced or eliminated. In the first five months of 2025, US employers announced 696,309 job cuts — an 80% increase from the prior year. The cuts weren't concentrated in struggling industries. They were concentrated in the kinds of roles where the work product is information.

These numbers describe a specific thing: the deal that knowledge workers relied on — trade cognitive skill for stable employment — is being repriced. Not slowly. Not eventually. Now.


I want to say something about how this feels, because the economics alone don't capture it.

I know people — smart, accomplished, well-credentialed people — who are watching this happen and feeling something they've never felt in their careers: expendability. Not the abstract expendability of "anyone could be laid off." The specific, personal expendability of watching a machine do their job competently. Not perfectly. But competently, and cheaply, and at scale, and without needing health insurance.

That feeling is new. Previous waves of automation threatened manual labor, and knowledge workers could watch from a distance, secure in the belief that their work required something machines couldn't replicate. Creativity. Judgment. Nuance. Context. The things you develop over years of education and experience.

Those words still mean something. But they mean something different when the first 80% of the work — the research, the drafting, the analysis, the synthesis — can be handled by a system that costs fifteen dollars a month. What's left is the last 20%: the judgment, the taste, the relationships, the decisions about what to build in the first place. That 20% is genuinely valuable. But it's not what most people's jobs consisted of.

Most people's jobs consisted of the 80%. And the 80% is what's being repriced.


There's a version of this observation that leads to despair, and I've seen a lot of that version circulating. The "learn to code" advice inverted itself — now the coders are worried. The people who followed every piece of conventional career wisdom feel betrayed, and they're not wrong to feel that way. The deal was real. They held up their end. The deal broke anyway.

But there's another observation sitting right next to that one, and it doesn't get nearly as much attention because it doesn't travel as well as fear does.

The same economic shift that's making employment fragile is making something else newly possible. The same tools that let a company replace three analysts with a prompt are the same tools that let one person build something that would have required three analysts to create. The lever works in both directions. Companies are using it to reduce headcount. Individuals can use it to increase capability.

That symmetry is the thing most people miss. The conversation about AI and work has two sides, and almost all the attention is on the side about what's being lost. The other side — what's being enabled — is where the next chapter of this starts.

But before we get there, it's worth sitting with something uncomfortable. The old deal didn't just provide income. It provided identity. Ask someone what they do, and they'll tell you their job title. The deal structured people's days, their social lives, their sense of purpose. Losing that — or watching it become unreliable — isn't just an economic event. It's a psychological one.

The question isn't whether the old deal is breaking. It's breaking. The question is what replaces it.


Next: The Leverage Shift — what AI actually gives individuals, and why it's not what most people think.