Skip to content
Chapter 1

Something Is Happening

The essay that went viral, the compression that is real, and the question of position.

Something Is Happening

"I want to be very clear about this, the article wasn't meant to scare people."
— Matt Shumer, CNBC, February 2026


A few weeks ago, a friend sent me a link. You need to read this, he wrote. No context. He works in finance, not tech. I noticed that.

The essay was 5,000 words by Matt Shumer, CEO of OthersideAI. The title was Something Big Is Happening. By the time I read it, 50 million people had. By the time Shumer went on CNBC to clarify what he meant, it was 80 million. The numbers kept moving as the conversation spread into languages he hadn't written it in.

I've been building with AI systems for years. I read the essay and felt two things at once: recognition, and a question the essay didn't answer.

The recognition was immediate. Shumer describes completing full software projects by describing what he wanted, walking away, and returning hours later to working code that required no corrections. I've had that experience. The first time it happened I sat for a while just looking at the screen, not because it was impressive — it was impressive — but because I was doing math in my head about what it meant for how everything else worked.

The question was this: if you see what's happening and you're not scared, what are you?


The essay circulated mostly as a warning. That's the frame most people applied to it: jobs at risk, timelines compressed, preparation required. Shumer himself pushed back on that reading. "The article wasn't meant to scare people," he told CNBC, somewhat exasperated. His actual point, buried under the virality, was about timing. The single biggest advantage you can have right now, he wrote, is simply being early.

Early to understand it. Early to use it. Early to adapt.

That's a different message than the one most people heard. One is about loss. The other is about position.

I've been thinking about why the warning spread so much faster than the opportunity. Part of it is how attention works — threats travel faster than possibilities, always have. But part of it is something more specific to this moment. Most of the people who felt that warning most acutely are knowledge workers. People who became credentialed, who learned a set of cognitive skills over years, who built careers around being the person who could do a particular kind of thinking. The warning landed differently for them than it would for someone who'd spent the last decade building something they owned.

That distinction matters. Not because one group is better than the other — but because the thing being disrupted, and the thing being enabled, are not the same thing for everyone.


Here's what I've been watching: In 2025, 5.2 million new business applications were filed in the United States. That's not a rounding error. The year before that, there were over 29 million businesses operating without a single employee — 84% of all US businesses — generating $1.7 trillion in revenue between them. Fulltime self-employment hit a record high.

These numbers moved before the current generation of AI tools existed. They were moving because of something structural: the cost of starting a business was falling. Software infrastructure, payment systems, distribution channels, communication tools — each generation cheaper and more accessible than the last. AI didn't create this trend. It accelerated it to a different order of magnitude.

I think about a guy named Danny Postma. He built HeadshotPro — an AI tool that generates professional headshots from ordinary photos — alone, with no team, to a million dollars in annual recurring revenue in less than a year. He wasn't a category-defining innovator. He saw a specific thing people needed, built the specific thing that solved it, charged a reasonable amount for it, and grew. Seth Kramer did something similar with PDF.ai. One person. One clear problem. Real revenue.

These aren't exceptional stories anymore. They're normal. That's the thing that keeps striking me: at what point does a pattern of exceptions become just the way things work?


I should say what I am, since I've been building a case without introducing myself. I've spent years inside the development of AI systems — building agents, designing architectures, watching enterprise software teams try to integrate capabilities they don't fully understand into processes that were built for a different era. I've watched the tools change fast enough that anything I wrote about specific capabilities six months ago would read as quaint today. I'm not an academic. I'm not a journalist. I build things and I watch what other people build, and I've been doing that long enough that I've developed some opinions about what's actually happening versus what people say is happening.

What's actually happening is a compression. The gap between having an idea and having something real is shrinking. It used to take two engineers and six months to build a working software prototype. Now it takes one person with judgment and two weeks, sometimes less. That compression is unevenly distributed right now — the people who understand how to use the tools are pulling ahead of those who don't — but the window of that unevenness is shorter than most people think.

The question I've been sitting with is not whether this is real. It's real. The question is what kind of position you want to be in when the compression reaches your domain.


Shumer's essay is a good description of what's changing. What it doesn't address — what almost nothing I've read addresses — is the specific shape of the opportunity it's opening. Not the abstract opportunity of "early adopters will benefit." The specific, practical question of what people who aren't currently building something should consider building, how, and why now rather than later.

That's what this book is about.

Not a blueprint. Not a system. Not a framework. I don't think the people who've navigated this transition well got there by following a framework. They got there by seeing something clearly and moving toward it. What I want to do is help you see it clearly — the actual landscape, not the idealized version, not the scary version, not the one that's been filtered through whatever agenda the person writing it was carrying.

Something is happening. Shumer is right about that. What I want to explore is what it looks like from inside it, and what the people who are building rather than watching are actually doing differently.

The pattern, once you see it, is not complicated.

But you have to look at the right things.


Next: The Old Deal — what the employment contract promised, and why it's breaking now.