
Everyone'sdebatingwhatAIwilltake.Nobody'stalkingaboutwhatit'salreadyseparating.Thegapisn'tcoming.It'scompoundingrightnow,quietly,inordersofmagnitude,whiletheworldarguesaboutjobtitles.
I've been living inside the AI world for the last few years.
Every model launch. Every "we raised X million" announcement. Every new framework. Every pricing change. Every tiny detail about who partnered with who and why.
I'm not saying that to flex. I'm saying it because what I'm about to say didn't come from a headline or a think piece. It came slowly. From watching this thing evolve in real time. Day after day. Year after year.
Late last year, it finally clicked.
AI is the biggest inequality machine we've ever built.
I don't mean the research papers from the 80s. I don't mean the machine learning that quietly powered recommendations and fraud detection for years. I mean the kind of AI anyone can open in a tab and use to multiply their output. The ChatGPTs, the Claudes, the Geminis, the "agents" that actually do work with you.
That version of AI is basically three years old.
And in those three years, the gap it's creating is already bigger and faster than anything we've seen before.
Let me make it real.
Imagine two developers.
One is old school. He can build you a clean, functional website in about three weeks of focused work.
The other uses AI.
The same website takes him one week. He leans on a model for boilerplate, tests, copy, refactors. He's three times faster out of the gate.
Then a new coding model drops. It's better at reading entire codebases, better at planning, better at writing end to end.
Now that same dev is doing the website in three days.
The first guy is still at three weeks.
Another upgrade. New agents. Better tools. The second dev is down to one day for the same scope. Then a few hours.
The first developer is still at three weeks.
That's what I mean by inequality. The gap between them isn't growing like 1, 2, 3. It's growing like 1, 3, 9, 27. It's compounding.
And they both "do the same job."
People love to say "AI won't take your job, someone using AI will."
They skip a detail.
The person using AI isn't just a bit more efficient. They can be ten times more productive on the same clock. At some point that's not "an edge" anymore. That's a different category of human output.
Now put cost on top of that.
When ChatGPT Plus came out, it was 20 dollars a month. That felt accessible. Annoying for some, but doable for a lot of people in tech.
Fast forward. You now have Pro, Max, Business, Enterprise tiers. Vertical tools. Stacks. It's normal for a serious AI user to be paying 100 to 400 dollars a month across ChatGPT, Claude, Gemini, Perplexity, image tools and APIs.
That's one side of the world.
On the other side, the median income in many countries is still in the low thousands of dollars per year, and there are places where people live on just a few dollars a day. For a lot of the world, 200 dollars a month is not a "tool budget." It's rent. Food. School fees.
So when a new "Pro" tier drops at 200 a month, here's what happens.
A software engineer in San Francisco shrugs. Light work. It's the cost of two nice dinners. He upgrades day one.
A young dev in Lagos or Nairobi or Dhaka opens the pricing page and just closes the tab. That's more than half of what they earn. There's no mental math to even justify it.
Next month a new model drops behind that same high tier. The San Francisco engineer just slides up another level. The dev who couldn't afford the first jump can't afford the second one either.
Again, the distance between them isn't drifting. It's exploding.
Cost is only one part.
The second part is technical know-how.
Every time a new tool lands, it comes with a layer of friction that quietly filters who can use it.
Take something like Claude Code. If you're technical, it's your new receptionist, your pair programmer, your admin assistant, your everything. It sits in your terminal. It works with your editor. It talks to your local files.
If you're not technical, the moment you see "open your terminal and run this command," you mentally tap out. Terminal. Config. API keys. Git. Forget it.
Nobody announces that as a barrier. But it works as one.
So what happens is simple. The technical person adapts at every release. They wire the new thing into their workflow. They go from "I can code faster" to "I can ship whole products alone in days."
The non-technical person hears the same news, opens the same blog post, and then hits the wall at step one.
More releases. More friction. More walls.
With every iteration, the technical person steps up another floor in the building. The non-technical person is still looking for the elevator.
Then there's speed.
Speed is what turns all of this from "unfair" into "terrifying."
In a normal world, you might spend three months learning a tool or a workflow. You get good. You feel confident. You start to rely on it.
Four months later, a new model comes out that does the whole thing with one prompt.
All that learning isn't totally wasted. You still understand the domain. But a huge chunk of your practical advantage just vanished. You're now back at zero, trying to learn the new way.
Now imagine this loop repeating every quarter.
Some people can live like that. They have the time, the energy, the community, and the money to keep relearning the same job every few months.
Most people don't.
Even for me, someone who tracks this space more obsessively than anyone I know in real life, there are parts I've already been left behind on. I look up and realize, oh, there are whole workflows now that I never adopted, and catching up would take weeks.
If I feel that while living in this space every day, what does it look like for someone who is just trying to survive and occasionally hears "you should really learn AI"?
So far this has all been about individuals.
But the deepest inequality sits below that. In the structure.
There are Cursor-style AI cafes in the US now. People sitting in public spaces pairing with AI on their laptops. There are Claude meetups in key cities. Invite-only Discords. "Friends of the lab" gatherings. Accelerator programs for AI founders. Private beta groups.
Inside those rooms, people see demos months before the public. They hear about roadmap directions. They know what's coming next and can position themselves ahead of time.
You can't subscribe your way into that.
It's geography. It's network. It's who you know and where you live.
Even if AI subscriptions were free worldwide, that "ecosystem layer" would still be heavily tilted toward a few cities and a few communities.
And under all of that sits compute.
A tiny number of companies own the data centers, the GPUs, the networking hardware that all of this runs on. If you want to train state-of-the-art models at scale, you're almost always renting power from the same handful of providers.
That means the entire stack is concentrated from the bottom up. The infrastructure. The models. The distribution. The branding. The conferences. The narratives.
By the time AI reaches an "end user," inequality has already been baked in at three or four layers above them.
This is why I call AI the biggest inequality machine we've ever built.
Not because it's evil. Not because someone sat down and designed it that way. But because of how fast it moves, who can actually keep up, and who quietly gets left behind at every step.
Previous technology waves had gaps, but they also had something we don't have here.
They had time.
Electricity, the printing press, the internet, they all rolled out over decades. Whole generations had a chance to catch up, switch careers, move cities, adjust.
Accessible, general-purpose AI has been in people's hands for around three years.
In those three years, we already have people who build a three-week project in a day, people who pay more for AI every month than others earn, people who sit in the rooms where the future is demoed before it's announced, and people who are not even in the building.
The inequality is not hypothetical. It is here. It is compounding. And it is happening in our faces while we argue about whether AI will "take jobs."
None of this has a clean fix. You and I can't redistribute GPUs or rewrite global pricing. We're not going to magically "solve" AI inequality in a thread.
The only thing you control is how you move through it.
That means touching the models yourself. Touching the tools. Letting yourself feel stupid and lost every time something new drops, and doing it anyway. People who study learning and adaptation keep finding the same thing: growth lives at the edge of that discomfort, not inside your old habits.
So if there's a mental model here, it's this: build a habit of getting uncomfortable.
Not once. Repeatedly. Release after release.
It won't erase the inequality machine we've already built. But staying on the sidelines guarantees you live on the wrong side of it.