A wide cinematic illustration of a long industrial conveyor belt stretching from left to right, viewed from a slight elevated angle. On the belt, moving left to right through history: a medieval cobbler at his bench, a textile weaver at a loom, a factory worker at an assembly line, a 1950s office worker at a desk, a modern software engineer at a glowing monitor. Each figure is slightly more translucent and ghostlike than the previous one, fading toward the right. At the far right end of the belt where the next worker should stand: emptiness, just the belt disappearing into darkness with a single blinking cursor floating in the void. High above the entire scene, a single shadowy featureless figure observes from a balcony or elevated platform—present in every era, never changing. Dark industrial color palette, dramatic side lighting, cinematic and painterly style. The mood should be melancholic, inevitable, and quietly ominous.

The Replaceable Human

How Capital Has Always Tried to Own Your Skills

Right now, capital is making the largest coordinated investment in a single technology in human history.

We’re talking trillions of dollars. More than the Manhattan Project. More than the Apollo Program. More than the Interstate Highway System. Adjusted for inflation, the money being poured into AI development dwarfs every major coordinated capital investment humanity has ever attempted.

The question nobody is asking loudly enough is: why?

Not “what is AI?” Not “what can it do?” But why are the people who control the world’s wealth willing to spend more money on this technology than they’ve spent on anything else in history?

The answer they give you is progress. Innovation. The next frontier of human achievement.

The real answer is older than computers. Older than factories. Older than capitalism itself.

It’s control.

Specifically: control over labour. Control over the people whose skills, knowledge, and expertise capital has always needed but never wanted to depend on.

To understand why trillions are being spent on AI right now, you don’t need to understand machine learning or transformer models or context windows. You need to understand the history of a word most people haven’t thought about since a high school history class.

You need to understand what it meant to be a vagrant.

The Vagrant: When Freedom Became a Crime

Before capitalism, most people in England didn’t work for anyone.

They worked for themselves.

Common land, fields, forests, rivers, etc was shared by the communities that lived alongside it. Peasants grew their own food, grazed their own animals, cut their own timber. Craftsmen owned their tools, their workshops, their relationships with their customers. A cobbler didn’t work for a shoe company. He was the shoe company. His skills, his reputation, his craft had been passed down through generations and belonged entirely to him.

This wasn’t prosperity by modern standards. But it was independence. The means of survival were in your own hands.

Capital couldn’t function in a world like this.

If people can feed themselves, shelter themselves, clothe themselves through their own land and their own craft, they don’t need to work for you. You can’t build a factory, a mine, a plantation if the people you need to staff it have a viable alternative. Why would anyone accept dangerous, degrading, poorly paid wage labour when they could work their own land instead?

The answer was to eliminate the alternative.

Between the 15th and 19th centuries, the Enclosure Acts systematically transferred common land from communities to private owners. Fields that families had farmed for generations were fenced off overnight. Forests that had provided timber and food became private property. The commons, the shared foundation of rural self-sufficiency, were privatised out of existence.

Millions of people who had never needed to work for anyone suddenly had nothing. No land. No commons. No alternative. Just their labour, which they now had no choice but to sell.

And then, having engineered this mass dispossession, the ruling class turned around and called the dispossessed criminals.

The Vagrancy Acts made it illegal to be without employment or a fixed home. People who wandered looking for work, people who had been deliberately stripped of any alternative to wandering, were classified as vagrants. Arrested. Whipped. Branded. Conscripted. Imprisoned.

The cobbler who had owned his shop for generations, whose grandfather had owned it before him, who had been forced to surrender his independence and work for someone else, he wasn’t a victim of economic violence. He was a potential vagrant. A threat to public order. The lowest of the low.

This is how capital writes history. It engineers the conditions that force people into dependency, then criminalises the people it has made dependent.

The vagrant wasn’t a lazy criminal who refused to contribute to society. The vagrant was a free person who had been robbed of everything that made freedom possible and then punished for the consequences.

The Luddite: Capital’s First Broken Promise

The word “Luddite” has become an insult. A technophobe. Someone who fears progress. Someone who wants to stop the march of innovation because they’re too stupid or too scared to adapt.

This is one of the most successful historical rewrites capital has ever pulled off.

The Luddites weren’t afraid of technology. They were responding to betrayal.

These were skilled craftsmen such as weavers, textile workers, framework knitters who had already made the sacrifice the vagrant had been forced to make. They had already surrendered the independence of owning their own means of production. They had already accepted the terms of wage labour. They showed up. They gave their skills, their time, their expertise to someone else’s enterprise.

Capital made them an implicit promise: we need you. Your skills have value. Work for us and you’ll have stability, income, a place in the new order we’re building.

Then the machines arrived.

Not machines that worked alongside them or made their jobs easier. Machines specifically designed to replace them. To take the complex, skilled craft that a trained weaver had spent years mastering and reduce it to something a child could operate for a fraction of the wage.

The Luddites didn’t smash machines because they hated technology. They smashed machines because they understood exactly what was happening. Capital had extracted their labour for years, had benefited from their skills and expertise, had allowed them to abandon every other path their lives might have taken and then discarded them the moment a cheaper alternative appeared.

They weren’t technophobes. They were people who had been lied to, used, and thrown away. And they were angry about it.

The British government’s response was telling. At the height of the Luddite uprisings, more British soldiers were deployed against textile workers in the English Midlands than were deployed against Napoleon in the Iberian Peninsula. Frame breaking (destroying the machines) was made a capital offence. Seventeen Luddites were hanged. Many more were transported to Australia.

The message was clear: capital’s right to replace you is protected by the full force of the state. Your right to resist is not.

Sound familiar?

The Luddite story isn’t a historical curiosity. It’s a template. It’s what happens every time capital makes a promise to labour, extracts what it needs, and then discards the people it no longer requires.

The craftsman became the vagrant. The skilled weaver became the Luddite. The pattern was set.

But capital still had a problem.

Even in the new industrial order, some workers still had knowledge that couldn’t easily be replaced. And knowledge, it turned out, was a form of power that would take capital another century to figure out how to break.

Now we get into Taylorism, the science of replaceability:

Taylorism: The Science of Making You Replaceable

By the late 19th century, capital had a problem it couldn’t solve with enclosure acts or soldiers.

Skilled workers had power.

Not political power as the ruling class had that locked down. But economic power. The kind that came from knowing things. From being the only person in the factory who understood how the machine worked, why it kept breaking down, how to get the most out of it. From being the foreman who knew every worker’s strengths and could organise a floor efficiently. From being the engineer who held the institutional knowledge of a decade of production problems and solutions in their head.

This knowledge couldn’t be fenced off like common land. It couldn’t be smashed like a loom. It lived inside people. And people, it turned out, knew they had it.

When you can’t be replaced, you have leverage. You can negotiate wages. You can withdraw your labour through strikes and actually hurt the business. You can demand better conditions because the cost of losing you is real and immediate. Internal promotion was the norm not because capital was generous but because it was cheaper to reward the person who already knew everything than to find and train someone new.

Capital was still dependent on labour in a way that made it deeply uncomfortable.

Enter Frederick Winslow Taylor.

Taylor was an American engineer who in the 1880s developed what he called “scientific management.” The idea was simple and revolutionary: stop treating work as a craft and start treating it as a system.

Watch every worker. Time every movement. Break every job down into its smallest possible components. Identify the single most efficient way to perform each component. Then train workers to do only that one thing, repeatedly, as fast as possible.

The genius of Taylorism wasn’t efficiency. Efficiency was the sales pitch.

The genius was replaceability.

If your job is to tighten bolt seven on an assembly line, you don’t need to understand how the engine works. You don’t need to know what comes before or after you. You don’t need institutional knowledge, years of experience, or any skill beyond the single motion you perform hundreds of times a day.

And if you can be trained to do that in a week, so can anyone else.

Suddenly the leverage was gone. Strike? Fine. You’ll be replaced by lunchtime. Demand a pay rise? There are ten people outside who’ll do the same job for less. The knowledge that had given workers power; the craft, the expertise, the institutional memory, had been systematically extracted, documented, and transferred into the process itself.

Taylor didn’t just change how factories worked, he changed the fundamental power relationship between capital and labour.

Work was no longer something you knew how to do. It was something you were instructed to do. The knowledge lived in the system, not in you. Systems don’t go on strike.

Taylorism spread from factories to offices. Every complex role was broken down, documented, systematised. Accounting. Administration. Customer service. Management itself was Taylorised – reduced to measurable metrics, standardised processes, replaceable parts.

For most of the 20th century, capital had finally achieved what the enclosure acts had started: a workforce that was dependent, replaceable, and largely powerless.

Until computers arrived. And with them, an entirely new class of worker that capital hadn’t anticipated.

The IT Exception: The Accidental Craftsmen

Capital loved information technology at first.

And why wouldn’t it? The entire value proposition of early enterprise software was replacing other workers. Automate the payroll department. Simplify HR. Digitise the filing system. Put the travel bookings online. Replace the typing pool with word processors. Replace the switchboard with automated routing.

IT was Taylorism’s most powerful tool yet. Not just breaking jobs into smaller components but eliminating them entirely. Every new system deployed meant fewer workers needed. Every process automated meant headcount reduced while capacity expanded.

Capital was thrilled. IT budgets were investments that paid for themselves in redundancies.

But something unexpected happened.

The people building and maintaining these systems started accumulating something capital thought it had permanently destroyed: irreplaceable knowledge.

Not just technical knowledge though that was part of it. Domain knowledge. The understanding of why the system was built the way it was. The memory of the decision made in 2003 that created the dependency nobody documented. The awareness that if you touch this part of the codebase, three other things break in ways that aren’t obvious until production goes down at 2am on a Friday. The relationship with the business stakeholders that meant you understood not just what the software did but what the business actually needed.

This knowledge lived in people. Again.

And it couldn’t be Taylorised. Not easily. Because software wasn’t a series of repeatable physical motions that could be timed and optimised. It was a creative, problem-solving discipline that required understanding context, history, and consequence. The cobbler had returned. Except now he was wearing a hoodie and drinking cold brew.

Salaries exploded. A good senior software engineer with deep domain knowledge of a complex system became one of the most expensive employees in any organisation. Not because capital wanted to pay them well but because the cost of losing them was catastrophic.

Capital tried the obvious solution: offshore it.

If Australian and American software engineers were expensive, surely Indian and Eastern European engineers were cheaper? The labour arbitrage seemed obvious. Same skills, fraction of the cost.

It worked. Partially. For a while. For certain things.

New greenfield projects with clean requirements and no legacy complexity could be built offshore reasonably well. Routine maintenance, bug fixing, well-documented features, these could be distributed across time zones with managed results.

But the domain knowledge problem remained stubbornly unsolvable.

You can’t email institutional memory. You can’t document what you don’t know you know. The senior engineer who had been with the company for twelve years and could tell you exactly why the payment processing system had that bizarre exception for transactions over $50,000 on the last day of the financial quarter that person couldn’t be replaced by a cheaper equivalent in Bangalore because the knowledge that made them valuable wasn’t in any document. It was in their head.

Offshoring created its own costs: communication overhead, timezone friction, cultural misalignment, quality issues, the eventual expensive repatriation of critical systems after things went wrong. Many companies that aggressively offshored in the 2000s and 2010s spent the 2020s quietly rebuilding local teams.

The leverage remained.

Software engineers had stumbled into the same position as the skilled weavers before the power loom, the same position as the experienced foremen before Taylor arrived with his stopwatch. They had knowledge that capital needed and couldn’t easily replace.

Capital noticed. And this time, it decided to solve the problem properly.

AI: The Final Taylorism

Now we’re back to the question we started with.

Why is capital spending more on AI than any coordinated investment in human history? Why are the people who control the world’s wealth willing to pour trillions into a technology that may or may not deliver on its promises?

Because they’ve been here before. And this time they want to finish it.

The craftsman had to be dispossessed through enclosure. The skilled weaver had to be replaced through mechanisation. The factory worker had to be broken down through Taylorism. The office worker had to be systematised through enterprise software. Each time, capital found a way to take knowledge that lived in people and transfer it somewhere capital could own and control.

But software engineers resisted that process. Not through strikes or sabotage (though those remain theoretically available) but simply through the nature of what they do. Complex, creative, context-dependent problem solving that resisted systematisation. Domain knowledge that couldn’t be documented, offshored, or broken into repeatable components.

Until now. Or at least, that’s the bet.

The AI investment thesis, stripped of all the marketing language about innovation and human flourishing, is this: replace the complex skill of software engineering with a simple English language interface. Take the last category of workers with genuine leverage and make them replaceable.

Think about what that actually means.

Right now, if you want to build software, you need people who understand programming languages, system architecture, security, performance, databases, APIs, deployment pipelines, and a dozen other deeply technical disciplines. You need people who can hold complexity in their heads, reason about consequences, and make judgment calls that require years of accumulated experience.

That’s leverage. That’s the cobbler owning his shop. That’s the weaver whose craft took years to master.

Capital’s dream is a world where instead of needing that person, you just… describe what you want. In plain English. And the AI builds it. Where the domain knowledge that made software engineers expensive and irreplaceable gets absorbed into a model that capital owns, controls, and can scale infinitely without paying salaries, superannuation, or sick leave.

This is Taylorism applied to cognitive work. Break the complex skill into its simplest possible component, describing a requirement in natural language, and make everyone else in the chain replaceable.

The Manhattan Project built one bomb. The Apollo Program went to the moon. The Interstate Highway System connected a continent. All of those were enormous, historically significant investments.

The AI investment is larger than all of them combined, adjusted for inflation.

That doesn’t happen because technologists are excited. That happens because the people who control capital have looked at the last category of workers with real leverage and decided this is the moment to break it.

Whether it works depends on one thing: whether AI keeps improving.

And here’s what should concern every software engineer reading this: there are no signs of it slowing down. Every benchmark, every model release, every capability threshold crossed ahead of schedule points in the same direction. The trajectory isn’t flattening. If anything the next phase, recursive self-improvement, where AI systems assist in training and improving the next generation of AI systems, threatens to make the current pace of development look glacial.

Capital has been trying to make labour replaceable since the enclosure acts. It got there with physical labour. It got there with routine cognitive labour. It is now making the largest single bet in economic history that it can get there with complex cognitive labour too.

The vagrant was told their freedom was criminality. The Luddite was told their resistance was ignorance. The software engineer is being told this is progress.

The framing changes. The pattern doesn’t.

The Last Replaceable Human

Every generation of workers has believed their skills were irreplaceable.

The craftsman who had spent a lifetime mastering his trade couldn’t imagine a machine doing what his hands knew how to do. The skilled weaver had such intimate knowledge of tension, thread, and pattern that mechanisation seemed impossible. The factory foreman whose floor management came from decades of experience couldn’t conceive of a clipboard and a stopwatch capturing what he knew. The typist, the bookkeeper, the travel agent, the bank teller… all of them, at some point, were the last line of complexity that capital couldn’t automate.

None of them were.

And now we’re here. Software engineers, the accidental craftsmen of the digital age, the last workers to stumble into genuine leverage, watching capital make the largest bet in economic history that we’re next.

Maybe it doesn’t work. Maybe AI hits limits we can’t currently see. Maybe the domain knowledge problem proves as stubborn for machines as it did for offshore teams. Maybe the gap between “generate some code that looks right” and “build systems that work reliably at scale in complex real-world environments” turns out to be uncrossable without human judgment.

Maybe we get a reprieve.

But here’s what five centuries of this history tells us: a reprieve isn’t a solution. If AI can’t replace software engineers today, capital will keep investing until something can. The goal doesn’t change. Only the timeline does.

And this time, the stakes are different in a way they’ve never been before.

Every previous wave of labour replacement had somewhere for the displaced workers to go. The dispossessed peasant became the factory worker. The replaced weaver became the machinist. The automated typist became the data entry clerk. The offshored manufacturer became the service worker. Each time, new categories of work emerged to absorb the people capital no longer needed in the old ones.

Those new categories emerged because the automation was always partial. It replaced physical labour but not cognitive labour. It replaced routine thinking but not complex thinking. There was always a next rung on the ladder that humans could climb while machines took the rung below.

AI doesn’t automate a rung. It automates the ladder.

Physical labour, routine cognitive labour, complex cognitive labour, creative work, judgment, pattern recognition, language, reasoning, the same technology is making inroads into all of it simultaneously. There is no obvious next category. No clear rung above that humans can retreat to while we figure out what comes next.

This is why the question that has always sat underneath this entire history, the one that was manageable when displacement was partial and gradual, now demands an urgent answer.

What happens to people when capital no longer needs their labour at all?

Not some people. Not workers in one industry or one skill category. All of them. What happens to a system built entirely on the premise that people survive by selling their labour, when the largest coordinated capital investment in human history is explicitly designed to make that labour unnecessary?

The vagrant was created by enclosure and told they were a criminal. The Luddite was created by mechanisation and told they were a technophobe. The displaced worker of the coming decade will be created by AI and told they need to retrain, upskill, adapt, embrace the future.

But retrain into what? Upskill toward what? Adapt into a labour market where the same technology that displaced you is also displacing every category you might have moved into?

We have been here before, in pieces. We have never been here in full.

The enclosure acts stole the commons and called it progress. The power loom broke the craftsman and called it innovation. Taylorism destroyed worker leverage and called it efficiency. Offshoring gutted local industries and called it globalisation.

And now capital is spending more money than it has ever spent on anything, on a technology explicitly designed to make the last skilled workers replaceable, and calling it the next frontier of human achievement.

The framing changes. The pattern doesn’t. And this time, there may be no next rung to climb to.

The question isn’t whether you’ll be replaced.

The question is what kind of world we’re going to build for the people who are.