If you’re AI-first, you’re already human-last

"We’re an AI-first company now."

The phrase AI-first is everywhere now, passed around in press releases with the same reverence once reserved for “disruption” or “digital transformation.” It sounds reasonable, forward-thinking, strategic even. But what it really means, when you strip away the optics, is that we’ve started removing people from the equation. Humans got tired of being human. That’s the only explanation I’ve got. The way we’re rolling out the red carpet for AI is bordering on self-erasure. Whole departments are being ‘reimagined.’ Roles, once filled by people with pensions and Monday headaches, are now being handed to clusters of code. The spin is clean, the implications are not.

Duolingo, for example, announced it’s now an “AI-first” company. No big drama, just a casual update: the human contractors who used to create the language lessons? Gone, AI’s got it covered. Klarna’s CEO said they stopped hiring because “AI can already do everything a human can do.” A bold claim, also a depressing one. And Artisan, a start-up I’d never heard of until they launched a campaign saying "Hire Artisans, not humans", they’ve built AI agents that write sales emails, book meetings, and do the stuff that entry-level people once relied on to get their foot in the door. But don’t worry. It’s all very "innovative."

This is what “AI-first” actually means: human-last. You don’t want to say it, but I will. The structure makes it clear. AI gets the seat at the table, you get a polite LinkedIn post saying how proud everyone is of your contribution. Or silence, loud piercing silence. And before anyone says, “But it’s just the low-level stuff,” here’s what’s really happening: a 2024 survey found that 70% of hiring managers believe AI can now do intern-level tasks. Another study showed 30% of companies already replaced workers with AI last year, with more on the way. Entry-level jobs are being wiped out, the middle tier will go next. The top? Well, once your AI can make a deck and present it without a stammer, you’re probably on borrowed time too.

PwC’s report estimated that up to 30% of jobs could be at risk of automation by the mid-2030s, I think this is a very conservative view. The World Economic Forum suggests 83 million jobs may disappear by 2027, replaced by 69 million new ones, most of which require advanced tech literacy, access, and a level of career resilience that, let’s be honest, not everyone has. We’re watching a slow, smiling removal of people from the picture. It's called "automation." But here’s a more interesting question: even if we pull it off, even if we automate the entire supply chain, the creative departments, the customer support teams, the interns, the assistants, the marketers, and the teachers, who, exactly, are we building all this for?

The consequences of this aren’t theoretical. When people don’t work, they don’t earn. When they don’t earn, they don’t spend. And when they don’t spend, the whole glorious engine of capitalism wheezes to a stop. The economy, for all its complexity, is embarrassingly basic at its core. No income, no demand. No demand, no growth. It turns out, even the most beautifully optimised AI system still can’t buy a cup of coffee, not the way I like it! But this isn’t just about economics, it’s about erosion. When work disappears, so does structure. So does pride. So does that basic human need to be needed. What’s replacing it? An endless scroll of online courses on how to “pivot.” A flood of newsletters telling you to learn prompt engineering before the bots eat your lunch.

It would be laughable if it weren’t so tragic. There’s this persistent myth that AI is somehow doing all this on its own. That it’s rolling in, Terminator-style, and taking over. The truth is far sadder: we are handing over the keys ourselves, cheerfully, even. We’re not being overthrown, we’re surrendering. And why? Because somewhere along the line, we convinced ourselves that faster was always better. That scale was always the goal, anything that introduced friction, even if that friction was a human being with a point of view, or a bad day, or sick, or a burst of creativity, was a problem to be solved. We are calling this progress. We keep dressing it up in metrics and KPIs and strategy decks. But you don’t need a PhD in ethics to see what’s really happening here; we are building systems that function beautifully without us, and then we’re calling it a win.

No one’s stopping to ask the harder question: If AI is first, where do we go? This isn’t a Luddite lament. Technology moves, but there’s a difference between evolving systems and erasing people. When the goal shifts from augmenting human intelligence to extracting it, codifying it, replacing it, making it redundant, then we’re not in a revolution. We’re in a reordering, of power, of value, of purpose.

Here’s the tangent, because every time I write about this, I get a little stuck on one thought.

We’ve got AI that can write your emails, schedule your meetings, generate campaign ideas, even suggest what to buy your kid for their birthday. Except it doesn’t know my kid. It doesn’t know that she likes lip balms, not just any lip balm, and not even a specific brand, but the ones with the right texture, the right smell, the ones she wants me to buy, not her dad, because that matters. And her dad is expected to get her the “main” present, not the small ones, because there’s an unspoken rule to these things. AI doesn’t get that, it doesn’t know the conversations we’ve had about what kind of birthday she wants this year. It can suggest things based on trends or past purchases or predicted preferences, but it doesn’t know her. It doesn’t understand care. It doesn’t understand love. It just knows what’s been said, not what hasn’t. And yet somehow, this is the system we’re trusting to design our future.

There is another way to do this, a more intentional way that doesn’t treat people like deprecated features. We could design systems where AI supports human work, not replaces it. Where creativity, care, and judgment are not optional. Where the metric isn’t how much faster we can get rid of people, but how much better we can make their time count. This is exactly why Human-Driven AI matters. It's a fundamental, almost stubborn change in how we choose to build. It puts people at the centre of the process, as the actual drivers of how AI is applied. HDAI insists that intelligence without context, without empathy, without accountability, isn’t intelligence at all. And in a moment like this, that principle is essential.

We need to remember: AI didn’t ask to be first, we made that choice and we can unmake it, because it’s still the most human thing we have left to do.

Previous
Previous

The illusion of AI efficiency

Next
Next

The Machine doesn’t know you