I'm tired of the discourse. "AI will replace designers." "AI will replace writers." "AI will replace developers." Every week there's a new profession on the chopping block, a new LinkedIn thought leader who automated their entire job in an afternoon and wants you to know about it. The takes cycle between breathless excitement and existential dread, and honestly, I find the whole thing exhausting.

But here's what bugs me: the people saying "don't worry, AI won't replace you" are also missing the point. Not because AI isn't powerful — it absolutely is, and it's getting better at a pace that should make anyone pay attention. The real risk isn't that a machine takes your job overnight. The real risk is subtler, slower, and much harder to notice. It's atrophy.

The threat isn't replacement. It's the slow, comfortable erosion of the skills you stopped practicing because a machine was doing them for you.

The GPS Killed Your Sense of Direction

Think about what GPS did to navigation. Twenty years ago, you had to actually know where you were going. You built mental maps of your city. You recognized landmarks, memorized highway exits, developed an intuition for north versus south. You could feel when a route was wrong before you even looked at a map.

Now? You follow the blue line. You arrive at your destination perfectly fine. But try to retrace the route without your phone and you'll stare blankly at the intersection like you've never been there before — even though you've driven it fifty times.

GPS didn't make you a worse driver. It made you a worse navigator. And you didn't notice because you were still getting where you needed to go, right on time, every time. The capability atrophied in the background while the outcome stayed the same.

AI is doing the same thing to knowledge work. Right now. To you. To me.

Understanding vs. Producing

I use AI every day. I build products with Claude Code, I brainstorm with it, I use it to debug problems I'd otherwise spend hours on. It has made me dramatically more productive. And I've started to notice a pattern that worries me.

When I first started building with AI, I read every line of code it generated. I asked it to explain decisions I didn't understand. I looked up concepts, challenged its suggestions, pushed back when something felt wrong. I was slow, but I was learning. The AI was a teacher as much as a tool.

Six months later, I caught myself copying entire files into my project without reading them. They worked. Why bother understanding them? Ship it, move on, next task.

That's the trap. "It works" is the lowest possible bar. Understanding why it works is what lets you debug it when it breaks. Knowing what it does is what gives you the judgment to decide what to build next. Without that understanding, you're not building — you're assembling. And assemblers are the first to be automated.

Vibe Coding: A Cautionary Tale

There's a term floating around developer circles: "vibe coding." The idea is simple — describe what you want to an AI, it writes the code, you ship it. Don't read the code. Don't understand it. Don't even look at it too hard. Just vibe with it.

I get the appeal. I really do. I'm a non-technical founder who ships products through conversation with an AI. The last person who should be gatekeeping is me. But there's a critical difference between using AI as a lever and using AI as a blindfold.

When something breaks at 2am — and it will — the vibe coder pastes the error back into the AI and prays. The person who read the code, who understood the architecture, who asked "why" along the way? They have at least a rough idea of where to look. They might be slower, but they're not helpless.

Don't Panic

Using AI doesn't make you lazy. Refusing to understand what it produces does. There's a vast middle ground between "write everything from scratch" and "accept everything blindly." The whole point is to find that middle ground and live there deliberately.

The Sweet Spot: Amplify, Don't Abdicate

I think of AI like a guitar amplifier. Plug in a skilled guitarist and the amp makes them incredible — louder, richer, more powerful. Plug in someone who can't play and the amp just makes the bad playing louder. The amplifier doesn't create the music. It amplifies what's already there.

The sweet spot is using AI to amplify your skills while staying in the driver's seat. That means a few concrete things:

  • Read what it produces. Every time. Not every line of every file, but enough to understand the approach, the tradeoffs, and the assumptions.
  • Do the hard version sometimes. Write the first draft yourself before asking AI to improve it. Sketch the layout by hand before generating it. Keep the muscle alive.
  • Stay curious about the "why." AI is great at the "what" and the "how." The "why" is still yours. If you can't explain why something works, you don't really understand it.
  • Build your taste. AI can generate a thousand options. Knowing which one is actually good — that's taste. Taste comes from reps, from paying attention, from doing the work yourself enough to recognize quality.
The future belongs to people who can use AI and still think for themselves. Who have taste, judgment, and the hard-won understanding that comes from doing the work — not just delegating it.

The Uncomfortable Truth

The people who will thrive aren't the ones who can use AI. Everyone will be able to use AI — that's the whole point of it. The people who will thrive are the ones who stayed sharp while everyone else got comfortable. The ones who used AI to go faster without forgetting how to walk.

AI won't replace you. But if you let it do all your thinking, you might just replace yourself — one outsourced decision at a time, until the day someone asks you to work without it and you realize you've forgotten how.

Stay in the driver's seat. The machine is an incredible engine. But you still need to know where you're going.