
You Don't Need AI For That
By Alex R. · 15 April 2026
"You're in a meeting. You have an informed opinion. You've seen this pattern before. And then someone across the table not a developer, not an architect, not someone who has ever written a line of production code pulls out a response they got from a prompt and uses it to argue you down. Suddenly everyone's a system architect. Everyone's a senior engineer. Because they asked."
At some point in the last two years, a lot of development teams stopped thinking and started prompting. Not for the hard stuff. For everything.
And it looks like productivity. It really does. Until you realise nobody in the room actually has an opinion anymore. They just have outputs.
Most of what people are prompting AI for, they already know. Or they should. Or and this is the part nobody wants to say out loud they should be working it out with the actual humans they share an office with. Because that process? That messy, slow, sometimes frustrating process of figuring things out together? That's not a waste of time. That's the whole point.
When you replace it with a prompt, you don't just skip the thinking. You skip everything that matters. The debate. The pushback. The person who says "we tried that, it was a disaster, here's why." The moment where two people disagree, dig in, and both walk away sharper. All of it gone. Replaced by a confident-sounding paragraph that has no idea what it's talking about in your specific situation.
And here's the thing about AI answers they're not wrong, exactly. They're just not yours. They're not specific to your project, your client, your team, or the weird legacy decision made three years ago that everyone has quietly agreed to never touch. They're the average answer for the average situation. And if your situation were average, you probably wouldn't need to be asking.
The trust issue is what really kills me though. When every question goes through a model first, you quietly stop relying on each other. Not in one big moment just gradually, the habit of turning to a colleague starts to feel like extra steps. Why ask Sarah when Claude answers faster and won't judge you? Why have the conversation when you can skip straight to the conclusion?
But Sarah has context. Sarah was in the room when the client changed their mind three times. Sarah pushed back on the last "simple" feature that turned into a six-week spiral. Sarah has actual opinions, formed from actual experience, about what works and what doesn't in this specific place. That's not replaceable. That's not something you can prompt your way to. And the moment you stop asking Sarah, you lose access to all of it and she eventually stops offering.
Which brings up the part that really doesn't get talked about. What happens to the developer with ten, fifteen years of experience when their judgment gets routinely overruled by a chatbot? Not argued down. Not challenged with a better idea. Just quietly dismissed because the AI said something different. That's a special kind of demoralising. The kind that makes people shut up, do the work, and stop caring.
But there's something even more absurd happening. And if you've been in this industry long enough, you've probably already run into it.
You're in a meeting. Someone raises a technical concern architecture, infrastructure, a decision that has real long term consequences. You have an informed opinion. You've seen this pattern before. You know the tradeoffs. And then someone across the table not a developer, not an architect, not someone who has ever written a line of production code pulls out a response they got from a prompt and uses it to argue you down.
Suddenly everyone's a system architect. Everyone's a senior engineer. Because they asked.
And now you're in the strange position of having to explain to a non-technical person armed with a chatbot why your years of actual experience lead you to a different conclusion than a language model that has never shipped anything. How do you even have that argument? Who are you actually debating the person in front of you, or the bot they're ventriloquising? Because it doesn't feel like a conversation anymore. It feels like being fact-checked by a Magic 8-Ball.
And honestly is it even worth it? That's the question nobody wants to ask. Because if every technical decision can be settled by whoever has the better prompt, what exactly is the point of expertise? Why bother developing deep knowledge over years if it can be casually overruled in thirty seconds by someone who's never had to live with the consequences of a bad architectural call?
Here's what nobody promoting their AI workflow will tell you: we all have access to the same tools. Every single one of us. There is no competitive edge in having a ChatGPT account. There is no secret power in knowing how to write a prompt. The person sitting next to you has it. The client has it. The intern has it. The guy who just started last Monday and has never deployed anything in his life he has it too.
So no, having AI at your disposal does not make you a genius. It does not make you an architect. It does not make your opinion more valid, your decisions more sound, or your experience more real. It makes you someone who can retrieve information quickly. That's it. That's the whole trick.
And here's the part that should really sting: we are genuinely getting dumber. Not dramatically, not all at once but every time you outsource a thought that you could have had yourself, that muscle gets a little weaker. Every time you skip the struggle of figuring something out because the answer is one prompt away, you lose a tiny piece of the process that actually builds expertise. Do that enough times, across enough teams, over enough years and you end up with an industry full of people who are very confident and increasingly incapable.
And the productivity promise? Let's talk about that. Because the whole pitch was that AI would free us up. That we'd work less, think more, do the meaningful stuff. How's that going? Because from where most developers are standing, we're working more than ever. More output expected, more speed demanded, more done with less because the tools should be handling it, right? The tools are handling it. And somehow the to-do list just keeps growing.
AI is not a silver bullet. It is not the answer to everything. It is a tool ...a useful one, in the right hands, for the right job. But a tool used without judgment is just noise. And a team that has replaced its collective brain with a subscription plan isn't more productive. It's just more confident about being wrong.
Use AI. But use your head first. Because that's the one thing it actually can't replace ... and the one thing too many people have quietly stopped bothering with.