Last week my team Slack blew up. Someone dropped a link to the NYT’s “Coding After Coders” piece — the one arguing that AI is rendering programmers obsolete — and suddenly everyone had opinions. One of my senior engineers called it “tech journalism cosplay.” A junior dev sent a string of 😬 emojis and nothing else.

I read the piece twice. Then I went back to my actual week: three architecture review meetings, a gnarly production incident at 2 AM, and seventeen code reviews. And I thought: these journalists talked to the right people about the wrong things.

Here’s my honest breakdown.

Table of Contents

What the NYT Got Right

Let me start by giving credit where it’s due — because the piece is not wrong, it’s just incomplete.

Boilerplate is genuinely dying. I don’t have one engineer on my team who writes CRUD endpoints from scratch anymore. Unit tests for straightforward logic? AI. Basic data transformations? AI. Scaffolding a new service? AI gets you 80% there in minutes. This is real, it’s happening, and pretending otherwise is cope.

Entry-level ticket work is shrinking. The “write this simple utility function” or “fix this obvious type error” tasks that used to be good training grounds for juniors — those are increasingly handled by AI before a junior ever sees them. I’ve watched this happen on my own team over the past 18 months. The ramp-up is different now.

Something is genuinely changing. The shape of programming work is shifting. Less time on implementation, more time on what to implement and why. If you’re not noticing this, you might not be paying attention.

What They Completely Missed

Here’s where the piece falls apart: they interviewed product managers, startup founders, and AI researchers. What they didn’t do — or if they did, didn’t weight heavily — was talk to people actually running engineering teams shipping production software at scale.

Because the moment you zoom out from “write me a function” to “keep a complex distributed system alive and evolving,” the picture looks completely different.

Architecture is not a prompt. Last month we had to decide how to handle state synchronization between multiple services with different latency requirements and partially overlapping ownership. I spent two days whiteboarding with two engineers, going back and forth on trade-offs. I used Claude to think through edge cases. But the actual decision — the one that will shape how this system grows for the next two years — came from understanding our specific constraints, our team’s strengths, our company’s priorities, and a dozen historical context points that no AI has access to.

Debugging production is detective work. I’ve found AI useful for debugging — I’ve written about it. But a 2 AM incident last week involved multiple services, several teams, conflicting metrics, and a race condition that only manifested under specific load patterns. You know what helped most? The engineer who had built that service 18 months ago and remembered a specific edge case. That kind of institutional knowledge doesn’t prompt well.

Cross-team coordination is entirely human. Getting two teams to agree on an API contract when they have different timelines and priorities? Getting a platform team to actually prioritize your migration? This is politics, trust, and relationship management. AI doesn’t attend your planning meetings.

The Work AI Cannot Do (Yet)

To be concrete about where AI actually is right now — as opposed to where it might be in five years — here’s a real comparison from my week:

TaskAI PerformanceWhy
Write a parser for a known data formatExcellentWell-defined, bounded problem
Generate tests for a pure functionExcellentClear input/output, no hidden state
Review a PR for obvious bugsGoodPattern matching on known anti-patterns
Design a new service’s data modelMixedNeeds business context AI doesn’t have
Decide whether to refactor vs. rewritePoorRequires org context, team capacity, risk tolerance
Debug a race condition across servicesPoorNeeds real-time data, system history, intuition
Convince a stakeholder to delay a featureNot applicableRelationships, trust, organizational dynamics

The NYT piece blurs the line between “AI can generate code” and “AI can do engineering.” Those are not the same thing.

What This Means for Your Career

If you’re reading this defensively, I get it. But defensive is the wrong posture.

The engineers on my team who are thriving are not the ones who write the most code. They’re the ones who make the best decisions. They ask “should we build this at all?” before “how do we build this?” They can hold context across systems and surface the non-obvious trade-off. They know when to push back on a product requirement.

If your entire value-add is writing syntactically correct code quickly — yes, that is under pressure. But if you’ve been treating coding as a means to solving problems, you’re in a better position than you might feel right now.

The engineers adapting fastest are the ones using AI as a force multiplier — check out how AI is changing development workflows for a more tactical look at this.

The Uncomfortable Truth About Junior Roles

I want to be honest here because I think a lot of senior people are being evasive about this.

Yes, the junior developer path is getting harder. The “learn by doing simple tasks” pipeline is partially broken because AI does those simple tasks now. I’m genuinely uncertain how to run a good junior program in this environment. We’re figuring it out.

What I’ve landed on: junior developers need to be closer to the decision-making process faster. Less “implement this ticket,” more “sit in this architecture discussion and tell me what you learned.” It’s a different kind of apprenticeship.

If you’re a junior developer: the answer is not to avoid AI. It’s to use it to accelerate your own learning while intentionally building the judgment that AI cannot replace.

My Actual Take

Programming isn’t dead. But “programming” is being redefined in real-time, and the NYT piece captures a real change while missing the iceberg below the surface.

The code-writing part of engineering is becoming table stakes — necessary but not sufficient. The thinking, deciding, designing, and coordinating parts are becoming more central, not less.

The engineers who will struggle are the ones who’ve optimized for the part AI is eating. The ones who will thrive are the ones who’ve always understood that writing code is just how you express an idea — and that having good ideas, understanding complex systems, and working well with other humans is the actual job.

I’ve been using AI tools in my workflow for over a year — from code review to pair programming to sprint planning. The tools are genuinely powerful. They’ve changed how my team works. They haven’t changed what we’re ultimately responsible for.

The NYT got the headline. They missed the story.


Want something more tactical? My piece on AI tools for tech leads in 2026 covers the specific workflows my team has adopted.