As AI-driven existential dread spreads among software developers across the world, I’m watching reactions varying from the shit-eating grins of those convinced that we must repent for tech-bros’ arrogance, through the crazy-eyed prophecies of the new golden age, and finally, smug assertions that the phrase “software developer/engineer” will at last regain its true meaning. Because most of us just glue stuff together and are nothing like the giants of prior glorious ages.
While the first two are easy to ignore—what do I care about misplaced envy and madness—the last one got me thinking about how it came to be that probably most of us are in this less-than-respectable category. How come the majority of programmers actually are putting premade ugly pieces into even uglier creations and the only sexy part left is—sometimes—the paycheck? Not building elegant solutions to complicated problems and squeezing additional cycles from red-hot silicon.
Thing is that those lobotomized children of the software industry are its own creation. It is a poorly understood fact in the wider public that startup-world rockstars, FAANG employees who memorized half of LeetCode exercises, child prodigies with multiple PhDs, giants of OSS, and nestors of architecture are in fact a high-profile but very small minority of the whole landscape. Most of us are doing boring stuff quietly.
The show must go on
A long time ago, managers believed that software was a commodity. One that can be designed upfront and built according to this design—that it can be done, finished, and then used or sold. It took around 30 years of mostly failing at that and badly missing deadlines and budgets until another concept slowly took hold: that software is a process of discovering what exactly needs to be built as you build it.
The suddenly growing understanding that you cannot plan for things you don’t know that you don’t know swung the pendulum in the opposite direction. We started to be “agile,” which should mean delivering bit by bit, reevaluating, cleaning up the mess—rinse and repeat. Yet, more often than not, it meant a constant tug of war between the waterfall-like mental process of management projected onto the more-or-less “agile” process of software developers—who often got the worst of both worlds.
Responsibility for arbitrary set deadlines in the long run and constant pressure to deliver anything in the short term.
Then came the third revolution as the scarecrow of the dot-com bubble disappeared behind the horizon. The Internet started to gobble up another slow trickle of investor money, which turned fast into a raging river. Those who could swim in it could get really far—under one condition. A hardened-steel conviction that the end justifies the means. Nothing else matters but delivering a passable solution to a paying customer before funds and the patience of investors run out, consequences be damned—we will clean up the mess, maybe, sometime in the future, if nothing else comes up.
The problem is that all of the software created in the “good old times” with all the flaws characteristic of that particular era mostly still exists—and needs to be run, maintained, enhanced, and updated. If you are not the Ronaldo of software engineering or computer science, you are probably one of the suckers who are doing this soul-crushing job—hopefully for a fat paycheck financing therapy.
You might be forgiven for thinking that during these 60 years, we learned a thing or two. So that places where long-term planning, short-term business needs, and daily hygiene are in healthy balance are dominant, allowing you to practice your craft like a good artisan, not a legend of old, but a solid engineer who hones their skills to find smart solutions to complex problems.
Yet, the industry is still in the strong grip of the past and traumas induced by those three periods, mostly ignoring decades of research proving that not shitting where you eat makes you go faster, not slower; and that you can save an hour of careful consideration by only two weeks of coding (or the other way around?).
Waterfall taught us that long-term planning often fails—so we stopped doing it, even when it makes sense. The corpse of “agile,” after Scrum ground it to dust, taught us that nobody gives a flying fuck about anything beyond the next two-week sprint. Finally, the startup era dropped any pretense that our craft is to build coherent, well-architected systems—it can wait, always.
It’s sad because the teachings of all three eras were not wrong; they were just taken too far without any deeper, balanced reflection. Which brings us back to the lobotomized children of the software industry, who once were bright-eyed hobbyists—not Pelés of IT, but people deeply enjoying building stuff and wanting to get better at it. Then, they hit the wall at full speed.
Most of the work out there is not about making things better and getting better yourself in the process—it’s about keeping stuff running exactly as it is, a twisted reflection of the organization that built it. You try, you struggle, go through all stages of grief, and then you learn that there are no prizes for rocking the boat. The operation was successful—we hope your frontal lobe is mostly intact; you will need it—some of it at least.
Now you are like the rest of the herd. Smart enough not to ask how exactly we plan to get this new microservices architecture right if we could not maintain a healthy monolith in the first place. You have your thoughts on what kind of torture the guy who came up with the idea deserves, but if you have not already put in your notice, you will keep them to yourself. You will make it work, exactly like with the rotten monolith, gluing shit together exactly like you are told. No point in stepping out of line—nobody cares.
Failed mercy killing
A good bit of the allure of coding agents is putting all of those poor souls out of their misery. Those who—often for years—were taught to stick to simple assembly work and just working around increasingly common bits hanging by a thread and a bit of spit, now are told that they will become obsolete if they do not develop skills that were suppressed on purpose for years.
Suddenly, everybody who is not an “AI” scientist or S-level expert in some critical niche should become a strange mix of Product Manager, Engineering Manager, and Architect on top of being well-versed in the craft itself, while whipping obedient coding agents to harvest cotton faster. Impostors—now defined by their “AI” illiteracy—will be exposed and expunged, not to stain the good name of the software engineering profession.
This is the dream of those who employ us—often full of suspicion that our arcane craft is not employed fully toward the benefit of shareholders, or those convinced that the demoscene is the last bastion of “true” IT nerds who can do a dial-up modem voice. Who knows, it might be the future, to which some of us adapt and persevere; some of us hopefully paid the mortgage already.
Here comes the twist. This is not what I actually expect will happen.
If you understand how coding agents view the code and manage context, then you know that it is somewhat similar to how we do it. Assess what is relevant, read it, find patterns, come up with a plan based on what you learned—or completely lose the plot if there is no predictable structure and patterns that can be trusted. Overbearing cognitive load or context stuffed with irrelevant things gets you similar results—or no results at all. Both protein- and silicon-based peons require clarity, crisp separation of concerns, and limited distractions to be effective, as both process language—natural and otherwise—in similar ways.
That’s why there is a glaring difference between businesses that got it right before the fourth era of “AI”-assisted or driven development, and those that didn’t but are salivating at the thought of “cheat-code” allowing them to skip the hard part. The first kind will keep on winning, rewarded with increased productivity of smart people who already were doing smart things. The second kind will give a bucket of gasoline and matches to people they spent years indoctrinating that good craftsmanship is at very best secondary to delivery and not to dig too deep-stunting their growth.
Reading a postmortem devoid of substance, clearly generated by AI, elicited only a crooked smile, but it was just the first dead canary in the coal mine. Since a frustrated friend vented to me that they’ve just got all the tokens in the world to burn and a strict ban on all tech-debt repayment in an already deeply flawed codebase, I consider it not a scary possibility, but a new normal in the making.
Ironically, this might actually be the saving grace for all those poor souls who reached the point of not caring as long as the paycheck clears. The ceiling of what current technology can realistically deliver is much lower than advertised, and there is no AGI on the horizon. This means that clearing the mess, when it reaches critical mass in the next few years, will require a lot of warm bodies in the trenches.