by cat-whisperer on 5/27/25, 10:33 AM with 407 comments
by whstl on 5/27/25, 11:31 AM
Funny, because I did some freelancing work fixing disposable vibe-coded landing pages recently. And if there's one thing we can count on is that the biggest control-freaks will always have that one extra stupid requirement that completely befuddles the AI and pushes it into making an even bigger mess, and then I'll have to come fix it.
It doesn't matter how smart the AI becomes, the problems we face with software are rarely technical. The problem is always the people creating accidental complexity and pushing it to the next person as if it was "essential".
The biggest asset of a developer is saying "no" to people. Perhaps AIs will learn that, but with competing AIs I'm pretty sure we'll always get one or the other to say yes, just like we have with people.
by jstummbillig on 5/27/25, 11:34 AM
> It's architecting systems. And that's the one thing AI can't do.
Why do people insist on this? AI absolutely will be able to do that, because it increasingly can do that already, and we are now goalposting around what "architecting systems" means.
What it cannot do, even in theory, is decide for you to want to do something and decide for you what that should be. (It can certainly provide ideas, but the context space is so large that I don't see how it would realistically be better at seeing an issue that exists in your world, including what you can do, who you know, and what interests you.)
For the foreseeable future, we will need people who want to make something happen. Being a developer will mean something else, but that does not mean that you are not the person most equipped to handle that task and deal with the complexities involved.
by vinceguidry on 5/27/25, 3:16 PM
The only people that value quality are engineers. Any predictions of the future by engineers that rely on other people suddenly valuing quality can safely be ignored.
by fhd2 on 5/27/25, 11:53 AM
1. The push for "software architects" to create plans and specifications for those pesky developers to simply follow. I remember around 2005, there was some hype around generating code from UML and having developers "just" fill in the blanks. The result I've observed were insanely over engineered systems where even just adding a new field to be stored required touching like 8 files across four different layers.
2. The "agile transformation" era that followed shortly after, where a (possibly deliberate) misunderstanding of agile principles lead to lots of off-the-shelf processes, roles, and some degree of acceptance for micro managing developers. From what I've seen, this mostly eroded trust, motivation and creativity. Best case scenario, it would create a functioning feature factory that efficiently builds the wrong thing. More often than not, it just made entire teams unproductive real fast.
What I've always liked to see is non-developers showing genuine interest in the work of developers, trying to participate or at least support, embracing the complexity and clarifying problems to solve. No matter what tools teams use and what processes they follow, I've always seen this result in success. Any effort around reducing the complexity inherent in software development, did not.
by dinfinity on 5/27/25, 11:26 AM
> And as we'll see, that's the one skill AI isn't close to replacing.
Yet we never 'see' this in the article. It just restates it a few times without providing any proof.
I'd argue the opposite: specifically asking AI for designing an architecture already yields better results than what a good 30% of 'architects' I've encountered could ever come up with. It's just that a lot of people using AI don't explicitly ask for these things.
by crakhamster01 on 5/27/25, 2:19 PM
The author of this post is right, code is a liability, but AI leaders have somehow convinced the market that code generation on demand is a massive win. They're selling the industry on a future where companies can maintain "productivity" with a fraction of the headcount.
Surprisingly, no one seems to ask (or care) about how product quality fares in the vibe code era. Last month Satya Nadella famously claimed that 30% of Microsoft's code was written by AI. Is it a coincidence that Github has been averaging 20 incidents a month this year?[1] That's basically once a work day...
Nothing comes for free. My prediction is that companies over-prioritizing efficiency through LLMs will pay for it with quality. I'm not going to bet that this will bring down any giants, but not every company buying this snake oil is Microsoft. There are plenty of hungry entrepreneurs out there that will swarm if businesses fumble their core value prop.
by nhumrich on 5/27/25, 11:22 AM
Yes, this. 100% this. The goal is for a program to serve a goal/purpose with the least a amount of code possible. AI does the exact opposite. Now that code generation is easy, there is no more natural constraint preventing too much liability.
by mpweiher on 5/27/25, 8:18 PM
http://www.softwarepreservation.org/projects/FORTRAN/BackusE...
by sunegg on 5/27/25, 12:52 PM
The current generation of generative AI based on LLMs simply won't be able to properly learn to code large code bases, and won't make the correct evaluative choices of products. Without being able to reason and evaluate objectively, you won't be a good "developer" replacement. Similar to how asking LLMs about (complex) integrals, it will often end it's answer with "solution proved by derivation", not because it has actually done it (it will also end with this on incorrect integrals), but because that's what its training data does.
by IshKebab on 5/27/25, 12:11 PM
It's entirely possible that in 5 or 10 years at least some developers will be fully replaced.
(And probably a lot of people in HR, finance, marketing, etc. too.)
by overflow897 on 5/27/25, 2:43 PM
But if it's false, there's no saying you can't eventually have an ai model that can read your entire aws/infra account, look at logs, financials, look at docs and have a coherent picture of an entire business. At that point the idea that it might be able to handle architecture and long term planning seems plausible.
Usually when I read about developer replacement, it's with the underlying assumption that the agents/models will just keep getting bigger, better and cheaper, not that today's models will do it.
by janalsncm on 5/27/25, 5:20 PM
Think about it this way: five years ago plenty of companies hired more SWEs to increase productivity, gladly accepting additional cost. So it’s not about cost imo.
I might be wrong, but perhaps a useful way to look at all of this is to ignore stated reasons for layoffs and look at the companies themselves.
by JimDabell on 5/27/25, 12:18 PM
by gherkinnn on 5/27/25, 1:59 PM
I don't quite agree. I see the skill in translating the real world with all its inconsistencies in to something a computer understands.
And this is where all the no/lo-code platforms fall apart. At some point that translation step needs to happen and most people absolutely hate it. And now you hire a dev anyway. As helpful as they may be, I haven't seen LLMs do this translation step any better.
Maybe there is a possibility that LLMs/AI remove the moron out of "extremely fast moron" that are computers in ways I haven't yet seen.
by hintymad on 5/27/25, 6:53 PM
I think this time there is a key difference: AI coding is fully embedded into a software dev's workflow, and it indeed cuts loads of work for at least some of the projects and engineers. In contrast, few, if none, engineers would go to a No-Code/Low-Code tool and them maintain them in their repo.
The impact would be that we will need fewer engineers as the productivity of us increases. That alone may not be enough to change the curve of supply and demand. However, when this is combined with the current market condition of lacking business growth, the curve will be changed: the fewer new problems we have, the more repetitive solutions we will get, the more repetitive solutions we will work on, the more accurate the code generated by AI will be, and therefore the less code we will need a human to write.
So, this time it will not be about AI replacing engineers, but about AI replacing enough repetitive work that we will need fewer engineers.
by gwbas1c on 5/27/25, 12:43 PM
> The result wasn't fewer developers
Makes me wonder if the right thing to do is to get rid of the non-developers instead?
by kookamamie on 5/27/25, 12:21 PM
And the most valuable skill in defending a stance is moving goal posts.
by dakiol on 5/27/25, 12:00 PM
by goric on 5/29/25, 1:26 PM
The team used supplemental oxygen on the climb, with the starting altitude and flow rate not being reported [1]. This is speculative, but if they were using more oxygen than typical and starting at a lower altitude, that's a massive advantage.
Further, Andrew Ushakov traveled from NYC to the summit in just under 4 days this year without the use of xenon (but also with supplemental oxygen with unknown starting altitude and flow rate). He used a hypoxic tent to prepare as well, and depending on the accuracy of the reporting may have even spent less time doing that than the xenon team did [1].
[0] https://www.npr.org/2025/05/18/nx-s1-5398553/a-new-company-i... [1] https://www.alanarnette.com/blog/2025/05/21/everest-2025-fas...
by hermitcrab on 5/27/25, 1:51 PM
Visual programming (NoCode/LowCode) tools have been very successful in quite a few domains. Animation, signal processing, data wrangling etc. But they have not been successful for general purpose programming, and I don't think they ever will be. More on this perenial HN topic at:
https://successfulsoftware.net/2024/01/16/visual-vs-text-bas...
by protocolture on 5/27/25, 10:48 PM
by brunoborges on 5/27/25, 6:57 PM
I agree with the core of the idea though, and I have written about it as well (https://www.linkedin.com/posts/brunocborges_ai-wont-eliminat...).
by tbrownaw on 5/27/25, 11:26 AM
Also, something being a liability and something having upkeep costs are not the same thing.
by plainOldText on 5/27/25, 11:46 AM
> Code is not an asset, it's a liability.
> Every line must be maintained, debugged, secured, and eventually replaced. The real asset is the business capability that code enables.
> The skill that survives and thrives isn't writing code. It's architecting systems. And that's the one thing AI can't do.
by nailer on 5/27/25, 2:11 PM
God I felt like I was the only one that noticed. People would say 'DevOps can code' as if that made DevOps a new thing, but being able to automate anything was a core principle of the SAGE-style systems admin in the 90s / early 2000s.
by FrameworkFred on 5/27/25, 7:54 PM
But AI can do some architecting. It's just not really the sort of thing where an unskilled person with a highly proficient LLM is going to be producing a distributed system that does anything useful.
It seems to me that the net effect of AI will be to increase the output of developers without increasing the cost per developer. Effectively, this will make software development cheaper. I suppose it's possible that there is some sort of peak demand for software that will require less developers over time to meet, but, generally, when something becomes cheaper, the demand for that thing will tend to increase.
I think the rumors of our demise are overblown.
by ddtaylor on 5/27/25, 11:45 AM
This was the response by non-developers to make it obsolete to need to spell out your business details to an expensive programmer who, we presume, will just change them anyhow and make up their own numbers!
That didn't work for shit either, although to the authors point it did create a ton of jobs!
by bob1029 on 5/27/25, 2:56 PM
Seeing the difference in complexity between a distributed "monolith" and an actual one makes me wonder how serious some of us are about serving the customer. The speed with which you can build a rails or PHP app makes everything proposed since 2016 seem kind of pointless from a business standpoint. Many SaaS B2B products could be refactored into a single powershell/bash script.
It can take a very firm hand to guide a team away from the shiny distractions. There is no way in hell an obsequious AI contraption will be able to fill this role. I know for a fact the LLMs are guiding developers towards more complexity because I have to constantly prompt things like "do not use 3rd party dependencies" and "demonstrate using pseudocode first" to avoid getting sucked into npm Narnia.
by elzbardico on 5/27/25, 12:40 PM
by ahofmann on 5/27/25, 11:29 AM
Nobody knows how the future looks like, but I would change that sentence slightly:
"It's architecting systems. And that's the one thing AI can't yet do."
by holtkam2 on 5/27/25, 6:58 PM
Now just for the heck of it I’ll attempt to craft the strongest rebuttal I can:
This blog misses the key difference between AI and all other technologies in software development. AI isn’t merely good at writing code. It’s good at thinking. It’s not going to merely automate software development, it’s going to automate knowledge work. You as a human have no place in a world where your brain is strictly less capable in all realms of decisionmaking compared to machines.
by legulere on 5/27/25, 7:12 PM
I think a better comparison is to Jevons Paradox. New Technologies make developers more efficient and thus cheaper. This increases demand more than what is gained by the efficiency increases.
I don't see us anytime soon running out of things that are worth automating, especially if the cost for that continues to drop.
by joshuakelly on 5/27/25, 12:32 PM
This time it really _is_ different, and we're looking at a world totally saturated with an abundance of bits. This will not be a simple restructuring of labor markets but something very significant and potentially quite severe.
https://www.lesswrong.com/posts/6Xgy6CAf2jqHhynHL/what-2026-...
by octo888 on 5/27/25, 3:16 PM
Which makes it really obvious their aim is to get rid of (expensive) developers, not to unlock our time to enable us to work on higher things
by analog31 on 5/27/25, 12:14 PM
This could explain the cycle by itself. Dynamic equations often tend to oscillate. Anything that temporarily accelerates the production of code imposes a maintenance cost later on.
by inadequatespace on 5/28/25, 2:20 PM
by neallindsay on 5/27/25, 1:45 PM
by msgodel on 5/28/25, 4:18 PM
This is why LLMs will not replace developers.
by bahmboo on 5/27/25, 8:06 PM
by throwawayobs on 5/27/25, 3:03 PM
by chasing on 5/27/25, 10:30 PM
by ogogmad on 5/27/25, 5:46 PM
by the__alchemist on 5/27/25, 11:52 AM
by eduction on 5/27/25, 1:55 PM
Writing code should almost be an afterthought to understanding the problem deeply and iteratively.
by simultsop on 5/28/25, 11:53 AM
a must-have principle throughout the entire software engineering career.
by catigula on 5/27/25, 12:51 PM
>What actually happens isn't replacement, it's transformation.
"Statement --/, negation" pattern is the clearest indicator of ChatGPT I currently know.
by martzoukos on 5/27/25, 11:59 AM
by moralestapia on 5/27/25, 7:45 PM
Weak conclusion as AI already does that quite well.
by wayeq on 5/27/25, 9:59 PM
Tell that to Coca-Cola.. whose most valuable asset is literally an algorithm
by 1vuio0pswjnm7 on 5/27/25, 3:46 PM
Can it persist in times when borrowing money is not free (nonzero interest rates)
by hcfman on 5/27/25, 1:44 PM
by exodust on 5/27/25, 11:30 AM
And the disdain for marketing sites continues. I'd argue the thing that's in front of your customer's face isn't "disposable"! When the customer wants to tinker with their account, they might get there from the familiar "marketing site". Or when potential customers and users of your product are weighing up your payment plans, these are not trivial matters! Will you really trust Sloppy Jo's AI in the moment customers are reaching for their credit cards? The 'money shot' of UX. "Disposable"? "Doesn't matter"? Pffff!
by readthenotes1 on 5/27/25, 1:29 PM
The same day that a tutorial from Capgemini consultant on how to write code using AI appeared here, I heard from a project manager who has AI write up code that is the reviewed by the human project team--because that is far easier.
I expect most offshoring to go the way of the horse and buggy because it may be easier to explain the requirements to cursor, and the turnaround time is much faster.
by mediumsmart on 5/27/25, 2:49 PM
I keep saying that - AI is the brick maker - you build the house. and its your decision to build that house that only needs bricks in the right place ...