from Hacker News

The Myth of Developer Obsolescence

by cat-whisperer on 5/27/25, 10:33 AM with 407 comments

  • by whstl on 5/27/25, 11:31 AM

    > For agency work building disposable marketing sites

    Funny, because I did some freelancing work fixing disposable vibe-coded landing pages recently. And if there's one thing we can count on is that the biggest control-freaks will always have that one extra stupid requirement that completely befuddles the AI and pushes it into making an even bigger mess, and then I'll have to come fix it.

    It doesn't matter how smart the AI becomes, the problems we face with software are rarely technical. The problem is always the people creating accidental complexity and pushing it to the next person as if it was "essential".

    The biggest asset of a developer is saying "no" to people. Perhaps AIs will learn that, but with competing AIs I'm pretty sure we'll always get one or the other to say yes, just like we have with people.

  • by jstummbillig on 5/27/25, 11:34 AM

    I think the article is mostly wrong about why it is right.

    > It's architecting systems. And that's the one thing AI can't do.

    Why do people insist on this? AI absolutely will be able to do that, because it increasingly can do that already, and we are now goalposting around what "architecting systems" means.

    What it cannot do, even in theory, is decide for you to want to do something and decide for you what that should be. (It can certainly provide ideas, but the context space is so large that I don't see how it would realistically be better at seeing an issue that exists in your world, including what you can do, who you know, and what interests you.)

    For the foreseeable future, we will need people who want to make something happen. Being a developer will mean something else, but that does not mean that you are not the person most equipped to handle that task and deal with the complexities involved.

  • by vinceguidry on 5/27/25, 3:16 PM

    This article makes a fundamental mistake where the author thinks that business values quality. Business has never valued quality. Customers can value quality, but business only values profit margins. If customers will only buy quality, then that's what business will deliver. But customers don't value quality either, most of the time. They value bang-for-buck. They'll buy the cheapest tools on Amazon and happily vibe code their way into a hole, then throw the broke code out and vibe code some more.

    The only people that value quality are engineers. Any predictions of the future by engineers that rely on other people suddenly valuing quality can safely be ignored.

  • by fhd2 on 5/27/25, 11:53 AM

    I think a few revolutions are missing in the list, that weren't technical, but organisational:

    1. The push for "software architects" to create plans and specifications for those pesky developers to simply follow. I remember around 2005, there was some hype around generating code from UML and having developers "just" fill in the blanks. The result I've observed were insanely over engineered systems where even just adding a new field to be stored required touching like 8 files across four different layers.

    2. The "agile transformation" era that followed shortly after, where a (possibly deliberate) misunderstanding of agile principles lead to lots of off-the-shelf processes, roles, and some degree of acceptance for micro managing developers. From what I've seen, this mostly eroded trust, motivation and creativity. Best case scenario, it would create a functioning feature factory that efficiently builds the wrong thing. More often than not, it just made entire teams unproductive real fast.

    What I've always liked to see is non-developers showing genuine interest in the work of developers, trying to participate or at least support, embracing the complexity and clarifying problems to solve. No matter what tools teams use and what processes they follow, I've always seen this result in success. Any effort around reducing the complexity inherent in software development, did not.

  • by dinfinity on 5/27/25, 11:26 AM

    > The most valuable skill in software isn't writing code, it's architecting systems.

    > And as we'll see, that's the one skill AI isn't close to replacing.

    Yet we never 'see' this in the article. It just restates it a few times without providing any proof.

    I'd argue the opposite: specifically asking AI for designing an architecture already yields better results than what a good 30% of 'architects' I've encountered could ever come up with. It's just that a lot of people using AI don't explicitly ask for these things.

  • by crakhamster01 on 5/27/25, 2:19 PM

    I'm increasingly certain that companies leaning too far into the AI hype are opening themselves up to disruption.

    The author of this post is right, code is a liability, but AI leaders have somehow convinced the market that code generation on demand is a massive win. They're selling the industry on a future where companies can maintain "productivity" with a fraction of the headcount.

    Surprisingly, no one seems to ask (or care) about how product quality fares in the vibe code era. Last month Satya Nadella famously claimed that 30% of Microsoft's code was written by AI. Is it a coincidence that Github has been averaging 20 incidents a month this year?[1] That's basically once a work day...

    Nothing comes for free. My prediction is that companies over-prioritizing efficiency through LLMs will pay for it with quality. I'm not going to bet that this will bring down any giants, but not every company buying this snake oil is Microsoft. There are plenty of hungry entrepreneurs out there that will swarm if businesses fumble their core value prop.

    [1] https://www.githubstatus.com/history

  • by nhumrich on 5/27/25, 11:22 AM

    > code is not an asset—it's a liability

    Yes, this. 100% this. The goal is for a program to serve a goal/purpose with the least a amount of code possible. AI does the exact opposite. Now that code generation is easy, there is no more natural constraint preventing too much liability.

  • by mpweiher on 5/27/25, 8:18 PM

    “Since FORTRAN should virtually eliminate coding and debugging…” -- FORTRAN Preliminary report, 1954

    http://www.softwarepreservation.org/projects/FORTRAN/BackusE...

  • by sunegg on 5/27/25, 12:52 PM

    The issue with these AI systems is how incredibly well they function in isolated circumstances, and how much they crash and burn when they have to be integrated into a full tech stack (even if the tech stack is also written by the same model).

    The current generation of generative AI based on LLMs simply won't be able to properly learn to code large code bases, and won't make the correct evaluative choices of products. Without being able to reason and evaluate objectively, you won't be a good "developer" replacement. Similar to how asking LLMs about (complex) integrals, it will often end it's answer with "solution proved by derivation", not because it has actually done it (it will also end with this on incorrect integrals), but because that's what its training data does.

  • by IshKebab on 5/27/25, 12:11 PM

    These kinds of articles are arguing against nothing. Anybody can see that AI can't really replace developers today (though it can certainly save you huge chunks of time in some situations). But what about in 5 years? 10 years? Things are changing rapidly and nobody knows what's going to happen.

    It's entirely possible that in 5 or 10 years at least some developers will be fully replaced.

    (And probably a lot of people in HR, finance, marketing, etc. too.)

  • by overflow897 on 5/27/25, 2:43 PM

    I think articles like this have the big assumption under them that we are going to plateau with progress. If that assumption is true, then sure.

    But if it's false, there's no saying you can't eventually have an ai model that can read your entire aws/infra account, look at logs, financials, look at docs and have a coherent picture of an entire business. At that point the idea that it might be able to handle architecture and long term planning seems plausible.

    Usually when I read about developer replacement, it's with the underlying assumption that the agents/models will just keep getting bigger, better and cheaper, not that today's models will do it.

  • by janalsncm on 5/27/25, 5:20 PM

    I think for the most part the layoffs in software are layoffs because of uncertainty, not because of technology. They are being justified after the fact with technobabble. If there wasn’t economic uncertainty companies would gladly accept the extra productivity.

    Think about it this way: five years ago plenty of companies hired more SWEs to increase productivity, gladly accepting additional cost. So it’s not about cost imo.

    I might be wrong, but perhaps a useful way to look at all of this is to ignore stated reasons for layoffs and look at the companies themselves.

  • by JimDabell on 5/27/25, 12:18 PM

    You can go back further than the article describes as well. Back in the 90s the same sorts of articles were written about how WYSIWYG editors like FrontPage, Dreamweaver, etc. were going to make web developers obsolete.
  • by gherkinnn on 5/27/25, 1:59 PM

    > The most valuable skill in software isn't writing code, it's architecting systems.

    I don't quite agree. I see the skill in translating the real world with all its inconsistencies in to something a computer understands.

    And this is where all the no/lo-code platforms fall apart. At some point that translation step needs to happen and most people absolutely hate it. And now you hire a dev anyway. As helpful as they may be, I haven't seen LLMs do this translation step any better.

    Maybe there is a possibility that LLMs/AI remove the moron out of "extremely fast moron" that are computers in ways I haven't yet seen.

  • by hintymad on 5/27/25, 6:53 PM

    > The NoCode/LowCode Revolution

    I think this time there is a key difference: AI coding is fully embedded into a software dev's workflow, and it indeed cuts loads of work for at least some of the projects and engineers. In contrast, few, if none, engineers would go to a No-Code/Low-Code tool and them maintain them in their repo.

    The impact would be that we will need fewer engineers as the productivity of us increases. That alone may not be enough to change the curve of supply and demand. However, when this is combined with the current market condition of lacking business growth, the curve will be changed: the fewer new problems we have, the more repetitive solutions we will get, the more repetitive solutions we will work on, the more accurate the code generated by AI will be, and therefore the less code we will need a human to write.

    So, this time it will not be about AI replacing engineers, but about AI replacing enough repetitive work that we will need fewer engineers.

  • by gwbas1c on 5/27/25, 12:43 PM

    > "Why hire expensive developers when anyone can build an app?"

    > The result wasn't fewer developers

    Makes me wonder if the right thing to do is to get rid of the non-developers instead?

  • by kookamamie on 5/27/25, 12:21 PM

    > The most valuable skill in software isn't writing code, it's architecting systems.

    And the most valuable skill in defending a stance is moving goal posts.

  • by dakiol on 5/27/25, 12:00 PM

    I think that until LLMs can assertively say "no" to your requests, we won't be able to rely on them autonomously. The greatest downside of ChatGPT, Copilot, and similar tools is that they always give you something in return, they always provide some sort of answer and rarely challenge your original request. That's the biggest difference I've noticed so far between working with humans and machines. Humans will usually push back, and together you can come up with something better (perhaps with less code or fewer processes, or less dependencieS). Chatbots (as of now) just throw at you one of the thousands of potential solutions to shut your mouth.
  • by goric on 5/29/25, 1:26 PM

    IMO this climb doesn't really do much to prove of xenon's effectiveness (altitude researcher Dr. Peter Hackett says as much [0]).

    The team used supplemental oxygen on the climb, with the starting altitude and flow rate not being reported [1]. This is speculative, but if they were using more oxygen than typical and starting at a lower altitude, that's a massive advantage.

    Further, Andrew Ushakov traveled from NYC to the summit in just under 4 days this year without the use of xenon (but also with supplemental oxygen with unknown starting altitude and flow rate). He used a hypoxic tent to prepare as well, and depending on the accuracy of the reporting may have even spent less time doing that than the xenon team did [1].

    [0] https://www.npr.org/2025/05/18/nx-s1-5398553/a-new-company-i... [1] https://www.alanarnette.com/blog/2025/05/21/everest-2025-fas...

  • by hermitcrab on 5/27/25, 1:51 PM

    >The NoCode/LowCode Revolution

    Visual programming (NoCode/LowCode) tools have been very successful in quite a few domains. Animation, signal processing, data wrangling etc. But they have not been successful for general purpose programming, and I don't think they ever will be. More on this perenial HN topic at:

    https://successfulsoftware.net/2024/01/16/visual-vs-text-bas...

  • by protocolture on 5/27/25, 10:48 PM

    AI is an unlimited line of credit at the bank of technical debt.
  • by brunoborges on 5/27/25, 6:57 PM

    The amount of em dashes in this article is quite telling...

    I agree with the core of the idea though, and I have written about it as well (https://www.linkedin.com/posts/brunocborges_ai-wont-eliminat...).

  • by tbrownaw on 5/27/25, 11:26 AM

    The first listed iteration is to late, what about the COmmon Business-Oriented Language?

    Also, something being a liability and something having upkeep costs are not the same thing.

  • by plainOldText on 5/27/25, 11:46 AM

    I think these are key thoughts worth considering going forward:

      > Code is not an asset, it's a liability.
    
      > Every line must be maintained, debugged, secured, and eventually replaced. The real asset is the business capability that code enables.
    
      > The skill that survives and thrives isn't writing code. It's architecting systems. And that's the one thing AI can't do.
  • by nailer on 5/27/25, 2:11 PM

    > The sysadmins weren't eliminated; they were reborn as DevOps engineers with fancy new job titles and substantially higher compensation packages.

    God I felt like I was the only one that noticed. People would say 'DevOps can code' as if that made DevOps a new thing, but being able to automate anything was a core principle of the SAGE-style systems admin in the 90s / early 2000s.

  • by FrameworkFred on 5/27/25, 7:54 PM

    I agree with some of the article. I agree that code is a liability that's distinct from the asset that the code is part of. It's like tires on a car, they're liability-like whereas the car can be thought of as an asset.

    But AI can do some architecting. It's just not really the sort of thing where an unskilled person with a highly proficient LLM is going to be producing a distributed system that does anything useful.

    It seems to me that the net effect of AI will be to increase the output of developers without increasing the cost per developer. Effectively, this will make software development cheaper. I suppose it's possible that there is some sort of peak demand for software that will require less developers over time to meet, but, generally, when something becomes cheaper, the demand for that thing will tend to increase.

    I think the rumors of our demise are overblown.

  • by ddtaylor on 5/27/25, 11:45 AM

    I think we should look at one even earlier: COBOL.

    This was the response by non-developers to make it obsolete to need to spell out your business details to an expensive programmer who, we presume, will just change them anyhow and make up their own numbers!

    That didn't work for shit either, although to the authors point it did create a ton of jobs!

  • by bob1029 on 5/27/25, 2:56 PM

    I strongly agree with the architecture piece.

    Seeing the difference in complexity between a distributed "monolith" and an actual one makes me wonder how serious some of us are about serving the customer. The speed with which you can build a rails or PHP app makes everything proposed since 2016 seem kind of pointless from a business standpoint. Many SaaS B2B products could be refactored into a single powershell/bash script.

    It can take a very firm hand to guide a team away from the shiny distractions. There is no way in hell an obsequious AI contraption will be able to fill this role. I know for a fact the LLMs are guiding developers towards more complexity because I have to constantly prompt things like "do not use 3rd party dependencies" and "demonstrate using pseudocode first" to avoid getting sucked into npm Narnia.

  • by elzbardico on 5/27/25, 12:40 PM

    One thing that I observed is that my company now strongly leans "build" in all the "build vs buy" decisions. And it is not a tech company. And yes, AI is not magical, I am working 10 hours a day because of that, even with the non-negligible help from AI.
  • by ahofmann on 5/27/25, 11:29 AM

    > It's architecting systems. And that's the one thing AI can't do.

    Nobody knows how the future looks like, but I would change that sentence slightly:

    "It's architecting systems. And that's the one thing AI can't yet do."

  • by holtkam2 on 5/27/25, 6:58 PM

    I loved this article and it is the strongest argument I’ve ever heard for “why I shouldn’t be freaking out about the future of my engineering career”.

    Now just for the heck of it I’ll attempt to craft the strongest rebuttal I can:

    This blog misses the key difference between AI and all other technologies in software development. AI isn’t merely good at writing code. It’s good at thinking. It’s not going to merely automate software development, it’s going to automate knowledge work. You as a human have no place in a world where your brain is strictly less capable in all realms of decisionmaking compared to machines.

  • by legulere on 5/27/25, 7:12 PM

    I don't think the higher pay is true. There are simply less people proficient in new technology at the beginning. You're simply seeing classic demand and response play out. After a while things will calm down again.

    I think a better comparison is to Jevons Paradox. New Technologies make developers more efficient and thus cheaper. This increases demand more than what is gained by the efficiency increases.

    I don't see us anytime soon running out of things that are worth automating, especially if the cost for that continues to drop.

  • by joshuakelly on 5/27/25, 12:32 PM

    Read this, and then compare it to Daniel Kokotajlo's "What 2026 Looks Like" published 4 years ago.

    This time it really _is_ different, and we're looking at a world totally saturated with an abundance of bits. This will not be a simple restructuring of labor markets but something very significant and potentially quite severe.

    https://www.lesswrong.com/posts/6Xgy6CAf2jqHhynHL/what-2026-...

  • by octo888 on 5/27/25, 3:16 PM

    At my company they're doubling down. Forcing us to use AI, and product and managers suddenly cosplaying architect and senior developer, attempting to muscle in on developers'/architects' roles. Ie trying to takeaway the thing the developers would have more time for if the AI tools achieved their aims. And to triple down, they're offshoring.

    Which makes it really obvious their aim is to get rid of (expensive) developers, not to unlock our time to enable us to work on higher things

  • by analog31 on 5/27/25, 12:14 PM

    >>> Here's what the "AI will replace developers" crowd fundamentally misunderstands: code is not an asset—it's a liability. Every line must be maintained, debugged, secured, and eventually replaced. The real asset is the business capability that code enables.

    This could explain the cycle by itself. Dynamic equations often tend to oscillate. Anything that temporarily accelerates the production of code imposes a maintenance cost later on.

  • by inadequatespace on 5/28/25, 2:20 PM

    So, what do all of these responses and the article itself seem to dance around? It's not that it makes developers obsolete, but rather increases inequality. In other words, either creates a class of inferior developers because they don't have whatever new skill, or in the case of offshoring, literally creates a class of lower developers.
  • by neallindsay on 5/27/25, 1:45 PM

    The picture at the top of the article seems to be a bad (AI-generated, I assume) illustration of the Gartner Hype cycle. There are supposed to be five stages, but the text at the bottom doesn't line up with the graph because it is missing the "peak of inflated expectations" while the graph seems to be missing the "plateau (of) productivity" stage.
  • by msgodel on 5/28/25, 4:18 PM

    To put it in the language of "programming as theory building" AI produced artifacts are stillborn. No one had or has the theory in their mind that corresponds to them. A software developer could read the output and develop this but then you're doing actual software development again.

    This is why LLMs will not replace developers.

  • by bahmboo on 5/27/25, 8:06 PM

    Funny how no one has commented on the graphic being wrong. The enlightenment and disillusionment labels are swapped.
  • by throwawayobs on 5/27/25, 3:03 PM

    I'm old enough to remember when you wouldn't need to hire expensive developers anymore because object-oriented programming would make it possible to have semi-skilled employees assemble software from standardized parts. They even talked about the impending "software factory".
  • by chasing on 5/27/25, 10:30 PM

    For vibe coding to replace software engineering vibe coding will have to become… software engineering.
  • by ogogmad on 5/27/25, 5:46 PM

    I think that as programmer productivity increases, demand for programmers also increases, but only INITIALLY. However, if productivity improves too much, and programming gets automated too much, then demand for programmers will begin to drop very rapidly. It's non-linear.
  • by the__alchemist on 5/27/25, 11:52 AM

    The articles point about LLMs being poor at architecture aligns with my primary rule of using them in code: Don't have them design data structures or function signatures. They can fill them in when appropriate, but I will not let an LL define them. (Structs, enums, fn sigs etc)
  • by eduction on 5/27/25, 1:55 PM

    Kind of funny that the things that time and again save developers, especially expensive US/California based ones, are those they tend to hate - meetings, writing prose, and customer service.

    Writing code should almost be an afterthought to understanding the problem deeply and iteratively.

  • by simultsop on 5/28/25, 11:53 AM

    > code is not an asset—it's a liability. Every line must be maintained, debugged, secured, and eventually replaced. The real asset is the business capability that code enables.

    a must-have principle throughout the entire software engineering career.

  • by catigula on 5/27/25, 12:51 PM

    I love how I instantly know if something is written with GPT-4o now.

    >What actually happens isn't replacement, it's transformation.

    "Statement --/, negation" pattern is the clearest indicator of ChatGPT I currently know.

  • by martzoukos on 5/27/25, 11:59 AM

    AI does a pretty good job at helping you learn architecture, though.
  • by moralestapia on 5/27/25, 7:45 PM

    >It's architecting systems. And that's the one thing AI can't do.

    Weak conclusion as AI already does that quite well.

  • by wayeq on 5/27/25, 9:59 PM

    > code is not an asset—it's a liability.

    Tell that to Coca-Cola.. whose most valuable asset is literally an algorithm

  • by 1vuio0pswjnm7 on 5/27/25, 3:46 PM

    The Myth of Developer Relevance

    Can it persist in times when borrowing money is not free (nonzero interest rates)

  • by hcfman on 5/27/25, 1:44 PM

    Man! Us people in Europe would love this double the salary scenario to apply here.
  • by exodust on 5/27/25, 11:30 AM

    > "For agency work building disposable marketing sites, this doesn't matter"

    And the disdain for marketing sites continues. I'd argue the thing that's in front of your customer's face isn't "disposable"! When the customer wants to tinker with their account, they might get there from the familiar "marketing site". Or when potential customers and users of your product are weighing up your payment plans, these are not trivial matters! Will you really trust Sloppy Jo's AI in the moment customers are reaching for their credit cards? The 'money shot' of UX. "Disposable"? "Doesn't matter"? Pffff!

  • by readthenotes1 on 5/27/25, 1:29 PM

    The author seems to miss the point that code being a liability has not affected the amount that is written by people who don't care.

    The same day that a tutorial from Capgemini consultant on how to write code using AI appeared here, I heard from a project manager who has AI write up code that is the reviewed by the human project team--because that is far easier.

    I expect most offshoring to go the way of the horse and buggy because it may be easier to explain the requirements to cursor, and the turnaround time is much faster.

  • by mediumsmart on 5/27/25, 2:49 PM

    >The most valuable skill in software isn't writing code, it's architecting systems.

    I keep saying that - AI is the brick maker - you build the house. and its your decision to build that house that only needs bricks in the right place ...