from Hacker News

Ask HN: Are LLMs the new compilers?

by calflegal on 3/19/25, 5:52 PM with 5 comments

It seems clear to me that AIs are going to write more source code going forward. At the moment I review a lot of it, it seems necessary. But it occurs to me the same may have been true of early compilers. Is it the case that early engineers would review and change resulting assembly? That certainly isn't how most of us operate today, so I'm wondering if the same is bound to happen at this new abstraction level. Will we stop reviewing the output and write...english? Why or why not?

Update based on comments: Perhaps by using 'LLM' I was too specific to today's non-deterministic LLM systems..I suppose I meant AI systems in general.

  • by subject4056 on 3/19/25, 6:02 PM

    I would guess not. An important feature of compilers is that they are guaranteed to emit code with certain properties in response to specific inputs (memory safety guarantees, asymptotic performance, calling convention, etc.). If they don't do that, you can file a bug report.

    You cannot file a bug report against an LLM that it produced an unexpected output, because there is no expected output; The core feature of an LLM is that neither you nor the LLM developer knows what it will output for a wide range of inputs. I think there are a wide range of applications for which LLMs core value proposition of "no-one knows a priori what this tool will emit" is disqualifying.

  • by nickpsecurity on 3/19/25, 6:00 PM

    Compiler are usually deterministic in their function. They use algorithms designed for their purpose, tested to work on many inputs, and integration testing of many programs. They don't hallucinate or do probabilistic things in most cases. They're also much faster than LLM's, especially if parallelized.

    I could see LLM's being used to generate compiler configurations or source annotations to improve compiler behavior. They might be good at code generation for prototyping changes to an existing compiler. They're too unreliable and resource heavy to be a good alternative in most cases, though.

  • by taylodl on 3/19/25, 7:30 PM

    No - and you don't want them to be. You want an AI system to produce human-readable code that can then be input to a compiler or interpreter. If anything, you should think of AI as being a transpiler. The AI is transpiling a natural language description of a function into a programming language.

    I have been wondering if we use formal modeling languages such as ArchiMate and UML, a set of Gherkin tests, and a deployment specification of some sort such as Workik AI, could an AI generate all the artifacts needed to compile, deploy, and run?

    Then, if there's a bug, create a new Gherkin test that recreates the bug and regenerate the whole shebang. Realistically, that's where I see things headed over the next 5-10 years.

  • by techpineapple on 3/19/25, 6:10 PM

    I do wonder if there will be an evolution in language design, but I also think english is probably too easily fungible. Bi-weekly can mean both twice per week and once every two weeks. Which one should "bi-weekly" compile to. Maybe a centralized config? Is that config in plain english, or structured. and then probably fuggedabout it if you have both native and non-native speakers of english on the same team? Or I imagine in some companies non-english speakers are writing code.

    If your language is non-deterministic it would really suck if your application compiled one way on Tuesday, and another way the next Tuesday.

  • by JohnFen on 3/19/25, 7:39 PM

    > Is it the case that early engineers would review and change resulting assembly?

    I'm a graybeard (~40 years as a dev) and I've never heard of people doing that, at least not because of any wariness about compilers. That's not to say it didn't happen, of course, just that I never saw it.

    It did sometimes happen that something needed to be written in assembly for performance or space reasons, and devs would start with assembly generated by a compiler to save dev time, though. Not that often, but it happened.