by squircle on 6/17/25, 1:25 AM with 30 comments
by bob1029 on 6/17/25, 9:59 AM
NUMA only got more complicated over time. The range of latency differences is more extreme than ever. We've got L1 running at nanosecond delay, and on the other end we've got cold tapes that can take a whole day to load. Which kind of memory/compute to use in a heterogeneous system (cpu/gpu) is also something that can be difficult to figure out. Multi core is likely the most devastating dragon to arrive since this article was written.
Premature optimization might be evil, but it's the only way to efficiently align the software with the memory architecture. E.g., in a Unity application, rewriting from game objects to ECS is basically like starting over.
If you could only focus on one aspect, I would keep the average temperature of L1 in mind constantly. If you can keep it semi-warm, nothing else really matters. There are very few problems that a modern CPU can't chew through ~instantly assuming the working set is in L1 and there is no contention with other threads.
This is the same thinking that drives some of us to use SQLite over hosted SQL providers. Thinking in terms of not just information, but the latency domain of the information, is what can unlock those bananas 1000x+ speed ups.
by varjag on 6/17/25, 5:40 PM
Nowadays one often encounters the opinion that in the sixties programming has been an overpaid profession, and that in the coming years programmer salaries may be expected to go down. Usually this opinion is expressed in connection with the recession, but it could be a symptom of something different and quite healthy, viz. that perhaps the programmers of the past decade have not done so good a job as they should have done. Society is getting dissatisfied with the performance of programmers and of their products.
by saghm on 6/17/25, 3:14 PM
Interestingly, designing a language that enforces that loops need an invariant that proves they terminate is actually possible; Coq, for example, does pretty much exactly this from what I understand. My understanding is that this means that it isn't Turing complete, but I also think that maybe Turing completeness isn't quite as necessary for as many things as it might otherwise seem like.
by ddtaylor on 6/17/25, 2:43 PM
> When these machines were announced and their functional specifications became known, quite a few among us must have become quite miserable; at least I was. It was only reasonable to expect that such machines would flood the computing community, and it was therefore all the more important that their design should be as sound as possible. But the design embodied such serious flaws that I felt that with a single stroke the progress of computing science had been retarded by at least ten years: it was then that I had the blackest week in the whole of my professional life. Perhaps the most saddening thing now is that, even after all those years of frustrating experience, still so many people honestly believe that some law of nature tells us that machines have to be that way. They silence their doubts by observing how many of these machines have been sold, and derive from that observation the false sense of security that, after all, the design cannot have been that bad. But upon closer inspection, that line of defense has the same convincing strength as the argument that cigarette smoking must be healthy because so many people do it.
by Michelangelo11 on 6/17/25, 3:17 PM
Absolutely right, with the implication that new capabilities available suddenly to everyone often end up making the playing field more unequal, not less.
by Nicook on 6/17/25, 4:59 PM
>baroque monstrosity
warnings. probably beating a dead horse here, but way too many tools, and they keep adding more.
by stereolambda on 6/17/25, 10:46 AM
I guess there is still something left here from there from the concept of programming language as a tool for top-down shaping and guiding the thinking of its users. Pascal being the classic example. Golang tries to be like that. I get how annoying it can be. I don't know how JS/TypeScript constructs evolve, but I suspect this is more Fortran-style committee planning than trying to "enlighten" people into doing the "right" things. Happy to be corrected on this.
Maybe the hardest to interpret in hindsight is the point that in the sixties programming has been an overpaid profession, the hardware costs will be dropping and software costs cannot stay the same (You cannot expect society to accept this, and therefore we must learn to program an order of magnitude more effectively). Yeah, in some sense, what paying for software even is anymore.
But interestingly, the situation now is kind of similar to the very old days: bunch of mainframe ("cloud") owners paying programmers to program and manage their machines. And maybe the effectiveness really has gone up dramatically. There's relatively little software running in comparison to the crazy volume of metal machines, even though the programmers for that scale are still paid a lot. It's not like you get a team of 10 guys for programming each individual server.
by augustk on 6/17/25, 1:52 PM
by puttycat on 6/17/25, 1:13 PM
by selcuka on 6/17/25, 6:35 AM
Apparently the ISO/IEC 1539-1:2023 [1] committee didn't get the memo.
by buckfactor on 6/17/25, 3:48 PM
by b0a04gl on 6/17/25, 12:34 PM
by enord on 6/17/25, 6:53 AM
Maybe his incisive polemic, which I greatly enjoy, was all but pandering to a certain elitist sensibility in the end.
To make manageable programs, you have to trade off execution speed both on the cpu and in the organization. His rather mathematized prescriptions imply we should hire quarrelsome academics such as him to reduce performance and slow down product development[initially…] all in the interest of his stratified sensibilities of elegance and simplicity.
Sucks to be right when that’s the truth.
by ddtaylor on 6/17/25, 2:39 PM
Oh boy does that read VERY true today!
by gobblik on 6/17/25, 10:45 AM
by jsonchao on 6/17/25, 7:41 AM
by dkarl on 6/17/25, 2:45 PM
JITs have taken this to an even higher level — people don't just argue that the machine is fast enough to run their convoluted code with countless unnecessary layers, they argue that their code as they've written it won't be run at all: the JIT will reduce it to a simpler form that can be handled efficiently.
But they can't explain why their poor coworkers who have to read and maintain the code don't deserve the same consideration as the machine!