from Hacker News

The Humble Programmer (1972)

by squircle on 6/17/25, 1:25 AM with 30 comments

  • by bob1029 on 6/17/25, 9:59 AM

    > Secondly, we have got machines equipped with multi-level stores, presenting us problems of management strategy that, in spite of the extensive literature on the subject, still remain rather elusive.

    NUMA only got more complicated over time. The range of latency differences is more extreme than ever. We've got L1 running at nanosecond delay, and on the other end we've got cold tapes that can take a whole day to load. Which kind of memory/compute to use in a heterogeneous system (cpu/gpu) is also something that can be difficult to figure out. Multi core is likely the most devastating dragon to arrive since this article was written.

    Premature optimization might be evil, but it's the only way to efficiently align the software with the memory architecture. E.g., in a Unity application, rewriting from game objects to ECS is basically like starting over.

    If you could only focus on one aspect, I would keep the average temperature of L1 in mind constantly. If you can keep it semi-warm, nothing else really matters. There are very few problems that a modern CPU can't chew through ~instantly assuming the working set is in L1 and there is no contention with other threads.

    This is the same thinking that drives some of us to use SQLite over hosted SQL providers. Thinking in terms of not just information, but the latency domain of the information, is what can unlock those bananas 1000x+ speed ups.

  • by varjag on 6/17/25, 5:40 PM

    Such an evergreen observation:

    Nowadays one often encounters the opinion that in the sixties programming has been an overpaid profession, and that in the coming years programmer salaries may be expected to go down. Usually this opinion is expressed in connection with the recession, but it could be a symptom of something different and quite healthy, viz. that perhaps the programmers of the past decade have not done so good a job as they should have done. Society is getting dissatisfied with the performance of programmers and of their products.

  • by saghm on 6/17/25, 3:14 PM

    > A study of program structure had revealed that programs —even alternative programs for the same task and with the same mathematical content— can differ tremendously in their intellectual manageability. A number of rules have been discovered, violation of which will either seriously impair or totally destroy the intellectual manageability of the program. These rules are of two kinds. Those of the first kind are easily imposed mechanically, viz. by a suitably chosen programming language. Examples are the exclusion of goto-statements and of procedures with more than one output parameter. For those of the second kind I at least —but that may be due to lack of competence on my side— see no way of imposing them mechanically, as it seems to need some sort of automatic theorem prover for which I have no existence proof. Therefore, for the time being and perhaps forever, the rules of the second kind present themselves as elements of discipline required from the programmer. Some of the rules I have in mind are so clear that they can be taught and that there never needs to be an argument as to whether a given program violates them or not. Examples are the requirements that no loop should be written down without providing a proof for termination nor without stating the relation whose invariance will not be destroyed by the execution of the repeatable statement.

    Interestingly, designing a language that enforces that loops need an invariant that proves they terminate is actually possible; Coq, for example, does pretty much exactly this from what I understand. My understanding is that this means that it isn't Turing complete, but I also think that maybe Turing completeness isn't quite as necessary for as many things as it might otherwise seem like.

  • by ddtaylor on 6/17/25, 2:43 PM

    What computer is he referring to?

    > When these machines were announced and their functional specifications became known, quite a few among us must have become quite miserable; at least I was. It was only reasonable to expect that such machines would flood the computing community, and it was therefore all the more important that their design should be as sound as possible. But the design embodied such serious flaws that I felt that with a single stroke the progress of computing science had been retarded by at least ten years: it was then that I had the blackest week in the whole of my professional life. Perhaps the most saddening thing now is that, even after all those years of frustrating experience, still so many people honestly believe that some law of nature tells us that machines have to be that way. They silence their doubts by observing how many of these machines have been sold, and derive from that observation the false sense of security that, after all, the design cannot have been that bad. But upon closer inspection, that line of defense has the same convincing strength as the argument that cigarette smoking must be healthy because so many people do it.

  • by Michelangelo11 on 6/17/25, 3:17 PM

    > The first effect of teaching a methodology —rather than disseminating knowledge— is that of enhancing the capacities of the already capable, thus magnifying the difference in intelligence.

    Absolutely right, with the implication that new capabilities available suddenly to everyone often end up making the playing field more unequal, not less.

  • by Nicook on 6/17/25, 4:59 PM

    Java really needs to take a look into the

    >baroque monstrosity

    warnings. probably beating a dead horse here, but way too many tools, and they keep adding more.

  • by stereolambda on 6/17/25, 10:46 AM

    In the articles and talks from that time people often take the perspective of what the whole society (with its organizations) wants from the "automatic computers" and programmers as a profession. Compare also something like the 1982 Grace Hopper's talk on YT. Now I think it's mostly the perspective of companies, teams, the industry. This shift happened in the 1990s? I'm guessing here.

    I guess there is still something left here from there from the concept of programming language as a tool for top-down shaping and guiding the thinking of its users. Pascal being the classic example. Golang tries to be like that. I get how annoying it can be. I don't know how JS/TypeScript constructs evolve, but I suspect this is more Fortran-style committee planning than trying to "enlighten" people into doing the "right" things. Happy to be corrected on this.

    Maybe the hardest to interpret in hindsight is the point that in the sixties programming has been an overpaid profession, the hardware costs will be dropping and software costs cannot stay the same (You cannot expect society to accept this, and therefore we must learn to program an order of magnitude more effectively). Yeah, in some sense, what paying for software even is anymore.

    But interestingly, the situation now is kind of similar to the very old days: bunch of mainframe ("cloud") owners paying programmers to program and manage their machines. And maybe the effectiveness really has gone up dramatically. There's relatively little software running in comparison to the crazy volume of metal machines, even though the programmers for that scale are still paid a lot. It's not like you get a team of 10 guys for programming each individual server.

  • by augustk on 6/17/25, 1:52 PM

    The interview "Discipline in Thought" is also quite interesting:

    https://www.youtube.com/watch?v=mLEOZO1GwVc

  • by puttycat on 6/17/25, 1:13 PM

    What a joy to find a plaintext HTML page (and such a wonderful text of course).
  • by selcuka on 6/17/25, 6:35 AM

    > The sooner we can forget that FORTRAN has ever existed, the better, for as a vehicle of thought it is no longer adequate: it wastes our brainpower, is too risky and therefore too expensive to use.

    Apparently the ISO/IEC 1539-1:2023 [1] committee didn't get the memo.

    [1] https://www.iso.org/standard/82170.html

  • by buckfactor on 6/17/25, 3:48 PM

  • by b0a04gl on 6/17/25, 12:34 PM

    dijkstra's take aged better than most things from that era. still see teams chasing fast output over clean design and hitting walls later. the mind-map linked in the thread does a decent job condensing it. worth a skim even if you’ve read the essay before
  • by enord on 6/17/25, 6:53 AM

    It’s a real shame Dijkstra rubbed so many people the wrong way.

    Maybe his incisive polemic, which I greatly enjoy, was all but pandering to a certain elitist sensibility in the end.

    To make manageable programs, you have to trade off execution speed both on the cpu and in the organization. His rather mathematized prescriptions imply we should hire quarrelsome academics such as him to reduce performance and slow down product development[initially…] all in the interest of his stratified sensibilities of elegance and simplicity.

    Sucks to be right when that’s the truth.

  • by ddtaylor on 6/17/25, 2:39 PM

    > In this sense the electronic industry has not solved a single problem, it has only created them, it has created the problem of using its products.

    Oh boy does that read VERY true today!

  • by gobblik on 6/17/25, 10:45 AM

    Or, for the esolangers: The Less Humble Programmer http://digitalhumanities.org/dhq/vol/17/2/000698/000698.html
  • by jsonchao on 6/17/25, 7:41 AM

    This is what I thought~
  • by dkarl on 6/17/25, 2:45 PM

    > But if you take as “performance” the duty cycle of the machine’s various components, little will prevent you from ending up with a design in which the major part of your performance goal is reached by internal housekeeping activities of doubtful necessity

    JITs have taken this to an even higher level — people don't just argue that the machine is fast enough to run their convoluted code with countless unnecessary layers, they argue that their code as they've written it won't be run at all: the JIT will reduce it to a simpler form that can be handled efficiently.

    But they can't explain why their poor coworkers who have to read and maintain the code don't deserve the same consideration as the machine!