by morph123 on 4/2/23, 7:08 PM with 46 comments
I always do look enviously at back when you had to figure out sort of bootstrappy ways to solve problems due to inherent limitations of the hardware instead of just sitting there gluing frameworks together like a loser.
Am I wrong or was programming just way cooler back then?
by alkonaut on 4/2/23, 10:35 PM
For anything that isn’t IO bound you are very often bound by memory access time. CPUs are incredibly fast and feeding them data to work on is very difficult, more so today than before!
The big difference is that a category of programming jobs has appeared where this is almost never a concern because it’s about shoving text around between servers.
by lousken on 4/2/23, 11:31 PM
These days, developers have no shame - even Microsoft shows Teams was a utter piece of garbage with both memory and cpu usage https://youtu.be/CT7nnXej2K4 . Are they even serious with 22 second loading on a modern hardware and multiple gigabytes of memory usage for a stupid chat app?
We still use laptops that have 8GB of RAM in our company for positions like HR, marketing and others and the fact that Teams and couple of browser tabs can totally kill their PCs is insane. Laptop manufacturers haven't caught up to the news and they still charge premium for 32GB machines, while 8GB is no longer sufficient for anything and 16GB should be the baseline these days.
And of course this is not just Teams but many other applications. Too bad most users don't understand the issue and think that something is wrong with their computers.
Regular users are not great testers. Testing used to be a dedicated job...
Anyway I think there should be a lot more pressure on developers to use our current hardware more efficiently
by GianFabien on 4/2/23, 10:15 PM
Since you mention an interest in this space, consider buying an Arduino kit and building some cool personal projects. Once you have some experience, look around your part of the world and identify any needs that you could solve with an embedded solution.
by rerdavies on 4/2/23, 11:22 PM
The machine had an 8-bit Z80 processors, 128k of bank-swappable memory (64k address space), and two 8" 240kb floppy disk drives.
The 8080 assembler source for my module was spread over 14 floppy disks. Compiling involved meticulously swapping floppies in the correct order for about a day and half. Inserting the wrong floppy in the wrong order would cause the build to fail, and you'd have to start over from scratch.
Despite that, one had a breathless sense that you were working with a technology that was about to change everything. The implications of the coming technological age were unmistakable. And the only limit was imagination and ingenuity.
Having a MILLION instructions a second to play with seemed limitless; but having to fit things into 16kb memory pages felt a bit constraining. For the most part, user data had to fit in a 16kb memory page because that was all that was left over after the operating system was loaded.
In retrospect, there's only so much you can do with a 4Mhz processor when you only have 16kb of data to work with, so processing power wasn't the constraint.
Probably exactly the same as what it's like to be working with AI technology today. Everything is going to change. And the future is going to be unrecognizable. And you're sitting in the middle of it.
So yes, absolutely, it was way cooler back then. Unless you're currently working on AI projects.
by drpixie on 4/2/23, 11:02 PM
Compared with (say) 20 years ago, you've got 20-50 times more memory, and your data is 20-50 times (or more) bigger.
We're doing the same as always, but using more bits, because bits are cheaper.
by jitl on 4/3/23, 12:50 AM
Sure, there’s more memory these days and many languages make memory correctness errors a thing of the past, but play your cards wrong and you’ll exhaust your users’ resources just the same. In many ways it’s harder today because you can’t easily look at an abstraction and understand its memory or cache cost in languages like Java, Python, or JavaScript. you just pray it’ll get optimized or won’t cost too much and big-O analysis is enough. I can’t tell you how many times I’ve heard programmers working with GC languages worry about GC pressure or GC pauses.
You can always work in Zig, C++, Rust, etc to make reasoning about memory use easier. There’s more positions today working in those contexts than in the past, even though they’re a smaller overall percent of the market.
by PaulHoule on 4/2/23, 7:11 PM
I’d say that garbage collection was key to large scale software reuse as if you didn’t have garbage collection you’d have to have a lot of cooperation between libraries and applications (how does the application know the library doesn’t need a piece of memory or vice versa?)
by xboxnolifes on 4/2/23, 10:34 PM
The more constraints you remove, the more something transitions from being a puzzle to being a canvas. Each end appeals to different kinds of people.
by bell-cot on 4/2/23, 8:08 PM
Memory mattered. Ditto CPU cycles, disk, and network bandwidth. (Not that your code could count on there being a network, in a lot of cases.) Security wasn't a Garden of Eden...but now that can feel more like being a night-shift ICU nurse during a COVID surge that never ends. And the software stack under your code was orders of magnitude smaller, and slower-changing. These days - that can feel like you aren't a programmer, but instead a Wall Street attorney who's trying to pilot some giga-corporate merger through the shifting regulatory and political obstacles in 37 different countries.
by dave4420 on 4/2/23, 7:56 PM
I don’t miss low level programming in a work context at all.
by japhyr on 4/3/23, 12:04 AM
That was a time when Python was rarely someone's first language. People would say, "Try Python, your programs will be about a third as long as they are now!" I was skeptical, but holy heck those people were right.
I am academic enough to enjoy the intellectual challenge of being conscious of memory usage. But when you had to pay attention to it in every project, it was not always fun. It was really fun to use Python and forget about memory for a while. I built many projects in Python that I probably wouldn't have made time for in Java.
I really enjoy programming these days, where you can forget about memory until you have a reason to optimize. Then when you do need to optimize, you can profile your code with fantastic tools, reason about memory enough to get things working efficiently again, and move on with the project. I really enjoy hopping back and forth between focus on a larger project, and focus on the inner workings of a project.
I think we're seeing the same kind of fundamental change with GPT-assisted programming. There's a whole generation starting to use these tools right now who will ask a question on a 2040s version of HN, "Was programming more interesting when correct syntax was a concern?"
by mitchellpkt on 4/2/23, 9:33 PM
by Dwedit on 4/2/23, 10:25 PM
by wruza on 4/3/23, 3:03 AM
I was pretty late into it (ca 2000) and can only imagine the pain of switching banks, mapping windows, dealing with segments and overlays, unloading things. When you hear about “cache invalidation and naming things”, don’t you ask yourself what the hell is “cache invalidation”? Lucky you.
There’s so much you can do today in just a few hours. But if you want that feeling, start with npm-remove-ing your bundler. That’s not the same, but the feeling will be the same.
by picklebarrel on 4/2/23, 11:31 PM
by eesmith on 4/2/23, 8:20 PM
However, people go into programming for many reasons. For those who were not interested in squeezing out every word, it was not interesting. Probably some people realized their tasks couldn't be done in the limited hardware of the time, so didn't even try until memory became significantly cheaper.
The past can seem like a golden era of mythic heroes.
Some from the generation you envy are themselves envious of the previous generation, where programmers were expected to write their own operating system - the vendor didn't provide one - and pull out a soldering iron and oscilloscope for debugging.
When many were writing yet another COBOL program for accounts processing.
by jrexilius on 4/3/23, 12:04 AM
[edit to add:] Although, another part of having to understand memory usage, for example, meant that you really had to understand the fundamental building materials you were working with. Where as now, no one has the man hours to really understand the stack of tens of thousands of lines of node shit you have to glue together to do some basic task.. you feel more excited when you are really mastering the materials, I think..
by fsociety on 4/2/23, 11:03 PM
by cc101 on 4/2/23, 11:28 PM
by helph67 on 4/2/23, 10:20 PM
by quechimba on 4/2/23, 11:41 PM
by kadoban on 4/2/23, 11:30 PM
You're wrong, to be blunt.
Sure there's some interesting challenges to solve around keeping memory usage low, but mostly it was just tedious and limiting.
Programming has enough challenges and interesting optimizations without worrying as much as one used to have to about memory.
by DamonHD on 4/2/23, 7:13 PM
by markus_zhang on 4/2/23, 9:29 PM
by mistrial9 on 4/2/23, 10:37 PM
by revelio on 4/2/23, 8:52 PM
Anyway. No, it really wasn't cool.
The thing to realize is that software wasn't really that much better optimized for RAM than today. Maybe a bit, but mostly it just created a lot of pain:
- Every single API was specified to be able to fail due to OOM, and a lot of programmers tried to check for this and recover. However, there were no good ways to simulate or test it, and unit testing was in its infancy as a practice (e.g. no widely adopted frameworks for testing), and so in practice this just yielded a ton of codepaths and boilerplate that never really got tested and probably didn't work.
- Because RAM was so tight, you pretty much had to use the operating system's APIs for everything, even if they sucked or were full of bugs. It wasn't just RAM of course, it was also disk space, CD space, nearly non-existent network bandwidth. Duplication of what Windows had wasn't feasible. It meant everything was pretty consistent, which had its upsides, but it also meant that everything was consistently quite ugly and the horrible Windows/macOS APIs were all you got.
- Memory limits resulted in awkward APIs. No type safety, no enums (only bit flags), no reflection, lots of annoying (and by the late 90s obsolete) memory locking protocols that were holdovers from win16, and "rerendering" literally meant re-drawing small areas of the screen because pixels weren't cached. In particular a lot of APIs required you to call them twice, once to figure out how much memory something would require, and then a second time to fill out that memory once it was allocated.
- Error messages? Logging? Hah no. You get 32 bit error codes because there isn't enough memory or disk space for all the strings proper logging and errors would require. So you got really good at decoding HRESULTs.
- Garbage collection existed as a tech but because swap was so slow and RAM so tight it was very easy for it to trigger swap storms, so in common practice automatic memory management (when it existed) was all based on refcounting. So you couldn't quite just forget about it because you could still leak memory pretty easily even using a higher level language like VB.
The transition towards relative memory abundance was fairly painful. For example, Java because popular with devs because it had all those nice things that used a lot of memory but it led to slow/swappy apps that were painful for users. You still see the same complaints w.r.t. Electron apps, though today machines can take it and so it's more about people feeling it's wasteful rather than it literally killing the responsiveness of the machine like it used to be.
Of course in the above I say "was" and "used to be" but if you ever have to do any Win32 or low level POSIX programming you'll be right back in that world.
by butterisgood on 4/2/23, 11:30 PM
by eurticket on 4/3/23, 12:14 AM
by zxcvbnm on 4/2/23, 7:24 PM
by kazinator on 4/2/23, 11:48 PM
I checked this morning; it was still the case.
by nathants on 4/2/23, 8:40 PM
by boppo1 on 4/2/23, 11:58 PM
by czzr on 4/2/23, 10:30 PM