by whatrocks on 3/19/23, 2:37 AM with 88 comments
by jmmv on 3/19/23, 3:25 AM
But I somehow find it a little bit sad that this is the case, so I’ll plug my own https://www.endbasic.dev/ because it’s very fitting in this context :) I’ve been building it precisely as a way to understand everything that’s going on by offering that 1980s experience (although it’s still far from fulfilling the full promise).
Also, buried in the article is a reference to the https://10print.org/ book. I recently came across it at HPB and it has been a pretty entertaining read. Couldn’t believe there was so much to write about such a simple little program!
by timsneath on 3/19/23, 3:39 AM
That said, those who missed out on this era also missed out on quite how limited the available sources of information were. Without online services to consult, the primary source of information being trial and error, and one incorrect machine code instruction leading to loss of all data entered, progress was very slow going. A committed learner could certainly make far faster progress with a more modern environment.
For today's generation, I'm grateful for books like Charles Petzold's Code [4], which constructs a computer architecture from first principles. The joy is still there waiting to be found!
[1]: https://archive.org/details/sinclair-user-magazine-033/page/...
[2]: https://archive.org/details/PersonalComputerWorld1984-01/pag...
[3]: https://archive.org/details/sinclair-user-magazine-044/page/...
[4]: https://www.microsoftpressstore.com/store/code-the-hidden-la...
by DeathArrow on 3/19/23, 9:53 AM
Even if they were simple it was hard to program them because there was a very limited amount of information on how to do it. The few books available didn't cover a lot of things. Learning was mostly done lots of things until you succeeded. And that took a lot of time and patience. The Internet, stackoverflow, Reddit, YouTube, forums, Udemy, Github, and the thousands of tutorials, examples and documentation site make things a lot easier.
I started to learn programming on an 8 bit Sinclair ZX Spectrum and the only good things that came out of that is that it teached me to work on very constrained systems and to build up patience and will to try and fail until I succeed.
4 or 5 years later when I've experienced IBM PCs at school, it felt like going from horse and carriage to a rocket. Yes, the rocket might be a bit harder to maneuver but you can do much more things, faster.
by inpdx on 3/19/23, 4:03 AM
by dmje on 3/19/23, 8:00 AM
Many weekends were spent typing lines and lines of code to make simple games. Then we'd save stuff to tape - after we'd spent hours and hours debugging, of course.
Elite occupied me for months. Then Forest at World's End, which I mapped out on a bunch of sheets of A4 taped together.
When I got older I hacked my joystick port and connected it to a water chaos wheel I'd made out of an old bicycle rim and some other bits, then I wrote a BASIC program to visualise the movement of the wheel, monitoring direction via the hacked joystick port.
Oh man. Fun, fun times :-)
by cortesoft on 3/19/23, 3:33 AM
by jmull on 3/19/23, 3:54 AM
by DeathArrow on 3/19/23, 10:18 AM
Not entirely true. While I've learned as a kid many of the insides and outsides of my ZX Spectrum clone, from the limited info I could gather and from tinkering, I tried to learn about most complex systems later, as much as I could.
I learned x86 assembly under MS DOS, I learned writing device drivers in C for Windows, I learned a bit of Linux system programming in University, I learned a bit of OpenGL and shaders, I learned a few bits about hardware, I learned about logical gates like NAND and simple digital circuitry. And those are basic things I've learned long time ago.
Having low level knowledge is useful but also having a higher level knowledge. I think concepts like algorithms, parallel and concurrent programming, formal languages and automata theory, cryptography, statistics, machine learning and other high level stuff I've came across in University were equally useful.
I tackled many areas of programming, desktop software, device drivers, embedded software, video games, mobile apps, web front-end, web backend. Now I am building microservice based apps with Kubernetes and Azure. I am thinking of brushing up my knowledge on ML.
I liked pretty much everything I did and I approached everything with a learning mentality.
One can't learn everything like in the '80s but one can learn a lot of things to keep him entertained and help him accomplish great things while having enough knowledge of how things work under the hood.
I am probably not an expert in any one field of programming but know enough things to be useful in many areas. I rather like being a jack of all trades than highly specialized because there is more than one thing that interests me and I am always curious about different things and I like to learn. That being said, being an expert in one thing is not a bad place to be and experts can be paid a lot.
by zabzonk on 3/19/23, 4:47 AM
10 print "fuck you"
20 goto 10
and then running out.
some may be running major companies now.
by smackeyacky on 3/19/23, 5:19 AM
Those of us who owned offbeat PCs in the early 1980s probably were more motivated to learn our machines as the games and whatnot were much more limited. Wish I had never given away my Microbee...
by teddyh on 3/19/23, 4:26 PM
The most widely used version of Basic which Commodore (and other) platforms used does not have functions. This makes Basic programs tend to spaghetti and unreadable code, especially considering the constant memory constraints of those platforms. I grew up on these systems, and every time I think back on it I wish something like Forth would have taken its place – i.e. something with a clean and scalable pattern for abstraction. Basic, on the other hand, doesn't do abstractions. It barely has data types and what it calls “functions” are an even more limited form of Python's style of lambdas; every subset of code which can actually do something looks like “GOSUB 11600” when you call it. No naming, no abstractions, nothing. (No parameters or return values, only global variables.)
(This is in some ways even worse than assembler, which usually has labeled goto’s.)
When I programmed in Basic those many years ago, I was stalled when my programs reached a certain level of complexity. I was then mostly halted in my education and development as a programmer for many years, because the language did not make program composition easy. It was not until I had the opportunity to learn other languages with did have proper functions and other methods of program composition that I could break through the barrier, so to speak.
(Reportedly, BBC Basic on the BBC Micro did have proper named functions, and later versions of Basic like on the Atari ST and Amiga also had them. I believe that those versions of Basic would have been vastly more productive and taught people the usefulness of abstracting things as you go, building ever higher abstractions, etc. But this is never the version of Basic which people talk about, or used by all those listings in magazines, etc. These are, for all intents and purposes, not the “80s style Basic” which everybody remembers with such apparent and baffling fondness.)
(Mostly a repost of a previous post of mine: https://news.ycombinator.com/item?id=34033513)
by rtpg on 3/19/23, 9:39 AM
I value "infinite online resources" but having integrated books of documentation in the IDE includes such valuable writing. I miss it so much when going through the hastily-written "getting started" tutorials I end up with nowadays (the scope of problems trying to be solved is way different of course)
by dang on 3/19/23, 4:09 AM
Learning BASIC Like It's 1983 - https://news.ycombinator.com/item?id=17900494 - Sept 2018 (160 comments)
by Koshkin on 3/19/23, 2:30 PM
https://micromite.org/product-category/maximites/colour-maxi...
by logicalshift on 3/19/23, 10:11 AM
An interesting thing is that RISC OS is still available for the Raspberry Pi and it's a direct descendant from the operating system of the BBC Micro - not emulated. It still has the same level of direct hardware access, so if you ever wanted to use peek and poke (well, those are the ! and ? operators in BBC BASIC) on some modern graphics hardware, there's a way to do it. There's a built-in ARM assembler in there too.
What I think was really different about the time was the quality of the documentation. Nothing modern has the same sense of empathy for the user or achieves the same combination of conciseness and comprehensiveness. For instance, here's the BBC Micro's Advanced User Guide: https://stardot.org.uk/mirrors/www.bbcdocs.com/filebase/esse... (it's of particular historical note, because today's ARM architecture grew out of this system). You could build the entire computer from parts using just this 500 page manual, and you'll note that it's not actually a huge amount more complicated than Ben Eater's 6502 breadboard computer.
Weird thing: RISC OS actually has backwards compatibility with some of the old APIs so some of the stuff in the advanced user guide still works today on a Raspberry Pi (plus it comes with a BBC Micro emulator which was originally written because Acorn didn't want their new machine to fail due to a lack of software). These days there's also https://bbcmic.ro of course :-)
The Programmers Reference Manual for RISC OS is similarly well written, and surprisingly quite a lot of it is still relevant: most things still work on a Raspberry PI, and even modern operating systems still work pretty much the same way on the architecture. While things like MEMC, IOC and VIDC are long dead, there's a pretty direct lineage for the modern hardware to these older chips too.
by MissTake on 3/19/23, 12:25 PM
Learnt so much about the 6502 that way!
by PopAlongKid on 3/19/23, 2:06 PM
That was another way to learn BASIC like it's 1983 that I haven't seen mentioned yet.
by Jemm on 3/19/23, 12:22 PM
School had an Apple II but wouldn't let students use them. Instead they forced them to use punch cards so I didn't take the class. Instead I used a Commodore PET that was on display at a department store.
Eventually owned an Apple II clone and and IBM PC clone. First work computer was a Compaq Luggable with an amazing orange phosphor monochrome display.
Before the Internet was available we used BBSs, and Compuserve. BBSs were horrible little fiefdoms run but basement dwelling trolls.
Networking was still a toss up between Ethernet and Arhnet. I liked Arcnet. You had to configure interrupts, ports, buad rates, stop bits, and parity for the IDE network cards. It was a pain.
Most business LANS used Novell Netware or Lantastic. I loved Lantastic, it was easy and even had a voice over network feature. Still have a t-shirt from them somewhere.
The Internet arrived before Windows was usable and Microsoft wasn't ready. So you had to use a SOCKS client.
I made a lot of money in those days simply by hanging out in the computer section of the big book store. Managers would wander in like Bambi on a highway. When they saw me reading a book on computers they inevitably asked questions. It turned in to consulting work.
Fun times but also very frustrating. No real multi user, buggy products and operating systems, Linux was still very much 'assembly required'.
Now we have non-typed, high level, abstracted languages, and agile methodologies which are possibly a step too far in the other direction.
by bitwize on 3/19/23, 4:18 PM
8-bit micro BASIC development was based on the idea that it's the programmer's job to produce a complete program, and to understand all that it does from the time you type RUN until the program ends or is interrupted (by pressing Ctrl-C or RUN STOP, resetting the computer, etc.).
Today, most software developers develop program fragments that are plugged into a framework. The framework takes care of most of the details and only calls into your code for specialized logic. If you grew up programming BASIC (or Forth or Logo or Turbo Pascal), it can be confusing and frustrating to work this way because your intuitive sense of the program flow is completely disrupted.[0] I've found that younger programmers have fewer issues writing framework code. When their brains were still pliable, they learned that this is what programming is, so they adjusted to it. Even game programming, long the purview of hardcore bit diddlers, is high-level and framework-oriented thanks to engines like Unreal and Unity. Older programmers like me, sometimes their instincts and intuitions got in the way. The ones who thrived are the ones who adapted, who stopped worrying and learned to love the framework.
The entire discipline of programming is going to be disrupted again -- by AI. So today's programmers are going to be confused and frustrated when their jobs switch from writing code to prompt-engineering a model into writing code. But Gen Z will be right at home with it.
[0] I've found that working in Spring is for me an aggravating process because it involves guessing how to make ill-specified framework magic do what I want instead of, you know, writing a program that does what I want.
by heywhatupboys on 3/19/23, 1:34 PM
oh dearest, save us from false revisionist shit political takes on gender/race/politics in every modern journalistic piece. There was no stigma or bias for women going into CS/programming in 1983.
by j16sdiz on 3/19/23, 5:52 AM
by retrocryptid on 3/20/23, 1:26 PM
You had to go to your local toy or department store to get a C64's or 1541. I remember the day the C64 price hit $200 at BEST (discount toy-heavy catalog store.) But the 1541 was still something like $400, so... OOF!
by taubek on 3/19/23, 6:52 AM
by actually_a_dog on 3/19/23, 3:45 AM
And that's all a good story, except that it mostly didn't happen.
Not everybody was typing in BASIC listings, then spending hours debugging where the typo was. There was more than enough premade software, including games, out there for the major computers on the market (C64, Apple II, TI-99) that you could have a lot of fun without ever seeing a BASIC prompt.
And, while the manuals were much lower level then than they would be just a few short years later, and you could certainly learn a lot about how to control the machine by PEEKing and POKEing memory locations, the fact that if you POKE some address and the screen changes color doesn't tell you anything about how it all happened. It's just as mysterious as how moving the mouse on a modern computer moves a pointer on the screen.
BASIC is too high level to teach you anything meaningful about what's going on under the hood. But, fortunately, the machine languages of the home computers of this era were generally pretty well documented. However, even then, while you'd know there's this thing called a CPU and that it has things called registers and that you can give it instructions to read and write memory locations, those things are still pretty big abstractions if you want to claim to understand "how it all works."
by DeathArrow on 3/19/23, 10:22 AM
Learn a bit about GPUs, some OpenGL, DirectX, shading language or Vulkan and tinker with the GPU.
by DeathArrow on 3/19/23, 9:57 AM
Actually the best games and software were programmed in assembler and you had to load them from cassette tapes.
You couldn't do much in Basic.
by jbverschoor on 3/19/23, 9:48 AM
Couldn't find it only, but found the sony one at https://hansotten.file-hunter.com/uploads/files/sonymsx2basi... fun to see the difference
by hfo on 3/19/23, 6:45 AM
I tried typing in listings, but you only knew that there was a typo somewhere after you finished hundreds of lines. Finding the typo was out of the question for me. Also, it was obvious that the result of the listing, usually a game, was of much lower quality than the games I already had or could copy from friends.
The reason I learned Basic was that I wanted to know how games work. They always fascinated me since I saw Space Invaders somewhere. I quickly understood, mostly with the help of more advanced friends, that you couldn't make games with Basic, it was too slow. You had to learn machine language.
So that's what I eventually did, and that's how I really understood the machine down to the point where I could tell what almost every single of the roughly 40000 available bytes did. It took a long time to get there, those 8-bit machines where already quite layered in hindsight when you think about: How 6502-instruction, assembler, disk I/O, joystick input and graphical output where tucked together with what today be probably called the Atari-API was not immediately obvious and the result of 20 years of technical development, but nowhere explained for a 12 year old!
My enlightment moment was this dialog with a friend: Me: "Why does this assembler program crash? Why do I have to reset? Why can't the computer handle it?", friend: "Because deep down, executing your program is also a program. If your program crashes, that program crashes.". I think that was the most profound lesson ever to me. It's programs all the way down!
So, yes I know that CPUs have their own instructions and that every programming language ultimately compiles to that. But that knowledge helped me little with what I consider the next large learning steps over the decades: Learning C on x86, learning how Unix/Linux works, learning what the internet is fundamentally build up on, learning Javascript+HTML5, learning how fundamentally different asynchronous programming is if you can't assume that I/O might not respond immediately and possibly never.
My favorite language today is vanilla javascript. I love the simplicity, no compiler insisting on type safety, a great UI, almost platform independent, lots of cool APIs. I think JS is as remote from Assembler as you can get.
Bottom line, I think it really doesn't matter to know about machine instructions, same as it didn't matter at the time how CPUs worked on the hardware level. That still mystifies me: The 6502-equivalent of an if-clause was branch-not-equal (BNE), but how did that work in reality? What's happening on the silicone then? How can a lifeless thing make a decision? Never really understood whats beneath the turtles.
by ar9av on 3/19/23, 4:17 AM