from Hacker News

The Worst CPUs Ever Made (2021)

by fzliu on 5/19/22, 8:00 PM with 151 comments

  • by hajile on 5/19/22, 9:09 PM

    This doesn't seem to be the best-researched article out there.

    If they thought Itanium was bad, they should have looked into the i860. Itanium was an attempt to fix a bunch of the i860 ideas. i860 quickly went from a supercomputer chip to a cheap DSP alternative (where it had at least the hope of hitting more than 10% of its theoretical performance).

    Intel iAPX 432 was preached as the second coming back in the 80s, but failed spectacularly. The i960 was take 2 and their joint venture called BiiN also shuttered. Maybe Rekursiv would be worthy of a mention here too.

    We now know that core 2 dropped all kinds of safety features resulting in the Meltdown vulnerabilities. It also partially explains why AMD couldn't keep up as these shortcuts gave a big advantage (though security papers at the time predicted that meltdown-style attacks existed due to the changes).

    Rather than an "honorable mention", the Cell processor should have easily topped the list of designs they mentioned. It was terrible in the PS3 (with few games if any able to make full use of it) and it was terrible in the couple supercomputers that got stuck with it.

    I'd also note that Bulldozer is also maligned more than it should be. There's a lot to like about the concept of CMT and for the price, they weren't the worst. I'd even go so far as to say that if AMD wasn't so starved for R&D money during that period, they may have been able to make it work. ARM's latest A510 shares more than a few similarities. A big/little or big/little/little CMT architecture seems like a very interesting approach to explore in the future.

  • by scrlk on 5/19/22, 10:08 PM

    A different twist on the Itanium: technically bad but ended up as a strategic win for Intel.

    SGI, Compaq and HP mothballed development of their own CPUs (MIPS/Alpha/PA-RISC) as they all settled on Itanium for future products.

    After Itanium turned out to be a flop, those companies adopted x86-64 - Intel killed off 3 competing ISAs by shipping a bad product.

  • by SeanLuke on 5/19/22, 9:09 PM

    What does "worst CPU" mean? I think that it means, regardless of market success, the CPU that most hindered, indeed retarded, progress in CPU engineering history. In this regard, #1 and #2 are clearly the 8088 and 80286 respectively.
  • by drallison on 5/20/22, 5:25 AM

    Of course, what constitutes "worst" is a difficult question.

    Signetics made the 2650, a nice processor with a highly regular architecture with a condition code register. After every arithmetic operation including loads and stores the ALU updated the condition code register.

    The National 32032 processor was a wonderful part with a clarity of design that made it a great choice for a workhorse processor. Unix running the machine was stable and efficient except that every few weeks there would be disastrous crash. With a tremendous amount of effort the source of the problem was found: a race condition in the interrupt control logic that returned from the wrong stack and scribbled over memory.

    The Intel i860 exposed the internal computational pipeline to the programmer. Context switching was complicated by the conflict of real-time operating performance requirements and a deep pipeline with no way to grab the context and drain the pipeline. Eventually a dedicated team got a Unix OS running on the part, but it peformed poorly.

    The Maspar MP-1 was a SIMD machine. It was cool to test new library functions by seeing if, say, sqrt(x)*sqrt(x)==x for all floating point numbers. Customers wanted the Maspar machine to be timeshared, but the architecture made it difficult to do since the CPU state was very large and memory was not mapped.

    Intel's 8048 (and simplified versions like the 8021 and enhanced versions like the 8051) did not perform as well in terms of speed or code size as many of the competing micro controllers. The competition offered very simple asymmetric complex architectures which could be programmed (possibly with external hardware assists) to accomplish embedded tasks with significant effort and several days or weeks of effort. The Intel part was not quite as efficient in memory use and speed, but could be programmed in an afternoon. And another engineer/programmer could look at the code and understand it without much deep thought.

    The Motorola 68000 was a wonderful machine with a clear instruction set. But the original 68000 could not support virtual memory.

    There have been all sorts of different architectures tried which seen strange today but came about because the architecture was thought to provide an engineering solution to an immediate problem. There was a time when register machines were thought to be a bad architecture, far inferior to a simple stack architecture.

  • by bstar77 on 5/19/22, 10:01 PM

    I would vote for the Pentium IV for all the reasons mentioned in the article, but more importantly because it was initially coupled with Rambus memory. Intel pushed that tech so hard to try and squeeze out AMD. Super high frequency, high bandwidth, high expense memory with terrible latency was not the future anyone wanted. Intel's hubris back then was off the charts.

    I know intel wanted Itanium to succeed for the same reasons, but the PIV came very close to home since it actually shipped for consumers. Oddly enough, Extreme Tech was a huge shill for Intel back in those days. Funny they don't mention that in this article.

  • by McGlockenshire on 5/19/22, 9:43 PM

    I'm currently building a homebrew system built on the TMS99105A CPU, one of the final descendants of the TMS9900.

    It's a nifty little CPU. There's a lot of hidden little features once you dig in. It can actually address multiple separate 64k memory namespaces: data memory, instruction memory, macroinstruction memory, and mapped memory with the assistance of a then-standard chip. Normally these are all the same space and just need external logic to differentiate them. There's also a completely separate serial and parallel hardware interface bus.

    The macroinstruction ("Macrostore") feature is pretty fun. There's sets of opcodes that will decode into illegal instructions that, instead of immediately erroring out, will go looking for a PC and workspace pointer (the "registers") in memory and jump there. Their commercial systems like the 990/12 used this feature to add floating point and other features like stack operations.

    Yup, there's no stack. Just the 16 "registers," which live in main memory. There are specific branch and return instructions that store the previous PC and register pointer in the top registers of the new "workspace," allowing you direct access to the context of the caller. The assembly language is simple and straightforward with few surprises, but it's also clearly an abstraction over the underlying mechanisms of the CPU. I believe this then classifies this CPU as CISC incarnate.

    There are some brilliant and insane people on the Atari Age forums! One of them managed to extract and post the data for a subset of those floating point instructions, and then broke it all down and how it all worked. Some are building new generations of previous TMS9900 systems. One of them is replicating the CPU in an FPGA. A few others are building things like a full-featured text editor and, of course, an operating system.

    I've learned a hell of a lot during this project. I've been documenting what I'm doing and am planning to eventually make it into a pretty build log. I think this is a beautiful dead platform that deserved better.

  • by StillBored on 5/19/22, 10:06 PM

    Man, that 6x86 CPU is still getting the short end of the stick nearly three decades later despite being a pretty solid chip.

    So, first it generally had a higher IPC than anything else available (ignoring the P6). So, the smart marketing people at cyrix decided they were going to sell it based on a PR rating which was the average performance on a number of benchmarks vs a similar pentium. AKA a Cyrix PR166 (clocked at 133Mhz) was roughly the same perf as a 166Mhz pentium. Now had they actually been selling it for a MSRP similar to a pentium 166 that might have seemed a bit shady, but they were selling it closer to the price of a pentium 75/90.

    Then along comes quake which is hand optimized for the pentium's U/V pipeline architecture and happens to use floating point too. And since a number of people had pointed out the Cx86's floating point perf was closer in "PR" ratings to its actual clock speed suddenly you have a chip performing at much less than its PR rating, and certain people then proceeded to bring up the fact that it was more like a 90Mhz pentium in quake than a 166Mhz pentium (something i'm sure made, say intel, really happy) at every chance they get.

    So, yah here we are 20 years later putting a chip with what was generally a higher IPC than its competitors on a "shit" list mostly because of one benchmark. While hopefully all being aware that these shenanigins continue to this day, a certain company will be more than happy to cherry pick a benchmark and talk up their product while ignoring all the benchmarks that make it look worse.

    Now as far as motherboard compatibility, that was true to a certain extent if you didn't bother to assure your motherboard was certified for the higher bus rates required by the cyrix, and the other being it tended to require more sustained current than the intels the motherboards were initially designed for. So, yah the large print said "compatible with socket7" the fine print later added that they needed to be qualified, and the whole thing paved the way for the super socket7 specs which AMD made use of. And of course lots of people didn't put large enough heatsink/fans on them which they needed to be stable.

    So, people are shitting on a product that gets a bad rep because they were mostly ignorant of what we have all come to accept as normal business when your talking about differing micro architectural implementations.

    PS: Proud owner of a 6x86 that cost me about the same as a pentium 75, and not once do I think it actually performed worse than that, while for the most part (compiling code, and running everything else including Unreal) it was significantly better than my roommates pentium75.

  • by rondrabkin on 5/19/22, 11:40 PM

    OK then, I was very heavily involved in both the item in the Intro (the flaw in the first Pentium, I was the production control guy in the sole source factory) and #1 on the list (Itanium, I was trying to get hardware companies to work with their software suppliers to port to the new architecture using a very significant budget).

    The common thread was Intel marketing pushing something that was a dog for marketing reasons

    1. It is very amazing not in a good way when you think you have enough inventory but someone from HQ calls up the warehouse and has the older CPUs crushed by a bulldozer (you don't want to throw them out, they are quite usable)

    2. Was amazing that sucker ran so hot tech support got a call about test boxes catching on fire

  • by louissan on 5/19/22, 10:23 PM

    Anyone remember Pentium II and their new <del>sockets</del> cartridges?

    That didn't last long. Like what, one generation?

    Good.

    (saying that, but I remember purchasing a dual Pentium II motherboard for 2 400 MHz CPUs to speed up 3DStudio 4 renderings under Windows NT4... xD)

  • by xbar on 5/19/22, 9:03 PM

    I'm so ashamed to have owned a Cyrix, a P4, and an AMD Bulldozer.

    They were all awful.

  • by tangental on 5/19/22, 9:15 PM

    I visted this page hoping to see the PowerPC 970 top of the list, but all it gets is a "Dishonorable Mention". After going through three PowerMac G5s, all of which had their processors die within 4 years, I still bear a grudge.
  • by cwilkes on 5/19/22, 10:46 PM

    I wondered what happened to the head of Cyrix, Jerry Rogers. He died 2 years ago:

    https://obits.dallasnews.com/us/obituaries/dallasmorningnews...

  • by grp000 on 5/19/22, 10:16 PM

    For a bit of time, I ran an over clocked FX 8320 and crossfire 7970's. The heat that machine put out was tremendous. I only had a wall mounted AC unit so I had to practically take my shirt off when I loaded it up.
  • by easytiger on 5/19/22, 9:25 PM

    My first PC had a cyrix 333Mhz CPU. Ran just fine! But I was learning c in Borland turbo c and djgpp so it didn't have to do much. Running java on it... Well that wasn't fun with the 32MB RAM.

    Worked on itanium too. It was more amazing Microsoft actually had support for it.

  • by annoyingnoob on 5/19/22, 9:36 PM

    I've owned 4 or 5 of the CPUs on that list over the years. I'm sure there are worse.
  • by nesarkvechnep on 5/19/22, 8:46 PM

    Cyrix wasn't the first company to build SoC, Acorn was.
  • by hilbert42 on 5/20/22, 1:34 AM

    "Note: Plenty of people will bring up the Pentium FDIV bug here, but the reason we didn’t include it is simple: Despite being an enormous marketing failure for Intel and a huge expense, the actual bug was tiny."

    The fact that the fault was tiny and that few people were affected is definatly NOT the point.

    The so-called Pentium 'bug' was the result of fundamentally terrible engineering on Intel's part in that the underlying design wasn't fit for purpose - it wasn't just a bug.

    It seems to me the authors of this story do not understand the implications of what Intel did was fundamentally wrong in that its math processing was flawed by design from the outset or otherwise they would have included the Pentium in their list.

    In order to achieve increased math processing speed, Intel broke mathematics algorithms down into part algorithm and part lookup tables - that is instead of having mathematics algorithms complete the whole task (which is the logical way of doing things). If the mathematics algorithm were wrong then every calculation would also be wrong and thus the problem obvious from the outset. Adding a lookup table makes calculations faster but one would then have had to test every combination in the lookup table - and Intel didn't.

    Look at the problem like this - think of a set of log or trig tables, now think of the implications if one of those table entries is incorrect. What Intel did was deliberate cheating and it failed to get away with it. Intel would have known this from the outset and thus the problem was an integral design fault rather than a bug.

    Intel knowingly implemented a design that had flawed data integrity at its most fundamental level. What Intel did was so nasty that it's hard to think of how it could have made matters worse than if it had deliberately tried to introduce a fault.

    In my opinion, any company that would stoop to such low ethical tactics as Intel did with the Pentium's design would have demonstrated that it cannot be trusted - and I've never trusted Intel from that point onward.

    If anyone ever needs a reason for why processors should have open design architectures that are subject to third-party scrutiny then this is the quintessential example.

  • by alkaloid on 5/19/22, 9:53 PM

    As a system builder for a "custom computer shop" back in 1997/98, I came here just to make sure Cyrix was on the list.
  • by velcrovan on 5/19/22, 8:39 PM

    CTRL+F "transmeta crusoe": Not found

    ah well

  • by masklinn on 5/19/22, 8:58 PM

    The lack of Alpha seems odd, though maybe that should be the worst ISA rather than merely individual CPU?