from Hacker News

I was at AMD in the mid-late 2000s helping design CPU/APU/GPUs

by jxub on 7/5/24, 12:23 PM with 130 comments

  • by mkl on 7/5/24, 12:34 PM

  • by basilgohar on 7/5/24, 12:50 PM

    I love this insider view into this interesting point in computing history, especially about AMD. However, I was a little put off by the glorification of nVidia's shady practices and lock-in policies as key to their current leading position. While technically true, I dislike "ends justify the means"-style thinking.

    All this as the OP glorifies AMD's engineering and grit-based culture to drive through all though tough missteps and missed opportunities.

    To expand on that, I really do feel AMD has great engineering culture but they keep falling to the same traps. They do not invest strongly enough in software support nor vendor relationships. Neither of these necessitate the more evil monopolistic practices of vendor lock-in and proprietary, non-free (as in libre) software. If they can navigate that without turning evil, they'd be a company for the ages.

    And I can't close with mad respect to Dr. Lisa Su for her admirable leadership, itself bookworthy. Also, quick fact, she and Jensen are cousins!

  • by AlexandrB on 7/5/24, 12:51 PM

    > SUPERIOR PRODUCTS LOSE TO SUPERIOR DISTRIBUTION LOCK-INS & GTM.

    This takeaway was a little odd to me in the context of 2008. I had been an AMD stalwart in my PCs since about 2000 (Athlon Thunderbird), but IIRC in 2008 Intel had the better processor. Better single core performance, better performance/watt, and I think AMD processors tended to have stability issues around this time. I remember I built a PC in 2009 with a Core processor for these reasons.

    Obviously this is a niche market (gaming PC) perspective. But I don't think it was so clear cut.

  • by difosfor on 7/5/24, 1:01 PM

    > I seriously wish Nvidia and AMD could merge now – a technology cross-licensing that takes advantages of each other’s fab capabilities is going to help a lot in bringing the cost of GPU cycles down much furthe.

    Given Nvidia's track record I'd sooner imagine them just slacking off and overcharging more for lack of competition. I wish AMD would actually compete with them on GPUs (for graphics, not AI). Interestingly Intel seems to be trying to work up to that now.

  • by gpderetta on 7/5/24, 12:49 PM

    > We did launch a “true” dual core, but nobody cared. By then Intel’s “fake” dual core already had AR/PR love.

    Practicality beats purity 100% of the time. This echoes "Worse is better".

  • by btouellette on 7/5/24, 1:50 PM

    Is he really trying to say that AMD had a superior product in the Core 2 Duo era and Intel was only dominating due to marketing? It's hard to take any of the rest of his opinions seriously when he starts with that take
  • by tambourine_man on 7/5/24, 1:30 PM

    I never worked at a large company and he was right there, but there are so many outstanding things in this thread, it’s hard not be surprised.

    Not understanding the importance of GPUs in 2006, or of being first-to-market, while confusing OpenGL with OpenCL (twice), survival bias (BELIEVE IN YOUR VISION)…

  • by andruby on 7/5/24, 2:32 PM

    It's unbelievable that INTC market cap is only 133B, AMD is only 274B and NVDA is 3,130B. That's 23x INTC and 11x AMD.
  • by Zambyte on 7/5/24, 2:49 PM

    > I seriously wish Nvidia and AMD could merge now – a technology cross-licensing that takes advantages of each other’s fab capabilities is going to help a lot in bringing the cost of GPU cycles down much further!

    It's interesting that they see such a monopoly as something that would bring costs down. It seems more to me like competing with AMD does much more to keep Nvidias costs down (if they can be described as "down") than combining resources would.

  • by alberth on 7/5/24, 5:53 PM

    > I spent 6+yrs @ AMD engg in mid to late 2000s helping design the CPU/APU/GPUs that we see today.

    Is that a far statement to make, given ~20-years has passed?

  • by lotsofpulp on 7/5/24, 12:49 PM

    > a technology cross-licensing that takes advantages of each other’s fab capabilities is going to help a lot in bringing the cost of GPU cycles down much further!

    What does this mean? I thought neither have any “fab” (manufacturing) facilities.

  • by modeless on 7/5/24, 4:31 PM

    > In fact, AMD almost bought Nvidia but

    Imagine the wealth destruction if they had merged way back then! I don't love the way mergers are regulated today but I do feel like preventing companies from growing too big through mergers is desirable.

  • by nickpeterson on 7/5/24, 12:51 PM

    People keep yelling about nvidia stock but that feels like a huge bubble. AI disillusionment will hit and the stock will implode. Nvidia hasn’t made any inroads on producing actual systems, just gpus. Once Apple or Microsoft have a fast enough chip (TOPs wise), nobody will care about nvidia lead except in the datacenter. Seems like a failing position to me.
  • by theandrewbailey on 7/5/24, 1:25 PM

    > We didn’t want a GPU company so much that the internal joke was AMD+ATI=DAMIT.

    I remember reading that on places like the Register, but they kept the second A, so DAAMIT.

  • by chollida1 on 7/5/24, 1:50 PM

    Minor curiosity point.... Does anyone know why engg has two g's here?

    I'm sure it mean engineering but i've never seen that abbreviation, he motioned he's from India, is that where this comes from or is it just an individual quirk?

  • by dooglius on 7/5/24, 1:44 PM

    Dupe of https://news.ycombinator.com/item?id=40696384 no idea why that was flagged
  • by OliverGuy on 7/5/24, 12:48 PM

    Why is AMD green on that graph and Nvidia red.......
  • by fulafel on 7/6/24, 8:59 AM

    > We wanted to merge GPU+CPU into an APU but it took years of trust & and process-building to get them to collaborate. Maybe if we had Slack, but we only had MSFT Sharepoint

    I wonder how many companies had this problem.

  • by washedup on 7/5/24, 1:20 PM

  • by carlsborg on 7/5/24, 1:08 PM

    Back in 2015, AMD was trading at $2.40 and Nvidia at about 50 cents (accounting for stock splits). 1000 USD invested then would be ~$70,000 and ~$256,000 respectively today.
  • by sublinear on 7/5/24, 12:50 PM

    > We were always engineering-led and there was a lot of hubris...

    So, long story short is that most engineers, especially ones as fanboyish as this, are wildly out of place in decision making and can't see the forest for the trees?

    It doesn't seem that surprising.

  • by Apreche on 7/5/24, 3:34 PM

    I predicted years ago they would make a CPU and you would be able to buy an All-NVidia PC. I think the reason that hasn't happened is because of the failed purchase of ARM. And looking at the market dominance of NVidia, it seems they were right to block that acquisition.
  • by _zoltan_ on 7/5/24, 5:34 PM

    Since we're talking about nvidia... :)

    is there anybody here who has access to a B200 NVL72 with working external nvlink switches and wants to share non-marketing impressions?

  • by paulmd on 7/5/24, 5:48 PM

    > We did launch a “true” dual core, but nobody cared. By then Intel’s “fake” dual core already had AR/PR love. We then started working on a “true” quad core, but AGAIN, Intel just slapped 2 dual cores together & called it a quad-core. How did we miss that playbook?!

    it is wild the way AMD engineers can't stop themselves from throwing stones, even with 20 years of distance and even when their entire product strategy in 2024 now rides on gluing together these cores.

    people forget that Intel saying that AMD was gluing together a bunch of cores comes after years of AMD fans whining that Intel was gluing together a bunch of cores - that was always an insult to Intel users that pentium D wasn't a real chip, that core2quad wasn't a real chip (not like quadfather, that's a real quad-core platform!). And you see that play out here, this guy is still salty that Intel was the first to glue together some chips in 2002 or whatever!

    and the first time AMD did it, they rightfully took some heat for doing it... especially since Naples was a dreadful product. Rome was a completely different league, Naples really was glued-together garbage in comparison to Rome or to a monolithic chip. You can argue that (like DLSS 1.0) maybe there was a vision or approach there that people were missing, but people were correct that Naples was a dogshit product that suffered from its glued-together nature. Even consumer ryzen was a real mixed bag, vendors basically took one look at naples and decided to give AMD 2 more years to cook. People wedge still so wound into it they sent death threats to GamersNexus for the “i7 in production, i5 in gaming” which frankly was already quite generous given the performance.

    frankly I find it very instructive to go back and read through some of the article titles and excerpts on semiaccurate because it just is unthinkable how blindly tribal things were even 10 years ago, but this shit is how people thought 10 years ago. Pentium D is bad, because it's glued-together! Core2Quad is bad because it's glued-together! And that from the actual engineers who have the perspective and the understanding to know what they're looking at and the merits, with 20 years of retrospect and distance! If you instead look at what the discourse of this time was like...

    https://www.semiaccurate.com/tag/nvidia/page/6/

    "NVIDIA plays games with GM204"

    "how much will a GM204 card cost you!?"

    "Why mantle API will outlive DX12 [as a private playground for API development outside the need for standardization with MS or Khronos]"

    "GP100 shows that NVIDIA is over four years behind AMD in advanced packaging"

    "NVIDIA profits are up in a fragile way".

    like why are amd people like this? inside the company and out. It’s childish. None of the other brands engineers are out clowning on twitter (frank azor? chris hook? etc), none of the other fans are sending death threats when their brand’s product isn’t good. Like you wanna make a $10 bet over it???