from Hacker News

Why OpenAI's $157B valuation misreads AI's future (Oct 2024)

by pcurve on 1/28/25, 1:17 AM with 132 comments

  • by openrisk on 1/28/25, 8:10 AM

    "Linux ultimately prevailed-not because it was better from the start, but because it allowed developers to modify the code freely, run it more securely and affordably, and build a broader ecosystem that enabled more capabilities than any closed system"

    Deepseek followed llama and will be followed by others in the usual mushroom fashion of open source. People really dont appreciate the magnitude of the disruptive force that is unleashed by the open source paradigm. In a year from now the landscape will be brimming with new initiatives. In a few years nobody will even remember "open"ai.

    Conventional economic theory will always misread the future of computing (and thus "AI"). The zero marginal cost and infinite replicability is not a bug, its a feature. But so far we dont really have a good model how to think about it and merge it with mainstream business models. Something must pay the bills eventually but these are very different bills from those of conventional scarcity based businesses. Ironically in the end the main scarcity is human ingenuity. Read the interview of the Deepseek founder on why their models are open source.

  • by mbowcut2 on 1/28/25, 6:00 AM

    DeepSeek has demonstrated that there is no technical moat. Model training costs are plummeting, and the margins for APIs will just get slimmer. Plus model capabilities are plateauing. Once model improvement slows down enough, seems to me like the battle is to be fought in the application layer. Whoever can make the killer app will capture the market.
  • by nextworddev on 1/28/25, 4:34 AM

    Just playing devil's advocate:

    VCs (esp those who missed out on OAI) are heavily incentivized to root for OAI to fail, and commoditize the biggest COGs item (AI models).

    This guy is just talking his book.

  • by Stokley on 1/28/25, 4:13 AM

    Whether you love or hate OpenAI, the CapEx involved with this company will be viewed as historic in the future, and will change (has already changed) the paradigm of how tech startups/projects are funded
  • by elijahbenizzy on 1/28/25, 5:03 AM

    We’ve just learned that it’s possible to do AI on less compute (deepseek). if OpenAI doesn’t scale and that’s the problem then I’d argue that in the long run, if you believe in their ability to do research, then the news this week is a very bullish sign.

    IMO the equivalent of moores law for AI (both on software and hardware development) is baked into the price, which doesn’t make the valuation all too crazy.

  • by trhway on 1/28/25, 4:20 AM

    > But while Facebook’s costs decreased as it scaled, OpenAI’s costs are growing in lockstep with its revenue, and sometimes faster

    And here comes DeepSeek and takes the steam out of this and the cost arguments that follow it.

  • by blackeyeblitzar on 1/28/25, 5:01 AM

    People are announcing the death of foundational models too early. Don’t people realize that the big AI players will take all of the proprietary things they’ve been building up behind closed doors and simply layer onto them all the winning techniques everyone else is publishing (like what DeepSeek has used)? DeepSeek itself is taking ideas that have proven out in various other papers and stacking them up to produce their gains (which they’ve been transparent about in their papers).

    I also still don’t believe their cost figures, and think they’re leaving out the capital to acquire their secret GPU stash and the cost of pre training their base model (DeepSeek-V3-base). I also suspect their training corpus, which they’ve only vaguely described, would reveal the savings came from working off other foundational models’ work without counting those costs in their figure.

    For now, I treat the cost claim as simply a calculated strategy for China to not look like they’re behind in the most important race, to prevent investors from continuing to boost US technology by causing them to doubt the ROI, and to take value out of the US stock market as they did today.

  • by tempeler on 1/28/25, 5:20 AM

    Pricing is the betting or wishing in the valuation. The buyer thinks it will increase; the seller thinks it's enough. No one knows what will happen in the future. Maybe the Fed will print too much money. Does anyone know what will happen in the future?
  • by stego-tech on 1/28/25, 5:15 AM

    A pretty good read that succinctly picks apart the realities of current AI businesses. Easily something I’d reference as a “primer” to someone that is more business-minded than technically-minded.

    One point I’ll agree on is his final one: that the true big players haven’t even been founded yet. Right now, the AI hype seems to still revolve around the dream of replacing humans with machines and still magically making Capitalism work in the process, which is something I (and other “contrarians”) have beaten to death in other threads. That said, what these companies have managed to demonstrate is that transformer-based predictive models are a part of the future - just not AGI.

    If I were a VC, I’d be looking at startups that take the same training techniques but apply them in niche fields with higher success rates than general models. An example might be a firm that puts in the grunt work of training a foundational model in a specific realm of medicine, and then makes it easier for a hospital network to run said model locally against patient data while also continuously training and fine-tuning the underlying model. I wouldn’t want to get into the muck of SaaS in these cases, because data sovereignty is only going to become an ever-thornier issue in the coming decades, and these prediction models can leak user data like a sieve if not implemented correctly. Same goes for other narrow applications, like single-mode logistics networks or on-site hospitality interfaces. The real money will be in the ability to run foundational models against your own data in privacy and security, with inference at the edge or on-device rather than off in a hyperscaler datacenter somewhere.

    Then again, I could be totally wrong. Guess we’ll all find out together.

  • by dralley on 1/28/25, 4:16 AM

    A year ago Sam Altman was going around trying to convince people we all needed to drop 7 trillion dollars to build hundreds of fabs and nuclear power plants to fuel his AI ambitions. Only a week ago he was triumphantly announcing 500 billion dollar deals with our new President.

    The (regrettably temporary) ousting of Sam Altman looks like the right call, in hindsight. Of course some amount of showmanship is expected, but the extreme nature of this self-serving BS is just laughable.

    6 months from now we may be looking at Sam Altman the way we look at Adam Neumann.

  • by a13n on 1/28/25, 5:51 AM

    > I’d argue that the most valuable companies of the AI era don’t exist yet. They’ll be the startups that harness AI’s potential to solve specific, costly problems across our economy—from engineering and finance to healthcare, logistics, legal, marketing, sales, and more.

    I feel like the author's concluding point contradicts himself. There is a gold rush and OpenAI is selling shovels.

  • by alephnerd on 1/28/25, 3:46 AM

    Glad to finally see Ashu Garg's writings on HN.
  • by halfcat on 1/28/25, 4:39 AM

    It’s the year 2000. We have the internet, a technology that will change the world. Yahoo is the most valuable company on earth. Among the coolest things people do is go to CompUSA and pay money for a web browser, Netscape Navigator, because it supports the <blink> tag so you can make your geocities page even more awesome. Google is still operating out of a garage somewhere, and won’t be household name until after the bubble bursts.

    That’s where we are in the AI journey in 2025. The year 2000.

  • by tonyhart7 on 1/28/25, 4:37 AM

    and microsoft literally spent 80 billions on top of its, like bro imagine 80 billions dollar company is like top 0,01 percent

    and that valuation would crumble because of deepseek

  • by poorcedural on 1/28/25, 5:46 AM

    Why do we not value QBASIC in billions? Honestly, we value current Van Gogh paintings in billions. The past cost us more, we got here because of that art fought through decades of litigation. Does progress mean we forget all of that and hope on a promise of easy answers?
  • by refulgentis on 1/28/25, 5:24 AM

    For about a month now I've been paying $20-$30/day to delegate the bulk of my coding to Sonnet. The agentic loop thats trained into it is just simply not matched by another other model.

    I can't admit to myself there's any open question as to if there is any long-term value.

    I expect within 2 years, this will seem like a non-controversial idea, and it won't bring in a ton of assumptions about the speaker.

    I have invested much time and effort making sure local models are a peer to remote ones in my app, and none, including DeepSeek's local models, are remotely close to the things needed to make that flow work.

    EDIT: Reply-throttled, so answering replies here:

    - The machine is building the machine: Telosnex, a cross-platform Flutter app

    - it can do 90% of the scope, especially after I wrote precanned instructions for doing e.g. property-based testing.

    - Things it's done mostly wholesale: -- secure iframe environment, on all 6 platforms, to: execute JS in, or render react components it wrote. -- completely refactoring my llama.cpp inference to use non-deprecated APIs.

    - Codebase is about 40K real lines of code. (I have to think this helps a lot I doubt that ex. from scratch it would be able to build a Flutter app that used llama.cpp.)

    - $30/day!?! -- Yeah, it's crazy, its up an order of magnitude from my most busy days when I just copy-pasted back and forth. It reads as much code as it wants, and you're doing more work literally, so it adds up.

    - $20/day is realistic average

    - Lines added per day +55%, lines deleted per day +29%, files changed per day 9 -> 21 https://x.com/jpohhhh/status/1881453489852948561

  • by ripped_britches on 1/28/25, 6:21 AM

    What a great take, I have thought this for a while.
  • by Jasondells on 1/28/25, 1:35 PM

    The OpenAI vs. DeepSeek debate is fascinating... but I think people are oversimplifying both the challenges and the opportunities here.

    First, OpenAI’s valuation is a bit wild—$157B on 13.5x forward revenue? That’s Meta/Facebook-level multiples at IPO, and OpenAI’s economics don’t scale the same way. Generative AI costs grow with usage, and compute isn’t getting cheaper fast enough to balance that out. Throw in the $6B+ infrastructure spend for 2025, and yeah, there’s a lot of financial risk. But that said... their growth is still insane. $300M monthly revenue by late 2023? That’s the kind of user adoption that others dream about, even if the profits aren’t there yet.

    Now, the “no moat” argument... sure, DeepSeek showed us what’s possible on a budget, but let’s not pretend OpenAI is standing still. These open-source innovations (DeepSeek included) still build on years of foundational work by OpenAI, Google, and Meta. And while open models are narrowing the gap, it’s the ecosystem that wins long-term. Think Linux vs. proprietary Unix. OpenAI is like Microsoft here—if they play it right, they don’t need to have the best models; they need to be the default toolset for businesses and developers. (Also, let’s not forget how hard it is to maintain consistency and reliability at OpenAI’s scale—DeepSeek isn’t running 10M paying users yet.)

    That said... I get the doubts. If your competitors can offer “good enough” models for free or dirt cheap, how do you justify charging $44/month (or whatever)? The killer app for AI might not even look like ChatGPT—Cursor, for example, has been far more useful for me at work. OpenAI needs to think beyond just being a platform or consumer product and figure out how to integrate AI into industry workflows in a way that really adds value. Otherwise, someone else will take that pie.

    One thing OpenAI could do better? Focus on edge AI or lightweight models. DeepSeek already showed us that efficient, local models can challenge the hyperscaler approach. Why not explore something like “ChatGPT Lite” for mobile devices or edge environments? This could open new markets, especially in areas where high latency or data privacy is a concern.

    Finally... the open-source thing. OpenAI’s “open” branding feels increasingly ironic, and it’s creating a trust gap. What if they flipped the script and started contributing more to the open-source ecosystem? It might look counterintuitive, but being seen as a collaborator could soften some of the backlash and even boost adoption indirectly.

    OpenAI is still the frontrunner, but the path ahead isn’t clear-cut. They need to address their cost structure, competition from open models, and what comes after ChatGPT. If they don’t adapt quickly, they risk becoming Yahoo in a Google world. But if they pivot smartly—edge AI, better B2B integrations, maybe even some open-source goodwill—they still have the potential to lead this space.

  • by coldpepper on 1/28/25, 4:21 AM

    AI is still a fad.
  • by andrewmcwatters on 1/28/25, 4:41 AM

    Oh! It's "Open"AI because there's no moat! /s