from Hacker News

AI Needs So Much Power, It's Making Yours Worse

by arbuge on 12/30/24, 3:15 AM with 11 comments

  • by mikewarot on 12/30/24, 6:21 AM

    It's behind a paywall, but the gist of it is that so many bad power supplies are being connected to the grid that the sine waves are getting distorted. It's based on this research from Whisker Labs[1].

    Better power supplies can help with the distortion, but it really shouldn't matter much, except in extreme cases where things are at 99% of their limits in the power grid, on a hot day.

    By the way, you can learn interesting things from analysis of the power grid. Long ago I remember a Slashdot comment about using it with covertly gathered recordings estimate the effective yield of nuclear enrichment operations in the middle east. I've been trying to find that thread but Google isn't what it used to be

    [1] https://www.whiskerlabs.com/analysis-of-total-harmonic-disto...

  • by _xerces_ on 12/30/24, 10:49 AM

    It's better than using it on crypto mining which is a huge waste of resources for something so useless to humanity.
  • by toomuchtodo on 1/2/25, 2:11 AM

  • by avidiax on 12/30/24, 9:28 AM

    Is the problem essentially that too much of the load on the grid is AC->DC converters that are drawing power only at the peak of each cycle?

    Maybe there's an alternative design for AC->DC conversion that can use the full cycle?

  • by rapjr9 on 12/30/24, 8:23 AM

    Require that new data centers be powered by their own renewables/batteries instead of being attached to the grid. Problem solved. I'm surprised anyone building a data center is not doing this already.