from Hacker News

Charging lithium-ion batteries at high currents first increases lifespan by 50%

by snazz on 9/8/24, 10:06 PM with 109 comments

  • by starky on 9/9/24, 1:23 AM

    From having worked a bit in the industry I'm a bit skeptical about this study, I've definitely seen studies and experiments that used different initial charging conditions that would have shown better fade performance if this was true.

    Not to mention, how much does the increased SEI change the impedance of the cell (thus reducing the subsequent charge speed) and the capacity available.

  • by mensetmanusman on 9/8/24, 10:33 PM

    Such a cool finding if it pans out in production. A hidden process variable hiding in plain sight.
  • by rkagerer on 9/9/24, 1:03 AM

    TLDR: During a battery's initial "formation" charge, some of the lithium deactivates, forming a squishy, protective layer around the negative electrode, called the solid electrolyte interphase (SEI). Today, manufacturers typically do a slow formation charge, during which about 9% of the lithium is lost to the SEI. It was thought this was needed to form a robust layer. But the researchers found at the higher initial charge currents used in this study, 30% becomes SEI - so you loose some battery capacity (for a given amount of lithium), but wind up with a beefier protective layer on your electrode and better longevity across subsequent charge cycles.
  • by sharpshadow on 9/9/24, 11:28 AM

    I was able to revive lithium batteries which have been discharged to much and didn’t charge by connecting them to a fully charged one for a couple of seconds.
  • by mleonhard on 9/9/24, 1:24 AM

    Since a good SEI layer on the electrode is important, couldn't they put the layer on the electrode before assembling the battery? Then they could make the layer's shape more even.
  • by dzhiurgis on 9/9/24, 12:19 AM

    Whats a battery lifespan? Is it capacity degradation or random failure?

    If discovery slows down capacity degradation, but now your EV battery is 100x more likely spontaneously fail ($$$) - it's not really an improvement. Maybe ok for consumer device tho.

  • by jostmey on 9/9/24, 1:22 AM

    I’m confused… Is this just a prediction or has it been experimentally verified?
  • by Euphorbium on 9/9/24, 1:28 AM

    I remember a recent paper that found that charging at double the current, but at 2khz frequency square wave basically eliminated battery degradation.
  • by keepamovin on 9/9/24, 3:43 AM

    Probably burns in the microstructure making it more stable to filament formation, like the way high voltage electricity etches wood.
  • by fencepost on 9/8/24, 10:57 PM

    TL;DR the high current causes a layer on the negative electron to form a bit differently (and obviously faster), previously it was thought that a slower initial charge led to better formation. This is a process tweak incremental improvement, not anything truly fundamental.
  • by westurner on 9/8/24, 10:33 PM

    But are there risks and thus costs?
  • by solarkraft on 9/9/24, 3:00 AM

    Rule of thumb: It’s a battery innovation/“breakthrough”, so the chance it’ll reach the market any time soon is slim.
  • by userbinator on 9/8/24, 11:04 PM

    is 30 times faster

    Faster than what?

    It turns out this is about the very first charge after assembly of the cell, not regular use.

    However, I doubt that this finding will be used much, except perhaps in applications like aerospace; it is in manufacturer's economic interests that their products have short lives.

    Edit: looks like as usual, comments that expose the truth get buried ;-)