from Hacker News

Last fifty years of integer linear programming: Recent practical advances (2024)

by teleforce on 6/14/25, 6:15 AM with 66 comments

  • by aquafox on 6/14/25, 6:18 PM

    Could someone maybe give a high-level explanation into why commercial ILP solvers (e.g. Gurobi) are that much better than free/open-source ones? Is it because ILP is inherently that difficult to solve (I know it's NP-hard), that the best solvers are just a large ensemble of heuristics for very specific sub-problems and thus no general "good" strategy has made it's way into the public domain?
  • by dcminter on 6/14/25, 5:44 PM

    I vaguely recall building a resource allocation tool using IBM's "ILOG" mixed integer linear programming library and we realised that if we'd built the tool about 20 years earlier it would have still been running for the same problems we were now solving within 5 minutes.

    As I recall it the raw computer power had increased by a factor of around a thousand and the algorithms had improved by about the same, giving us a factor of a million improvement.

    Worth pondering when trying to predict the future!

    The "resources" in question were diamonds by the way...

  • by tormeh on 6/14/25, 8:25 PM

    Can anyone share how this is used in practice? Somehow I imagine implementing numerical optimization often fails due to the usual problems with data-driven approaches (trust, bad data, etc.) and ultimately someone important just decides how things are going to be done based on stomach feel.
  • by pradn on 6/15/25, 4:05 AM

    "... between 1988 and 2004, hardware got 1600 times faster, and LP solvers got 3300 times faster, allowing for a cumulative speed-up factor higher than 5 × 106, and that was already 20 years ago!"

    "The authors observed a speedup of 1000 between [the commercial MILP solvers of] 2001 and 2020 (50 due to algorithms, 20 due to faster computers)."

    I wonder if we can collect these speedup factors across computing subfields, decomposed by the contribution of algorithmic improvements, and faster computers.

    In compilers, there's "Proebsting's Law": compiler advances double computing power every 18 years.

  • by djoldman on 6/14/25, 1:23 PM

    I've heard Gurobi is fairly expensive. Anyone willing to share pricing details?
  • by FabHK on 6/14/25, 5:14 PM

    I remember implementing some version of Gomory cutting hyperplanes back in the 1990s in Maple (for learning, not for production.) Looks like the field has progressed...

    > if we needed two months of running time to solve an LP in the early 1990s, we would need less than one second today. Recently, Bixby compared the machine-independent performance of two MILP solvers, CPLEX and Gurobi, between 1990 and 2020 and reported speed-ups of almost 4×10^6.

  • by beret4breakfast on 6/14/25, 6:19 PM

    It feels like there’s a significant lack of “ML/AI” based approaches applied to these kinds of problems. I’ve seen a lot of example of RL/GNN papers that do attempt to solve smaller problems but it always feels like the best option is to just pay for a gurobi license and have at it. I’ve been doing some scheduling optimisation recently (close to job shop scheduling) and while there’s some examples of using RL they just don’t seem to cut it. I’ve resorted to evolutionary algorithms to get reasonable solutions to some big problems. Maybe it’s just always more efficient to using OR type approaches when you can formulate the problem well.
  • by perching_aix on 6/14/25, 2:20 PM

    title could use [pdf] [2024]
  • by CyberDildonics on 6/14/25, 4:56 PM

    Integer linear programming doesn't sound very complex.