from Hacker News

Solving SAT via Positive Supercompilation

by Hirrolot on 2/1/24, 5:01 PM with 38 comments

  • by inasio on 2/1/24, 5:53 PM

    There are many very simple heuristics for solving 3-SAT, some even with analytical convergence estimates. A simple random walk (Algorithm P in Knuth's Satisfiability fascicle), which iterates through the clauses and flips a variable at random if it encounters a violated clause, will do the trick too
  • by kazinator on 2/2/24, 7:31 AM

    Conversion of an NFA graph to DFA (in regex processing) seems to be a kind of supercompilation. The NFA is treated as a process, and is simulated for all possible input symbols at every step. The supercompiler makes note of what states it goes into and turns sets of them into DFA states.
  • by arcastroe on 2/1/24, 6:21 PM

    Ah, should've double checked before posting. Thank you everyone who corrected me. Leaving it up for others.

    > SAT is an NP-complete problem, meaning that it is at least as hard as any other problem in NP

    I think this statement is backwards. Instead, any other problem in NP is at least as hard as SAT

  • by zellyn on 2/1/24, 7:25 PM

    I'm almost completely clueless with both 3-SAT and supercompilation, but I'm willing to bet that if you found a neat trick that solves 3-SAT problems easily, then either or both (a) there are classes and/or sizes of 3-SAT problems for which your solution breaks down badly (goes exponential), or (b) what you're doing reduces to a 3-SAT strategy that the people who enter 3-SAT contests[1] already knew about a long time ago.

    [1] http://www.satcompetition.org/

  • by theGnuMe on 2/1/24, 7:17 PM

    Is this just compiler optimized memoization? I mean this is great for parallelization as well but I thought this was standard in the functional programming paradigm. Maybe I am wrong.