by popol12 on 5/11/25, 8:50 AM with 20 comments
by rainsford on 5/11/25, 12:53 PM
So as far as I can tell from some quick skimming, the paper's title is entirely clickbait. Regardless of the size of the numbers involved, this is not really "RSA-2048" because no one would construct an actual RSA-2048 key this way. And if they did, I think it would be susceptible to classical attacks like Fermat factorization, no "quantum computer" needed.
To be fair, the paper does eventually admit this has no real impact on actual RSA-2048, but it does still try to characterize this as some sort of looming threat.
by jcranmer on 5/11/25, 1:12 PM
It's not. It's a factorization of the product of two 1024-bit numbers that are known to differ only in two bits (and the bit positions they differ may also be an input to the algorithm, not clear on that). The only relevance to RSA-2048 is that it's not technically a lie that they factored a 2048-bit integer.
by hbartab on 5/12/25, 8:15 PM
Moreover, D-Wave's quantum computers rely on quantum annealing, not Shor's algorithm. Quantum annealers are NOT gate-based machines. Only for the latter is there a theoretical exponential speedup over a classical computer. For the former, we still don't know if there is any speedup at all. And if there is, it probably is not applicable in general: getting lucky with a specific integer does not count.
by bawolff on 5/11/25, 12:59 PM
by antimatter15 on 5/11/25, 12:54 PM
The paper only applies to "special integers" where the prime factors are known to only differ by two bits.
by formerly_proven on 5/11/25, 12:59 PM
> When factoring this class of integers, their special properties will make the exponential-level solution space search problem in the factorization simplify to a constant-level solution space search problem, which greatly saves computational resources.
„We elected to solve a O(1) subset instead of the actual problem“
by thrance on 5/11/25, 1:16 PM
Here, they picked artificially constructed numbers that are designed to be easy to factor. Something classical computers could do far more efficiently mind you, but hey, maybe some guy won't read the article and invest a few extra bucks in D-Wave based on the headline, in which case it was all worth it. It only required further degrading the credibility of this clown industry.