from Hacker News

An Argument Against Quantum Computers

by amaks on 2/7/18, 6:44 PM with 62 comments

  • by stcredzero on 2/7/18, 10:01 PM

    A note for the savvy: A quantum computer is not a magic bit-string that mysteriously flips to the correct answer. A n-qubit quantum computer is not like 2^n phantom computers running at the same time in some quantum superposition phantom-zone. That's the popular misconception, but it's effectively ignorant techno-woo.

    Here's what really happens. If you have a string of n-qubits, when you measure them, they might end up randomly in of of the 2^n possible configurations. However, if you apply some operations to your string of n-qubits using quantum gates, you can usefully bias their wave equations, such that the probabilities of certain configurations are much more likely to appear. (You can't have too many of these operations, however, as that runs the risk of decoherence.) Hopefully, you can do this in such a way, that the biased configurations are the answer to a problem you want to solve.

    So then, if you have a quantum computer in such a setup, you can run it a bunch of times, and if everything goes well after enough iterations, you will be able to notice a bias towards certain configurations of the string of bits. If you can do this often enough to get statistical significance, then you can be pretty confident you've found your answers.

    https://www.youtube.com/watch?v=IrbJYsep45E

    https://www.youtube.com/watch?v=wUwZZaI5u0c

    EDIT: I rather like Issac Arthur, but unfortunately, his Quantum Computing episode is an example of exactly this kind of popular misconception. I've called him out on it in comments.

    https://www.youtube.com/watch?v=wgCuKTN8sX0

    EDIT: I can't find my comment anymore, and I've also discovered that I'm now banned from the public Facebook group! Hmmm.

    EDIT: It seems that Issac did correct his video, kind of. He still seems to advocate the 2^n parallelism, but then explains why that can't work around 18 minutes in.

  • by tagrun on 2/8/18, 12:58 AM

    This illustrates the disconnect of mathematicians trying to do physics. Not only his error model is unrealistic (you essentially need a frequency dependent power spectral density function to represent fluctuations in control which decays according to a power law at high frequencies, sometimes called 1/f noise but the power doesn't have to be 1; also you don't get that kind of strong spatial correlations in real systems, cross-talk among spins more of a realistic problem but it is not what he's doing either), his argument is based on something that is just wrong:

    > that the effort required to obtain a low enough error level for any implementation of universal quantum circuits increases exponentially with the number of qubits, and thus, quantum computers are not possible.

    That's what dynamical decoupling (DD) or pulse sequences are for: you can get arbitrarily high quality quantum gates (typically, but not always, at the cost of increasing the gate times; removing the most significant first order error typically increases the gate time by less than an order magnitude, think 2x-4x) without increasing the number of physical qubits at all. People don't just rely on surface codes, anyone serious about implementing a quantum computer use surface codes after DD to reduce the infidelity to the threshold required for them. Which is why you don't need hundreds of physical qubits to have a single robust logical qubit.

    For those who are not familiar, DD is like one of the oldest tricks in the bag, it's nothing like a new cutting edge type quantum error correction code. In fact, the oldest form of DD, spin echo, precedes any real discussion about quantum computers by a decade.

    DD is possible essentially because quantum operations don't commute so errors don't simply add up as they do with classical errors; this makes it possible to obtain a better gate by carefully combining noisy gates such that (significant) errors cancel.

  • by GuiA on 2/7/18, 9:36 PM

    I love that the journalist asked ”What do your critics say to that?”, and that the researcher gave a meaningful, non disparaging reply. This should be expected from everyone doing research work, but is sadly far from the norm.

    It’s along the same lines as, in a debate, asking each participant to sum up their opponent’s point of view in a way that they agree with. A fundamental tool for productive discourse.

    Really enjoyed the interview.

  • by kirrent on 2/7/18, 9:41 PM

    Scott Aaronson still has his $100 000 wager available for an actual refutation of scalable QC. The impetus for the bet was actually from arguing with Gil Kalai in the comments on a blog.

    https://www.scottaaronson.com/blog/?p=902

  • by Robotbeat on 2/7/18, 10:09 PM

    The flip side of this argument is that if a quantum computer can never show "quantum supremacy" over a classical computer, then perhaps there are much better ways of simulating quantum phenomenon with classical computers than we've yet discovered.

    It's important to remember that the motivation for quantum computing came from the realization that it's intractable to simulate quantum phenomenon on classical computers. So any claim that classical computers can do just as good as quantum needs to butt up against this realization at some point (IMHO).

  • by sgillen on 2/7/18, 8:44 PM

    Here is a paper Kalai published that obviously goes into a lot more depth than the interview. https://arxiv.org/abs/1605.00992
  • by philipov on 2/7/18, 8:31 PM

    It's annoying how they refer obliquely to theorems without ever citing what they are talking about. Is he refering to the church-turing thesis?
  • by pankajdoharey on 2/8/18, 2:38 PM

    I dont buy this argument, for one reason and one reason alone. When we started integrating silicons we had similar problems of noise and distortion due to circuits packed so close. There was always scope for corruption of signals and then the problem got amplified when these circuits started working at higher frequencies. A high end processor these days could be 4Ghz, such a high frequency data transaction already creates distortion, so we employed many schemes to circumvent these things most notably using different wire interconnects, going smaller and using various chemical shielding. His key argument revolves around noise/distortion which is not a good argument. If the math doesn't work then its a real problem. Also massiveness argument is also not correct because it was given for computers aswell that we will never be able to build smaller computers and yet we did.
  • by roywiggins on 2/7/18, 8:48 PM

    I haven't read it yet, but this looks like a fairly accessible (but still technical) explanation of the argument:

    http://www.ams.org/journals/notices/201605/rnoti-p508.pdf

  • by socalnate1 on 2/8/18, 12:48 AM

  • by reacweb on 2/8/18, 10:28 AM

    "There is a Hebrew proverb that says that trouble comes in clusters." In France, we have a famous quote from Jacques Chirac "La merde, ça vole toujours en escadrille !" (Shit always flies in squadrons! ).
  • by lavamantis on 2/7/18, 9:11 PM

    Is there a distinction being made between the quantum computer being impossible, and it being impossible without a future breakthrough in engineering or physics?
  • by frgtpsswrdlame on 2/7/18, 8:26 PM

    Really interesting contrarian opinion although I wish he had expanded a bit when he said this:

    I agree that these are unusual and perhaps even strange lines of analysis.

  • by zitterbewegung on 2/7/18, 10:00 PM

    Gil Kalai can have his contrarian opinion on quantum computers. But we can have a much simpler argument against quantum computers. That they will remain infeasible for a long enough time for traditional computers to catch up to it.

    What I mean by traditional computers catching up with it is that we will see innovations in traditional algorithms that either make quantum computers extremely pricey for a very long time in respect to traditional computers or that we find innovations that will bypass the things that they are good at without violating any of the current theories we have.

    Currently, we have an enormous effort invested in our silicon architecture. We seem to be hitting limitations to silicon and they seem to be making a large bet on Quantum computers to eventually allow them to proceed. But, if quantum algorithms take a long time to have useful implementations then we have a local maxima where quantum computers aren't close enough to implement without investing a massive amount of money and that silicon just becomes cheaper and cheaper. When Intel finally reaches the limit of traditional computing that doesn't mean that it won't stop being cheaper.

    Intel, IBM, Microsoft, DWave, Google and other companies have small bets on quantum computer engineering strategies that may or may not pay off. Also there seems to be a bunch of researchers publishing more and more papers on the theory of quantum algorithms. There are cross collaborations with academia on quantum computation (and they are basically what the research institutions work with the quantum engineering departments in those companies).

    If we find that quantum computers can only speed up computations in the limited set of algorithms then they will probably end up as accelerator cards attached to traditional computers. I expect a order of magnitude of inefficiency to translate problems or do work on quantum computers. Either due to interconnecting or the way that we will figure out to operate them. So we will see these systems relegated to supercomputer workloads.

    On the algorithms side, funding the research of quantum algorithms will probably dry up in the way that string theory research has.

    Hopefully we will see a large breakthrough that will make quantum computers feasible. From my reading of the research it seems like Topological Quantum Computers have the best chance.