by amaks on 2/7/18, 6:44 PM with 62 comments
by stcredzero on 2/7/18, 10:01 PM
Here's what really happens. If you have a string of n-qubits, when you measure them, they might end up randomly in of of the 2^n possible configurations. However, if you apply some operations to your string of n-qubits using quantum gates, you can usefully bias their wave equations, such that the probabilities of certain configurations are much more likely to appear. (You can't have too many of these operations, however, as that runs the risk of decoherence.) Hopefully, you can do this in such a way, that the biased configurations are the answer to a problem you want to solve.
So then, if you have a quantum computer in such a setup, you can run it a bunch of times, and if everything goes well after enough iterations, you will be able to notice a bias towards certain configurations of the string of bits. If you can do this often enough to get statistical significance, then you can be pretty confident you've found your answers.
https://www.youtube.com/watch?v=IrbJYsep45E
https://www.youtube.com/watch?v=wUwZZaI5u0c
EDIT: I rather like Issac Arthur, but unfortunately, his Quantum Computing episode is an example of exactly this kind of popular misconception. I've called him out on it in comments.
https://www.youtube.com/watch?v=wgCuKTN8sX0
EDIT: I can't find my comment anymore, and I've also discovered that I'm now banned from the public Facebook group! Hmmm.
EDIT: It seems that Issac did correct his video, kind of. He still seems to advocate the 2^n parallelism, but then explains why that can't work around 18 minutes in.
by tagrun on 2/8/18, 12:58 AM
> that the effort required to obtain a low enough error level for any implementation of universal quantum circuits increases exponentially with the number of qubits, and thus, quantum computers are not possible.
That's what dynamical decoupling (DD) or pulse sequences are for: you can get arbitrarily high quality quantum gates (typically, but not always, at the cost of increasing the gate times; removing the most significant first order error typically increases the gate time by less than an order magnitude, think 2x-4x) without increasing the number of physical qubits at all. People don't just rely on surface codes, anyone serious about implementing a quantum computer use surface codes after DD to reduce the infidelity to the threshold required for them. Which is why you don't need hundreds of physical qubits to have a single robust logical qubit.
For those who are not familiar, DD is like one of the oldest tricks in the bag, it's nothing like a new cutting edge type quantum error correction code. In fact, the oldest form of DD, spin echo, precedes any real discussion about quantum computers by a decade.
DD is possible essentially because quantum operations don't commute so errors don't simply add up as they do with classical errors; this makes it possible to obtain a better gate by carefully combining noisy gates such that (significant) errors cancel.
by GuiA on 2/7/18, 9:36 PM
It’s along the same lines as, in a debate, asking each participant to sum up their opponent’s point of view in a way that they agree with. A fundamental tool for productive discourse.
Really enjoyed the interview.
by kirrent on 2/7/18, 9:41 PM
by Robotbeat on 2/7/18, 10:09 PM
It's important to remember that the motivation for quantum computing came from the realization that it's intractable to simulate quantum phenomenon on classical computers. So any claim that classical computers can do just as good as quantum needs to butt up against this realization at some point (IMHO).
by sgillen on 2/7/18, 8:44 PM
by philipov on 2/7/18, 8:31 PM
by pankajdoharey on 2/8/18, 2:38 PM
by roywiggins on 2/7/18, 8:48 PM
by socalnate1 on 2/8/18, 12:48 AM
by reacweb on 2/8/18, 10:28 AM
by lavamantis on 2/7/18, 9:11 PM
by frgtpsswrdlame on 2/7/18, 8:26 PM
I agree that these are unusual and perhaps even strange lines of analysis.
by zitterbewegung on 2/7/18, 10:00 PM
What I mean by traditional computers catching up with it is that we will see innovations in traditional algorithms that either make quantum computers extremely pricey for a very long time in respect to traditional computers or that we find innovations that will bypass the things that they are good at without violating any of the current theories we have.
Currently, we have an enormous effort invested in our silicon architecture. We seem to be hitting limitations to silicon and they seem to be making a large bet on Quantum computers to eventually allow them to proceed. But, if quantum algorithms take a long time to have useful implementations then we have a local maxima where quantum computers aren't close enough to implement without investing a massive amount of money and that silicon just becomes cheaper and cheaper. When Intel finally reaches the limit of traditional computing that doesn't mean that it won't stop being cheaper.
Intel, IBM, Microsoft, DWave, Google and other companies have small bets on quantum computer engineering strategies that may or may not pay off. Also there seems to be a bunch of researchers publishing more and more papers on the theory of quantum algorithms. There are cross collaborations with academia on quantum computation (and they are basically what the research institutions work with the quantum engineering departments in those companies).
If we find that quantum computers can only speed up computations in the limited set of algorithms then they will probably end up as accelerator cards attached to traditional computers. I expect a order of magnitude of inefficiency to translate problems or do work on quantum computers. Either due to interconnecting or the way that we will figure out to operate them. So we will see these systems relegated to supercomputer workloads.
On the algorithms side, funding the research of quantum algorithms will probably dry up in the way that string theory research has.
Hopefully we will see a large breakthrough that will make quantum computers feasible. From my reading of the research it seems like Topological Quantum Computers have the best chance.