Some Scientists Question Whether Quantum Computer Really Is Quantum 170
gbrumfiel writes "Last week, Google and NASA announced a partnership to buy a new quantum computer from Canadian firm D-Wave Systems. But NPR news reports that many scientists are still questioning whether new machine really is quantum. Long-time critic and computer scientist Scott Aaronson has a long post detailing the current state of affairs. At issue is whether the 512 quantum bits at the processor's core are 'entangled' together. Measuring that entanglement directly destroys it, so D-Wave has had a hard time convincing skeptics. As with all things quantum mechanical, the devil is in the details. Still it may not matter: D-Wave's machine appears to be far faster at solving certain kinds of problems (PDF), regardless of how it works."
Read the blog post (Score:5, Interesting)
The problem is that it's not faster, and while there's a study that concludes it is, the blog post specifically invalidates this:
About the paper claiming it's faster:
Re:Not General Purpose (Score:4, Interesting)
But doesn't this suggest that arrays of narrow domain analog computers of this type might be constructed in such a way as to produce a *really* fast general purpose supercomputer? For example, sorting routines are built into most software frameworks. Could we not hybridize a system wherein np hard problems are called from the framework that transfers the sort to an quantum adiabatic solver and returns an answer?
Proved the Market (Score:5, Interesting)
Whether this thing turns out to be the real McCoy (dammit Jim, I'm a quantum annealer) or not, one thing D-Wave has done is proven that there are customers who will pay $10M to be on the cutting edge of quantum computing for a few years. This should help boost investment and entrepreneurship in other companies. Eventually, one of them will revolutionize everything.
Re:D-Wave's Dirty Little Secret (Score:5, Interesting)
Is there really any difference between quantum entanglement and magic?
Yes. There's this tendency to view entanglement as spooky, magical, and hard to understand. But this really isn't the case and is more due to the confusing way that quantum mechanics if often taught, as a series of counterintuitive results tacked on to classical physics. If one adjusts one's perspective to think of quantum mechanics more as the consequences of using a 2-norm and looking then at the structure imposed on vectors by unitary transformations, things make a lot more sense. Scott Aaronson(mentioned in the summary above) has a book out recently on just this subject "Quantum Computing since Democritus" which is aimed at explaining these issues to people outside is field but with a comfortable background in other technical fields- essentially no more than some linear algebra, basic probability and complex numbers. The book is highly readable and Scott is a very funny writer, so there are a lot of amusing asides.
Re:Read the blog post (Score:5, Interesting)
Because if it is quantum it's a generation 0 (barely out of prototype) implementation going up against a generation... oh I don't know... 30+ classical computer. If it's not quantum, if it's basically an ASIC chip designed to solve simulated annealing problems (intentionally or not), it's worthless even as research. What they are selling is a research and training system, so that engineers can learn what kinds of problems can be solved on the hardware that will, presumably, get much more powerful going forward.
Look at it this way, the current D-wave machine has 512 qbits and a modern PC can match it's speed. Double the qbits and you end up with a simulation space several million times larger, the 15x faster is going to seem laughable when the problem you are solving is trillions of times larger and the D-Wave solves in constant time while your PC runs an algorithm that's O(n^2). If, if, what D-wave is selling is using quantum affects.
Re:It's much cooler if we *don't* know how it work (Score:5, Interesting)
Anything can be pushed to the limits of what we know, and on occasion, things work, but not for the reasons you think it did. This is sufficiently close to the cutting edge that it may be operating correctly, but that we only think we understand why.
F'rinstance, for years, we thought about electricity as a liquid. Voltage equaled pressure. Amps equaled volume. The math worked. Nature wiggled it's eyebrows suggestively.
BUT, electricity is NOT a liquid. It works the way it does for completely different reasons. It just took a while for us to figure that out. Yet, even before we understood this, we build practical machinery.
Re:Entanglement isn't the only issue (Score:4, Interesting)
I would argue that we should be open-minded at first and see what they can actually do. Maybe analog computers are in fact not as outdated as some people claim. Maybe we could build some sort of "analog FPGA" and do massively useful things with that. I still remember an HP computer graphics subsystem using analog computers !
Surely digital computers have the advantage of simple control of temperature, aging and general error margin issues, but it comes at a massive cost in the number of transisitors to perform a certain function. less than ten transistors can perform an analog multiplication while you need tens of thousands if not hundreds of thousands of transistors to perform a floating point multiplication. Also, the analog multiplier will be operating at much higher speed (easily 10x). Again, if we could control temperature and aging-related issues and have high integration and programmability (FPGA-style), maybe we could do massively useful things at very low power levels or with massive parallelity. I do NOT think that analog computers are dead forever. It might be more of a cultural thing we currently don't use them much ("digital is always better", "digital is modern" and similar semi-truths.
If you put one seasoned computer scientists and one seasoned electrical engineer in one room and task them to do what I described, if you give them massive funding (say 3 million dollars), I am sure they could come up with something massively useful. For example, digital circuits could periodically calibrate the analog circuits to compensate for all kinds of drift and aging. Software could automate the drudgery of manual circuit synthesis, it could model crosstalk and similar things.
Well, maybe Analog Devices already has this kind of thing.....