Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Supercomputing

Some Scientists Question Whether Quantum Computer Really Is Quantum 170

gbrumfiel writes "Last week, Google and NASA announced a partnership to buy a new quantum computer from Canadian firm D-Wave Systems. But NPR news reports that many scientists are still questioning whether new machine really is quantum. Long-time critic and computer scientist Scott Aaronson has a long post detailing the current state of affairs. At issue is whether the 512 quantum bits at the processor's core are 'entangled' together. Measuring that entanglement directly destroys it, so D-Wave has had a hard time convincing skeptics. As with all things quantum mechanical, the devil is in the details. Still it may not matter: D-Wave's machine appears to be far faster at solving certain kinds of problems (PDF), regardless of how it works."
This discussion has been archived. No new comments can be posted.

Some Scientists Question Whether Quantum Computer Really Is Quantum

Comments Filter:
  • Read the blog post (Score:5, Interesting)

    by oGMo ( 379 ) on Wednesday May 22, 2013 @09:48AM (#43793327)

    The problem is that it's not faster, and while there's a study that concludes it is, the blog post specifically invalidates this:

    Namely, the same USC paper that reported the quantum annealing behavior of the D-Wave One, also showed no speed advantage whatsoever for quantum annealing over classical simulated annealing. In more detail, Matthias Troyer’s group spent a few months carefully studying the D-Wave problem—after which, they were able to write optimized simulated annealing code that solves the D-Wave problem on a normal, off-the-shelf classical computer, about 15 times faster than the D-Wave machine itself solves the D-Wave problem! Of course, if you wanted even more classical speedup than that, then you could simply add more processors to your classical computer, for only a tiny fraction of the ~$10 million that a D-Wave One would set you back.

    About the paper claiming it's faster:

    As I said above, at the time McGeoch and Wang’s paper was released to the media (though maybe not at the time it was written?), the “highly tuned implementation” of simulated annealing that they ask for had already been written and tested, and the result was that it outperformed the D-Wave machine on all instance sizes tested. In other words, their comparison to CPLEX had already been superseded by a much more informative comparison—one that gave the “opposite” result—before it ever became public. For obvious reasons, most press reports have simply ignored this fact.

  • by gestalt_n_pepper ( 991155 ) on Wednesday May 22, 2013 @10:11AM (#43793575)

    But doesn't this suggest that arrays of narrow domain analog computers of this type might be constructed in such a way as to produce a *really* fast general purpose supercomputer? For example, sorting routines are built into most software frameworks. Could we not hybridize a system wherein np hard problems are called from the framework that transfers the sort to an quantum adiabatic solver and returns an answer?

  • Proved the Market (Score:5, Interesting)

    by bill_mcgonigle ( 4333 ) * on Wednesday May 22, 2013 @10:21AM (#43793671) Homepage Journal

    Whether this thing turns out to be the real McCoy (dammit Jim, I'm a quantum annealer) or not, one thing D-Wave has done is proven that there are customers who will pay $10M to be on the cutting edge of quantum computing for a few years. This should help boost investment and entrepreneurship in other companies. Eventually, one of them will revolutionize everything.

  • by JoshuaZ ( 1134087 ) on Wednesday May 22, 2013 @10:22AM (#43793677) Homepage

    Is there really any difference between quantum entanglement and magic?

    Yes. There's this tendency to view entanglement as spooky, magical, and hard to understand. But this really isn't the case and is more due to the confusing way that quantum mechanics if often taught, as a series of counterintuitive results tacked on to classical physics. If one adjusts one's perspective to think of quantum mechanics more as the consequences of using a 2-norm and looking then at the structure imposed on vectors by unitary transformations, things make a lot more sense. Scott Aaronson(mentioned in the summary above) has a book out recently on just this subject "Quantum Computing since Democritus" which is aimed at explaining these issues to people outside is field but with a comfortable background in other technical fields- essentially no more than some linear algebra, basic probability and complex numbers. The book is highly readable and Scott is a very funny writer, so there are a lot of amusing asides.

  • by MozeeToby ( 1163751 ) on Wednesday May 22, 2013 @10:47AM (#43794023)

    Because if it is quantum it's a generation 0 (barely out of prototype) implementation going up against a generation... oh I don't know... 30+ classical computer. If it's not quantum, if it's basically an ASIC chip designed to solve simulated annealing problems (intentionally or not), it's worthless even as research. What they are selling is a research and training system, so that engineers can learn what kinds of problems can be solved on the hardware that will, presumably, get much more powerful going forward.

    Look at it this way, the current D-wave machine has 512 qbits and a modern PC can match it's speed. Double the qbits and you end up with a simulation space several million times larger, the 15x faster is going to seem laughable when the problem you are solving is trillions of times larger and the D-Wave solves in constant time while your PC runs an algorithm that's O(n^2). If, if, what D-wave is selling is using quantum affects.

  • by gestalt_n_pepper ( 991155 ) on Wednesday May 22, 2013 @10:55AM (#43794121)

    Anything can be pushed to the limits of what we know, and on occasion, things work, but not for the reasons you think it did. This is sufficiently close to the cutting edge that it may be operating correctly, but that we only think we understand why.

    F'rinstance, for years, we thought about electricity as a liquid. Voltage equaled pressure. Amps equaled volume. The math worked. Nature wiggled it's eyebrows suggestively.

    BUT, electricity is NOT a liquid. It works the way it does for completely different reasons. It just took a while for us to figure that out. Yet, even before we understood this, we build practical machinery.

  • by Anonymous Coward on Wednesday May 22, 2013 @11:25AM (#43794397)

    I would argue that we should be open-minded at first and see what they can actually do. Maybe analog computers are in fact not as outdated as some people claim. Maybe we could build some sort of "analog FPGA" and do massively useful things with that. I still remember an HP computer graphics subsystem using analog computers !

    Surely digital computers have the advantage of simple control of temperature, aging and general error margin issues, but it comes at a massive cost in the number of transisitors to perform a certain function. less than ten transistors can perform an analog multiplication while you need tens of thousands if not hundreds of thousands of transistors to perform a floating point multiplication. Also, the analog multiplier will be operating at much higher speed (easily 10x). Again, if we could control temperature and aging-related issues and have high integration and programmability (FPGA-style), maybe we could do massively useful things at very low power levels or with massive parallelity. I do NOT think that analog computers are dead forever. It might be more of a cultural thing we currently don't use them much ("digital is always better", "digital is modern" and similar semi-truths.

    If you put one seasoned computer scientists and one seasoned electrical engineer in one room and task them to do what I described, if you give them massive funding (say 3 million dollars), I am sure they could come up with something massively useful. For example, digital circuits could periodically calibrate the analog circuits to compensate for all kinds of drift and aging. Software could automate the drudgery of manual circuit synthesis, it could model crosstalk and similar things.

    Well, maybe Analog Devices already has this kind of thing.....

"Everything should be made as simple as possible, but not simpler." -- Albert Einstein

Working...