Some Scientists Question Whether Quantum Computer Really Is Quantum 170
gbrumfiel writes "Last week, Google and NASA announced a partnership to buy a new quantum computer from Canadian firm D-Wave Systems. But NPR news reports that many scientists are still questioning whether new machine really is quantum. Long-time critic and computer scientist Scott Aaronson has a long post detailing the current state of affairs. At issue is whether the 512 quantum bits at the processor's core are 'entangled' together. Measuring that entanglement directly destroys it, so D-Wave has had a hard time convincing skeptics. As with all things quantum mechanical, the devil is in the details. Still it may not matter: D-Wave's machine appears to be far faster at solving certain kinds of problems (PDF), regardless of how it works."
If it works - it works (Score:2, Insightful)
Does it really matter so long as it does what it says on the tin? It works faster, surely that's all that matters?
Re:If it works - it works (Score:5, Informative)
You really need to RTFA. It's slower than an optimized implementation of the same thing on a classical computer (and one that costs a lot less than $10m).
Re:If it works - it works (Score:5, Informative)
Indeed, the summary is misleading.
Citing from Aaronsons blog:
Among the many interesting comments below, see especially this one by Alex Selby, who says he’s written his own specialist solver for one class of the McGeoch and Wang benchmarks that significantly outperforms the software (and D-Wave machine) tested by McGeoch and Wang on those benchmarks—and who provides the Python code so you can try it yourself.
and
As I said above, at the time McGeoch and Wang’s paper was released to the media (though maybe not at the time it was written?), the “highly tuned implementation” of simulated annealing that they ask for had already been written and tested, and the result was that it outperformed the D-Wave machine on all instance sizes tested. In other words, their comparison to CPLEX had already been superseded by a much more informative comparison—one that gave the “opposite” result—before it ever became public. For obvious reasons, most press reports have simply ignored this fact.
In other words, if it works, it works, except that it doesn't.
Re: (Score:2)
...been superseded by a much more informative comparison—one that gave the “opposite” result—before it ever became public.
Sounds eerily familiar to every discussion of quantum mechanics I've heard! ("opposite" result; gave it before even knew it existed; etc).
Re: (Score:2)
maybe they can use a quantum software robot, surely that's the answer
Re: (Score:2)
I'm planning a kickstarter for a cloud based quantum software robot with an integrated 3D printer so it can print its own spare parts.
(Actually just posting nonsense to undo a faulty moderation I did..)
Re: (Score:2)
well as long as you license the patent for quantum software robot from me, its all good ;)
oh and might as well tell you about the DRM requirement
Re: (Score:2)
Careful! If you keep spouting crap like that, you'll get buried in VC offers.
Re: (Score:2)
Dear World,
Our previous $15 million dollar machine wasn't as fast as your Dell, but this one is, honest!
Experimental machines people are writing academic papers about are one thing, but this is a commercial product, with commercial claims made about it.
Re: (Score:2)
There's a world market for about 6 computers.
Re: (Score:2)
That might be considerably overestimating the world market for $15 million machines that don't do what they say they do.
Re: (Score:3)
Here is a link to a good article.
There were 3 tests with D-Wave going against a generic algorithm.
It tied on 2 or the 3 tests, but beat the generic algorithm running 3,600 times faster.
However, if it went against a specialized algorithm it was just as fast.
http://www.economist.com/news/science-and-technology/21578027-first-real-world-contests-between-quantum-computers-and-standard-ones-faster [economist.com]
Re:If it works - it works (Score:5, Funny)
You may say that now, but wait until PETA find out about the number of cats and flasks of cyanide their prototype gets through every month...
Re: (Score:3)
Re: (Score:2)
Why aren't you asking why those dogs and cats come to PETA in the first place?
Re: (Score:2)
Actually, it is mentioned though there is no "criticism" section.
The reason being that "the article as written contains far too much criticism to fit into a single section".
I.e. Their efforts to maintain neutrality on the subject are bordering on partiality.
http://en.wikipedia.org/wiki/People_for_the_Ethical_Treatment_of_Animals#Euthanasia_of_shelter_animals [wikipedia.org]
Read the blog post (Score:5, Interesting)
The problem is that it's not faster, and while there's a study that concludes it is, the blog post specifically invalidates this:
About the paper claiming it's faster:
Re: (Score:2)
Sounds like it really does not matter then; who cares whether its quantum or not when it provides less value at a higher price?
Re:Read the blog post (Score:5, Interesting)
Because if it is quantum it's a generation 0 (barely out of prototype) implementation going up against a generation... oh I don't know... 30+ classical computer. If it's not quantum, if it's basically an ASIC chip designed to solve simulated annealing problems (intentionally or not), it's worthless even as research. What they are selling is a research and training system, so that engineers can learn what kinds of problems can be solved on the hardware that will, presumably, get much more powerful going forward.
Look at it this way, the current D-wave machine has 512 qbits and a modern PC can match it's speed. Double the qbits and you end up with a simulation space several million times larger, the 15x faster is going to seem laughable when the problem you are solving is trillions of times larger and the D-Wave solves in constant time while your PC runs an algorithm that's O(n^2). If, if, what D-wave is selling is using quantum affects.
Re: (Score:3)
That's the problem - nobody is sure if it really IS a quantum computer. If it were just a slow quantum computer, it would be interesting. If it's a slow classical computer, it's not. Some people would like to know which it is before they buy one and their engineers invest a lot of time "learn[ing] what kinds of problems can be solved on the hardware."
Also the "get much more powerful going forward" is in doubt. There's a lot of expert criticism that D-Wave's approach cannot be easily scaled up because th
No really, READ IT ALL (Score:3)
It's pretty obvious who is and who is not reading the article here:
The author concedes that it is possible that this may happen, but:
Re: (Score:2)
Aaronson's language is so political. He seems to have rabbit ears, more concerned with what people are saying than what he's doing. The real question is why can't we explore both D-Wave's approach and "academic QC programs" in parallel? Economics is not a good reason; economics should serve the advance of knowledge, not throttle it.
Re: No really, READ IT ALL (Score:2)
Funding has to come from somewhere, and fraud in a field tends to end future funding in that field, even if it holds promise. Case in point: organic semiconductors took a huge funding hit after the Bell labs fiasco. Another case in point: public funding of bubble fusion research was basically banned after the Oak Ridge controversy, even though the basic principle holds promise.
Re: (Score:2)
Well, that's why we have tech, obviously, so that we can make the machine that costs $10M and makes the 'BOING" noise. /s
Re: Read the blog post (Score:4, Insightful)
But what if I build a standard internal combustion engine, wrap it in a sheet of tin foil, and proclaim that I have created a portable cold fusion generator? Is that worth $10m for advancing cold fusion technology, despite the fact that it's not actually cold fusion?
The issue with D-Wave isn't that it's not as good as classical methods, it's that it probably isn't what it claims to be.
Re: (Score:2)
D-Wave is selling this as a commercial product, and as a commercial product it seems lacking. Now, if we are talking about increasing fund on R&D – that would be a different story.
Re: (Score:2)
Surely there is still value in advancing the technology, $10m seems a small price to pay considering.
I believe that's exactly the scam here. D-Wave is selling something, something which solves problems, but is much less interesting than claimed.
Paying millions for really impressive PowerPoint slides only leads to scams. Actual proof of technological advancement if worth paying for, but unproven claims of such are worthless.
Re: (Score:2)
Aaronson's selling cynicism. Wrapped in emotional language that uses caps for emphasis!
Re: (Score:2)
Does it really matter so long as it does what it says on the tin?
But if you put in the wrong figures, do you get the correct answer?
Re: (Score:2)
Does it really matter so long as it does what it says on the tin? It works faster, surely that's all that matters?
Nope. Need to know how it works. Need to be able to examine every aspect of it. Need to be able to model it and predict its behavior in different situations.
Re: (Score:2)
It only works for certain types of problems. You won't break RSA with it.
Re: (Score:2)
But what if it isn't quantum and we've built an entire computer and don't know how it really works? At that point you may as well throw up your hands and yell magic.
I'm not so sure that anyone can really explain how a computer works, from the transistor level up thru the functioning of various hardware blocks and then thru maybe 30 layers of software abstraction. What we do have is piles of people who are very good at improving one level or another, and the results to date suggest these improvements tend to be independent, i.e. an improvement at one level rarely if ever forces a degradation at another level.
But, heck, if someone creates a *reproducible* new concept
Re: (Score:3)
Maybe you can't, but when I was in university we had to wire up transistors to make gates, then wire up gates to make adders, then wire up adders to make ALUs, then wire up ALUs to make computers. Then we went and wrote assemblers and compilers to run on those. They were simple and limited, yes, but the concepts are just the same. The problem with D-Wave is that nobody is really sure how their computers work, on any level.
Re: (Score:2)
But what if it isn't quantum and we've built an entire computer and don't know how it really works? At that point you may as well throw up your hands and yell magic.
Sounds like being a parent.
Re: (Score:2)
Heisenberg Uncertainty principle is intact... (Score:5, Funny)
So they know where the D-Wave system is, but they cannot definitively measure whether it is actually a quantum computer or not...
Forget whether it is Quantum or not... (Score:2, Funny)
This crazy scientist kidnapped my cat and put it in a box! See if the D-Wave computer can tell me if my cat is alive or not.
Re: (Score:2)
This crazy scientist kidnapped my cat and put it in a box!
I have a crazy cat that regularly kidnaps physicists and puts them in boxes. The cat is very skeptical of quantum mechanics though, as the physicists always wind up dead.
Re: (Score:2)
Re: (Score:2)
Re:Heisenberg Uncertainty principle is intact... (Score:5, Funny)
Its amazing to watch the confusion among scientists as the entire structure of the machine goes into a state of superposition with being real and not real all at the same time.
All quantum computers are real, unless declared integer.
Re: (Score:3)
The difference between classical probability and quantum superposition is that if you flip one coin many times you will get a binomial distribution, but if you send one photon at two slits you will get a wave-like multinomial distribution.
Quoting Dirac (quoted in http://en.wikipedia.org/wiki/Quantum_superposition [wikipedia.org]):
Re: (Score:2)
The word "multinomial" probably isn't correct. What I meant was: when you repeatedly send a single photon through the double-slit apparatus, you get a wave interference pattern. Flipping a coin won't produce such a probability distribution, unless it can somehow interfere with itself.
Is any quantum computer really quantum? (Score:5, Funny)
A real quantum computer both is and isn't at the same time.
Re:Is any quantum computer really quantum? (Score:5, Funny)
It is a quantum computer only when no one is looking at it.
Re:Is any quantum computer really quantum? (Score:5, Funny)
It is a quantum computer only when no one is looking at it.
That sounds like the invasion of Weeping Turings...
Re: (Score:2)
Has a D-Wave quantum-nature?
This is the most serious question of all.
If you say yes or no,
You lose your own quantum-nature.
D-Wave's Dirty Little Secret (Score:5, Funny)
Re:D-Wave's Dirty Little Secret (Score:5, Insightful)
Re: (Score:2, Funny)
Is there really any difference between quantum entanglement and magic?
You may have to ritually sacrifice the occasional intern to keep magic working, but otherwise, no.
Re:D-Wave's Dirty Little Secret (Score:5, Interesting)
Is there really any difference between quantum entanglement and magic?
Yes. There's this tendency to view entanglement as spooky, magical, and hard to understand. But this really isn't the case and is more due to the confusing way that quantum mechanics if often taught, as a series of counterintuitive results tacked on to classical physics. If one adjusts one's perspective to think of quantum mechanics more as the consequences of using a 2-norm and looking then at the structure imposed on vectors by unitary transformations, things make a lot more sense. Scott Aaronson(mentioned in the summary above) has a book out recently on just this subject "Quantum Computing since Democritus" which is aimed at explaining these issues to people outside is field but with a comfortable background in other technical fields- essentially no more than some linear algebra, basic probability and complex numbers. The book is highly readable and Scott is a very funny writer, so there are a lot of amusing asides.
Re: (Score:3, Funny)
>using a 2-norm and looking then at the structure imposed on vectors by unitary transformations
Like, obviously!
Re: (Score:2, Insightful)
" If one adjusts one's perspective to think of quantum mechanics more as the consequences of using a 2-norm and looking then at the structure imposed on vectors by unitary transformations, things make a lot more sense."
Which means absolutely nothing to the vast majority of people; hence spooky, magical and hard to understand.
Re: (Score:3)
I think my mathematics and quantum physics are not bad (although it were genuinely strong, perhaps I would have gone further than ABD), and I have trouble making "a lot more sense" of "using a 2-norm". Yes, I understand that the wave functions are complex, but that is not enlightening without a lot of math.
Most "spooky" results are simply the result of not accepting that everything is a wave form, and waves cannot understand that they are not supposed to be stretched across space. Which is another way of
Re: (Score:2)
You can describe magic with math too, and it starts to make a lot more sense.
Re: (Score:2)
My guess? Free will, since particles are not completely random but can be in states such as i/2|0> + sqrt(3)*i/2|1>.
Re: (Score:3)
Any insufficiently advanced magic is just technology.
Re: (Score:2)
Re: (Score:2)
"This particle collides with that one, causing the two to spin in opposite directions."
Hidden variable theory states the above; Bell's experiment was designed specifically to test it. Conclusion: Einstein was wrong; hidden variables don't exist.
Re:D-Wave's Dirty Little Secret (Score:4, Informative)
One is a bunch of mathematical equations modeling a universe.
Math is not real. The models are not real. They are virtual.
Not saying that quantum mechanics doesn't have some robust models. But it is not "real" in an empirical sense. It is also not complete.
Re: (Score:3)
Simply put any math we use only defines a simulation. You only get real when you observe things. Which QM takes extreme issue with =p
It's much cooler if we *don't* know how it works. (Score:2)
Then we not only get a useful machine, we eventually get new science in the bargain. I *like* it!
Re:It's much cooler if we *don't* know how it work (Score:5, Interesting)
Anything can be pushed to the limits of what we know, and on occasion, things work, but not for the reasons you think it did. This is sufficiently close to the cutting edge that it may be operating correctly, but that we only think we understand why.
F'rinstance, for years, we thought about electricity as a liquid. Voltage equaled pressure. Amps equaled volume. The math worked. Nature wiggled it's eyebrows suggestively.
BUT, electricity is NOT a liquid. It works the way it does for completely different reasons. It just took a while for us to figure that out. Yet, even before we understood this, we build practical machinery.
Comment removed (Score:5, Informative)
Re:it is and it isnt (Score:5, Informative)
Namely, the same USC paper that reported the quantum annealing behavior of the D-Wave One, also showed no speed advantage whatsoever for quantum annealing over classical simulated annealing. In more detail, Matthias Troyer’s group spent a few months carefully studying the D-Wave problem—after which, they were able to write optimized simulated annealing code that solves the D-Wave problem on a normal, off-the-shelf classical computer, about 15 times faster than the D-Wave machine itself solves the D-Wave problem! Of course, if you wanted even more classical speedup than that, then you could simply add more processors to your classical computer, for only a tiny fraction of the ~$10 million that a D-Wave One would set you back.
Re: (Score:2)
I'm not sure that this really helps clarify whether this is actually a quantum computer.
The question is whether the D-Wave is a quantum computer, not whether it is faster than a classical computer. A quantum computer that is slower than a desktop calculator is still useful for research purposes.
Computer. (Score:2)
The problem is calling it a "Computer", which it is not really.
A better analogy might be to call it a like on some old school computers, a math co-processor. A math co-processor wasn't a "computer" but rather a processor that offloaded certain tasks that it could handle more efficiently.
I see this as a similar situation. It is really only good at solving very specific problems, outside of which a normal computer would be better served.
Re: (Score:2)
Except that, as per the article, somebody has shown a purely classical algorithm that reproduces the bimodal success distribution. The blog author in the article actually accepts that distribution as evidence (finally) that the D-Wave is actually doing something quantum, but then has to back off on it when that result comes up.
Entanglement isn't the only issue (Score:5, Insightful)
Scott's blog post and the comment thread there are really worth reading. Entanglement isn't the only issue. A major part of this also is whether the process that the D-Wave machine is doing is anything that is even faster (either in practice or asymptotically) than a classical computer. Right now, the answer for the first is clearly no. Scott describes mildly optimized systems which have been shown to effectively outperform D-Wave at its own problem. The second question- of asymptotic performance is a little trickier but the answer looks like "probably not". It is also worth noting that the D-Wave machines do a very specific optimization problem, of unclear usefulness, and cannot be used at all for many of things that we think of as what one wants a quantum computer for, like Shor's algorithm http://en.wikipedia.org/wiki/Shor's_algorithm [wikipedia.org] to factor integers.
In fairness to D-Wave though if one thinks of this as essentially a research machine, then not doing as well as conventional systems isn't that much of mark against it any more than very early cars being slower than horses. However, D-Wave is trying to sell these machines commercially. And Scott expresses worry that there's going to be a serious backlash against quantum computing as a whole when the the D-Wave hype bubble bursts. Apparently, D-Wave has now gotten about 100 million in funding, so at least, this is an extremely suboptimal allocation to resources when much more promising academic quantum computer research projects are getting much less money.
Re: (Score:3)
However, D-Wave is trying to sell these machines commercially. And Scott expresses worry that there's going to be a serious backlash against quantum computing as a whole when the the D-Wave hype bubble bursts.
No one with the money to afford one of their machines is stupid enough to buy them as anything other than a research machine. Even if someone hadn't come up with a way to match their speed with a classical computer (which has already happened) a 10,000x speed increase on a very narrow problem set at a cost of more than 100,000x just buying the additional hardware (not to mention the cost in learning how to use this new and exotic machine) makes obvious that it is not yet an economical solution for anyone.
Re:Entanglement isn't the only issue (Score:4, Interesting)
I would argue that we should be open-minded at first and see what they can actually do. Maybe analog computers are in fact not as outdated as some people claim. Maybe we could build some sort of "analog FPGA" and do massively useful things with that. I still remember an HP computer graphics subsystem using analog computers !
Surely digital computers have the advantage of simple control of temperature, aging and general error margin issues, but it comes at a massive cost in the number of transisitors to perform a certain function. less than ten transistors can perform an analog multiplication while you need tens of thousands if not hundreds of thousands of transistors to perform a floating point multiplication. Also, the analog multiplier will be operating at much higher speed (easily 10x). Again, if we could control temperature and aging-related issues and have high integration and programmability (FPGA-style), maybe we could do massively useful things at very low power levels or with massive parallelity. I do NOT think that analog computers are dead forever. It might be more of a cultural thing we currently don't use them much ("digital is always better", "digital is modern" and similar semi-truths.
If you put one seasoned computer scientists and one seasoned electrical engineer in one room and task them to do what I described, if you give them massive funding (say 3 million dollars), I am sure they could come up with something massively useful. For example, digital circuits could periodically calibrate the analog circuits to compensate for all kinds of drift and aging. Software could automate the drudgery of manual circuit synthesis, it could model crosstalk and similar things.
Well, maybe Analog Devices already has this kind of thing.....
Mod parent up (Score:2)
Damn. All those mod points last week, and now when I for once run in to an under-moded AC post, of course I have non. I would love to read a comparison between the D-Wave computer and analog computers.
Re: (Score:2)
It would presumably get it's ass kicked. The D-Wave solves an annealing optimization problem. The analog computer called "my lunch" solves those problems on a far larger scale every time I put it in the microwave and the cheese melts.
Proved the Market (Score:5, Interesting)
Whether this thing turns out to be the real McCoy (dammit Jim, I'm a quantum annealer) or not, one thing D-Wave has done is proven that there are customers who will pay $10M to be on the cutting edge of quantum computing for a few years. This should help boost investment and entrepreneurship in other companies. Eventually, one of them will revolutionize everything.
Re: (Score:2)
I need more evidence! (Score:3)
Scott Aaronson comes off as an egotistical man-child that is just angry he's not directly involved in all this.
It's still love - love in this match.
Stupid Question of the Day!!!! (Score:3)
Now that I've RTFA and through the commentary threads, as a dumb ignorant layperson I get why Scott Aaronsen is right to call out D-Wave. I also get the counter-argument that there needs to be some sort of hype in order to sustain interest in QC. And, the damn thing's got to work eventually. What I'm wondering though is this: Are we (as a society) making an error in trying to use QC to solve problems that are particular to classical computing?
The reason I ask is that a while back on /. I was educated about the nature of Base-10 computing. Prior to this, I'd spent my entire life thinking that Base-10 WAS mathematics, and I'd had no reason to assume or even imagine that there could be any other type of mathematics than Base-10. Base-10 was the pinnacle of mathematics to me. Then I find out that Base-10 is probably the most efficient to date for our society, but that it is not the only way to count; and that Pi is only Pi because of Base-10. Which led me to look at mathematics in a whole new light. Similar with Quantum mechanics--the more I understand about Quantum Mechanics, the more I realize that I have to set aside everything I know about Newtonian physics, because trying to understand quantum physics from a newtonian perspective will always result in failure--while there is a bridge between the two, if I don't take that "bridge" into account then I'm metaphorically trying to judge apples based on my prior experience in dog shows.
Given this, is it fair to hold QC to the same standards as Classical Computing, or should we be looking at entirely new applications of computing? And, is there anyone out there who's staring into the vast unknown and saying "What happens if we do THIS with a QC?"
Re:Stupid Question of the Day!!!! (Score:4, Informative)
he reason I ask is that a while back on /. I was educated about the nature of Base-10 computing. Prior to this, I'd spent my entire life thinking that Base-10 WAS mathematics, and I'd had no reason to assume or even imagine that there could be any other type of mathematics than Base-10. Base-10 was the pinnacle of mathematics to me. Then I find out that Base-10 is probably the most efficient to date for our society, but that it is not the only way to count; and that Pi is only Pi because of Base-10.
No. Pi will be the same regardless of base. The digits of Pi will be different if you write it in a different base, but this is simply a representation, not a change in what the number is. If you do calculations involving Pi in one base and do the same thing with another base and then convert the answer from one to the other you will get the same thing.
Your general question is a good one. In fact, one of the major things people want to use quantum computers for is to do simulations of quantum systems, which they can do, but which are extremely inefficient (both in terms of time and memory) on a classical computer. So people are looking at problems which are practically not doable on a classical computer. At the same time though, we know that a quantum computer can be simulated on a classical computer with massive resource overhead (essentially exponential slowdown), so we know that anything you can do on a quantum computer you can do on a classical computer if one is patient enough.
Re: (Score:3)
Pi will be the same regardless of base.
Most bases. Not base 1 (unary), base 0 or base -1.
Re: (Score:2)
Re: (Score:2)
Those aren't bases you can write numbers in in general. That's not that Pi is different, that's that those bases can't be used to represent all real numbers. Pi would still be the same.
Unary is still very much in use. Nearly every child starts by counting fingers, and often we still count by making a mark for each thing we count.
Yes, unary doesn't have a way of representing reals, which means that pi can only be represented if someone manages to square the circle. Otherwise, you're stuck with fractions and other representations based on whole numbers. The concept of an exact pi only exists outside the unary system.
As for base 0, it most certainly can represent all real numbers. In fac
Re: (Score:2)
As for base 0, it most certainly can represent all real numbers. In fact, it cannot not do so. But there's no way to convert pi from base 0 to any other base.
I don't follow. What do you mean?
Re: (Score:2)
Regarding unary, you are still confusing the issue of whether one has a representation of the number with whether it exists. These are not the same thing.
Not if you claim it can be convered from one base to another. Then the base must have a way to represent it, and unary doesn't (unless, as said, someone manages to square the circle).
I don't follow. What do you mean?
In base 0, all numbers are equal, real or not. 0*N = 0 for every possible value of N. This makes base 0 rather useless for mathematics, of course, but it's a valid philosophical concept, a relative to atheistic solipsism.
Re: (Score:2)
Re: (Score:2)
Then I find out that Base-10 is probably the most efficient to date for our society, but that it is not the only way to count; and that Pi is only Pi because of Base-10.
No, pi is pi in all bases equal to or higher than two, provided you assume euclidian geometry.
If you meant that pi is 3.14159265358979323846... only because of base 10, you're correct (again, as long as you ignore relativity and curved space).
Re: (Score:2)
Re: (Score:2)
"I was educated about the nature of Base-10 computing. Prior to this, I'd spent my entire life thinking that Base-10 WAS mathematics."
Really? We had to do some base-10 computing in undergrad, of the BCD nature. It was a pain in the ass and everyone was glad to get back to good old base 2. And that was base-10 coded in base-2. I haven't heard of a computer using base-10 natively since the ones in the middle of the 20th century that wore skirts.
Re: (Score:2)
Re: (Score:2)
I don't think you properly understand the implications of changing bases, possibly because you say you learned about it from Slashdot. Mathematics doesn't care what base you use. When I took honours calculus we didn't really use numbers at all - everything was symbolic. Number systems are just convenient ways we use to represent values that are larger (or smaller) than the number of unique symbols we care to make use of. As someone else pointed out, pi is pi no matter what base you use. If you calculat
Re: (Score:2)
Re: (Score:2)
If you're trying to understand, perhaps you should read the explanation without dismissing it as pedantic.
The whole universe is Quantum (Score:2)
Of course it is quantum. The whole universe, including the reactions in the D-Wave are based on the laws of quantum mechanics. Regular computer chips have to take quantum effects into account too, although they try to defeat quantum effects rather than utilize them. Nevertheless, at a basic level the transistors work because of the laws of quantum mechanics.
So how do you want to describe the D-Wave? Do you want to describe it using the laws of Quantum Mechanics? Or do you want to approximate it using a
What? (Score:3)
Re: (Score:2)
Then they win a Nobel (Score:2)
Re:Not General Purpose (Score:4, Interesting)
But doesn't this suggest that arrays of narrow domain analog computers of this type might be constructed in such a way as to produce a *really* fast general purpose supercomputer? For example, sorting routines are built into most software frameworks. Could we not hybridize a system wherein np hard problems are called from the framework that transfers the sort to an quantum adiabatic solver and returns an answer?
Re:Not General Purpose (Score:5, Informative)
Re: (Score:2)
Exactly what does it mean for a problem to be "intermediate between P and NP"?
AFAIK P is a subset of NP. It may, although unlikely, be a proper subset.
Re: (Score:2)
Re: (Score:2)
if you want to solve a particular type of optimisation problem, you can run it on a D-Wave at quite some speed
Except, according to the blog post linked, the D-Wave has so far been slower than optimised algorithms running on classical computers...