MIT And HP Announce Joint Quantum Computer Project 174
MetaCow writes: "CNN is running this article which describes a joint effort between MIT and HP to build a quantum computer. Nothing expected any time soon, though: 'Quantum computing research is farsighted, and it may take 10 years to develop a fully operational quantum computer ...'"
Correction needed... (Score:2, Insightful)
Alright, a little 'bit picky', but
"While the classical bit can store any number between 0 and 255 on each of its eight bytes,
Byte and bit should be reversed in this sentence fragment.
:^)
Ryan Fenton
Re:Correction needed... (Score:3, Insightful)
Actually, that still doesn't make sense. Perhaps, "While the classical byte can store any number between 0 and 255 using all of its eight bits"?
Indeed. (Score:1)
;^)
Ryan Fenton
Re:Correction needed... (Score:1)
Quantum computing uses the properties of quantum physics to perform calculations. The basic unit of computation used is the qubit, or quantum bit. Unlike classical bits the qubit is not just 0 to 1 but is in a superposition of both. In other words, it is both on and off at the same time.
While the classical digital byte can store any number between 0 and 255 using all of its eight bits, it can only represent one of those numbers at a time. But a qubyte can be all of the numbers between 0 and 255 simultaneously.
This allows much more information to be stored on a quantum bit than a classical bit, and allows massively parallel processing on a quantum scale. One calculation can give the answer for all the numbers on the byte at the same time, according to an explanation posted on the MIT Web site. In other words, for each clock cycle a quantum computer could perform 256 calculations in the same time a digital computer can perform just one.
CNN have the older version up... the correction should be along shortly.
Re:Correction needed... (Score:2)
On the other hand, the "article-typer" probably heard the words "quantum" and "qubit" and decided to write an article. His research probably consisted of noticing that his 56 kilobit modem was getting transfer rates of ~5 kilobytes on Win9x reported connection of ~40kps, and deduced that bits are bigger than bytes.
Following this, he decided that he was an Authority (search for False Authority on
Re:Correction needed... (Score:1)
About time (Score:1)
They may or may not succeed. (Score:3, Funny)
anyone else find it odd? (Score:1)
Mr. Clarke was 10 years off perhaps? HMMMMMM
time frame (Score:1)
<br><br>
Which means the NSA should be turning thiers on, rights..... about...... <b>now</b>
Re:time frame (Score:1)
They actually don't have anything working right now, beyond what many public labs have. What they do have however is an essentially inexhaustible supply of money being thrown at this. They can hire the best talent and get any equipment they want.
The NSA isn't there yet, but they sure do want to be first.
What? (Score:1)
Please tell me this is just bad grammer. I mean, Doesn't this sound like you need 64 bits to make 255 on a binary computer (which we all know is false) and that somehow these "magical" quantum computers can do it in only 8 bits?...
Imagine the compression we could have by not using a whole byte to store a bit of information...
Some real info...article lacks it (Score:4, Informative)
The article fails to make any real points. It's merely a PR article for HP and MIT.
Unlike classical bits, the qubit can be not just 0 or 1 but a superposition of both, in differing proportions.
Um...wrong. The qubit can be in the 0 or 1 state, but can also be both at the same time, and have varying rotations. Which is what makes it immpossible for us to decode them. It is the multiple state position that is what is interesting to us, and what does the parallel computing. We just don't know how to utilize it just yet. There have been various articles. Quibit.org [qubit.org] is a great place to start reading up on this stuff. The IBM Almaden [ibm.com] has a nice article [ibm.com] that will actually tell you something useful.
Re:Some real info...article lacks it (Score:2, Informative)
I also thought the article lacked a lot of relevent info and, what's more, said some things that were just wrong. A few additional pieces of information that might be of interest:
A quantum computer is not simply faster than a classical computer
Quantum computers use a different set of fundemental logical operations, which allow them to do all the things a classical computer (Turing machine) can, plus it allows some other operations to be done using the superpositions or entaglement of states. This means there are some problems for which fundementally different algerithms can be written to do it much faster, in polynomial time as apposed to exponential; however, there are just a few of these. There's no reason to think it will be faster for everything. For most tasks odds are its algerithms will be just as efficient as a classical computer. In addition, its logic operations will likely be slower meaning it may well be slower for the vast majority of tasks. So, don't expect a desktop QC in 20 years, it would probably be used only for specialized computational tasks for the forseable future.
A quantum computer is not right around the corner
Though the article gives little detail, the computer they're talking about making in 10 years will probably not be amongst the most powerful in the world when it is turned on. Odds are it will be more like the size of house with the power of a pocket calculator, old school mainframe style. This is due to the difficulties they mention. This is not really made very clear in the article, but the difficulties they face aren't just practical. Even theoretical calculations suggest that it's not feasible with currently theorized techniques to make a quantum computer more than about 100 qubits, due to decoherence and computer control factors. Quoth one paper, "...the absolute limit on what any practical NMR comptuer can handle remains well below 100 bits" (Cory et al., Proc. Natl. Acad. Sci. USA 94, 1997). I've read similar estimates for all forms of quantum computation currently in the works. It seems that it takes a computer with on the order of 500 to 1000s of qubits before it actually becomes more useful than a classical computer. This should not be hard to imagine. What do you think you could do with a computer that could only hold 100 bits of memory at a time? The point is that in ten years, we won't be seeing any supercomputer.
Quantum computers are not the next step in miniturization
The article says that we're reaching the limit of classical computation and suggest that quantum computers are the next step. It is true that Feynmann suggested in his original paper on the topic in 1983 that when computer technology reached the atomic scale then quantum mechanics would have to become part of computation, but quantum computers as they are talked about today have little to do with miniturization. In many of the schemes, such as atom traps quantum computers, the memory and computation elements (when control devices are included) are much larger than those for classical computers. This can also be said for RF-Squids and quantum dots. In short, the proimise of quantum computers has nothing to do with miniturization, it has to do with the fact that they can perform fundementally different algerithms.
All that being said, I think it is a good that they're doing this research, and I do think that quantum computers will likely someday be useful, both as computers and as tools for researching quantum mechanics further. I am not saying I don't think they'll work, simply that the article seems to suggest a picture which is overly optimistic. However, with the great promise they hold, I don't think we can afford not to do research like this.
Ah Richard P Feynman... (Score:2)
His biography is quite the read. You can get it here [amazon.com].
Actually. (Score:1)
You must enjoy your Mac.
only $25 million in funds? (Score:2)
With the results that a Quantum computer could generate I cannot believe that there isn't a larger sum designated for this project...
Re:only $25 million in funds? (Score:1, Funny)
year 8: go see what CMU has done with Quantum computing
year 9: reimplement & call press conferences to upstage CMU, cuz we're MIT dammit
This is progress? (Score:2, Funny)
This is something of a drop from conventional computer performance, in which the answer is merely often wrong.
Somehow I think this article 'dumbed things down' a bit too much...
186,282 mi/s...not just a good idea, its the law!
Re:This is progress? (Score:2, Funny)
Re:This is progress? (Score:2)
Max Headroom was brilliant, and totally did not see the advent of the Internet. They envisioned a world where TV was mandatory, and everything was run by TV conglomerates.
They missed the internet entirely. But aside from that, they were not all that bad.
- - -
Radio Free Nation [radiofreenation.com]
is a news site based on Slash Code
"If You have a Story, We have a Soap Box"
- - -
Quick, somebody call the FBI (Score:1)
Re:uhm... what? (Score:2)
Of course when you look to see what color it is, the act of looking changes the color.
Re:uhm... what? (Score:2)
Yeah right. While no one is looking, right?
Of course when you look to see what color it is, the act of looking changes the color.
The amazing thing about quantum computing is that it only works when nobody is looking. As soon as you take a look, all the in-between values disappear into thin air. It's like saying you can jump as tall as the empire state building when nobody is looking. What ever happened to physics being an experimental science? It's looking more and more like chicken feather voodoo physics to me. But to each his own I suppose.
Re:uhm... what? (Score:2)
Quantum computing does have one benefit, though. At least when you look you find out whether the cat is dead or alive. :-)
Why HP? (Score:2)
As it stands, without some serious changes in senior management and a total overhaul of their product line, it is unlikely HP (as we know it) will see 2010....they're on the 3COM/SGI track right now.
Re:Why HP? (Score:1)
cracking on a whole new level. (Score:1)
Re:cracking on a whole new level. (Score:1)
Great but.... (Score:1)
What sort of frame rates will I be getting in Quake3?
Fnord!
Re:Great but.... (Score:1)
All of them, but of course you won't be able to see them.
Microsoft on reading data from quantum computers (Score:2)
--CTH
The Quantum Computing Swindle (Score:2, Interesting)
This isn't flamebait - it's definitely a subject worthy of discussion. I, for one, have great reservations about whether this is a viable technology. This is especially important since so much money and attention is being poured into research, perhaps often without a real understanding of the basic principles. I happen to know people in Gershenfeld's lab, and know full well their tendencies to let the hype get out of hand.
Perhpas HP is spending the money as a marketing/PR effort, rather than them intending to get real work done. That would explain the press release.
So here it is; I hope nightlight3 will chime in.
- - - - -
"If one existed, a quantum computer would be extremely powerful; building one, however, is extremely challenging,"
Extremely challenging, like in "it can't work and it won't ever work, but I hope the government and the industry sponsors won't find that out, at least until I retire, preferably after I am dead."
The whole field of Quantum Computing is a mathematical abstraction (fine, as any pure math is, as long as you don't try to claim that's how the real world works). Its vital connection with the real world is based on a highly dubious (even outright absurd, according to some physicists, including Einstein) conjecture about entangled quantum states (roughly, a special kind of "mystical" non-local correlation among events) which was actually never confirmed experimentally. And without that quantum entanglement the whole field is an excercise in pure abstract math with no bearing on reality.
While there were number of claims of an "almost" confirmation of this kind of quantum correlations (the so-called Bell inequality tests), there is always a disclaimer (explicit or, in recent years, between the lines as the swindle got harder to sell), such as "provided the combined setup and detection efficiency in this situation can be made above 82%" (even though it is typically well below 1% overall in the actual experiment; the most famous of its kind, Aspect experiment from early 1980s had only 0.2% combined efficiency, while 82% is needed for actual, "loophole free" proof) or provided we assume that the undetected events follow such and such statistics, etc. The alternative explanations of those experiments (requiring no belief in mystical instant action-at-a-distance), which naturally violate those wishfull assumptions, are ignored, or ridiculed as unimportant loopholes when forced to debate the opposition, by the "mystical" faction. After all, without believing their conjecture all the magic of quantum computing, quantum cryptography, quantum teleportation, along with funding, would vanish.
For those interested in the other side of these kinds of claims, why it doesn't work and why it will never work, check the site by a reputable British physicist Trevor Marshall, who has been fighting, along with a small group of allies, the "quantum magic" school for years:
Quantum Mechanics is not a Science [demon.co.uk]
Unfortunately, the vast bulk of the research funding in this area goes to the mystical faction. As long as there are fools with money, there will always be swindlers who will part the two.
For a more popular account, accessible to non-physicists, of the opposing view, you can check a site by a practical statistician (and general sceptic) Caroline Thompson:
Caroline Thompson's Physics [aber.ac.uk]
Quantum Computing Is One ofthe Biggest Hoaxes Ever (Score:1, Troll)
This is so true. David Deutsch is a half-crazed crackpot and con artist who manage to convince a bunch of gullible people that his chicken faether voodoo physics is real science. I never thought I'd live to see the day when science is turned into in-your-face superstition by a bunch of swindlers. Do physicists think that there are beyond public scrutiny? Do they really think they can throw any crap at the public and that the public is forced to swallow it? I think they should be careful because the public is not as stupid as they want us to belive. One day, we'll wake up from our stupor and wipe that smug superiority smile off their faces. After all we pay their salaries and we reserve the ultimate right to decide what is good science and what is not.
It is up to us, it is up to the citizens of a free society to either accept the chauvinism of science without contradiction or to overcome it by the counterforce of public action. Paul Feyerabend
How the hell did you hit the +1 karma bonus? (Score:1)
I hope you have a great sense of humor, or else have just neglected your physics education, because currently I have little faith in your powers of logic.
OTOH, I suppose you're a neccessary part of the scientific eco-system. We always need people criticizing the accepted standards, pushing us to put up (give some physical proof) or shut up. I would prefer, though, if you would adhere to the same standards of intellectual honesty that your typical scientist does. For instance, if you could come up w/ some alternate theories w/ solid predictions, that could be verified or disproven, I might listen. Otherwise, you're little more than a crackpot claiming that stuff doesn't work because it doesn't make sense, to your math-illiterate brain. Well, tough. The Universe is not mandated to 'make sense' to the Joe Public, especially if he refuses to try. The power of populism only goes so far, you know.
At least the original post linked [demon.co.uk] to someone who is willing to make predictions that can be proven or disproven. That man is a scientist, IMHO, whether I agree w/ him or not. Your link is to a crackpot, OTOH.
Insufferable Pomposity (Score:1, Troll)
Re:Insufferable Pomposity (Score:1)
First of all your logical argument against travel in spacetime to me makes no sense. You find that it is self-referential, because to find the velocity in the time axis the equation would be v = dt/dt.
If a time dimension does exist it would not be a spatial dimension therefore 'velocity' within it wouldnt make any sense in the first place. Velocity is a phenomena that is found in our (3D) world. However there is a connection between relative velocity and time itself. The faster you travel the slower time appears to be. This has been proven with atomic clocks on space flights. 'Time' itself did slow down, even though it was fractions of seconds.
I dont understand your point. Another thing you claim they are mathematical abstracts and have no counter-parts in nature. Yes they are abstracts but so is any language. The human mind takes experience and puts them in nice little symbolic packages so they can be transferred between human beings (language). This is math, and even english. Just because they are does not make them false.
You seem to me to be very anti-science. Also I think you take Feyerabend out of context. I deeply respect Feyerabend but he is not against science. His point was to make science more humanistic. Using any one methodology would slow down the progress of science because science is a human art. An interpretation of reality that becomes a world view. I would have to agree with David Bohm that insight comes before the mathematical equations. The math(or language, paintings, etc) just describes this insight.
Think of scientists more like painters who use math as their paintbrush to interpret the world. Science is a very fulfulling and beautiful thing do not resent it.
Again what is your point my friend? Science sometimes gets boggled down in its own abstractions and becomes a little crazy but what human endeavor does not?
Re:Insufferable Pomposity (Score:2)
If a time dimension does exist it would not be a spatial dimension therefore 'velocity' within it wouldnt make any sense in the first place.
You are agreeing with me. It is precisely for this reason that there is no time travel. IOW, there is no motion in spacetime. Therefore there is no spacetime and no time dimension either.
Velocity is a phenomena that is found in our (3D) world. However there is a connection between relative velocity and time itself. The faster you travel the slower time appears to be.
An observer will notice a moving clock to run slower. Time itself cannot change because changing time is self-referential.
This has been proven with atomic clocks on space flights. 'Time' itself did slow down, even though it was fractions of seconds.
Not entirely correct. Time dilation is a misnomer. Time can neither slow down nor speed up. Only processes (clocks) slow down. Clocks are used to measure invariant temporal intervals. Intervals measured with a moving clock or a clock under the influence of gravity will seem longer than with a non-moving or inertial clock. "Time dilation" is not time travel. If a slowed clock actually traveled in time, it would simply disappear from view. I suggest you take a look at the book "Relativity from A to B" by Robert Geroch. Here's an excerpt:
There is no dynamics within space-time itself: nothing ever moves therein; nothing happens; nothing changes. [...] In particular, one does not think of particles as "moving through" space-time, or as "following along" their world-lines. Rather, particles are just "in" space-time, once and for all, and the world-line represents, all at once the complete life history of the particle.
Also check out the work of relativists like Bianchi, Pavsic and Horwitz, especially the latter's invariant tau formalism. I am not making any of this stuff up.
Another thing you claim they are mathematical abstracts and have no counter-parts in nature.
I never made any such claim. I said that the spacetime model has no counterpart in nature. If it did, there would be no motion. Please do not use this strawman against me. I'd appreciate a little bit of honesty from my critics.
You seem to me to be very anti-science.
No I am not. I love science. I am against the "chauvinism of science", as Feyerabend puts it in his "Against Method." I am especially against quackery that passes itself as science. I am against any group of people who would set itself up as some kind of privileged priesthood over the public.
Again what is your point my friend? Science sometimes gets boggled down in its own abstractions and becomes a little crazy but what human endeavor does not?
We, the public, pay good money for scientific research. A lot of money. We deserve to get good science for our money. Not snake oil. Quackery from famous charlatans is orders of magnitude more detrimental to humanity than crackpottery from your run of the mill crackpot. It is like a monkey wrench in the works. It can set us back for decades if not centuries.
Re:Quantum Computing Is One of the Biggest Hoaxes (Score:1, Troll)
It can't create an infinite number of universes either. Nor can it allow time travel (a la Stephen Hawking and Kip Thorne) or have a particle's state be two mutually exclusive values concurrently. Especially when nobody is looking.
Why the bitterness towards advanced Math/Science? (Score:1)
Why are so many people incredibly bitter and unknowing of the pure sciences such as Mathematics and Physics? If there is something you don't understand, due to lack of education, where do you think the answers lie...God? I am not religious, and am not of a faith bound to a single God, but apparently from what I understand you can't experimentally prove Him, you can't touch, see, or smell Him, and he is a helluva lot more abstract and unprovable than any Physics or Mathematics could possibly hope achieve. Yet, Quantum Mechanics seems to be something you wholeheartly disconcern as hype and lies. Why? To myself and many others, Mathematics and Science is God...it is the language of the universe. There is no "mysticism" in that.
I am a Pure Mathematician, as I have stated in many of my other posts. Mathematics is the language of the Universe. All of science is Mathematics in the end, and as such holds the answers to every question the universe holds. Every mathematical equation and answer has been abstract at some point. Even the original Arabic concept of the zero. Mathematics follows very, very strict rules. If mathematics, which is Physics, says that something exists, then it does. The idea of what it may be may be up to interpretation, which is where most Physics is done, but the pure mathematics is not. So, if an idea is too abstract for you to comprehend, does it mean it is automatically false? No, of course not. It just means you don't understand it.
Mathematicians and Physicists, especially ones at a place like MIT, are not there to "scam" or "swindle" you. Neither you nor I am a student or professor at MIT, and therefore have no right to judge their intelligence, integrity, or their minds. Quantum Mechanics is a science, and an incredibly important one at that. You terribly miquoted Einstein and others when you made the blatantly incorrect reference to his stance on Quantum Mechanics (which is based on Probabilities, unlike Einstein's relatively 'flat' universe). Einstein actually helped create Quantum Mechanics and was quoted as saying "Quantum Mechanics...Scary things happening at a distance." That was his quote, and it has nothing to do with what you blatantly messed up. He said is was scary to him...Something one of the greatest minds that has ever lived didn't completely understand. So why do you think you should be able to?
Let's take an everyday example to a moment. Have you ever gone to the grocery store and purchased something? Did the cashier wave your product, which contained a bunch of bars in a small box called a UPC over some lasers, and *presto*...Your total came up on the cash register? Are you someone who never thinks twice about how that works or are you someone who takes the time to find out. Well, you would have no clue how it works without a college course in Contemporary Abstract Algebra. The math that makes your UPC work are such abstract things as Group Theory, Ring Theory, and Modular Multiplication. These things, which while looking at them plain faced mean nothing to you. But, dig deeper, become educated in the pure science of mathematics, and all of a sudden you realize that without Group Theory, Ring Theory, and Modular Multiplication nothing would work at all. Now, to explain, most CS majors know what Mod. Mul. is, right? Well, when applied to a Ring (which I have no room to explain here) you can choose a prime number (another of those abstract ideas) as your Ring modifier and apply the Prime Ring to your Mod. Mul. and you have the ability to create a code that has basically only one real solution (well, technically possibly more than one solution depending on your prime seed, but it would be a process of reverse engineering the UPC mathematics to figure it out, and no one cares, unless you have a Pure Math degree, a permanent marker, a code sheet of the vendor's UPC codes, and a lot of spare time). Hidden in that UPC bar code is the correct number to undo the Ring, thus giving you the Product ID of the food you purchased.
Since Abstract Algebra has been around for almost 200 hundred or more years now, do you think anyone then could have imagined a UPC code in the 20th/21st century? No, of course not. And always be careful of reading "rebuttal" sites on the net or in print. None of them are written by anyone even remotely qualified to say anything. Especially judging the article you referenced and your apparent respect for it. Please never stop dreaming, and never doubt something out of hand, simply because you don't understand it. Peace.
Observing without interacting (Score:1)
Can phenomena which are in no particular state (i.e., the wave that lives in the space between non-existence and existence) have interim unobserved effects during the "particle"->"wave"->"particle" transition? Or is the wave/particle duality an impenetrable boundary, the "wave state" something that can never be "known" in itself?
It seems that in such a "quantum system" one could induce some potential course of action, and by measuring the existence or nonexistence of a resulting effect infer the meaning of the result based on the original parameters. In this case you can use time as the controlling variable, and all is right with the world.
(Intruducing time into the equation is the only means to observe a system without necessarily interacting with it that I can think of. At least, you probably only need to interact half of the time.)
Once you've made a logical branch based on the result (or non-result) then you can happily reset the system to a known state and set up the next "instruction."
I realize this is an oblique notion, but it makes a weird kind of sense if you've done enough acid!
DMCA vs Quantum computers (Score:3, Interesting)
Re:DMCA vs Quantum computers (Score:1)
See item 1 in my previous comment [slashdot.org].
Interesting Implications (Score:2, Interesting)
1. No more encryption. Quantum computers can crack block ciphers with ease, as well as assymetric public key cyphers. Bigger keys? Just use more qubits. Hmm... can anonymous networks (MixMaster, Freenet, Publius, etc...) exist without encryption? Can banking exist without encryption? How about online transactions in general?
2. Uber compression. Everything digital occurs in the Pi sequence somewhere right? Well, quantum computers might be able find that offset and length within Pi, LCG's, or any other kind of sequence.
Imagine downloading a 4 hour DIVX using 20 bytes. 4b sequence ID, 8b offset, 8b length. That is the same length as an IP header...
3. Massive optimization. Remember all those NP-complete problems you learned in comp. sci. ? No more simmulated annealing, genetic algorithm, guesstimation methods. Qubits can find the optimal solution instantly. No more intense calculations for hours/days to find meager 'near' optimal solutions. P.S. NP-complete type problems shows up in almost every complex system in every field / domain.
So what are the implications of this kind of computing becoming available in ten years? It's a wonder we dont hear more about this when reading about quantum computers. The effect they will have when available is almost more interesting than the implementation
Re:Interesting Implications (Score:2)
Re:Interesting Implications (Score:2)
If it weren't the case, people would already be using PI as a compressions scheme on not-so-large data -- sure it might take days/weeks/months to find the encoding of a specific piece of data in PI, but considering how fast it would uncompress (if they offset were really significantly smaller than the data it was compressing, which wouldn't be often) it would be worth it for some uses.
Re:Interesting Implications (Score:1)
The only problem is that very few people have a quantum computer:)
Not quite that good (Score:1)
This is actually not quite true. So far no one has found a quantum computer algorithm which solves an NP-complete problem in polynomial time. This is perhaps one of the things that tend to get people overly exicted about quantum computers. They will most likely be built and become more or less practical, depending on the amount of technological progress, but they are not magic. So far most problems for which there are fast quantum algortihms are problems which can be solved in less than exponential time on an ordinary computer. There are a few exceptions like simulating a quntum system, but these problems are not NP-complete.
However there is no mathematical proof that a polynomial time quantum algorithm for an NP-complete problem could not be found, but the same is true for a classical algortihm for NP-complete problems.
Re:Interesting Implications (Score:4, Insightful)
Claim #1: No encryption.
There is a quantum algorithm for factoring large numbers (Shor's algorithm). It will break RSA... but so what? There are elliptic curves and countless other methods of cryptography still available. Quantum computing might or might not be able to break these.
And then there is quantum encryption,
Claim #2: Improved compression
The concept of compression is a complex issue of information/communicaton theory. It applies to information, not to computing. Quantum computing is basically just computing; you might find be able to compute the compressed file faster, but no computational method can squeeze information beyond the theoretic limit.
The pi scheme mentioned here is totally unusable: it takes as much bits to represent the index as it takes to represent the data itself.
Claim #3: Faster computing
This too is slightly incorrect. There are some things that are faster to implement with quantum computing. Some things, like adding two to two are suitable for classical computers. Even the Shor's algorithm uses classical FFT at one step. I don't think it's certain that a quantum computer can solve NP-complete problems faster than classical computers.
Besides, for some things *the* optimal solution simply can't be expressed in any computer, quantum or classical. (Think for example the equation x^2 = 2. The answer can't be represented numerically in a computer; however, it can be approximated to as many digits as is necessary.)
Quantum computing does not necessarily imply massive parallel computing. For some applications something like this happens, but it's not the same as having n different computations running simultaniously; more like the the same computation running over with some variation.
Last of all, we don't get full certainty. The Shor's algorithm (IIRC) can give us an answer with very high probability, but that probability isn't 100%. (99.999% perhaps)
Re:Interesting Implications (Score:2, Insightful)
1. There are algorithms for efficient factoring and discrete log, so many public-key cryptosystems are can indeed be broken by a quantum computer. However, there is no known quantum algorithm for lattice-based cryptosystems such as NTRU. Moreover, for private-key cryptosystems all we have is a generic search algorithm that finds the key in time SQRT(2^n) where n is the key length, instead of 2^n for a brute force classical search. Thus by doubling the key length of the cipher, you make the work as hard for a quantum computer as the original cipher was for a classical computer.
2. As pointed out in previous replies, the offset will usually be longer than the data. This follows from a simple counting argument.
3. There is no evidnence that quantum computer can efficiently solve NP-complete problems. In fact, many quantum computing researchers believe there isn't one.
The current quantum algorithms are very tantalizing, but restricted in applicability. We really don't know yet what types of problems will benefit from quantum computing beyond the few known classes.
Wait, wait.. a "computer"? (Score:4, Insightful)
Or is this going to be, like, you know, a real *computer*? Something that can be given general calcuations and work through them? By using the word "computer", are they thinking that what they make is actually going to be something turing-complete, or at least comparable to ENIAC, or maybe even one of the bethemoths that used to sit in the basement of a college (where the computer science students would sign up for a block of time, then come by, drop off large stacks of punchcards, and then wander by the next day to collect the results of the program)?
More importantly, though-- and this is what i'm really wondering about-- if they actually are building a quantum computer that is capable of going into the realm of *running actual programs of some sort*.. what programming language will be used? How will these programs be written? What will the "machine code" look like, and how possible will it be to write software for this in high level languages? (I.e. will it be possible to do HLL abstractions as we do with current computers, at least at first, or will hand-optimisation be too necessary to allow things like "compiling"? I am not 100% sure what a "von neumann" architecture is, but as far as i understand things there are some implicit assumptions in the way that things like C work that kind of only make sense if computers are designed at least generally the way they are now. How different would the architecture of a quantum computer be in a general sense, and how much would current programming languages have to change to make sense in that architecture? Which language is in its current state closest to something that would make sense for the creation of programs on a quantum computer architecture-- C, Python/java, LISP/scheme, Haskell/ML, or APL/Mercury? Or something i've never heard of?
Or is it that special boards or setups whatever will have to be hardwired and specially set up for each specific task (although it will do those tasks really quickly), and this will not be a general-purpose computer capable of doing things like loading and executing an arbitrary written-as-software program?
And to get into the complete castles-in-the-air-speculation realm.. if it is a true general-purpose computer, are they going to try to give it, like, you know, an operating system with things like a kernel and process manager and networking capabilities? Are they going to just stick with letting programs be fed in manualy, or is the thing that they say will take ten years something that is at least realistic to think that you could build one, set it in the basement of a college, and let all the students telnet to it and build and run programs while using some equivilent of unix talk/write to message each other and tyrannical sysadmins constantly watch to see if anyone is playing quantum games so they can kill those processes? (I don't care if they acctually *do* that. Just if that's realistic, my mind is totally blown. I doubt it's realistic.)
Or is anything that may be completed so far in the future they can't really say what it will look like at this point?
I am deathly curious. I desire explanations, or at least links to academic webpages explaining, what sorts of things this computer would do and in what way we would go about giving it its programmatical instructions. Pleasepleaseplease i thank you in advance?
-mcc
If It Can't Process Church Numerals Then What Good Is It
Re:Wait, wait.. a "computer"? (Score:2)
Normal computers would still be needed to work with quantum computers, and in fact, any kind of quantum computer would likely be a regular digital computer with a 'Quantum CoProcessor' to crunch on the difficult combinatorial optimization / state space search part of the problem.
Everything else would be done by the digital computer.
So, as a short answer to your question, think of Quantum Computers as a math coprocessor that you use in tandem with a digital computer to solve very complex problems that cannot be solved using conventional digital computers (or take prohibitively long to do so).
So what you're saying is... (Score:1)
Great general QC background. (Score:2)
http://tph.tuwien.ac.at/~oemer/doc/quprog/index.ht ml [tuwien.ac.at]
Bob
Re:Wait, wait.. a "computer"? (Score:5, Informative)
First, two key facts.
So yes, we're talking about a fully programmable computer. However, this universaily means an increase in the computer size, so for practical reasons you'll see two shortcuts that undermine this universatily.
First, everything that can be done classically is done outside the quantum computer -- a quantum-classical dualism, if you wish. Suppose you have an algorithm that performs operation Q on some register, and applies it 200 times in a loop. You could put the loop counter as part of the quantum computer and program it to do an increment+test+Q operation. However, currently it's much cheaper to put the loop counter outside the quantum computer and simply tell it 200 times to apply a Q operation.
Second, the hardware is made as simple and specialized as possible, and optimized for a specific algorithm.
Now, these shortcuts are clearly present in the tiny quantum computers built so far (latest is IBM's, featuring 8 qubits), and will probably be used for quite some time. But that's mere practice, not theory.
OK, so how do you program this things? All quantum algorithms to date have been expressed using one of two formalisms: either using algebra in the mathematical "Hilbert space" that represents all the states the computer can be in, or using a "quantum circuit" which is just like a normal logical circuit (with gates like AND and NOT), except the gates are different -- in fact the gates are expressed as operations in the Hilbert space, as in the first approach, but it's often easier to see the overall picture if you combine simple gates using the circuit formalism. To be truthful, there's also the "and then you do that N more times, and then you measure first register" type of formalism.
So you see, currently researchers are working at a level way below assembly language, and they're pretty happy about it because the algorithms are very small too. But what about higher-level representations? All efforts I've seen so far use quantum-classical dualism: you write a classical program that manipulates a quantum register (data), but the logic is purely classical. Some do it with a dedicated programming language, some with a class library, but the idea is the same (see a comprehensive list [ex.ac.uk]; all of these are mere simulators for now, of course).
Now, this is a very reasonable approach, but it seems rather halfway-there -- wouldn't it beneficial to allow quantum operations in flow control, not only in data? Well, we don't know yet. Currently there are very few "patterns" used in quantum computing, and none of them seems to easier to represent that way. It's hard to design a paradigm, let alone a language, for solving problems that don't yet exist. On the other hand, if we invent such a paradigm it might help us find new quantum algorithms. This is a vast open research area.
As for your speculations on how quantum computers will be used, the answers is yet again that we don't know. Here are the two extreme cases, both easily imagined and consistent with current knowledge. First, it could be that quantum computers will be found to be good for nothing except a few very specialized tasks, and that only a few RSA-cracking devices will be built by intelligence agencies at a prohobitive cost. On the other hand, it could be that a new class of quantum algorithms will be discovered that address more common needs, leading first to something in the college basement and later to a chip in everybody's computer. No one currently knows such these chips can be good for, if at all, though there's some intuition about what's more likely. I do venture to say that which of these possibilities becomes reality depends primarly on usefulness, since long-term prospects for mass-producton seem quite real, given sufficient demand.
I hope this clears things up a bit. I wish I could be more concrete, but it really takes a few hours to get a rough grasp of how these things really work, and a full-semester course to understand the interesting algorithms. Please don't hesitate to e-mail me [mailto] if you want to discuss this further.
Scheduling challenges (Score:3, Funny)
Of course, we won't know if the project worked or not until someone looks inside the lab in 2005.
I am getting this funny feeling..... (Score:1)
-Approaching the singularity
So long! (Score:2)
Re:So long! (Score:1)
Engineering schools are snapping up anyone who's lucky enough to make it through the HS math sump alive. Those a little braver are still discouraged by parents and media arguing for higher salaries, more partying, and of course more corporate sponsored and prescibed merchandise.
I'm disgusted.
Re:So long! (Score:2)
Some of the challenges (Score:3, Interesting)
"Dr Cory says that the program for factorising large numbers [400 digits] will require about 1,000 qubits simply to store the problem. But each of these qubits will require dozens of extra qubits for error-checking. That means that a useful computer will need tens of thousands of qubits to come up with a reliable answer. At seven and counting, that goal is a long way off. "
Hmm (Score:1)
Why joke... Just do it with the Next Big Thing...
This article is gibberish (Score:2)
Cool! (Score:1)
even if its vapor (Score:2)
With any luck at all... (Score:1)
Since HP likes to name their products with numbers (Score:2)
The HP [0-9][0-9][0-9][A-D|G|N|L|R|P|Q|X][T|S|X|Z]
REJECTED????? (Score:1)
Uhhh... yeah. So I submitted this exact same story a few days ago when the original article appeared on IDG.net [idg.net]. It was rejected.
Guess I should go for CNN as the authoritative recycler of science content next time instead of getting it from the source. After all, we all know how CNN is always right. :/
Arbitrary decisions aside, at least this is some encouragement that the irresistable force of Moore's Law [intel.com] won't meet the immovable object of the physical limits of silicon, etc and our universe will continue to exist!
Re:REJECTED????? (Score:1)
Ooops! Forgot to add this link to the Quantum Applications Symposium 2001 site [erim.org]
quantum memory in NY Times (Score:2)
article [nytimes.com] on quantum memory.
This is not the same as quantum computing,
but does use a quantum state of atoms to make
propose extremely dense memory.
Some background Spin (Score:1)
I am the Yeti!!!
Another Quantum Computing Article (Score:1)
Yo Mama (Score:1)