Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Technology

MIT And HP Announce Joint Quantum Computer Project 174

MetaCow writes: "CNN is running this article which describes a joint effort between MIT and HP to build a quantum computer. Nothing expected any time soon, though: 'Quantum computing research is farsighted, and it may take 10 years to develop a fully operational quantum computer ...'"
This discussion has been archived. No new comments can be posted.

MIT And HP Announce Joint Quantum Computer Project

Comments Filter:

  • Alright, a little 'bit picky', but

    "While the classical bit can store any number between 0 and 255 on each of its eight bytes, ..."

    Byte and bit should be reversed in this sentence fragment.

    :^)

    Ryan Fenton
    • Actually, that still doesn't make sense. Perhaps, "While the classical byte can store any number between 0 and 255 using all of its eight bits"?


      • There should be a new law passed that all computer-related articles would have to be edited by Slashdot boards. Then it would be easy to recognize any computer-related article by it's first words - "First post!"

        ;^)

        Ryan Fenton
      • I know the reporter who wrote the story and this is the un-subbed version... She's added in this paragraph instead

        Quantum computing uses the properties of quantum physics to perform calculations. The basic unit of computation used is the qubit, or quantum bit. Unlike classical bits the qubit is not just 0 to 1 but is in a superposition of both. In other words, it is both on and off at the same time.

        While the classical digital byte can store any number between 0 and 255 using all of its eight bits, it can only represent one of those numbers at a time. But a qubyte can be all of the numbers between 0 and 255 simultaneously.

        This allows much more information to be stored on a quantum bit than a classical bit, and allows massively parallel processing on a quantum scale. One calculation can give the answer for all the numbers on the byte at the same time, according to an explanation posted on the MIT Web site. In other words, for each clock cycle a quantum computer could perform 256 calculations in the same time a digital computer can perform just one.

        CNN have the older version up... the correction should be along shortly.
    • Well, that's probably why they are having problems reading the information back, as mentionad farther on in the article. If MIT and HP cannot figure the difference between bits and bytes, then they are simply providing a demonstration of the adage, "Crap in, crap out".

      On the other hand, the "article-typer" probably heard the words "quantum" and "qubit" and decided to write an article. His research probably consisted of noticing that his 56 kilobit modem was getting transfer rates of ~5 kilobytes on Win9x reported connection of ~40kps, and deduced that bits are bigger than bytes.

      Following this, he decided that he was an Authority (search for False Authority on /. when the indexes have been rebuilt) on conned a PHB editor with a deluge of buzzwords into letting him publicy insert his foot into his mouth.
      • Okay - I wrote the stupid thing. And the whole sentence got mangled. There's a correction been posted on the IDG site but it hasn't got through to CNN yet.. Not MIT's fault - mine. D'oh.
  • I was getting tired of Geforce2 quality graphics.
  • by SpanishInquisition ( 127269 ) on Monday August 20, 2001 @03:45PM (#2199314) Homepage Journal
    Someone will have to observe them to get a definitive answer.
  • That it's going to take this EXACTLY the same amount of time as it will for HAL to develop into maturity [slashdot.org]?????

    Mr. Clarke was 10 years off perhaps? HMMMMMM
  • <i>and it may take 10 years to develop a fully operational quantum computer ...</i>
    <br><br>
    Which means the NSA should be turning thiers on, rights..... about...... <b>now</b>
    • I actually got to tour the main NSA facility devoted to quantum computing research as a potential employee. (If they have more than one major lab working on this, no one told us).

      They actually don't have anything working right now, beyond what many public labs have. What they do have however is an essentially inexhaustible supply of money being thrown at this. They can hire the best talent and get any equipment they want.

      The NSA isn't there yet, but they sure do want to be first.
  • While the classical bit can store any number between 0 and 255 on each of its eight bytes, the qubit can store all the numbers between 0 and 255 on a byte of eight qubits.


    Please tell me this is just bad grammer. I mean, Doesn't this sound like you need 64 bits to make 255 on a binary computer (which we all know is false) and that somehow these "magical" quantum computers can do it in only 8 bits?...

    Imagine the compression we could have by not using a whole byte to store a bit of information...

  • by akiaki007 ( 148804 ) <[aa316] [at] [nyu.edu]> on Monday August 20, 2001 @03:49PM (#2199341)
    Quite a lame article, IMO.

    The article fails to make any real points. It's merely a PR article for HP and MIT.

    Unlike classical bits, the qubit can be not just 0 or 1 but a superposition of both, in differing proportions.

    Um...wrong. The qubit can be in the 0 or 1 state, but can also be both at the same time, and have varying rotations. Which is what makes it immpossible for us to decode them. It is the multiple state position that is what is interesting to us, and what does the parallel computing. We just don't know how to utilize it just yet. There have been various articles. Quibit.org [qubit.org] is a great place to start reading up on this stuff. The IBM Almaden [ibm.com] has a nice article [ibm.com] that will actually tell you something useful.
    • I also thought the article lacked a lot of relevent info and, what's more, said some things that were just wrong. A few additional pieces of information that might be of interest:


      A quantum computer is not simply faster than a classical computer

      Quantum computers use a different set of fundemental logical operations, which allow them to do all the things a classical computer (Turing machine) can, plus it allows some other operations to be done using the superpositions or entaglement of states. This means there are some problems for which fundementally different algerithms can be written to do it much faster, in polynomial time as apposed to exponential; however, there are just a few of these. There's no reason to think it will be faster for everything. For most tasks odds are its algerithms will be just as efficient as a classical computer. In addition, its logic operations will likely be slower meaning it may well be slower for the vast majority of tasks. So, don't expect a desktop QC in 20 years, it would probably be used only for specialized computational tasks for the forseable future.


      A quantum computer is not right around the corner

      Though the article gives little detail, the computer they're talking about making in 10 years will probably not be amongst the most powerful in the world when it is turned on. Odds are it will be more like the size of house with the power of a pocket calculator, old school mainframe style. This is due to the difficulties they mention. This is not really made very clear in the article, but the difficulties they face aren't just practical. Even theoretical calculations suggest that it's not feasible with currently theorized techniques to make a quantum computer more than about 100 qubits, due to decoherence and computer control factors. Quoth one paper, "...the absolute limit on what any practical NMR comptuer can handle remains well below 100 bits" (Cory et al., Proc. Natl. Acad. Sci. USA 94, 1997). I've read similar estimates for all forms of quantum computation currently in the works. It seems that it takes a computer with on the order of 500 to 1000s of qubits before it actually becomes more useful than a classical computer. This should not be hard to imagine. What do you think you could do with a computer that could only hold 100 bits of memory at a time? The point is that in ten years, we won't be seeing any supercomputer.


      Quantum computers are not the next step in miniturization

      The article says that we're reaching the limit of classical computation and suggest that quantum computers are the next step. It is true that Feynmann suggested in his original paper on the topic in 1983 that when computer technology reached the atomic scale then quantum mechanics would have to become part of computation, but quantum computers as they are talked about today have little to do with miniturization. In many of the schemes, such as atom traps quantum computers, the memory and computation elements (when control devices are included) are much larger than those for classical computers. This can also be said for RF-Squids and quantum dots. In short, the proimise of quantum computers has nothing to do with miniturization, it has to do with the fact that they can perform fundementally different algerithms.


      All that being said, I think it is a good that they're doing this research, and I do think that quantum computers will likely someday be useful, both as computers and as tools for researching quantum mechanics further. I am not saying I don't think they'll work, simply that the article seems to suggest a picture which is overly optimistic. However, with the great promise they hold, I don't think we can afford not to do research like this.

  • The brilliant physicist who first visualized this concept.


    His biography is quite the read. You can get it here [amazon.com].

  • The article mentions only $25 million. For a project that *could* take over the estimated 10 years it just seems small...

    With the results that a Quantum computer could generate I cannot believe that there isn't a larger sum designated for this project...
    • by Anonymous Coward
      years 0-7: pizza, beer, sleep till 3pm every day.
      year 8: go see what CMU has done with Quantum computing
      year 9: reimplement & call press conferences to upstage CMU, cuz we're MIT dammit
  • It seems that a quantum computer could calculate every possible double answer simultaneously, given a 64 qubit processor. The only problem is that, if the answer is read out, its guaranteed to be wrong! ;-)


    This is something of a drop from conventional computer performance, in which the answer is merely often wrong.


    Somehow I think this article 'dumbed things down' a bit too much...


    186,282 mi/s...not just a good idea, its the law!

    • Err, I thought it was only wrong if you tried to read the answer before the computation was done.
    • I can see that, some years in the future, that the technical jargon will be as dated for Star Trek as much as the technical data is in a show like the old Max Headroom (from the late 80s).

      Max Headroom was brilliant, and totally did not see the advent of the Internet. They envisioned a world where TV was mandatory, and everything was run by TV conglomerates.

      They missed the internet entirely. But aside from that, they were not all that bad.

      - - -
      Radio Free Nation [radiofreenation.com]
      is a news site based on Slash Code
      "If You have a Story, We have a Soap Box"
      - - -

  • These evil crackers are trying to circumvent factor based encryption, a copyright protection scheme. Lock 'em up
  • HP has cast itself as the manufacturer of all of the disposables of computing (PCs, printers, cameras, etc.). While it is laudable for them to go after future marketspace, their R&D might be better focused on how to beat Dell at low-cost manufacturing and inventory management, seeing as this is the commodity market they have chosen to compete in.

    As it stands, without some serious changes in senior management and a total overhaul of their product line, it is unlikely HP (as we know it) will see 2010....they're on the 3COM/SGI track right now.

  • Imagine the power we will have to crack passwords when this quantum computing goes through
  • This is all nice, but:

    What sort of frame rates will I be getting in Quake3?

    Fnord!
  • This excerpt pretty much sums up the state of this vapor(hard)ware:
    Problems arise when it comes to reading the information back. Any interactions with the environment -- including trying to read the information stored -- affect the qubits so that they change from a pure quantum state to a mixed state. This is known as decoherence and any reading taken from this state will be wrong.
    Bill Gates: "Yes! This computer works perfectly. You just don't have the technology to read the result without currupting the data, but for $5 per compute cycle, Microsoft will be happy to license our proprietary qbit data reading technology to you."

    --CTH
  • This is a re-post of a fine piece by nightlight3 some months ago. I'd simply post the link [slashdot.org], but slashdot archives aren't working. (I retrieved this from google cache).

    This isn't flamebait - it's definitely a subject worthy of discussion. I, for one, have great reservations about whether this is a viable technology. This is especially important since so much money and attention is being poured into research, perhaps often without a real understanding of the basic principles. I happen to know people in Gershenfeld's lab, and know full well their tendencies to let the hype get out of hand.

    Perhpas HP is spending the money as a marketing/PR effort, rather than them intending to get real work done. That would explain the press release.

    So here it is; I hope nightlight3 will chime in.
    - - - - -

    "If one existed, a quantum computer would be extremely powerful; building one, however, is extremely challenging,"

    Extremely challenging, like in "it can't work and it won't ever work, but I hope the government and the industry sponsors won't find that out, at least until I retire, preferably after I am dead."

    The whole field of Quantum Computing is a mathematical abstraction (fine, as any pure math is, as long as you don't try to claim that's how the real world works). Its vital connection with the real world is based on a highly dubious (even outright absurd, according to some physicists, including Einstein) conjecture about entangled quantum states (roughly, a special kind of "mystical" non-local correlation among events) which was actually never confirmed experimentally. And without that quantum entanglement the whole field is an excercise in pure abstract math with no bearing on reality.

    While there were number of claims of an "almost" confirmation of this kind of quantum correlations (the so-called Bell inequality tests), there is always a disclaimer (explicit or, in recent years, between the lines as the swindle got harder to sell), such as "provided the combined setup and detection efficiency in this situation can be made above 82%" (even though it is typically well below 1% overall in the actual experiment; the most famous of its kind, Aspect experiment from early 1980s had only 0.2% combined efficiency, while 82% is needed for actual, "loophole free" proof) or provided we assume that the undetected events follow such and such statistics, etc. The alternative explanations of those experiments (requiring no belief in mystical instant action-at-a-distance), which naturally violate those wishfull assumptions, are ignored, or ridiculed as unimportant loopholes when forced to debate the opposition, by the "mystical" faction. After all, without believing their conjecture all the magic of quantum computing, quantum cryptography, quantum teleportation, along with funding, would vanish.

    For those interested in the other side of these kinds of claims, why it doesn't work and why it will never work, check the site by a reputable British physicist Trevor Marshall, who has been fighting, along with a small group of allies, the "quantum magic" school for years:

    Quantum Mechanics is not a Science [demon.co.uk]

    Unfortunately, the vast bulk of the research funding in this area goes to the mystical faction. As long as there are fools with money, there will always be swindlers who will part the two.

    For a more popular account, accessible to non-physicists, of the opposing view, you can check a site by a practical statistician (and general sceptic) Caroline Thompson:

    Caroline Thompson's Physics [aber.ac.uk]

    • Extremely challenging, like in "it can't work and it won't ever work, but I hope the government and the industry sponsors won't find that out, at least until I retire, preferably after I am dead."

      This is so true. David Deutsch is a half-crazed crackpot and con artist who manage to convince a bunch of gullible people that his chicken faether voodoo physics is real science. I never thought I'd live to see the day when science is turned into in-your-face superstition by a bunch of swindlers. Do physicists think that there are beyond public scrutiny? Do they really think they can throw any crap at the public and that the public is forced to swallow it? I think they should be careful because the public is not as stupid as they want us to belive. One day, we'll wake up from our stupor and wipe that smug superiority smile off their faces. After all we pay their salaries and we reserve the ultimate right to decide what is good science and what is not.

      It is up to us, it is up to the citizens of a free society to either accept the chauvinism of science without contradiction or to overcome it by the counterforce of public action. Paul Feyerabend
      • Dude, I looked at your .sig link (Nasty Little Truth About Spacetime Physics [gte.net]) and found no useful math whatsoever. This guy wants to debate fine points of physics, and doesn't understand the language at all. It's like debating comp sci w/o any understanding of if-then logic, much less And-Or logic. My guess is he's a philosophy major who was frequently humiliated in class by the guys who knew math, and thus is on a crusade against anyone who passed calculus.


        I hope you have a great sense of humor, or else have just neglected your physics education, because currently I have little faith in your powers of logic.


        OTOH, I suppose you're a neccessary part of the scientific eco-system. We always need people criticizing the accepted standards, pushing us to put up (give some physical proof) or shut up. I would prefer, though, if you would adhere to the same standards of intellectual honesty that your typical scientist does. For instance, if you could come up w/ some alternate theories w/ solid predictions, that could be verified or disproven, I might listen. Otherwise, you're little more than a crackpot claiming that stuff doesn't work because it doesn't make sense, to your math-illiterate brain. Well, tough. The Universe is not mandated to 'make sense' to the Joe Public, especially if he refuses to try. The power of populism only goes so far, you know.


        At least the original post linked [demon.co.uk] to someone who is willing to make predictions that can be proven or disproven. That man is a scientist, IMHO, whether I agree w/ him or not. Your link is to a crackpot, OTOH.

        • If there is one thing that a great many physicists and mathematicians have in common, it's their insufferable pomposity. My site was not created for you. It's for the lay public. They are the ones who need to wake up and wipe that smug superiority smile off your faces and remind you who the real bosses are. So if you don't like it, don't read it.
          • Sir, I have visited your site in the past and looked it over and even considered emailing you about it. Yet, I decided not too. However I feel here is a chance to ask you some questions about your ideas listed on said webpage.

            First of all your logical argument against travel in spacetime to me makes no sense. You find that it is self-referential, because to find the velocity in the time axis the equation would be v = dt/dt.
            If a time dimension does exist it would not be a spatial dimension therefore 'velocity' within it wouldnt make any sense in the first place. Velocity is a phenomena that is found in our (3D) world. However there is a connection between relative velocity and time itself. The faster you travel the slower time appears to be. This has been proven with atomic clocks on space flights. 'Time' itself did slow down, even though it was fractions of seconds.

            I dont understand your point. Another thing you claim they are mathematical abstracts and have no counter-parts in nature. Yes they are abstracts but so is any language. The human mind takes experience and puts them in nice little symbolic packages so they can be transferred between human beings (language). This is math, and even english. Just because they are does not make them false.

            You seem to me to be very anti-science. Also I think you take Feyerabend out of context. I deeply respect Feyerabend but he is not against science. His point was to make science more humanistic. Using any one methodology would slow down the progress of science because science is a human art. An interpretation of reality that becomes a world view. I would have to agree with David Bohm that insight comes before the mathematical equations. The math(or language, paintings, etc) just describes this insight.

            Think of scientists more like painters who use math as their paintbrush to interpret the world. Science is a very fulfulling and beautiful thing do not resent it.

            Again what is your point my friend? Science sometimes gets boggled down in its own abstractions and becomes a little crazy but what human endeavor does not?
            • First of all your logical argument against travel in spacetime to me makes no sense. You find that it is self-referential, because to find the velocity in the time axis the equation would be v = dt/dt.
              If a time dimension does exist it would not be a spatial dimension therefore 'velocity' within it wouldnt make any sense in the first place.


              You are agreeing with me. It is precisely for this reason that there is no time travel. IOW, there is no motion in spacetime. Therefore there is no spacetime and no time dimension either.

              Velocity is a phenomena that is found in our (3D) world. However there is a connection between relative velocity and time itself. The faster you travel the slower time appears to be.

              An observer will notice a moving clock to run slower. Time itself cannot change because changing time is self-referential.

              This has been proven with atomic clocks on space flights. 'Time' itself did slow down, even though it was fractions of seconds.

              Not entirely correct. Time dilation is a misnomer. Time can neither slow down nor speed up. Only processes (clocks) slow down. Clocks are used to measure invariant temporal intervals. Intervals measured with a moving clock or a clock under the influence of gravity will seem longer than with a non-moving or inertial clock. "Time dilation" is not time travel. If a slowed clock actually traveled in time, it would simply disappear from view. I suggest you take a look at the book "Relativity from A to B" by Robert Geroch. Here's an excerpt:

              There is no dynamics within space-time itself: nothing ever moves therein; nothing happens; nothing changes. [...] In particular, one does not think of particles as "moving through" space-time, or as "following along" their world-lines. Rather, particles are just "in" space-time, once and for all, and the world-line represents, all at once the complete life history of the particle.

              Also check out the work of relativists like Bianchi, Pavsic and Horwitz, especially the latter's invariant tau formalism. I am not making any of this stuff up.

              Another thing you claim they are mathematical abstracts and have no counter-parts in nature.

              I never made any such claim. I said that the spacetime model has no counterpart in nature. If it did, there would be no motion. Please do not use this strawman against me. I'd appreciate a little bit of honesty from my critics.

              You seem to me to be very anti-science.

              No I am not. I love science. I am against the "chauvinism of science", as Feyerabend puts it in his "Against Method." I am especially against quackery that passes itself as science. I am against any group of people who would set itself up as some kind of privileged priesthood over the public.

              Again what is your point my friend? Science sometimes gets boggled down in its own abstractions and becomes a little crazy but what human endeavor does not?

              We, the public, pay good money for scientific research. A lot of money. We deserve to get good science for our money. Not snake oil. Quackery from famous charlatans is orders of magnitude more detrimental to humanity than crackpottery from your run of the mill crackpot. It is like a monkey wrench in the works. It can set us back for decades if not centuries.

    • Why are so many people incredibly bitter and unknowing of the pure sciences such as Mathematics and Physics? If there is something you don't understand, due to lack of education, where do you think the answers lie...God? I am not religious, and am not of a faith bound to a single God, but apparently from what I understand you can't experimentally prove Him, you can't touch, see, or smell Him, and he is a helluva lot more abstract and unprovable than any Physics or Mathematics could possibly hope achieve. Yet, Quantum Mechanics seems to be something you wholeheartly disconcern as hype and lies. Why? To myself and many others, Mathematics and Science is God...it is the language of the universe. There is no "mysticism" in that.
      I am a Pure Mathematician, as I have stated in many of my other posts. Mathematics is the language of the Universe. All of science is Mathematics in the end, and as such holds the answers to every question the universe holds. Every mathematical equation and answer has been abstract at some point. Even the original Arabic concept of the zero. Mathematics follows very, very strict rules. If mathematics, which is Physics, says that something exists, then it does. The idea of what it may be may be up to interpretation, which is where most Physics is done, but the pure mathematics is not. So, if an idea is too abstract for you to comprehend, does it mean it is automatically false? No, of course not. It just means you don't understand it.
      Mathematicians and Physicists, especially ones at a place like MIT, are not there to "scam" or "swindle" you. Neither you nor I am a student or professor at MIT, and therefore have no right to judge their intelligence, integrity, or their minds. Quantum Mechanics is a science, and an incredibly important one at that. You terribly miquoted Einstein and others when you made the blatantly incorrect reference to his stance on Quantum Mechanics (which is based on Probabilities, unlike Einstein's relatively 'flat' universe). Einstein actually helped create Quantum Mechanics and was quoted as saying "Quantum Mechanics...Scary things happening at a distance." That was his quote, and it has nothing to do with what you blatantly messed up. He said is was scary to him...Something one of the greatest minds that has ever lived didn't completely understand. So why do you think you should be able to?
      Let's take an everyday example to a moment. Have you ever gone to the grocery store and purchased something? Did the cashier wave your product, which contained a bunch of bars in a small box called a UPC over some lasers, and *presto*...Your total came up on the cash register? Are you someone who never thinks twice about how that works or are you someone who takes the time to find out. Well, you would have no clue how it works without a college course in Contemporary Abstract Algebra. The math that makes your UPC work are such abstract things as Group Theory, Ring Theory, and Modular Multiplication. These things, which while looking at them plain faced mean nothing to you. But, dig deeper, become educated in the pure science of mathematics, and all of a sudden you realize that without Group Theory, Ring Theory, and Modular Multiplication nothing would work at all. Now, to explain, most CS majors know what Mod. Mul. is, right? Well, when applied to a Ring (which I have no room to explain here) you can choose a prime number (another of those abstract ideas) as your Ring modifier and apply the Prime Ring to your Mod. Mul. and you have the ability to create a code that has basically only one real solution (well, technically possibly more than one solution depending on your prime seed, but it would be a process of reverse engineering the UPC mathematics to figure it out, and no one cares, unless you have a Pure Math degree, a permanent marker, a code sheet of the vendor's UPC codes, and a lot of spare time). Hidden in that UPC bar code is the correct number to undo the Ring, thus giving you the Product ID of the food you purchased.
      Since Abstract Algebra has been around for almost 200 hundred or more years now, do you think anyone then could have imagined a UPC code in the 20th/21st century? No, of course not. And always be careful of reading "rebuttal" sites on the net or in print. None of them are written by anyone even remotely qualified to say anything. Especially judging the article you referenced and your apparent respect for it. Please never stop dreaming, and never doubt something out of hand, simply because you don't understand it. Peace.
    • A simple physics question, or maybe not so simple:

      Can phenomena which are in no particular state (i.e., the wave that lives in the space between non-existence and existence) have interim unobserved effects during the "particle"->"wave"->"particle" transition? Or is the wave/particle duality an impenetrable boundary, the "wave state" something that can never be "known" in itself?

      It seems that in such a "quantum system" one could induce some potential course of action, and by measuring the existence or nonexistence of a resulting effect infer the meaning of the result based on the original parameters. In this case you can use time as the controlling variable, and all is right with the world.

      (Intruducing time into the equation is the only means to observe a system without necessarily interacting with it that I can think of. At least, you probably only need to interact half of the time.)

      Once you've made a logical branch based on the result (or non-result) then you can happily reset the system to a known state and set up the next "instruction."

      I realize this is an oblique notion, but it makes a weird kind of sense if you've done enough acid! ;-) I'm untainted by any deep understanding of quantum computing, so I'm just riffing off my intuition. Does quantum computing rely on preserving states in such a way that my theoretical brute-force approach would topple the system?
  • by roman_mir ( 125474 ) on Monday August 20, 2001 @04:05PM (#2199444) Homepage Journal
    I wonder if DMCA would render building, selling and using such machines illegal, since quantum computers can be used to compute securitykeys for any encryption algorithms in a feasible amount of time?
  • Here are a few things that quantum computers (when fully realized and sufficiently powerful) may bring with them in the future:

    1. No more encryption. Quantum computers can crack block ciphers with ease, as well as assymetric public key cyphers. Bigger keys? Just use more qubits. Hmm... can anonymous networks (MixMaster, Freenet, Publius, etc...) exist without encryption? Can banking exist without encryption? How about online transactions in general?

    2. Uber compression. Everything digital occurs in the Pi sequence somewhere right? Well, quantum computers might be able find that offset and length within Pi, LCG's, or any other kind of sequence.

    Imagine downloading a 4 hour DIVX using 20 bytes. 4b sequence ID, 8b offset, 8b length. That is the same length as an IP header...

    3. Massive optimization. Remember all those NP-complete problems you learned in comp. sci. ? No more simmulated annealing, genetic algorithm, guesstimation methods. Qubits can find the optimal solution instantly. No more intense calculations for hours/days to find meager 'near' optimal solutions. P.S. NP-complete type problems shows up in almost every complex system in every field / domain.

    So what are the implications of this kind of computing becoming available in ten years? It's a wonder we dont hear more about this when reading about quantum computers. The effect they will have when available is almost more interesting than the implementation ;)
    • About the PI think, most combinations are most likly to be found at a numerically higher offset, than the number itself is, therefor making the total bigger ;(
      • You are correct. The offset into PI will be larger than the data segment you are trying to find in it in all but the most trivial cases. Simple information theory dictates this.


        If it weren't the case, people would already be using PI as a compressions scheme on not-so-large data -- sure it might take days/weeks/months to find the encoding of a specific piece of data in PI, but considering how fast it would uncompress (if they offset were really significantly smaller than the data it was compressing, which wouldn't be often) it would be worth it for some uses.

    • You are forgetting that same technology that allows you to break all conventional encryption, also you to create imbreakable encryption - Not only that, but it will also be impossible to intercept a quantum coded message, without alerting the reciever.

      The only problem is that very few people have a quantum computer:)
    • "Massive optimization. Remember all those NP-complete problems you learned in comp. sci. ? No more simmulated annealing, genetic algorithm, guesstimation methods. Qubits can find the optimal solution instantly."

      This is actually not quite true. So far no one has found a quantum computer algorithm which solves an NP-complete problem in polynomial time. This is perhaps one of the things that tend to get people overly exicted about quantum computers. They will most likely be built and become more or less practical, depending on the amount of technological progress, but they are not magic. So far most problems for which there are fast quantum algortihms are problems which can be solved in less than exponential time on an ordinary computer. There are a few exceptions like simulating a quntum system, but these problems are not NP-complete.
      However there is no mathematical proof that a polynomial time quantum algorithm for an NP-complete problem could not be found, but the same is true for a classical algortihm for NP-complete problems.
    • by arasinen ( 22038 ) on Monday August 20, 2001 @05:02PM (#2199689)
      All three points are more or less incorrect. Quantum computing does not necessarily improve every aspect of classical computing. The main difference is that classical computers are dead-on deterministic, while quantum computers are more of a probabilistic sort.

      Claim #1: No encryption.

      There is a quantum algorithm for factoring large numbers (Shor's algorithm). It will break RSA... but so what? There are elliptic curves and countless other methods of cryptography still available. Quantum computing might or might not be able to break these.

      And then there is quantum encryption, ...

      Claim #2: Improved compression

      The concept of compression is a complex issue of information/communicaton theory. It applies to information, not to computing. Quantum computing is basically just computing; you might find be able to compute the compressed file faster, but no computational method can squeeze information beyond the theoretic limit.

      The pi scheme mentioned here is totally unusable: it takes as much bits to represent the index as it takes to represent the data itself.

      Claim #3: Faster computing

      This too is slightly incorrect. There are some things that are faster to implement with quantum computing. Some things, like adding two to two are suitable for classical computers. Even the Shor's algorithm uses classical FFT at one step. I don't think it's certain that a quantum computer can solve NP-complete problems faster than classical computers.

      Besides, for some things *the* optimal solution simply can't be expressed in any computer, quantum or classical. (Think for example the equation x^2 = 2. The answer can't be represented numerically in a computer; however, it can be approximated to as many digits as is necessary.)

      Quantum computing does not necessarily imply massive parallel computing. For some applications something like this happens, but it's not the same as having n different computations running simultaniously; more like the the same computation running over with some variation.

      Last of all, we don't get full certainty. The Shor's algorithm (IIRC) can give us an answer with very high probability, but that probability isn't 100%. (99.999% perhaps)
    • Wrong on all three accounts, I'm afraid.

      1. There are algorithms for efficient factoring and discrete log, so many public-key cryptosystems are can indeed be broken by a quantum computer. However, there is no known quantum algorithm for lattice-based cryptosystems such as NTRU. Moreover, for private-key cryptosystems all we have is a generic search algorithm that finds the key in time SQRT(2^n) where n is the key length, instead of 2^n for a brute force classical search. Thus by doubling the key length of the cipher, you make the work as hard for a quantum computer as the original cipher was for a classical computer.

      2. As pointed out in previous replies, the offset will usually be longer than the data. This follows from a simple counting argument.

      3. There is no evidnence that quantum computer can efficiently solve NP-complete problems. In fact, many quantum computing researchers believe there isn't one.

      The current quantum algorithms are very tantalizing, but restricted in applicability. We really don't know yet what types of problems will benefit from quantum computing beyond the few known classes.
  • by mcc ( 14761 ) <amcclure@purdue.edu> on Monday August 20, 2001 @04:09PM (#2199462) Homepage
    I am really curious as to what they mean by a "computer" in this specific case. I mean, i have heard that they've done quantum computers capable of picking phone numbers out of a list of four, and such. Which while a HUGE accomplishment is still rather primitive.. Is this just going to be another one of those? A simplistic test machine?

    Or is this going to be, like, you know, a real *computer*? Something that can be given general calcuations and work through them? By using the word "computer", are they thinking that what they make is actually going to be something turing-complete, or at least comparable to ENIAC, or maybe even one of the bethemoths that used to sit in the basement of a college (where the computer science students would sign up for a block of time, then come by, drop off large stacks of punchcards, and then wander by the next day to collect the results of the program)?

    More importantly, though-- and this is what i'm really wondering about-- if they actually are building a quantum computer that is capable of going into the realm of *running actual programs of some sort*.. what programming language will be used? How will these programs be written? What will the "machine code" look like, and how possible will it be to write software for this in high level languages? (I.e. will it be possible to do HLL abstractions as we do with current computers, at least at first, or will hand-optimisation be too necessary to allow things like "compiling"? I am not 100% sure what a "von neumann" architecture is, but as far as i understand things there are some implicit assumptions in the way that things like C work that kind of only make sense if computers are designed at least generally the way they are now. How different would the architecture of a quantum computer be in a general sense, and how much would current programming languages have to change to make sense in that architecture? Which language is in its current state closest to something that would make sense for the creation of programs on a quantum computer architecture-- C, Python/java, LISP/scheme, Haskell/ML, or APL/Mercury? Or something i've never heard of?

    Or is it that special boards or setups whatever will have to be hardwired and specially set up for each specific task (although it will do those tasks really quickly), and this will not be a general-purpose computer capable of doing things like loading and executing an arbitrary written-as-software program?

    And to get into the complete castles-in-the-air-speculation realm.. if it is a true general-purpose computer, are they going to try to give it, like, you know, an operating system with things like a kernel and process manager and networking capabilities? Are they going to just stick with letting programs be fed in manualy, or is the thing that they say will take ten years something that is at least realistic to think that you could build one, set it in the basement of a college, and let all the students telnet to it and build and run programs while using some equivilent of unix talk/write to message each other and tyrannical sysadmins constantly watch to see if anyone is playing quantum games so they can kill those processes? (I don't care if they acctually *do* that. Just if that's realistic, my mind is totally blown. I doubt it's realistic.)

    Or is anything that may be completed so far in the future they can't really say what it will look like at this point?

    I am deathly curious. I desire explanations, or at least links to academic webpages explaining, what sorts of things this computer would do and in what way we would go about giving it its programmatical instructions. Pleasepleaseplease i thank you in advance?

    -mcc
    If It Can't Process Church Numerals Then What Good Is It
    • Quantum computers are strange, in that they are good for mainly one kind of computation: Combinatorial optimization / state space search.

      Normal computers would still be needed to work with quantum computers, and in fact, any kind of quantum computer would likely be a regular digital computer with a 'Quantum CoProcessor' to crunch on the difficult combinatorial optimization / state space search part of the problem.

      Everything else would be done by the digital computer.

      So, as a short answer to your question, think of Quantum Computers as a math coprocessor that you use in tandem with a digital computer to solve very complex problems that cannot be solved using conventional digital computers (or take prohibitively long to do so).
    • The link posted above contained a really good general QC background, written up as part of this fellow's Master's thesis, intended for CS people, starting with basic quantum physics and going on to qubits and basic quantum computing. I think it is worthwhile pointing out the link :

      http://tph.tuwien.ac.at/~oemer/doc/quprog/index.ht ml [tuwien.ac.at]

      Bob

    • by Insount ( 11174 ) <slashdot2eran&tromer,org> on Monday August 20, 2001 @06:59PM (#2200206) Homepage
      These are excellent questions. I'll try to answer them as far as my knowledge of the field allows, but please bear in mind that a good answer to some of the issues you raised would significantly enhance our state of knowledge.

      First, two key facts.
      • A quantum computer can run any classical algorithm. There is some overhead due the reversibility constraint and due to the practical need for error correction, but it's probably a low-order polynomial overhead at worst. In particular, a quantum computer can run a Universal Turing Machine. So, it's fully programmable in respect to classical programs.
      • There exists a Universal Quantum Turing Machine that acts analogously to a classical Universal Turing Machine, i.e., lets a fixed quantum computer simulate run any quantum algorithm given a description of that algorithm as additional input. (It does so to finite accuracy, but the error can be made arbitrarily small)


      So yes, we're talking about a fully programmable computer. However, this universaily means an increase in the computer size, so for practical reasons you'll see two shortcuts that undermine this universatily.

      First, everything that can be done classically is done outside the quantum computer -- a quantum-classical dualism, if you wish. Suppose you have an algorithm that performs operation Q on some register, and applies it 200 times in a loop. You could put the loop counter as part of the quantum computer and program it to do an increment+test+Q operation. However, currently it's much cheaper to put the loop counter outside the quantum computer and simply tell it 200 times to apply a Q operation.

      Second, the hardware is made as simple and specialized as possible, and optimized for a specific algorithm.

      Now, these shortcuts are clearly present in the tiny quantum computers built so far (latest is IBM's, featuring 8 qubits), and will probably be used for quite some time. But that's mere practice, not theory.

      OK, so how do you program this things? All quantum algorithms to date have been expressed using one of two formalisms: either using algebra in the mathematical "Hilbert space" that represents all the states the computer can be in, or using a "quantum circuit" which is just like a normal logical circuit (with gates like AND and NOT), except the gates are different -- in fact the gates are expressed as operations in the Hilbert space, as in the first approach, but it's often easier to see the overall picture if you combine simple gates using the circuit formalism. To be truthful, there's also the "and then you do that N more times, and then you measure first register" type of formalism. :-)

      So you see, currently researchers are working at a level way below assembly language, and they're pretty happy about it because the algorithms are very small too. But what about higher-level representations? All efforts I've seen so far use quantum-classical dualism: you write a classical program that manipulates a quantum register (data), but the logic is purely classical. Some do it with a dedicated programming language, some with a class library, but the idea is the same (see a comprehensive list [ex.ac.uk]; all of these are mere simulators for now, of course).

      Now, this is a very reasonable approach, but it seems rather halfway-there -- wouldn't it beneficial to allow quantum operations in flow control, not only in data? Well, we don't know yet. Currently there are very few "patterns" used in quantum computing, and none of them seems to easier to represent that way. It's hard to design a paradigm, let alone a language, for solving problems that don't yet exist. On the other hand, if we invent such a paradigm it might help us find new quantum algorithms. This is a vast open research area.

      As for your speculations on how quantum computers will be used, the answers is yet again that we don't know. Here are the two extreme cases, both easily imagined and consistent with current knowledge. First, it could be that quantum computers will be found to be good for nothing except a few very specialized tasks, and that only a few RSA-cracking devices will be built by intelligence agencies at a prohobitive cost. On the other hand, it could be that a new class of quantum algorithms will be discovered that address more common needs, leading first to something in the college basement and later to a chip in everybody's computer. No one currently knows such these chips can be good for, if at all, though there's some intuition about what's more likely. I do venture to say that which of these possibilities becomes reality depends primarly on usefulness, since long-term prospects for mass-producton seem quite real, given sufficient demand.

      I hope this clears things up a bit. I wish I could be more concrete, but it really takes a few hours to get a rough grasp of how these things really work, and a full-semester course to understand the interesting algorithms. Please don't hesitate to e-mail me [mailto] if you want to discuss this further.
  • by isomeme ( 177414 ) <cdberry@gmail.com> on Monday August 20, 2001 @04:16PM (#2199513) Journal

    The project, announced last week, is part of a $25 million, five-year alliance launched in June 2000.

    Of course, we won't know if the project worked or not until someone looks inside the lab in 2005.
  • ...that the world is beginning to look like a long game of Alpha Centuri. We do we see the /. article on Threshold of Transcendance.

    -Approaching the singularity
  • God! TEN WHOLE YEARS for something that will revolutionise computing, and perhaps even make us reassess every way in which we view the world! So long! What are these scientists wasting our time for, surely they should have produced this yesterday - it sounds easy enough.

    • What are scientists wasting our time for?? What are you talking about. The reason the advances are so slow is because college graduation rate of physics majors is pitful. Our flagship public university, with 50k students graduates 25 physics majors per year. You wanna know why? Math and sciences, but math in particular is being pushed to the side by MTV and Art Class. No one seems to want to do the difEq or LinAl needed to get any work done.

      Engineering schools are snapping up anyone who's lucky enough to make it through the HS math sump alive. Those a little braver are still discouraged by parents and media arguing for higher salaries, more partying, and of course more corporate sponsored and prescibed merchandise.

      I'm disgusted.
      • I was joking! The article seemed to suggest that ten years was ages to wait, but this is an ABSURDLY revolutionary step in terms of computing and how we view the world, and to be told that it'll happen in ten years is almost frightening for it being so soon...
  • by JPMH ( 100614 ) on Monday August 20, 2001 @04:53PM (#2199649)
    The Economist had a good article back in May about the state of the art (7 qubits), and some of the practical difficilties [economist.com] facing quantum computing.

    "Dr Cory says that the program for factorising large numbers [400 digits] will require about 1,000 qubits simply to store the problem. But each of these qubits will require dozens of extra qubits for error-checking. That means that a useful computer will need tens of thousands of qubits to come up with a reliable answer. At seven and counting, that goal is a long way off. "

  • by TFloore ( 27278 )
    Soemone joked about going back in time and patenting the AND gate.

    Why joke... Just do it with the Next Big Thing...
  • "While the classical bit can store any number between 0 and 255 on each of its eight bytes, the qubit can store all the numbers between 0 and 255 on a byte of eight qubits."
    So a classical bit has 8 bytes and ranges 0-255? But a quantum byte has 8 qubits and ranges 0-255. Ummmm..... no.
  • I have heard a lot of about Quantum computers, I bet DT [designtechnica.com] does a feature on what they are are etc.. very soon. This is the technology that will change the world.
  • I'm certain that Dr. Feynman is very proud that we're actually clsoe to building something he had envisioned. its also something that I think all of us have been patiently waiting for but often thought of as being a 50 years off kind of thing.
  • ...it won't be a "Pavilion"
  • what will they call this product? Maybe:



    The HP [0-9][0-9][0-9][A-D|G|N|L|R|P|Q|X][T|S|X|Z]


  • Uhhh... yeah. So I submitted this exact same story a few days ago when the original article appeared on IDG.net [idg.net]. It was rejected.

    Guess I should go for CNN as the authoritative recycler of science content next time instead of getting it from the source. After all, we all know how CNN is always right. :/

    Arbitrary decisions aside, at least this is some encouragement that the irresistable force of Moore's Law [intel.com] won't meet the immovable object of the physical limits of silicon, etc and our universe will continue to exist!

  • Today's NY Times has an
    article [nytimes.com] on quantum memory.
    This is not the same as quantum computing,
    but does use a quantum state of atoms to make
    propose extremely dense memory.
  • The New York Times [nyt.com] has an interesting, not-too-technical article [nytimes.com] with information on Spintronics. Spintronics is the art and science of developing practical applications which take advantage of an electron's inherent property of spin. Some discussion of M-RAM is presented which would be a very important first step in the actual deployment of a quantum computer. As a side note, my crypto professor at the U always said that if quantum computers ever became a reality, all current encryption methods would quickly become quite useless. I was wondering if anyone has looked at developing encryption algorithms which could specifically take advantage of the possibilities of quantum computing and remain secure.

    I am the Yeti!!!
  • This article [e-open.ws] at Open magazine [openmagazine.net] (loosely related to /.) also talks about Quantum Computing.

Been Transferred Lately?

Working...