Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Technology

Quantum Computers 61

joecool12321 writes: "Although Richard Feynman spoke about quantum computers in 1981, technology is only now starting to catch up. This article at Scientific American discusses recent developments towards the goal of 'infinite computing,' and research is showing that scalibility may not be far away, and thus scalable qbits."
This discussion has been archived. No new comments can be posted.

Quantum Computers

Comments Filter:
  • Um, could somebody please mod the parent up? Or am I missing something?
  • Penrose's argument (as elucidated in "The Emperor's New Mind" anyway) started with saying that brains aren't Turing Machines (by appealing to Goedel). He then goes on (at the end of Emperor and the whole of the next book, which I haven't read fully) to say that the reason brains are Turing Machines could be quantum effects at the neuron level.

    But I, like you and previous poster, think he's out a limb (if not out of his tree completely).
    --
    Non-meta-modded "Overrated" mods are killing Slashdot
  • You are right, quantum computing isn't "infinite". However, it IS "arbitrarily large" which is good enough for a layperson.

    You know the difference between DFA's and NFA's right? An NFA is like a "parallel processing DFA" (speaking loosely). So here's the analogy: NFA:DFA::QuantumComputer:TuringMachine.
    --
    Non-meta-modded "Overrated" mods are killing Slashdot
  • A reliable /dev/random!
    Now if only I could get a /dev/null that can hook up to a USB port, everything would be fine.

  • "The Colorado group and Reichel's group are working on running Bose-Einstein condensates through their microchip devices, a development that would allow true quantum studies to begin."

    You know things are getting cool when components of a computer start getting named after scientists (remember the Heisenberg Compensator [startrek.com]?)

  • by zCyl ( 14362 ) on Friday March 09, 2001 @05:15AM (#374942)
    You can construct a rudimentary USB compatable /dev/null using a dixie cup and some salt water.
  • by Merk ( 25521 ) on Friday March 09, 2001 @07:56AM (#374943) Homepage
    First, quantum computers have to be perfectly reversible. That means for every output there's an input and vice versa. And there has to be no way of knowing the initial states of the data. You don't process data, you process probabilities in a quantum computer; if you know exactly what any one value is throughout the computation, you can find out all of the values: the superposition ends and you're stuck with a useless chunk of machinery.

    Actually let me clarify a bit of this. First of all an example of what it means to be reversible. The best example of this is that you can't clear memory / registers. Setting something to zero is a destructive, non-reversible process. Basically any "program" run on a quantum computer would be runnable backwards and using the all the outputs you could find all the inputs. Even a simple program like C = A | B would have to keep another bit of data, a "D" that would enable you to reconstruct A and B using C.

    Now the types of things that quantum computers would in theory do really well take advantage of being able to use an input state that is a superposition of all possible inputs. The prime example is factoring huge numbers. The number to be factored is entered as one input to the process, and the second input to the process is a superposition of every number from zero to the number to be factored.

    The quantum computer then divides the number to be factored by this input vector, and retains the remainder, which is a superposition of all possible remainders from that division.

    Now there will be patterns in the remainders from the division, and if you take a fourier transform of those remainders you will get big peaks that correspond to the factors.

    At this point your calculation is done so you measure the output. Remember that everything that has happened so far has been happening internally to the "quantum computer" and has not been observed. Your observation of the output collapses the probability and you get one output point, but if you repeat this operation a hundred times or so, most of your output points will be somewhere in this peak.

    The cool thing about this process is that it takes advantage of the fact that you can do a fourier transform in the intermediate step before you collapse the probability. To get enough points to do a Fourier Transform in the intermediate state in a traditional computer you'd need to get thousands input vectors, but the quantum computer only needs one.

    (btw, IANLAPBITEPIS (I am no longer a physicist but I took engineering physics in school), so if I messed up somewhere here and someone can correct me please do)

  • Quantum computers will likely mark the end of Moore's law. They don't scale the same way as conventional computers. Think of a quantum computer as being able to process 2^n values (where n is the bits per item being processed) for every single value processed in a conventional computer. If we can keep the same kind of mojo going for quantum computers that we've kept going for classical computers, then we might rougly experience computing power squaring every 18 months or so. (Of course, there are a whole new set of challenges facing us in the field of quantum computing that we've never seen in classical computers.)
  • The Bose-Einstein condensate has nothing to do with computers, as a component anyway at least not yet. A Bose-Einstein condensate is what happens when a collection of molecules gets chilled to near 0K. Since there is almost no motion of molecules in the condensate the group is hoping to use it to perform tests on the molecules, atoms or anything else they can hit.
  • Yes, quantum computing could smash traditional encryption while offering new and improved methods. But the new, expensive technology will initially be in the hands of the wealthy and powerful, leading to a new wave of accumulation.
  • What got me about Penrose is that he goes through the gyrations of showing the limitations of computers. GEB did a better job of that IMHO. But then he fails to prove to my satisfaction that humans have a special insight into math that computers cannot have.
  • by Anonymous Coward
    Read up on quantum encryption -- quantum computing is a two edge sword.

    Actually, even a quantum computer cannot break (within reasonable time) the newest block ciphers. AES is specified with 128, 192, or 256 bit key. Why up to 256 bits, when 128 will withstand any concievable brute force attempt for at least the next thousand years or so?

    Because a QC can only do so much better than a regular computer. In this case, the square root of the effort by a normal computer. That's a 2**64, 2**96, or 2**128 effort, even with a full QC. So even a quatum computer (or a million million million of them) cannot brute force a 256 bit key cipher.

    Similiarly for RSA. It can factor numbers a lot faster than we can now, but if you pick your primes big enough you'll be safe.

    Sadly, quatum encryption isn't much of a defence against QCs. It requires dedicated optical channels, which realistically are pretty rare (and you certainly can't use it over email).

  • I can just see the headlines...

    In the news today, a local overclocker and avid QuakeIII player died in an accident when his AMD Megatron 100,000,000 GHz suddenly overheated and self destructed.

    Fellow gamer, who goes by the name "1337", was in a game at the time that the accident occured. "The last thing I remember he was up about 15,000 frags, and boasting about how he was getting 2,000,000 FPS on his machine." 1337 said. "Then suddenly he disappeared. Served him right, he had an unfair advantage."

    Experts who have been investigating the accident have managed to peice together a probable chain of events. "From what we can figure, a pump failed in the cooling tower that he had in the back yard, and caused the chip to overheat" claims Cho Man Foo, a scientist at Los Alamos National Laboratory, one of the many expert quantum phyiscs experts called in to reconstruct the chain of events. "He may have survived the incident," states Mr. Foo, "except when the chip detonated, it started a chain reaction that caused the magnetic shiels around his Segate black hole quantum hard drive to collapse, imediatly consuming everything within 500 feet."

    Pictures of the aftermath can be found here [primex.co.uk].

    In related events, LAN party co-ordinators have been advised to postpone all further gatherings untill a meathod of confirming that no overclocked machines be used in game meetings. AMD and motherboard maker ASUS will be taking steps to insure that further incidents will not happen. In the next version of their products, the clock multipliers will be locked at the factory, hopefully preventing further injuries from their products. Also, tobacco giant Phillip Morris is suing AMD, claiming that the chips cause danger to their large customer base, and they could potentially loose a large number of customers if there was to be a major accident.

    Related articles:

    Overclocker Creats Rift in Space-Time Continuum [bbspot.com]


    "Everything that can be invented has been invented."

  • Quantum computers are not non-deterministic computers regardless of how much we want them to be. IIRC, to get exponential speed up you need to use a fourier transform. This has been applied to things like factorization and search, and such, but not to NP-complete problems. Last I heard, it was still an open question as to whether or not P=NP on quantum computers. Much of the confusion comes from people assuming that factorization is an NP-complete problem. It is really hard and time consuming, but it is not NP-complete. It just happens to be hard enough and time consuming enough to be practical for encryption assuming you have a big enough number.

    Until the theoreticians tell us whether or not P=NP on quantum computers, a good NP-complete encrytption algorithm will work just fine.

  • How close are we with optical switching to be able to build an all optical computer?
    Instead of just on or off each 'bit' have have different states depending on the shade of light.
    Imagine a system where a few thousand different shades of light are recognized.


  • I agree. Human brains take Years to develop intelligence (such as learning a language, spatial recognition, and common sense). There is nothing stopping someone from writing a very long computer program to emulate these processes; it's been done before. But we can't expect the program to be written in just 5 years.

    In other words, using this analogy, we (humans) are a program with some random input (eg. body size, gender, hell.. anything), and a human life is simply that program executing, interracting with other 'programs' along the way. The program is already written, and it still takes Years to fully develop a 'person'. In addition, some people theorize that a person is Never fully developed, but I digress.

    Therefore; maybe in 15 - 20 years, someone will have developed a complete learning 'program' that can be set free, and we'll see what happens. Plenty of people have already begun.

    Of course, this is all a 'theory', and im certainly not an expert.. but it's interesting nonetheless.

    -Egon
  • To whome it may concern: I'm currently filing a patent for a Quantum computer case with "Don't Panic" stamped clearly on the front.
  • by billstewart ( 78916 ) on Friday March 09, 2001 @10:32AM (#374954) Journal
    One thing I've never seen explained is whether there's a way to link together multiple QC widgets (No, I didn't say a Beowulf Cluster of them! :-) or whether you're limited to the resolution of one widget, which Heisenberg limits to a value around Planck's constant (~10**-46 = ~140 bits.) If you could do the physics and precision construction to get this resolution, it would be lots of fun, but it doesn't fundamentally change cryptography, because it doesn't get you unlimited exponential growth - you can always add another 140 bits to your key length.
  • One flaw in the idea that the mind uses quantum effects as a quantum computer would, is that we don't have any special skill at solving the sorts of problems that quantum computers could.

    Agreed. People like to come up with lots of examples to show how humans can use our mind to do things that a computer could never do, like look at code and find the infinite loop, etc. Unfortunately, when humans do these tricks, we dont use formal logic or exhaustive search to solve them (like most computers do), we use heuristics to guess as to what a good answer will look like and proceed from there. We form these heuristics from years or even decades of experience. To put it in laymans terms, we just make educated guesses. When we build a computer that can form its own heuristics, then give it decades of experience, I think we will see some amazing results.

  • So how long does this "inifite computing" machine take to complete an infinite loop?
  • We're not going to compile QMP (Quantum Multi Processing) enabled kernels tomorrow at 5PM either...

    But when it comes, we will have to QUICKLY design a better encryption system than our current public-key based one !

    What is the point of encryption? Someone out there is always going to be able to break it, and the net, because of its design and the weaknesses of the computers attached to it, will always be an unsecure channel of communication.

    Quantum computers may the fastest brains on the earth, but how is that beneficial to the majority of people? Science? maybe.. running sims will certainly be faster.. but where is the advantage?

    Smaller and faster seems to be the trend these days, but no quantum computer or 'next big thing' is going to change the way we do things. Computer technology isn't that different from what existed years ago... I would say we've maxed out the real potential of digital systems.. all they're good for are making our lives more complex, rushed, and information-swamped.

    So why are people looking to the next wave of technology as some great savior? Wouldn't the greater advancement of the human species be when we've supplanted capitalism, advertising, and the supposed 'market-driven economy' with something a little less sensational for the media to turn around and cram down our throats?

    I mean, really... quantum computers might be nice to have, but why not fix what's wrong with society first so we'll still be around to enjoy the toys?

    I'm not ranting..just raving mad, hey... woo!

  • so its not exactly the sort of room-temperature you'd want to have in your cube - unless you happen to be Mr. Freeze.
  • Your suspicion is based on disbelief in spookiness, no matter how well accepted and reproducible it is in the laboratory. Quantum "spookiness" is real. Quantum algorithms that solve NP-complete problems in polynomial time are real. They do this because the spookiness of superposition allows exponential parallelization.

    If there is a stumbling block for quantum computers, it is in engineering them. The math and physics behind them is sound. Engineering a qubit that maintains phase relationships with other qubits for a long enough period of time to carry out a calculation is the issue, not your superstition that spooky-fast solvers are impossible in principle.

    Bingo Foo

    ---

  • Don't working quantum computers, (in addition to easy factoring of large composites,) imply easy solution of all NP Complete problems?

    No, IIRC, you only get exponential speedup if you can use a fourier transform to solve the problem. Search and factorization have been solved in this manner, but neither are NP-complete.

  • You can draw up the plans for a pair of shoes that when equipped correctly will let the wearer fly, but I don't see that happening any time soon.

    will we see the dawn of Quantom Computing? I don't see why not ... but the sheer power, programming that would go in to it, and the understanding (hell it's still a thereom), will make this a reality further in the future ..

    still is fun to dream ...

  • There's an earlier thread here [slashdot.org] that's very stimulating here about advances nanoscale devices on the molecular and how they could be used in the implementation and realization of a actual working quantum computer.

    Highly recommended reading.

  • by Jace of Fuse! ( 72042 ) on Friday March 09, 2001 @12:04AM (#374963) Homepage
    Sometimes I wonder if AI and Quantum Computers wouldn't compliment each other nicely.

    I mean, I've always suspected that true, self aware computers might only be possible in a Quantum form.

    Sure, we can do some excellent AI with faster Digital Computers, but for a system to be both intelligent and diverse it needs to be able to store a lot of data and process all of it quickly.

    With today's computers, one can assume that the the more complex the information an AI is dealing with, and the more it "Learns" the more it has to process. Theoretically, this wouldn't be a problem for a Quantum Computer.

    "Good morning computer."
    "What's so good about it? You're just going to ask me to check your e-mail, read you the news at Slashdot, and give you the stock report. Then you're going to drink your coffee and head off to work, leaving me here alone as ussual. Good morning indeed."


    "Everything you know is wrong. (And stupid.)"
  • So how fast could we do a full Linux kernel compile with one of these babies? Will quantum computing even be useful for a (mostly) sequential operation such as this, or will it be mostly used for massively "parallelizable" functions such as decryption?

    And the real burning question, how long has the NSA had one of these???

    Thanks!

  • by nachoworld ( 232276 ) on Friday March 09, 2001 @12:07AM (#374965) Homepage
    I remember seeing something about atom trapping. I was able to find a tone down version of the Science magazine article here: www.academicpress.com/inscight/06022000/graphb.htm [academicpress.com]

    Schmiedmayer, who's mentioned in the parent story, is also in this story from mid-last year.

    A recent slashdot article [slashdot.org] that I submitted also concerns the aspect of using silicon buckyballs as cages for qubits.

    The crux of the matter still remains unsolved in this SciAm article, and I have yet to see any explanation on how to solve it in any of the scientific journals that I read: that is, we don't use pure quantum states to preserve the very fickle quantum condition. When we can do that - there have already been enough postulation on what a qubit can consist of - then we can seriously consider quantum computing in the future.

    ---
  • If we get QMP, then we will probably have quantum encryption up and running. Since this means that any recipient can tell if their message has been intercepted you can be sure if your one-time pad has been safely and securely transmitted. From that point, any interceptor doesn't stand a chance; every message looks like every other message of the same length.
  • Read up on quantum encryption -- quantum computing is a two edge sword.
  • by Nick ( 109 )
    Michael Crichton's latest book, Timeline, has a pretty interesting spin on Feynman's work and references it a bit if anyone is interested in his work (It also happens to be a damned good book, I might have to say it's his best work).
  • If you're interested in quantum computing, try to check out Damian Conway's talk on the Perl module Quantum::Superpositions [cpan.org]. It's very funny and actually quite useful!

    Damian is travelling around the world talking to perl user groups. Check out his schedule [yetanother.org] to see if he's due to talk near you.

  • Wow, many interesting and informative science articles on Slashdot this morning. Nice one. Keep it up.

    Claric
    --

  • What exactly is infinite computing?
    As far as I'm aware quantum computing just claims to work in a finite number of inputs simultaneously....I'm not too sure what it can do which is infinite.
  • by peccary ( 161168 ) on Friday March 09, 2001 @02:47AM (#374972)
    This is (was?) Roger Penrose's postulate as to why true AI was not possible -- because the human brain has truly random processes, and computers do not. His theory, not mine. I think it's hogwash.

  • You might need a quantum computer to find the information necessary to build one. No wonder it's so difficult.

  • In what sense is quantum computing, "infinite computing"? Computing is inherently finite. See typed lambda-calculus, intuitionistic logic, and Curry-Howard isomorphism, if you don't understand why computing means finite.

    Now, I anticipating someone bringing up non-terminating Turing machines and untyped recursive lambda-functions as counter examples to my claim... Those don't represent computing! They are nothing more than short-comings of our formal systems. They have no logical meaning. Mathematicians, logicians, and computer scientists have put great effort towards gaining an understanding of computing that weeds out such nonsensical non-constructs.
    An effective procedure that doesn't terminate with a result is not an effective procedure. Now, if by "infinite", you mean unbounded, then you have done nothing more than abuse terminology. "Infinity" does not mean the same thing as "unbounded", in a computational sense of the words.
  • A good place to start would be this article [aip.org]. With quantum encryption you can determine whether there's an eavesdropper due to quantum entangled particles. Any eavesdropping will create a detectable effect on the message. There's relevant slashdot articles here [slashdot.org] and here [slashdot.org].
  • For those who don't already know, an editor put together Feynman's lecture notes into a great book [barnesandnoble.com], which he used during his famous 1980s lectures. It's worth the $30, if you are interested in computers, computation theory, physics, or you just really like Feynman's stuff. This book is one of those classics that everyone has on their shelf.
  • Sorry, bad grammer usage. I didn't mean to say that the book was used by Feynman in his lectures. I meant to say that the lecture notes used to write the book, are the same notes that Feynman used during his famous 1980 lectures. Dangling modifiers always get me.
  • I have a question. (And no, I'm not trying to be a wise guy.)

    Lots of articles about quantum computers talk about how processing is done and how fast it will be.

    How does quantum memory work?

    Are mass/heat constraints a problem?

    The reason I ask is that a while back there was some interest in using the properties of chemical reactions, specifically the polymerization of DNA, to solve NP-complete problems fast. This strategy was used to solve some easy travelling salesman problems, for example. It was quickly pointed out, however, that to solve hard problems was going to take too much DNA. That is, the DNA system was fine for solving travelling salesman with 5 nodes, but 100 nodes (or something) would take converting the entire mass of the universe into DNA, so DNA computers are a dead end.

    Are there similar constraints with quantum computers? Why not?
  • Actually this hasn't much to do with Penrose's theory.

    His theory is based on Goedel's theorem - basically a Turing Machine machine is weaker that a human mind because we can build formal mathematical proposition based on the Turing Machine's code, that that Turing machine won't be able to prove true or false (it's undecidable in the Turing Machine's formal system) but that we humans can prove true or false thanks to Goedel's theorem.

    I personnaly don't subscribe to this theory because I feel his proof is weak around the edges - most importantly a human that proves a theorem doesn't produce the proof in the same sense that a formal system produces a proof. There's always room for a slight error in the human thought process.

    On a sidenote, he doesn't give much insight on what he thinks quantum computers might be able to do for AI.

  • we'll see a quantum computer the day after there is a fusion reactor (magnetic jar vessel) to power it...:) Sadly, this technology appears to be in the same vein as fusion: always 5 years away. Shit, I didn't even bother to read the article. SA ran one about 4 or 5 months ago about that I bothered to read and while the technology seems nifty one has to question when it will actually show. I think that SA was having what is the magazine equivalent of a "slow news day"
  • well isn't that the pot calling the kettle faggort.

    ***

  • One flaw in the idea that the mind uses quantum effects as a quantum computer would, is that we don't have any special skill at solving the sorts of problems that quantum computers could. Personally, I can't factor large numbers quicker than a slow 8 bit micro could.

  • My suspicion is that quantum computers, will turn out to be impossible in principle, because some of the mathematical implications are as spooky as action at a distance.

    Don't working quantum computers, (in addition to easy factoring of large composites,) imply easy solution of all NP Complete problems? Some of the other mathematical implications are ever stranger. Whole classes of problems would go from insouble over the age of the universe, to trivial.
  • "Overheating" for a quantum computer would mean reaching a degree or two above absolute zero. Anything higher than that and the qubits decohere.
  • Undetermined. There has been research for years into the automatic parallelization of algorithms written to the usual serial assumptions of computer languages, but so far as I know it has not looked specifically at converting them to the needs of a hypothetical quantum computer.

    It's likely any real-world quantum computer would employ a hybrid architecture. Considering we are unlikely to see commercial quantum computers outside the lab for at least a decade, conventional processors at that time should be somewhere in the 50-100GHz range for a desktop system. Even if some application has been discovered that needs the extra quantum boost, most things we do now will run more than fast enough on a conventional processor, and given the availability of conventional software it's likely most of the system will still use the conventional processor, calling out to the quantum system only for specialized tasks.

    Tim

  • We're not going to compile QMP (Quantum Multi Processing) enabled kernels tomorrow at 5PM either...

    But when it comes, we will have to QUICKLY design a better encryption system than our current public-key based one !
  • by TheOutlawTorn ( 192318 ) on Thursday March 08, 2001 @11:54PM (#374987)
    From the article:
    "In five years we will know if it's an interesting physics problem or if it's really something that we can use"

    Unfortunately, this was the general opinion 5 years ago, and it will probably be the general opinion 5 years from now. It's like with AI, we're always on the cusp of a breakthrough, but that breakthrough never seems to come.

    Ah well, someday...
  • > But when it comes, we will have to QUICKLY design a better encryption system than our current public-key based one !

    Well, I still believe that _a_ public system that relies on scrutiny by everyone will be more secure - just look at the last year and tha spate of security braks we've seen. At least with general scrutiny, these holes are likely to be found faster (which still can be a long tims, see the PGP alternate key situation
  • Excellent! It's a maglev rollercoaster for atoms!
  • If Moore's law has anything to say about it, these will become practical immediately after the 1 atom transistor CPU becomes commonplace. After that, the limitations of silicon will have been reached, and we will need to use a new technology.

    The question is, who will have this technology? The early computers were the sole preserve of governments, but transistorised computers were more commonly seen in big businesses.

    Early silicon based designs were bought solely by home hobbiest. Later silicon monsters once again went ot business, but this time those of all sizes. Will we see these more widely spread than the current breed, or more selectively targeted?
  • Cuz if it does I would like to leap back in time about 2 years ago and sell all my nasdaq stock..Do I get the little gaurdian angel who smokes cigars? Maybe CowboyNeal can be him!
  • What's with all these technology pieces? Next thing we'll hear is that they have cloned dinosaurs - or almost have, you'll bet the the computers they tried this on will be using a plastic chip that you print out on your portable (linux powered) computer developed at Bell Labs - but it won't be shipping with DVD playback

    sheesh.
  • by Klerck ( 213193 ) on Friday March 09, 2001 @12:23AM (#374993) Homepage
    First, I'd like to point out that quantum computation and quantum encryption are two almost completely separate concepts. Quantum encryption is based on the fact that quantum states cannot be measured without altering. The most common example is the polarization of a photon, but it will work for any quantum state, so long as there exist, effectively, two unique states that can transmit the data.

    Quantum computation, however, is much more complex and much more interesting. Quantum computers are based on the concept of quantum entanglement, the ability of a quantum state to exist in a superposition of all of its mutually exclusive states: It's a 1 and a 0. However, this is not as easy to use as one might think. While it's true that if you have n quantum logic gates you have the ability to input 2^n data values simultaneously (as opposed to only 1 piece of data if you have n digital logic gates), this is not going to be the end of classical computing for a few reasons. First, quantum computers have to be perfectly reversible. That means for every output there's an input and vice versa. And there has to be no way of knowing the initial states of the data. You don't process data, you process probabilities in a quantum computer; if you know exactly what any one value is throughout the computation, you can find out all of the values: the superposition ends and you're stuck with a useless chunk of machinery. This means YOU CAN ONLY GET ONE RESULT FROM ANY QUANTUM COMPUTATION, THE END RESULT. You can't see what the data in the middle is or the computer becomes useless. (Landauer's principle makes heat loss data loss. When your processor gets hot, it's losing data. If the same thing happened to a quantum computer, it wouldn't be quantum anymore.) Decoherence is what happens when you randomly lose data to the environment by design, not by choice, and the superposition ends. This is bad for Q.C. Oh, and quantum computers can only do *some* things faster, like prime factorization and discrete logarithms. Not multiplication or addition. Plus, the circuits that would do basic arithmetic would be bigger and slower than what you've currently got.

    So what does this all mean? It means that quantum computers are going to provide some advantages (real quick big number factorization), and some disadvantages (that whole RSA standard). The most realistic initial use of quantum computers will be as add-ons to existing super-computers to resolve certain types of NP-Complete headaches that regular math can't simplify yet. At best they will someday be an add-on to your PC; but they will never replace the digital computer.~

    If you want more info, check out ahttp://www.qubit.org [qubit.org], it's got some decent tutorials.
  • This issue was raised in Singh's book The Code Book. There is a working model to create a "quantum one time pad," or somthing thereof.
  • ...Feynman's notes are still available (from the Feynman link):

    Papers, 1933-1988. Feynman's correspondence, course and lecture notes, talks, speeches, publications, manuscripts, working notes and calculations and commentary on the work of others are all included in this extensive collection.

    Collection size: 91 boxes, 39 linear ft.

    That's a big twinkie.

  • We must find out how to overclock Quantum Computers.
  • by Glowing Fish ( 155236 ) on Friday March 09, 2001 @01:36AM (#374997) Homepage
    Um, how did this get modded down to -1 with no explanation? This looks to be totally on-topic and interesting. Did some moderatorscatrun across the keyboard again?

Genetics explains why you look like your father, and if you don't, why you should.

Working...