Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Supercomputing Science

First Quantum Computing Gate on a Chip 166

An anonymous reader writes "After recent success in using quantum computing for superconducting qubits, researchers from Delft have formed the first Controlled-NOT quantum gate. 'A team has demonstrated a key ingredient of such a computer by using one superconducting loop to control the information stored on a second. Combined with other recent advances, the result may pave the way for devices of double the size in the next year or two--closer to what other quantum computing candidates have achieved, says physicist Hans Mooij of the Delft University of Technology in the Netherlands. Unlike today's computers, which process information in the form of 0s and 1s, a quantum computer would achieve new levels of power by turning bits into fuzzy quantum things called qubits (pronounced cue-bits) that are 0 and 1 simultaneously. In theory, quantum computers would allow hackers to crack today's toughest coded messages and researchers to better simulate molecules for designing new drugs and materials.'"
This discussion has been archived. No new comments can be posted.

First Quantum Computing Gate on a Chip

Comments Filter:
  • A solid milestone... (Score:5, Interesting)

    by teebob21 ( 947095 ) on Sunday June 24, 2007 @05:48PM (#19630241) Journal
    I find it interesting that the first electronic computing gates devised were the AND/OR gates, using basic diode logic. Quantum computing research develops the NOT gate first. I think this has something to do with the esoteric nature of quantum computing. AND/OR gates require two inputs to change to a single value, where NOT is merely an inverter. The idea of entanglement makes the inversion process a likely first step in quantum research.

    For those wondering why this is important, the first true electronic gates were invented in the early 1920's. This predates point-contact transistors by about 20 years, invented in 1947. 60 years later, here we are with transistor computing in every aspect of our lives.

    At the rate quantum computing is advancing, I think we can expect to see quantum transistors (in the lab, at least) by 2020. A true useful quantum computer may be available less than 50 years from now. Hopefully by then someone will pick up the slack and have the Linux kernel ported to the Q-CPU architecture!

  • by Ant P. ( 974313 ) on Sunday June 24, 2007 @05:54PM (#19630277)
    What is a quantum computer good for, anyway? So far all I've seen is cracking encryption and other stuff involving gigantic calculations. Is there anything in the mainstream market it'd be useful for, like sound/video processing?
    Come to think of it, arithmetic encoding is a bit like encryption...
  • Quantum states (Score:4, Interesting)

    by arashi no garou ( 699761 ) on Sunday June 24, 2007 @06:03PM (#19630323)
    I'm no quantum theory expert by a very long shot, but it was my understanding that there are 32 quantum states of electrons, not just on/off (1/0) like in the binary computer world. So, if we now have a quantum NOT gate, doesn't that mean there are 32 possible states of the NOT gate? Also, according to the article the CNOT gates they created can be both 0 and 1 simultaneously. In my mind this would cause errors and actually stop the flow of information instead of speeding it up.

    Someone with some understanding of this stuff please elaborate, before my head asplodes.
  • by asuffield ( 111848 ) <asuffield@suffields.me.uk> on Sunday June 24, 2007 @06:30PM (#19630521)

    For those wondering why this is important, the first true electronic gates were invented in the early 1920's. This predates point-contact transistors by about 20 years, invented in 1947. 60 years later, here we are with transistor computing in every aspect of our lives.


    However, it is important to realise that the theory of computation had been in development since the early 1800s (and the logic underlying that had been around for centuries); by the time the first electronic devices were created, we already had a good understanding of what they could be used for, because we had been doing exactly the same things by hand for over 50 years at that point (the word "computer" originally meant a person who performed such computations, and an "electronic computer" was just a device to replicate the task that person was doing).

    We can't do quantum computations by hand, so we have no real experience with the theory, and the underlying statistical methods are relatively recent developments: quantum computers do not use the classical logic that we're all familiar with. This is a massive setback compared to the development of the electronic computer - and advances in theory usually can't be accelerated all that much. It is likely to be between 50 and 100 years before we know enough to build non-trivial applications out of quantum computers. Not because we can't build the hardware, but because we don't know how to write any software to run on them. The entire field of software development will have to be reinvented, and we don't actually know that it will be useful for anything. Unlike the first electronic computers, which had very real and obvious applications performing the tasks that were currently being done by hand, we have only vague theories and ideas about what quantum computers might be useful for. (Even the much-quoted method for breaking certain encryption algorithms is based on various assumptions that aren't proven; we don't know for sure whether quantum computers will actually be able to run it, yet)

    We'll get there eventually, but it will probably take a long time and we can't really predict at this stage whether it'll be particularly useful. From what we know so far, these things are going to be incredibly arcane and obtuse to work with, and that is going to make it difficult. We might see it in our lifetimes, but I wouldn't place any bets on it, it might take much longer. The things we're playing with today may turn out to be the Babbage engines of quantum computing.
  • Inversed qubit? (Score:2, Interesting)

    by Lobais ( 743851 ) on Sunday June 24, 2007 @06:47PM (#19630611)
    If a qubit is both 0 and 1 at the same time, what is the point of inversing it? Would it then be 1 and 0 at the same time?
  • by Anonymous Coward on Sunday June 24, 2007 @06:52PM (#19630647)
    Recently, D-Wave's founder Gordie Rose was asked in his blog [wordpress.com]


    (comment 2):"How come Delft U has been able to perform a CNOT with two qubits using superconducting technology? I thought Rose/D-wave claimed it was extremely difficult to do discrete quantum gates with superconducting technology. What are the present & future limitations of the Delft "quantum computer?"


    Rose IGNORED the question. The quantum computer built by D-Wave [wikipedia.org] is an adiabatic computer (which is an analog computer), whereas the Delft people have built a discrete gate quantum computer. Does the Delft computer make D-Wave's computer obsolete?

  • by jp102235 ( 923963 ) on Sunday June 24, 2007 @06:59PM (#19630673)
    Well, inverting logic is well, logical (no pun intended) to most modern digital logic designers of the CMOS type. CMOS logic (and its variants) are inherently inverting. That is, the basic gate in CMOS is an inverter. The next higher complexity of gates in CMOS is a NAND and NOR (AND NOT / OR NOT). To make an AND gate requires a NAND and an Inverter... same thing for OR : a NOR and an Inverter. Although the quantum abstraction of computation may not be the same as CMOS (inverting layers of logic) it is not surprising at all that the designers tried to make an inverter first. Had they started during the days of relays, we might have had a different gate altogether. JP
  • by Short Circuit ( 52384 ) <mikemol@gmail.com> on Monday June 25, 2007 @01:22AM (#19632665) Homepage Journal

    Really? I know they can get close, and it's possible to prove that power must be consumed to change state, but I'd love to see a device with no leakage. Gates such as those used in NAND flash devices (dual MOSFETs) get pretty close, but I'm pretty sure even they leak, especially on read operations.
    You got me there. I'd forgotten about leakage across the insulating layer due to quantum tunneling. OTOH, a vacuum tube doesn't work well without power to the heating element. Come to think of it, though, one could still get current flow at ambient temperatures; It just wouldn't be nearly as large as when you have a hot cathode.

    My point is that quantum computers are not just special transistors with slightly different properties. They really are a completely different thing. You're never going to make an amplifier out of quantum gates, and you're never going to make a quantum computer out of transistors
    Point taken and noted. I'm still not very clear on how quantum computing works, but I'm looking forward to reading more discussions like this one in the future.
  • IAMAQP (I am not a quantum physicist) but the theory I read explains a system gaining processing power from shared computing of a single processor replicated across multiple realities. Each qubit is a calculated answer by a machine in one reality and the culmination of those answers assumedly gives you the correct response. David Deutsch [wikipedia.org] wrote a book on this called "The Fabric of Reality" that works through the concept of a basic Turing machine - where computers all come from - and how this can be re-worked into a quantum processor.

    There's a lot more math to it than that, but the idea is a simpler approximation formula replicated infinitely across realities gives an accurate response much faster than any single reality calculating the absolute answer.

    Cooler yet is that if they're actually making functional quantum gates does this mean the processor power is actually being derived from other realities? That would be awesome and totally Outer Limits material.

    -Matt

"Gravitation cannot be held responsible for people falling in love." -- Albert Einstein

Working...