First Quantum Computing Gate on a Chip 166
An anonymous reader writes "After recent success in using quantum computing for superconducting qubits, researchers from Delft have formed the first Controlled-NOT quantum gate. 'A team has demonstrated a key ingredient of such a computer by using one superconducting loop to control the information stored on a second. Combined with other recent advances, the result may pave the way for devices of double the size in the next year or two--closer to what other quantum computing candidates have achieved, says physicist Hans Mooij of the Delft University of Technology in the Netherlands. Unlike today's computers, which process information in the form of 0s and 1s, a quantum computer would achieve new levels of power by turning bits into fuzzy quantum things called qubits (pronounced cue-bits) that are 0 and 1 simultaneously. In theory, quantum computers would allow hackers to crack today's toughest coded messages and researchers to better simulate molecules for designing new drugs and materials.'"
A solid milestone... (Score:5, Interesting)
For those wondering why this is important, the first true electronic gates were invented in the early 1920's. This predates point-contact transistors by about 20 years, invented in 1947. 60 years later, here we are with transistor computing in every aspect of our lives.
At the rate quantum computing is advancing, I think we can expect to see quantum transistors (in the lab, at least) by 2020. A true useful quantum computer may be available less than 50 years from now. Hopefully by then someone will pick up the slack and have the Linux kernel ported to the Q-CPU architecture!
Re:A solid milestone... (Score:3, Interesting)
Come to think of it, arithmetic encoding is a bit like encryption...
Quantum states (Score:4, Interesting)
Someone with some understanding of this stuff please elaborate, before my head asplodes.
Re:A solid milestone... (Score:5, Interesting)
However, it is important to realise that the theory of computation had been in development since the early 1800s (and the logic underlying that had been around for centuries); by the time the first electronic devices were created, we already had a good understanding of what they could be used for, because we had been doing exactly the same things by hand for over 50 years at that point (the word "computer" originally meant a person who performed such computations, and an "electronic computer" was just a device to replicate the task that person was doing).
We can't do quantum computations by hand, so we have no real experience with the theory, and the underlying statistical methods are relatively recent developments: quantum computers do not use the classical logic that we're all familiar with. This is a massive setback compared to the development of the electronic computer - and advances in theory usually can't be accelerated all that much. It is likely to be between 50 and 100 years before we know enough to build non-trivial applications out of quantum computers. Not because we can't build the hardware, but because we don't know how to write any software to run on them. The entire field of software development will have to be reinvented, and we don't actually know that it will be useful for anything. Unlike the first electronic computers, which had very real and obvious applications performing the tasks that were currently being done by hand, we have only vague theories and ideas about what quantum computers might be useful for. (Even the much-quoted method for breaking certain encryption algorithms is based on various assumptions that aren't proven; we don't know for sure whether quantum computers will actually be able to run it, yet)
We'll get there eventually, but it will probably take a long time and we can't really predict at this stage whether it'll be particularly useful. From what we know so far, these things are going to be incredibly arcane and obtuse to work with, and that is going to make it difficult. We might see it in our lifetimes, but I wouldn't place any bets on it, it might take much longer. The things we're playing with today may turn out to be the Babbage engines of quantum computing.
Inversed qubit? (Score:2, Interesting)
Does this make D-Wave's quantum computer obsolete? (Score:1, Interesting)
(comment 2):"How come Delft U has been able to perform a CNOT with two qubits using superconducting technology? I thought Rose/D-wave claimed it was extremely difficult to do discrete quantum gates with superconducting technology. What are the present & future limitations of the Delft "quantum computer?"
Rose IGNORED the question. The quantum computer built by D-Wave [wikipedia.org] is an adiabatic computer (which is an analog computer), whereas the Delft people have built a discrete gate quantum computer. Does the Delft computer make D-Wave's computer obsolete?
Re:A solid milestone... (Score:4, Interesting)
Re:A solid milestone... (Score:2, Interesting)
Really freaking fast processing by estimation (Score:3, Interesting)
There's a lot more math to it than that, but the idea is a simpler approximation formula replicated infinitely across realities gives an accurate response much faster than any single reality calculating the absolute answer.
Cooler yet is that if they're actually making functional quantum gates does this mean the processor power is actually being derived from other realities? That would be awesome and totally Outer Limits material.
-Matt