Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
IBM Google

IBM Raises the Bar with a 50-Qubit Quantum Computer (technologyreview.com) 69

IBM said on Friday it has created a prototype 50 qubit quantum computer as it further increases the pressure on Google in the battle to commercialize quantum computing technology. The company is also making a 20-qubit system available through its cloud computing platform, it said. From a report: The announcement does not mean quantum computing is ready for common use. The system IBM has developed is still extremely finicky and challenging to use, as are those being built by others. In both the 50- and the 20-qubit systems, the quantum state is preserved for 90 microseconds -- a record for the industry, but still an extremely short period of time. Nonetheless, 50 qubits is a significant landmark in progress toward practical quantum computers. Other systems built so far have had limited capabilities and could perform only calculations that could also be done on a conventional supercomputer. A 50-qubit machine can do things that are extremely difficult to simulate without quantum technology. Whereas normal computers store information as either a 1 or a 0, quantum computers exploit two phenomena -- entanglement and superposition -- to process information differently.
This discussion has been archived. No new comments can be posted.

IBM Raises the Bar with a 50-Qubit Quantum Computer

Comments Filter:
  • by Anonymous Coward on Friday November 10, 2017 @11:44AM (#55526363)

    But can it run Linux?

  • encryption (Score:4, Interesting)

    by Anonymous Coward on Friday November 10, 2017 @11:51AM (#55526417)

    One of the reasons the three letter agencies like to store even encrypted communication is that quantum computers will allow breaking encrypted data in ways that classical computers can't do in any practical sense. An example is Shor's Algorithm for factoring numbers, which runs efficiently in a practical amount of time on a quantum computer and could be used to break public key crypto. If they have saved the current encrypted text they can later break that when quantum computing hits.

    Quantum computing is not quite there yet but it is coming up the well.

    • I wouldn't be surprised to see a move to lattice based algorithms or crypto that is resistant to quantum factoring in the next few years, once there is a significant key factored. Or, perhaps when a key handshake is done, part of it is keeping a shared secret for a later time, so if the public/private part of the encryption is broken, the shared secret, even though not as secure, would still protect the data.

      • Quantum is no silver bullet. Basically only public key cryptography is in a need of overhaul, as algorithms we currently use, chosen for nice short keys, are vulnerable. Elsewhere, it's not an issue: for example it's proven that a quantum algorithm can break a hash of no more than twice the length than an equivalent non-quantum computer. Yes, double the hash length is an exponential speed-up, but the only effect is hashes being slightly more cumbersome to read for a human.

      • Google released "A New Hope", a lattice based key exchange for securing communication. It would be immune to Shor's Algrithm, so safe in a post quantum world. Unfortunately, there are other issues with it. Someone might find a classical computing way to break it, it might leak information, it can sometimes fail etc. There are other possible quantum resistant algorithms. XMSS can be used to sign documents but the signatures and keys are huge compared to what we have now. There is also Supersingular iso
  • by king neckbeard ( 1801738 ) on Friday November 10, 2017 @11:57AM (#55526449)
    50 cubits is an awfully large computer, and why do Americans have to use such archaic units?
    • The summary is wrong. These must be cubic cubits, as measuring a computer in a single dimension makes no sense at all.

    • by sl3xd ( 111641 )

      I think it all comes down to what you base your measurement system from.

      Most of the world's measurements are centered around "the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom." All of the other basic units require that duration as part of their derivation.

      America has realized that it's a poor basis, as the duration can vary significantly depending on one's relative velocity vs a frame of refere

  • by Myria ( 562655 ) on Friday November 10, 2017 @12:06PM (#55526485)

    I get the feeling that we're going to find out that maintaining coherence requires energy that's exponential in the number of qubits, which would making quantum computing mostly useless.

    Our universe has always tended to stop those who try to break the rules; try making a perpetual motion machine, for example.

    • Re:Coherence (Score:5, Informative)

      by JoshuaZ ( 1134087 ) on Friday November 10, 2017 @12:17PM (#55526543) Homepage
      We sort of know already that that isn't the case, at least it isn't the case for generic states. We know that because we can construct Bose-Einstein condensates https://en.wikipedia.org/wiki/Bose%E2%80%93Einstein_condensate [wikipedia.org] which are in a certain sense coherent states of lots of things together. That said, Gil Kalai has made more technical claims and conjectures which seem to follow from a similar intuition https://gilkalai.wordpress.com/2014/03/18/why-quantum-computers-cannot-work-the-movie/ [wordpress.com]. Note that this isn't really like the thermodynamic situation of perpetual motion; there's no intrinsic law of physics that appears to be being violated by quantum computers, they just don't match our intuitions well.
    • Perpetual motion is easy. Extracting energy from said machine is impossible.

      A machine which appears to be perpetual motion and provides extractable energy is also easy. It fails as soon as you leave whatever environmental conditions you're exploiting.

    • by GuB-42 ( 2483988 )

      I think that the question is : is that quantum computer able to factor integers into prime numbers faster than a classical computer using the same amount of power.
      If that's the case, even if it proves too impractical to break cryptography right now, it should at least prove that there is something to be gained from quantum computing.

    • by Luthair ( 847766 )
      What about Twitter? They developed a perpetual emotion machine.
    • "try making a perpetual motion machine"

      Not all that difficult. we're powering ours with human stupidity which is infinite, filtered thru a mesh of hashed bitcoin which are well known to be imaginary. The math -- which involves dividing stupidity by cellphone user intelligence (zero) shows that perpetual motion is not only possible, but inevitable. We'll be taking our product to market just as soon as we handle a couple of engineering glitches.

  • by JoshuaZ ( 1134087 ) on Friday November 10, 2017 @12:13PM (#55526531) Homepage

    We're getting closer and closer to testing quantum supremacy- the hypothesis that quantum computers can practically solve problems that classical computers cannot do https://en.wikipedia.org/wiki/Quantum_supremacy [wikipedia.org]. Note that this is a practical statement; anything a quantum computer can do, a classical computer can do, but with potentially exponential slowdown. This follows from the fact that BQP https://en.wikipedia.org/wiki/BQP [wikipedia.org] the set of problems that a quantum computer can do in polynomial time is within is contained in PSPACE https://en.wikipedia.org/wiki/PSPACE [wikipedia.org] the set of things that a classical computer can do with polynomial space (since polynomial space calculations live in EXPTIME, the set of things requiring exponential time, the result follows).

    It is very likely that before we see genuinely useful quantum computing (e.g. for factoring large numbers or simulating complicated chemical systems) we'll have an answer to the quantum supremacy question. I suspect that it is more likely that we'll have an answer in terms of boson sampling before we have an answer involving a universal quantum computer.

    Essentially, boson sampling works by just looking at the distribution of bosons (well for convenience, photons) as they go through very simple optical objects. Boson sampling has two major advantages: first, we know it is actually *hard* in a technical sense for a classical computer to do unless some conjectures that pretty close to everyone believes are false. In particular, Scott Aaronson and Alex Arkipov proved that if a classical computer can do boson sampling efficiently then the polynomial hierarchy will collapse https://www.scottaaronson.com/papers/optics.pdf [scottaaronson.com]. For those who aren't theoretical compsci people, the polynomial hierarchy not collapsing is a statement which is only marginally stronger than P!=NP and is very widely believed. This is in contrast for example with factoring large numbers where if it turned out that classical computers could efficiently factor the only major conjecture that would turn out to be false would just be the difficulty of factoring itself. Second, boson sampling is much easier in many respects than what IBM is trying to do which requires much fancier systems, supercooled qubits, careful protection from stray particles, careful preservation of entanglement and all sorts of other stuff. Still, what they are doing is important and very necessary if we're going to actually have practical quantum computers ever.

    • Comment removed based on user account deletion
      • So, everyone agrees that if one can get enough qubits to work together in a universal computer, then they will be useful. The exact number isn't clear but pretty much everyone agrees that by the time you get to around 200 qubits there will be tasks where it will be very likely to be practically useful.
  • Well, if they repeat the same for network speeds, maybe in 10 years we can run Lotus Notes in an usable way.
  • Is it truly 50qb, i.e. all 50 are entangled, or is it 'n' times smaller (e.g. 4qb) units?
  • Wait, isn't D-Wave already providing a 1000+ Qubit computer? What's the difference?

    • The D-Wave computers are very far from universal computers. Each qubit is only able to talk to a small number of qubits very close to it. The specific method that D-Wave is using is a variant of quantum annealing and it isn't clear that this provides any speedup over classical approaches https://www.scottaaronson.com/blog/?p=3192 [scottaaronson.com]. So for multiple reasons, the IBM approach is very different, and frankly, much more likely to pan out in the long term.
    • by sl3xd ( 111641 )

      There's no small amount of controversy as to whether D-Wave is even "quantum". It's definitely not general purpose.

      D-Wave's current offering is "15x" faster than a single-core silicon microprocessor -- and the tasks it's useful for are embarrassingly parallel. Modern laptops are starting to be offered with 18 or more cores - meaning that even laptop CPU can outperform D-Wave's "1000+ Qubits"

      Scientific publications have, by and large, found that a traditional multicore silicon chip can easily outperform what

  • by MouseR ( 3264 )

    Canadian-owned and operated D-Wave [dwavesys.com] computer has way more that 50 Q-bits with a 1000Q model available and a 2K in the works.

    • by sl3xd ( 111641 )

      D-Wave makes quantum annealing processors - and is only useful for a sliver of useful computing (adiabatic quantum computing).

      There's no small amount of controversy [wikipedia.org] as to whether D-Wave systems are truly quantum machines. A number of groups found "no quantum speedup" and have shown better performance using traditional silicon, and studies [sciencemag.org] have been published [arxiv.org] to that effect.

      Having worked in supercomputing for a decade, I've looked hard are D-Wave's "quantum" computing, and give it slightly more credibility t

      • D-Wave makes quantum annealing processors - and is only useful for a sliver of useful computing (adiabatic quantum computing).

        Whew, that's lucky! Quantum computing wouldn't get very far at all without obese diabetic programmers participating.

    • by dissy ( 172727 )

      Canadian-owned and operated D-Wave computer has way more that 50 Q-bits with a 1000Q model available and a 2K in the works.

      D-wave works on a completely different design. Their systems can not manipulate individual qbits, but instead have all their qbits in a big pool functioning together such that they can only manipulate the entire grouping.

      Instead of reading out individual qbits, they read the energy level of the entire pool of qbits summed together.

      This makes it easier to actually setup all of those qbits in the first place, but they are limited to solving "lowest energy state" problems.

      IBM and Google are using designs that

  • I don't know how many times I've read the now rote description of quantum computing in some sciencey magazine or blog:

    "Our regular computers have bits that can be only 1 and 0. A quantum computer has bits that use Superposition, the bits can be both 1 and 0 at the same time"

    Every time I read it, I ask my self, so what? So a bit can be both 1 and 0 at the same time. That didn't explain anything at all.

    • by Jeremi ( 14640 )

      Every time I read it, I ask my self, so what? So a bit can be both 1 and 0 at the same time. That didn't explain anything at all.

      Right, they forget to mention the hoped-for consequence, which only becomes apparent when you consider a system containing more than one qubit at once.

      i.e.

      1 qubit = 2 simultaneous states (== 2x potential speedup vs classicalp)
      2 qubits = 4 simultaneous states (== 4x potential speedup vs classical)
      3 qubits = 8 simultaneous states (== 8x potential speedup vs classical)
      [...]
      64 qubits = 18446744073709551616 simultaneous states (== 18446744073709551616x potential speedup vs classical)

      It's the old rice-on-the-ches [singularitysymposium.com]

      • by Anonymous Coward

        Yes everyone always explains that.

        But they never explain how that is in anyway useful.

        Hope do you give an input (like a cipher text) and have it spit out an encryption key. Yes 128qbits could take the state of any 128bit encryption key, but how do you collapse it to the correct key?

    • It's very weird. Much like everything in the quantum world. Imagine you had 2 classical bits. They could be in the configuration of :
      1,1
      1,0
      0,1
      0,0
      but only one of those 4 states. With qubits, they are all of those at the same time. As such, where we would just read the bits in a classical computer, referencing the bits is not enough, we have to provide a coefficient as well, to tell which quantum state we are checking. That means essentially we can derive 4 bits (2 coefficients + 2 bits) worth of inform

    • That didn't explain anything at all.

      Unfortunately, only Cubots can understand the theory, and they can also not understand it at the same time.

    • That didn't explain anything at all.

      Well. It does and it doesn't.

  • 90 microseconds before the qubits get destroyed, I assume rendering the "CPU" unusable? That's still longer than most iPhones last according to the last reliability survey.
  • by Chris Mattern ( 191822 ) on Friday November 10, 2017 @01:58PM (#55527241)

    What's a qubit?

Time is the most valuable thing a man can spend. -- Theophrastus

Working...