Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Supercomputing IBM Hardware

Quantum Computing Startup Says It Will Beat IBM To Error Correction (arstechnica.com) 39

An anonymous reader quotes a report from Ars Technica: On Tuesday, the quantum computing startup Quera laid out a road map that will bring error correction to quantum computing in only two years and enable useful computations using it by 2026, years ahead of when IBM plans to offer the equivalent. Normally, this sort of thing should be dismissed as hype. Except the company is Quera, which is a spinoff of the Harvard University lab that demonstrated the ability to identify and manage errors using hardware that's similar in design to what Quera is building. Also notable: Quera uses the same type of qubit that a rival startup, Atom Computing, has already scaled up to over 1,000 qubits. So, while the announcement should be viewed cautiously -- several companies have promised rapid scaling and then failed to deliver -- there are some reasons it should be viewed seriously as well. [...]

As our earlier coverage described, the Harvard lab where the technology behind Quera's hardware was developed has already demonstrated a key step toward error correction. It created logical qubits from small collections of atoms, performed operations on them, and determined when errors occurred (those errors were not corrected in these experiments). But that work relied on operations that are relatively easy to perform with trapped atoms: two qubits were superimposed, and both were exposed to the same combination of laser lights, essentially performing the same manipulation on both simultaneously. Unfortunately, only a subset of the operations that are likely to be desired for a calculation can be done that way. So, the road map includes a demonstration of additional types of operations in 2024 and 2025. At the same time, the company plans to rapidly scale the number of qubits. Its goal for 2024 hasn't been settled on yet, but [Quera's Yuval Boger] indicated that the goal is unlikely to be much more than double the current 256. By 2025, however, the road map calls for over 3,000 qubits and over 10,000 a year later. This year's small step will add pressure to the need for progress in the ensuing years.

If things go according to plan, the 3,000-plus qubits of 2025 can be combined to produce 30 logical qubits, meaning about 100 physical qubits per logical one. This allows fairly robust error correction schemes and has undoubtedly been influenced by Quera's understanding of the error rate of its current atomic qubits. That's not enough to perform any algorithms that can't be simulated on today's hardware, but it would be more than sufficient to allow people to get experience with developing software using the technology. (The company will also release a logical qubit simulator to help here.) Quera will undoubtedly use this system to develop its error correction process -- Boger indicated that the company expected it would be transparent to the user. In other words, people running operations on Quera's hardware can submit jobs knowing that, while they're running, the system will be handling the error correction for them. Finally, the 2026 machine will enable up to 100 logical qubits, which is expected to be sufficient to perform useful calculations, such as the simulation of small molecules. More general-purpose quantum computing will need to wait for higher qubit counts still.

This discussion has been archived. No new comments can be posted.

Quantum Computing Startup Says It Will Beat IBM To Error Correction

Comments Filter:
  • All those qubits (Score:3, Informative)

    by backslashdot ( 95548 ) on Tuesday January 09, 2024 @10:38PM (#64145901)

    They claim to have all those qubits, but it's a lie. If they actually had a 3000 qubit computer they'll be able to show it cracking cryptography. Instead most of the qubits are wasted on "error correction" BS. When will they make one capable of solving any real world applicable problem? Nobody knows. It seems to be up there with cold fusion for fleecing investor money. So far the only thing they show is simulating quantum systems. That's like saying a swimming pool is a computer because it can solve wave mechanics problems. Ironically, that's actually more useful than current fake "quantum computers".

    • When will they make one capable of solving any real world applicable problem?

      There are a lot of physics experiments that do "quantum computing", meaning building a qm system, measuring its parameters and doing something useful with the results.

      Semiconductors, lasers, nanoparticles with useful properties, you name it.

      But the research is usually slow, the progress - incremental and the hodge-podge of methods used is too difficult to put into a simple pseudo-metric like "qubits", so journos ignore it :)

      • by gweihir ( 88907 )

        Nope. This is not "computing". Computing means you abstract a problem to some formal representation, have a machine work on that, get an abstract, formal result and transfer that back to the real world.

    • by gweihir ( 88907 )

      The term is "effective Qbits" and only those do matter. Kind of like "AI" is not intelligent and the real term for that is AGI.

      With 3000 raw Qbits, they have, maybe, 150 effective ones, maybe less. With that the can break maybe RSA 50. If he entanglement holds up long enough. Error correction cannot prevent entanglement collapse from other sources than noise. And if they can actually do universal computations.

      Well, my 30 year old programmable pocket calculator running Basic can break RSA 60 and probably can

    • It's actually in the summary.

      3,000 of their qubits is actually only 30 qubits.

      Their hardware isn't great.
      • by gweihir ( 88907 )

        Hahah, so even worse than I thought. They can (at best) break RSA 10 with that and probably habe less computing power than the most weak ass pocket calculator ever made. That is not "useful". That is pathetic.

      • It's actually in the summary. 3,000 of their qubits is actually only 30 qubits.

        The point is that if you can reliably get X% effective qubits and that percentage holds regardless of scale, then building an arbitrarily-large quantum computer is just a matter of throwing money at it. To get 3000 effective qubits, you have to build a machine with 300,000 qubits, which would be incredibly expensive. So expensive that probably only a handful of nation-state intelligence agencies could afford to build them... but nation-state intelligence agencies could build them.

        And, of course, technolo

        • Uh, you're assuming linear scaling.

          • Uh, you're assuming linear scaling.

            Yes, obviously.

            Though more generally I'm just assuming that the progress made on error correction in recent years will continue.

    • by tlhIngan ( 30335 )

      They claim to have all those qubits, but it's a lie. If they actually had a 3000 qubit computer they'll be able to show it cracking cryptography. Instead most of the qubits are wasted on "error correction" BS. When will they make one capable of solving any real world applicable problem? Nobody knows. It seems to be up there with cold fusion for fleecing investor money. So far the only thing they show is simulating quantum systems. That's like saying a swimming pool is a computer because it can solve wave me

  • by Mr. Dollar Ton ( 5495648 ) on Tuesday January 09, 2024 @10:40PM (#64145903)

    Normally, this sort of thing should be dismissed as hype. Except

    this is a slashvertisement, so it should be dismissed with prejudice ;)

  • And IBM will own it all. Buy those stock options!

  • by kmoser ( 1469707 ) on Tuesday January 09, 2024 @11:55PM (#64145979)
    Year of the Linux desktop, or year of error correcting quantum computing?
  • They can make their founders rich though because many investors have no clue.

    Actually useful practical QCs are even more remote than they were before. 50 years, 100 years, 1000 years and "never" are all realistic estimates, with "never" looking more and more likely as time goes by and this technology has now delivered nothing except promises for something like 40 years.

  • by Opportunist ( 166417 ) on Wednesday January 10, 2024 @05:26AM (#64146289)

    Now go play and present it when you have something to present, in the meantime the grownups are gonna continue with their work.

Think of it! With VLSI we can pack 100 ENIACs in 1 sq. cm.!

Working...