Google Claims Breakthrough in Quantum Computer Error Correction (ft.com) 29
Google has claimed a breakthrough in correcting for the errors that are inherent in today's quantum computers, marking an early but potentially significant step in overcoming the biggest technical barrier to a revolutionary new form of computing. From a report: The internet company's findings, which have been published in the journal Nature, mark a "milestone on our journey to build a useful quantum computer," said Hartmut Neven, head of Google's quantum efforts. He called error correction "a necessary rite of passage that any quantum computing technology has to go through."
Quantum computers struggle to produce useful results because the quantum bits, or qubits, they are based on only hold their quantum states for a tiny fraction of a second. That means information encoded in a quantum system is lost before the machine can complete its calculations. Finding a way to correct for the errors this causes is the hardest technical challenge the industry faces. [...] Google's researchers said they had found a way to spread the information being processed in a quantum computer across a number of qubits in a way that meant the system as a whole could retain enough to complete a calculation, even as individual qubits fell out of their quantum states. The research published in Nature pointed to a reduction of only 4 per cent in the error rate as Google scaled up its technique to run on a larger quantum system. However, the researchers said this was the first time that increasing the size of the computer had not also led to a rise in the error rate.
Quantum computers struggle to produce useful results because the quantum bits, or qubits, they are based on only hold their quantum states for a tiny fraction of a second. That means information encoded in a quantum system is lost before the machine can complete its calculations. Finding a way to correct for the errors this causes is the hardest technical challenge the industry faces. [...] Google's researchers said they had found a way to spread the information being processed in a quantum computer across a number of qubits in a way that meant the system as a whole could retain enough to complete a calculation, even as individual qubits fell out of their quantum states. The research published in Nature pointed to a reduction of only 4 per cent in the error rate as Google scaled up its technique to run on a larger quantum system. However, the researchers said this was the first time that increasing the size of the computer had not also led to a rise in the error rate.
I'm ready for the singularity! (Score:1, Redundant)
Transporters not far off (Score:3)
Seems like Google is trying to invent the Heisenberg Compensator.
If you're not a Trek fan, then just mod me down and move on. :)
Re: (Score:2)
Re: Transporters not far off (Score:1)
Re: (Score:2)
Physics turtle is ashamed for your recognition of fundamental constants.
Re: (Score:2)
Or dead, for that matter.
Or anti-matter...
Don't like my jokes, what's the matter?
Re: (Score:1)
Don't care (Score:2)
Re: (Score:2)
You can make your own nMR/MRI machine with an electromagnet, permanent magnet or even the Earth's magnetic field. Nobody does though.
It's unlikely you'll ever be able to make a quantum computer, even a shitty one, without cryogenics and a lot of shielding.
TFS asks the wrong question (Score:2)
4% BER decrease... : Pfft, not enough.
>The research published in Nature pointed to a reduction of only 4 per cent in the error rate as Google scaled up its technique to run on a larger quantum system. However, the researchers said this was the first time that increasing the size of the computer had not also led to a rise in the error rate.
So this doesn't address the important bit, which is "Can the method for 4% reduction be extended to 100% reduction or is it constrained in some way"
Of course "4% reduct
Re:TFS asks the wrong question (Score:4, Informative)
Replying to myself..I skimmed the paper.
>Although our device is close to threshold, reaching algorithmically
>relevant logical error rates with manageable resources will require an
>error-suppression factor d/(d+2)1. On the basis of the error budget
>and simulations in Fig. 4, we estimate that component performance
>must improve by at least 20% to move below threshold, and substantially improve beyond that to achieve practical scaling.
So the answer is no. The QECC can't scale to address arbitrary BERs. The device performance has to improve to allow the QECC to correct it.
So the universe is still in alignment. Quantum computers are still BS. The barrier to quantum error correction is still that the noise overwhelms the system in ways that error correction can't fix it. This is not a new ECC that gets out of this constraint.
Business as usual.
Re: (Score:3)
You don't need 0% errors from a quantum system. You can verify the answers with a traditional computer. Even if it takes many attempts to get a correct answer, it's still likely to be quicker than trying to do it the traditional way.
Re: (Score:2)
Repeating the algorithm until it gives the right answer only works when you can efficiently test that the answer is correct. So every algorithm in NP for instance.
Algorithms to solve things the Traveling Salesman problem, that isn't possible. So while repetition is a form of error correction (called unsurprisingly a repetition code) it does not solve an important class of problem and unless your BER is lowe enough, it doesn't even help probabalistically.
As with other fields (like Physically Unclonable Funct
Re: (Score:2)
But do you need to get the best answer to your NP problem, or just one better than all the ones we have now? Or get an answer very quickly and be fairly sure it's at least close to ideal?
Re: (Score:2)
Since I work in cryptography, the dlog problem is the one that matters and probabalistic solutions won't cut it.
Potentially a really big deal (Score:4, Informative)
Okay, first off - this is not remotely my area. So I'm sure someone is going to point out some misstatement.
A week or two ago I was listening to a talk where a quantum researcher touched on qubit error correction. I was unaware just how big an issue this is. We always hear announcements about, for instance, IBM and its new 400+ qubit quantum computer... sounds like we're getting close to a practical quantum computer, since we only need a couple thousand qubits right? But, according to this speaker, this is much less impressive than it sounds because the error issues mean you need quite a large number of qubits (maybe as high as a million) to give you the reliable usable equivalent of a 1-2K quantum computer. So it seems we're still a long ways away from a general-use quantum computer (assuming it ever happens at all).
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Quantum Computing Explained (Score:3)
Maybe I'm too much into classical computing (a sequential/structured/OO programmer since 1994) but I.JUST.DON'T.GET the governing principle behind quantum computing. I don't understand how you'd write programs for it, and in what kind of languages. And how they'd look like.
Can someone who can relate please recommend a book/article/video/movie/person-to-talk-to to make things clearer for the programmer in me?
Re:Quantum Computing Explained (Score:4, Informative)
https://quantum.country/ [quantum.country]
It's about the best introduction I've found. Combine with this:
https://barghouthi.github.io/2... [github.io]
So you can play with the concepts in written-from-scratch code.
Summary: quantum computing is fairly simple linear algebra. Like any linear algebra, when the matrices get bigger, computations get a lot slower. Quantum computers exploit natural phenomena to do those specific matrix manipulations really fast, just like a classical computer exploits natural phenomena to do arithmetic really fast. If you can map your problem onto that particular set of matrix operations, you're good to go. Theoretically.
Re: (Score:1)