Microsoft and Quantinuum Say They've Ushered in the Next Era of Quantum Computing (techcrunch.com) 24
Microsoft and Quantinuum today announced a major breakthrough in quantum error correction. Using Quantinuum's ion-trap hardware and Microsoft's new qubit-virtualization system, the team was able to run more than 14,000 experiments without a single error. From a report: This new system also allowed the team to check the logical qubits and correct any errors it encountered without destroying the logical qubits. This, the two companies say, has now moved the state-of-the-art of quantum computing out of what has typically been dubbed the era of Noisy Intermediate Scale Quantum (NISQ) computers.
"Noisy" because even the smallest changes in the environment can lead a quantum system to essentially become random (or "decohere"), and "intermediate scale" because the current generation of quantum computers is still limited to just over a thousand qubits at best. A qubit is the fundamental unit of computing in quantum systems, analogous to a bit in a classic computer, but each qubit can be in multiple states at the same time and doesn't fall into a specific position until measured, which underlies the potential of quantum to deliver a huge leap in computing power.
It doesn't matter how many qubits you have, though, if you barely have time to run a basic algorithm before the system becomes too noisy to get a useful result -- or any result at all. Combining several different techniques, the team was able to run thousands of experiments with virtually no errors. That involved quite a bit of preparation and pre-selecting systems that already looked to be in good shape for a successful run, but still, that's a massive improvement from where the industry was just a short while ago. Further reading: Microsoft blog.
"Noisy" because even the smallest changes in the environment can lead a quantum system to essentially become random (or "decohere"), and "intermediate scale" because the current generation of quantum computers is still limited to just over a thousand qubits at best. A qubit is the fundamental unit of computing in quantum systems, analogous to a bit in a classic computer, but each qubit can be in multiple states at the same time and doesn't fall into a specific position until measured, which underlies the potential of quantum to deliver a huge leap in computing power.
It doesn't matter how many qubits you have, though, if you barely have time to run a basic algorithm before the system becomes too noisy to get a useful result -- or any result at all. Combining several different techniques, the team was able to run thousands of experiments with virtually no errors. That involved quite a bit of preparation and pre-selecting systems that already looked to be in good shape for a successful run, but still, that's a massive improvement from where the industry was just a short while ago. Further reading: Microsoft blog.
not sure what scares me more (Score:2, Funny)
Re: (Score:2)
Don't worry, Microsoft does not even have a reliable QC on the level of a 30 year old pocket calculator, and they will not get one anytime soon and may never get one. The whole article is carefully designed to obscure how pathetic QC performance is after 50 years of research.
Re: (Score:1)
Re: (Score:2)
"It's a failure"? Billions invested in the technology by industry leaders & governments, entire industry sectors preparing for it to become a reality: https://www.ncsc.gov.uk/whitep... [ncsc.gov.uk]
Just because the research hasn't provided a commercial product for you yet, doesn't mean the research is a failure.
Re: (Score:3)
Yes billions, the "harvest now, decrypt later" strategy is concerning and being prepared for.
https://www.cisa.gov/sites/def... [cisa.gov]
Re:not sure what scares me more (Score:4, Interesting)
Did you know the largest number factored by Shor's algorithm is 21?
There's been debate about this, the numbers are higher given the method used. For example, https://arxiv.org/pdf/2212.123... [arxiv.org]
"Using this algorithm, we have successfully factorized the integers 1961 (11-bit), 48567227 (26-bit) and 261980999226229 (48-bit), with 3, 5 and 10 qubits in a superconducting quantum processor, respectively. The 48-bit integer, 261980999226229, also refreshes the largest integer factored by a general method in a real quantum device. We proceed by estimating the quantum resources required to factor RSA-2048. We find that a quantum circuit with 372 physical qubits and a depth of thousands is necessary to challenge RSA-2048 even in the simplest 1D-chain system. Such a scale of quantum resources is most likely to be achieved on NISQ devices in the near future."
But that's about all I understand from the research :)
Re: (Score:2)
It is essentially lies by misdirection. Shor's algorithm needs about 3x the effective (!) qbits of the bits in the number factrized. The only exception are products of primes of very specific form, i.e. the results are not general in any way. Also note that you get much, much fewer effective qbits than physical qbits.
The GQ field does have a long history of lying about their capabilities though.
Re: (Score:2)
There's been debate about this, the numbers are higher given the method used. For example, https://arxiv.org/pdf/2212.123... [arxiv.org]
"Using this algorithm, we have successfully factorized the integers 1961 (11-bit), 48567227 (26-bit) and 261980999226229 (48-bit), with 3, 5 and 10 qubits in a superconducting quantum processor, respectively. The 48-bit integer, 261980999226229, also refreshes the largest integer factored by a general method in a real quantum device.
These results are basically cheating, though. They're been achieved with the so called "Adiabatic quantum computing", which is a form of "Quantum annealing". Note that there is still no proof that quantum annealers and the algorithms which run on them can in principle outperform their classical counterparts. While it has been proven conclusively that a (not existing yet, for all practical purposes) quantum computer running Shor's algorithm does outperform any classical algorithm so thoroughly as to break th
Re: (Score:2)
The breakthrough is just around the corner, all we need is for you to invest just a little bit more...
Re: (Score:2)
Just because the research hasn't provided a commercial product for you yet, doesn't mean the research is a failure.
That's true. They are a failure because they've been talking about it for 50 years and cannot seem to get past "pocket calc" levels of performance. At some point, after enough delays and hand waving, speculation becomes wishing & talking-shit. QC is way over that line in my book.
Re: (Score:2)
Indeed.
Re: (Score:2)
People throwing money at something does not mean it is going anywhere or will ever work. Investors are _dumb_.
Re: (Score:2)
Indeed. The scalability is simply not there and there are good reasons to think it cannot be there in this physical universe. Will never be more than a partty-trick.
Re: (Score:1)
Yes, Windows produces Schrodinger Output.
Except it has 3 states: A) Normal cat, B) Mutant cat, C) BSOD
Re: (Score:2)
Nothing more scary than NFT as world's currency.
So which is it? (Score:2)
Were they able to run with no errors, or were they able to correct the errors they encountered without destroying the qubits? It can't be both.
Re: (Score:2, Funny)
Surely it can, it's a superposition!
Re:So which is it? (Score:5, Informative)
The source article is pretty interesting: https://cloudblogs.microsoft.c... [microsoft.com]
From what I understand, it's using a hybrid approach of taking the physical qubit and moving it to a virtual (logical) qubit to reduce error rates.
This part caught my attention:
"Three fundamental criteria to advance from noisy, intermediate-scale quantum computing to reliable quantum computing are:
1) Achieve a large separation between logical and physical error rates.
2) Correct all individual circuit errors.
3) Generate entanglement between at least two logical qubits.
We have demonstrated, for the first time on record, that all three of the above criteria have been met. For the first criterion, we achieved an 800x improvement in logical error rate compared to the physical error rate. To quantify this 800x improvement, we entangled qubits and performed runtime error diagnostics and error corrections on the measurements (as seen in Figures 1 and 2), thus satisfying the second and third criteria.
In addition to meeting the three criteria above, we have demonstrated several rounds of active syndrome extraction on two logical qubits, which marks the transition to reliable quantum computing. This achievement is a prerequisite for building a hybrid classical-quantum supercomputer that outperforms even the most powerful classical computers."
I'm just waiting for the AI - quantum crossover (Score:2)
Where the hype will grow so large that it'll form a Bullshit Blackhole that has swirling jets of CEO investor promises and SPACs.
Largest number factored? (Score:1)
So what were the factors? Was this a 20 digit number?
I'm still waiting (Score:2)
Have 33 or 35 been factored using Shor's algorithm yet?
C'mon, Microsoft. With your massive qubit numbers and error-free operation, this should be easy-peasy, no?