Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Supercomputing Google

Google Identifies Low Noise 'Phase Transition' In Its Quantum Processor (arstechnica.com) 31

An anonymous reader quotes a report from Ars Technica: Back in 2019, Google made waves by claiming it had achieved what has been called "quantum supremacy" -- the ability of a quantum computer to perform operations that would take a wildly impractical amount of time to simulate on standard computing hardware. That claim proved to be controversial, in that the operations were little more than a benchmark that involved getting the quantum computer to behave like a quantum computer; separately, improved ideas about how to perform the simulation on a supercomputer cut the time required down significantly.

But Google is back with a new exploration of the benchmark, described in a paper published in Nature on Wednesday. It uses the benchmark to identify what it calls a phase transition in the performance of its quantum processor and uses it to identify conditions where the processor can operate with low noise. Taking advantage of that, they again show that, even giving classical hardware every potential advantage, it would take a supercomputer a dozen years to simulate things.

This discussion has been archived. No new comments can be posted.

Google Identifies Low Noise 'Phase Transition' In Its Quantum Processor

Comments Filter:
  • Simple request: Just embargo and hold these news stories from general release until Quantum computers are available for purchase by large corporations and run reliably for 5 years without millions of dollars spent on maintenance.

    Same goes for the regular hype about human brain -> computer interface. A trickle of products over the last 40 years yet no general purpose augment a human brain with a computer has happened, yet that general purpose augmentation is always listed or hinted at in each news articl

    • Ah, because new research isn't suitable for a "News for Nerds" site until it's boringly practical. Got it.

      • by Shaitan ( 22585 )

        If it isn't at least practical for even one person with the access and knowledge then it isn't news yet.

        Their approach is so much of a dead end that they went back to find ways to work around the criticism instead of just implementing something else. You only do that when you can't make something else work. I don't know what you call that but it certainly isn't supremacy.

        You'll know you've actually got quantum supremacy when your next move isn't to give a shit about the criticism or even to publish but to i

        • All for the research into quantum computers.

          Just want the repetitious news articles claiming 'it will be any day now' for the last X years on these technologies.

          It's lazy journalism to throw in these lofty promises for years when those promises don't materialize. At what point do you disregard the news stories that put this lofty goal in them when the lofty goal has not been reached or is not really any closer after 5 or more years of hype?

          Different topic: Like how there are lots of programs research or go

    • by gweihir ( 88907 )

      Anyone know the correct term for these "appeal to a grand objective" phrases used in marketing and news articles?

      I think "Big Lie" covers it nicely: https://en.wikipedia.org/wiki/... [wikipedia.org]

    • Ah, so you're a "new battery tech" reader!

    • Don't hate the player!

      Those marketing dudes are just trying to win a Nobel Prize in Physics, like the rest of us!

    • They are never going to sell them.

      They've realized how much money is in being a middleman, so quantum computing, if it works, will be cloud only.

    • Just embargo and hold these news stories from general release until Quantum computers are available for purchase by large corporations and run reliably for 5 years without millions of dollars spent on maintenance.

      So we can then bitch about 'corporate lack of transparency.'

  • even giving classical hardware every potential advantage, it would take a supercomputer a dozen years to simulate things.

    What kind of metric is that? Simulate things? This is why Nature is going downhill.

    • by gweihir ( 88907 )

      Indeed. And it is even worse: The lie is based around the fantasy that a QC is somehow simulating itself. That makes absolutely no sense whatsoever and that is not how you quantify computing power. Using that metric, a glass of water has more computing power than the largest supercomputer ever built. It is just not computing power that is useful in any way. Quite like "simulating" a QC is not useful for anything.

      At this time, much the QC field has apparently gone the same way as much of the AI field: If you

  • by gweihir ( 88907 ) on Thursday October 10, 2024 @12:58AM (#64852973)

    Yes, there is (some) research being done. But claims as to an actually useful computing devices are simply direct and shameless lies, nothing else. At this time it is still not clear whether that actually useful "QC"-type computing device is even possible in this universe. Scalability of these things is so abysmally bad and stability for longer computations is too, that it is quite possible the only thing that a QC will ever be better at "simulating" is a QC. That is, of course, just another lie. If you take a modern computer and actually claim it is a simulation for what the electronics in there do, you suddenly have a "computing mechanism" that is massively more powerful, and that would be the actually "fair" comparison. Obviously, it is a comparison that makes no sense at all.

    Hence, nothing to see here, Google just lying a bit more about how great it is.

    • yay - let's sit on our hands and not try advancing humanity then - it might not meet your approval
      • yay - let's sit on our hands and not try advancing humanity then - it might not meet your approval

        Well, that certainly is a garbage take. Congratulations.

        Nobody complained about trying to advance technology.

        The complaint was about lying about it.

        Try learning to read, it helps.

    • by Viol8 ( 599362 )

      I've often wondered how given information processing takes a certain amount of energy, how quantum computers would seem to be able to solve complex problems seemingly if perhaps not for free, then for a tiny amount of energy. Something isn't right.

    • by HiThere ( 15173 )

      If they can actually get the noise/qbit down enough, then there are definitely applications for which quantum computers would be useful. But I don't think there's any evidence yet that they are even potentially more useful on most common problems.

      That said, they've clearly got promise in the areas of both number factoring and emulation of small molecules interacting. Quite possibly in all the areas covered by quantum physics, but not modeled well by Newtonian or Relativistic physics.

  • by mattr ( 78516 ) <`moc.ydobelet' `ta' `rttam'> on Thursday October 10, 2024 @05:15AM (#64853253) Homepage Journal

    IANAP but I read the paper. What I got out of it was:
    - The quantum processors are so noisy that there is a struggle to find a use for them, most likely ceritifed random number generation it says in the conclusion
    - Random Circuit Sampling (RCS) is apparently the method people are using to attempt to characterize quantum computers
    - The researchers identified a regime in which there is relatively low noise. Seems to match up with a paper from 2023 that said the boundary of noise limiting coincides with the lower boundary of error correction which is good.
    - To get this data they used a 67-qubit quantum Sycamore chip from which they read out 70M random bit strings with 97% fidelity

    All this is fine. Those poor guys. But then uh-oh!
    - They estimate how many FLOPS the computation this chip is supposedly performing would take on a classical computer, based on it having taken a supercomputer 15 hours a year or so ago to do an RCS simulation. They decide that using the fastest supercomputer in the world it would take 1000 years, or maybe 12 years if every computer and all its hard disks were used.. whatever. And this proves quantum computing is supreme!

    Look, I would have been fine with everything up to the comparison with classical supercomputers. Go for it guys. But suddenly they compare a chip that is just a very noisy random bit generator to a modern supercomputer. WTF guys!? IANAQCP so maybe I am missing something here, but it really does not seem like the chip is doing any computations at all. They are comparing how many random bits they can generate on a chip that is just an expensive random number generator at the moment, to the use of a supercomputer to run an RCS simulation, which I didn't read that paper but I take it is a simulation of a *working* quantum computer chip. Maybe all QCS papers try to one-up each other by saying how quantum supreme they are, or the chip manufacturer gave them a chip on the condition that they could tweet the same. Whatever. If RCS is a valid method of investigating quantum noise / error correction then I would leave it at that and stop attempting to compare "look at how many angles all the snowflakes in the Alps have!" to "look at how much computing power it take to calculate all those angles! Must be a zillion years! W00T! Quantum Supremacy111!!" How aggravating. Random number generation is not computing, and any comparison of computing power requires actual computation to be involved, not simulated theoretical capability.

    • by tap ( 18562 )

      And think, those snowflakes in the Alps compute all those angles instantly! How long would it take a super computer to generate all those angles? Snowflake Supremancy!!

Avoid strange women and temporary variables.

Working...