Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Google Technology

Scientific Breakthrough Gives New Hope To Building Quantum Computers (ft.com) 83

Google has achieved a major breakthrough in quantum error correction that could enable practical quantum computers by 2030, the company announced in a paper published Monday in Nature. The research demonstrated significant error reduction when scaling up from 3x3 to 7x7 grids of quantum bits, with errors dropping by half at each step. The advance addresses quantum computing's core challenge of maintaining stable quantum states, which typically last only microseconds.

Google's new quantum chip, manufactured in-house, maintains quantum states for nearly 100 microseconds -- five times longer than previous versions. The company aims to build a full-scale system with about 1 million qubits, projecting costs around $1 billion by decade's end.

IBM, Google's main rival, questioned the scalability of Google's "surface code" error correction approach, claiming it would require billions of qubits. IBM is pursuing an alternative three-dimensional design requiring new connector technology expected by 2026. The breakthrough parallels the first controlled nuclear chain reaction in 1942, according to MIT physics professor William Oliver, who noted that both achievements required years of engineering to realize theoretical predictions from decades earlier.

Further reading: Google: Meet Willow, our state-of-the-art quantum chip.

Scientific Breakthrough Gives New Hope To Building Quantum Computers

Comments Filter:
  • by TechyImmigrant ( 175943 ) on Monday December 09, 2024 @01:13PM (#65001441) Homepage Journal

    "IBM, Google's main rival, questioned the scalability of Google's "surface code" error correction approach"

    I believe I questioned the scalability, right here on slashdot, on the discussion around the original surface code paper. The numbers were in the paper showing the error correction could not scale to a lid on BER as the number of qubits and iterations increase.

    The text of TFS is underwhelming. "Dropped by half". So you had 10 errors, now you have 5. Your algorithm still doesn't work.

    I haven't read the paper yet - I guess I've got to go and do that now just so I can see why it won't work.

  • Baloney computing (Score:4, Insightful)

    by backslashdot ( 95548 ) on Monday December 09, 2024 @01:17PM (#65001449)

    They still haven't shown they can factorize a number great than 35. That's the true test of a useful quantum computer, "can it factorize large numbers?" .. so far the record is the number 35. A human can do that in his head in a few seconds if not instantly. It means our most powerful quantum computer is pathetic. I'm not saying we ought to give up, but man we have a long way to go.

    • Re:Baloney computing (Score:4, Informative)

      by LordHighExecutioner ( 4245243 ) on Monday December 09, 2024 @02:14PM (#65001581)
      35 integers ougth to be enough for anybody.
    • by jonadab ( 583620 )
      To be practical, a quantum computer needs to have something it can do faster than a contemporary electronic computer. And yes, factoring large numbers is potentially a candidate for such a task, but I think we can be more general than that. To be _practical_ (i.e., faster than a similarly expensive electronic computer) in five years, or even in ten, it would need to _already_ be faster than a bargain-bin electronic computer from thirty years ago.

      So the question I would ask is, is there _anything_ these qu
      • by ceoyoyo ( 59147 )

        Sure. They can produce random numbers quickly with very controllable correlation structures. The current state of the art is competitive with large clusters of conventional computers running the latest algorithms, certainly much, much faster than anything we could do in the 1990s, never mind on a single desktop.

        That doesn't sound all that useful, but it's what you need for doing quantum simulations. The most useful of those is probably quantum chemistry, where we're up to simulating fairly simple molecules

    • I guess it's finally time to replace my AES-6 encryption.

    • by gweihir ( 88907 )

      Yep. Even a slow 4-bit MCU does orders of magnitude better. And that is after 50 years of research. Any other mechanism would have been dropped as a failure long ago. But some people think GCs are magic, so more money is wasted.

    • by tlhIngan ( 30335 )

      I think the Chinese demonstrated factoring a 36 bit number, not 35.

      36 bits is basically a number in the 64 billion range. This is still small enough that a classical computer can brute force fairly easily, or a human can do it with a pencil and paper, and maybe a calculator.

      Of course, the recommended bit length for RSA is 4096, so there's still a ways to go.

    • by Zarhan ( 415465 )

      What about this?

      https://www.nature.com/article... [nature.com]

      It says 23 bits (8219999 = 32749x251).

      Was there something special about this method that doesn't allow it to be considered a general case?

    • That's the true test of a useful quantum computer, "can it factorize large numbers?"

      It's true that quantum computers will eventually need a killer app that justifies the cost, but it's OK if it's not factorization. Not sure how factorization could even be that killer app. As far as I'm aware, big number factorization is only used for encryption. It would just force people to move on to something else for encryption. More like the Y2K problem than the spreadsheet...

  • by Myria ( 562655 ) on Monday December 09, 2024 @01:28PM (#65001477)

    Wake me when I can factor 1024-bit RSA keys. The Nintendo DSi and I have unfinished business.

    I'm a pessimist, so I'm guessing--with no evidence--that we will find out that keeping N qubits coherent requires energy exponential in N, meaning that quantum computers are mostly useless.

    • by HiThere ( 15173 )

      It shouldn't require energy exponential in n. Just build it beyond Earth's orbit, and use a shade to keep the sun from heating it. You might need a large radiator, but the default temperature would be about 3.5K. (That's a bit warmer than 2.7K which you could get with a more distant orbit.)

      • You do know that quantum computers operate inside dilution refrigerators below 100 mK, right? Oh, and it's not like space is full of radiation or anything...oh, wait...
  • by MooseTick ( 895855 ) on Monday December 09, 2024 @01:32PM (#65001489) Homepage

    Quantum Computers have been just a few years away like cold fusion, flying cars, and self driving cars for decades. I think I'll wait to get my hopes up until there are real reproducible results.

    Also, the main use I've seen for QC is factoring large numbers which would break current most popular encryption methods. I guess that could be useful except there are already several alternative solutions ready to go as soon as QC is a legit threat.

    • It's getting rather late in the game to not believe in self-driving cars [external-preview.redd.it].
      • I think FSD will happen, eventually. Flying cars exists for very special cases. But Musk has been promising INVESTORS and the world for a decade now that FSD will be here "next year". And not even limited FSD, he's stated a person can summon a car across the country and it can find its way to you without any human intervention.

        • They are getting there and will get there. Fact is he's trying .. unlike all the other automakers. Without Elon promising it and pushing it relentlessly,, it would have taken Benz 50 years to first try to implement it. It would have taken decades to get there incrementally with basic ADAS features. Thanks to Elon pushing it to get it we'll have it. Same thing with robotics. Would have taken Boston Dynamics decades to make a useful humanoid robot and thats if they hadn't run out of money.

          • Umm, Boston Dynamics have produced a human like robot, unlike Musk and his farcical demonstration unit which was never seen again. I suggest you check out Boston Dynamics youtube channel.

            As for flying cars, give me a fucking break. Theyre a cartoon fantasy that for many reasons will remain a toy for the rich just like helicopters are now.

          • Re: (Score:2, Insightful)

            by MooseTick ( 895855 )

            You seem to have a lot of faith in someone who has yet to build anything beyond an electric car. I will give him credit for throwing a lot of money at the problem though.

        • The Musk mobiles are still confused by shadows, the speckled shadows of my trees routinely turns on my windshield wipers, and I had my car brake the other on a Texas highway because I abruptly crossed into deep shade caused by tall trees adjacent to the freeway. Now I do not even use the basic cruise control in the evening, because unnecessary breaking in Texas can get you rear-ended or shot.
        • I don't know if Tesla is even on the right track. Musk's big bet on robotaxis to the exclusion of the long-awaited cheap tesla (and updating the other models) is a MASSIVE risk if you ask me. (That said, the guy has been known to gamble and win).

          Waymo, on the other hand, is delivering an exponentially growing number of paid driverless rides per month, per the graph I linked to. And the deal they just inked for Miami shows they are open to partering with other companies for operations, which is scalable

        • by bill_mcgonigle ( 4333 ) * on Monday December 09, 2024 @03:21PM (#65001719) Homepage Journal

          Some people bought a Model 3 on the promise that "next year" you could rent it out as a robotaxi during the day and have a net negative cost.

          They count on 80% of people forgetting what they hear.

  • The list of authors is a full page of tiny type, all by itself!

    I remember when people complained about Physical Review papers which had 50 authors...

  • Many were claiming that practical QCs would be here by 2030, or even before that, before this breakthrough. Can you say "hype"?
    • by HiThere ( 15173 )

      Yes, and many have been claiming that QCs will turn out to be impossible. My guess is that they'll be possible, but of really limited utility.

      • by gweihir ( 88907 )

        QCs are definitely possible. We have them, but they are tiny. But QCs of useful size are a completely different proposition. The scaling of all known mechanisms (including this new one) is so abysmally bad that they may well never scale high enough to become useful. So far, the scaling seems to be inverse exponential with effort. Unless and until that can at the very least be made linear, QCs are a completely lost cause.

        • by HiThere ( 15173 )

          Yeah, I should have said QCs that are better than a classical computer and are general computers rather than just, e.g., relaxation machines.

          • Quantum computers will never be a general computer, that's not what they do.

            They are like a quantum computer co-processor. Ideally they will solve certain types of problems quickly just like a vector processor unit does, ie, not great for general computing but amazing at what they do.
            • by gweihir ( 88907 )

              For variable ways of "quickly". For the problems they are useful for, if they ever scale high enough, "quickly" can mean weeks, months or years.

              • I have a hypothesis that you can prove that quantum computers can't scale asymptotically. Still haven't worked it out yet.
                • by gweihir ( 88907 )

                  Do you mean "exponentially"? Because that is what conventional computers have done for a long time and seem to have stopped doing in the last 10 years or so.

                  • I mean asymptotically they don't scale better than conventional computers. Of course reducing your c constant can be enough for practical purposes.
                    • by gweihir ( 88907 )

                      I see. Well, my impression is they scale inverse-exponentially. That makes some sense because all qbits have to be entangled and that gets progressively harder. There is no such requirement with conventional computers. You do have a communication (interconnect) limit there, bit it is much, much more benign.

                    • Yeah, but currently, because of error correction, they scale exponentially. Unless this Google thing turns out to be as amazing as the press release.
  • Traditionally, in IT, whenever anybody predicts that anything will be available in ten years, everyone with any experience in such matters mentally corrects the statement to "In ten years, they'll still be saying it'll be available within ten years."

    Do I now need to start doing that when people make five-year predictions as well?
    • by gweihir ( 88907 )

      Looks like it. This prediction here is simply a complete lie, nothing else. The "could" that usually gets thrown in does not make the lie any better.

  • If we assume quantum mechanics is the fundamental theory of reality and then isn't the universe some huge quantum computer? If that is true how can a computer simulate itself?

    • Re: (Score:3, Insightful)

      by gtall ( 79522 )

      If we assume the Sun and its planets use the fundamental theory of gravity, isn't the solar system some huge analog computer? Hey this is fun, anyone can male shit up.

    • To simulate the entire universe with perfect accuracy would require a computer of equal complexity.

      However, we tend to simplify the models, limit their scope, and run them at vastly lower resolution than reality.

      This is how we can simulate electrons using electronics.

      • by Stalyn ( 662 )

        >This is how we can simulate electrons using electronics.

        But how well does that scale? Also one of quantum computers' use-case is exactly that because classical computers are so bad at it.

        • You can never run a simulation more complex than the simulator. The technology you use is irrelevant, this is a fundamental limit of information.

          "Does that scale" doesn't really apply.

    • by sfcat ( 872532 )
      That's a much bigger assumption than you think. There is good reason to think that QM is just a statistical illusion and something else we don't understand is happening. For example, QM can't explain the 2 slit experiment. QM's main strength is that it can approximate very accurate predictions. Its problem is that clearly there are things happening that we don't understand, otherwise we could make a first principles version of QM that worked. Its extremely clever math, but there aren't many physicists
    • by gweihir ( 88907 )

      That argument is nonsense. Using a part of a complex system as a computer is not a problem for the complex system. Otherwise you would run into this effect already with electronic computers or even an Abacus.

      That said, the problem with QCs is that they scale abysmally bad, probably inverse exponential with effort. That means they will never scale to relevant sizes.

      • by Stalyn ( 662 )

        >That argument is nonsense. Using a part of a complex system as a computer is not a problem for the complex system. Otherwise you would run into this effect already with electronic computers or even an Abacus.

        Actually a complex system talking or referring to itself is indeed a huge problem. See Godel's Incompleteness and the Halting Problem.

        >Otherwise you would run into this effect already with electronic computers or even an Abacus.

        You do run into these types of problems.. at least at sufficient scal

        • by gweihir ( 88907 )

          >That argument is nonsense. Using a part of a complex system as a computer is not a problem for the complex system. Otherwise you would run into this effect already with electronic computers or even an Abacus.

          Actually a complex system talking or referring to itself is indeed a huge problem. See Godel's Incompleteness and the Halting Problem.

          These are problems with theories, not with physical reality.

          >Otherwise you would run into this effect already with electronic computers or even an Abacus.

          You do run into these types of problems.. at least at sufficient scale. An Abacus is bad at math where the numbers are larger than the Abacus itself. Electronic computers are bad at quantum chemistry.

          No, you do not. Your argument is complete nonsense. Obviously, any practical QC will be of limited size just as well.

          • by Stalyn ( 662 )

            >These are problems with theories, not with physical reality.

            Theories don't exist in physical reality? The model of a physical system doesn't share the same complexity?

            > Your argument is complete nonsense.

            You don't get the argument to begin with. Anyway we do agree that QCs are BS.

            • by gweihir ( 88907 )

              >These are problems with theories, not with physical reality.

              Theories don't exist in physical reality? The model of a physical system doesn't share the same complexity?

              That is a beginner's question. Obviously, a model never has the complexity of the thing it models. That is the very purpose of a model.

              While theories can be _described_ in physical reality, the problems the theories have do not transfer to physical reality just for that reason alone. You need a bit more. And for the two you quoted, incompleteness is an property of formal systems which do not exist in physical reality and can only be described there. The halting problem assumes a computing mechanism that is

        • by Mal-2 ( 675116 )

          And GPUs are great at massively parallel multiply-and-accumulate, which translates into matrix operations being really fast. This is a large part of why generative AIs are built the way they are, because their workflow decomposes into matrix multiplication, which GPUs are good at. I'm not so convinced that's actually the ideal way to do it, but it's the expedient way because the hardware is already good at that.

          Last I heard, the chatter was about how AI was going to eat quantum computing's lunch in many (ma

    • The quantum computer isn't trying to simulate the universe, it's trying to find prime numbers.
  • Need to sell them right now to some sucker...

  • These things are wayyyyyyyyyyyyyyy to small to be useful. If they are now just wayyyyyyyyyyyyyy too small, that does not change much. And no, 2030 is simply a complete lie. Maybe in several 100 years. Or maybe never.

Our policy is, when in doubt, do the right thing. -- Roy L. Ash, ex-president, Litton Industries

Working...