Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Supercomputing

Nvidia CEO: Quantum Computers Won't Be Very Useful for Another 20 Years (pcmag.com) 30

Nvidia CEO Jensen Huang said quantum computers won't be very useful for another 20 years, causing stocks in this emerging sector to plunge more than 40% for a total market value loss of over $8 billion. "If you kind of said 15 years for very useful quantum computers, that'd probably be on the early side. If you said 30, is probably on the late side. But if you picked 20, I think a whole bunch of us would believe it," Huang said during a Q&A with analysts. PCMag reports: The field of quantum computing hasn't gotten nearly as much hype as generative AI and the tech giants promoting it in the past few years. Right now, part of the reason quantum computers aren't currently that helpful is because of their error rates. Nord Quantique CEO Julien Lemyre previously told PCMag that quantum error correction is the future of the field, and his firm is working on a solution. The errors that qubits, the basic unit of information in a quantum machine, currently make result in quantum computers being largely unhelpful. It's an essential hurdle to overcomeâ"but we don't currently know if or when quantum errors will be eliminated.

Chris Erven, CEO and co-founder of Kets Quantum, believes quantum computers will eventually pose a significant threat to cybersecurity. "China is making some of the largest investments in quantum computing, pumping in billions of dollars into research and development in the hope of being the first to create a large-scale, cryptographically relevant machine," Erven tells PCMag in a statement. "Although they may be a few years away from being fully operational, we know a quantum computer will be capable of breaking all traditional cyber defenses we currently use. So they, and others, are actively harvesting now, to decrypt later."
"The 15 to 20-year timeline seems very realistic," said Ivana Delevska, investment chief of Spear Invest, which holds Rigetti and IonQ shares in an actively managed ETF. "That is roughly what it took Nvidia to develop accelerated computing."

Nvidia CEO: Quantum Computers Won't Be Very Useful for Another 20 Years

Comments Filter:
  • To be powered by helium-3 fusion reactors on Musk's Mars mansion!

  • by taustin ( 171655 ) on Thursday January 09, 2025 @07:35PM (#65076851) Homepage Journal

    "Right now, part of the reason quantum computers aren't currently that helpful is because of their error rates."

    That hasn't slowed AI hype so far.

    Quantum computing was the scam, er, marketing hype from a couple of years ago. Now it's AI. Next year, will be something else.

    It's hard work coming up with new scams every year or two, but the marks aren't a gullible as we want them to be! They keep catching on!

    • That hasn't slowed AI hype so far

      Slap me a high 6, bro!

    • Hate to break it to you but quantum encryption is already a commercial thing. Just the tip of the iceberg. But go ahead and buggy whip yourself into believing it has nothing to do with you any time soon.

      • It would be a mistake to over-emphasize the similarities between 'quantum encryption' and 'quantum computing'.

        If you are doing quantum key distribution you are exploiting the fact that quantum phenomena are exceptionally delicate to set up optical channels that cannot be tapped or MiTMed without perturbing them. Very handy if you have exceptionally tight requirements and direct fiber or free air paths; but not a lot of comfort to people trying to do quantum computation; and specifically struggling with t
      • by taustin ( 171655 )

        Perhaps you should point that out to the idiot that wrote the article.

        I was merely pointing out that if he applied the same logic to AI hype as he did to quantum computing hype, he wouldn't say such stupid things.

    • by jma05 ( 897351 )

      Not even close.

      AI as unlimited use cases today. Every single one of us will be using AI backed tools and services, whether we choose to or not.

      Neither is a "scam". One has matured to usefulness, other also will, just has not yet.

      • by taustin ( 171655 )

        Not even close.

        AI as unlimited use cases today.

        I didn't say otherwise. Which you'd know if you'd actually read what I posted. It is, indeed, in widespread use today. That's why we have so many examples of how spectacularly it fails [cio.com] on a regular basis.

        Every single one of us will be using AI backed tools and services, whether we choose to or not.

        That's why it's a problem. It's like Russian roulette. And the more we use it, the fewer empty chambers there are in the revolver. We're rapidly approaching playing Russian roulette with an automatic.

        Neither is a "scam". One has matured to usefulness, other also will, just has not yet.

        Given how spectacularly, and often, AI fails, no, it hasn't matured to usefulness, no matter how much you ge

        • Not even close.

          AI as unlimited use cases today.

          I didn't say otherwise. Which you'd know if you'd actually read what I posted. It is, indeed, in widespread use today. That's why we have so many examples of how spectacularly it fails [cio.com] on a regular basis.

          Guess what? For the use cases where AI isn't good enough (like AGI, autonomous vehicles and robots), it's not used to generate revenue. For the use cases where AI is already good enough (image recognition, NLP, language translation, medical diagnostics, graphic design, etc.), it has been and will continue to generate a lot of revenue. Go figure.

    • Quantum computers wish they had the error rates of AI. They'd be amazing if you could reduce the error rate to even 99% wrong 1% right, for something simple like factoring a large prime. Each of your qbits literally collapses if you look at it funny, and you need to smack thousands of them around a bunch of logic gates while keeping them quantum.

    • by gtall ( 79522 )

      Quantum AI is starting to become a thing. Google for Bob Coecke, he references what he's working on near the end of

            https://www.youtube.com/watch?... [youtube.com]

      Coecke is a physicist and logician. Dunno if he has any papers yet on quantum AI but I expect they'll be along before long.

  • Ulterior motive? (Score:5, Interesting)

    by Tablizer ( 95088 ) on Thursday January 09, 2025 @08:00PM (#65076905) Journal

    Is it possible he's trying to sabotage the quantum investment market so investors turn instead to AI?

  • by rsilvergun ( 571051 ) on Thursday January 09, 2025 @08:01PM (#65076909)
    While I don't think quantum computers are anywhere near usability I could see Nvidia wanting to keep investment dollars away from them if only because it means investors you might have thrown some money at quantum computing would instead throw it at Nvidia
  • As far as I can see, they have none of the characteristics of what we call a computer. At best, it seems they are a chunk of weird logic that can be attached to a conventional computer.
    • by Jeremi ( 14640 )

      If you limit your definition of "computer" to something with a traditional Von Neumann architecture (with a CPU, RAM, etc), then they certainly aren't that. But the history of computing includes a large number of other architectures as well (mechanical computers, water clocks, even people with notebooks and instructions) and as long as they are able to compute a result, they are, in the broader sense, computers.

    • Somebody hasn't studied the basics. Somebody named up. Maybe consider studying the basics. There's probably a quantum computing for dummies page out there somewhere. If not try asking chatgpt :)

      (And see why Jensen is kinda shitting bricks)

  • by Tough Love ( 215404 ) on Thursday January 09, 2025 @08:44PM (#65076957)

    Self serving nvidia spin as usual. Of course nvidia is invested up to the eyeballs in classical computing and are not prepared to service the upcoming quantum universe. So this is all about making noise to distract geeks like you and me from the interesting questions. Um, what's that called again? Oh yeah! FUD.

    Bunch of fucking nvidia FUD, nothing more. If you want to know what impact quantum computing is going to have on you and your data center, do not listen to Jensen, whose only interest is preserving nvidias defacto monopoly over their shitty GPUs and garden variety tensor processors.

    • The one thing that makes it seem more likely that they are at least somewhat sincere is that, while talk(especially in contexts where the SEC can't nab you for being materially misleading) is cheap; Nvidia's acquisition activities are not a secret; too publicly traded and antitrust-scrutinized for that; and right now they've got approximately all the money in the world to go shopping while potential future competitors are still cheap and highly speculative; whether it be buying controlling interests or poac
    • by ledow ( 319597 )

      Given that there isn't a single commercial quantum processor in existence, and even if there were all current technology is huge, clunky, expensive and basically unavailable commercially, and nVidia deal in everything from consumer to datacentre... it's not really FUD is it?

      Even if someone came with a working useful practical stable quantum processor tomorrow... of course it's going to take 10-20 years for you to see ANYTHING AT ALL.

      The current best quantum processor is 1000 qubits (completely useless for a

      • by ledow ( 319597 )

        https://en.wikipedia.org/wiki/... [wikipedia.org]

        For reference if you want to look at the progress.

        It's even slower than traditional computing's origin of the first electrical computer in the depths of wartime secret projects funded with stupendous money.

        The difference between ENIAC in 1945 and the computing of, say, 1965 - 20 years later - "the DDP-116 is announced at the 1965 Spring Joint Computer Conference. It was the world's first commercial 16-bit minicomputer and 172 systems were sold. The basic computer cost $28,50

  • This seems like a dramatic and unimpressive showing for how smart and well informed the money sloshing around is.

    Not only is Huang a non-obvious candidate for advice on the technical viability of quantum computing(he's not just some finance bro; but electrical engineering and semiconductor design are fairly distinct from the sort of physics that the would be builders of quantum computers are drawing on); Nvidia is publicly traded and...not exactly...short on cash at the moment, or showing any interest in
  • with all the investment in cracking passwords when authorities insist on having their backdoors anyway.

  • "we know a quantum computer will be capable of breaking all traditional cyber defenses we currently use."
    What does this mean exactly? I'll admit I know next to nothing about quantum computing, but a quick web search seems to suggest that at least some traditional security mechanisms will remain secure in a post-quantum world.
  • “I think there is a world market for maybe five computers.”

    Thomas Watson, president of IBM, 1943

"But this one goes to eleven." -- Nigel Tufnel

Working...