Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Supercomputing

'Quantum Computing Has a Hype Problem' (technologyreview.com) 48

"A reputed expert in the quantum computing field puts it in black and white: as of today, quantum computing is a paper tiger, and nobody knows when (if ever) it will become commercially practical," writes Slashdot reader OneHundredAndTen. "In the meantime, the hype continues."

In an opinion piece for MIT Technology Review, Sankar Das Sarma, a "pro-quantum-computing" physicist that's "published more than 100 technical papers on the subject," says he's disturbed by some of the quantum computing hype he sees today, "particularly when it comes to claims about how it will be commercialized." Here's an excerpt from his article: Established applications for quantum computers do exist. The best known is Peter Shor's 1994 theoretical demonstration that a quantum computer can solve the hard problem of finding the prime factors of large numbers exponentially faster than all classical schemes. Prime factorization is at the heart of breaking the universally used RSA-based cryptography, so Shor's factorization scheme immediately attracted the attention of national governments everywhere, leading to considerable quantum-computing research funding. The only problem? Actually making a quantum computer that could do it. That depends on implementing an idea pioneered by Shor and others called quantum-error correction, a process to compensate for the fact that quantum states disappear quickly because of environmental noise (a phenomenon called "decoherence"). In 1994, scientists thought that such error correction would be easy because physics allows it. But in practice, it is extremely difficult.

The most advanced quantum computers today have dozens of decohering (or "noisy") physical qubits. Building a quantum computer that could crack RSA codes out of such components would require many millions if not billions of qubits. Only tens of thousands of these would be used for computation -- so-called logical qubits; the rest would be needed for error correction, compensating for decoherence. The qubit systems we have today are a tremendous scientific achievement, but they take us no closer to having a quantum computer that can solve a problem that anybody cares about. It is akin to trying to make today's best smartphones using vacuum tubes from the early 1900s. You can put 100 tubes together and establish the principle that if you could somehow get 10 billion of them to work together in a coherent, seamless manner, you could achieve all kinds of miracles. What, however, is missing is the breakthrough of integrated circuits and CPUs leading to smartphones -- it took 60 years of very difficult engineering to go from the invention of transistors to the smartphone with no new physics involved in the process.

This discussion has been archived. No new comments can be posted.

'Quantum Computing Has a Hype Problem'

Comments Filter:
  • by Anonymous Coward
    If you don't hype it you can't have all that cash to burn through while accomplishing nothing but more hype so that you have more cash to burn through.

    Let's face it, government and many investors generally don't have the best judgement.
  • quantum computing is a paper tiger, and nobody knows when (if ever) it will become commercially practical

    I believe that would make it Schroedinger's tiger?

  • Wait 'til he finds out about blockchains and NFTs. Hooo boy.

  • Photons maintain coherence crossing the entire universe, electrons can't compete.

  • by BrendaEM ( 871664 ) on Monday March 28, 2022 @07:35PM (#62398437) Homepage
    Hype is not a stranger to our land.
  • I'm not going to care about quantum computing until I can ssh on a machine that can do programmable quantum computing at a practical scale.

    I'm excited people are looking into it. But personally, not gonna care for a while.

    • by gweihir ( 88907 )

      I am in the same camp, minus the excitement. I had an actual expert explain things to me about 30 years ago. He was spot-on regarding the "progress" to be expected.

      • You should talk to your expert again. The QC world changed a lot in the last year. A lot of us who were ignoring it have realized we maybe can't safely do that any more. https://tech.slashdot.org/comm... [slashdot.org]
        • > You should talk to your expert again. The QC world changed a lot in the last year.

          Not sure how much an evaluation of these kinds of progress in QC change an evaluation of overall progress in QC

          "... 2021 also saw the first experimental demonstration of fault-tolerant Bacon-Shor code in a single logical qubit of a trapped-ion system, i.e. a demonstration for which the addition of error correction is able to suppress more errors than is introduced by the overhead required to implement the error correction

          • by gweihir ( 88907 )

            Indeed. All that was proven is that _theoretically_ error-correction can work. It may still blow up hardware size all out of proportion and make building it completely infeasible in practice, especially if you need, for example, do a complex calculation with 6k Qbits to break 2k RSA. The last number I saw was 50 effective Qbits after error correction, I think from IBM, and no statement of how the effort for correction scales with raising numbers of effective Qbits. Error correction has a tendency to scale a

        • by gweihir ( 88907 )

          You should talk to your expert again. The QC world changed a lot in the last year. A lot of us who were ignoring it have realized we maybe can't safely do that any more.

          It did not. Marketing got better (i.e. lies more convincing) but that is it. Incidentally, I am not ignoring what is happening (as a security expert that would be quite foolish), but I continue to be quite unimpressed. I think the longer this goes and the more advances are made, the worse the outlook for actually working and meaningful application is becoming. The last year was _no_ exception to that.

    • I'm not going to care about quantum computing until I can ssh on a machine that can do programmable quantum computing at a practical scale.

      I'm excited people are looking into it. But personally, not gonna care for a while.

      I care.

      I build crypto software and crypto security infrastructure for devices that are used by billions of people and need to last years, and in some cases decades. It's unlikely that practical quantum computers capable of breaking in-use asymmetric cryptography will appear in the next 5-10 years, but it's not only possible, the probability has significantly increased in the last year.

      Until very recently, quantum computing was still in a state where error correction efficiency fell below the threshold

      • I care... but not that much.

        The biggest problem in crypto is just getting people to care at all - look at the sheer scale of data breaches where internal connections/databases are completely unprotected. Unfortunately, that often means purposely setting the bars as low as we can without scaring off users - for instance, better to have users encrypting using only 3DES than nothing at all. This is closely coupled to integrating low-level crypto (e.g., AES) into high-level libraries accessible to typical de

        • I'm not concerned about the application layer. Applications are easy to update (relatively, not saying there aren't challenges, just that they're much, much easier to deal with). I'm concerned about system integrity, firmware signing, etc. The signing keys used for firmware for millions to billions of devices can't be changed, last for years to decades, and the low-level software on devices is the foundation for absolutely everything else done on them. Pwn the bootloader (or the ROM!) and the rest of the

      • by godrik ( 1287354 )

        Yeah, I am not in the security business. I am in the optimization and high performance business. And I suppose I have the luxury to ignore quantum computing until it becomes clear it will eventually work.

        At this point, my PhD is 15 years old and the people who were working on that "very exciting technology about to become practical" have not much to show for. It is likely I'll retire in 20 years and QC will still not be practical at that time.

  • Everything right now has a hype problem. Nothing is going to change without major change. Between politics, major world outlook and relations, Blockchain, global warming, cancer, artificial intelligence, and on and on... The world is full of talk and hype and no real action. That won't change on any respect without something big happening.
    • by gweihir ( 88907 )

      Quite true. Many people looking for meaning and game-changers, and quite a few willing to exploit that to make a dollar. Almost assures stagnation, because real progress is low and very rarely makes a larger step.

  • For the kinds of computing typical of most businesses, quantum computing has little to offer. Most business options are basic integer arithmetic or algebra. For data analysis problems you want massive parallelism with the winner raising their hand first. We've had that for years and better solutions are being added all the time. You need event based computing tied to parallel data stores that can detect and report events as they happen. You want smart memory that can interact directly with the CPU - whi
  • by ganv ( 881057 ) on Monday March 28, 2022 @08:46PM (#62398541)
    One thing many don't notice is that often in cases of huge hype, the main players are not in it for technological development. The business types want stock prices to become speculative. The career scientists want funding for their research and to make a name for themselves. The news media want something exciting that sells. None of these need the hyped topic of the moment to work in the near future. In fact, they will mostly do just fine if it never works. They will simply switch to the next case of big hype. I wonder whether the hype game will ever stop working by people realizing that the need to create hype is much much bigger than the rate of actual new and revolutionary ideas.
  • Transistor -> smartphone is a bad analogy.

    Smartphones are not efficient computing devices. That isn't their purpose and they use transistors very inefficiently. Further, transistors are second-generation computation elements. When the first transistor was invented, there are were already computers built out of tens of thousands of vacuum tubes.

    A better comparison is the first vacuum tube to the first vacuum tube computer. Eniac had 19,000 vacuum tubes. That would be 1904 to 1940 or so depending on w

    • by evanh ( 627108 )

      The transistor was needed before lithographic miniaturisation was possible. Sure, computers worked with valves, but they were primitive as all hell.

      Sankar Das Sarma is saying that, up against modern conventional computing, valve equivalent quantum computing just won't ever cut it. Ie: He expects it'll need a fundamental revolution, like the transistor was, first.

    • by sjames ( 1099 )

      An even better comparison would be from Babbage to an actual practical computer.

      The incredibly slow (by today's standards) and failure prone tube based computers were already a huge advancement over Babbage's theoretical (since only parts of it were ever built) mechanical system.

      As TFA pointed out, the things will need to be about 6 orders of magnitude larger to actually crack RSA.

      Imagine a demo of early smartphone technology where if you combined "only" one million of them together they could make a phone

  • It's only a problem for investors who think they are going to get huge sums of money out of it. It could eventually be a useful tool for certain problems. Maybe there will be a breakthrough in 40 years, who knows.

    I don't see the problem here unless you're an investor.

    • by sjames ( 1099 )

      Put another way, don't worry about quantum computing cracking your keys. You'll be long dead by the time cracking your key will be cheap enough to be worth it, even if it's your bitcoin wallet.

      • I disagree because that seems like it's one of the few applications of quantum computing. Also, being "worth it" is not an issue for governments.

  • My opinion on quantum computing is (1/sqrt(2))( |great idea> + |empty hype> )

  • Building a quantum computer that could crack RSA codes out of such components would require many millions if not billions of qubits. Only tens of thousands of these would be used for computation -- so-called logical qubits; the rest would be needed for error correction, compensating for decoherence.

    Brains and microglia. [youtu.be]

  • Superconductor or ions qubits are a waste of time&resources. They don't scale to the number needed for a useful QC
  • The technical term is "hype supremacy".

  • Is that people translate "QC can have business applications" with "QC can break cryptography". It is surely more entertaining talking about super duper sci-fi machines that defeat spies and armies in a sort of futuristic Mr Robot scenario rather than talking about the many potential applications in chemistry, material science, logistics, quantum simulation, etc. Stuff like "searching for organic catalysts that improve industrial ammonia production by 0.46%". But the reality is that the latter applications a
  • ... is the new nuclear fusion?
  • The results don't live up to the hype.

    Maybe one day they will. And that day, the price of storage will crash. Because the TLAs of the world will start crunching through their stored encrypted data to decrypt it and find the interesting stuff, freeing up a lot of storage for more productive uses. That'll be more-or-less current levels of storage equipment, not the storage that was relevant at the time the data was stored by the TLA.

  • On the way to finding that quantum transitions are NOT instantaneous, Yale physicists ginned up a way to infer quantum state without directly observing [and therefore collapsing state function] the state. So the problem of quantum error correction may be solvable but maybe not for the crop of quantum computers now on laboratory benches. https://quantuminstitute.yale.... [yale.edu]
  • In many designs not all physical qubits are interconnected - correcting for that yields quantum volume. Then, correcting for the % of qubits doing error correcting gives algorithmic qubits. . Laser-controlled, "ion trap" quantum computers get full n x n interconnection without the heat of exponentially many wires. Further, their most recent step from ytterbium to barium ions further decreases energy adds by requiring a lower frequency regulating laser. Estimated state set-up accuracy increases from 99.5% to

Some people manage by the book, even though they don't know who wrote the book or even what book.

Working...