Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Technology

D-Wave's 5,000-Qubit Quantum Computing Platform Handles 1 Million Variables (venturebeat.com) 66

D-Wave today launched its next-generation quantum computing platform available via its Leap quantum cloud service. The company calls Advantage "the first quantum computer built for business." In that vein, D-Wave today also debuted Launch, a jump-start program for businesses that want to begin building hybrid quantum applications. From a report: "The Advantage quantum computer is the first quantum computer designed and developed from the ground up to support business applications," D-Wave CEO Alan Baratz told VentureBeat. "We engineered it to be able to deal with large, complex commercial applications and to be able to support the running of those applications in production environments. There is no other quantum computer anywhere in the world that can solve problems at the scale and complexity that this quantum computer can solve problems. It really is the only one that you can run real business applications on. The other quantum computers are primarily prototypes. You can do experimentation, run small proofs of concept, but none of them can support applications at the scale that we can." Quantum computing leverages qubits (unlike bits that can only be in a state of 0 or 1, qubits can also be in a superposition of the two) to perform computations that would be much more difficult, or simply not feasible, for a classical computer. Based in Burnaby, Canada, D-Wave was the first company to sell commercial quantum computers, which are built to use quantum annealing. But D-Wave doesn't sell quantum computers anymore. Advantage and its over 5,000 qubits (up from 2,000 in the company's 2000Q system) are only available via the cloud. (That means through Leap or a partner like Amazon Braket.)
This discussion has been archived. No new comments can be posted.

D-Wave's 5,000-Qubit Quantum Computing Platform Handles 1 Million Variables

Comments Filter:
  • by DontBeAMoran ( 4843879 ) on Tuesday September 29, 2020 @11:08AM (#60553992)

    Either the technology is progressing a lot more rapidly than I expected, or I fell asleep for a decade or two. Weren't there news about 10~100 digits qubits only a year or two ago?

    • by mustafap ( 452510 ) on Tuesday September 29, 2020 @11:19AM (#60554022) Homepage

      You were asleep. In other updates, wear a mask when you leave the building.

    • Re:5000 qubits?! (Score:5, Interesting)

      by igor.sfiligoi ( 1517549 ) on Tuesday September 29, 2020 @11:28AM (#60554040)

      Either the technology is progressing a lot more rapidly than I expected, or I fell asleep for a decade or two. Weren't there news about 10~100 digits qubits only a year or two ago?

      DWave does not build "real quantum computers"... not in the traditional sense.

      Their systems are "Adiabatic Quantum Annealers"... basically a specialized, analog minimization engine... which happens to use quantum effects in the process.
      Still impressive technology, though.
       

      • by ceoyoyo ( 59147 )

        Not sure what "traditional sense" is when it comes to quantum computers. DWave machines, which were first, are special purpose quantum computers. Gate-based quantum computers are also special purpose, but a bit less so than DWave machines. This is pretty much the same evolution as traditional computers took, starting with special purpose machines and evolving more generality. Except with quantum, there are hard limits to how general you can get.

        A DWave machine is analogous to something like a pre-GeForce 3D

        • Not sure what "traditional sense" is when it comes to quantum computers.

          Can you solve 4096 bit RSA in seconds? Can it solve knapsack problems in better than linear time? If so then it's a quantum computer like we were promised. If not then it's a rip off and we want our money back.

          It's like turning up with my flying car and telling me "this is a flying car, but you aren't allowed to go higher than two feet and the maximum speed is 15mph". Sorry, we saw the comics. We know what we want. This isn't it.

          • by ceoyoyo ( 59147 )

            Lol. Fortunately there are people who don't live based on comics. There's very good reason to believe that you will never be able to solve 4069 bit RSA in seconds, and why would you want to? Actual exciting possibilities for quantum computers are things like computational chemistry.

            DWave's machines are quantum annealers. Annealing is an algorithm for solving non-convex optimization problems. Non-convex optimization is one of those basic algorithms that's extremely useful for a great many things. Just like a

            • There's very good reason to believe that you will never be able to solve 4069 bit RSA in seconds, and why would you want to?

              I've had enough of my porn; I want to see your porn too.

              But being serious, there is serious profit to be made from breaking 4096 (or even smaller) RSA keys. Much of online banking is protected by this and so you could immediately start transferring money from one account to another. In a sense I hope that this never happens or that we have moved on to better substitutes before it does, however if it does happen I want to know. That's the most important thing - if quantum computers are going to be able to

              • by ceoyoyo ( 59147 )

                If a quantum computer gets remotely close to breaking RSA with a reasonable key size, NIST and the other government standards bodies will remove their approval, and your bank, as well as mine, will switch to something else. A couple of governments that archive everything will will have some fun reading old messages, and if it's accessible enough (that's a big if) maybe the FBI will find get to read some ten year old terrorist IMs. All this has happened before, repeatedly such as when DES was broken or RSA w

      • And this article suggests that IBM and Google are at about 75 Qubits these days.

        So yes, apples to oranges. "real" quantum computers are in the 10-100 range.

        https://singularityhub.com/202... [singularityhub.com]

      • Sounds a lot like the Wikipedia definition for simulated annealing. I vaguely remember writing a paper on that when I was in college. This is the same thing though (presumably) much faster than simulated annealing done on traditional hardware. Useful, but it’s not the “quantum computer” everyone is talking about, the one that can run Shor’s algorithm for breaking crypto.
        • by jythie ( 914043 )
          Apparently, so far, the dwave systems have not been preforming as well as simulating the same process on conventional hardware. They are claiming 'any day now', but nothing has been independently verified. Their big pitch is trying to get companies to experiment with the technology so that when they produce something that IS better the companies that believed in them will be ahead of the game and ready to make use of their tech.
        • This is the same thing though (presumably) much faster than simulated annealing done on traditional hardware

          Quantum computers are slower at all known algorithms they're capable of performing.

          You should never presume a real quantum device is faster at anything than a digital computer, because it has never happened. Whenever quantum means faster, or more capable, it means a mathematician is talking out their ass and pretending their formula is an actual machine. But it isn't, and they have no clue how to engineer such a device.

      • Distilling what seems like BS alerts you cite: curve fitting through iterative processing.

        There's a case for that. Here and there. Maybe. Can't we use AI and neural nets instead?

        >

        • Distilling what seems like BS alerts you cite: curve fitting through iterative processing.

          There's a case for that. Here and there. Maybe. Can't we use AI and neural nets instead?

          >

          First of all there is no “AI”. All of machine learning is just an optimization solve. Second a Neural Net is just a parameterized function. It’s no different than any other math function with unknown parameters like a Taylor or Fourier series. All machine learning does is use a gradient decent method to find a minimum of your loss function there’s literally nothing intelligent about it.

          A gradient decent method is a local optimization method. It just moved in a direction down hill un

          • by DavenH ( 1065780 )

            First of all there is no “AI”. All of machine learning is just an optimization solve.

            This is simplistic and reductionist. Machine learning is "just" curve fitting, and human brains are "just" chemical potentials spiking. That statement diminishes the power and (potential) intelligence of their emergent structures not a whit.

            All machine learning does is use a gradient decent method to find a minimum of your loss function there’s literally nothing intelligent about it.

            Intelligence is not defined or limited by the process that created it, simple or complex; it's defined functionally by the results. For example, there's nothing intelligent about evolution, yet there are intelligent organisms.

            • This is simplistic and reductionist. Machine learning is "just" curve fitting, and human brains are "just" chemical potentials spiking. That statement diminishes the power and (potential) intelligence of their emergent structures not a whit.

              Intelligence is not defined or limited by the process that created it, simple or complex; it's defined functionally by the results. For example, there's nothing intelligent about evolution, yet there are intelligent organisms.

              And this is just ignorance of simple math. Stop trying to anthropomorphize it into something it’s fundamentally not.

              • by DavenH ( 1065780 )
                You're going to have to be more precise. I really can't see how your statements map to anything I just said.
                • by jythie ( 914043 )
                  I think the person is trying to argue that since AI uses math instead of ethereal god goo it can never amount to anything, kinda like how since transistors are just tiny switches computers don't really do anything.
              • by keltor ( 99721 ) *
                Can I step in and say that at this stage, AI (and all it's subsets like ML) are still current limited to running on binary computers. Brains are not binary computers and that does limit what current AI can do. Now can we build an organic computer that can "work" just like the human brain? I think 100% we CAN and maybe we will. That's just not the situation in 2020.
            • And humor is about the observation of incredulity.

              My BS meter went off the rails when I read the above description.

            • This is simplistic and reductionist. Machine learning is "just" curve fitting, and human brains are "just" chemical potentials spiking.

              Machine learning really is just curve fitting.

              Human brains are much more complicated than your simplification.

              Don't be an idiot with false equivalencies. You're trying to quibble over a subjective characterization. You can only be wrong when you argue with somebody else's subjective characterization. That is not only not a hill to die on, it is a hill only an idiot fights over. And you also ignored the substance.

          • Distilling what seems like BS alerts you cite: curve fitting through iterative processing.

            There's a case for that. Here and there. Maybe. Can't we use AI and neural nets instead?

            >

            First of all there is no “AI”. All of machine learning is just an optimization solve. Second a Neural Net is just a parameterized function. It’s no different than any other math function with unknown parameters like a Taylor or Fourier series. All machine learning does is use a gradient decent method to find a minimum of your loss function there’s literally nothing intelligent about it.

            A gradient decent method is a local optimization method. It just moved in a direction down hill until it finds a region where the gradient is zero. Here’s the sticking point. The gradient decent only finds the global minimum when the function you’re searching for the minimum is convex. That is to say, there are no local minima. Otherwise it will reach different minima depending on the initial search point in parameter space.

            By contrast, the D wave machine, evaluates all points in parameter space simultaneously. The simulated annealing reduces all these states to a single state at the global minimum. So even if your optimization has multiple local minima, it will find the global minimum. This is also faster than trying to evaluate the model at every point in space.

            Translation: The D wave machine does the same thing, more slowly, and it can't be optimized for the case where there is only one possible answer; it can only be run, slowly, across the whole range. It is faster at no cases, a little bit slower at the one case it is useful for, and the same speed as that for simple cases, even if you know they're simple.

            A math dildo, basically.

        • "training" a neural net IS just curve fitting through iterative processing... and yes, researchers have tried using curve fitting to solve a "metaproblem" about optimizing curve fitting; there are a few papers but it's still in theory-land.

          in theory, quantum annealing could accelerate neural net training but this system is far too small to help very much, even in theory, with an interesting neural net.

          in practice, with this machine i suppose you could work on a few NP-complete problems. however i'm suspicio

          • in theory, quantum annealing could accelerate neural net training

            An odd theory it would be that exists before the conditions necessary to form the hypothesis. ;)

            Quantum annealing can't be a candidate to accelerate neural net training based on current observations. (Those are the observations that have been made. Compare to observations that haven't been made to see the significance)

            If you came up with a hypothesis for how to use this type of parameter minimization to improve your curve-fitting, then you could test that, if it worked, you'd be ready to come up with a theo

        • AI and neural nets, like this device, fail to improve upon the carefully crafted digital algorithms in actual use for solving these problems where they need to be solved.

          This is a tool is for theoreticians to throw money at the clouds so they can use a slower, more expensive API. But it makes them sound cool to people who don't know WTF they're talking about.

      • The only part of the bullshit that needs to be detected to see through it is the part about it having "5000 qubits" but then it turns out not to use a gate model.

        It is like claiming an analog transistor has some number of "bits." I'm sure their blah-blah means exactly blah-blah, like they said, but it sure as fuck isn't a "5000 qubit quantum computer." It isn't even a computer. It is like calling any mechanical linkage a "computer," not because it is does programmable computations but merely because you can

      • by Motor ( 104119 )

        It's designed to solve optimisation problems - and it doesn't use 'gates', but it doesn't use quantum effects and they believe those speed up the process.

        That's not bullshit. Neither is their engineering - which is top notch stuff.

        It *might* not deliver the fabled quantum supremacy... that's yet to be determined, but only idiots are dismissing DWave as bullshitters.

    • Last time I checked D-Wave advertised only the total number of qbits not the one's being entangled, e.g. a year back they had "n" of 2qbit pairs and I believe this machine is somewhere in this line as having "n" parallel "m" qbits machines, where n*m = 5000.
    • by gweihir ( 88907 )

      For actual computations, they are still at around 50 effective QBits. You massively lose numbers in error correction. And they simulate stringing longer calculations together today (losing all advantages), hence no actual factorization or the like. Also, 50 bits lets you factor numbers up to around 16 bit. For comparison, my pocket calculator could factor up to 60 bit long numbers or so 30 years back.

      For the D-Wave, these are not only "raw" QBits, they are not all entangled with each other in addition _and_

  • I been wondering, what is the largest integer a computer like this can handle? Is it still 64bit? Or can it handle numbers with millions of digits?

    • The better question is, do we even have use cases for numbers that fall outside the range of 64 bit?

      Also are we supposed to be impressed by "handles 1 million variables"? After all variables are stored in RAM, they make it sound as if 1MB RAM was a breakthrough?! Or did they mean 1 million CPU registers, in which case yes that would be impressive*.

      * I suppose someone will reply that GPUs can access all their RAM as registers.

      • Also are we supposed to be impressed by "handles 1 million variables"?

        Yes being able to perform a global optimization on a million dimensional function is game changing. If you were to attempt this using a brute force of approach would require M^N evaluations where M is the resolution of your scan, and N is the number of parameters. A million dimensions with just 10 scan points in each dimension would require 10^1000000 evaluations.

        • Yes being able to perform a global optimization on a million dimensional function is game changing.

          Not if you realize that they're just unrolling a loop so that they have to calculate every possible value. And that their "computer" doesn't have memory, at least not that is accessible to the calculation, so obviously you can only calculate everything, you can never respond in the middle of the algorithm by stopping it early.

          You can build a digital computer the same way with an FPGA, and sometimes it is done, but no it is not interesting or game-changing. Their thing is slower than brute-forcing it with tr

          • you can never respond in the middle of the algorithm by stopping it early.

            There’s literally no reason you would ever want todo that.

        • Yes being able to perform a global optimization on a million dimensional function is game changing.

          If you were to attempt this using a brute force of approach would require M^N evaluations where M is the resolution of your scan, and N is the number of parameters.

          âoeThe combination of the number of qubits and the connectivity between those qubits determines how large a problem you can solve natively on the quantum computer,â Baratz said. âoeWith the 2,000-qubit processor, we could natively solve problems within 100- to 200-variable range. With the Advantage quantum computer, having twice as many qubits and twice as much connectivity, we can solve problems more in the 600- to 800-variable range."

      • The better question is, do we even have use cases for numbers that fall outside the range of 64 bit?

        In cryptography, all the time.

      • Real quantum computers don't have memory, they only have I/O, so there are no software variables.

        That's why that bit of the bullshit is in there. They do have variables, but they don't have a computer. It is like building a rope and pulley system, and counting the threads of yarn in the rope as variables.

    • They're not to 1 yet, but they're pretty good at -1.

      (Puns not intended)

    • by gweihir ( 88907 )

      I been wondering, what is the largest integer a computer like this can handle? Is it still 64bit? Or can it handle numbers with millions of digits?

      None at all. This thing cannot do regular calculation. It can only do simulated annealing and that much slower than a classical algorithm on a classical computer.

  • When coming to quantum computing, not telling how many qubits can actually be entangled simultaneously is amount to cheating. Can someone enlighten us how many qubits this machine can entangle?

    • by Zak3056 ( 69287 )

      FTFA, it looks like 15.

    • by jythie ( 914043 )
      0. While, as I understand it, their process uses quantum effects in order to solve a problem, it really is not the type of 'computer' people are generally talking about regarding 'quantum computing'.
  • So how does the Universe end D-Wave? You have to power to find out now, right?
  • ...does it support Full Self-Driving?

  • This is great news. Now you just need a quantum business problem to solve which doesn't exist.

    • How do you get Theoreticians to throw Real Money at a Fake Cloud running an Imaginary Quantum Non-Computer?

      If the business problem didn't exist, I doubt IBM would be building quantum computers; they build real supercomputers already, they know the difference.

      • by dmay34 ( 6770232 )

        I'm confused. Has it historically been difficult to get theoreticians to throw real money at imaginary things? I think that's kinda their whole job.

        But beyond that, you missed my point. There is no doubt in my mind that IBM and other companies working on quantum computers are working just as hard thinking about and inventing WHOLE NEW NEEDS for that technology that businesses didn't even know they had.

    • Finally a solution looking for a problem!

      I think this may be a solution for extracting more investment money from those with too much of said money. Or sadly, those in control of tax payer money.
    • by gweihir ( 88907 )

      There is exactly one business problem this thing is trying to solve: Keeping the D-Wave scam alive a bit longer.

  • ya.
    being an xbox player.
    i need a speed edge.
    so what if the frames per second with one of these things
  • Call me when a quantum computer can factor RSA-1024.

  • Making this available online only is highly suspicious. Add that everything this thing can do is faster on a not very powerful classical computer using classical algorithms, and I see a company trying to perpetuate what was a scam from the very beginning.

As you will see, I told them, in no uncertain terms, to see Figure one. -- Dave "First Strike" Pare

Working...