Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Technology

Nvidia Reveals Next-Gen AI Chips, Roadmap Through 2028 (cnbc.com) 8

Nvidia unveiled its next wave of AI processors at GTC on Tuesday, announcing Blackwell Ultra chips that will ship in the second half of 2025, followed by the Vera Rubin architecture in 2026. CEO Jensen Huang also revealed that its 2028 chips will be named after physicist Richard Feynman.

The Blackwell Ultra maintains the same 20 petaflops of AI performance as standard Blackwell chips but increases memory from 192GB to 288GB of HBM3e. Nvidia claims these chips can process 1,000 tokens per second -- ten times faster than its 2022 hardware -- enabling AI reasoning tasks like running DeepSeek-R1 models with 10-second response times versus 1.5 minutes on H100 chips.

Vera Rubin will deliver a substantial leap to 50 petaflops in 2026, featuring Nvidia's first custom Arm-based CPU design called Olympus. Nvidia is also changing how it counts GPUs -- Rubin itself contains two dies working as one chip. The annual release cadence represents a strategic shift for Nvidia, which previously introduced new architectures every two years before the AI boom transformed its business.

Nvidia Reveals Next-Gen AI Chips, Roadmap Through 2028

Comments Filter:
  • Nvidia claims these chips can process 1,000 tokens per second -- ten times faster than its 2022 hardware

    So if we go by Nvidia's bullshit performance metrics from their 50 series graphics cards we can expect around about a 10% perf boost?

  • We just can't get graphics cards. Intel put out a pretty nice chip that's supposed to retail for 250 bucks but the cheapest you're going to find it unless you get lucky or you live near a microcenter is 350. I've heard tails of people who paid MSRP for one but I'm pretty convinced you're more likely to see Bigfoot than a MSRP GPU right now.

    I suspect years from now if we ever have antitrust law enforcement again here in America then it'll get revealed that there was collusion going on. Like when lcd mon
    • by Anonymous Coward
      It isn't collusion, it is basic economics. they make more money selling inflated price cards to AI vendors at the moment, gaming cards take up precious Fab time, why waste that on GPU's you can get a few grand on when you could be using it to fab GPU's for AI at 10's of thousands a pop. Thus the atrocious supply which allows everyone to jack up prices. Thankfully AMD cards this gen are pretty good for mid range, so unless you are desperate for 4k ultra perf of the 5080 and above then still decent options ab
    • I suspect years from now if we ever have antitrust law enforcement again here in America then it'll get revealed that there was collusion going on. Like when lcd monitors were so expensive for so long even though technology had advanced so much.

      My guess is that what we'll find is free-market economics at work. Wafers developed into data center chips are worth far more than when made into gaming chips. I don't know the prices, but I'm guessing the data center chips can be sold for around 10x the price of the gaming chips. That's the problem, that every gaming chip Nvidia sells is throwing huge profits away. Yes, there are complaints, some vociferous, about the scarcity and the price of gaming GPUs. However, until AMD poses more of a market sha

      • by Holi ( 250190 )

        Most likely Nvidia will get out of the consumer GPU market altogether eventually. None of their advance are really helping in the rasterization department, that has been stagnating, all the new advances are in using AI to upscale.

"You must have an IQ of at least half a million." -- Popeye

Working...