Nvidia's Huang Says His AI Chips Are Improving Faster Than Moore's Law (techcrunch.com) 44
Nvidia's AI chips are advancing faster than Moore's Law, the semiconductor industry's historical performance benchmark, according to chief executive Jensen Huang. "Our systems are progressing way faster than Moore's Law," Huang told TechCrunch. Nvidia's chips have improved thousand-fold over the past decade, outpacing Moore's Law's prediction of doubled transistor density every year, Huang said. He adds: We can build the architecture, the chip, the system, the libraries, and the algorithms all at the same time. If you do that, then you can move faster than Moore's Law, because you can innovate across the entire stack.
[...] Moore's Law was so important in the history of computing because it drove down computing costs. The same thing is going to happen with inference where we drive up the performance, and as a result, the cost of inference is going to be less.
[...] Moore's Law was so important in the history of computing because it drove down computing costs. The same thing is going to happen with inference where we drive up the performance, and as a result, the cost of inference is going to be less.
Revisionist History In The Making (Score:5, Insightful)
Re: (Score:3)
While the GPU complexity and GPU power usage has gone through the roof, the actual GPU rendering performance has only been improving by about 20% a year for the past 5 years. It seems like they're running headfirst into a giant efficiency wall.
Re: (Score:2)
He's not talking about rendering performance.
BTW, a lot of these performance gains are "real, but with a catch". They've gone from FP16 to FP8 to FP4 offerings. And all of those are useful for AI inference (not so much for training). And each halving of size gets you roughly double the FLOPS. So yeah, you can run inference a lot faster. But it's not exactly an apples to apples comparison.
Re:Revisionist History In The Making (Score:5, Insightful)
Yeah, was recently in a situation where AI approach was being compared with a simulation and noting that the AI approach was much much faster to 'good enough' results compared to the simulation. Someone tweaked the simulation to use similar precision as the AI approach ultimately used, but for the traditional simulation and *of course* it ran much much faster even than the AI approach. Massive speedups through precision reduction.
An indirect boon to some fields is that the "you don't need supreme decision for all use cases" mindset may improve efficiencies for certain scenarios that always demanded things like FP64 exclusively despite not really having inputs and outputs that quite benefited from such precision. Which should be no surprise to physicists that know that gravity is 10 m/s^2.
Re: (Score:2)
AI inference is unusual for its tolerance of low precisions. But it's the same in brains as well. Brains are very "noisy". Neural networks however, whether artificial or biological, are extremely forgiving of noise; they tend to converge on solutions rather than diverge.
Doesn't work with, though, with training. Typical training weight adjustments are like max 1e-5 or whatnot.
Re: (Score:2)
the actual GPU rendering performance has only been improving by about 20% a year for the past 5 years. It seems like they're running headfirst into a giant efficiency wall.
A 20% yearly performance improvement means a performance doubling every four years. I wish my 401k was hitting that kind of wall.
Re: (Score:3)
Moore's Law was originally doubling every year, later revised to doubling every two years. The latter is 41%/year -- quite a lot more than the GP's number for GPU rendering performance.
Re: (Score:1)
Re: (Score:2)
Look Tim, the reality is that most people have never heard of Moore's law. Hell, most people have never even heard of Gordon Moore! Sure, it was on Jeopardy once, but Alex told everyone watching that the law states that "chips" double every 18 months... How the researchers got it that wrong is beyond comprehension.
Of course, even the users here get it wrong. One guy is even quoting, verbatim, the nonsense Google's AI spits out when you search for Moore's Law.
The "law" comes from an article by Gordon Moor
Re: (Score:1)
Re: (Score:2)
I think most people have heard of Moore's law.
You can't possibly be serious.
Re: (Score:2)
Double the physical size, multiply the power draw by ten times, get twice the performance, and hey, they are doing so great, right?
Mixing metaphors here... (Score:2)
Moore's Law was originally about price/bit.
Until about 2007, we also saw Dennard scaling [wikipedia.org] which was driving the increase in clock speed.
Barring a really new and exciting technology, the only way inference is going to get faster is by massive parallelism. Which is hard, from both an algorithmic point of view and from chip cooling.
Re: (Score:1)
Re:Mixing metaphors here... (Score:5, Informative)
Moore's law [wikipedia.org] is the observation that the number of transistors in an integrated circuit doubles about every two years. There's nothing in it about price/bit, or whatever that means.
Re: (Score:3)
Moore's law is the observation that there's an optimum number of components in an IC that provides the lowest price per transistor and that optimum was increasing exponentially.
It's not about performance or the number of transistors on a chip (or transistors at all, technically), it's about the number of components in the optimum price / component package.
Re: (Score:2)
False, as I explained above.
Re: (Score:2)
False, as I explained above.
At least I provided a source for my information.
Re: (Score:2)
Wikipedia is not a good source for precise information. It's a good source for pretty good approximations in non-controversial areas. I don't know about currently, but actual experts used to be systematically excluded. (You had to be reporting about a publication somewhere else, not reporting actual results.)
Re: (Score:2)
The place Intel would have to horn in on to hurt nVidia would be via MAX and OneAPI which... doesn't seem to be happening in any significant sense.
And if you th
So are mine! (Score:1)
Buy my stock.
What? 10 years Moore's law would mean 1024 (Score:2)
Re: (Score:2)
Moore's law hasn't been taken to mean a doubling every year for decades.
According to Wikipedia Moore adjusted it to every two years in the mid 70s.
Re: (Score:1)
"Transistors" sounded too woke in the new political climate, so they renamed them "Cistors".
Salesman says salesman things. (Score:2)
When a C Suite says something that borders on fantasy, it could just be salesmanship. At some point in the past decade or so, we decided that salesmanship was more important than a grasp on reality. Doesn't matter if what they're selling is real. It's how they sell it that matters! Stock price should continue to climb for another few clicks with that level of salesmanship and that much disconnection from reality. How long can Wall Street operate on daydreams? I guess we're gonna find out!
Re: (Score:2)
It would be nice if $ per flop and watts per flop had decreased as much as performance increased, since a single H200 is about $35,000 and a DGX box packs 8 of those. Nvidia wasn't selling anything at that price scale a decade ago.
Re: (Score:2)
The loonies are in charge of the asylum, and truth is treason in the kingdom of lies.
Corporatism is so cool.
May I be forgiven a little skepticism? (Score:4, Funny)
"Huang says his AI chips are improving faster than Moore's Law"
Yeah, well let me introduce all you ladies out there to my ten-inch Huang.
Re: (Score:2)
Huang has lied before (Score:1)
He is lying now. What a piece of shit.
Moore's law is not tied to a specific application (Score:2)
hyperbole (Score:2)
"We can build the architecture the architecture, the chip, the system, the libraries, and the algorithms"
Nice sales pitch and I'm sure they are making a lot of progress, but if you include the support structures including the chassis and all the software in your metric it starts to have little relevance to Moore's Law. They are breaking new ground, and in situations like that there can be very rapid progress at first. Later there will be diminishing returns.
Bullshit (Score:2)
Nvidia's chips have improved thousand-fold over the past decade, outpacing Moore's Law's prediction of doubled transistor density every year, Huang said.
No. 1000 2^10.
Re: (Score:2)
Forgot this was slashdot and I have to type HTML. So here's my second attempt: 1000 < 2^10.
Errrr does Huang know math? (Score:4, Informative)
Nvidia has improved 1000x over 10 years. Better than Moore's law of doubling every year?
Nvidia: x * 1000 = 1000x
Moore's: x * 2^10 = 1024x
Looks like Nvidia is underperforming Moore's law by numbers. To say nothing of the fact that performance != transistor density so they were talking about two different things in the first place. I thought this guy was an engineer.
Re: (Score:2)
I thought this guy was an engineer.
Yes sarcasm I get it, and not to mention other public figures: Those who wave their arms about shouting at crowds making unverified claims are grifters who enjoys the sound of their own voice and personal wealth. I cant think of a word for the idiots who listen intently to this drivel, maybe investors?
Because it's not Moore's law (Score:2)
Does not apply to graphics cards (Score:3)
In 2018, I got a laptop with 1060Ti graphics card with 6 GB RAM for under $800. If NVIDIA chips performance has increased 1000x in 10 years across the board, then I would expect 64x improvement in graphics as well. Mr. Huang, please send me a link where I can laptop with 64x speed improvement over 1060Ti and cost under $800.
Based on the benchmarks, desktop version of RTX4090 is 5.6x faster than GTX1060Ti and RTX4090 is way more expensive than GTX1060Ti was in 2018. The recently announced RTX5060 with price of $549 (that Huang claims has speed of 4090) matches 6x performance improvement at similar price to GTX1060Ti. So in 7 years, we get 5.6x improvement. That is way slower than Moore's law.
Re: (Score:2)
Moore's observation (Score:1)
https://www.intel.com/content/... [intel.com]
It's not a law. It's not a principle. It's just a thing people say. NVIDIA won't "beat" a nothing law, but good on them for innovation.
AI isn't real. NVIDIA is a fad. Bitcoing is not an investment.
Stop making up crap and pretending it's science. Or news. It's just crap. Wipe, flush, done.
So is the price... (Score:2)
There was only Wright's Law (Score:2)
Chips didn't follow "Moore's Law" that was the apparent effect of them following Wright's Law, which is not time-based but volume-based, a certain % reduction every time the total amount manufactured doubled. Chips happened to double sales every two years; when sales quit doubling that fast, chips stopped following "Moore's Law".