Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Technology

Nvidia's Huang Says His AI Chips Are Improving Faster Than Moore's Law (techcrunch.com) 46

Nvidia's AI chips are advancing faster than Moore's Law, the semiconductor industry's historical performance benchmark, according to chief executive Jensen Huang. "Our systems are progressing way faster than Moore's Law," Huang told TechCrunch. Nvidia's chips have improved thousand-fold over the past decade, outpacing Moore's Law's prediction of doubled transistor density every year, Huang said. He adds: We can build the architecture, the chip, the system, the libraries, and the algorithms all at the same time. If you do that, then you can move faster than Moore's Law, because you can innovate across the entire stack.

[...] Moore's Law was so important in the history of computing because it drove down computing costs. The same thing is going to happen with inference where we drive up the performance, and as a result, the cost of inference is going to be less.

Nvidia's Huang Says His AI Chips Are Improving Faster Than Moore's Law

Comments Filter:
  • by toddz ( 697874 ) on Wednesday January 08, 2025 @10:25AM (#65072627)
    Nvidia knows they can't beat Moore's Law so they are just going to redefine it.
    • by leonbev ( 111395 )

      While the GPU complexity and GPU power usage has gone through the roof, the actual GPU rendering performance has only been improving by about 20% a year for the past 5 years. It seems like they're running headfirst into a giant efficiency wall.

      • by Rei ( 128717 )

        He's not talking about rendering performance.

        BTW, a lot of these performance gains are "real, but with a catch". They've gone from FP16 to FP8 to FP4 offerings. And all of those are useful for AI inference (not so much for training). And each halving of size gets you roughly double the FLOPS. So yeah, you can run inference a lot faster. But it's not exactly an apples to apples comparison.

        • by Junta ( 36770 ) on Wednesday January 08, 2025 @11:36AM (#65072775)

          Yeah, was recently in a situation where AI approach was being compared with a simulation and noting that the AI approach was much much faster to 'good enough' results compared to the simulation. Someone tweaked the simulation to use similar precision as the AI approach ultimately used, but for the traditional simulation and *of course* it ran much much faster even than the AI approach. Massive speedups through precision reduction.

          An indirect boon to some fields is that the "you don't need supreme decision for all use cases" mindset may improve efficiencies for certain scenarios that always demanded things like FP64 exclusively despite not really having inputs and outputs that quite benefited from such precision. Which should be no surprise to physicists that know that gravity is 10 m/s^2.

          • by Rei ( 128717 )

            AI inference is unusual for its tolerance of low precisions. But it's the same in brains as well. Brains are very "noisy". Neural networks however, whether artificial or biological, are extremely forgiving of noise; they tend to converge on solutions rather than diverge.

            Doesn't work with, though, with training. Typical training weight adjustments are like max 1e-5 or whatnot.

      • the actual GPU rendering performance has only been improving by about 20% a year for the past 5 years. It seems like they're running headfirst into a giant efficiency wall.

        A 20% yearly performance improvement means a performance doubling every four years. I wish my 401k was hitting that kind of wall.

        • by Entrope ( 68843 )

          Moore's Law was originally doubling every year, later revised to doubling every two years. The latter is 41%/year -- quite a lot more than the GP's number for GPU rendering performance.

    • So? The transistor count doesn't matter to users and people already interpret Moore's law to mean 'Computers get faster'.
      • by narcc ( 412956 )

        Look Tim, the reality is that most people have never heard of Moore's law. Hell, most people have never even heard of Gordon Moore! Sure, it was on Jeopardy once, but Alex told everyone watching that the law states that "chips" double every 18 months... How the researchers got it that wrong is beyond comprehension.

        Of course, even the users here get it wrong. One guy is even quoting, verbatim, the nonsense Google's AI spits out when you search for Moore's Law.

        The "law" comes from an article by Gordon Moor

        • No, I think most people have heard of Moore's law. Certainly people who care about chip performance have. And actually, Moore's law is much more famous than Gordon Moore himself. It's not a surprise they don't know about Gordon Moore
    • by Targon ( 17348 )

      Double the physical size, multiply the power draw by ten times, get twice the performance, and hey, they are doing so great, right?

  • Moore's Law was originally about price/bit.
    Until about 2007, we also saw Dennard scaling [wikipedia.org] which was driving the increase in clock speed.

    Barring a really new and exciting technology, the only way inference is going to get faster is by massive parallelism. Which is hard, from both an algorithmic point of view and from chip cooling.

    • It's not even a "law" in any sense but rather a business slogan.
    • by Pieroxy ( 222434 ) on Wednesday January 08, 2025 @11:19AM (#65072737) Homepage

      Moore's law [wikipedia.org] is the observation that the number of transistors in an integrated circuit doubles about every two years. There's nothing in it about price/bit, or whatever that means.

      • by ceoyoyo ( 59147 )

        Moore's law is the observation that there's an optimum number of components in an IC that provides the lowest price per transistor and that optimum was increasing exponentially.

        It's not about performance or the number of transistors on a chip (or transistors at all, technically), it's about the number of components in the optimum price / component package.

      • by narcc ( 412956 )

        False, as I explained above.

      • by HiThere ( 15173 )

        Wikipedia is not a good source for precise information. It's a good source for pretty good approximations in non-controversial areas. I don't know about currently, but actual experts used to be systematically excluded. (You had to be reporting about a publication somewhere else, not reporting actual results.)

  • Buy my stock.

  • So, they are basically even with Moore's law even though they have redefined from being transistor count to performance.
    • by AvitarX ( 172628 )

      Moore's law hasn't been taken to mean a doubling every year for decades.

      According to Wikipedia Moore adjusted it to every two years in the mid 70s.

    • by Tablizer ( 95088 )

      "Transistors" sounded too woke in the new political climate, so they renamed them "Cistors".

  • When a C Suite says something that borders on fantasy, it could just be salesmanship. At some point in the past decade or so, we decided that salesmanship was more important than a grasp on reality. Doesn't matter if what they're selling is real. It's how they sell it that matters! Stock price should continue to climb for another few clicks with that level of salesmanship and that much disconnection from reality. How long can Wall Street operate on daydreams? I guess we're gonna find out!

    • The NVidia hardware / software stack has definitely become much, must faster over the last decade, there isn't any doubt about that.

      It would be nice if $ per flop and watts per flop had decreased as much as performance increased, since a single H200 is about $35,000 and a DGX box packs 8 of those. Nvidia wasn't selling anything at that price scale a decade ago.

    • Marketing Defined Reality is a very real thing, and arguing against said Reality can indeed make people see you as crazy.
      The loonies are in charge of the asylum, and truth is treason in the kingdom of lies.

      Corporatism is so cool.
  • by hyades1 ( 1149581 ) <hyades1@hotmail.com> on Wednesday January 08, 2025 @11:12AM (#65072729)

    "Huang says his AI chips are improving faster than Moore's Law"

    Yeah, well let me introduce all you ladies out there to my ten-inch Huang.

    ;-)

  • He is lying now. What a piece of shit.

  • Moores law can't be both dead and alive, the silicon doesn't know what you are doing with the transistors after all. He is just trying to backtrack on his moores law is dead comments which he used as an excuse to increase prices 400% or more
  • "We can build the architecture the architecture, the chip, the system, the libraries, and the algorithms"

    Nice sales pitch and I'm sure they are making a lot of progress, but if you include the support structures including the chassis and all the software in your metric it starts to have little relevance to Moore's Law. They are breaking new ground, and in situations like that there can be very rapid progress at first. Later there will be diminishing returns.

  • Nvidia's chips have improved thousand-fold over the past decade, outpacing Moore's Law's prediction of doubled transistor density every year, Huang said.

    No. 1000 2^10.

    • by zmooc ( 33175 )

      Forgot this was slashdot and I have to type HTML. So here's my second attempt: 1000 < 2^10.

  • by thegarbz ( 1787294 ) on Wednesday January 08, 2025 @12:35PM (#65072967)

    Nvidia has improved 1000x over 10 years. Better than Moore's law of doubling every year?

    Nvidia: x * 1000 = 1000x
    Moore's: x * 2^10 = 1024x

    Looks like Nvidia is underperforming Moore's law by numbers. To say nothing of the fact that performance != transistor density so they were talking about two different things in the first place. I thought this guy was an engineer.

    • I thought this guy was an engineer.

      Yes sarcasm I get it, and not to mention other public figures: Those who wave their arms about shouting at crowds making unverified claims are grifters who enjoys the sound of their own voice and personal wealth. I cant think of a word for the idiots who listen intently to this drivel, maybe investors?

    • by etash ( 1907284 )
      doubling every two years. that means 2^5 as there are only 5 two-year periods in a decade.
  • It never was! Law of diminishing returns. Or the whatever-curve. Don't remember the name but basically you spend a ton of money inventing something at all then it takes $100 to make the new product 30% perfect. Then it takes $1000 to make it 80% and $10,000 to make it 90% perfect and $100,000 to make it 99% perfect. It's a new product. You threw a reasonable amount of money at chips that do this new-ish thing and shocker, it got WAY better. That's how almost all product development works. We learned this in
  • by u19925 ( 613350 ) on Wednesday January 08, 2025 @01:44PM (#65073129)

    In 2018, I got a laptop with 1060Ti graphics card with 6 GB RAM for under $800. If NVIDIA chips performance has increased 1000x in 10 years across the board, then I would expect 64x improvement in graphics as well. Mr. Huang, please send me a link where I can laptop with 64x speed improvement over 1060Ti and cost under $800.

    Based on the benchmarks, desktop version of RTX4090 is 5.6x faster than GTX1060Ti and RTX4090 is way more expensive than GTX1060Ti was in 2018. The recently announced RTX5060 with price of $549 (that Huang claims has speed of 4090) matches 6x performance improvement at similar price to GTX1060Ti. So in 7 years, we get 5.6x improvement. That is way slower than Moore's law.

    • Are graphics cards even focused on FPS any more? It's all about the highly parallel AI enablement shit.
  • https://www.intel.com/content/... [intel.com]

    It's not a law. It's not a principle. It's just a thing people say. NVIDIA won't "beat" a nothing law, but good on them for innovation.

    AI isn't real. NVIDIA is a fad. Bitcoing is not an investment.

    Stop making up crap and pretending it's science. Or news. It's just crap. Wipe, flush, done.

  • But not in the right direction.
  • Chips didn't follow "Moore's Law" that was the apparent effect of them following Wright's Law, which is not time-based but volume-based, a certain % reduction every time the total amount manufactured doubled. Chips happened to double sales every two years; when sales quit doubling that fast, chips stopped following "Moore's Law".

The only thing cheaper than hardware is talk.

Working...