Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Technology

Nvidia Takes an Added Role Amid AI Craze: Data-Center Designer (msn.com) 24

Nvidia dominates the chips at the center of the AI boom. It wants to conquer almost everything else that makes those chips tick, too. From a report: Chief Executive Jensen Huang is increasingly broadening his company's focus -- and seeking to widen its advantage over competitors -- by offering software, data-center design services and networking technology in addition to its powerful silicon brains. More than a supplier of a valuable hardware component, he is trying to build Nvidia into a one-stop shop for all the key elements in the data centers where tools like OpenAI's ChatGPT are created and deployed -- or what he calls "AI factories."

Huang emphasized Nvidia's growing prowess at data-center design following an earnings report Wednesday that exceeded Wall Street forecasts. The report came days after rival AMD agreed to pay nearly $5 billion to buy data-center design and manufacturing company ZT Systems to try to gain ground on Nvidia. "We have the ability fairly uniquely to integrate to design an AI factory because we have all the parts," Huang said in a call with analysts. "It's not possible to come up with a new AI factory every year unless you have all the parts." It is a strategy designed to extend the business success that has made Nvidia one of the world's most valuable companies -- and to insulate it from rivals eager to eat into its AI-chip market share, estimated at more than 80%. Gobbling up more of the value in AI data centers both adds revenue and makes its offerings stickier for customers.

[...] Nvidia is building on the effectiveness of its 17-year-old proprietary software, called CUDA, which enables programmers to use its chips. More recently, Huang has been pushing resources into a superfast networking protocol called InfiniBand, after acquiring the technology's main equipment maker, Mellanox Technologies, five years ago for nearly $7 billion. Analysts estimate that InfiniBand is used in most AI-training deployments. Nvidia is also building a business that supplies AI-optimized Ethernet, a form of networking widely used in traditional data centers. The Ethernet business is expected to generate billions of dollars in revenue within a year, Chief Financial Officer Colette Kress said Wednesday. More broadly, Nvidia sells products including central processors and networking chips for a range of other data-center equipment that is fine-tuned to work seamlessly together.

This discussion has been archived. No new comments can be posted.

Nvidia Takes an Added Role Amid AI Craze: Data-Center Designer

Comments Filter:
  • by xack ( 5304745 ) on Tuesday September 03, 2024 @09:57AM (#64758814)
    Expand CUDA into a full ISA and then expand the Nvidia driver into a full kernel and distro. With everything controlled by one central AI unit, it would be the base for more extended systems. Nvidia could have the iphone of ai if they wanted to.
    • by quonset ( 4839537 ) on Tuesday September 03, 2024 @10:17AM (#64758866)

      Expand CUDA into a full ISA and then expand the Nvidia driver into a full kernel and distro. With everything controlled by one central AI unit, it would be the base for more extended systems. Nvidia could have the iphone of ai if they wanted to.

      Odd you should say that. That is exactly what some analysts are saying about the company. It wants to be the next Apple. They'll have both the software and hardware side. You need AI infrastructure? They have it all in one package. Or they can configure to your needs.

    • AMD ensured that all major AI related frameworks (e.g. pyTorch) run natively on its hardware.

  • Ok, you know how Elon Musk lies about having full self-driving ready "later this year" every year since 2016 .. well Nvidia has been doing the exact same thing .. except unlike Tesla they have nothing in production to show. What does this have to do with AI? Everything, if Nvidia chips were great at AI they would have solved FSD. https://www.autoconnectedcar.c... [autoconnectedcar.com]

    How come they get a free pass on that, just because Elon is doing it too?

    • by HBI ( 10338492 ) on Tuesday September 03, 2024 @10:37AM (#64758916)

      I think the reason why is that no one really cares. The expectation that cars would drive for you has been by and large dashed for now. Big promises, no real delivery. You still need to pay 100% attention to the road. I mean if you don't want to die.

      What they do care about now is the current AI hype. When that doesn't pan out, Nvidia's current valuation is going to be a fond memory.

      • If you're talking about Tesla only.

        Waymo is doing it. This [reddit.com] is what rolling out real self-driving cars looks like. Not a press release. Steady, real-world progress.

    • What has NVidia lied about? They aren't claiming to have self-driving car capabilities.

      It is well-understood what NVidia boards and APIs do, and they do it well.

        • That's an incorrect prediction about the future, not a dishonest claim about something they are currently selling.

          Now, Tesla has flirted with - and in my opinion crossed - that line. I wouldn't mind a bit if the FTC forced Tesla to stop calling what they are currently selling "full self driving," because it's not.

  • At some time, enough people will realize that LLMs are just glorified unfunny clowns with no insight and projects will get cancelled en-masse. At that time, Nvidia will crash hard because most of their business will just vanish from one day to the next.

    • by GoTeam ( 5042081 )
      They're involved with AI and cryptocurrency. Investors get blue balls just hearing those words together. I guess once folks understand that LLMs are basically an algorithm, half of the NVIDIA wet dream will suffer.
    • by Calibax ( 151875 )

      People have been saying that Nvidia will crash for about two years. But they haven't yet. They are sold out for 2024 and well on the way to being sold out for 2025 if you believe the analytical companies whose job is to forecast company performance for the market.

      It's very possible they will crash, but I'm not selling until there's an indication that the competition is killing them or if companies stop buying their AI stuff. Until then I'm holding the 2k shares I bought at $4 (actually 200 bought at $40 way

      • by Kartu ( 1490911 )

        Wrong. Two years ago is when ChatGPT 3.5 was released and major public just started getting taste of what LLM is.

        At this point hundreds of billions are poured into "AI" with majority of projects not only not having anything in production, but even not seeing any concrete path to it.

        Unless a new LLM like breakthrough (it has surprised even its developers) happens and suddenly all that chipery becomes useful, this madness will crash.

        • by gweihir ( 88907 )

          Exactly. At the current development stage, the only thing that could really work (on a far smaller scale) is cheap specialist models. But they are 5-10 years away, maybe more, and it is possible that even they will not be good enough.

    • When Nvidia "crashes" they're going to still be in a pretty sweet position. Yeah, they will have to shed a lot of the company, so what? They will still have the best GPU hardware on the planet, and the most experience integrating it with CPU cores. They will have to go back to being... *checks notes* ...the most successful GPU manufacturer on the planet. And they will be sitting on a big pile of cash which they can use to transition to the next big fad.

    • LLMs have value. We spent a lot of hype using them wrong, and I'm not sure the killer use case has been found yet.....but there surely is one. The ability to convert between structured data and natural language in both directions can't help but pay off for someone eventually.
    • At some time, enough people will realize that LLMs are just glorified unfunny clowns with no insight and projects will get cancelled en-masse. At that time, Nvidia will crash hard because most of their business will just vanish from one day to the next.

      If LLMs are shown definitively to not be useful, then Nvidia sales and stock will crash. Of course, if anyone could prove now that LLMs are not useful and won't be useful, then that person would be famous and lauded as a genius. Past history has included technologies that didn't pan out along with technologies that initially weren't that impressive that eventually emerged as industry changing. It's obvious that it's not at all obvious whether LLMs belong to the former or the latter.

  • This seems like Google's, Microsoft's, and many other large tech company's playbook.

    First, you master one vertical (search, desktop OS, graphics/AI chips). Then, you make way too much money with your new pseudo-monopoly.

    But then you have a new problem: all your investors want you to keep growing at the crazy rate you did when you first started cornering the market ... but you can't, because you've already cornered that market.

    So instead, you go into some other vertical, which is at least related to what yo

    • by Kartu ( 1490911 )

      This looks more like wishful thinking from Huang, rather than actual business.

      His stock went down after recent earnings call, the next gen AI chip has issues, even before a possible perfect storm (AMD MI300x and what comes after it), NV is at 70% of AI market, with big flashy "fuck you, Huang" from juggernauts like Apple, who have opted for Googles specialized chips.

If you steal from one author it's plagiarism; if you steal from many it's research. -- Wilson Mizner

Working...