Nvidia Takes an Added Role Amid AI Craze: Data-Center Designer (msn.com) 24
Nvidia dominates the chips at the center of the AI boom. It wants to conquer almost everything else that makes those chips tick, too. From a report: Chief Executive Jensen Huang is increasingly broadening his company's focus -- and seeking to widen its advantage over competitors -- by offering software, data-center design services and networking technology in addition to its powerful silicon brains. More than a supplier of a valuable hardware component, he is trying to build Nvidia into a one-stop shop for all the key elements in the data centers where tools like OpenAI's ChatGPT are created and deployed -- or what he calls "AI factories."
Huang emphasized Nvidia's growing prowess at data-center design following an earnings report Wednesday that exceeded Wall Street forecasts. The report came days after rival AMD agreed to pay nearly $5 billion to buy data-center design and manufacturing company ZT Systems to try to gain ground on Nvidia. "We have the ability fairly uniquely to integrate to design an AI factory because we have all the parts," Huang said in a call with analysts. "It's not possible to come up with a new AI factory every year unless you have all the parts." It is a strategy designed to extend the business success that has made Nvidia one of the world's most valuable companies -- and to insulate it from rivals eager to eat into its AI-chip market share, estimated at more than 80%. Gobbling up more of the value in AI data centers both adds revenue and makes its offerings stickier for customers.
[...] Nvidia is building on the effectiveness of its 17-year-old proprietary software, called CUDA, which enables programmers to use its chips. More recently, Huang has been pushing resources into a superfast networking protocol called InfiniBand, after acquiring the technology's main equipment maker, Mellanox Technologies, five years ago for nearly $7 billion. Analysts estimate that InfiniBand is used in most AI-training deployments. Nvidia is also building a business that supplies AI-optimized Ethernet, a form of networking widely used in traditional data centers. The Ethernet business is expected to generate billions of dollars in revenue within a year, Chief Financial Officer Colette Kress said Wednesday. More broadly, Nvidia sells products including central processors and networking chips for a range of other data-center equipment that is fine-tuned to work seamlessly together.
Huang emphasized Nvidia's growing prowess at data-center design following an earnings report Wednesday that exceeded Wall Street forecasts. The report came days after rival AMD agreed to pay nearly $5 billion to buy data-center design and manufacturing company ZT Systems to try to gain ground on Nvidia. "We have the ability fairly uniquely to integrate to design an AI factory because we have all the parts," Huang said in a call with analysts. "It's not possible to come up with a new AI factory every year unless you have all the parts." It is a strategy designed to extend the business success that has made Nvidia one of the world's most valuable companies -- and to insulate it from rivals eager to eat into its AI-chip market share, estimated at more than 80%. Gobbling up more of the value in AI data centers both adds revenue and makes its offerings stickier for customers.
[...] Nvidia is building on the effectiveness of its 17-year-old proprietary software, called CUDA, which enables programmers to use its chips. More recently, Huang has been pushing resources into a superfast networking protocol called InfiniBand, after acquiring the technology's main equipment maker, Mellanox Technologies, five years ago for nearly $7 billion. Analysts estimate that InfiniBand is used in most AI-training deployments. Nvidia is also building a business that supplies AI-optimized Ethernet, a form of networking widely used in traditional data centers. The Ethernet business is expected to generate billions of dollars in revenue within a year, Chief Financial Officer Colette Kress said Wednesday. More broadly, Nvidia sells products including central processors and networking chips for a range of other data-center equipment that is fine-tuned to work seamlessly together.
Make AI a full stack (Score:3)
Re:Make AI a full stack (Score:5, Interesting)
Expand CUDA into a full ISA and then expand the Nvidia driver into a full kernel and distro. With everything controlled by one central AI unit, it would be the base for more extended systems. Nvidia could have the iphone of ai if they wanted to.
Odd you should say that. That is exactly what some analysts are saying about the company. It wants to be the next Apple. They'll have both the software and hardware side. You need AI infrastructure? They have it all in one package. Or they can configure to your needs.
CuDA is not that relevant in AI (Score:1)
AMD ensured that all major AI related frameworks (e.g. pyTorch) run natively on its hardware.
Nvidia lies about AI capabilities (Score:1)
Ok, you know how Elon Musk lies about having full self-driving ready "later this year" every year since 2016 .. well Nvidia has been doing the exact same thing .. except unlike Tesla they have nothing in production to show. What does this have to do with AI? Everything, if Nvidia chips were great at AI they would have solved FSD. https://www.autoconnectedcar.c... [autoconnectedcar.com]
How come they get a free pass on that, just because Elon is doing it too?
Re:Nvidia lies about AI capabilities (Score:4, Insightful)
I think the reason why is that no one really cares. The expectation that cars would drive for you has been by and large dashed for now. Big promises, no real delivery. You still need to pay 100% attention to the road. I mean if you don't want to die.
What they do care about now is the current AI hype. When that doesn't pan out, Nvidia's current valuation is going to be a fond memory.
Re: (Score:2)
Waymo is doing it. This [reddit.com] is what rolling out real self-driving cars looks like. Not a press release. Steady, real-world progress.
Re: (Score:2)
It is well-understood what NVidia boards and APIs do, and they do it well.
Re: (Score:2)
Here: https://futurism.com/a-video-c... [futurism.com]
Re: (Score:2)
Now, Tesla has flirted with - and in my opinion crossed - that line. I wouldn't mind a bit if the FTC forced Tesla to stop calling what they are currently selling "full self driving," because it's not.
Sun Microsystems Modular Data Center (Score:4, Informative)
Nvidia is headed for a brutal crash (Score:2)
At some time, enough people will realize that LLMs are just glorified unfunny clowns with no insight and projects will get cancelled en-masse. At that time, Nvidia will crash hard because most of their business will just vanish from one day to the next.
Re: (Score:2)
Re: (Score:2)
People have been saying that Nvidia will crash for about two years. But they haven't yet. They are sold out for 2024 and well on the way to being sold out for 2025 if you believe the analytical companies whose job is to forecast company performance for the market.
It's very possible they will crash, but I'm not selling until there's an indication that the competition is killing them or if companies stop buying their AI stuff. Until then I'm holding the 2k shares I bought at $4 (actually 200 bought at $40 way
Re: (Score:1)
Wrong. Two years ago is when ChatGPT 3.5 was released and major public just started getting taste of what LLM is.
At this point hundreds of billions are poured into "AI" with majority of projects not only not having anything in production, but even not seeing any concrete path to it.
Unless a new LLM like breakthrough (it has surprised even its developers) happens and suddenly all that chipery becomes useful, this madness will crash.
Re: (Score:2)
Exactly. At the current development stage, the only thing that could really work (on a far smaller scale) is cheap specialist models. But they are 5-10 years away, maybe more, and it is possible that even they will not be good enough.
Re: (Score:2)
When Nvidia "crashes" they're going to still be in a pretty sweet position. Yeah, they will have to shed a lot of the company, so what? They will still have the best GPU hardware on the planet, and the most experience integrating it with CPU cores. They will have to go back to being... *checks notes* ...the most successful GPU manufacturer on the planet. And they will be sitting on a big pile of cash which they can use to transition to the next big fad.
Re: (Score:2)
Maybe. Or maybe not. A complex organization cannot simply go back to what it did before.
Re: Nvidia is headed for a brutal crash (Score:2)
Sure, but there will be something else for them to move to by then. Plus it's a great excuse for a mass layoff
Re: (Score:2)
Sure. And if they do that wrong, they will not have a company 10 years later.
Re: Nvidia is headed for a brutal crash (Score:2)
Re: (Score:2)
At some time, enough people will realize that LLMs are just glorified unfunny clowns with no insight and projects will get cancelled en-masse. At that time, Nvidia will crash hard because most of their business will just vanish from one day to the next.
If LLMs are shown definitively to not be useful, then Nvidia sales and stock will crash. Of course, if anyone could prove now that LLMs are not useful and won't be useful, then that person would be famous and lauded as a genius. Past history has included technologies that didn't pan out along with technologies that initially weren't that impressive that eventually emerged as industry changing. It's obvious that it's not at all obvious whether LLMs belong to the former or the latter.
This Seems Familiar (Score:2)
This seems like Google's, Microsoft's, and many other large tech company's playbook.
First, you master one vertical (search, desktop OS, graphics/AI chips). Then, you make way too much money with your new pseudo-monopoly.
But then you have a new problem: all your investors want you to keep growing at the crazy rate you did when you first started cornering the market ... but you can't, because you've already cornered that market.
So instead, you go into some other vertical, which is at least related to what yo
Re: (Score:1)
This looks more like wishful thinking from Huang, rather than actual business.
His stock went down after recent earnings call, the next gen AI chip has issues, even before a possible perfect storm (AMD MI300x and what comes after it), NV is at 70% of AI market, with big flashy "fuck you, Huang" from juggernauts like Apple, who have opted for Googles specialized chips.