Nvidia's Revenue Triples As AI Chip Boom Continues 30
Nvidia's fiscal third-quarter results surpassed Wall Street's predictions, with revenue growing 206% year over year. However, Nvidia shares are down after the company called for a negative impact in the next quarter due to export restrictions affecting sales in China and other countries. CNBC reports: Nvidia's revenue grew 206% year over year during the quarter ending Oct. 29, according to a statement. Net income, at $9.24 billion, or $3.71 per share, was up from $680 million, or 27 cents per share, in the same quarter a year ago. The company's data center revenue totaled $14.51 billion, up 279% and more than the StreetAccount consensus of $12.97 billion. Half of the data center revenue came from cloud infrastructure providers such as Amazon, and the other from consumer internet entities and large companies, Nvidia said. Healthy uptake came from clouds that specialize in renting out GPUs to clients, Kress said on the call.
The gaming segment contributed $2.86 billion, up 81% and higher than the $2.68 billion StreetAccount consensus. With respect to guidance, Nvidia called for $20 billion in revenue for the fiscal fourth quarter. That implies nearly 231% revenue growth. [...] Nvidia faces obstacles, including competition from AMD and lower revenue because of export restrictions that can limit sales of its GPUs in China. But ahead of Tuesday report, some analysts were nevertheless optimistic.
The gaming segment contributed $2.86 billion, up 81% and higher than the $2.68 billion StreetAccount consensus. With respect to guidance, Nvidia called for $20 billion in revenue for the fiscal fourth quarter. That implies nearly 231% revenue growth. [...] Nvidia faces obstacles, including competition from AMD and lower revenue because of export restrictions that can limit sales of its GPUs in China. But ahead of Tuesday report, some analysts were nevertheless optimistic.
AI is hot (Score:3)
What are some cool buzzwords I can put in my resume? Ideally a sentence or two would be great.
Re: (Score:2)
> What are some cool buzzwords I can put in my resume?
Transformers
Back propagation
Re: AI is hot (Score:2)
Re: (Score:3)
What are some cool buzzwords I can put in my resume? Ideally a sentence or two would be great.
Why not ask ChatGPT or OpenAI for some help?
Re: (Score:1)
What are some cool buzzwords I can put in my resume? Ideally a sentence or two would be great.
Why not ask ChatGPT or OpenAI for some help?
And the moral is, how could you not see this coming? I know people wanted them to fail I guess but hey had GPU's figured a long time ago.....
Re: (Score:3)
Me> What are some cool AI buzzwords I can put in my resume? Ideally a sentence or two would be great.
ChatGPT>
Certainly! Including relevant AI buzzwords on your resume can help showcase your expertise and catch the attention of employers. Here are some examples:
"Implemented machine learning algorithms for predictive analytics, enhancing decision-making processes and optimizing business outcomes."
"Utilized natural language processing (NLP) techniques to develop a sentiment analysis tool, improving custo
Re: (Score:3)
Makes me want to stand up and yell, "BINGO!"
That is one of the finest buzzword bingo presentations I've ever seen. Finally, chatgpt has provided value.
Re: (Score:2)
"Machiavelli"
Re: (Score:2)
I guess Nvidia is lucky these days... (Score:3)
I guess Nvidia is lucky these days, they rode two frivolous hype waves in a row, cryptos and AI, still riding AI as crypto wave has faded some.
Re: (Score:2)
AMD makes GPU but couldn't benefit as much from either crypto or AI.
Intel couldn't do it. They tried to make GPUs but failed to deliver on time.
Re: (Score:2)
Intel has a long history of failing in the gpu market going back to much earlier days. I'm pretty sure they've never once managed to make a competitive high end gpu. Their bottom end stuff was fine for reading email in cheap systems but I'm not talking about the cheap stuff.
When I buy computers I always assume I'll need a discrete video card if it comes only with intel graphics.
Re: (Score:2)
More people look at video created by Intel graphics chip than probably any other chip.
I don't understand how they can't figure out to make a GPUs in the world where there is massive massive demand for it. They have the foundries, have the gobbled up so many AI chip companies, they have access so much talent, money and so much but can't seem to execute!
Re: (Score:2)
Lack of vision and bad management bean counting?
I have no other explanation. I agree they have everything going for them but never managed to make a worthwhile gpu.
Re: I guess Nvidia is lucky these days... (Score:2, Insightful)
They're not lucky, they're competent.
CUDA is the reason everyone uses Nvidia for GPGPU.
Maybe if Intel or AMD could do a driver or an API worth a fuck they would be dominating this market instead.
Re: (Score:3)
Intel can't do gpu hardware much less drivers and apis.
My AMD Gpus were great when they worked but it was night n day when I switched to Nvidia. The occasional glitches in certain software just went away.
I'd never go back even if the amd equivalent was a lot cheaper.
Re: (Score:2)
Maybe if Intel or AMD could do a driver or an API worth a fuck they would be dominating this market instead.
That wouldn't be enough, there's too much lock in and too much existing investment in CUDA.
Using OpenCL or vulkan-compute you can already do cross platform gpgpu programming, quite well I might add.
From-scratch items are easy, it's integrating existing work that's an issue, it's not insurmountable but there's enough friction to make it not worth peoples while.
Nvidia got in bed with academia pretty early with cuda, so there's whole decades of university graduates of the engineering type that don't know any b
Re: (Score:1)
That wouldn't be enough, there's too much lock in and too much existing investment in CUDA.
How would they or anyone know? They've never put together anything as good as CUDA and then ridden it out for long enough for it to catch on.
Re: (Score:2)
If the point of CUDA is to expose hardware capabilities openCL has been pretty great since 1.1/1.2 over 12 years ago. That's not a short length of time and they can't very well go back in time to predate CUDA.
If you're referring to the myriad of libraries [nvidia.com] nvidia supports and supplies, I'd argue that that isn't really in the scope of an api designed to facilitate efficient use of compute hardware. As useful as it may be for coders.
Re: (Score:3)
I guess Nvidia is lucky these days, they rode two frivolous hype waves in a row, cryptos and AI, still riding AI as crypto wave has faded some.
Erm... NVIDIA have made good GPUs for years. Long before the crypto-fart nonsense.
They're still making them and haven't forgotten that it was gamers that ultimately made NVIDIA. I cant hate them for a little success.
Making the money while they can (Score:2)
Re: (Score:2)
nVidia is fabless.
They've got their own GPUs (Score:2)
Re: (Score:2)
They're at least 10 years behind now but in 20 years they'll be 10 years ahead of where we are now.
Re: (Score:2)
I don't think you can make that prediction so easily. Being ahead is very hard, because instead of trying to get one thing working, you need to explore dozens of possibilities, most of which will fail. That costs an order of magnitude more money than playing catch-up.
I think it's most likely that they'll end up only little behind, such that the sanctions don't matter. Of course that's not good for Nvidia since they'll have a bunch of competitors, but great for us as consumers who'll have more choice than ev
Re: (Score:2)
Wouldn't it be funny if China does a sanction against the US for GPUs if they catch up / get ahead of Nvidia / AMD / Intel's GPUs?
Anyway if they don't have good drivers for use, the GPUs may not be worth using. So that's something they have to work on.
On another note, if they ever get even somewhat close to Nvidia's offerings but at half the price, that may actually get most of the world to start using their gear / skip CUDA for whatever the China people are offering. Wide availability, cheap (especially fo
privacy nightmare (Score:2)
It tries to force me to enable device history so they can use my voice for training the LLM if I try to use this; and if I refuse to turn it on, it won't let me actually use this feature.
So no, this is a privacy nightmare, it honestly looks to me like they are training deepfakes on the voices of everybody using it.
Uh. (Score:2)