Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Google AI

Sundar Pichai Says Google and Nvidia Will Still Be Working Together 10 Years From Now (cnbc.com) 16

Sundar Pichai said Google's longstanding relationship with chipmaker Nvidia isn't going to change any time soon -- in fact, he expects it to continue over the next 10 years. From a report: In an interview Wired published Monday, the Google CEO said the company worked "deeply" with Nvidia on Android and other initiatives for over a decade, adding that Nvidia has a "strong track record" with AI innovation. "Look, the semiconductor industry is a very dynamic, cooperative industry," Pichai said. "It's an industry that needs deep, long-term R&D and investments. I feel comfortable about our relationship with Nvidia and that we are going to be working closely with them 10 years from now."
This discussion has been archived. No new comments can be posted.

Sundar Pichai Says Google and Nvidia Will Still Be Working Together 10 Years From Now

Comments Filter:
  • by ozzymodus12 ( 8111534 ) on Monday September 11, 2023 @10:25AM (#63839034)
    Unless this means cheaper graphics cards, no one cares. Although Google going skynet is amusing as it it.
  • Sure, why not. Both companies treat their user base the same. Their goals are the same: greed.

  • Translation: (Score:5, Interesting)

    by MachineShedFred ( 621896 ) on Monday September 11, 2023 @10:36AM (#63839072) Journal

    Here's the translation of all this chummy good will: Google learned that making a processor that's worth a damn while still having great power efficiency isn't as easy as they were led to believe. Don't expect their "Tensor" to overtake Nvidia in literally anything, any time soon.

    • 1. the fabric matters. quad-channel LPDDR5 memory on a Cloudripper versus eight channel LPDDR5X on a GH200 plus NVlink to connect many of them together at a low level.
      2. eight cores in an unusual configuration of big, medium, little. Compared to 72 cores that are equivalent to the family of the previously mentioned "big" cores.
      3. 480GB of RAM shared by GPU and CPU in a GH200. I couldn't find Cloudripper's max, but it's more than 12 GiB. (probably 64, and less than 256)
      4. ready to run products like DGX GH200

      • The shared memory doesn't do a whole lot for you except make programming a little easier. Off GPU bandwidth is so tiny, that shared memory or message passing are equivalent performance wise.

        Large models are handled through pipelining.

    • One extra on top of that, "The moment we hire enough of their engineers or figure out how to do it ourselves, we're dumping those mother fuckers! We are embarrassed and humiliated we can't already build our own processors and that we have to publicly kiss ass. This is sooooo wrong!"

    • Which also indicates that Google is not a well run company. I am too chicken shit to ever short a stock, but I would love to short Google.
  • Or just maybe somebody own a boatload of Nvidia stock.
  • Business is business: they will still be working together in ten years if there is something in it for both of them. As soon as such is not the case for the pwoers-that-be in either, the relevant party will drop the other instantly.
  • Will buy a cup of coffee at McDonalds. If it's not written down, it means nothing.

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...