Inflection AI Develops Supercomputer Equipped With 22,000 Nvidia H100 AI GPUs 28
Inflection AI, an AI startup company, has built a cutting-edge supercomputer equipped with 22,000 NVIDIA H100 GPUs. Wccftech reports: For those unfamiliar with Inflection AI, it is a business that aims at creating "personal AI for everyone." The company is widely known for its recently introduced Inflection-1 AI model, which powers the Pi chatbot. Although the AI model hasn't yet reached the level of ChatGPT or Google's LaMDA models, reports suggest that Inflection-1 performs well on "common sense" tasks, making it much more suitable for applications such as personal assistance.
>
Coming back, Inflection announced that it is building one of the world's largest AI-based supercomputers, and it looks like we finally have a glimpse of what it would be. It is reported that the Inflection supercomputer is equipped with 22,000 H100 GPUs, and based on analysis, it would contain almost 700 four-node racks of Intel Xeon CPUs. The supercomputer will utilize an astounding 31 Mega-Watts of power.
The surprising fact about the supercomputer is the acquisition of 22,000 NVIDIA H100 GPUs. We all are well aware that, in recent times, it has been challenging to acquire even a single unit of the H100s since they are in immense demand, and NVIDIA cannot cope with the influx of orders. In the case of Inflection AI, NVIDIA is considering being an investor in the company, which is why in their case, it is easier to get their hands on such a massive number of GPUs.
>
Coming back, Inflection announced that it is building one of the world's largest AI-based supercomputers, and it looks like we finally have a glimpse of what it would be. It is reported that the Inflection supercomputer is equipped with 22,000 H100 GPUs, and based on analysis, it would contain almost 700 four-node racks of Intel Xeon CPUs. The supercomputer will utilize an astounding 31 Mega-Watts of power.
The surprising fact about the supercomputer is the acquisition of 22,000 NVIDIA H100 GPUs. We all are well aware that, in recent times, it has been challenging to acquire even a single unit of the H100s since they are in immense demand, and NVIDIA cannot cope with the influx of orders. In the case of Inflection AI, NVIDIA is considering being an investor in the company, which is why in their case, it is easier to get their hands on such a massive number of GPUs.
Aass! (Score:5, Funny)
Now you too can subscribe to our Ai-as-a-Service. Sorry if it is a bit shitty, butt it's literally in the name.
Re: (Score:2)
You mean all those video gehmerz can't buy video cards?
H100s are not video cards and can't help your gaming.
Seem so very gimicky to me (Score:2)
It reminds me of a smart watch. Bling that isn't even blingy. They just seem so dorky.
Re: Seem so very gimicky to me (Score:2)
Re: (Score:3)
Yeah, I'm kind of shocked they have the resources to get that much gear. We are talking about a pretty trivial chatbot in a crowded field of platforms doing both chatbot and using LLM for more stuff.
I suppose investors may be imagining them to grow up to do bigger and better things, but it's an awfully crowded field with competitors well ahead of them.
Infection AI (Score:2, Funny)
I originally read the headline as "Infection AI". I'm not sure it isn't more correct that way, though.
Re: (Score:2)
Re: (Score:2)
My watch nags me whenever I don't do at least 250 steps every hour. It's smart enough to say, "Get off your ass, you lazy slob!"
My watch congratulated me for completing my exercise for the day about 20 minutes after driving on the freeway. As if I am a cheetah who completed a 10K....
Re: (Score:2)
Re: Seem so very gimicky to me (Score:2)
Thereâ(TM)s nothing gimmicky about this. You can see in the previous generation of AI breakthrough (image processing) that thereâ(TM)s huge numbers of practical applications for this kind of tech. That generation got us things like huge improvements in the quality of photography; massive leaps in various scientific fields, like astronomy and particle physics; big improvements in medical imagingâ¦
LLMs have already got us things like much better automated translation, tutoring systems fo
nice use of resources (Score:1)
meaningless metrics (Score:2)
31 Mega-Watts (Score:1)
Re: 31 Mega-Watts (Score:2)
Re: (Score:2)
Your kitchen oven may have a power usage of 2,000 W. Means it uses 2,000 joules every second.
That AI center has a power usage of 31,000,000 W, hence uses 15,500 times more energy per second than your oven.
Clear enough?
Re: (Score:3)
Watts are a unit of power, thus it's not per day or per hour, it just is. Whatt-hours are a measure of energy, thus hypothetically this datacenter runnig flat out would do 31 MWh per hour.
Re: (Score:2)
It's 31 megawatts any time it is in full use. Usually it is calculated by Watt hours.
Furthermore, it's generalized that a power plant making 1 megawatt can serve 1,000 homes - because some will be using more power, and other less, and the total power usage would be under 1 megawatt every hour.
That means when going full tilt this super computer uses the same amount of energy as a small to medium sized city of 31,000 homes. Every hour. Do note, that is 31K HOMES, not people. The average is multiple people per
Re: (Score:3)
i'd love to see data centers integrated into other industrial processes where heating is needed - i'd be such a much more efficient use of energy..
Re: 31 Mega-Watts (Score:2)
Yeah are they just dumping those 31MW of heat into the atmosphere? Kind of a waste.
Re: (Score:2)
The average is multiple people per home...
Huh?
Re: (Score:1)
Infection AI!? (Score:1)
I want a personal AI thing. (Score:1)
I'm sick and tired of being marketed to on every single thing I own. My TV, even when I'm watching streaming media, my phone, my car*, my refrigerator*, washer and dryer*, hell, I'm surprised my lamps and/or light bulbs don't shit out advertisements to me at seemingly random times of the day. I want a personal AI instance to run on my own server(s) that will not log, evaluate and sell my information to third party advertisers to try to market more shit to me that I have no desire for, and have to ignore
Re: I want a personal AI thing. (Score:3)
There are open source efforts underway to reduce the hardware requirements for large language models. Qlora [github.com] is one example, and it allows you to train it on your own dataset.
Weird hardware spend (Score:2)
Reminds me of supercomputer budgets in the 2000s. What did they run? Besides insane electric bills, just dick measuring benchmarks mostly. It would be interesting to know how they came to require that size of system and if they have done anything remotely big.
Nice funding (Score:2)
The GPUs alone cost $880M. Having raised $1.5B, it will be interesting to see how long they'll stay in the red before operational costs threaten to defund them. I would love a flood of second-hand H100s to enter the market!
Impressive? (Score:2)
This headline just reminds me of ENIAC's proud boast of using 18,000 vacuum tubes.