Sam Altman Seeks Trillions of Dollars To Reshape Business of Chips and AI (wsj.com) 54
Sam Altman was already trying to lead the development of human-level artificial intelligence. Now he has another great ambition: raising trillions of dollars to reshape the global semiconductor industry. From a report: The OpenAI chief executive officer is in talks with investors including the United Arab Emirates government to raise funds for a wildly ambitious tech initiative that would boost the world's chip-building capacity, expand its ability to power AI, among other things, and cost several trillion dollars, according to people familiar with the matter. The project could require raising as much as $5 trillion to $7 trillion, one of the people said.
The fundraising plans, which face significant obstacles, are aimed at solving constraints to OpenAI's growth, including the scarcity of the pricey AI chips required to train large language models behind AI systems such as ChatGPT. Altman has often complained that there aren't enough of these kinds of chips -- known as graphics processing units, or GPUs -- to power OpenAI's quest for artificial general intelligence, which it defines as systems that are broadly smarter than humans. Such a sum of investment would dwarf the current size of the global semiconductor industry. Global sales of chips were $527 billion last year and are expected to rise to $1 trillion annually by 2030. Global sales of semiconductor manufacturing equipment -- the costly machinery needed to run chip factories -- last year were $100 billion, according to an estimate by the industry group SEMI.
The fundraising plans, which face significant obstacles, are aimed at solving constraints to OpenAI's growth, including the scarcity of the pricey AI chips required to train large language models behind AI systems such as ChatGPT. Altman has often complained that there aren't enough of these kinds of chips -- known as graphics processing units, or GPUs -- to power OpenAI's quest for artificial general intelligence, which it defines as systems that are broadly smarter than humans. Such a sum of investment would dwarf the current size of the global semiconductor industry. Global sales of chips were $527 billion last year and are expected to rise to $1 trillion annually by 2030. Global sales of semiconductor manufacturing equipment -- the costly machinery needed to run chip factories -- last year were $100 billion, according to an estimate by the industry group SEMI.
Problem and solution (Score:4, Interesting)
Problem: You don't have enough money to buy chips
Solution: Spend trillions of dollars to build factories that make chips cheaper
I think it is better to just wait for the real AI researchers to solve this problem using something else than brute force.
Re:Problem and solution (Score:5, Insightful)
It seems likely to me that GPUs are good, but only because we've structured "AI" to work well with them. If some chip designers and the AI software people get together, I'll bet there's an "AIPU" that will do the job better than anything we've seen today.
I can't see that needing an entire reshaping of the industry, more like some bright people from TSMC and elsewhere to figure it out with a couple of OpenAI people. The supply issues that may or may not result are secondary, and may take trilions of dollars to resolve, but I can't see how Sam Altman needs to spend those dollars himself.
Still, he's rich, and evidently has some rich mates. If they want to actually spend their money rather than hoard it, then I'm all for it.
Re: (Score:3)
If I am understanding correctly, Sam wants to develop newer better state of the art processes for making chips, and new foundries to actually print those chips. Not so much Nvidia make 100x more chips, but more, lets re-invent how we make chips.
GPUs are simpler processors than CPUs, but are well suited for AI,
Re: Problem and solution (Score:2)
Re:Problem and solution (Score:5, Informative)
Application Specific Integrated Circuits (ASIC) [wikipedia.org] will always provide you more bang for your buck, BUT they require a lot of development time, AND they can only perform a single task well
Field Programmable Gate Arrays (FPGA) [wikipedia.org] offer the speed of ASIC without the custom fabrication, but they have physical limits in scope and still require a lot of custom design work
At this time, GPUs offer the most bang for the buck, and will likely rule the high performance computing world until a new combination of Memristors and FPGAs becomes a commodity product. [ieee.org]
Re: (Score:1)
>FPGA offer the speed of ASIC
No they don't, and it should be obvious why, but you can read the article you yourself linked if you're not convinced.
Re: (Score:3)
It seems likely to me that GPUs are good, but only because we've structured "AI" to work well with them.
GPUs are a good fit for AI primarily because they already had a good memory subsystem with lots of memory bandwidth. They also happen to have parallel compute units, but the custom AI processors arguably have better compute units, at least for some models. Where the custom processor struggle is with memory bandwidth. GPUs fortuitously had to address that problem previously for graphics and then for supercomputing.
The idea that AI has adapted to GPUs is wrong. In fact, it's just the opposite. GPUs are w
Re: (Score:2)
It seems likely to me that GPUs are good, but only because we've structured "AI" to work well with them. If some chip designers and the AI software people get together, I'll bet there's an "AIPU" that will do the job better than anything we've seen today.
There are already companies figuring out what an AIPU will look like. Check out Cerebras, a company building wafter-scale AI chips that massively outperform GPUs: https://www.cerebras.net/produ... [cerebras.net]
Re: (Score:3)
I'll bet there's an "AIPU" that will do the job better than anything we've seen today.
There already is, TPUs (Tensor Processing Unit), GPU siblings that have massive amounts of memory, custom float types, and caches optimized for training / inferring ML models. The Nvidia H200 is a prime example. I'm not sure what Sam is gunning for here, maybe it's cheaper TPUs, because H200s, if you could find one, run $40K a piece.
If his idea is to come up with a processor that doesn't rely on tensors, that would require redefining the underlying basis for virtually everything we call machine learning tod
Re: (Score:2)
They are working with LLMs, which the GP poster is pointing out essentially use a brute force method of statistical modeling across vast corpora of data. Unless and when this results in something reminiscent of a general AI, the 'real AI researchers' will be everyone not working on a LLM. And no, they may not have a hunger for more and faster chips because they might not be brute forcing data.
When the bubble pops, this will all be more apparent.
Re: (Score:2)
I think it is better to just wait for the real AI researchers to solve this problem using something else than brute force.
You'd think that but near as I can tell, brute force is the only way we've made progress in AI. Back in the day, we tried creating all sorts of rule-based production systems and they got nowhere as far and as fast as throwing transistors at the problem.
Re: (Score:2)
Problem: You don't have enough money to buy chips Solution: Spend trillions of dollars to build factories that make chips cheaper
As long as it's private money, I have no problem with this. Knock yer socks off.
If he wants taxpayer money to do this, go pound sand. There are lots of investors out there trying to decide which of a zillion investments make sense. Just because he personally wants boatloads of AI processors doesn't mean we should give up robots, networks, cars, houses, and farms to give it to him.
Power (Score:5, Interesting)
How does Altman intend to power these many GPUs?
He should be considering that, too - and not letting that be a problem solved only by governments whose main interest is selling more fossil fuels but instead be working to develop a lot of GPUs and a suitably non-carbon-emitting power source to go with them.
Re: (Score:2)
Microsoft has announced an atomic energy station to power its AI needs.
Re: (Score:2)
Microsoft has announced an atomic energy station to power its AI needs.
You're not wrong. Maybe AI will be the thing to bring Nuclear back in favor? https://www.theverge.com/2023/9/26/23889956/microsoft-next-generation-nuclear-energy-smr-job-hiring [theverge.com]
The new bitcoin (Score:3, Interesting)
So he wants to produce zillions of processors to burn every bit of electricity, all for a gimmicky search engine. Getting nothing useful done.
Re: (Score:2)
There are certainly zillions of processors busy processing stories after stories after ever-fucking-boring stories about AI right now...
hmmm (Score:2)
Maybe we should not entrust trillions of dollars to a college drop-out?
Re: (Score:2, Insightful)
Just like Bill Gates then.
I've no axe to grind one way or the other with regards to either of them, but you don't need a degree to do something useful or world changing - you need brains, and good idea, business nous and some luck.
A degree prepares you for whats already out there, it doesn't necessarily prepare you to create something entirely new.
Re: (Score:3)
You think Bill Gates did useful things?
Re: hmmm (Score:2)
Initially yes. Then he got greedy.
Wrong problem. (Score:5, Interesting)
If you're going to raise trillions of dollars to address a real problem then he should be targeting climate change. Seriously, how much money is even being invested in saving our ecosystem? Whatever it is, it's not nearly enough.
Re: (Score:2)
If you're going to raise trillions of dollars to address a real problem then he should be targeting climate change.
because we as a species with billions of individuals can only do one thing at a time...
Re:Wrong problem. (Score:5, Insightful)
Sure, we can multi-task, but in this particular case, the 'trillions of dollars' would indicate a bit of starving out other endeavors. $7 trillion is 30% of the GDP of the US. His suggestion would be equivalent to devoting a third of US economic activity just to drive his initiative.
Further, as things stand, this would increase energy demand by 4 PWh a year, or about a 14% increase in global energy consumption, actively working against climate change efforts.
The scale of this ask is just stupidly bad.
Re: (Score:3)
Re: (Score:2)
The obvious problem being that we have no idea how to make AGI.
will run out of power (Score:2)
Sam needs to work on efficiency. While scale is important, efficiency is the only thing that will help his plan work at scale.
Can you clarify? (Score:1)
I also seek trillions of dollars ... (Score:4, Funny)
Re: (Score:2)
Psshsht! Over-reaching (Score:3)
Or should I say One Million Dollars! [youtu.be]
Re: (Score:3)
I like a thing I read:
"I want a version of "Who wants to be a millionaire?" but the contestants are all billionaires so it's more of a threat"
Re: (Score:2)
Bonus points if there's a catchy dance number [youtube.com] at the end.
Re: (Score:2)
Better ask for 10 million - in this economy and with this inflation, 1 might not even be enough to retire with.
Does everyone else not see this as a scam? (Score:5, Interesting)
Yes!!!! Give me trillions and THEN I'll live up to the "Intelligence" part of Generative AI.
This is the same tactics used by multi-level marketers and religious cults:
1. make a promise that's too good to be true, but throw in a good reason why it's true
2. when you give them your money and the promise wasn't realized, they blame you and say you didn't give enough. "If you had worked harder or given more money, then you would have achieved what we promised."
I guess he's hoping the UAE are as dumb as YouTube portrays them to be. Sam Altman is not the next Bill Gates...he's a more intelligent version of Elizabeth Holmes. He's the next Sam Bankman-Fried. Generative AI is Crytocurrency and NFTs all over again.
Re: (Score:3)
Re: (Score:2)
Maybe I'm the dumb one here, but I do find ChatGPT and MS Copilot to be highly useful. So far, I've used it to help me:
- Generate job descriptions for candidates I'm hiring
- Build PPT slides for talks I give at work
- Provide instructions for specific tasks on stage lighting consoles
- Suggest ways to solve DIY challenges around the house
- Build code in programming languages that are not my primary languages
I could go on. I honestly don't get the pushback from those who say AI doesn't do anything useful. I fi
I hope this code isn't mission-critical (Score:2)
- Build code in programming languages that are not my primary languages
I hope to god this code was not for mission-critical work or housed in a computing environment in which houses sensitive information. If you don't understand the language enough that you need generative AI to generate it for you, you probably shouldn't be working with that language professionally.
This is basically the intro to every other security breach story I've read. You're describing the origin story of the next major virus.
You're hacking things in your free time for fun? Great, enjoy!...You'r
Re: (Score:2)
I think you have an inflated view of the importance of knowing every nuance of every language you work in. In my career, I've written code in more than 25 different languages. After a while, you start to see that they can all do the same things, they just use different words. Pasting together SQL commands is a bad idea whether you're using VB, C, C++, C#, Python, JavaScript, or whatever.
As with spoken languages, as a person gains proficiency, they can understand and read far more quickly than they can compo
Chips (Score:5, Funny)
Has anyone thought of reaching out to Frito-Lay?
I heard they have lots of chips.
Smells (Score:2)
Re: (Score:2)
Re: (Score:1)
"Take the blue pill, Coppertop!"
Pop! (Score:2)
Yes, AI has some good uses & it'll help change a lot of processes in certain sectors but it's unlikely to be as world changing as the kinds of money that are being thrown at it right now. This ain't sustainable.
Wouldn't matter much... (Score:2)
You aren't going to get some magical leap that allows a LLM to run affordably in desktop computer, much less a laptop or phone. So, you have to have these in the cloud, but you still have to power and cool them all. And you aren't getting 2x the performance every 18 months. It was three years at best and terms of tensor accelerators, it will probably take five years to see if feature shrinks will allow for core and memory gains for that next 2x.
The push to run smaller models with AI specific chip features i