OpenAI Needs At Least $207 Billion By 2030 Just To Keep Losing Money, HSBC Estimates (ft.com) 83
OpenAI will need to raise at least $207 billion in new funding by 2030 to sustain operations while continuing to lose money, according to a new analysis from HSBC that models the company's cloud computing commitments against projected revenue. The bank's US software team updated its forecasts after OpenAI announced a $250 billion cloud compute rental deal with Microsoft in late October and a $38 billion deal with Amazon days later, bringing total contracted compute capacity to 36 gigawatts.
HSBC projects cumulative rental costs of $792 billion through 2030. Revenue growth remains strong in the model -- the bank expects OpenAI to reach 3 billion users by decade's end, up from roughly 800 million today -- but costs rise in lockstep, meaning OpenAI will still be subsidizing users well into the next decade. If revenue growth disappoints and investors turn cautious, the company's best option might be walking away from some data center commitments.
HSBC projects cumulative rental costs of $792 billion through 2030. Revenue growth remains strong in the model -- the bank expects OpenAI to reach 3 billion users by decade's end, up from roughly 800 million today -- but costs rise in lockstep, meaning OpenAI will still be subsidizing users well into the next decade. If revenue growth disappoints and investors turn cautious, the company's best option might be walking away from some data center commitments.
That's crazy! (Score:5, Funny)
Forget $207 Billion, they should invest in me instead because I would keep losing money through 2030 even if you only gave me $1 billion!
Re: (Score:2)
I'll bet you two billion I can lose money better than you. :-p
Re: (Score:2)
Good joke, but you left out the balloon. The $207-billion balloon?
Just using the number from the story. The actual AI balloon is MUCH bigger than that.
Perspective (Score:5, Insightful)
Just to realize how gobsmackingly stupid that amount of money for all the flack and conspiracies that US military industrial complex this is more money that the top 4 "MiC" classic military contractors earned in revenue for 2024. But bubbles aren't real right?
Lockheed: $66B
Northrop: $41B
Raytheon: $26B
Boeing: $66B
Re:Perspective (Score:5, Insightful)
Re: Perspective (Score:3, Interesting)
Anyone reminded of "the Everything bubble" in 2019 and predictions of an imminent crash, which actually happened but due to COVID not anything the predictors thought, and then the Fed just printed money faster than prices rose? What if the Fed distributed printed money equally (to avoid the Cantillon effect)?
Re: (Score:1)
The expectation is that this will run everything. Including most products by these companies.
Every single one on the list is already all-in on battlefield AI applications. Everything from drones to ISR processing to something as basic as hardening communications against jamming is in process of being moved to AI-based systems.
We've already had some test runs of very early versions of things like AI-based jamming and drone control in Ukraine, and it's clear that it's vastly superior to existing mainly either
Re: (Score:3)
All that can be true but it can still be a bubble and it can still be a stupid amount of money. This is also about 3x the entire military budget of Russia ($66B)
And if this is so crucial to the military then I would hope we could spare some of that free flowing money to Ukraine to you know, do the drone warfare they seem to have become experts at (at a much lower cost than all this) and provide us valuable field research and testing while also putting pressure of the geopolitical antagonists we are worried
Re: (Score:2)
$207 billion over five years is one Northrop revenue.
Re: (Score:2)
Lockheed: $264B
Northrop: $164B
Raytheon: $104B
Boeing: $264B
OpenAI needs: $207B.
Changes your conclusion a bit when you actually measure apples to apples, doesn't it?
Re: (Score:2)
What is my conclusion? That there's an AI bubble?
Re: (Score:2)
Just to realize how gobsmackingly stupid that amount of money for all the flack and conspiracies that US military industrial complex this is more money that the top 4 "MiC" classic military contractors earned in revenue for 2024.
And literally compare 4-5 years of revenue against 1 and use that as the basis for your conclusion?
Whether or not there is a bubble is entirely orthogonal to your dumbshit argument. What's truly impressive, is that it reached +5, despite being based solely on a 3rd grade logic error.
Welcome to slashdot. But by all means- keep defending that truly dumbshit post, lol.
Re: (Score:2)
What was the title of my post? Perspective, that's all, picture that mount of billions in a different way. I also said it was 1 year of revenue but go off chief.
Re: (Score:2)
Not that bubbly looking anymore, is it, chief?
Re: (Score:2)
I dunno $52 billion a year for something that in reality, only acts as a glorified search in an FAQ ? What real world money making applications has this thing done? (Or any of these "AI's" not necessarily Chat-GPT) When used as an agent this thing has basically failed at almost everything - when your product is worse than outsourcing to India that is very impressive to me! See the recent Drive-Thru AI's - these things have no intelligence at all shown by the fact they have no idea 1 man in an SUV probably
Re: (Score:2)
I dunno $52 billion a year for something that in reality, only acts as a glorified search in an FAQ ?
Google makes $350B a year.
What real world money making applications has this thing done?
You just said what it's done.
They have like 35 million paying customers pulling in ~$4B a year.
That's a far stretch from $52, but their growth is also obscene.
Of course the vast majority are free users. This is the case for all products like this. They'll hit $10B a year by next year unless there's a collapse in growth.
If they could fix the hallucination problem (they can't) - then they could trust it to do real tasks (like really handling orders etc) - then it would be worth the valuation. But they can't !
I think you probably shouldn't judge based on what you're seeing on the free shit models.
People are using it today in businesses. Agentic workflows are all over
Re: Perspective (Score:1)
You (probably - well let's face it, couldn't possibly) care what I think, but while I frequently disagree some of your viewpoints, I find I notice them.
(e.g: you got the chops)
Keep being the spike in my living under a tree world.
In all seriousness, it's why I like my tree, and I hope to not give it up anytime soon.
Posted from McKenzie Bridge after 3 2oz tequila shots.
"But there's no bubble", they say. (Score:5, Informative)
OpenAI will need to raise at least $207 billion in new funding by 2030 to sustain operations while continuing to lose money
Kinda pops the balloon of those that say AI isn't some massive bubble that dwarfs the dot-com one.
Re: (Score:1)
Re: "But there's no bubble", they say. (Score:1)
Can the Fed solve the (potential) problem?
Re: (Score:1)
Technically a lot of current build out seems to be already pretty circular. Nvidia invests into AI companies, AI companies invest in datacenters, buy hardware to run them from Nvidia.
Re: (Score:2)
This is about to be the fourth "worst investment bubble" in my lifetime.
So let's not get too excited, there's probably time for a few more bigger ones. There's fusion and space mining and geo-engineering all still to come.
I am such a dinosaur (Score:5, Funny)
I *completely* missed when we converted over from gigaflops to gigawatts as a measure of compute capacity. Can anyone bring me up to speed? Is it anything like gallons of horsepower?
Re:I am such a dinosaur (Score:5, Informative)
I *completely* missed when we converted over from gigaflops to gigawatts as a measure of compute capacity. Can anyone bring me up to speed? Is it anything like gallons of horsepower?
Exactly like that. If I give you a gallon of fuel, you may get a highly variable amount of horsepower from it. Likewise a gigawatt of power will give you a regularly changing number of gigaflops. For planning purposes it makes sense to go with the more consistent measure.
Re: (Score:2)
In your defense, it happened recently. The AI billionaires are too ignorant to realize how stupid this measure is.
Re: (Score:2)
Re: (Score:2)
FLOPs are a better metric than Watts, though. Too bad you're too stupid to understand even that.
"... you want to continue using a meaningless measure..."
What measure do I use? Citations please.
Re: (Score:2)
FLOPs are a better metric than Watts, though.
No, they're not. They're literally meaningless.
Too bad you're too stupid to understand even that.
erm, lol.
They're literally meaningless. That's all there is to understand.
AI inference has zero dependence on FLOPS. None.
It does have a dependency on power, even if there is a hidden factor in the middle.
This is imply math.
What measure do I use? Citations please.
"FLOPs are a better metric than Watts, though."
You're so fucking stupid- it's amazing, lol
Re: (Score:2)
A FLOP has just as much meaning as a watt, yet is far more relevant to computation. You can say what you want, but all you're telling us is how stupid you are.
"AI inference has zero dependence on FLOPS. None."
Completely false, and yet you want to claim one is meaningless while the other isn't? How do watts related to inferences?
"FLOPs are a better metric than Watts, though."
Because they are, but where is a single example of where I've used that metric with respect to inferences?
Do you even speak English,
Re: (Score:2)
Re: (Score:2)
What measurement would you prefer, since FLOPS is literally irrelevant to them?
It's like measuring cars in a race by the amount of iron in their motor.
The usefulness of a GPU to one of these companies is not measured in FLOPS, because no GPU can do inference anywhere near their peak FLOPS.
Re: (Score:2)
"What measurement would you prefer, since FLOPS is literally irrelevant to them?"
Ignoring that FLOPS are extremely relevant, despite your moronic claims to the contrary, we certainly don't want WATTS which are completely meaningless.
Let's not forget that as stupid as your comments are regarding FLOPs, you make them in defense of something even more stupid.
Re: (Score:2)
1) power usage is more important at this juncture.
2) inference is memory bandwidth limited, not compute limited.
A device with 17 bf16 TFLOPS may be able to do inference at precisely the same rate as a device with 4 bf16 TFLOPS.
i.e., for very large model inferece- TFLOPS isn't a great measure.
I suppose we could transition to aggregate GB/s as our unit of measure.
Re: (Score:2)
"1) power usage is more important at this juncture."
Is it? At this "juncture"? If we're talking about AI inferences, how is power usage important at all?
"2) inference is memory bandwidth limited, not compute limited."
So what? How is "gigawatts" useful in that regard?
"A device with 17 bf16 TFLOPS may be able to do inference at precisely the same rate as a device with 4 bf16 TFLOPS."
So what?
"i.e., for very large model inferece- TFLOPS isn't a great measure."
So what? Who's using that? The measure is a gig
Re: (Score:2)
Is it? At this "juncture"? If we're talking about AI inferences, how is power usage important at all?
Because it's the primary bottleneck for datacenter capacity.
GPUs are cheap. Hauling mobile megawatt generators to datacenters because utilities can't provide the power is not.
So what? How is "gigawatts" useful in that regard?
It isn't. But it's more useful than FLOPS.
FLOPS are meaningless. At least power has some meaning.
So what?
lolwut?
"I *completely* missed when we converted over from gigaflops to gigawatts" That's what, lol.
I suppose I'll repeat myself: FLOPS are meaningless.
So what? Who's using that? The measure is a gigawatts, the dumbest measure of all.
lolwut?
"I *completely* missed when we converted over from gigaflops to gigawatts"
You are truly a moron. Maybe you could transition to teaspoons of sugar as well.
You
Re: (Score:2)
"Because it's the primary bottleneck for datacenter capacity."
Citation please. Power is A bottleneck, it's not the PRIMARY bottleneck.
"GPUs are cheap."
They are not.
"Hauling mobile megawatt generators to datacenters because utilities can't provide the power is not."
And this is a false choice. There are more constraints that these two.
"It isn't. But it's more useful than FLOPS."
No it's not, and no matter how many times you repeat it it remains about the most idiotic thing you could say.
"You are so fucking i
Re: (Score:2)
1) While the FLOPs per megawatt ratio is going to change over time, the limiting factor right now isn't compute per se, it's the ability to get power the computers. Azure recently told my employer not to expect any additional capacity in the Virginia datacenter that our primary servers are hosted in, because they can't get any additional power.
2) It isn't. But the fact that memory bandwidth is a problem doesn't make FLOPS any less obsolete. Being able to do 50 TFLOPS with an on-chip cache just doesn't matte
Re: (Score:2)
"1) While the FLOPs per megawatt ratio is going to change over time, the limiting factor right now isn't compute per se, it's the ability to get power the computers. "
Citation please. Obviously, billionaires want free power and are making a point of this, but so what? Furthermore, while watts is a measure of power, it is NOT a measure of inference.
"But the fact that memory bandwidth is a problem doesn't make FLOPS any less obsolete."
FLOPS are not obsolete, nor have FLOPS ever been the sole measure of comp
Re: (Score:2)
Citation: https://www.cbre.com/insights/... [cbre.com]
"Construction completion timelines have been extended by 24 to 72 months due to power supply delays."
If you want to measure the speed at which CPUs can multiply numbers stored in on-chip cache, then flops are the right unit of measure, because that's constraint is very closely aligned with the thing that you care about.
If you want to measure the scale of a datacenter buildout today, gigawatts of utility power is a reasonable metric, because right now, that's a very
Re: (Score:1)
The most important measure for OpenAI is R - C. Since physicists tell us that nothing can go faster than C, then, as the article posits, this all important quantity (P) is likely to be less than zero forever. Scientist have always been at a loss to explain why bubbles such as those we observe in AI where R is always less than C can form or persist. There is clearly some form of dark irrationality out there whose relative value can expand at times but which is eventually(*) counteracted by the force of ratio
Re: I am such a dinosaur (Score:1)
When you look at a historical graph of the stock market, are the bangs really that big or does the market end up much higher in the end?
Re: (Score:2)
Is it anything like gallons of horsepower?
A gallon of horsepower is a gallop.
Re: (Score:1)
Gigawatts instead of Gigaflops is purely from a facility planning perspective like square footage instead of number (or quality) of people you can fit into a large office facility. How you use that available square footage or gigawatts is upto you and can change every year or based on the type of people and their per seat requirements etc. Similarly what type & quantum of compute you deploy can change based on various factors but your data center capacity is basically fixed to power capacity and built u
Re: (Score:2)
So, GHz is useless now? So is memory bandwidth (GB/s)?
So, because my computer (at 100% usage, running Cudo Miner at night), only consumes maybe 300-350 watts (24-core Threadripper 3960x, Titan X GPU, 128GB RAM), it's just a doorstop? My machine could run circles around one of the LLM-AI machines rendering video is Vegas (raw horsepower) versus a purpose-built LLM-AI machine (specialized GPU with tons of RAM on it, lots of system RAM, everything on some variety of SSD/NVMe, probably a really good processor
Re: (Score:2)
"But yes it took me a month to accept that MW or GW is ok to use to denote data center capacity. Now every other number from investment in compute or number of people or amount of water is also thumbruled to per MW of IDC capacity."
It took you a month to realize that the metrics billionaires use exist to quantify how rapidly they can gain wealth, not how much technical capacity exists (which they don't even care about)? This is right in the metric itself, you should understand that immediately.
"Just scalin
3 billion people, really? (Score:2)
OpenAI Needs At Least $207 Billion By 2030 Just To Keep Losing Money ...
the bank expects OpenAI to reach 3 billion users by decade's end, up from roughly 800 million today
Over 1/3 of the world's population, really? Although, think of how much money OpenAI will have lost by then. /s
Re: (Score:2)
Over 1/3 of the world's population, really? Although, think of how much money OpenAI will have lost by then. /s
My first thought as well, though I was looking at it from the perspective of, "The entire US population is only 341 million!!!". 1/3rd of the global population is far fetched as fuck.
Inference will get cheaper (Score:1, Troll)
The current price decrease is exponential* and if we assume it stops being exponential in two years (or alternatively slows down and drops only two orders of magnitude until 2030), we are at $2.07 billion in costs. Given 3 billion users, we are at less than $1 per user. When they keep the price of $20 for the simple subscription, they need 1 in 20 users to pay to keep the service up. When now 1 in 200 users has the $200 subscription instead of the $20 subscription, they start being profitable.
This currently
Re:Inference will get cheaper (Score:5, Interesting)
So far all we are seeing with the generative AI delusion is an exponentially exploding waste of resources in order to pollute my Youtube feed with slop. Every enterprise is trying "AI" and essentially all of them are finding it does not do what the people selling the tin claim it can.
There were no Amazon, or Uber or Internet evangelists trying to convince everyone that those things were useful or invent uses for them because there was no need: the value was obvious and real.
Re: (Score:2)
The difference between the AI slop machine and Amazon or Uber is that even when those were losing money, it was none the less clear that if they scaled up then scaling efficiencies would yield a lower cost/unit and they'd become profitable. The pathway to making money instead of setting it on fire clearly existed. It also existed because it was clear even before they super-scaled that Amazon and Uber were doing something useful for which where existed a demand.
So far all we are seeing with the generative AI delusion is an exponentially exploding waste of resources in order to pollute my Youtube feed with slop. Every enterprise is trying "AI" and essentially all of them are finding it does not do what the people selling the tin claim it can.
There were no Amazon, or Uber or Internet evangelists trying to convince everyone that those things were useful or invent uses for them because there was no need: the value was obvious and real.
Isn't Uber still losing money?
Amazon had a plan for profitability, so much so they took on more debt in the early days to scale up. A gamble that paid off because they had a solid plan to begin with, not a "hope the magic beans drop into our laps before we run out of money" type of plan that AI companies have. Uber's business plan was "lets keep doing illegal shit that our competitors cant and just hope we become big enough not to fail".
Re: (Score:2)
And Uber is not providing value. They are modern slavery and profit from mediating "slave labor" to others. Their own contribution is just the app, otherwise they try to keep out of things so they don't have to pay, for example, insurance for people transportation (which is in many countries NOT covered by the driver's insurance when the driver gets paid but does not have an insurance similar to what a taxi driver has).
We can hate OpenAI all we want and with good reason, but they are providing a service. Th
Re: (Score:1)
I am using GPT-5 and Claude Sonnet 4.5 at work in copilot agent mode.
What these f*cks can code based on barely coherent statements is scary levels of shockingly good.
E.g. oracle to postgres migration utility written in go, with plan/migrate/verify, blob chunking (ai agent has patched existing go lib for that!) and what freaking not.
My role was mostly splitting the tasks into smaller ones.
What I see free versions do is not even remotely close to the premium model capabilities, so tha
Re: (Score:2)
From your first linked article:
"There is no doubt we will see rapid advancements in some of the areas, but for others, like quantization, it is less clear. So while the cost of LLM inference will likely continue to decrease, its rate may slow down."
"Will LLM prices continue to decline at this rate? This is very hard to predict."
But go ahead and an assume massive declines in inferencing costs, we believe you. No doubt Altman and Musk will pass those savings onto the little man.
"or alternatively slows down a
Re: (Score:2)
The second link kinda confirms the first link. We will see what the numbers in early 2026 look like and what models you can use by then.
Your comment about Altman and Musk is off, because most of the models in the chart are neither xai nor OpenAI models. I don't even think Microsoft hosts Phi themselves, you either run it on your PC or use it with a pay-per-token model of an API provider.
I'd recommend following /r/LocalLlama from time to time, then you get the relevant news about new interesting models (in p
It's not supposed to be profitable (Score:5, Insightful)
The goal here is to replace as many workers as possible and eliminate the dependency on consumers.
The ultra wealthy want to go back to being like kings. Basically feudalism.
They will have a very tiny number of guildsman and scribes and a handful of knights to keep them in line.
Everyone else has a lifestyle below that of a medieval peasant because you're not even needed to tend the land anymore, they will have machines for that.
It never ceases to amaze me how many people don't realize what's happening here. Even more so there are the people who realize it but just kind of put it out of their mind because the idea of the ultra wealthy dismantling capitalism is so far outside what people view as possible that they can't emotionally comprehend it even if they can understand it intellectually.
And of course there are the numb skulls who think that they are somehow going to profit from the collapse of modern civilization. It's a big club boys and you ain't in it.
Re: It's not supposed to be profitable (Score:1)
What if we stopped having kids and concentrated on spiritual enlightenment, leaving the rich to deal with their own karma?
Not enough time (Score:2)
So long before our population could adjust we're going to get hit with huge amounts of layoffs that will cause massive amounts of social strife. There's no getting away from it.
Re: (Score:2)
It never ceases to amaze me how many people don't realize what's happening here. Even more so there are the people who realize it but just kind of put it out of their mind because the idea of the ultra wealthy dismantling capitalism is so far outside what people view as possible that they can't emotionally comprehend it even if they can understand it intellectually.
You act as if there's something we can do about it. The vast majority of the public *hate* all the AI garbage being packed into every piece of software and have been very vocal about it. The response of the ultra-wealthy is, "We know and we don't care because this is GOING to happen because we said so." Most people are more aware than you give them credit for, but we have a limited amount of energy we can put toward all the things happening in the world right now and we understand this is one of those thing
Re: (Score:2, Troll)
You could also get over that stupid 12-year-old feeling of it's not fair when you see somebody having food and shelter without being miserable 40 hours or more per week.
But you're not going to do that. Or if you do your friends and famil
Re: It's not supposed to be profitable (Score:1)
What if Democrats stopped thinking in zero sum terms that a basic income must be tax-funded, which takes away the main Republican objection?
Re: (Score:3)
I think everyone realizes it that cares enough to consider it. Most don't bother thinking about it, at least beyond how they can make a buck. You are either a predator or a victim, the only solution is to not allow the game to be played. Sad thing is that most that imagine themselves predators don't realize they are the targets.
There is no post-AI economy, there is only homelessness, poverty and starvation. That's exactly what billionaires intend.
Re: (Score:1)
Re: (Score:2)
You cannot as a regular person comprehend the kind of greed that a man like Elon Musk or Bill Gates experiences as their normal state of being. It is way past just wanting money or yachts or any of that and into the point where they w
Re: (Score:2)
Correct, and no one should ever forget that.
Re: (Score:2)
"We aren't the same as you."
LOL, share you aren't. But at least you draw a paycheck from Putin.
Re: (Score:2)
"Wealth prefers a dystopian hell-whole?"
No, wealth has no preference, it doesn't have agency. The wealthy, however, do prefer exactly that. Everything for them, nothing for you.
"Have our financial masters become stupid in their success? That would be a damning critique of post-modern wealth; say nothing of the institutions that educated them. "
Perhaps you should observe current behavior of the wealthy, it's easier now than ever. And learn something from history.
Re: (Score:1)
Hahahaha, incredible! (Score:4, Insightful)
They are NOT going to make it. And then the whole bubble will burst.
Pay this back with what money? (Score:3)
Here comes the next round trip! (Score:1)
OpenAI will be fine. They just need a few more infusions of cash from Microsoft, AMD, Nvidia, and Oracle that they can use to buy stuff from Microsoft, AMD, Nvidia, and Oracle. This can go on forever because this is not a scam. Just trust your capitalist masters, they are CEOs and in America CEOs know what's best for us all,
by some you mean.. (Score:1)
the company's best option might be walking away from some data center commitments.
'some' is a really funny way to spell 'most'
This is going to be one hell of shit show for Wall Street (probably not Main Street) but hoo boy is this going to pull down valuations of some big NASDAQ components... When OpenAI goes tits up, or as likely gets parted up and sold off in pieces.
AI needs data (Score:2)
There isnâ(TM)t enough data to train on for robotics. They are going to have to create it, using humans. They are going to have to show how to do all the things humans can. AI needs millions of hours of driving videos just to figure out how to drive. Meanwhile any human (ok almost) can learn to do it with a few hours of training.
How is a plumbing or electrician or roofing robot going to get the data it needs. It will need thousands of hours of videos of the various permutations and possibilities that i
Re: (Score:2)
Current approaches to AI may need this, but what that says is they are doing it wrong. An "intelligence" is more than a neural network, no matter how big.
A human figures out what do to, an AI fakes it based on ample precedent it has been trained on. The two are not the same.
& ironically the war will be won (Score:2)
so, we have trillions to invest in AI but no money (Score:2)
so, we have trillions to invest in AI all of a sudden, but for decades, never had money available to put into chip foundries, and power capacity generation.
This circle jerk of an Ai bubble will decimate trillions from the markets. Compounded by the already massive disruption to the workforce, and spiking unemployment.
Curious what happens to all these companies/governments that "Adopted Ai" and the Ai company implodes... those will be some interesting times for sure as companies scramble for solutions.
If OpenAI disappeared? (Score:2)
I'm 100% convinced that other than spending irresponsible amounts of money on building an infrastructure which is only competitive because they are willing to outspend their peers, they don't offer anything of value.
I currently am using glm-4.5 on a computer with 64000 cpu cores and 304 H200 GPUs. I share the machine with 10 other users. It's pretty fast. It gives me an idea of how A
36 gigawatts? The Doc would have something to say (Score:3)
I mean, you only need 1.21 gigawatts to fucking travel through time!
Re: (Score:2)
that's jiggawatts, much different!