Forgot your password?
typodupeerror
Oracle Businesses

OpenAI Is Walking Away From Expanding Its Stargate Data Center With Oracle (cnbc.com) 41

OpenAI is reportedly backing away from expanding its AI data center partnership with Oracle because newer generations of Nvidia GPUs may arrive before the facility is even operational. CNBC reports: Artificial intelligence chips are getting upgraded more quickly than data centers can be built, a market reality that exposes a key risk to the AI trade and Oracle's debt-fueled expansion. OpenAI is no longer planning to expand its partnership with Oracle in Abilene, Texas, home to the Stargate data center, because it wants clusters with newer generations of Nvidia graphics processing units, according to a person familiar with the matter.

The current Abilene site is expected to use Nvidia's Blackwell processors, and the power isn't projected to come online for a year. By then, OpenAI is hoping to have expanded access to Nvidia's next-generation chips in bigger clusters elsewhere, said the person, who asked not to be named due to confidentiality.
In a post on X, Oracle called the reports "false and incorrect." However, it only said existing projects are on track and didn't address expansion plans.

CNBC notes: "Oracle secured the site, ordered the hardware, and spent billions of dollars on construction and staff, with the expectation of going bigger."
This discussion has been archived. No new comments can be posted.

OpenAI Is Walking Away From Expanding Its Stargate Data Center With Oracle

Comments Filter:
  • Oracle Is Walking Away From ... Oracle

    Someone needs a bit more coffee.

  • by joshuark ( 6549270 ) on Tuesday March 10, 2026 @02:05PM (#66033686)

    From the headline I'd say Oracle has some serious corporate dysfunction in its outer joins...

    JoshK.

    • by Tailhook ( 98486 )

      Given what they've been doing, one is forced to stop and think for a moment whether or not that's a typo. It is a typo, but yeah, still makes you think.

    • Needs the Spider-Man pointing at Spider-Man meme, wearing Larry face-masks.

  • by ebunga ( 95613 )

    Imagine throwing away billions of dollars.

    And then when those other datacenters are nearing completion, they'll also never be used because they're just on the cusp of the next generation, and then when those are about to be turned on...

    • by zlives ( 2009072 )

      the AI will fix that imagination for you.

    • Re:LOL (Score:4, Funny)

      by Comboman ( 895500 ) on Tuesday March 10, 2026 @03:03PM (#66033820)

      But Larry Ellison needs those billions of dollars so his nepobaby can buy more movie studios!

    • by EvilSS ( 557649 )
      It will get used. There is still high demand for GPU workloads on the big cloud providers by companies looking to do AI or other GPU enabled workloads but not at the OpenAI or Anthropic scale. It just won't be guaranteed 100% usage from one big customer.
    • by Gilmoure ( 18428 )

      Case planning an extraction from a half-built data center out in the desert

    • Because too many rich idiots copied the behaviors of Zuck spending several $10B's per year on data centers, server hardware, and VR without a profitable product requiring them much less a plan. Sort of like the Trump's wag the dog Epstein Iran war. Hypernormalization just doing things based on vibes while ignoring reality and fundamentals.
      • Also, while the technofeudal aristocrats waste zillions on boondoggles, millions lack health insurance, food, and housing and they stole around $50 trillion from us since the Regan era while obliterating the middle class and driving millions into homelessness. Anyone professing libertarianism, deregulation, or market nonsense is just a shill for the criminal pedo class at this point.
  • by greytree ( 7124971 ) on Tuesday March 10, 2026 @02:18PM (#66033708)
    OpenAI -> Closed AI, Non profit -> For profit, cash from Nvidia -> no cash from Nvidia, No developer jobs in 3 months -> Slightly less cringey version 5.3 in 3 months, refusing Pentagon money ->accepting Pentagon money, Stargate -> no Stargate.

    Sam Altman is a dirty, lying cunt.

    Happily, Open AI will soon go bust.
    • by zlives ( 2009072 )

      too big to fail?

    • Pretty presumptuous of you to assume that I don't already hate Sam Altman at least as much as you do.

    • To be fair I don't think this has anything to do with lying. More to do with the rapidly changing nature of the industry. E.g.

      - Open was never fully open. There was always a closed aspect to it.
      - Non-profit was fine being non-profit right until cost and contracts needed to get involved. It's hard to run a non-profit if your business depends on insane capital injections, you can't raise that kind of money as a non-profit. Elon knew this as well which is one of the reasons he bailed.
      - Cash from NVIDIA was pro

  • You can be first to scale up or have the latest technology but not both and not expanding is not the same as cancelling. The current construction seems like a needed intermediary step before new GPUs become available so that they stay relevant. That said OpenAI seems to be comfortable with ChatGPT in its current 'name brand' position even when competitors like Claude, Grok, and Gemini are all meeting or exceeding its functionality rapidly.
  • I was hoping this was recursively odd like Samsung not selling RAM to Samsung [slashdot.org].

    Nope, just an uncaught typo.

  • Funding was supposed to come from Gulf States as a quid-pro-quo for US military protection from Israel and Iran.

  • by Tony Isaac ( 1301187 ) on Tuesday March 10, 2026 @05:10PM (#66034108) Homepage

    because it wants clusters with newer generations of Nvidia graphics processing units, according to a person familiar with the matter.

    Hmmm, is that the real reason? Do the new processors provide something so special that the software can't run on older ones? Or are the new processors just *faster* than the old ones? Would a cloud customer back out of a deal just because the hardware is a few months old? Maybe, but maybe it's an excuse to get out of the deal, and OpenAI doesn't want to disclose the real reason. Maybe this is another sign of the AI bubbles starting to burst.

    • A generation old chip for a cloud customer means inference takes a few extra milliseconds. For OpenAI it means their training jobs take extra weeks to finish, they're late to market, they got leap-frogged by competitors, etc..
      • They're also going to wait *many* extra weeks because of backing out of this deal, they won't be able to provision the new fancy chips instantly. On one hand, slower chips that take longer to process training, and on the other, logistics that take longer to even *start* training.

        • Right, they need to find a balance. But not being able to train with new chips as soon as they come out is the worst case scenario for them.
          • Have you used cloud infrastructure? It doesn't seem so, because your conclusion wouldn't make sense, if you had. In the cloud, it's *really* easy to move stuff around, to scale up or down, to change where your software is running. Hosting on older chips, and moving to the newer ones when they're ready, is *not* a major undertaking, in the scheme of things.

        • The cost of backing out may be offset by the efficiency change as well. Projects need to be reviewed at all stage of execution and there should always be a willingness to back out if some variable changes. I'm not sure what happened here, but I've certainly been on a project where there was a financially beneficial reason why the project was delayed. We were literally told to slow down our progress in the name of making the project more viable.

          • I don't think cost is OpenAI's primary concern. As the previous poster noted, *time* is of the essence, not cost.

            • That's a very narrow view when talking about a company who has just had literally many 10s of billions of investment evaporate, largely due to a question of potential profitability.

              AI is like Bitcoin was when the first ASICs hit the market. There's a very VERY real return on investment related equation to not having yesterday's hardware.

  • Quietly having the energy companies pass the buck onto us plebs is over, we are on to your scam. And OpenAI doesn't want to pay for it either.
  • They ran out of other people's money to spend before showing ROI. They've exhausted angels, sovereign wealth, institutional investors, and public markets. No one else wants to give them more money, including the financialized funny money Nvidia sloshes around. Buuuubble about to ...
  • For those who don't know the Stargate datacentre is currently Samsung Hynix's single largest customer having purchased all of Samsung's DRAM wafer supply (900,000 per month) through the next 2 years.

    Fuck OpenAI and what they are doing to the computing industry. I'm glad they are walking away. I hope they cancel their DRAM order.

    • by whitroth ( 9367 )

      Yep. And on top of which, cory Doctorow and Ed Zitron are proven right - by the time the datacenter's built, it will be moving towards obsolescence. New chips, *all* of which will require replacing the racks, which is the most expensive part of the datacenter.

    • Isn't that hoarding? Here in the US at least for basic necessities like food, hoarding such as this is outright illegal. I guess I'm saying there ought to be a law...
      • Purchasing something with a lead time isn't hording. Hording is buying something without the intent to use it. The now is, will the order be cancelled? I doubt OpenAI intend to take delivery of a rapidly depreciating asset which generates no revenue for them.

  • You shouldn't think of computer programs anthropomorphically--they hate that.

The best laid plans of mice and men are held up in the legal department.

Working...