Forgot your password?
typodupeerror
Businesses Google Hardware

Anthropic Reveals $30 Billion Run Rate, Plans To Use 3.5GW of New Google AI Chips (theregister.com) 36

Anthropic says its annualized revenue run rate has surpassed $30 billion and disclosed plans to secure roughly 3.5 gigawatts of next-generation Google TPU compute starting in 2027. Broadcom will supply the key chips and networking gear for the effort, the company announced. The Register reports: News of the two deals emerged today in a Broadcom regulatory filing that opens with two items of news. One is a "Long Term Agreement for Broadcom to develop and supply custom Tensor Processing Units ("TPUs") for Google's future generations of TPUs." Google and Broadcom have collaborated to produce custom TPUs. Broadcom CEO Hock Tan recently shared his opinion that hyperscalers don't have the skill to create custom accelerators and predicted Broadcom's chip business will therefore win over $100 billion of revenue from AI chips in 2027 alone.

Working on next-gen TPUs for Google will presumably help to make that prediction a reality. So will the second part of Broadcom's announcement: a "Supply Assurance Agreement for Broadcom to supply networking and other components to be used in Google's next-generation AI racks through up to 2031." Broadcom's filing also revealed one user of Google's next-gen TPU will be Anthropic, which starting in 2027, "will access through Broadcom approximately 3.5 gigawatts as part of the multiple gigawatts of next generation TPU-based AI compute capacity committed by Anthropic."

Anthropic Reveals $30 Billion Run Rate, Plans To Use 3.5GW of New Google AI Chips

Comments Filter:
  • So, by Doc Brown's units of power, that would be just shy of three lightning strikes, but continuous. Great Scott!

    • Can somebody please travel back in time and make sure Dario Amodei's parents never meet? Thanks, I'd appreciate it.
      • Can somebody please travel back in time and make sure Dario Amodei's parents never meet? Thanks, I'd appreciate it.

        Motion seconded. But, you know, while on the mission, Sam Altman's parents, Mark Zuckerberg's, Jeff Bezos', you know what, we'll compile a list and get it to you before mission launch.

    • by jd ( 1658 )

      This means you shoud NOT, under any circumstance, run Claude at 88mph. Unless you really want to.

    • by Luckyo ( 1726890 )

      It's honestly pretty interesting that we've gone from units of computational performance to units of electricity in measuring AI compute.

      Essentially, this tells us that bottleneck has moved from silicon's computational performance to being able to provide said silicon with enough power to perform said computational tasks.

  • Wait... (Score:5, Funny)

    by Locke2005 ( 849178 ) on Tuesday April 07, 2026 @02:10PM (#66081714)
    We're measuring CPUs in gigawatts, not megabytes or operations per second now? Dudes, the goal isn't to waste as much energy as possible! That's the most disgusting dick size measuring contest ever!
    • by leonbev ( 111395 )

      And the really annoying thing about this is that when the AI bubble inevitably crashes, it's going to be difficult to repurpose all of these specialized AI processors into something useful..

      This won't be like the 2018-2023 crypto bubble, where we end up with a ton of cheap used GPU's and power supplies available for resale. This stuff with mostly end up in the landfill and scavenged for their raw materials.

      • by JoshZK ( 9527547 )
        Yeah but the power grid is going to sick.
        • No, they are going to need to massively upgrade the power grid to support AI, hopefully paid for by the AI investment billions. And that will have beneficial effects long after the AI companies have all gone bankrupt.
          • Wasn't Freeman Dyson at least skeptical of ever bigger and better particle accelerators as reaching diminishing returns on the amount of physics knowledge gleened per dollar spent.

            Forget Freeman Dyson. Hasn't anyone taken ECON 101 as a college freshman and remember "diminishing marginal returns." And forget the harm to the environment, isn't the outragious electric bill a sign of more and more resources thrown at something to "scale it up" without considering where the scaling law levels off?

      • by tlhIngan ( 30335 )

        And the really annoying thing about this is that when the AI bubble inevitably crashes, it's going to be difficult to repurpose all of these specialized AI processors into something useful..

        No, it's basically impossible. AI doesn't need high precision math - 16-bit floating point, 8 bit floating point, even 4 bit floating point is all that's required.

        There aren't many uses for such low precision ALUs and such other than AI.

    • That's the most disgusting dick size measuring contest ever!

      Well... if you've seen a dick size measuring contest and said, "those are really nice dicks" then you have likely just found out something new about yourself. ;)

    • We're measuring CPUs in gigawatts, not megabytes or operations per second now? Dudes, the goal isn't to waste as much energy as possible! That's the most disgusting dick size measuring contest ever!

      You clearly didn't get the new dick measuring spec sheets. We clearly ARE trying to waste as much energy as possible. Along with all other resources available. That's what AI is. An outward manifestation of the greed we have worshipped for forty years or more in the United States. Even the framing of it is based on greed. "We have to, or someone else might." It's as tribal and greedy as anything we've ever done. Gotta climb aboard, or you'll get run over by it. Or so we keep getting told.

    • Yes, when it comes to AI, thatâ(TM)s how we measure it, because thatâ(TM)s the important metric. They have a certain amount of power that they are able to consume. Theyâ(TM)re buying from Google âoehowever many CPUs maximise our compute give a power budget of 3.5GW.â

    • Hehe... the only thing that measurement defines is how much power it needs to run. What does that 3.5 jiggawatts help with? More AI slop? Writing your kid's essay for them? Goody!
      Maybe, I dunno... dedicate the excess compute power to cancer research (because not every server is busy every compute cycle across every Anthropic location... do something like idle machines kick over to something like BOINC for 5 minutes or something).
      If an Antminer can draw 1400W for a thing that'll fit in a shoebox, I could

  • AI has brought nothing good to the world.
    • AI used in the medical and scientific sectors, for example being used to spot cancers, has done massive good. It's the AI that is being used as a grift for ad revenues and to make it's creators wealthy through IPOs that has yet to be of any benefit.
    • by jd ( 1658 )

      I've designed a few machines - some rather more insane than others - in meticulous detail using AI. What I have not done, so far, is get an engineer to review the designs to see if any of them can be turned into something that would be usable. My suspicion is that a few might be made workable, but that has to be verified.

      Having said that, producing the design probably took a significant amount of compute power and a significant amount of water. If I'd fermented that same quantity of water and provided wine

    • by znrt ( 2424692 )

      welp, i disagree. i find it quite useful for now, using it sensibly and sparingly.

      but the whole circus and freakshow around it is also about to make a huge bunch of rich clueless motherfuckers lose huuuuge piles of money. think of the spectacle and have some romanticism, life is short, ffs!

      other than that, yeah, like any other tool it's going to be abused, and not in people's favor, but that's sort of an inevitability, the history of our species ...

  • by greytree ( 7124971 ) on Tuesday April 07, 2026 @02:18PM (#66081734)
    What is this, News For Miserable Economists ?

    "AI Overview
    Revenue Run Rate
    (RRR) is a financial metric that projects a company’s future annual revenue based on current, short-term performance (usually monthly or quarterly). It helps startups and rapidly growing companies estimate annual revenue by assuming current performance trends will remain consistent for a full year."
    • by gweihir ( 88907 )

      Probably. I mean they all are keen on obscuring how very far too low their revenue is to even break even.

    • by znrt ( 2424692 )

      they're announcing they're about to win a few bucks and thus it's reasonable to invest yet another whole lot of bucks in more datacenters. it's vibe entrepeneuring.

  • Why are the measuring it in gigawatts instead of tons of CO2? That's weird. There must be a reason. Oh, it must all be solar and nuclear. That must be it. I'm sure it is.
  • by ZipNada ( 10152669 ) on Tuesday April 07, 2026 @02:40PM (#66081780)

    The Claude models are the best by far for coding assistance in my experience. Apparently a lot of other people think so too because Anthropic is getting swamped. They are having to ration out their compute resources and in some cases have raised their fees to 2-3 times more than the lesser competition charges. I'm finding that in order to keep costs down I'm having to use 2nd-tier models for simpler work and revert to Claude for the heavy lifting. A hassle.

    Clearly the demand is there. At this point I expect Anthropic is revenue-limited by their infrastructure availability so it makes sense that they recruit the big players to help beef it up.

    • It appears that three AI companies are neck and neck: GitHub, Claude, and Cursor. https://www.cbinsights.com/res... [cbinsights.com]

      My experience is a bit different from yours. I personally use GitHub Copilot, which lets you use models from all the major companies, including Claude. Whenever I've tried Claude models, I get good results, but the execution is *S*L*O*W*. Like, 3-5 times slower than, say, GPT-5.4. So I keep reverting back (for now) to GPT.

      • Claude can be slow I agree. But I submit that is because so many people are demanding to use it.

        I work with other models too and get good results at times but if it is something particularly difficult the Anthropic models tend to do a better job for me.

  • Why should anybody care if this drives electric bills up to $1000/mo for the typical household?

    We have unlimited energy, no?

    Dipshits aren't creating a global energy crisis right now.

    The world economy isn't headed for a global depression.

    Natgas should be burned for LLM hallucinations and cats driving motorcycles, not converted into fertilizer to stave off a massive African famine.

    Western woke governments haven't spent the past fifty years blocking new energy generation at every opportunity.

    Right?

    Don't invit

  • As in "Take the money and run"?

"Of course power tools and alcohol don't mix. Everyone knows power tools aren't soluble in alcohol..." -- Crazy Nigel

Working...