Forgot your password?
typodupeerror
Businesses Technology

The Accounting Uproar Over How Fast an AI Chip Depreciates (msn.com) 61

Tech giants including Meta, Alphabet, Microsoft and Amazon have all extended the estimated useful lives of their servers and AI equipment over the past five years, sparking a debate among investors about whether these accounting changes are artificially inflating profits. Meta this year increased its depreciation timeline for most servers and network assets to 5.5 years, up from four to five years previously and as little as three years in 2020. The company said the change reduced its depreciation expense by $2.3 billion for the first nine months of 2025. Alphabet and Microsoft now use six-year periods, up from three in 2020. Amazon extended to six years by 2024 but cut back to five years this year for some servers and networking equipment.

Michael Burry, the investor portrayed in "The Big Short," called extending useful lives "one of the more common frauds of the modern era" in an article last month. Meta's total depreciation expense for the nine-month period was almost $13 billion against pretax profit exceeding $60 billion.
This discussion has been archived. No new comments can be posted.

The Accounting Uproar Over How Fast an AI Chip Depreciates

Comments Filter:
  • by Mr. Dollar Ton ( 5495648 ) on Monday December 08, 2025 @11:36AM (#65843337)

    It isn't like all these videocards will burn up in the race for "AGI". They will remain in working order long after the so-called "AI" bubble is gone.

    • Most of them aren't video cards as they don't have video output. A DAC and ports cost money that you don't need to spend to run LLMs. The other uses for these cards are mostly scientific, and there's not enough money in that to justify owning them. Perhaps the AI bubble crashing will lead to a push towards some kind of crypto still efficiently mined with GPGPUs. Eew.

      • If the AI crash happens in the next couple years these HUGE companies will likely hold onto the accelerators for a couple more years after that and by the time they are sold at rock-bottom prices they will be truly worthless... which at least saves us from them being used by miners...
        • which at least saves us from them being used by miners

          My concern is that the companies that bought them to run LLMs will become miners themselves to try to recoup some of the costs.

          • by Junta ( 36770 )

            The question is mine what? Etherium went away from GPU friendly proof of work, and Bitcoin proof of work hasn't been viable on GPUs in ages...

            • Well I'm not an expert on this, but search results say "Monero, Ravencoin, Vertcoin, and Grin" are all still good to mine with CPU or GPU. Also there's no reason they couldn't introduce another coin or ten.

              • by Junta ( 36770 )

                I mean, sure you can mine some crypto, but the perceived value of those is essentially nothing.

                The market cap for all those cited coins together is considered about $7 billion (Monero being the vast majority of that). So mining that won't do them any good to recoup expense unless they suddenly got all the crypto-bros to abandon BTC in favor of Monero (Etherium is at $380 billion, BTC is at $1.8 Trillion).

                • Perhaps they could establish a payment network for them or otherwise make them actually useful. File and pay your taxes for you.

          • My concern is that the part of the AI bubble that doesn't vanish when the bubble pops will be left behind to scoop up under-utilised datacenter capacity: surveillance and tracking.

            Oracle, Palantir, etc, are not going away anytime soon.

      • Whatever, matrix multipliers then.

      • It's still the exact same silicon and it's got the same problems. Not all of them burn out but some of them do.

        The real question is how long until it's replaced by newer or better hardware. Basically will we see custom hardware replace video cards soon for llm acceleration. Similar to what we saw with Bitcoin.

        That Won't help consumers because the Fab capacity is just going to go to different silicone, but it does mean that a whole shitload of these gpus will become worthless. I guess some of them wi
        • It's still the exact same silicon and it's got the same problems. Not all of them burn out but some of them do.

          The real question is how long until it's replaced by newer or better hardware.

          This is the real question, how long the useful life of the hardware is. Whether the companies or Burry is right is mainly determined by the actual lifetime of the hardware. Do companies keep these expensive GPUs longer than 3 years? If so, then a depreciation schedule of more than 3 years is the right thing. Is Burry suggesting that GPUs are being scrapped after about 3 years? Or is he just complaining that the depreciation schedule changed and that the change alone points to bad accounting? I'm guess

      • What do you mean these cards don't have video output?

        Aren't those the same cards used on CSI to flash all those faces and fingerprints when trying to make a match? (j/k!)

    • by Gleenie ( 412916 )

      I have heard that actually, they often don't. They run so hot they often don't last more than 2-3 years before they starting failing.

      Be that as it may, if they throw it away after 3 years even though it works because it's now 3 generations old and the new ones have 8x the compute for 90% of the power then depreciating over 6 years *is* fraud.

    • by Junta ( 36770 )

      They aren't "video cards", since they generally neither have video ports, nor do they fit in a standard form factor 'slot' form factor.

      If the LLM bubble evaporates, the workload appropriate to these devices will be dramatically lower. You *could* perhaps make a go of VDI and maybe someone takes another swing at a cloud gaming service (if someone went all in on Grace, then neither of those use cases would be well served either), but hard to imagine any of those markets sustaining the absurd footprint built

      • I know, but the label has stuck with me from the olden times.

        We've been using nvidia video boards for sciencey stuff (mostly nuclear power calculations) long before anyone thought about burning them for shitcoins or "AI", back when the main use was still cool gaming graphics.

    • It's a bit like a jar of jam. You can keep scraping it for a little more for quite a while, but eventually there isn't going to be any useful jam. Then you'll have to buy new jam. This is how depreciation works, you figure out when it's time to buy new jam and write off the "loss" of your asset over that predicted schedule.

      Because of the accounting and the second hand market, sending those graphics cards to the dump is going to likely be a bigger net benefit than trying to sell used compute cards with no di

      • Of course. It isn't like I'm making a serious comment here, quite the opposite. If they've cooked the books, it is fraud.

        The question is, will the people who need to max out the bubble (and they are now above any control mechanism) let this fraud be investigated or punished.

        And you know the answer as well as I do.

        • Some of them not only have no consequences, they'll be rewarded. Llike the investors that convinced millions of 401K holders to sink their retirement into this garbage. It will be like 2008 Goldman Sachs, but with more of the guardrails torn down since then.

    • It isn't like all these videocards will burn up in the race for "AGI". They will remain in working order long after the so-called "AI" bubble is gone.

      Will they? The point of depreciation is that the device is deemed to be past life and replaced over that period. That's what the write-off is all about. I'm with you by the way, I think the cards should remain in use, and I think companies should be forced to utilise them at full capacity for the entire duration of their write-off period.

      They want to claim they will be used continuously over that time, then let's force them to stick by their accounting trick.

      • Yep. I sure hope they'll dump the depreciated ones onto the market, so that I can pick some cheaply, the prices due to the bubble are plain ridiculous. On the other hand, given how deep the nvidias are in the game, who knows.

        You think they want cheap competition in the form of their products from a few years ago?

        In the end, it may come to a sell-off due to other reasons than depreciation...

  • by dskoll ( 99328 ) on Monday December 08, 2025 @11:37AM (#65843341) Homepage

    I have always through three years was too short for servers and network equipment. Especially nowadays that Moore's Law is slowing down, I think a 5-year depreciation period for servers makes sense.

    For AI processors, though, I think three years might be too long given how much change is going on in that space.

    • by Malc ( 1751 ) on Monday December 08, 2025 @11:47AM (#65843363)

      Indeed. I learnt recently that one of our GitLab VM hosts for Linux build runners is is hardware from 2012. The dev team discovered this when a vendor sent us an updated library that dropped SSE4.2 support and required AVX2, causing our smoke tests to fail and thus fail the builds. Why throw away hardware that is still working and performant?

      • by DarkOx ( 621550 ) on Monday December 08, 2025 @12:58PM (#65843577) Journal

        Just because an asset is fully depreciated on the books does not mean a business has to throw it away.

        They may want to because a new asset might be more efficent, more reliable, have lower maintenance costs, and of course it might be safer which could translate in to lower insurance rates, and obviously a depreciation goes against profitability and therefore gains you some favorable tax treatment which shifts the margin at which you might replace a appreciable asset forward.

        That said I worked a company about a decade ago that was running a punch press built in 1898 regularly. Pretty sure that was fully depreciated. It was still shaping metal cases just fine and nobody saw a reason to buy a new one.

        I think we are just now experiencing an age of computing where decade+ old hardware really is really might do a given job as well as anything new. Depending on the scale other concerns like energy consumption might not be much a of a concern.

        if you look at just PCs (server/data center applications are more complicated) from the mid 1970s up until maybe the mid 2000s, each generation was obviously a significant leap forward for the typical home user. Electron, more complex cryptography etc, mean that c2005 PC is pretty much hopeless, however by the time you get to 2015 or so you can still be using that system today if it was high-end at the time. For most business applications a 2017 or newer system if probably indistinguishable for the latest and greatest for all but most demanding users (recognize Slashdot bias probably means you're a demanding user) as far as Jim in AP is concerned when he selects re-calculate from the menu in Excel it was then, its fast now, and the pie chart he sends his VP to include in the executive briefing looks about the same in terms of resolution and color.

          It really is though quite a new thing for the response to 'new pcs' being 'why?'

        Which is I think why this AI hype cycle seems outsides even as tech hype cycles go; If the industry can sell you AI accelerators, they don't really know what else to do...

        • by Malc ( 1751 )

          Yes, I totally agree, you have to pick a reasonable length of time for depreciation. Three years is clearly too short for a lot of devices now, although I wouldn't suggest 18 years is appropriate either!

        • by Gleenie ( 412916 )

          Writing this on a PC from 2017. Admittedly it has had a mid-life CPU and GPU upgrade as well as additional disks but the core is still the same. The only reason I will be retiring from active service is that I'll be f'ed before I get Windows 11. Just waiting for next year's AMD video cards (AMD tending to work slightly more easily in Linux) and then I'll build a new one which will finally be a Linux daily driver again for the first time in like 20 years.

        • Just because an asset is fully depreciated on the books does not mean a business has to throw it away.

          What you're talking about is the opposite of the problem at hand. What is being discussed here is pushing out depreciation cycles despite replacing hardware with new cutting edge stuff to spread out the tax benefits. I think businesses should be forced to use equipment at full capacity for the duration of their depreciation period.

          If they want to give it up early, fuck them let them take an impairment charge.

          Bonus points if for every piece of hardware that doesn't meet the estimated timeframe an accountant'

      • Why throw away hardware that is still working and performant?

        Working and performant is relative to the task at hand. You may not need cutting edge, but when cutting edge is the difference between making money in a new market and not making money in a new market then the hardware ceases to be usable long before it is end of life.

        By all means keep your server from 2012 if it works for you. As long as my painfully slow laptop is replaced next year when it's 3 year cycle is up.

    • Precisely. Also, related to your observation, process shrinks have slowed down as we approach atomic-scale shrinks. There will come a time sooner rather than later when we can't shrink any more, and w/ that, all those cost reductions that companies factor into their execution plans will no longer be valid: it'll be a constant no matter what

      So the fabs may be completely depreciated, be it year 1 or year 5, but as long as there is demand for their chips, they'll keep producing, rather than getting convert

    • I've appreciated the cheap, practically new equipment on Ebay for pennies. But yeah, it's absurd. I've had a total of 2 ports fail on a switch in the last 18 years. Just run them till something goes wrong. Why else have redundancy?

      It's like the old adage: The architect 2x's the design for resiliency, the engineer doubles it again for extra redundancy, the carpenter reinforces it 2x for safety and suddenly you're 8x instead of 2x.

  • by Targon ( 17348 ) on Monday December 08, 2025 @11:39AM (#65843349)

    If a new generation of product comes out every one or two years, many companies will push the idea of, "customers will buy every new generation" as they inflate their projected numbers. They will also just assume that companies will continue to just throw money at buying AI based products, even when they already have enough to meet their needs.

    There is SOME merit to expecting that after a 30 percent boost in performance to a new generation, companies MAY decide to upgrade/replace equipment, but that is not an automatic.

    • by Whateverthisis ( 7004192 ) on Monday December 08, 2025 @12:07PM (#65843413)
      I think you're missing the point. This isn't about inflated revenue projections. It's about inflated profits.

      Each of these companies is spending tremendous amounts on building servers and data centers right now. the cost of that CapEx is depreciated by it's useful service life, which can vary quite a bit depending on what it is. Servers are typically 3 years or so, whereas real estate can be up to 28 or 30 years. It's a non-cash expense, but they get to claim that as an expense and amortize it out for many years, which while a non-cash expense it does allow them to reduce their profits and thus tax basis.

      The problem is it reduces profits, which makes the companies seem like they're spending too much money. As a calculated value, it's open to manipulation to make the company look better. It doesn't really matter what number of years you use for a given piece of equipment, as long as it's consistent and it makes sense. Changing your amortization schedule from what it was historically sends a signal that the company is artificially adjusting it's numbers to make things look better.

      Using the numbers above, if Meta had the same pre-tax profit of $60B now but was using the 3 year depreciation schedule they used in 2020 vs the current 5.5 year, then instead of depreciation being $13B it'd be $23.8B, meanding they'd lose nearly almost $11B in recorded profits, just from a calculation. So in essence this boosts their stock price by making them look more profitable than they are.

      • by dgatwood ( 11270 )

        Using the numbers above, if Meta had the same pre-tax profit of $60B now but was using the 3 year depreciation schedule they used in 2020 vs the current 5.5 year, then instead of depreciation being $13B it'd be $23.8B, meanding they'd lose nearly almost $11B in recorded profits, just from a calculation. So in essence this boosts their stock price by making them look more profitable than they are.

        True, but only momentarily. At the end of the first depreciation cycle, assuming purchasing of hardware is not accelerating, you're depreciating 5x as much hardware over 5x the time, and your momentary bubble in the stock price is gone.

        And even if hardware purchasing is growing right now, eventually, that will flatten out, and the above will be true.

        The only real question should be whether the depreciation rate is reasonable. If you're still getting substantial use out of the hardware after five years, th

        • All of what you say is true, but the short term is the point I think. I think they need to keep their stock price up given how much they're spending on AI that has no clear path to profitability and is tricky to measure if it's all worth it. If their profits look like garbage because of these investments, it could bring their stock price down quite a bit, creating a cycle that's difficult to escape from.
      • by ceoyoyo ( 59147 )

        So in essence this boosts their stock price by making them look more profitable than they are.

        Sure it does. Any serious investor is going to look at their basic financial statement, not to mention the numerous articles written on the subject, and make an informed decision.

        The rest aren't going to give a shit what their profits are. Most of them think revenue is profit anyway.

      • Using the numbers above, if Meta had the same pre-tax profit of $60B now but was using the 3 year depreciation schedule they used in 2020 vs the current 5.5 year, then instead of depreciation being $13B it'd be $23.8B, meanding they'd lose nearly almost $11B in recorded profits, just from a calculation. So in essence this boosts their stock price by making them look more profitable than they are.

        So, it the law lets them do that then allow them their $11B, then have the EPA & IRS clock them for $11B for the nation to process and bury their e-waste. Do this to all of them. It's a race and when the bubble bursts only a couple will be left standing to service demand a fourth or less of what they are predicting. It's all ego; the money is how they keep track of who's winning.

  • Who would have thought...

  • To some extent, it's always been the case that the value of just about anything is arbitrary, and varies according to context. But the underpinnings of what we call The Economy are becoming more and more divorced from any consistency or standards. Increasingly it's all a dirty exercise in what ranges from misplaced optimism to opportunism, extortion, and fraud.

    There seems to no longer be even a pretense of fairness, or duty to society, or basic decency among corporate interests. My Slashdot sig was meant to

  • Those AI servers are expensive, why retire one early, on the same terms of a normal server which could be 10x cheaper? Better for the environment also to an extent (if powered by carbon neutral).
  • Because they will replace that chip in 2 to 3yrs?

  • If you have a fleet of these servers in production, they can last a long time, many years.

    Eventually, they become too expensive to operate vs new servers, mainly because the new servers are more power efficient and power is one of the largest operational costs of running a data center. The new servers tend to be faster and fewer are required, but they dont typically offer any new capability, it just comes down to the cost per unit of work.

  • by apilosov ( 1810 ) on Monday December 08, 2025 @01:34PM (#65843659) Homepage

    While one can reasonably argue if depreciation should be 3 or 6 years, the article is utterly dumb, quote:
    > "However, some types of assets fall in value more sharply early on, then stabilize and decline more gradually on a predictable curve. A case in point: According to Silicon Data, which tracks pricing for Nvidia chips, the average resale value for an H100 system in its third year of use recently was around 45% of the price for a new H100."

    1) Depreciation is unrelated to market value. Real estate is depreciated over 30 years, whether or not the value is increased or decreased.
    Useful life is however long the item is actually used and contributes to company P&L -- unless otherwise declared so by accounting guidelines. There are no binding guidelines here.

    2) Annual depreciation _could_ be taken (but usually isn't) as "book value - salvage value" / useful life. Using the numbers in article, if 2.5 years in, value of asset is 45% of purchase price, annual depreciation is ~55%/2.5 = 22% -- resulting in exactly the same depreciation as a 5 year accounting schedule.

    • 1) Depreciation is unrelated to market value. Real estate is depreciated over 30 years, whether or not the value is increased or decreased.

      Providing you don't double dip there's nothing wrong with that. The concepts are different for an asset disposed of at end of life vs an asset that is kept for its value. You'll find that in the accounting world no one (who isn't committing fraud) is depreciating real-estate. They are depreciating the value of the buildings (39 years, not 30 years), while the realestate itself is kept as an asset on the books. The IRS would have some very interesting things to say for you if you attempted to depreciate the

  • by Casandro ( 751346 ) on Monday December 08, 2025 @01:40PM (#65843687)

    I work in a company that does do "AI", but we have a different approach than most companies you hear from, we make, what's called a "Profit". We sell services for more money they cost us. I am aware that this is a very freaky business model in AI.

    I have recently talked to the people more involved in the technical details and they said that it doesn't matter how advanced the model is they use and that cheaper models work just as well as more expensive ones... for their use cases. They also mentioned that they use older and cheaper "graphics cards" for their use as they are sufficient.

    It could be, that once the bubble bursts and the demand for hardware collapses, that many of the cards currently in service will be used for many years to come. It's a bit like network equipment after the .com bubble burst.

    • The issue is that the hardware costs money to run. If you don't have a way to generate a proportionate return from using it, then you are still just sinking money into the black hole, and that is not sustainable.

      Think about how it works with BTC mining - at a certain coin price and electricity cost, a given chip cannot mine a coin for less than the cost of the electricity to do so. So you would be a fool to run such a chip under those conditions, even if the chip was free.

      AI at the moment is not generating

      • Well the trick is that you are able to sell your services for a higher price than they cost you, for example by selling things people actually want to pay money for, but are actually cheap to produce.

  • As Cory Doctorow noted in a recent blog post, the chip manufacturers, and the purchasers, are claiming the chips will last five year... but the reality is three... unless they put a seriously heavy compute load on them, in which case some will burn out in 56 DAYS.

    Now consider the non-fully depreciated chips (and the related warranty) when they need replacing in *under* three years. The result... is, of course, "you should invest in AI, it's the future!!!", as they burn through ever more capital.

    • I predict the future by extrapolating from the past. All Corporations have lied in the past. I expect all Corporations are lying now and will continue to lie as long as lying is profitable. Prove me wrong.
  • Feels like a simple thing to audit: snap inspection and see the average ages of currently in use equipment.
    If it's what they're claiming, give them a refund.

  • In New Zealand, the IRD sets the depreciation rates, NOT the companies.

    Obviously they saw the reverse was full of opportunities for tax cheating.
    It's hard to believe, right.
  • Um, was there supposed to be anything magical about AI chips keeping their value--when GPUs and CPUs have not appreciated?
  • ...I remember when companies wanted to accelerate depreciation, so they could take the write-off sooner.

    I also remember the days when the tax laws made me a depreciable asset in my role doing tech research. I never quite worked up enough nerve to call the accountants and ask what happened after they had depreciated away my full value :^)
  • Things are happening a bit too quickly to be able to know how useful e.g. an H200 or B200 GPU today will be in five years. But it's not completely crazy to think it will still be useful to some extent. The oldest GPUs that we are still finding generally useful today for ML are Ampere. It's been five years since A100 was released and it's still quite useful today. V100, on the other hand, its Volta predecessor released in mid 2017, is not quite so useful. It's not very powerful for the electricity and cooli

Physician: One upon whom we set our hopes when ill and our dogs when well. -- Ambrose Bierce

Working...