Memory Prices Have Nearly Doubled Since Last Quarter (counterpointresearch.com) 40
Memory prices across DRAM, NAND and HBM have surged 80 to 90% quarter-over-quarter in Q1 2026, according to Counterpoint Research's latest Memory Price Tracker. The price of a 64GB RDIMM has jumped from a Q4 2025 contract price of $450 to over $900, and Counterpoint expects it to cross $1,000 in Q2.
NAND, relatively stable last quarter, is tracking a parallel increase. Device makers are cutting DRAM content per device, swapping TLC SSDs for cheaper QLC alternatives, and shifting orders from the now-scarce LPDDR4 to LPDDR5 as new entry-level chipsets support the newer standard. DRAM operating margins hit the 60% range in Q4 2025 -- the first time conventional DRAM margins surpassed HBM -- and Q1 2026 is on track to set all-time highs.
NAND, relatively stable last quarter, is tracking a parallel increase. Device makers are cutting DRAM content per device, swapping TLC SSDs for cheaper QLC alternatives, and shifting orders from the now-scarce LPDDR4 to LPDDR5 as new entry-level chipsets support the newer standard. DRAM operating margins hit the 60% range in Q4 2025 -- the first time conventional DRAM margins surpassed HBM -- and Q1 2026 is on track to set all-time highs.
Gauging (Score:2)
DDR, NAND, and even spinning rust has gone through the roof.
Lucky for me, I id all my buying during Microsft's upgrade forcing, so I won't need any memory until after the AI hardware market collapse.
Re: (Score:1)
What gauge is it?
Re: (Score:3)
12 gauge, 20 gauge. Whatever it takes.
Re: Gauging (Score:3)
A rookie mistake. I once used a 12 gauge to open a can of beans. Beans everywhere. Which is why every household needs a variety of gauges and calibers suitable for many situations.
Re: (Score:2)
Re: (Score:2)
Same here. The only thing I got was a spare 2TB SSD, still at old prices.
The collapse of the LLM insanity cannot come soon enough.
Re: (Score:3)
Why would the "LLM insanity" collapse? The demand is strong and growing! I cannot remember a "technology" with a faster adoption rate in my lifetime.
Four out of five of the top AI companies have weathered multiple economic downturns and have other business ventures funding the LLM wing. OpenAI is the only one with all its eggs in one basket, and I am sure Microsoft, Google, or Meta would buy it up in a heart beat just for the server space. The fifth, nVidia makes the gold rush shovels, a consumable, GPUs, t
Re: Gauging (Score:2)
If only pets.com said they would ship for free for only $139/year then they would be as successful as Amazon
Re: (Score:2)
Or maybe pets.com should have started with something lighter, like books for dogs, that the US postal service would ship cheaply.
Then over 25 years build up a shipping and warehouse network that fits their shopping regions while using USPS and UPS to fill in any holes.
Re: Gauging (Score:2)
Better get those circular deals going (Score:3)
Re: (Score:3)
No problem for the AI hyper scalers...they'll just do some 100 trillion dollar deals back and forth and it all will work out.
I genuinely don't think the circle-jerk deals are the real force behind this. Microsoft "investing" in OpenAI doesn't inherently create datacenter demand. OpenAI "investing" in nVidia doesn't either.
What does create demand is all the people asking CoPilot to summarize an e-mail for them. Or to write a paragraph of code they could Google the results of. Or create a picture of their wife's head on a walrus. The load is what's screwing us. All the useless, pointless, wasteful load.
Re: (Score:2)
No problem for the AI hyper scalers...they'll just do some 100 trillion dollar deals back and forth and it all will work out.
I genuinely don't think the circle-jerk deals are the real force behind this. Microsoft "investing" in OpenAI doesn't inherently create datacenter demand. OpenAI "investing" in nVidia doesn't either. What does create demand is all the people asking CoPilot to summarize an e-mail for them. Or to write a paragraph of code they could Google the results of. Or create a picture of their wife's head on a walrus. The load is what's screwing us. All the useless, pointless, wasteful load.
Agree. That could be solved by the AI-bros actually charging for the services, which will have to happen someday. They will have to start making actual profit on this shit, someday. Although, I'd also think that a lot of the useless shit people ask AI to do could be handled by local models, which would be cheaper to run. I would think summarizing an email should certainly not require 10 top of the line GPUs to do by now.
Re: (Score:2)
It doesn't, and they wouldn't. Your request doesn't get 10 top of the line GPUs, it probably gets one for a second or so, or maybe a few for a fraction of that. Then the next of your billion friends gets to use it. By the time they've had their turn you've had a chan
Re: (Score:2)
No problem for the AI hyper scalers...they'll just do some 100 trillion dollar deals back and forth and it all will work out.
I genuinely don't think the circle-jerk deals are the real force behind this. Microsoft "investing" in OpenAI doesn't inherently create datacenter demand. OpenAI "investing" in nVidia doesn't either. What does create demand is all the people asking CoPilot to summarize an e-mail for them. Or to write a paragraph of code they could Google the results of. Or create a picture of their wife's head on a walrus. The load is what's screwing us. All the useless, pointless, wasteful load.
Agree. That could be solved by the AI-bros actually charging for the services, which will have to happen someday. They will have to start making actual profit on this shit, someday. Although, I'd also think that a lot of the useless shit people ask AI to do could be handled by local models, which would be cheaper to run. I would think summarizing an email should certainly not require 10 top of the line GPUs to do by now.
They'll monetize this shit through shoveling ads at the users.
And while you are correct that local models would fit most use cases, there's *MUCH* more incentive to keep all of this based on datacenters. The companies behind it want your data. They *NEED* your data. They desperately need your data. It is their lifeblood. And not just a little of your data. No, if you have a thought that registers high enough for you to ask an AI a question, they want that question, and your response to the answers, because
Re: (Score:2)
Re: (Score:2)
Nah. That's the miracle of the circular deal:
The RAM manufacturer invests $100 Million into the AI company, in return the AI company buys $100 Million in RAM.
The AI company gets the RAM they need, and their market valuation is inflated by the new investment.
The RAM manufacturer gets to book a $100 Million sale, AND they have an asset on their books of a share in an ever-increasingly valuable AI company -all of which looks good to their investors and drives their share prices higher. The RAM company loses
More than doubled (Score:5, Informative)
Up until September the G.Skill memory kit I'd been buying to build computers was $119.99 and now its over $400 that's close to a 400% increase
Re:More than doubled (Score:5, Interesting)
Yes, it's nearly 4x since September. The article is blinkered.
Re:More than doubled (Score:5, Funny)
That's a 233% increase, not 400%.
Perhaps, if you had more memory, your calculator could figure it out.
Re: (Score:2, Insightful)
Maybe if you had some common courtesy......
Re: (Score:2)
Courtesy for somebody that proudly posts grossly wrong numbers? Naa. Not justified. Do better.
Re: (Score:3)
Nope. No amount of common courtesy makes your math work. Sorry, Charlie.
Re: (Score:2)
That was really a quite gentle correction for these environs.
Taking the time to make it humorous makes it A+
Re: (Score:2)
Then it wouldn't be as funny.
Re: (Score:3)
Re: (Score:3)
The DDR5 kit I got in August has gone from $121 to $590.
It's a very low-latency kit, I know they haven't all spiked as much but certain desirable specs are going crazy.
Be thankful that it's mostly only RAM, for now... (Score:5, Insightful)
Re: (Score:2)
It's no competition really. Billionaires get to decide the priority, and it will be for their money making data centers and not the nearby communities.
Well boys and girls (Score:5, Interesting)
The bloatware show is over. We coders aren't going to be allowed to import a massive library to solve some tiny problem in our projects. We might have to actually design and code up light weight solutions again if we want our programmers to actually fit on people's computers. People that put 8GB in their mid-range laptop are going to be stuck there for quite some time, at least another two CPU generations.
And that doesn't mean you can write your apps to eat up 7GB of RAM, no people still will want to run more than one program on their computer at a time.
Re: (Score:2)
LOL as if coders pushing text messaging apps like WhatsApp (which is currently using just shy of 1GB on my machine) give a rats arse about consumers. No the bloatware won't change. Libraries save coding time, coding time is dollars, dollars is the only thing anyone cares about. The consumer is no longer right. We live in the enshittification age now, we remove features while increasing resource consumption.
Re: (Score:2)
Computer prices have varied a lot in the past 40 years, and especially if you take into account inflation, they've been at the cheap, commodity priced, position for a while now. There have been plenty of times it cost $2,000+ for what was considered, if not an entry level PC, a "good but not premium" PC. That is, the type of thing your office would equip you with.
What I'm trying to say I guess is that, alas, no, developers aren't going to stop producing bloated stuff that imports hundreds of unverified thir
Re: (Score:2)
The bloatware show is over. We coders aren't going to be allowed to import a massive library to solve some tiny problem in our projects. We might have to actually design and code up light weight solutions again if we want our programmers to actually fit on people's computers. People that put 8GB in their mid-range laptop are going to be stuck there for quite some time, at least another two CPU generations. And that doesn't mean you can write your apps to eat up 7GB of RAM, no people still will want to run more than one program on their computer at a time.
Bull. We'll be told that running "programs" at all on the local machine is no longer possible by the powers that be, to justify yet more data-suck. All tasks easily performed on the weakest of systems will be relegated to the cloud, and we'll be left with overpowered dumb terminals at home because someone out there covets your data. Nobody's going to tell us to turn-around on coding bloat beyond bloat. Not when there are non-existent datacenters that need a justification for eventually existing.
So has the secondary server market (Score:2)
I need to upgrade my zfs pool from an old xeon workstation to a proper server. There are no deals to be had anymore. Everyone is taking ram and selling it separately. Used storage isn't getting any cheaper either. I'd like to upgrade my 8tb sas drives to something larger but not at these prices.