Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
Data Storage IT

DRAM Prices Soar as China Eyes Self-Reliance For High-End Chips (nikkei.com) 30

Standard DDR4 DRAM prices doubled between May and June 2025, with 8-gigabit units reaching $4.12 and 4-gigabit units hitting $3.14 -- the latter's highest level since July 2021, according to electronics trading companies cited by Nikkei Asia. The unprecedented single-month doubling follows speculation that Chinese manufacturer ChangXin Memory Technologies has halted DDR4 production to shift factories toward DDR5 memory for AI applications.

DDR4 currently comprises 60% of desktop PC memory while DDR5 accounts for 40%, per Tokyo-based BCN research. Samsung Electronics, SK Hynix, and Micron Technology controlled 90% of the global DRAM market in Q2 2025.

DRAM Prices Soar as China Eyes Self-Reliance For High-End Chips

Comments Filter:
  • 10 bucks on a low-end laptop? Most everything better is DDR5.

    • by aaarrrgggh ( 9205 ) on Tuesday August 05, 2025 @05:59PM (#65568642)

      And then you have Apple, who will likely have to increase prices for extra 4GB from $200 to $400.

      • My 5 year old Dell laptop recently wore out and shopping for a new one I was surprised how little, if any, specs have advanced for the same money (~$2k). In particular it doesn't look like anybody even makes a 15" 4k oled display any more. I suppose the chips are faster, so that's good, but the RAM, storage, and resolution are all about what they were.
        • 32GB is common now. 16GB is the norm. 8 GB is low end. Used to be 4, 8, 16 a few years ago.

          Screen sizes are resolutions determined by big companies making big orders. I have had two OLED laptops, and I chose LED for my newest, and the LED cost more. Higher rez and lower battery use, please. That said, most people are happy with 4k, and the companies follow the market. Also, I can get a 4TB nvme drive for much less now than I could a few years ago. 8TB have come down a lot in price too, despite infla

          • I'm curious why you chose LED?

            I wasn't looking for anything above 4k, even for that you have to get somewhat up there in the lineup.

            Same 1 TB as the last time for me, almost sorry to say I just haven't found a use for more in a laptop.

            • Wanted 4k+ in a 13" laptop and better battery life. OLED sucks power, especially if you have lit backgrounds. I run at HD a lot for the battery life, and then 4k for movies. Also, brighter than OLED. And the OLED wasn't 4k. It was 3500 or some such nonsense.

              • huh, I had thought oled was usually less power. I can see why not at least with a bright background though, many source of light vs a few bigger ones
                • Not with the two OLED laptops I owned. They sucked at battery life, so badly I returned the ASUS creator. I still have my old alienware. But it's irritating to use a black background all the time in lit rooms.

        • 4k is way overkill for a 15" display, probably why nobody makes it. I'm surprised anyone did, but I suppose there's money to be made from people who spend more time looking at the spec sheet than the screen.

          My own ca. 2020 Dell laptop was $1200 with either 4 or 8 GB RAM (forgot the original spec) and a 256 GB SSD. I just bought a desktop for a similar price ($1400) with 32 GB RAM (DDR5 now) and 2 TB SSD. That's nearly an order of magnitude more storage.

          • For me the "killer apps" for a high-res laptop display is maps. My outdoor hobbies call for looking at a lot of maps, including e.g. tracing out trails in aerial imagery. And lots of small labels. Maps look so nice in high-res.

            Also, I find that laptops with lower-res screens generally lack the gpu horsepower to drive an external high-res screen (I'm sure there are exceptions). Plugging the laptop into my 4k tv makes a very nice setup for me now that I need reading glasses on the laptop.

        • by mjwx ( 966435 )

          My 5 year old Dell laptop recently wore out and shopping for a new one I was surprised how little, if any, specs have advanced for the same money (~$2k). In particular it doesn't look like anybody even makes a 15" 4k oled display any more. I suppose the chips are faster, so that's good, but the RAM, storage, and resolution are all about what they were.

          I explicitly don't want 4K on a 15" screen. I don't even want it on a 17" screen. I suspect I'm not alone here as higher resolutions have meant that a lot of things have become too small to see comfortably. Also as a gamer, it takes a lot less GPU to run 1080p on a laptop. I think 4K screens have just become willy-waving, especially on laptops. Refresh rate is more important these days as a gamer, G2G times were always more important with LCD/LED screens.

          The Asus I bought in 2022 still has at least 18 m

          • by Bert64 ( 520050 )

            A higher resolution makings things small is due to poorly designed software.
            Sizes should be specified in real world units (eg 1pt is 1/72 of an inch), not arbitrary measurements like pixels.
            A higher resolution screen when used properly just results in more detail. Things will only get smaller if you decide to scale them down, by default they should be the same size.

      • 8 more GB for $200 on the macbook air.

  • Is this just noise being manipulated into a narrative designed to reinforce fears due to scarcity, which exists primarily in our heads?

    • Volatility in DRAM spot prices certainly isn't anything new; but it probably doesn't help that we've got a push by the high end vendors to move more HBM at fancy AI part margins while also being at the somewhat annoying point in the DDR4/DDR5 transition where you can still put together a pretty plausible computer with either; but you need to pick a completely different GPU to do so; rather than it being one of those where memory controllers currently support both and it's just a question of how they laid th
  • by rsilvergun ( 571051 ) on Tuesday August 05, 2025 @06:28PM (#65568700)
    Because one of the big manufacturers said they're going to stop making ddr4 so of course it's shooting up. The industry is moving on to ddr5.

    There's nothing unprecedented about this it happens every time during the final transition between ram generations. The price will shoot up for a little while and then it will collapse as people's old ram makes it onto ebay.

    I just bought 32 GB of dddr3 for an old i5 board I've got kicking around and I paid $30 for it shipped.
    • Because one of the big manufacturers said they're going to stop making ddr4 so of course it's shooting up. The industry is moving on to ddr5.

      "Samsung Electronics, SK Hynix, and Micron Technology controlled 90% of the global DRAM market in Q2 2025"

      So, at most ChangXin Memory Technologies has 10% of the worldwide DRAM market. Can a "minor" manufacturer really cause market prices to double?

      There are indications that Q2 DRAM sales increased at least for some of the big 3. This indicates that demand was strong, and this is the likely reason for increased prices. If there was an issue with supply, perhaps the high demand and higher margins for HBM

      • That's the issue. They've more or less been keeping the market going. Among budget gamers it's a big deal because you can still get cheap am4 boards and you can pick up a ryzen 5600 for under 100 bucks pretty easily. Basically for $250 you can set up a good motherboard CPU and RAM combo and then throw away $300 used GPU at it or spend an extra hundred bucks for a 9060 XT 16 gig if you want to go all out.

        If ddr4 prices shoot up then that blows that up and you are going to end up having to step up to am5
        • by ls671 ( 1122017 )

          ddr4 and older pcie suck speed wise. With recent pcie (5+) and ddr5 and recent CPU you can easily run AI workloads with very acceptable performances without a GPU. I recently upgraded an old monster server running 50+ vms with 256GB ram and 48 CPUs and I replaced it with a relatively smaller 128GB ram server with 12 CPU (12 x AMD EPYC 4244P 6-Core Processor 1 Socket) and the new server is running much faster since it has ddr5, pcie5, uses SSD only and some vms run AI loads without any GPU which I couldn't

  • are we really commenting on memory at $4/GB? I remember taking out a loan to get 64k on my TRS-80. You kids today.

    This screed typed on a machine with 32G RAM. wow.
    • It seems like a rounding error to me. Besides, I thought Micron was still around cranking out competitive chips, and we are not so dependent on a Chinese manufacturer?
      • It seems like a rounding error to me. Besides, I thought Micron was still around cranking out competitive chips, and we are not so dependent on a Chinese manufacturer?

        It's a global market so what happens in China could impact those around the world. It kind of depends on how much of the total supply this impacts.

        There must be a lot of factors to consider. How much is produced in the USA versus imported from elsewhere? How much of the end user cost is in shipping costs such as labor and fuel? Tariffs? Any piracy or such impeding trade? (For example, is the Panama Canal operating as it should?)

        Then could be trading on futures. What could impact future supplies and d

        • Re-reading it, prices doubled when one producer who makes 10% of the chips went offline temporarily. Even though I think the chips are still very cheap, that still kind of smells.
    • {Said in a Yorkshire accent}

      TRS-80? Luxury!

      {I'll leave as an exercise for the reader how the conversation goes from there to the punchline.}

      And if you tell the young people today they won't believe you.

      • Those were the days! I got my father to buy a 16kb for $50 in the early 80's. It was heaven! $50 back then must have been a chunk of change. I didn't comprehend that part of it. It was for a TS1000 computer. My friend had a TRS-80 that I got to play on too. I did appreciate it very much, and wrote a lot of software, I just wish I had published it. It was just pure fun back then.

Debug is human, de-fix divine.

Working...