Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AMD Technology

AMD is Making Another Run at Nvidia With New 4K-Ready GPUs as Sales Collapse 70

AMD will launch its new Radeon RX 9070-series graphics cards in March 2025, promising "high-quality gaming to mainstream players" amid struggling sales. The company's gaming division reported $563 million in Q4 2024 revenue, down 59% year-over-year. The new cards will target the same market segment as Nvidia's RTX 4070 Ti ($799) and 4070 Super ($599), featuring a 4nm TSMC manufacturing process, ML-enhanced FSR 4 upscaling, and next-generation ray-tracing accelerators.

Steam Hardware Survey shows AMD's current RX 7000-series cards have minimal market presence, with only the 7900 XTX and 7700 XT registering on the list. Industry research indicates AMD sells approximately one GPU for every seven or eight sold by Nvidia. The launch timing could be opportune, as Nvidia's upcoming RTX 5070 features fewer CUDA cores than the RTX 4070 Super it replaces.

AMD is Making Another Run at Nvidia With New 4K-Ready GPUs as Sales Collapse

Comments Filter:
  • by Pseudonymous Powers ( 4097097 ) on Thursday February 06, 2025 @10:26AM (#65146877)
    I love how Nvidia's market cap apparently jumped by a factor of 10 or 100 or whatever due to the twin crazes of cryptocurrency and generative AI, and rather than recognizing the tenuous nature of this surge and immediately starting a plan for some sort of soft landing from the inevitable crash, its CEO responded by just wearing crazier leather jackets.
    • Intel crashed up for some 40 odd years, until it crashed down. I wouldnt hold your breath waiting for the crash down.
    • Need more marble edifice around kitchen appliances!

    • Re:TULIPS FOREVER (Score:5, Interesting)

      by Luckyo ( 1726890 ) on Thursday February 06, 2025 @11:51AM (#65147153)

      AMD actually cashed on crypto harder than nvidia did. Their cards were utter unobtainium during ethereum craze, because they were more efficient for that specific work.

      And that meant AMD letting gaming market just die for their cards. Gamers basically couldn't get their cards for years in any meaningful amounts. So game makers saw no sense in optimizing games for them, or for that matter making new techniques in any meaningful amount of quality for them. For several years. For all of nvidia's faults, they made a good effort at staying relevant for gamers during ethereum boom.

      And now, they're basically the only game left in town. And it is in fact mostly AMD's own fault.

    • Nvidia's market cap apparently jumped by a factor of 10 or 100 or whatever due to the twin crazes of cryptocurrency and generative AI

      The chip co's are probably thinking if they dive into enough fads, at least one will actually take off.

      I would suggest AMD focus on making a gaming graphics chip-set that is optimized for games and only games. Gaming has a relatively stable demand. Crypto and AI could zoom to space or crash. AMD probably doesn't have the funds to compete in all the fads such that picking a sta

    • I love how Nvidia's market cap apparently jumped by a factor of 10 or 100 or whatever due to the twin crazes of cryptocurrency and generative AI, and rather than recognizing the tenuous nature of this surge and immediately starting a plan for some sort of soft landing from the inevitable crash, its CEO responded by just wearing crazier leather jackets.

      Crypto really was a craze as the only two known practical uses were money laundering and price speculation. Nvidia knew from the start that crypto was a passing fad. That's why they unsuccessfully tried to prevent the sales to crypto instead of gaming.

      AI is different. Yes, it's unknown what the practical uses of ChatGPT are, but gen AI is much larger than ChatGPT or Deepseek, and AI is much larger than just gen AI. In contrast to crypto, AI has already produced practical use cases that have already earned b

    • A CEO's job requirement is to not plan for crashes. The line must go up.

    • Did you short and get your free money? Momentum might even be on your side now.

  • Sales are down because people are waiting for the new models, which are all sold out. AMD is not going to jump in there and get these buyers.

    • Some of those people ate waiting for AMD instead because this generation of nv cards offers too little benefit for the price increase. The serious gamers who will pay this much don't want cards which are only faster with more fake frames.

      • Outside of a halo 5090, the other three cost exactly the same as the previous gen, including 5070, 5070 Ti and 5080. So, what price increase are you talking about?
        • no new gpu is available outside of the scalper market and the AIBs are upping their prices well above FE msrp.

          Itâ(TM)s a significant price increase if you actually want to purchase one, over prices from just November. Often double or triple price from scalpers. The 4090 was almost msrp when they stopped production.

    • I would disagree. The initial reviews on the 5000 series are they are underwhelming especially for the increasing prices NVidia is charging. If AMD can put out a price competitive product, people would buy them. Also while new cards sell out generally, many feel this launch was more of a paper launch where few if any cards were actually shipped. If AMD actually has cards people can buy, people will buy them.
    • by Targon ( 17348 )

      You misunderstand the difference between the high end of the market and the lower end of the video card market. For those looking to go with decent performance mid-range for a good price, AMD has the better product. Sure, you have a LOT of idiots with a RTX XX60 tier card who think that NVIDIA is the best, but in that price range, AMD has often been a far better choice.

      Do you think people at the low end turn on ray tracing? If they leave ray tracing off, then NVIDIA has no actual advantage over AMD at

  • by Virtucon ( 127420 ) on Thursday February 06, 2025 @10:32AM (#65146903)

    Yesterday: "AMD outsells Intel in the Data Center!"
    Today: "AMD is losing in the GPU market!"

    feast or famine. Frankly, I'm not interested in $2K video cards. I'm not crypto-mining, and I'm not looking to generate LLMs. My Rx 6800 is doing fine and I don't have to take out a home equity loan to pay for it's electrical usage.

    • by KingFatty ( 770719 ) on Thursday February 06, 2025 @10:41AM (#65146927)
      Couldn't it just be that both are true simultaneously/concurrently? I feel like the data center market is separate from the GPU market, so AMD can win in one market while losing in another?
      • When comparing AMD vs Intel in data center, its their general compute product: EPYC vs Xeon or what ever the fuck Intel calls it now. AMD is no where close to GPU sales in data center or consumer based products vs Nvidia.
      • GPUs are big business in the Data Center as well. For example, El Capitan (LLNL) has almost 44,000 AMD GPUs in it, along with the same number of EPYC CPUs. It's all in HP packaging, but it's AMD CPUs and GPUs.

    • Tomorrow: "Intel integrated graphics beat NVidia in portable devices!"

      Nvidia: Rock
      Intel: Paper
      AMD: Scissors

      • by Luckyo ( 1726890 )

        Actually AMD has utterly crushed nvidia in ultraportables. The only exception is nintendo, which never cared about performance. Everyone else went with deeply specialized AMD SOC specifically designed for those systems. Be it Steam, major Chinese manufacturers, or major Western manufacturers. AMD's ultraportable SOC has an amazing CPU + GPU performance and low power consumption on top of it.

        Well, except for MSI iirc. But that intel based portable was a joke that no one bought.

    • by Gilmoure ( 18428 )

      [hashtag] The New Beleaguered

  • by haxor.dk ( 463614 ) on Thursday February 06, 2025 @10:42AM (#65146935)

    I'm very impressed by the continuous advances in GPU technology having been made in the past 25 years, but instead of the next monster gfx/ai/ultranumbercruncher card, I'd really just like a small gfx device with a consistently low power draw, low need for active cooling, made with quality components that don't give up the ghost after two or three years of use, and with quality FLOSS driver and source code support.

    Nvidia and ATI/AMD has had over two decades to get their stuff together, and still Intels integrated offerings are the winning bet for the above criteria.

    • by JBMcB ( 73720 )

      Matrox, but they aren't cheap. Pretty sure they use ATI chips, but they are industrial-strength, designed to be ran non-stop driving video walls or multimonitor setups for control rooms.

      • by Luckyo ( 1726890 )

        ATI has been bought out by AMD a long time ago. Matrox today is basically out of date Radeon chips with Matrox's firmware/drivers and boards that have a lot of outputs.

    • You need to look for a processor with integrated graphics.

    • by e3m4n ( 947977 )

      I thought AMD also had integrated gfx for their processors? Nothing high end of course, but still decent.

      • AMD uses cutdown Radeon cores for integrated graphics CPUs. Sometimes the GPU is a generation or two behind depending on the model.
    • by Mspangler ( 770054 ) on Thursday February 06, 2025 @11:39AM (#65147119)

      I have a Ryzen 4600G that works just fine for my needs.

      My desktop may never have need of a separate GPU. The kid's PC is running some long obsolete video card from a decade ago. A new video card is desired there, but it must draw less than 200 watts.

      The industry went haring off after the 0.01% elite gamers for bragging rights, then they are wondering why sales are down. Look at Intels B580 cards. It should be just fine for a midrange video card, but it doesn't work properly on CPUs older than 10th generation. The kid has an i7-8700. Oops. That sale is lost.

    • by Luckyo ( 1726890 )

      Steamdeck?

    • by CAIMLAS ( 41445 ) on Thursday February 06, 2025 @12:24PM (#65147237)

      You can buy a low power integrated option from AMD (eg. the mobile or NUC-type devices) which is far better than the Intel integrated stuff. There's an entire line of AI-enabled/GPU stuff now, and has been for a while. Eg. the Ryzen AI Max coming up...

    • by AvitarX ( 172628 )

      It's been a while since I shopped, but if they still do the GE APUs that sounds like what you want. They were the integrated graphics on the upper mid range desktop chips, something like 80% the clock, but 50% the overall power.

    • Low end Nvidia quadro cards (now called RTX) fill that niche. For example, a T400 uses 30W and can drive 3 x 4K monitors at 120 Hz. In a lab, I've seen these used 24/7 for more than 6 years without issue (to give an idea here, a server-grade motherboard failure would be more common). The drivers may be open-source now: https://developer.nvidia.com/b... [nvidia.com], at least on Linux.

      If you want to game, then I wouldn't recommend one of these cards just on a performance/price basis. Also the lower-end RTX cards usua

    • I'd really just like a small gfx device with a consistently low power draw, low need for active cooling, made with quality components that don't give up the ghost after two or three years of use, and with quality FLOSS driver and source code support.

      You might want to check out IGPs. Both Intel's and AMD's have been pretty good in recent years.

    • That exists. It's called the iGPU and you simply need to buy an appropriate CPU with one included. Or failing that get a NVIDIA GT710 for less than $50 the power consumption there is low enough that the cards are fanless.

      Honestly I don't think you've looked. There are far more computers out there with your desired kind of GPU. They are in offices by the 100s of thousands.

  • people stop buying cards, waiting for something good, that is not a heater and not crazy priced.

    whales don't care and still buy nvidia at their crazy prices, but the other people are trying to wait and only buy something if they really need too (broken or need a new pc)... also you get the clueless that buy "cheap" nvidia just because they know it is good, while its performance/price is actually right now pretty bad

    AMD need to release something that performs well, don't use 800W and don't cost +1k, forget a

    • by Targon ( 17348 )

      You know it's winter, don't you? Many people go Intel and NVIDIA, just because that heats up a room VERY well.

      • by higuita ( 129722 )

        you know that there are more economical ways to generate heat ... and also remember that winter ends and then you have all other months that generating heat might not be ideal... like in summer!! ... yes, if you have it, you can use a AVAC system to cool down the home, but then you pay more than double the electric bill, as you pay for generating the heat and then pay even more to remove that same amount of heat and deal with all the inefficiencies.

        even ignoring that, having a GPU using 800W is expensive i

  • by EnsilZah ( 575600 ) <EnsilZah@GIRAFFE ... minus herbivore> on Thursday February 06, 2025 @10:54AM (#65146979)

    I use my home computer to occasionally play games but also for professional applications for 3D, video, and the like.
    Most of that software is written to take advantage of CUDA, so while GPUs from AMD might have advantages like sharing system RAM with the CPU, or Intel's new GPUs might be competitively priced, I guess I'll have to stick with Nvidia for the foreseeable future.

  • "AMD's new GPUs are competitively priced and reasonably fast but the drivers are still a glitchy train wreck disaster from hell just like they were for the last 10+ years so don't buy them"
    • by Targon ( 17348 )

      Strange that AMD drivers have been very stable, while you need to roll back NVIDIA drivers a full year at times to fix problems.

      • by beep999 ( 229889 )
        I've found just the opposite to be the case. I've run NVIDIA for years and had almost no driver issues, either on Windows or Linux. But my teenaged son had an AMD card for a while and had nothing but problems with the Windows driver in both compatibility and stability. We finally gave up and switched him to NVIDIA and traded in the AMD card. I would LOVE to support AMD and switch to their cards, but the driver issues keep me from pulling that trigger.
  • "Nvidia's upcoming RTX 5070 features fewer CUDA cores than the RTX 4070 Super it replaces"

    I'm pretty sure the base 5070 replaces the base 4070, not the 4070Ti or 4070 Super. That would be replaced by a 5070 Super.

    • by Luckyo ( 1726890 )

      To be fair, CUDA cores don't actually have same performance between architecture. It's in the name, U stands for "unified". It's what allows different architectures with difference performance be programmed for once with a loss of around 20% (highly variable minimum and maximum based on specific task) compared to fully optimized system at a much lower programming overhead.

      But the "super of the same SKU of previous generation is better than lowest end of that SKU in the next" is indeed pretty new for nvidia.

      • Nah, it happened with the 3070Ti (6144 cuda cores) and the 4070 (5888 cudas). Good point about the importance of architecture though, I think a 4070 will outperform my 3070ti. Sure, fewer cores but they're faster, tied to faster memory, and each core is probably more capable.
        • by Luckyo ( 1726890 )

          30 series CUDA cores were significantly worse than 40 series if memory serves correctly. That was in fact one of the selling points of 40 series over otherwise excellent 30 series that went toe to toe with their equivalent and quite disappointing 40 series equivalents in gaming. And absolutely trashed them on RRP to performance ratio.

  • Headline: "AMD is Making Another Run at Nvidia With New 4K-Ready GPUs as Sales Collapse"
    Body: "The company's gaming division reported $563 million in Q4 2024 revenue, down 59% year-over-year."

    This implies that AMD is launching new graphics cards BECAUSE their sales have decreased. GPU research begins years in advance and production takes months to ramp up and stockpile before launches. They've planned to sell these video cards for years... it just so happen that their sales have fallen recently as well.

    • by Luckyo ( 1726890 )

      Thing is, you do actually make a production run before releasing the cards, so you have something to sell. Usually. Nvidia for example didn't for their current release, which is why they have a paper launch in the middle of longest Chinese holiday when every guest worker goes home to their family.

      But usually you make a bunch of cards and store them in warehouses so you actually have something to sell once they're released, rather than do a paper launch with cards slowly trickling out over next few months.

      So

      • by Targon ( 17348 )

        AMD makes the GPUs, and those GPUs need to go out to video card makers MONTHS before a launch. Have you noticed how you can't find those RTX 5000 series cards ANYWHERE, because even on launch day, many places had a total of 9 RTX 5090 cards available for sale, meaning, sold out instantly. That's at the level of a paper launch. AMD not rushing to release products may allow there to be a decent number of Radeon 9070XT available at launch. The real question is what the performance is, and how much fas

  • Or maybe the market for gaming cards is not at $800 like during covid, but maybe $400??
    • I'd wager that the majority of the people who can spend $700 on an AMD card can spend a bit more and get a 50-series nVidia card.

      The folks who can't spend $700 on a GPU are likely a prime market for the $250 Battlemage GPUs from Intel. They benchmark pretty favorably for the creative market, and AMD isn't offering double the performance at double the price.

      I'll add to the argument that in addition to price, there's not as much impetus to spend $700 on a GPU. Crysis is nearly 20 years old, and I haven't hear

  • How can a company that just took the server market from Intel--do so badly at making GPU's and drivers.
    • I’m fairly certain it’s being done on purpose. Collusion with nvidia. The government needs to force them to spin off ATI.

    • by Targon ( 17348 )

      Do you consider the RTX 5070/5070Ti to be "weak GPUs"? The 9070 and 9070XT are looking to target those NVIDIA cards for the price/performance. If you only compare to the RTX 5090 which costs $2100+ even if not being sold by scalpers, then sure, AMD isn't even trying to sell at that price point.

    • How can a company that just took the server market from Intel--do so badly at making GPU's and drivers.

      Because a CPU is not a GPU? A Ryzen doesn't have drivers. It's hardware. And far different tech than GPUs.

  • GPU prices are still hilariously overpriced, and AMD has failed to compete with NVDA on the high end for a decade. At this point I have to assume it’s collusion. AMD agrees not to impede on nvda’s GPU profits, nvda doesn’t encroach on AMD by developing high ARM chips.

    AMD should spin off ATI so we can get a true GPU competitor again.

    • by Targon ( 17348 )

      What makes you think cards with 16GB of VRAM and need more layers to the BCB, need larger coolers than the old cards, and have more features are overpriced? The price of EVERYTHING is higher today than in 2017, so, $550 for a video card today vs. a Geforce 1080 or Vega 64(which was $600 at launch), and you are complaining about things being overpriced?

      • The 1080 retailed for $599 at launch. It’s not appreciably smaller than the 4080 which retailed at $1200. The price of all other electronics, including CPUS, has barely budged in that time span.

        “More features”? You stupid fanboy, what do you think happens as technology marches on?

        • by sinij ( 911942 )
          RTX 5080 retails for $1600 and you can't find one anywhere. It is absolutely insane. GPU should not be priced as a luxury item. This will kill PC gaming.
  • I will never buy another AMD GPU after getting repeatedly burned by drivers.
    • Here here. Despite over 20 years of promises, AMD drivers have always been awful. I remember the huge Catalyst rebranding campaign that went nowhere.

      nVidia's drivers are rock solid, especially so for legacy applications.

  • Itâ(TM)s shocking how many people have their head in the sand with AI. Itâ(TM)s no fad, itâ(TM)s bigger than the Internet. And what does AMD do? Release a GPU with 16GB vram. There is no serious competition to NVIDIA and boards that can handle video as well as local ai workloads are the future. If they think they can specialize in gaming, they are dreaming. NVIDIA can afford a far higher R&D budget.

Always leave room to add an explanation if it doesn't work out.

Working...