Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics AMD Hardware

AMD Unveils RDNA 3-Based Radeon RX 7900 XTX and 7900 XT Graphics Cards (hothardware.com) 50

Slashdot readers MojoKid and williamyf share the news of AMD's two new high-end graphics cards, the Radeon RX 7900 XTX and 7900 XT. "Priced at $999 and $899 respectively and available in December this year, the new Radeon cards are expected to go toe-to-toe with NVIDIA's GeForce RTX 4080 and 4090," writes MojoKid. HotHardware reports: AMD states that its goals for RDNA 3 are to accelerate performance-per-watt leadership and to raise the bar for high resolution and high framerate gaming. AMD has turned to a chiplet architecture to accomplish these goals, a first for gaming GPUs. The chiplet complexes consist of a 5nm graphics compute die (GCD), which is flanked top and bottom by up to six 6nm memory and cache dice (MCD). The RX 7900 XTX uses the full complement of 6 MCDs which aggregates as a 384-bit memory bus (64-bit per die) with GDDR6 memory offering 20Gbps of throughput. The RX 7900 XT uses 5 MCDs with a corresponding 320-bit bus.

All of this increased bandwidth and resources translates to what AMD claims is up to a 1.7X uplift in performance for the Radeon RX 7900 XTX versus its previous gen Radeon RX 6950 XT card in high resolution gaming. This could put the card within striking distance of NVIDIA's GeForce RTX 4090 possibly, but it's hard to say until cards ship to independent reviewers for testing. Regardless, gamers will appreciate the RX 7900 XTX's price point versus NVIDIA's $1600 top-end beast.

This discussion has been archived. No new comments can be posted.

AMD Unveils RDNA 3-Based Radeon RX 7900 XTX and 7900 XT Graphics Cards

Comments Filter:
  • I know what I'm buying next and with those two AI units the art world will never be the same.

  • price wise these look like they will put a lot of pressure on Nvidia to stop being greedy cunts.
    • by mobby_6kl ( 668092 ) on Friday November 04, 2022 @04:34AM (#63023619)

      Seems unlikely, these will probably be around the 4080 performance. Especially considering shit compute support and probably lower raytracing performance as well so I suspect that nvidia isn't worried. Not great for us either, it's still a thousand bucks for a card.

      • by quall ( 1441799 )

        For a top shelf card $1000 isn't bad. Considering that the new 4090 performs nearly twice as fast as a 3090ti, even at $1600 that is a good deal. If you wanted a cheaper card, then just buy a lower spec one.

        Even if the new AMD performs 10% slower than a 4090... well it's 40% cheaper so that's a great deal.

        • If you only want to use it for gaming that's fine, but nvidia is where compute is at.

          I'm appalled by these prices though in general, they are still at cryptocurrency-inflated prices despite the precipitous drop in mining. People need to stop paying more than the whole rest of their system for a GPU if we want the prices to ever come down. Sure GPU power has gone up, but so has CPU power yet it hasn't had a price increase significantly surpassing inflation.

          • If you only want to use it for gaming that's fine, but nvidia is where compute is at.

            Only if you want to use their fixed function cuda pipeline, locking you into that hardware.

            Vulkan compute is the most cross platform, and openCL easier to use. A lot of pre-existing applications are cuda only if you plan on running someone elses stuff but even there you have to hope to have the "right" fixed function hardware to use it.

            AMD may have slightly less performance in compute then using nvidia's fixed functions, but at least the software will run on anything, even your more capable card next year,

            • Only if you want to use their fixed function cuda pipeline, locking you into that hardware.

              Vulkan compute is the most cross platform, and openCL easier to use.

              The relative performance difference is enough that there are many applications out there that use CUDA when hardware is available and something else as a fallback when it's not.

              • Depends on what you're doing, a lot of heavy lifting stuff has no fallback by default, and if it's using something like PyTorch all you can typically do is make it use the cpu.

                Which for many purposes is useless compared to utilizing the gpu that is present.

                Cuda got an early stranglehold in academia by being a first mover, the continued addiction to it isn't necessarily a good thing.

          • Yeah that's the problem. If I'm supposed to spend a THOUSAND BUCKS on a video card, I want to it to be good at everything. Not just making shiny games that I don't have much time to play anyway.

            It looks like the 4090 is selling pretty well, but then it does have pretty ridiculous performance. Hopefully nobody buys their $1200 4080 and they'll have to go back to more reasonable prices. Especially once AMD's cards are out.

        • The 4090 is probably going to get recalled or NVIDIA is going to pony up for shipping better power cables that don't melt. https://www.theverge.com/2022/... [theverge.com]

          • The 4090 is probably going to get recalled or NVIDIA is going to pony up for shipping better power cables that don't melt.

            Even among modular power supplies there's no standard for the cables, and the problem is that the socket used on the GPU is of a design which is inadequate for the TDP, you cannot fix it with a new plug. A hardware fix would involve both reworking the card with a new socket and crimping a new plug onto the power supply leads with a superior contact design that can actually handle the load. This is conclusively not going to happen.

            It will probably be cheapest to throttle the cards, and take a hit in a class

            • the problem is that the socket used on the GPU is of a design which is inadequate for the TDP

              No. The problem is the socket on the GPU is sensitive to mishandling (e.g. bent pins / pins not making contact). The socket used on the GPU is perfectly fine and has ample spare power handling capacity compared to the draw from the GPU. It's just a shitty connector design that is none the less working fine for the overwhelming majority of people.

              There won't be a recall or patch for this. Though I highly suspect there will be a lawsuit because lawyers are like that, but it won't be won by consumers.

              • by raynet ( 51803 )

                The connector is just fine, the problem is shitty soldered adapter which when bent can break the solder joints but the sense wires still tell the GPU that there is 450W power available so the outermost pins begin to heat up while drawing the power via the center pins.

                • Yep, exactly. Link to some investigation if anyone is interested: https://www.igorslab.de/en/ada... [igorslab.de]

                  We really need to put this stupid "the connector isn't rated high enough" thing to bed. It's rated for 660W (600W) conservatively, with only a 40C temperature rise (so it can do more than 600W before things melt), and there is a 450W card connected to it, 75W of which come from the motherboard.

          • No, there's nothing wrong with the 4090. There's a problem with *some* shitty adapters and the cable side connector which relies on soldered terminals. That is it. At best they may need to ship out some adapters.

        • I know this is 2022 and this is a top of the line card, but man, $1000? I remembered Voodoo 3DFx add on cards and of course in absolute dollars it was less but did I really pay the equivalent of $1000 back in 1993?
          • After some googling, at launch the 4 MB Voodoo (1) was $299 in 1996 which an inflation calculator would be $570 now. That happens to be pretty close to the price of a PS5.

            As a fraction of the total computer cost, I recall getting a Pentium 90 Dell XPS in 1995 that was close to $3000, so the Voodoo was 10% of the overall computer.

            • Ahh, remember the days when tech got better and cheaper?

              Now nvidia discovered they have an effective monopoly and can milk us and we'll keep buying their shit.

              Come to think of it, phones too, seem to be getting more and more expensive. I guess people will buy Apple stuff at any price and Samsung and other OEMs are happy to match them on price.

              • Yes, straight from the horse's mouth of AMD's CEO:

                "Moore's law is dead," Huang said in response to PC World's Gordon Ung's question during the virtual press session. "And the ability for Moore's Law to deliver twice the performance at the same cost, or at the same performance and half the cost, every year and a half is over. It's completely over."

                Phones... don't even get me started.

              • by raynet ( 51803 )

                Thanks to inflation it is getting cheaper :)

        • For a top shelf card $1000 isn't bad. Considering that the new 4090 performs nearly twice as fast as a 3090ti, even at $1600 that is a good deal. If you wanted a cheaper card, then just buy a lower spec one.

          THe 3090ti was a fucking joke though. Very little improvement over the 3080, let alone 3090, for a ton of money. It just makes the 4090 look reasonable.

          The problem with "just buy a cheaper card" is that they're trying to move everything to a higher price point. The 4080 is now $1200, the fake 4080 aka 4070 is $900. I bought the GTX 1070 for around $380. Yes it was a while ago but not enough to explain more than doubling the price with inflation.

          • by raynet ( 51803 )

            3090ti was just NVidia making sure they can claim to have the fastest GPU out there and for some people who don't care about money to have the fastest gaming rig.

          • by nojayuk ( 567177 )

            The 4080 is now $1200, the fake 4080 aka 4070 is $900. I bought the GTX 1070 for around $380. Yes it was a while ago but not enough to explain more than doubling the price with inflation.

            The 1070 ti is PCI 3.0 with 8GB of GDDR5 RAM, 1700MHz clocks, 2400 CUDA cores etc. with a peak compute figure of 8 Tflops and is made up of about 7 billion transistors. The 4090 is PCIe 4.0 with 24GB of GDDR6X RAM, 2200MHz clocks, over 8000 CUDA cores and a peak compute approaching 100Tflops from 76 billion transistors.

            I

    • AMD are quite competitive on raw gaming performance. They do however lack severely in some areas. E.g. no support for AI based upscaling (DLSS is far better than FidelityFX), poor support and performance for raytracing, poor compute performance in general.

      If all you do is play standard video games an AMD card is for you. But there are many tasks out there (including playing raytraced games) where NVIDIA is far more compelling and worth the extra money for what you get.

      • poor compute performance in general.

        Poor compute support (for lack of cuda) sure. Poor compute performance, not really. There is a reason the vega series cards and 6800's were highly sought after by miners and others who heavily used opencl or vulkan. The performance per watt and performance per dollar were worth it to them.

  • Do AMD GPUs work better with AMD CPUs?

    I expect AMD would deny it, for reasons, but in reality, do they?
    • Do AMD GPUs work better with AMD CPUs?

      I expect AMD would deny it, for reasons, but in reality, do they?

      Gamers nexus mentioned something about the driver using special features in AMD CPUs to be introduced further down the road.

      • Gamers nexus mentioned something about the driver using special features in AMD CPUs to be introduced further down the road.

        All software does this if it's relevant. Games do it. Application software does it. The OS does it. This is already SOP across the industry. There's no reason to imagine that will make the combination any better than other combinations, because nvidia will do it too. They will want their hardware to perform when connected with an AMD CPU.

        I've been running AMD+nvidia for ages and it's turned out to be the right choice now that I'm actually using CUDA.

    • When I was researching parts for a new PC build I wondered the same thing. I couldnt find an official answer but I went with AMD CPU and GPU because, well, superstition in all honestly. It just "felt" better. (and and Radeon was a better value for my use case)

      That said there are scores and scores of people using AMD CPUs with NVIDIA GPUs with no issues at all it seems

      • That said there are scores and scores of people using AMD CPUs with NVIDIA GPUs with no issues at all it seems

        I've been doing this since I had a TNT (and then a TNT2) on a K6/2. Back then, the AMD CPU was the problem, but not specifically with the TNT drivers. It was just a problem :) And IME the problems with AMD video cards (or more to the point, their drivers) with any CPU were way, way bigger than any problems with nvidia on AMD. Now I only have problems with AMD drivers on Windows, when I got my HP laptop it came with Win10 and out of the box I did nothing but run Windows Update and suddenly the video driver s

        • by DarkOx ( 621550 )

          I really doubt the CPU was the issue. Really k6/k6-2/k6-3 were good chips. They could be very stable and were good performers clock-to-clock against their fellow x86 competitors and in terms of performance per $$ as well. The ALI and VIA chipsets in some many systems of that era had major problems and really ended up giving AMD systems a bad rap.

          They just were not stable at when paired with 400MHz+ CPU parts.

          • I really doubt the CPU was the issue. Really k6/k6-2/k6-3 were good chips.

            o/~ some of these things are not like the others o/~

            You needed patches for games and even other software to make them work correctly, some of those were CPU ID patches and some of them weren't. The K6/2 was not quite compatible with the Pentium MMX. By the K6/3 they had it all worked out. The K6/2 was a great CPU but its performance when trying to pretend to be an Intel chip was frankly shit. If you compiled software just for it, then you got very good performance, but that was rarely done in the industry a

    • by zekica ( 1953180 )
      Their marketing would let you think so, and in Laptops it indeed does. But in reality all technologies they use except for AMD SmartShift and possibly AMD SmartAccess Video are industry standards.
  • AMD always produces a GPU which is almost, but not quite, entirely unlike a good GPU.
    • by AmiMoJo ( 196126 )

      I am interesting in what the AV1 encoding performance is like. Their H.264 is notoriously poor, but their H.265 encoding is on a par with Nvidia's. None are as good as software encoding, but the performance gap is massive.

    • 6900XT was and still is a great card. Too bad it was so hard to get them for so long.

  • The only thing that has kept me on Team Green for so long is the selection of games that utilize the tech. Gaming is my hobby, and I like to squeeze every frame and every detail I can out of my machines. So when Nvidia basically started bribing game devs to use their proprietary shit, I went where the cool graphics were. But it looks like some devs have pledged to support AMD tech going forward so that is cool. Only a couple of games in my library support FSR for example.
    • PS5/xbox series x use amd 6xxx series gpu hardware, so any game that wants to run on current consoles will be optimized for AMD graphics cards.

      • No. They will be optimised for their respective consoles. There's very little transferable benefit between the highly specialised hardware (even if it is based on similar hardware to GPUs) in a console and a desktop PC.

        • The gpu part of the amd Soc's in current gen consoles is based on the RDNA 2 [xbox.com] series gpus amd make.

          The memory model is a unified one. But to say that it isn't using the same architecture is flatly wrong.

  • How much longer do we have to wait until the release of mid-tier RDNA3-based APUs, such as the "7600G" for example?

Time is the most valuable thing a man can spend. -- Theophrastus

Working...