Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics Hardware

Radeon RX 6800 and 6800 XT Performance Marks AMD's Return To High-End Graphics (hothardware.com) 62

MojoKid writes: AMD officially launched its Radeon RX 6800 and Radeon RX 6800 XT graphics cards today, previously known in the PC gaming community as Big Navi. The company claimed these high-end GPUs would compete with NVIDIA's best GeForce RTX 30 series and it appears AMD made good on its claims. AMD's new Radeon RX 6800 XT and Radeon 6800 are based on the company's RDNA 2 GPU architecture, with the former sporting 72 Compute Units (CUs) and 2250MHz boost clock, while the RX 6800 sports 60 CUs at a 2105MHz boost clock. Both cards come equipped with 16GB of GDDR6 memory and 128MB of on-die cache AMD calls Infinity Cache, that improves bandwidth and latency, feeding the GPU in front of its 256-bit GDDR6 memory interface.

In the benchmarks, it is fair to say the Radeon RX 6800 is typically faster than an NVIDIA GeForce RTX 3070 just as AMD suggested. Things are not as cut and dry for the Radeon RX 6800 XT though, as the GeForce RTX 3080 and Radeon RX 6800 XT trade victories depending on the game title or workload, but the RTX 3080 has an edge overall. In DXR Ray Tracing performance, NVIDIA has a distinct advantage at the high-end. Though the Radeon RX 6800 wasn't too far behind and RTX 3070, neither the Radeon RX 6800 XT or 6800 card came close the GeForce RTX 3080. Pricing is set at $649 and $579 for the AMD Radeon RX 6800 XT and Radeon RX 6800, respectively and the cards are on sale as of today. However, demand is likely to be fierce as this new crop of high-end graphics cards from both companies have been quickly evaporating from retail shelves.

This discussion has been archived. No new comments can be posted.

Radeon RX 6800 and 6800 XT Performance Marks AMD's Return To High-End Graphics

Comments Filter:
  • Caveats (Score:1, Interesting)

    by Anonymous Coward

    All good except:

    * RTRT performance is severely lacking
    * No alternative for NVIDIA tensor cores (powerful AI features) and DLSS (which makes 4K + RTRT at 60fps doable)

    Kinda sucks buying a GPU for the future which doesn't well support today's tech.

    • Yeah. The card (the XT at least) is great in rasterization performance, easily beating nvidia in terms of price and power / performance. By a bit.

      But I'm planning on building a full new PC (my first since the Core2 Quad days lol) as soon as the new GPUs and CPUs are available without a waiting list and at reasonable prices, and I don't really see the case for AMD here. Nobody cares about 20 watts on the desktop and you're only saving like $50 but giving up a lot of RT performance and DLSS. For some reason a

      • by Luckyo ( 1726890 )

        Problem with DLSS is that it's self defeating at the moment. By the time AI neural net has gathered enough data to actually have less than huge impact on image quality, the game is old.

        That's why feature is overwhelmingly tested for quality with games that released at RTX 2000 series launch, where it can finally be said that quality impact is now on par or better than most other popular forms of supersampling with performance in the same ballpark. If you test it in games that just came out, like the War Thu

        • Problem with DLSS is that it's self defeating at the moment. By the time AI neural net has gathered enough data to actually have less than huge impact on image quality, the game is old.

          No, that's just not correct [pcgamer.com], it simply doesn't take that long to generate data for DLSS to be effective. Call of Duty Black Ops has only just been released days ago and gets massive performance boosts with DLSS.

          • by Luckyo ( 1726890 )

            All games get massive boost in framerate with DLSS, because DLSS renders game in a much lower resolution.

            Image quality of DLSS is absolute garbage early on, because just like the name suggests, it's a deep LEARINING system. It needs time to learn. And quoting "DLSS image quality is better than original at the same resolution, herpderp" PCGamer as authority on it is hilarious.

            • I think you mean “LEARNING” but yes, that is how it works but I think you’re confused about how long it takes though.

              Anyhow here you can see it in action on the very recently released Call of Duty Black Ops:
              https://www.youtube.com/watch?v=L0N91Q_Wc2M [slashdot.org]

              Can you post some links to the ‘absolute garbage’ results you’re seeing?

              • by Luckyo ( 1726890 )

                Are you PCGamer's professional twitter user that wrote the article? Nevermind the fact that you mangled the link, do you understand that whatever you put on youtube is re-encoded for severe bitrate constraints making it utterly useless for graphical fidelity comparison?

                You should probably talk to the other guy who is claiming that DLSS makes things look better than normally rendering, also using a PCGamer article as a source. I suspect you have a lot in common.

                • Are you PCGamer's professional twitter user that wrote the article?

                  No. Are you?

                  Nevermind the fact that you mangled the link

                  Right, the text was there, it's not that complicated.

                  do you understand that whatever you put on youtube is re-encoded for severe bitrate constraints making it utterly useless for graphical fidelity comparison?

                  Yes but the difference is so significant that even with compression there is still a visual difference, if you have difficulty discerning that then good for you, you don't need features like this.

                  You should probably talk to the other guy who is claiming that DLSS makes things look better than normally rendering

                  Why? That would be a silly claim that has no basis in reality. Nobody (who know's what they're talking about) is saying it's better than native 4k, just that it's better than 1440p.

                  Can you post some screenshots demonstrating the 'absolute garbage' res

                  • by Luckyo ( 1726890 )

                    War Thunder's last update makes for a good example. They had a major overhaul of their engine and added DLSS. This is a game where telling details apart is not just important, but critical for your success. Aiming and hitting the point where ammunition is stored is effectively instakill. Hitting a section just a little off is a bounce or overpenetration that does basically nothing. This is not a game where you can hand wave washed out texture, bad resolution on far away surfaces and so on as "can't see it,

                    • This is a game that benefits massively from higher resolution playing as you have more aiming points and aiming references, but downscaling it is better for accuracy than DLSS.

                      Right. Nobody is saying DLSS is the only way to play. If you can get an appropriate framerate at native 4K then that is obviously better, in a general case a 1440p supersampled to 4K is going to be better than native 1440p and there will be niches where that's not the case either.

                      I have the XT version and an A 6000 (which is effectively a 3090 with more RAM and lower memory bandwidth), experiments with the default DLSS 2.0 model on arch viz datasets has been really impressive and even using the nv card to t

                    • by Luckyo ( 1726890 )

                      >If you can get an appropriate framerate at native 4K then that is obviously better, in a general case a 1440p supersampled to 4K is going to be better than native 1440p and there will be niches where that's not the case either.

                      And the point that you're dancing around is that DLSS is that "edge case" until ML algorithm had ample time to learn how to upscale the game.

                      >If you can't tell the difference between raytraced reflections and screenspace reflections, or between shadow maps and raytraced shadows

                    • And the point that you're dancing around is that DLSS is that "edge case" until ML algorithm had ample time to learn how to upscale the game.

                      Not with DLSS 2.0, no, but even then, who cares? Even if it were an edge case why would that matter to anybody?

                      First of all, this is quite a hypocritical statement to make as someone who doesn't care about massive quality loss of meaningful detail to the point of having extremely detrimental effect on competitive play where accurate scene representation is relevant.

                      Who's the "someone" who doesn't care about this "massive loss of meaningful detail"? I certainly never said or implied that. Also do you have screenshots showing what you're talking about?

                      "That loss of quality doesn't matter, but look at those slightly different looking reflections and shadows that are sorta kinda more realistic*!"

                      Who are you quoting?

                      *reflections and shadows may not be more realistic, depending on the specific game, quality level, driver version and general implementation. But they will incur a massive performance impact. And they will not make your shots any more accurate.

                      Again nobody claimed reflections and shadows will be magically more realistic, but certainly light transport is much better simulated by raytracing than rasterization which is why we use it in

                    • by Luckyo ( 1726890 )

                      >Not with DLSS 2.0, no, but even then, who cares? Even if it were an edge case why would that matter to anybody?

                      Yes with DLSS 2.0, because it's still Deep Learning system, which by definition needs to be trained extensively to produce non-awful results. You're also apparently missing the extremely obvious sarcasm about "edge case" as that "edge case" is far more relevant than RTX as I point out. Because the edge case is where it actually helps more than hurts.

                      But your line of straight man questioning sug

                    • Yes with DLSS 2.0, because it's still Deep Learning system, which by definition needs to be trained extensively to produce non-awful results.

                      I think you're missing the point between DLSS versions, the first version needed to be trained on a per-game basis, the more recent 2.0 version does not. But if it's so bad then just don't use it.

                      I do see the difference. And I find that difference is indeed "difference" rather than "improvement", exactly as I note above.

                      Then turn it off and don't use it. Again why are you complaining about the existence of a feature you don't want?

                      Did you notice something absolutely critical missing in this statement? Hint: it's the complete lack of reason given. I may as well state that we need to move to full physics simulation for everything. For the same reason. And it would be just as wrong on all the same merits.

                      The pursuit of visual realism, again if you're not interested then don't use it, don't invest in games that pursue it.

                      And now, let me answer the very question you fail to address. Why we want ray tracing?

                      Because once we can actually do it properly, it will massively cut down on development time. Developers will no longer need to spend a large amount of resources in basically hand crafting lighting in each 3D environment. They can simply set the light sources and let the GPU do the lighting in real time.

                      No that's how we do it today, it's just that the GI gets baked in via an offline proc

                    • by Luckyo ( 1726890 )

                      >The pursuit of visual realism, again if you're not interested then don't use it, don't invest in games that pursue it.

                      I notice an utter absence of demanding realistic physics, in spite of them being far more meaningful than implementing mostly fake ray tracing. And reason not to implement them is pretty much the same as not implementing ray tracing. Performance impact.

                      >i don't like it and i don't want other people to like it and i don't want other people to spend time on it.

                      You'\re projecting your bi

                    • And they don't use them, as evidenced in utter lack of support for ray tracing in modern games.

                      So what are you complaining about? You're so confused that you can't even figure out what you're angry about. First it's that games have raytracing, now you're whining that they don't have support for it.

                      Problem is that marketers for GPU makers and fanboys that sit in the two percentile do the "mememe" act that you're doing now, demanding that they be catered to at expense of overwhelming majority that already voted with their wallets against games like Control.

                      So your concern is for nvidia's bottom line since they're the one who paid for that feature?

                      That is your argument reversed, "I like it and I want massive amount of development resources be burned on the altar of serving mememe".

                      Nope, I'm not telling developers what to do. I am a developer and I'll do what I like and not be beholden to an entitled little shit like you.

                      to the point where shadows and reflections are often more unrealistic rather than less.

                      SHOW SCREENSHOTS

      • My guess is that this is just like with Ryzens. The next one will be better, but they obviously already got something out there, because waiting when you already got a product that is fine for many people could already at least give you some percentage of the sales. (Which also helps finance developing that next one.)

        Remember when their mobile CPU lineup wasn't that great after Ryzen already had become great? Now those are good too. So I'll wait and see. 2 or 3 generations in, with all the bells and whistle

      • by Agripa ( 139780 )

        But I'm planning on building a full new PC (my first since the Core2 Quad days lol) as soon as the new GPUs and CPUs are available without a waiting list and at reasonable prices, and I don't really see the case for AMD here.

        Waiting lists for nVidia cards are a major problem right now. AMD may win simply because of better availability.

    • for the long haul. Not that $600-$700 bucks is cheap, but in my more free spending pre-kid days I could easily see dropping that, gaming on the card for 2-3 years and then upgrading.

      It's not like when you drop $1200+ and expect to be kicking ass and taking names for 5-7 years.
    • Re: (Score:3, Informative)

      by gravewax ( 4772409 )
      none of the GPU's from either vendor are particularly good at RT. Just Nvidia is less sucky than AMD.
    • by Luckyo ( 1726890 )

      Ray tracing, for all the marketing fluff is basically absent in gaming today. Easily seen in people trying to benchmark it, most games that are using it are from time around nvidia 2000 series release.

      And even in those games, performance crash is massive, but effect on quality of output is barely visible. That's why nvidia has to cheat with DLSS, which downscales the game renderer and then upscales the output, resulting in significant image quality degradation. So you're getting a questionable minor uplift

    • yeah, this gen of GPUs is actually pretty crap.

      * AMD GPUs offer 16gb of vram, but offer crap RT performance.
      * the nvidia 3080 offers good RT and raster performance, but only has 10gb, which isnt enough long-term.
      * the nvidia 3090 has 24gb, but its price is just stupid.

      plus, we dont even have true next-gen games on PC for benchmarks yet.

      and the shit cherry on top of the shit sundae is TVs and receivers are experiencing hdmi 2.1 issues.

      shame, but the wise thing to do is either wait for the 3080t
  • by thegarbz ( 1787294 ) on Wednesday November 18, 2020 @06:02PM (#60740588)

    It's good to see AMD coming back and kicking arse in not just one, but two fields.

    • Yep. Building a new PC this BlackFriday/CyberMonday. Going all AMD due to recent price-performance stats.
      • by fazig ( 2909523 )
        I hear that often. Do you really expect to get good deals for these brand new super low availability parts that will make up the most expensive parts in the computer?
        What kind of insane retailer would do such a thing if demand is so high that people are willing to pay a lot more?
      • Good luck finding parts. Seriously. Most of the Zen3 CPUs are long gone, and the Radeon RX 6800s were sold out in seconds on Wednesday.

        I bought the motherboard and RAM I want in my final build, and then bought a cheapo used Ryzen off eBay for $90, which will get swapped out with a Ryzen 9 5900X when they actually are available. And I'll be bringing over my GTX 1070 Ti until I can lay hands on a current generation GPU.

        It's like the semiconductor industry forgot how to actually build product this year, and

    • by fazig ( 2909523 )
      I think they could have done even better if they cut back on the VRAM to keep the prices down a bit more.

      Benchmarks like from Godfall show that statements from developers about the 12GB VRAM requirement for 4K and Epic graphics settings are not accurate. The RTX 3080 with it's 10GB GDDR6X ties with the RX 6800 XT having 16GB GDDR6 in that title and that settings according to what Hardware Unboxed [youtube.com] found.
      Don't get me wrong, that still makes the RX 6800 XT the better chard here, because it is less expensive
      • Not everything is about gaming you know. On my 8GB card I can't use DAIN to process 1080p video because of the VRAM limitations on the 2070 Super. A 16GB card would be most welcome for far more than just throwing pixels at a screen.

        There's also the idea that in conjunction with future technologies like Direct Storage you may need an extra buffer in VRAM to do background transfers of new data before the current data is released in a goal to eliminate loading screens. This could put upwards pressure on VRAM r

        • by fazig ( 2909523 )
          I'm well aware that not everything is gaming.

          Last year I started with 3D art and texturing large texture sets using UDIMs requires tons of VRAM for the virtual textures for the software to run smoothly, because if it starts to move data into regular RAM things start to get annoyingly slow. 16GB of VRAM still isn't enough there still.
          I have a Quadro P6000 for that, which I got for a good price. The performance of it is hideous compared to much cheaper modern cards, but the 24GB GDDR5X still make it worth
    • by AmiMoJo ( 196126 )

      AMD still has some catching up to do but at least it looks like they are back in solid competition. Gaming performance is competitive now, and the fact that you can actually buy one really helps (latest Nvidia cards are very hard to get). Also Linux support is far better and Nvidia are generally just dicks.

      Areas where AMD is lagging a bit:

      - Raytracing performance.

      - DLSS, basically upscaling with some AI to improve image quality. It really boosts frame rates with raytracing enabled and visual quality is 80%

      • Yes I agree wholeheartedly. I make quite a lot of use of CUDA applications and AMD is really targeting the RAW gamer here. For anything AI related for example NVIDIA is still the clear winner.

      • You actually cannot buy one - they immediately went from "coming soon" to "out of stock" on AMD's website, and every retail partner either has every single card out of stock, or "coming soon". B&H Photo gave up and just put up a notice saying that they don't have any, stop nuking their servers refreshing.

        These GPU launches are a total shit show.

  • I was able to get one of those 3080's on release, and it's been real nice, even on a slightly older CPU (4790k). Most games other than MS Flight Sim and emulators don't care about the , only like 2 FPS loss max.

    But from most benchmarks - the top of the line AMD cards now perform basically on par with that, which is awesome!

    Go for it, if you can get one, and you're into PC gaming and have a 4k monitor. The whole reason I jumped on the 3080 was that when I got my 4k monitor, I promised myself to jump on the

    • For me, DLSS is a big deal. Frequently, you will get a more detailed rendering than native, with less strain on the GPU. It's like magic, and AMD graphics will be a hard sell on me as long as they don't support something similar.

      • by Luckyo ( 1726890 )

        >Frequently, you will get a more detailed rendering than native

        Even nvidia doesn't make marketing statements this ridiculous. But if you have a fanboy lens on, the smudged out textures and weird texture shifts close in may look like an improvement. YMMV.

        • Quote from:
          https://www.pcgamer.com/nvidia... [pcgamer.com]

          What's truly surprising is that, true to Nvidia's word in some ways, the DLSS 2.0 scene is often a little more clearly defined than its native counterpart. I know, who saw that coming?

          I've played Control entirely with DLSS enabled and seen this effect with mine own eyes. Not always, not on every texture, but more often that you'd think.

          • by Luckyo ( 1726890 )

            Yeah, that's PCGamer, not nvidia. Which shows its professional twitter user's "deep technical expertise/postmodern interpretation of what he heard engineer say while he was high".

            In reality, nvidia is fairly clear that after they run the AI through the learning paces for a few months, it will get about 80% of image quality or so of original. Which is why DLSS and RTX are mostly talked about in Control, the best "game that's actually only a benchmark that no one plays", the Ashes of Singularity of nvidia in

    • One thing to remember: AMD got all the console design wins for this current generation. It stands to reason that the optimized code for those consoles will be optimized for the AMD GPU in your desktop as well.

      • One thing to remember: AMD got all the console design wins for this current generation. It stands to reason that the optimized code for those consoles will be optimized for the AMD GPU in your desktop as well.

        Unlikely, it's a different system architecture. While the underlying GPUs are both RDNA2 the console versions use a shared memory architecture, so there's no PCIe bus to be copying data across from system memory to GPU memory unlike on the PC. Different system architecture, different clockspeeds, different numbers of cores, different amounts of graphics memory and, at least in the case of the PS5, a different underlying graphics API.

        Games on the consoles will be optimized for a high speed SSD connected to u

  • I have used AMD CPUs for many years now - close to 20 probably. Also used AMD graphic cards whenever possible. My current machine is second generation Ryzen7. Unfortunately for ML applications the AMD is a poor option if you want things to just work. So I had to go with NVDIA GPUs for several years now. I wish AMD took ML seriously and made an effort to support Tensorflow and other ML frameworks out of the box. There are some options but they involve a lot of effort to get them to work.
  • by fleeped ( 1945926 ) on Wednesday November 18, 2020 @06:51PM (#60740788)

    Or still pretending that it doesn't exist? ... As much as I love AMD these days (both my machines have Ryzen, plus one of my two GPUs), this is a sore point.
    But I'm truly, truly glad that AMD is stepping up on their GPU game as well, kudos! The NVIDIA monopoly is not healthy.

  • by Pimpy ( 143938 ) on Wednesday November 18, 2020 @07:02PM (#60740812)

    For those of us that aren't interested in gaming, it would be nice if AMD also made some of their newer high-powered GPUs more directly accessible for ML applications. Some more competition in this space would be great, but whereas NVIDIA treats ML as a first-class application, AMD has been pretty quiet on this front so far.

  • by Tough Love ( 215404 ) on Wednesday November 18, 2020 @07:47PM (#60740954)

    Also marks my return to a GPU upgrade. Been limping along with a 580 for quite some time, waiting for AMD to hit the sweet spot. Which this appears to be.

  • Availability of new stuff from AMD and Nvidia is pretty bad at the moment.

"Pull the trigger and you're garbage." -- Lady Blue

Working...