Forgot your password?
typodupeerror
Games IT

5K Gaming Is Too Hard, Even for an RTX 5090D (pcmag.com) 49

Asus has been showcasing its new 5K 27-inch ROG Strix 27 Pro gaming monitor running at 5,120 x 2,880 resolution and up to 180Hz, but even Nvidia's flagship RTX 5090 struggles to deliver smooth frame rates at this demanding pixel count. In testing conducted by Asus, the RTX 5090D -- a Chinese-exclusive variant with weaker AI performance -- achieved just 51 frames per second in a Cyberpunk 2077 benchmark at ultra ray traced settings. The test system ran an AMD Ryzen 9950X3D processor, had DLSS set to balanced, and kept frame generation disabled. The same configuration running at 4K managed 77 fps, around 50% higher.

The underlying math is simple: 5K resolution requires rendering 78% more pixels than 4K. That 218 PPI pixel density delivers impressive sharpness up close, but Asus chose an IPS panel over OLED technology to reach it, trading away deeper black levels and faster response times. Asus appears to be positioning the monitor as a dual-mode display -- 5K for productivity and video, 1440p at up to 330Hz for gaming. Early Chinese listings have it priced at the equivalent of $800, roughly what you'd pay for a larger 4K OLED panel.
This discussion has been archived. No new comments can be posted.

5K Gaming Is Too Hard, Even for an RTX 5090D

Comments Filter:
  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Tuesday December 23, 2025 @01:39PM (#65877711)

    ... running at 180hz?!?? Aside perhaps from some high performance VR setup that probably costs 50k of it even is available for regular people.

    4k at 60hz is luxurious. At a regular living room distance humans can't even make out single pixels with 4k.

    Honestly, at this point I'd be waaaay more interested in edging up color bandwidth and brightness contrast than increasing resolution by yet another iteration. There is still room there and while my cheap ass 27" 1080p business display is perfectly fine for me I do like the experience colors and contrast on my Samsung tablet with AMOLED display. They should work on making that larger and cheaper.

    • Seriously. So LCDs and modern displays have square pixels where the old CRTs the pixels ended up being kind of rectangular (I might have gotten that vice versa it's sometimes hard to keep track which is which)

      If you have ever played an old dos game on a modern display and it felt a little off that's why. It's especially noticeable with rts' but you will probably also notice it with other games that make heavy use of the mouse.

      Going up to 4K helps because the pixels become smaller and the differences
    • I really don't get who this is for. I've been a gamer my entire life, since the mid 90s I've been an avid PC gamer. My current gaming system has a 4090 in it and I've loved it and never seen any issues with playing any game I desire. I play on a 1440p/175hz OLED, and everything I play is set to175fps. That is more than enough for any game I want to play. I think a lot of the push for 4k, 5k and higher was due to poor LCD technology. Back in the day LCDs really sucked, especially when you were sitting
      • 1440p is the sweet spot. Going higher in resolution you really get diminishing returns in picture quality, the difference is only noticeable in some difficult scenes. If you can even see a difference, my eyes aren't getting better.

        The only reason I dislike my ultrawide monitor is the extra pixels it has to push, reducing the frame rate. I wish I could set the outer 15-20% to render at a lower resolution.

    • by gweihir ( 88907 ) on Tuesday December 23, 2025 @02:20PM (#65877797)

      Indeed. Unless you have inhumane reflexes (hint: you do not have them), 180Hz is complete nonsense. The receptors in your eyes are slow ass chemical ones, that can detect, under optimal conditions 13ms pulses. That is around 75Hz. Anything faster and you will not see it.

      And 5k? You need to sit very close or use a magnifier to even see that.

      In short: Anybody with this set-up has fallen for the ads and is wasting money, plain and simple. And many will be kidding themselves falsely thinking it gives them an advantage. This can always be observed with tech optimized to an useless level. Same happens when "audiophiles" buy completely useless "audio network cables" for $1000 per piece.

      • by Pascoea ( 968200 ) on Tuesday December 23, 2025 @02:47PM (#65877847)

        hint: you do not have them

        There are people who will swear up, down, sideways, and backwards that they can tell the difference. The same people that will pay for the $3,000 network cable for their HiFi system.

        • by gweihir ( 88907 )

          Yes. And they are right. You can detect the difference via inference. It is even easy: Just drag your mouse in a circle. But you cannot use that speed difference. Not possible.

          On the $3000 network cable, that possibility to detect shrinks down to a look at the invoice though.

      • No they haven't. If the refresh rate is super high you don't have to enable vsync to prevent tearing. It's not about people being able to see 180 fps. Just because you dont have need for something doesn't make it worthless.

        • Wtf are you talking about? Tearing?? What kind of cheap shit equipment from the 90s are you using?

        • by gweihir ( 88907 )

          That is complete insightless nonsensical crap. If you do not have vsync, you will get tearing, _regardless_ of refresh rate.

          • The screen is refreshed more often the higher the refresh rate, less opportunity for tearing. Are you a gamer? If not then you probably don't need a high refresh rate monitor. Higher refresh rates open up more frame rate caps for smoother play. My monitor is 240hz, I generally lock the frame rate on games at 80 fps. 40 fps is a nice compromise frame rate for consoles that is not available on a 60hz display, you need at least 120hz. Just because you don't have a use case doesn't mean there isn't one. I

            • by gweihir ( 88907 )

              I am a gamer. And I understand Physics and electronics. Tearing is not a speed issue, it is a SYNC issue. You are just cluelessly blubbering here.

      • What about movement? With 180Hz you'll see a series of close images that look like movement rather than that jumpy thing you get with 50Hz. I don't know ... like when you move mouse and it stops moving and starts jumping? Any idea?
      • by AmiMoJo ( 196126 )

        The have been a few tests showing that at least up to 240Hz, reaction times of ordinary people decrease with frame rate. It's not huge but it is a definite advantage in some games.

        • by gweihir ( 88907 )

          These tests likely do not meet scientific standards. For example, if people know they are on a faster display, they will try harder. Purely imaginary effect.

          • by AmiMoJo ( 196126 )

            IIRC Linus Tech Tips did a blind test. It's not perfect but I see no reason why it couldn't be true.

            • by gweihir ( 88907 )

              Linux Tech Tips did and does a lot of shady stuff. My guess is you just do not know how to do Science right. Blind tests are actually pretty hard to get right.

      • The receptors in your eyes are slow ass chemical ones, that can detect, under optimal conditions 13ms pulses. That is around 75Hz. Anything faster and you will not see it.

        20+ years ago, I bought a vacuum tube monitor that could adjust the refresh rate from 60hz to 144 hz. I was able to see the difference in refresh from 60hz to 120hz. I could tell you if the monitor were running at 80hz, 100 hz, 60 hz, abd be EXACTLY correct every time. I would usually guess correctly between 120hz and 144 hz, but my ability to ascertain was not as solid as between 60hz and 120hz.

        Long story short, do all the math and logic that you want but my brain notices a difference where your math and l

    • High refresh rate means no tearing with vsync off. It definitely has a use.

    • Aside perhaps from some high performance VR setup that probably costs 50k of it even is available for regular people.

      Errr given we have higher than 4k resolution in VR headsets already costing $500 running at 120Hz, I don't think the incredibly minor performance bump will add 2 zeroes to the price.

    • 60 Hz isn't "luxurious". I consider 80-100 fps the minimum for gaming these days. Really I aim to be around 140, and I use DLSS frame gen to get there. Yes, fake frames. They're the only good thing AI has given us, you can't scrutinize them fast enough to see the mistakes. But they do make it smooth and actually luxurious.

    • >running at 180hz?

      There is a much more interesting angle to this, and that is for CRT-emulating purposes. Which I believe wants 240Hz to get lift-off.

      I haven't looked too deeply into it, but I know emulators and image scaler boxes have modes that uses high resolution and refresh to give an image closer to a CRT. Yeah, I see the irony, but until someone finds a business model to start making CRTs again I applaud this and will keep an eye out for a good match whenever I buy a new monitor.

    • It's about field of view. If you need an immersive experience and have normal eyesight you need 5K. (Note, I have better than 20/10 vision .. so I can see pixels on 4K monitors and TVs at farther distances than most people).

    • ... running at 180hz?!?? Aside perhaps from some high performance VR setup that probably costs 50k of it even is available for regular people.

      If you hadn't started your sentence in the Subject Line, my quote would make more sense.

      Why don't you just admit that you have no use for high resolution and high refresh but acknowledge that other people may have a use for it. I, personally, do not want to live under the limitations of your experiences. But yeah, just be negative about the whole thing because YOU do not personally find a use for it.

      While 24 FPS at NTSC resolutions (and colors) may be acceptable for a very large portion of people, it is no

  • by alvinrod ( 889928 ) on Tuesday December 23, 2025 @01:43PM (#65877717)
    Turn off the RT and the performance would be more than fine. Ultra settings often deliver very little uplift over high as is, but ray tracing is so little of an upgrade for the performance cost that the only reason to run it at all is to justify paying $2000+ for a GPU. I think rat tracing is generally pointless for most games as well. The performance cost is not worth the marginally better visuals and most games don't want hyper realistic lighting because it gets in the way of design and game play. No game can truly be designed for it until the technology creeps down into the mainstream and low-end of the market and we're several hardware generations away from that still.
    • Turn off the RT and the performance would be more than fine.

      You would think so, but then you'd be wrong. There are plenty of titles that struggle to reach high frame rates at 4K even with a 5090 without using ray tracing. E.g. Starfield, the only way to get acceptable frame rates is to use DLSS ideally combined with frame generation, i.e. fake it. Money alone can't render the 4K pixels even without ray tracing.

      High resolutions SUCK for gaming.

    • RT done the right way, like Cyberpunk 2077* or Black Myth Wukong makes a huge difference in the overall lighting quality. The games have to be designed with it in mind from the beginning, preferably with environments that show it off. Many games with RT bolted on later, just to check a box for marketing, it's just a waste of electricity.

      *Cyberpunk in 2025 with all the settings maxed, it wasnt fully there for years after release

    • Turn off the RT and the performance would be more than fine. Ultra settings often deliver very little uplift over high as is, but ray tracing is so little of an upgrade for the performance cost

      I am so relieved that someone is protecting me from my poor tastes. I do not know what I would do if I paid to see actual reflections in puddles of water in Cybperpunk 2077. You have saved me money. Thank you.

      We should create a Slashdot group where we tell people what their tastes should be so they do not waste any money on anything worthless like "Night Watch" by Rembrandt. That thing costs millions of dollars... and yet someone somewhere will be stupid enough to buy it once it is on sale again. Whatever w

  • by RobinH ( 124750 ) on Tuesday December 23, 2025 @02:00PM (#65877755) Homepage
    This reminds me of the rush to build 3D televisions. It was pushed by industry like this is the next big thing, and you need to buy these new TVs to keep up, but the technology was underwhelming, most movies weren't shot in proper 3D (about the only one would be the original Avatar) and consumers didn't want to wear special glasses, where you had to sit almost straight on to the TV and couldn't lay down on your couch sideways, etc. We don't need 5K even for games. If you think you do, just wait a couple years and your eyesight will degrade enough that you can't see the difference anyway.
  • I have the LG 5K2K (5120 x 2160) and a 5080 (ASUS TUF RTX 5080 12GB) and it does NOT have trouble hitting at least 60fps on most titles on high settings when using DLSS. The trouble comes when I try to enable advanced lighting techniques like full path tracing. Even Doom Eternal struggles to run with full path tracing at 5K. That said, plenty of titles run great and look amazing in 5K.
    • Isn't the point of DLSS too run games at a lower resolution in order to maintain frame rate and then upscale?

      It really pisses me off the a card as expensive as powerful as the 5080 with 12 gigs of RAM too. That is explicitly designed to force you to upgrade that video card in a few years.... But that's Nvidia for you
  • 2 x 5090 will solve your problems. Except for paying for food, shelter, school, etc.
    • They've never really been able to get it to work all that well. It's too complicated and two crash prone and requires custom software from the developers. So it's never been worth the cost.
    • Plz. In 10 years, largely due to Jensen's NEED for the world's largest collection of leather jackets; you'll just be renting AAA diversity slop for a monthly fee, on rented hardware for a low, low monthly fee, in perpetuity.

    • SLI causes problems with a lot of titles and also isn't available on Linux any longer. (There is still a feature called SLI in the Linux driver but it is NOT SLI.)

    • With how parallelized the workload is, and the high bandwidth of PCIe 5, they are basically just SLI on a single card now. If you are judging by how much power a 5090 uses, it is like two high-end cards from the SLI era.

      It also costs 4 times as much. I thought tech was supposed to get cheaper.

  • Desktop gamers simply do not sit close enough to 4K displays for the increased resolution to matter over 1440p.

    Couch gamers would need a 10foot screen about 6 feet away from them to see this 8K nonsense.

    10bit color and proper HDR support would be far more meaningful to the average gamer.

  • I'm running the LG 45in, 5k x 2k (which is a bit smaller vertically) on a 4090 and it runs most games at 120fps, albeit with some settings turned off. Obviously Ray tracing is the biggest one, but you can also turn off stuff like post processing, global illumination, and maybe turn on DLSS to quality in some cases, and you will hit 120fps without much issues.
  • 16K gaming has been possible since at least 2017 on properly coded games, there's been many videos about it. But vibe coders just pump out unreal engine slop.
  • ...should be enough for everyone.
  • ... I don't PC game like I used to and don't care about these new stuff. I still use 1080, VGA, DVI, KVM from Y2K, etc. They all still work fine for me!

  • I've been running one (an old HP Z27q that requires dual DP connections to drive it) for about eight years. I bought it used, since they were pretty rare and expensive at the time. It came with a couple of dead pixels, but they're so small they just look like a piece of dust. Even knowing they're there, I really have to hunt to find them.

    Honestly, I think 5k (2880p @ 27") is the sweet spot for productivity. It lets you run at a 200% scale factor (so an effective 1440p resolution), but all your text i

Everybody likes a kidder, but nobody lends him money. -- Arthur Miller

Working...