Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Television Technology Games

Are We on the Verge of an 8K Resolution Breakthrough in Gaming? (arstechnica.com) 104

An anonymous reader shares a report: With the 2020 release of the Xbox Series X and PlayStation 5, we've started to see the era of console games that finally make full use of TVs capable of 4K resolutions (i.e., "Ultra HD" 3840x2160 pixels) that have become increasingly popular in the marketplace. Now, though, at least one TV manufacturer is already planning to support 8K-capable consoles (i.e., 7680x4320 resolution) that it thinks could launch in the next year or two.

Polish gaming site PPL reports on a recent public presentation by Chinese TV and electronics maker TCL. Tucked away in a slide during that presentation is a road map for what TCL sees as "Gen 9.5" consoles coming in 2023 or '24. Those supposed consoles -- which the slide dubs the PS5 Pro and "New Xbox Series S/X" -- will be capable of pushing output at 8K resolution and up to 120 frames per second, according to TCL's slide.

This discussion has been archived. No new comments can be posted.

Are We on the Verge of an 8K Resolution Breakthrough in Gaming?

Comments Filter:
  • by unique_parrot ( 1964434 ) on Friday May 27, 2022 @01:14PM (#62570888)
    ...I am gaming with 640k. Should be enough for everybody.
    • by GlennC ( 96879 )

      Beat me to it. Please accept a virtual +1 from me.

    • Luxury! My Commodore has only 64k. We didn't even have REUs but I was allowed. No fancy ports of Sonic the hedgehog from the Sega Master system for us.
  • Not for 2D (Score:4, Insightful)

    by systemd-anonymousd ( 6652324 ) on Friday May 27, 2022 @01:14PM (#62570890)

    There's not really much point of 4k+ for games, but there's a massive need for it for VR, which requires high resolution since the screens are so close to your eyes. It also requires 60+ FPS at all times and a refresh rate >90HZ. VR and machine learning are the main things that're going to be pushing requirements forward, I think.

    • We do need 8k per eye for VR. With foveated rendering, current generation consoles can easily support it. In foveated rendering, the GPU doesn't need to render the image at 8k. The human eye can only see a few degrees of central vision at high detail. Therefore, if the headset can use an internal camera to determine exactly where the person is looking .. the image only needs to be rendered at 640x480 (basic VGA) or less as long as the frame rate can be kept above 120.

      • by Z00L00K ( 682162 )

        I'd go for it when we have 8k vertically.

      • The resolution doesn't make sense without knowing the size and layout. Since you said "foveated imaging" which means varying pixel density, the "resolution" is even more meaningless. There's really only so much you can see in a headset even when turning the eyes and not the head. Ie, I wear progressive glasses - so they're set up to look straight forward and also up and down, but not necessarily left or right by too much. So I can read my computer monitor without moving my head, but my eyes will move. F

    • by quall ( 1441799 )

      I game on a tv and there is quite a noticeable different between even 1440p and 4k. 4k is ok without AA, but maybe 8k won't need it at all. So I'd say it is needed if we are still using post-processing affects to fix resolution problems.

      With that said, we can barely do 4k gaming at 60fps on max detail, so no, 8k is not nearly here. Unless people plan to put all of their detail settings down to low and then play at 30fps.

    • It also requires 60+ FPS at all times and a refresh rate >90HZ.

      Huh?

    • There's not really much point of 4k+ for games

      Sure there is. Most games use deferred rendering these days and that means no antialiasing.

      No antialiasing means you need higher resolutions.

      • Sure there is. Most games use deferred rendering these days and that means no antialiasing.

        No antialiasing means you need higher resolutions.

        No. This article is about TVs doing 8K. You can render games at a higher internal resolution and downsample to 4K to solve your issue without pointless expensive TVs.

        Also double no. Deferred rendering just means AA needs to be done differently in the rendering pipeline. Nearly all games on the market today provide some (or multiple) form of AA, and DirectX 11 introduced AA on deferred rendering 12 years ago.

    • There's not really much point of 4k+ for games

      with the current state of AA, 4k is not nearly enough.

    • Raw resolution is mostly only an important number when discussing rendering capability. When discussing displays, it's the pixel density and expected view distance that matter.
    • by AmiMoJo ( 196126 )

      I hope 8k takes off for computers, because I have 4k monitors and they are close to perfect, but but quite. I can see some pixels.

      8k should be perfect in all conditions. It should be the end game.

  • Yes they will be able to upscale to 8k and pretend like that is the same thing so they can sell a new generation of TV's that are ahead of the computing power to actually push the content. 4k isn't even close to a standard yet that is reliable and this is going to require 4x the power to compute 8k vs 4k. Unless the 4XXX series cards have some quantum level of power in them 8K is still quite a ways off. Not to mention 8k is so tiny in diminishing returns without having a full deep HDR color gamut that ma

    • Yeah, exactly - we can barely manage 4k right now with the highest end PC cards. Ain't no 8K cards coming for under $600 in the next 3 years, and sure as shit no under $600 consoles with real 8k support coming in

      8K on TVs is dumb unless you have a 100"+ screen, agreed, but for VR there are use cases...

    • by Luckyo ( 1726890 )

      Joke's on you, Nvidia's marketing for 3090 was "video card for 8k gaming".

      Or more likely, joke was on Nvidia marketing, since pretty much everyone laughed at the silliness of it.

  • by bradley13 ( 1118935 ) on Friday May 27, 2022 @01:15PM (#62570896) Homepage
    OK, I know 640k RAM was enough, but...serious question: Does 8k actually make sense? Will you really notice the difference, unless you are inches away from a huge display?
    • by Junta ( 36770 )

      8k is about the point where you could cover the entire field of view and at least have the resolution to be the same angular resolution that is considered 'retina' quality (about 60 pixels per degree).

      So for any display that's used as a TV or monitor, it's beyond your ability to resolve the detail.

      For a hypothetical VR headset, you've hit all you need per eye to hit the resolution facet of realism even at full vision coverage, of which you'd only be able to resolve a small portion of it at that resolution a

    • The answer is no. If you just google this question you will find multiple cites for people not being able to differentiate 4k and 8k content in testing. In fact sometimes they say the 4k looks better. Unless you have a very big screen and sit fairly close to it, you can't tell the difference.

      On the plus side, a lot of games claim to support whatever resolution and then they choke at that res a lot, so if people think their game can handle 8k, maybe it will not choke at 4k. so bring on the 8k! /still gaming

      • by quall ( 1441799 )

        Did they test via gaming?

        And was AA turned off for games, which specifically exists to hide resolution issues?

        If the answer to either of these is "No", then the study has a pretty big hole in it.

        • You can't test 4k vs 8k in a double blind test with gaming, because if you turn the settings up far enough to where there even is a difference to try to detect, you're going to affect frame rates and then it won't be double blind any more.

          • by quall ( 1441799 )

            You can just lock the frame rate to something that both can render at - problem solved. When comparing a 4k to 8k image, why wouldn't you do this anyways? You are testing visual fidelity, not frame rate. And why would you want to perform a double blind test? You're testing an opinion, and the people administering the test need zero contact with the "test subjects".

            Of course people will notice a choppy screen at 8k if the hardware can't handle it. The question is about whether they will see a visual differen

            • If the question is about what people are going to realistically see while console gaming in their living room then I don't see why proving the general point doesn't apply to gaming. If you can't see stuff, it can't make a difference. If the question is about PC gaming then it's a whole other ball of wax — you're going to have to fiddle settings around to the point that you're not necessarily even trying to make a direct comparison because if you're using a lower resolution, then you can turn on more f

      • by spun ( 1352 )

        Betteridge's Law of Headlines states, "if it asks a question, the answer is always 'no.'"

        https://en.wikipedia.org/wiki/... [wikipedia.org]

        • Betteridge's Law of Headlines states, "if it asks a question, the answer is always 'no.'"

          Ah, but what if the headline asks "Is Betteridge's Law Always Accurate?"

          • by spun ( 1352 )

            Then the answer is "no." I mean, it's a rule of thumb but the basic idea is, if the answer was "yes", the editors would phrase the headline as a statement. For example, "We are on the verge of an 8k breakthrough in gaming" is just a better heading, if that fact is true. And "we are not on the verge of an 8k breakthrough in gaming" is a terrible headline. It's basically saying "No need to read this article at all. This isn't really even news. The status quo remains unchanged." You don't report on a non-even

        • by narcc ( 412956 )

          Betteridge's Law of Headlines states, "if it asks a question, the answer is always 'no.'"

          That's not what your link says.

          • by spun ( 1352 )

            The link says "Any headline that ends in a question mark can be answered by the word no." so, how is that different? I just said it in fewer words.

            • by narcc ( 412956 )

              OMG. "the answer can be no" is in no way identical to "the answer is always 'no.'"

              • by spun ( 1352 )

                Jesus Christ, learn English. "Can be answered by the word no" means "the answer is no." It does not mean "the answer can be no" for fucks sake, that saying would mean absolutely nothing, and would not be called a law.

                You are literally claiming that Betteridge's Law says "if the headline asks a question, the answer might be no, or it might be yes." do you see how utterly stupid you sound saying that? God I fucking hate dumb people. Go away, and stop being stupid around me. Just stop giving opinions, on anyth

                • by narcc ( 412956 )

                  Really? I've corrected your illiterate ass twice now. Are you really going to double-down on stupid?

                  "Can be answered by the word no" means "the answer is no."

                  No, it does not, you drooling moron. "Can" does not mean "must". "Can", in this context, means "there exists the possibility".

                  From the fucking dictionary, not that you can read:

                  1. to be able to; have the ability, power, or skill to: She can solve the problem easily, I'm sure.
                  2. to know how to: He can play chess, although he's not particularly good at it.
                  3. to have the power or means to: A dictator can impose his will on the people.
                  4. to have the right or qualifications to: He can change whatever he wishes in the script.
                  5. may; have permission to: Can I speak to you for a moment?
                  6. to have the possibility: A coin can land on either side.

                  Your brain-damaged imaginary definition does not exist.

                  You are literally claiming that Betteridge's Law says "if the headline asks a question, the answer might be no, or it might be yes."

                  That's my claim because that's what it means, you shit-faced cock goblin. When a headline asks a question, it is implied that the answer is "yes". Betteridge is telling us

                  • by spun ( 1352 )

                    Hahah, oh man, so much verbiage for so little effect. The phrase means the answer is no, full stop, no questions, no debate. That's what it means. Anything else is pointless. It would not be coined as a law if your take were even remotely true. You don't get to redefine commonly known phrases to suit yourself.

                    You can try to interpret it how you like, but that's only true in your own fetid imagination. Keep it there. No one else cares about your opinion.

                    Let me type it out again for you: "Any headline that en

                    • by narcc ( 412956 )

                      true meaning of Betteridge's Law: "Any headline that ends in a question mark is able to be answered by the word no."

                      Look at you back-peddle! Pathetic! LOL! I'm glad you finally agree with me, but it's pretty sad that you can't just admit you were wrong.

                      Let me repeat your idiotic first post, in case you've forgotten how stupid it was:

                      Betteridge's Law of Headlines states, "if it asks a question, the answer is always 'no.'"

                      In case you've gone soft in the head again after being exposed to your brain-damaged post, I'll explain things to you again:

                      When a headline asks a question, it is implied that the answer is "yes". Betteridge is telling us not to just blindly accept that, and instead entertain the idea tha

                    • by spun ( 1352 )

                      If you can answer a question with the word no, the answer is no. It is not maybe no, maybe yes. If it were "yes" you would not be able to answer the question with the word no. I can't believe I even have to explain that to you.

                      Just admit you are wrong and move on.

                    • by narcc ( 412956 )

                      OMG, you're so fucking stupid.

                      Again, the word 'can' does not mean 'must'. Get a fucking dictionary.

                      A "yes or no" question can be answered with either a 'yes' or a 'no'. That's what makes it a "yes or no" question. Do you understand how questions work?

                      If it were "yes" you would not be able to answer the question with the word no.

                      OMFG, you're so, so, stupid. No, you moron, that's not how fucking questions work. Try this simple example:

                      "Would you like to have salad for lunch?"

                      This question can be answered with a 'no'. It can also be answered with a 'yes'. This is true of all '

                    • by spun ( 1352 )

                      Look, I am just trying to explain how the phrase is actually used by journalists. You are saying that your interpretation is right but that doesn't matter, it doesn't change how the phrase is actually used.

                      Google "headline asks a question and the answer is yes" and read the results. Here's the top two, to get you started.

                      https://training.npr.org/2019/... [npr.org]
                      https://www.barrypopik.com/ind... [barrypopik.com]

                      These are journalists talking about the phrase, and the way they mean it is how I used it. Sorry, you can argue semantics a

                    • by narcc ( 412956 )

                      Look, I am just trying to explain how the phrase is actually used by journalists. You are saying that your interpretation is right but that doesn't matter, it doesn't change how the phrase is actually used.

                      Nope. See your own links. I'm tired of explaining basic things to you:

                      Newspaper headlines sometime ask a question, and the answer to that question is usually “no.”

                      Or are you going to argue like an idiot that "usually" also means "always".

                      this isn't about definitions of the word "can" it is about how the phrase is used by people in the profession in question.

                      Keep moving those goalposts. Try as you might, you'll still be completely wrong.

                      Now fuck off and stop bothering me. I'm done. If you can't accept that you are wrong, I have nothing else to say to you. I have made the only argument that matters, so it is now on you. You can continue to be an asshole and an idiot, or you can learn and accept a new bit of knowledg

      • if people think their game can handle 8k, maybe it will not choke at 4k.

        Worth mentioning that 8k has a LOT more pixels than 4k.

    • These 640k games often had a nice story or some clever logic behind them, but it could also be just me, getting older.
    • by fazig ( 2909523 )
      It might be interesting for VR or AR, where pixel density can still be an issue here and there.

      And of course it can be beneficial for digital art, and many other professional application where pixel-"real estate" can bring advantages.
      But for gaming? People already hardly see differences between 1440p and 2160p if the pictures are in motion due to more or less natural motion blur that LCD panels have.

      In essence you'd have to have a display that has an extremely high refresh rate and a very low response
  • No (Score:4, Insightful)

    by TheDarkMaster ( 1292526 ) on Friday May 27, 2022 @01:16PM (#62570900)
    Most gamers don't even have the hardware needed to play in 4K, let alone 8K.
    • I agree. I do have the hardware to support 4k but many games in 4k drive my 3080ti to 80 degrees at 60FPS. Some games like Farcry 6 struggle to get 60fps in 4K. As a result mostly I run games in 2K. I suppose if you set games on their lowest settings you might be able to maintain 4k at a higher rate but 8K would be at pitiful frame rate. that is not the point of having a high end video card. You want to see the best and the tech is not there yet.

      For reference Threadripper 3960x 128gb ram, nvme pcie 4.0 r

      • Your threadripper is the bottleneck. They are not so great at games, and Far Cry 6 has some pretty radical single thread cpu issues as it is. Check your BIOS for a "gaming mode" toggle.
        • Your threadripper is the bottleneck.

          CPUs are basically never the bottleneck when games struggle to render only at very high resolutions like 4K. That one CPU thread doesn't give a crap about screen resolution. There's a reason CPU benchmarks are done with games at 1080p, sites which provide comparison with 1080p and 4K show that at 4K your CPU becomes borderline irrelevant to performance at high resolutions.

      • That's why I want VR headsets with foveated rendering and high-resolution displays. With foveated rendering, unlimited resolution of the display can be supported with any modern GPU since only the small area you're looking at is rendered at high resolution. The rest of the image is rendered at a low resolution because the eye sees the surrounding area blurred anyway. Current generation graphics cards could easily support it.

    • You must be confusing the word "breakthrough" with "mass adoption commodity". Yeah I get it, all the letters are very similar so it's an easy mistake to make.

    • Most gamers don’t even want 4K if it comes at the expense of framerate. Just look how lopsided the results were when GameFAQs asked its users a few months back: https://gamefaqs.gamespot.com/... [gamespot.com]

      It’s not even close. Resolution is a secondary concern when you’re playing something that’s flying around constantly. Being able to tell what’s going on depends much more on framerate in those moments, not on having fewer jaggies if you’re sitting too close to the screen.

    • by mjwx ( 966435 )

      Most gamers don't even have the hardware needed to play in 4K, let alone 8K.

      nononononononono... you don't get it. This is for Microsoft and Sony to appear relevant in the gaming industry. They get to (falsely) claim that their consoles can do 8K... and the next generation is going to 11K. Meanwhile, they're being outpaced by a gaming PC built years ago.

  • by Striek ( 1811980 )

    No.

  • There are no monitors available except the one from Dell, and it has 60hz refresh rate... so we aren't on the verge of anything.
    • There are no monitors available except the one from Dell, and it has 60hz refresh rate... so we aren't on the verge of anything.

      Perhaps we are on the verge of being on the verge though?
      Think of the possibilities!

  • that will kill any hope of streaming games from cloud.
    ISP's may choke if an full node of users is trying to do it at the same time and you can hit your cap so fast that you may as well have an game that lasts as long as dragons lair.

    • by ffkom ( 3519199 )
      Streaming services are well trained in selling pixel-mush as "high resolution" at ridiculously low bandwidths that compress away all detail. I have no doubt they'll even sell you "16k streaming" (with still only few MBit/s of bandwidth) for a hefty premium.
      • by bn-7bc ( 909819 )
        No, and here us why, the target market for "16k streaming" (at least initially) will be the super quality geeks ( the wanes with a dedicated hone cinema room and the budget oir a 16k projector and all the rest ) and those people will complain bitterly and/or cancel the service if they see artefacts on their giant screens. And there is only so much compression will do, so exoect thise streams to run at 100Mbos+ ( probably sugnificantlly more) and it will not be by netflix et all but som dedicated enthusiast
  • No. I will not play at high resolutions if the refresh rate is lower than 144hz. After years of using high refresh monitor even watching tv is like an eye-bleeding slide show. To say that the current generation of console hardware "fully utilizes" 4k is a fucking lie.
  • About 10% of my gamer buds are running 4k. Most are still running 1920x1080.

    I'm sure it'll be a neat breakthrough for the bleeding edge, but there's a trickle down of available and affordable video cards and monitors, it's just going to be commercial eye-candy.

    What video card can be had for $300 or less? (Still a an "LOL" kind of question.) What kind of monitor can be had for $150?

    • Those price points are what I would consider to be "bottom tier" gaming hardware. As in, barely passable at 1080p. And you can absolutely obtain high refresh 1080p monitors for that and a gpu that will push it.
      • $300 for a GPU being bottom tier is fucking terrible.

        My last GPU purchase was $150 and it was literally enough to play AAA games, albeit not at top settings and resolutions.

        Even with inflation it shouldn't be more than $200 for an entry level GPU today.

        I'm just going to keep playing old games until the prices come down, which might mean I keep playing old games forever. So be it.

        • Because of mining and production shortages, your $150 GPU is probably still selling on the used market for around $400. Newest-gen bottom tier is retailing at around $249 but you can't find it anywhere except from scalpers for around $350. It's a terrible time to buy anything gaming related.

          • Because of mining and production shortages, your $150 GPU is probably still selling on the used market for around $400.

            Nah, it was a Zotac 950 AMP! (now you know how long ago it was) that I bought to pair with another one which they gave me as a refurb for a 750 Ti. Just checked my gmail and I got it from newegg (heh) for $159.99. Now I look at ebay... $110. I will say this, the shortages have definitely helped it keep its value. If I sold them both I could get enough to buy a used 980 AMP! which would actually be faster than this... without SLI. (2x950 is ~1x970.) But not worth the hassle and potential to have a bad card,

          • by Luthair ( 847766 )
            A lot of cards are currently at or near MSRP.
        • by Luthair ( 847766 )
          A $150 card wasn’t going to play an AAA game on anything but the lowest rendering settings. Heck a Voodoo2 25-years ago was $400, accounting for inflation that is 720 which is pretty close to the MSRP of a 3080.
          • A $150 card wasnâ(TM)t going to play an AAA game on anything but the lowest rendering settings.

            In general my potato has been able to grunt out mostly-60-fps at mid settings on AAA games. People unfairly maligned the FX-8350 for not being very fast, which is not unfair because it's untrue, but because it's not very slow either :) I have no delusions about how slow it is by modern standards, though. What's more interesting is how well the 750 and 950 have held up for low-end gaming, in that they're still viable (in particular the 950 which has only recently dropped into maintenance-only status, and whi

        • $300 for a GPU being bottom tier is fucking terrible.
          My last GPU purchase was $150 and it was literally enough to play AAA games, albeit not at top settings and resolutions.

          There's a difference between a tier and being able to play games. Also there's nothing "terrible" about it. A top tier graphics card is expensive. Always has been. Hell the Voodoo2 was an inflation adjusted $550 card in today's dollars. Top of the line hardware has always been expensive and has *never* been required to play current games.

          Keep your old GPU, I'll keep working just fine. Just stop being judgemental for people who actually are fanatically into gaming and would prefer to run games at high resolu

          • Keep your old GPU, I'll keep working just fine.

            I am keeping it, but I'd really also like to move up to at least 2k so I can enjoy the last of my eyesight.

            Just stop being judgemental for people who actually are fanatically into gaming and would prefer to run games at high resolutions / with ray tracing rather than settling for the bottom tier experience you are happy with.

            The only people I'm judging are the cryptococks. Fuck those guys, I want my old GPU pricing back.

            No need to stick to old games even today's titles work just fine.

            Meh. I can play them at low-low settings, but that's not acceptable. I was able to play them at mostly medium settings, but that now produces unacceptable frame rates.

            Obviously I'd need a whole new motherboard &c, but it's the GPU prices specifically that keep me out of the market. Plus I could sell thi

  • Is that resolution high enough to allow dogs the ability to watch and enjoy?
  • Linus from LTT game had become a joke during the nVidia 3090 launch days for promoting 8k gaming. Fast forward to almost 2 years and the same 3090 stills struggles to keep up with decent frame rates on 4K resolutions if you turn more interesting visual features such as RT. Sure you can raster around 100 FPS on high end video cards but there is more than just pixel density to improve. Availability, for once, is more in demand than 4k panels, let alone 8k.

    • Linus from LTT game had become a joke during the nVidia 3090 launch days

      He was a joke long before then. But he's a good entertainer who knows how to capture a market with a fancy production. God knows he has to be, no one watches his channel for his whiney voice.

  • I changed my mind. Yes we are going to see 8k* gaming consoles in the next couple of years.





    *up to 8k resolution upscaled from 1080p. Max refresh rate of 24hz.
  • The people that can afford 8K monitors for gaming are probably old enough that they can't see that in decent fidelity anymore :-)

    • But are they dumb enough to not get their eyes checked and buy corrective lenses?

      • by Teckla ( 630646 )

        But are they dumb enough to not get their eyes checked and buy corrective lenses?

        Old(-ish) guy here checking in.

        Even with corrective lenses, the majority of my (similarly aged) friends and I can barely tell the difference between 1080p and 4k.

        For us, 4k alone (never mind 8k) isn't worth the higher price and substantial performance hit.

        Given the choice, we'll take higher frame rates instead.

        • I can barely tell the difference between 1080p and 4k

          When you say that, you really need to tell us which monitor size you use, and at what distance you're sitting from it.

          I'm sitting pretty close to a 32" 4K monitor and I'm 40-ish, and I feel it's much better than the full HD I first had.

  • Tucked away in a slide during that presentation is a road map for what TCL sees as "Gen 9.5" consoles coming in 2023 or '24. Those supposed consoles -- which the slide dubs the PS5 Pro and "New Xbox Series S/X" -- will be capable of pushing output at 8K resolution and up to 120 frames per second, according to TCL's slide.

    So what? That doesn't mean they'll be capable of gaming at that resolution. The 8k is for streaming video and the 120Hz is for gaming or for 3D TV. Does TCL still sell 3D TVs? I know the market for those pretty much cratered, but they used to have a whole bunch of models with 3D.

  • You would have to be several inches away from an UHD TV to perceive the pixels. Consoles & computers would be better off improving the poly count, draw distance, anti-aliasing, shading & shadows than worrying about rendering to a resolution that nobody can possibly see.
  • There is not enough power and cooling to game at 8k. I run at 4k with the highest end consumer equipment available and it struggles at 4k. That being said, I REALLY want 8k. 4k was a huge relief, but not quite far enough. I do not want to ever notice a single pixel. 4k fools me most of the time, so it is acceptable. 8k would likely fool my eyes and brain all of the time.

  • How close do you have to be to the monitor to actually see the 8K difference? You need a theater sized screen or you need to sit right in front of it - otherwise you cannot see the difference. There are very quickly diminishing returns over 4K.

  • Okay so a tv / panel manufacturer could make a display that pushes out 8k; but it's completely pointless if the average consumer needs to hawk a kidney to afford a video card capable of that resolution. (or for consoles; can the makers even source sufficient hardware for a mass rollout?)

    it's a bit like trying to launch a video streaming service before ubiquitous broadband.

  • The PS2 could output at 1080i but virtually nothing was able to run at that resolution.

    the PS3 and Xbox 360 could output at 1080p, but virtually nothing actually ran at that resolution.

    The PS4, PS5, Xbox One and Xbox Series systems can output at 4k, but virtually nothing, even now, runs at full 4k, all the time. They might output at it but they are upscaling 1440p for example.

    Has anyone actually sat in front of a 4k display at the average viewing distance your typical dummy sits from such a display?

    We don

  • Why do so many APIs use 32-bit or larger integers for holding width and height when after decades of development we are still only up to 8K?
    My theory is that software architects are way too optimistic about their potential lifespan of their creation.

    • Because no-one currently needs more space than there are atoms in the universe for a video game. Nothing game developers have ever produced has come even close to needing that kind of space. Not even the procedurally generated stuff, with all of it's flaws. We're a long ways off from being able to make something that would need that much space. Let alone anything that would make sense to have in context. It's not just a limitation of engineering, but also design. We need a ridiculous amount of effort put in
  • Why would anyone care about this if the content and game-play sucks?

  • Maybe when I was 12 and I sat close to the TV higher resolutions mattered. Don't get me wrong better anti-lizing an image stabilization is a good thing. And I guess a quick and dirty way to do that is to keep cranking the resolution up. But if it means I have to spend $1,000 on a GPU just to hit acceptable frame rates then you've lost me.

    That said I wouldn't be surprised to see people try to turn gaming into a luxury hobby. Everybody's always chasing that Apple computer model where you sell veblen goods
  • There's not really much point in 4k+ for games it's very useful https://99networth.com/ [99networth.com]
  • After 4k I'm out, 4k has enough resolution and video games are realistic enough for me already. 8k games will vastly increase computing resources needed for rendering and moving around data, for me I just don't need that in my life

  • 8k is not needed, it's higher resolution than the human eye can see, we have reached the limit just like 16 millions colours is all we need as humans cannot see more

    Most gamers, and video buffs, cannot tell the difference between 4k and 8k unless they get far far too close, and some will swear the 4k is better

    Find something else to improve ..

We were so poor we couldn't afford a watchdog. If we heard a noise at night, we'd bark ourselves. -- Crazy Jimmy

Working...