Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics AMD Hardware

Dual GPU Battle: GTX 980 Ti SLI vs. Radeon R9 Fury X Crossfire 51

jjslash writes: High-end GPU parts from Nvidia and AMD are plenty fast, these days. While top-end cards from both can provide playable performance at 4K, many games dip down to and below 30fps. Folks looking to achieve smooth 4K gameplay will undoubtedly be eyeing dual GTX 980 Ti or Fury X cards to realize their PC gaming machine's full potential. TechSpot puts both cards to the test in SLI and Crossfire modes, at stock and overclocked speeds in over 10 games to see who gets the bragging rights. As it turns out, AMD has a tiny advantage in average frame rates. The two split wins on frame time, but AMD won by bigger margins. When the cards get overclocked, Nvidia is the clear winner, and power consumption favors Nvidia as well.
This discussion has been archived. No new comments can be posted.

Dual GPU Battle: GTX 980 Ti SLI vs. Radeon R9 Fury X Crossfire

Comments Filter:
  • by whathappenedtomonday ( 581634 ) on Saturday August 08, 2015 @07:16PM (#50277037) Journal
    4k is cool and all, I get it. Still, why would I want my PC to render 4k at suboptimal frame rates, when I can play at full HD and have the display present me those 1080px in a way no native HD display could do? Really, I want one of those things just for the real estate and the freedom to place more better looking windows where I want and need them, but for gaming at native resolution, I just don't get it.
    • I agree. When monitors are 40 or 50 inch, then 4K will make sense for gaming. I just don't see it now. You need a lot of screen real estate to take advantage of that many pixels.

      • by FlyHelicopters ( 1540845 ) on Saturday August 08, 2015 @07:55PM (#50277125)

        I agree. When monitors are 40 or 50 inch, then 4K will make sense for gaming. I just don't see it now. You need a lot of screen real estate to take advantage of that many pixels.

        If that were true, then there would be no need for Anti Aliasing...

        Since there is, then your point is simply incorrect...

        Until there is no need for AA, then the resolution isn't high enough.

        • When monitors are 40 or 50 inch, then 4K will make sense for gaming

          What do you mean, "when"?. 40+ inch UHD ("4K") gaming monitors are already here! [pcper.com] Even comes with freesync.

        • by Jupix ( 916634 )

          On a 4K display, antialiasing is already completely pointless. It's uses a huge amount of rendering horsepower for a blur effect that is impossible to notice without A/B comparisons. Competent system testers use it in benchmarks only to put more load on the systems, and incompetent ones to prove that SLI/CF builds are inadequate for 4K. Popular but incompetent review sites like IGN like to do that latter part regularly, which is really counterproductive because it only increases FUD and slows down 4K adopti

        • I agree. When monitors are 40 or 50 inch, then 4K will make sense for gaming. I just don't see it now. You need a lot of screen real estate to take advantage of that many pixels.

          If that were true, then there would be no need for Anti Aliasing...

          Since there is, then your point is simply incorrect...

          Until there is no need for AA, then the resolution isn't high enough.

          Perhaps all that 4k rendering should be in the monitor. Just send the video image deltas and step out of the way.

      • Re: (Score:2, Informative)

        by Anonymous Coward

        You need a lot of screen real estate to take advantage of that many pixels.

        Not really. 28" at 2' occupies the same visual arc as 42" at 3'.

      • Works pretty well on 28" on a normal sized desk.

      • by Jupix ( 916634 )

        Having used a 55" 4K 60Hz panel (Sony 55X9005A) as my gaming display since 2013, I can say that high resolution gaming is pretty much the same thing as high refresh rate gaming or VR gaming: you won't "get it" until you try it.

        Furthermore, in my time I've observed three primary types of gamers:

        • "graphics & performance don't matter" gamers
        • "resolution & fidelity is everything" gamers
        • "fps & low latency is everything" gamers

        If you're not in category 2 then I'm afraid you'll never "get" these very exp

        • by tlhIngan ( 30335 )

          Having used a 55" 4K 60Hz panel (Sony 55X9005A) as my gaming display since 2013, I can say that high resolution gaming is pretty much the same thing as high refresh rate gaming or VR gaming: you won't "get it" until you try it.

          Furthermore, in my time I've observed three primary types of gamers:
          "graphics & performance don't matter" gamers
          "resolution & fidelity is everything" gamers
          "fps & low latency is everything" gamers

          If you're not in category 2 then I'm afraid you'll never "get" these very exp

    • Someone always has to pay the cost so that the bleeding edge can become commonplace. Enthusiasts pay for the high cost, high margin products that help make up for selling the mid-range run of the mill parts at low margins to the rest of us. Yesterday's luxury is today's commodity.
    • Okay, let's pretend 4k monitors didn't exist (they do). I give you another reason, supersampled anti-aliasing where you render at a higher resolution, apply any effects and maybe fxaa and then downscale the frame at the end to native resolution. Basically the downscaling the final frame is the best form of anti-aliasing one can actually have due to how downscaling algorithms derive pixel values from multiple sources pixels.

    • Okay, let's pretend 4k monitors didn't exist (they do). I give you another reason, supersampled anti-aliasing you render at a higher resolution, apply any effects and maybe fxaa and then downscale the frame at the end to native resolution, basically the downscaling the final frame is the best form of anti-aliasing one can actually have due to how downscaling algorithms derive pixel values from multiple sources pixels. That said, I wouldn't allow my computer to use such methods when I don't still get high en

    • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Saturday August 08, 2015 @08:49PM (#50277329) Homepage Journal

      More pixels per inch means better resolution with less detail-killing antialiasing. Back when resolutions over 1600x1200 were a lot I used to regularly get kills on people who couldn't even see me yet in FPSes because I had a big fancy .22 pitch monitor.

    • by Kjella ( 173770 )

      4k is cool and all, I get it. Still, why would I want my PC to render 4k at suboptimal frame rates, when I can play at full HD

      Not all games are about competitive FPS, it's that simple really. If you're playing The Witcher, Metro: Last Light, Tomb Raider, Civilization or any of a bunch of other games you just want it to be smooth and otherwise crank up the resolution/details/effects as far as you can.

  • by __aaclcg7560 ( 824291 ) on Saturday August 08, 2015 @07:38PM (#50277091)
    Nvidia TNT2 for Windows desktop and software-rendered games, and a pair of 3Dfx Voodoo 2 SLI boards for 3Dfx games. Got me through many rounds of Quake and Quake 2. Now get off my lawn!
  • In over 10 games (Score:2, Insightful)

    by Anonymous Coward

    Can marketing majors everywhere please quit using this sentence?

    Just say how many. Quit with the "Over X" bullshit. Every time I read that, I hear DUR IM PADDING NUMBERS BC IM STUPID.

    • by Anonymous Coward

      Even better, it's *exactly* 10 games...
      So not only is it stupid, it's flat out wrong.

    • Hey, don't knock it. It's an improvement on "up to X or more".

"Look! There! Evil!.. pure and simple, total evil from the Eighth Dimension!" -- Buckaroo Banzai

Working...