Dual GPU Battle: GTX 980 Ti SLI vs. Radeon R9 Fury X Crossfire 51
jjslash writes: High-end GPU parts from Nvidia and AMD are plenty fast, these days. While top-end cards from both can provide playable performance at 4K, many games dip down to and below 30fps. Folks looking to achieve smooth 4K gameplay will undoubtedly be eyeing dual GTX 980 Ti or Fury X cards to realize their PC gaming machine's full potential. TechSpot puts both cards to the test in SLI and Crossfire modes, at stock and overclocked speeds in over 10 games to see who gets the bragging rights. As it turns out, AMD has a tiny advantage in average frame rates. The two split wins on frame time, but AMD won by bigger margins. When the cards get overclocked, Nvidia is the clear winner, and power consumption favors Nvidia as well.
Please enlighten me (Score:3)
Re: (Score:3)
I agree. When monitors are 40 or 50 inch, then 4K will make sense for gaming. I just don't see it now. You need a lot of screen real estate to take advantage of that many pixels.
Re:Please enlighten me (Score:5, Interesting)
I agree. When monitors are 40 or 50 inch, then 4K will make sense for gaming. I just don't see it now. You need a lot of screen real estate to take advantage of that many pixels.
If that were true, then there would be no need for Anti Aliasing...
Since there is, then your point is simply incorrect...
Until there is no need for AA, then the resolution isn't high enough.
Re: (Score:2)
Why would AA be needed on a display where the eye could not distinguish between individual pixels?
Re: (Score:2)
And so... for this person who has fantastic eyes providing amazing vision: when they are given a screen at a resolution where they cannot distinguish individual pixels - would they need AA? Come on child, summon the brain trust and gather their collective wisdom.
Re: (Score:2)
Sorry, but you are wrong. Its common to play "2k" or 4k with AA off on 27' monitors for the performance and frame gains. In fact, I would know - I recently upgraded from 1080p to 2k. 2k at lower settings with AA off looks better than 1080p at high settings with 8x AA. No jaggies anywhere on 2k without AA on a 27' monitor.
Re: (Score:3)
When monitors are 40 or 50 inch, then 4K will make sense for gaming
What do you mean, "when"?. 40+ inch UHD ("4K") gaming monitors are already here! [pcper.com] Even comes with freesync.
Re: (Score:2)
On a 4K display, antialiasing is already completely pointless. It's uses a huge amount of rendering horsepower for a blur effect that is impossible to notice without A/B comparisons. Competent system testers use it in benchmarks only to put more load on the systems, and incompetent ones to prove that SLI/CF builds are inadequate for 4K. Popular but incompetent review sites like IGN like to do that latter part regularly, which is really counterproductive because it only increases FUD and slows down 4K adopti
Re: (Score:2)
I agree. When monitors are 40 or 50 inch, then 4K will make sense for gaming. I just don't see it now. You need a lot of screen real estate to take advantage of that many pixels.
If that were true, then there would be no need for Anti Aliasing...
Since there is, then your point is simply incorrect...
Until there is no need for AA, then the resolution isn't high enough.
Perhaps all that 4k rendering should be in the monitor. Just send the video image deltas and step out of the way.
Re: (Score:2, Informative)
You need a lot of screen real estate to take advantage of that many pixels.
Not really. 28" at 2' occupies the same visual arc as 42" at 3'.
Re: (Score:2)
Works pretty well on 28" on a normal sized desk.
Re: (Score:2)
Having used a 55" 4K 60Hz panel (Sony 55X9005A) as my gaming display since 2013, I can say that high resolution gaming is pretty much the same thing as high refresh rate gaming or VR gaming: you won't "get it" until you try it.
Furthermore, in my time I've observed three primary types of gamers:
If you're not in category 2 then I'm afraid you'll never "get" these very exp
Re: (Score:2)
Re: (Score:2)
The resolution matters more than the 4:2:0 chroma. Try it with A/B testing.
Re: (Score:3)
Re: Please enlighten me (Score:2)
Okay, let's pretend 4k monitors didn't exist (they do). I give you another reason, supersampled anti-aliasing where you render at a higher resolution, apply any effects and maybe fxaa and then downscale the frame at the end to native resolution. Basically the downscaling the final frame is the best form of anti-aliasing one can actually have due to how downscaling algorithms derive pixel values from multiple sources pixels.
Re: Please enlighten me (Score:2)
Okay, let's pretend 4k monitors didn't exist (they do). I give you another reason, supersampled anti-aliasing you render at a higher resolution, apply any effects and maybe fxaa and then downscale the frame at the end to native resolution, basically the downscaling the final frame is the best form of anti-aliasing one can actually have due to how downscaling algorithms derive pixel values from multiple sources pixels. That said, I wouldn't allow my computer to use such methods when I don't still get high en
Re:Please enlighten me (Score:4, Interesting)
More pixels per inch means better resolution with less detail-killing antialiasing. Back when resolutions over 1600x1200 were a lot I used to regularly get kills on people who couldn't even see me yet in FPSes because I had a big fancy .22 pitch monitor.
Re: (Score:2)
4k is cool and all, I get it. Still, why would I want my PC to render 4k at suboptimal frame rates, when I can play at full HD
Not all games are about competitive FPS, it's that simple really. If you're playing The Witcher, Metro: Last Light, Tomb Raider, Civilization or any of a bunch of other games you just want it to be smooth and otherwise crank up the resolution/details/effects as far as you can.
Re: (Score:2)
"X is better than Y for the use cases I'm seeing" is a perfectly valid argument. Now, it's best to clarify as such, and not simply extrapolate out to every single possible use under the sun, a la "Well, Nvidia/AMD gets 3 more frames per second on $GAME, so therefore it's better for EVERYTHING." That's stupid - but something like my recent experience where I had major crash problems with AMD drivers on some of the games I like to play, and none wit
Re: (Score:1)
I fail to see what's invalid about that argument, though.
What argument? Is there any argument in GGP's post beyond the premise "nVidia rocks, AMD sucks"? Not that I can see.
Re: (Score:2)
The same predictable douchey driver argument that rears it's ugly head in every single A/B article that's posted on /.
FTFY - As for red and green, it's not Christmas — and July was last month.
Re: (Score:1)
A smoking hot combo in the late 1990's... (Score:4)
Re: (Score:2)
Remember those Quantum Obsidian (I believe they were called) 3D-only cards with shit-tons of 3Dfx Voodoo chips on 'em? :)
Re: (Score:2)
In over 10 games (Score:2, Insightful)
Can marketing majors everywhere please quit using this sentence?
Just say how many. Quit with the "Over X" bullshit. Every time I read that, I hear DUR IM PADDING NUMBERS BC IM STUPID.
Re: (Score:1)
Even better, it's *exactly* 10 games...
So not only is it stupid, it's flat out wrong.
Re: (Score:2)
Hey, don't knock it. It's an improvement on "up to X or more".
Re: (Score:2)
Geforce driver improvements? They don't even install properly on Windows 10.
Re: (Score:2)
you're assuming an OS and related software that can't run in high DPI modes. iMac with 5k display is beautiful, and it's only 27".