Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD Graphics Games Hardware

FPS Benchmarks No More? New Methods Reveal Deeper GPU Issues 125

crookedvulture writes "Graphics hardware reviews have long used frames per second to measure performance. The thing is, an awful lot of frames are generated in a single second. Calculating the FPS can mask brief moments of perceptible stuttering that only a closer inspection of individual frame times can quantify. This article explores the subject in much greater detail. Along the way, it also effectively illustrates the 'micro-stuttering' attributed to multi-GPU solutions like SLI and CrossFire. AMD and Nvidia both concede that stuttering is a real problem for modern graphics hardware, and benchmarking methods may need to change to properly take it into account."
This discussion has been archived. No new comments can be posted.

FPS Benchmarks No More? New Methods Reveal Deeper GPU Issues

Comments Filter:
  • by tepples ( 727027 ) <tepplesNO@SPAMgmail.com> on Friday September 09, 2011 @10:21AM (#37351160) Homepage Journal
    An oversimplification in the article:

    After all, your average geek tends to know that movies happen at 24 FPS

    Movies happen at a motion-blurred 24 fps. Video games could use an accumulation buffer (or whatever they call it in newer versions of OpenGL and Direct3D) to simulate motion blur, but I don't know if any do.

    and television at 30 FPS

    Due to interlacing, TV is either 24 fps, when a show is filmed and telecined, or a hybrid between 30 and 60 fps, when a show is shot live or on video. Interlaced video can be thought of as having two frame rates in a single image: parts in motion run at 60 fps and half vertical resolution, while parts not in motion run at 30 fps and full resolution. It's up to the deinterlacer in the receiver's DSP to find which parts are which using various field-to-field correlation heuristics.

    and any PC gamer who has done any tuning probably has a sense of how different frame rates "feel" in action.

    Because of the lack of motion blur, 24 game fps doesn't feel like 24 movie fps. And because of interlacing, TV feels a lot more like 60 game fps than 30.

  • Re: time/frame (Score:4, Informative)

    by lgw ( 121541 ) on Friday September 09, 2011 @10:28AM (#37351250) Journal

    The Crysis test loop measures the slowest frame, starting with the second loop (to avoid measuring disk performance). That "minimum FPS" number is what I personally use to benchmark graphics cards - it has always been the speed through the slow-to-render part of the map that matters.

  • Comment removed (Score:5, Informative)

    by account_deleted ( 4530225 ) on Friday September 09, 2011 @10:40AM (#37351390)
    Comment removed based on user account deletion
  • by sanosuke001 ( 640243 ) on Friday September 09, 2011 @11:11AM (#37351714)
    If it makes you feel better, I write 3D applications and our software has stuttering issues when loading new texture data (very large texture sets so its a tradeoff we accept for the most part). It is a problem and taking an average over time for FPS is mostly bullshit. I actually do some per-frame render time benchmarks when I'm developing as its more useful when trying to test consistency.

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...