Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD Graphics Games Hardware

FPS Benchmarks No More? New Methods Reveal Deeper GPU Issues 125

crookedvulture writes "Graphics hardware reviews have long used frames per second to measure performance. The thing is, an awful lot of frames are generated in a single second. Calculating the FPS can mask brief moments of perceptible stuttering that only a closer inspection of individual frame times can quantify. This article explores the subject in much greater detail. Along the way, it also effectively illustrates the 'micro-stuttering' attributed to multi-GPU solutions like SLI and CrossFire. AMD and Nvidia both concede that stuttering is a real problem for modern graphics hardware, and benchmarking methods may need to change to properly take it into account."
This discussion has been archived. No new comments can be posted.

FPS Benchmarks No More? New Methods Reveal Deeper GPU Issues

Comments Filter:
  • Twelve pages of graphs and data? Couldn't he just have said "standard deviations and percentiles" and be done with it?
    • by jhoegl ( 638955 )
      Someone has never written a term paper :)
      • by gknoy ( 899301 )

        Or perhaps the author has no background or experience with statistics, and wouldn't know what is meaningful about one or the other.

    • So first, I have not read TFA, but there are at least two different types of people: those that take in data through visual presentation and those that prefer text and numbers.
      When I was an engineer, I presented everything to my management in text and numbers that "any idiot" could quickly understand. Just a couple of slides with the essential bullet points and some "standard deviations and percentiles". I never knew why my managers ended up getting frustrated, bored, or began working "on other projects" d
      • It's called "make it shiny." As an engineer/tech guy myself, watching PP presentations by manager types make me want to either barf or put my head through the table. They violate basically every single rule of graphical presentation. Yet "it's shiny," so everyone is happy.

  • by Tei ( 520358 ) on Friday September 09, 2011 @10:19AM (#37351144) Journal

    Our eyes detect 'deltas' better than 'speeds', so if the odd numbered frames have a delay shorter than others, our eye will detect it. But this only affect setups with multiple GPU's. And is easy to fix. Just calculate the delta of the latest frame, and force the same delta, maybe use a buffer. This is not a problem once has ben detected, It may need some minor changes on engines, but thats all. IMHO.

    • Right up until some disk IO causes your last frame to be 2 seconds long, now every frame in the future is forced to 2 seconds between updates! Awesome for the win.

      Its only slightly (and by slightly, I mean a lot) more complicated than you think it is.

    • That sort of sounds like the solution presented on page 11 [techreport.com]: "More intriguing is another possibility Nalasco mentioned: a 'smarter' version of vsync that presumably controls frame flips with an eye toward ensuring a user perception of fluid motion."
    • That's the weird thing... on Starcraft 2 (from the article), the jitter was more pronounced on the single chips than in the multi-GPU configurations. It's not that simple ;)

    • by RMingin ( 985478 )

      Actually, if you read TFA, it's not only multi-GPU setups doing it. Also, the 'solution' you describe has been used by Nvidia since the GeForce 8 era. They call it 'frame metering', and it's not a perfect solution either.

  • by mfh ( 56 )

    tldr; benchmarks ignore consistency in their measurements and are therefore nonscientific marketing devices.

  • by tepples ( 727027 ) <tepplesNO@SPAMgmail.com> on Friday September 09, 2011 @10:21AM (#37351160) Homepage Journal
    An oversimplification in the article:

    After all, your average geek tends to know that movies happen at 24 FPS

    Movies happen at a motion-blurred 24 fps. Video games could use an accumulation buffer (or whatever they call it in newer versions of OpenGL and Direct3D) to simulate motion blur, but I don't know if any do.

    and television at 30 FPS

    Due to interlacing, TV is either 24 fps, when a show is filmed and telecined, or a hybrid between 30 and 60 fps, when a show is shot live or on video. Interlaced video can be thought of as having two frame rates in a single image: parts in motion run at 60 fps and half vertical resolution, while parts not in motion run at 30 fps and full resolution. It's up to the deinterlacer in the receiver's DSP to find which parts are which using various field-to-field correlation heuristics.

    and any PC gamer who has done any tuning probably has a sense of how different frame rates "feel" in action.

    Because of the lack of motion blur, 24 game fps doesn't feel like 24 movie fps. And because of interlacing, TV feels a lot more like 60 game fps than 30.

    • by Tr3vin ( 1220548 )

      Movies happen at a motion-blurred 24 fps. Video games could use an accumulation buffer (or whatever they call it in newer versions of OpenGL and Direct3D) to simulate motion blur, but I don't know if any do.

      Forced Unleashed 2 uses a technique similar to this, but there are probably others that do, too. It renders frames at about 30 frames a second, but updates the screen at a stable 60. It uses an interpolating motion blur to make the gameplay feel nice and smooth. This allows for more geometry to be drawn while still providing a "good" user experience. It is slightly bit different than a simple accumulation blur since they predict the motion when doing the blur instead of simply blurring previous frames.

    • by TheLink ( 130905 )
      Movies happen at a motion blurred 24 fps AND I think that sucks. On the "big screen" I can visibly see the stuff "rippling" down at 24 fps especially on scenery pans.

      The 24 fps rate is not because the human eye can't see faster than that (it can), it's a compromise due to technology limitations nearly 100 years ago.

      Except for "special effects" I prefer that the stuff be updated much faster and let our eyes do the motion blurring. And I don't really like blurred special effects.

      I dislike movie scenes where s
      • When you look at a fast moving object it's sharp (unless it's moving so fast that your eyes can't track it).

        But the background will be blurred. Motion blurring usually has it's place, if done well you will not even notice it, it just seems natural. Of course that often involves no effects at all, the camera just uses the same limitations as your eyes. If you were to use a very high shutter speed in a scene with a fast moving object it will seem strange, your eyes won't do the motion blurring right because they're fed a completely sharp image to begin with, which is not what happens when your eyes pan in a real sc

        • by cynyr ( 703126 )

          The problem is that "film" based cameras can not film in infinite depth of field. This makes it rather impossible to do what you asking for in outdoor shots. for indoor or limited depth bright scenes it may be possible to use a higher Fstop and trade light amount for higher depth of field.

          Anyways fully rendered stuff can be rendered with perfect camera separation, and infinite depth of field.

        • by TheLink ( 130905 )
          1) You miss the point. The background might be blurred, but that's when I am not looking at it (e.g. not interested). BUT if I choose to look at it, it stops being blurred. At what is "background" depends on what/where I choose to look.

          My complaint is about the situation where movie directors artificially and unnecessarily make bits of the picture out of focus. Sometimes there are technological limitations, but often nowadays most of the scene is rendered and then artificially blurred (e.g. Avatar). Sometim
          • 1) You miss the point. The background might be blurred, but that's when I am not looking at it (e.g. not interested). BUT if I choose to look at it, it stops being blurred. At what is "background" depends on what/where I choose to look.

            Maybe I misunderstood you. I agree that it shouldn't be overdone, but I think that if you were to remove motion blur (MB) altogether from such a scene it would look very strange. Try enabling/disabling MB and play a suitable scene in a modern video game. It *will* look strange without MB, also with high fps/resolution on a fast monitor. I don't think that having everything completely sharp will work (if that's what you mean).

            My complaint is about the situation where movie directors artificially and unnecessarily make bits of the picture out of focus. Sometimes there are technological limitations, but often nowadays most of the scene is rendered and then artificially blurred (e.g. Avatar). Sometimes it's part of the story, or for good effect, in which case I don't mind.

            It hurts my eyes when it's overdone.

            Exactly.

            2) I know they are not the same, I'm just complaining about them all. I wrote: "out of focus or motion blurred". I dislike both effects. And I also dislike lower frame rates (which is another separate thing - the low-frame-rate frames could be perfectly sharp and in focus).

            I think we're agreeing in principle, but I took what you wrote to mean "do

    • by grumbel ( 592662 )

      to simulate motion blur, but I don't know if any do.

      Pretty much all modern games use motion blur. Some of course use it better then others and it is often not quite as high quality as one might want it to be, but motion blur itself is almost everywhere these days (most noticeable when swinging the camera around in a third person shooter).

    • by Anonymous Coward

      It's way more than motion blue that makes movies tolerable at 24 fps - motion blur is just one sign of the fundamental difference between recording and animation. If an animation runs at 60 fps, you are getting information about 60 frames in one second. This is not true for video - each frame of the 24 are actually an average over many million frames (look up the Planck time on Google - the fps is actually far more than a million). The universe runs at a tremendous amount of fps, and the 24 frames you see i

    • by Hatta ( 162192 )

      Did Voodoo cards apply any sort of motion blur? I've long noticed this stuttering, e.g. when circle strafing around a corner it's pretty easy to see that the corner doesn't move smoothly. It even happens when running very non-demanding games (prboom, openarena, UT, etc) at 75hz under vsync with modern hardware under any OS.

      The only hardware I've ever used where I didn't see this effect is my Voodoo 2 setup. Those things are smooth as butter, as long as you don't throw too many polygons at them.

    • Again, an over simplification.

      Movies are shot at 24 fps with a 1/180 second exposure time. When they make the exposure time longer, you notice and it gets hard to watch. Listen to the commentary by Joss Whedon on Serenity for more, but the scene on the reaver planet are shot at 1/120 and are a little unsettling. Things like panning include blur. the scene in Saving Private Ryan on the beach are shot at 1/60 and are noticeably difficult to watch--when the camera moves, everything blurs. When something flys b

    • The biggest difference in your example is that there is a big difference between WATCHING a video game with visually imperceptible stuttering and PLAYING a video game with visually imperceptible stuttering. The latter leaves the gamer confused about why their controls are suddenly unresponsive.
      • I'm guessing you've not played any multitrheaded games then?

        The controls are plenty responsive. The visual feedback is not.

    • In the real world you can follow an object and motion blur will only be an issue for the background. That doesn't work at all in the movies which is why fast panning shots look atrocious. Following objects with your eyes on an persistent display actually creates fake blur as your eyes try to track the in-between positions of the object that does not actually exist. This was a non-issue on CRT displays as the image relied on persistence of vision and it compensates for eye tracking. Modern TVs try to compe
    • Note: Not all TV shows are at 50 or 60 Hz. Some TV shows use the progressive encoding scheme that halves framerate.

      The two fields of interlace are hijacked to form a single still image. It's a single image built from two passes. Hence the name "progressive".

      They sacrifice framerate for the sales pitch. I don't know why though. Possibly to make the freeze-frame look better, dunno.

      • Most progressive TV shows I've seen appear to have been shot on film (or video processed to look like film), and I mentioned those: "24 fps, when a show is filmed and telecined".

        They sacrifice framerate for the sales pitch. I don't know why though.

        TV producers use film to make the show appear more cinematic. Audiences are used to seeing film-like frame rates and other artifacts of film for scripted programming as opposed to video for "reality" programming such as news and sports.

        Possibly to make the freeze-frame look better

        The freeze-frame doesn't look better when you hit it at the wrong field dominance.

    • You can sure as hell tell the difference between 24fps and 60fps video. It is real, REAL easy to see. I have an AVCHD cam that will shoot 24p, 30p, or 60p (HDC-TM900 if you are interested) and it is amazing how smooth the 60p video looks. You just don't get that kind of smoothness in movies.

      Some people don't like it, they think it looks "too smooth" and "not cinematic". In other words, they've gotten used to the choppiness of 24fps and associate that with movies.

      What's more, there's a market in TVs that try

  • This will certainly make benchmarking a bit more complex. One hopes that the gamers like going back to stats class.

    You'll need the FPS value, as before, (ideally with a worst-case FPS reported); but you'll also want a measure of the deviation of every frame's draw time from the average draw time being reported. And likely a measure of how atypically bad frames are distributed(ie. 5 seconds of super-low framerate during some sort of loading is annoying. 20 25 millisecond frames scattered throught action-h
    • Well, we definitely need a measure of output... Maybe am daft, but how about FPS achieved for X millions textured triangles or polygons [with shadow]?
      • I suspect that such a simplified benchmark would work just fine; but would be of interest to relatively few people. Most users of graphics cards either don't care at all, and have integrated graphics, don't care at all about theoretical performance; but do care about sniping n00bs in Medal of Halo 3, or are GPU compute users, who have their own quite specific demands.

        Even in the realm of game benchmarking, you can see some pretty dramatic differences, between engines, in how Nvidia's approach or ATI's ap
      • by Shinobi ( 19308 )

        Ah, back to the good old days of 3D graphics, when output was measured in polygons per second, and all the jiggling about, until the big boys all standardized on 50 pixel polygons, gouraud-shaded, at a resolution of 1280x1024, with a colour depth of 24 bits... And then more stuff was tacked on...

        Well, ok, almost all the big boys... Intergraph kept trying to pass 25 pixel polygon performance numbers as a valid comparison to 50 pixel polygons....

    • you'll also want a measure of the deviation of every frame's draw time from the average draw time being reported. And likely a measure of how atypically bad frames are distributed(ie. 5 seconds of super-low framerate during some sort of loading is annoying. 20 25 millisecond frames scattered throught action-heavy areas is really annoying...)

      As a non-stats-guy I hereby invent the standard-deviation-for-slow-down-only-including-outliers metric. You're welcome :)

      But seriously, couldn't they just include the whole distribution graph of time between frames, with a description of what it means? And maybe provide the underlying data as a spreadsheet so that you can run whatever analysis you're most comfortable with on it. Oh well, back to stats class.

      This post is going nowhere, so I'm gonna save it for posterity.

  • by Anonymous Coward

    The point of doing a FPS-benchmark is to reveal how the graphic card performs in the games that most people play. People don't care about the theoretical performance. They just want to know if it can run the new cool game with the good graphics. Either a game renders fast enough, or it is so slow that you can't turn out the special effects that makes the game look really good. It's all about the game. The other stuff is not so important.

    • by tepples ( 727027 )

      The point of doing a FPS-benchmark is to reveal how the graphic card performs in the games that most people play.

      What about people who don't play a lot of first-person shooters? Some of these benchmarks measure only fps in FPS, not fps in other genres.

      Either a game renders fast enough, or it is so slow that you can't turn out the special effects that makes the game look really good.

      The point of the article is that an isolated frame that takes too long to render can jerk one out of being absorbed in the effects.

    • Wrong.

      The point of an FPS benchmark is to get a higher number than some guy on a forum so you can brag about it.

      FPS has never been about how well a system can render a scene because its a shitty measure of it. It ignores quality, complexity, accuracy, stuttering (why is this supposed to be new? GPU stuttering is less noticable than disk IO causing jitter, and NOW the framerate no longer matters? Thats just dumb.

      There is nothing 'new' about the frame rate being a shitty method of rating performance, well,

  • At this point a large use of GPUs seems to be for processes where they are more efficient than CPU. The most obvious is vector processing. If one is doing heavy computational work then the standard benchmarks seem fine. What fraction of the GPU market is for actual graphical use?
    • by tepples ( 727027 )
      Which isn't unlike the benchmark that the article uses: 99th percentile frame rendering time. You want this to be under 16 ms in order to keep a consistent 60 fps.
      • You completely missed the point. The 99th percentile frame rendering time gives you a reasonable approximation of 1/fps. What we REALLY want to know about is those few frames that fall above the 99th percentile -- those are the ones that cause stuttering.
        • by AvitarX ( 172628 )

          I agree, I am not sure that 99 percentile is working or not now, but that still allows for a bad frame every other second.

          More useful would be the average of that worse 1%, or to crank it up to 99.9% (bad frame every twenty seconds), or longer, to some acceptable rate.

          Even if 99 percentile works now, it's ripe for abuse by the manufacturers if it becomes a common metric (I imagine anyway there could be trade-offs in a driver where 1 frame every 2 seconds is terrible, rather tan one frame a second being OK,

  • Don't you only need 60 FPS to have the illusion of animation anyway?

    • Just for the illusion of motion? That's around 12fps, which is what cartoons use. 60fps is the start of fairly smooth motion, but 120hz is in uncanny valley territory for me. Too smooth, but not smooth enough.

      • by Ferzerp ( 83619 )

        Are you sure about this, or are you basing it on experience with really poor TV frame interpolation? That's not 120Hz. That's more like 30Hz with lots of fake frames.

        • You have a point on the fake frames. I don't have anything capable of generating 120Hz as source material, or for displaying it. So in-store demo is really all I have. I suppose I'd have to watch the 120Hz interpolation in slow motion to see if the artifacts are the cause. Maybe it's because too much motion blur is missing after the interpolation (more than should be for that amount of interpolation).

  • Comment removed (Score:5, Informative)

    by account_deleted ( 4530225 ) on Friday September 09, 2011 @10:40AM (#37351390)
    Comment removed based on user account deletion
    • by tonywong ( 96839 )
      Not only that but for the most part he couldn't perceive the microstuttering most of the time. That article should have been turned away for better research.
    • by Kjella ( 173770 )

      Actually he measured the rendering just fine. But what nVidia told him is that they're doing some timing magic before they display it in SLI setups, which is currently not possible to measure with FRAPS or any of the other standard FPS tools. So right now you would need to get a high-speed camera to snap pictures of the screen to know what the user sees. But regardless of that they can't get rid of all the stuttering that easily, the slowest frame still takes much longer to render than the average. Also thi

    • by sanosuke001 ( 640243 ) on Friday September 09, 2011 @11:11AM (#37351714)
      If it makes you feel better, I write 3D applications and our software has stuttering issues when loading new texture data (very large texture sets so its a tradeoff we accept for the most part). It is a problem and taking an average over time for FPS is mostly bullshit. I actually do some per-frame render time benchmarks when I'm developing as its more useful when trying to test consistency.
      • You should probably review videocards giving your insight, most tech sights (even anand) still just runs canned FPS games. Maybe you can even make a little $ from doing articles.

    • If I'm reading the same spot, that's talking about latency between the call when the game wants to draw it and when it's actually rendered, in response to a technique Nvidia is using to smooth out FPS, not variation/jitter in the FPS, which is what a majority of the article is talking about.

      I'd like your five minutes and mine for taking the time to read your response too.
    • why would they, This is /. after all.
  • Most of the time when I notice any stuttering is also the same time my hard drive lights up. Usually either the game or some background service decides to flood the disk with IO requests. In a few instances I've even had Windows become completely unresponsive until whatever disk operation that is running completes. It doesn't matter how much RAM I have. I haven't purchased any SSDs yet but I'm sure they help a lot to alleviate the problem. The question is is this a fault in how programs or the operatin
    • 1) Pagefile/swapping. - the os thinks it is from the 90s and trying to do memory swapping (no matter how much ram you have).

      2) disk controller drivers run in kernel mode. any rise in disk queue length (pending requests) hangs the system. arggh.

      • I'm sorry, what OS are you using to which these things don't apply to? OSX, Linux, Windows all are exactly the same here, they all do these things (well, except for hang the system, unless of course we're hanging because we're waiting on something to be paged so the app we're running appears frozen), and are expected too, for obvious reasons. Well, clearly not obvious to you.

        So again, what OS are you using, that those things don't apply to? QNX or BeOS?

        • Most of the time when I notice any stuttering is also the same time my hard drive lights up

          As the drive(s) get busy, pending operations get queued up (disk queue depth) and as such, the machine starts "hanging" waiting for the IOs to finish.

          unless of course we're hanging because we're waiting on something to be paged so the app we're running appears frozen), and are expected too, for obvious reasons.

          It is obvious. The point here is that the paging operation of the machine makes the problem worse. Especially if you have slow disk(s), or the pagefile/swap space dynamically changes.

          primarily, linux does a much better job about not paging unless it starts running out of memory. whereas windows seems to start using it as soon as it boots.

          • original user could check disk queue length, and then start running their apps. then when it starts chugging, go check the stats to see if there is a spike in disk io, or queue length.
    • Comment removed based on user account deletion
      • Erm, isn't that what we already do, at least on Linux? I mean, the application simply uses read/write and possibly seek against an arbitrary filename - it's up to the OS and FS to handle actually getting or saving the data. Which means that the application doesn't know or care whether the file's on spinning media, a slow flash drive, SSD or cached in system ram!
        The only thing that may not be optimal is that we have the sync() function, which usually won't return until the data has actually been saved to dis

        • Comment removed based on user account deletion
  • As if the issue of micro stuttering hasn't already been covered in great detail numerous times in the past. I ran sli for a while, if it's a problem then features like vsync can help. If you are only running one GPU like 99.9% of folks out there then you don't need to waste your time on this article.
    FWIW, FPS is still a fine benchmark. Like any benchmark, it only tells part of the story. That's why you use tools like 3dmark that run a battery of benchmarks to aggregate a rating, and then measure actual
  • I find he content and discussion very interesting. For me, this was something obvious because of my line of work, but I can imagine that most people reading (and writing) GPU reviews had no clue what so ever about this.

    As much as I find the content interesting, its presentation is awful. Although is is interesting the present some figures on a frame-count base, most of the overview figures should be on equivalent time base, allowing a proper comparison of the tests sequences. I'd have shown one frame based

  • Comment removed based on user account deletion
    • by cynyr ( 703126 )

      i agree with you, much like the recent AMD bulldozer discussion where some anandtech benches were linked to. The slightly higehr clocked non-turbo quads beating the 6 ways in x264 encodes... seems to me they need to turn some more x264 options on.

  • The only way I have ever been able to test what real performance will be like in a given game or rendering in a given program is to play that game or render in that program. Even built-in benchmarks like in HL2 don't seem to take gameplay into account well enough. While (at best) benchmarks can be a help in deciding what to buy in a very general way, I have learned to be skeptical and trust my experience only. Even framerate monitors in games often don't reflect the smoothness of the experience of the game.

  • I'm glad somebody started looking at ms per frame instead of frames per second. Since what we really care about for game performance is whether frames are rendered quickly enough to give satisfactory reaction times etc, using frames per second is misleading.

    Another example where the same thing happens is fuel consumption: we keep talking about miles per gallon, but what we primarily care about is the fuel consumed in our driving, not the driving we can do on a given amount of fuel, so this is misleading. To

    • Changing from frames/sec to msec/frame doesn't fix the problem at all.

      I was playing Minecraft the other day with a buggy mod installed, and was getting 240 fps, but choppy performance. Sometimes I'd get 1 second spikes, and the fps monitor would change to show something like 30fps before creeping back up to 240. If you convert that to msec/frame, those numbers still look really good.

      Other games that run at 50fps looked much better than this buggy minecraft mod. Taking the inverse (as you suggest) doesn't

      • by jensend ( 71114 )

        Of course just looking at an average of either quantity isn't going to be sufficient- you need to look at the distribution of values, not just the mean. (Looking at the 99th percentile, as they did in the article, is a start.) But at least this way we're looking at the distribution of the right numbers.

  • They already show Min Frame Rate next to average on Tom's Hardware....
  • Tom's Hardware posted a similar sort of analysis a few weeks ago: http://www.tomshardware.com/reviews/radeon-geforce-stutter-crossfire,2995.html [tomshardware.com]
  • Why would anyone need a framerate faster then the refresh rate of the display refresh rate you're using?

    I've never understood why anyone would push a graphics card faster then the refresh rate of the display you're using. Why not just cap it off at the max refresh rate, and let the card take more time in rendering each frame.....

    It seems as though there should be some sort of "dynamic rendering" option. You want the framerate to match the refresh rate of the monitor, so why can't the rendering engine deci

  • When Black Ops came out I was getting decent FPS but the stuttering made it unplayable. It seems that this game requires a CPU with an onboard memory controller *and* 3/4 available cores. If you don't meet both those requirements the engine will stutter just as this article describes. This was a problem for users of Core2 series CPU's, even Core2quads were inadequate. Activision refuses to acknowledge the severity of this problem to this day.

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...