Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AMD Graphics Hardware

New GPU Testing Methodology Puts Multi-GPU Solutions In Question 112

Vigile writes "A big shift in the way graphics cards and gaming performance are tested has been occurring over the last few months, with many review sites now using frame times rather than just average frame rates to compare products. Another unique testing methodology called Frame Rating has been started by PC Perspective that uses video capture equipment capable of recording uncompressed high resolution output direct from the graphics card, a colored bar overlay system and post-processing on that recorded video to evaluate performance as it is seen by the end user. The benefit is that there is literally no software interference between the data points and what the user sees, making it is as close to an 'experience metric' as any developed. Interestingly, multi-GPU solutions like SLI and CrossFire have very different results when viewed in this light, with AMD's offering clearly presenting a poorer, and more stuttery, animation."
This discussion has been archived. No new comments can be posted.

New GPU Testing Methodology Puts Multi-GPU Solutions In Question

Comments Filter:
  • That unpossible. :-P

    • Mind you, the only thing worse than stuttery animation is jerky sound.
  • by amanicdroid ( 1822516 ) on Friday February 22, 2013 @03:37PM (#42983035)
    My AMD is cranking out Bitcoin hashes 15 times faster than an equivalently priced Nvidia so I'm okay with the results of this article.
    • by gstoddart ( 321705 ) on Friday February 22, 2013 @03:41PM (#42983095) Homepage

      My AMD is cranking out Bitcoin hashes 15 times faster than an equivalently priced Nvidia so I'm okay with the results of this article.

      Out of curiosity, what's your break even point?

      If you went out now, and bought one of these video cards solely for this ... how long would it take to recoup the cost of the card? Or is this something you'd run for a long time, and get two bucks out of, but still have had to pay for your electricity?

      I hear people talking about this, but since I don't follow BitCoin closely enough, I have no idea if it's lucrative, or just geeky.

      • by Anonymous Coward on Friday February 22, 2013 @03:52PM (#42983271)

        People that "mine" bitcoins don't pay for their own electricity. Most people don't have the basement circuits metered separately from the rest of the house.

      • by lordofthechia ( 598872 ) on Friday February 22, 2013 @03:57PM (#42983323)

        Don't forget electrical costs. At $0.10 a kWh you are paying $0.24 a day (24 hours) per 100 watts of continuous average power consumption. This is $7.20 per month per 100W @ $0.10 /kWh or $87.60 a year. Adjust up/down for your cost of electricity and power usage (120W and $0.12/kWh = 1.2 * 1.2 = 1.44x adjustment)

        Now add to this the waste heat vented into your house on the months you cool your house + the depreciated costs (and wear and tear) of the computer assets you tied up processing Bitcoins, then you'll have your true cost and you can calculate your break even point based on initial investment + ongoing costs - product (bitcoins) produced.

        • Don't forget electrical costs.

          You'll note that I didn't. ;-)

        • Waste heat? You mean my ATI Radeon 200W Space Heater® that takes away the night chills?
          • by h4rr4r ( 612664 )

            That costs a lot more to run than a natural gas furnace.

            • Mine is a joke? People call it waste heat, I use and enjoy the waste heat so I made a joke about it.

              I doubt a natural gas furnace even has a PCIx16 slot much less match even a budget Nvidia's hashing specs.

              ^See what I did there? Another joke.
          • by mikael ( 484 )

            In every house or apartment that has frosted windows in the doors or skylight windows above the doors, a single laptop screen would light up the entire floor - much to the annoyance of those who went to bed early to sleep vs. those who wanted to read slashdot.

        • by megamerican ( 1073936 ) on Friday February 22, 2013 @04:44PM (#42983937)

          Don't forget electrical costs. At $0.10 a kWh you are paying $0.24 a day (24 hours) per 100 watts of continuous average power consumption. This is $7.20 per month per 100W @ $0.10 /kWh or $87.60 a year. Adjust up/down for your cost of electricity and power usage (120W and $0.12/kWh = 1.2 * 1.2 = 1.44x adjustment)

          Believe me, I do not. With electricity costs taken into account I make around $4 per day (from 4 video cards) from Bitcoin or Litecoin on 2 gaming systems I rarely use. When I use my main gaming system it is slightly less.

          Now add to this the waste heat vented into your house on the months you cool your house

          Living in a colder climate these costs offset, however I have no hard numbers. The slightly higher electricity cost in the summer months are offset from a savings in natural gas cost in the winter months.

          + the depreciated costs (and wear and tear) of the computer assets you tied up processing Bitcoins

          The goal is to maximize profits and not necessarily maximize the amount of bitcoins/litecoins I mine, so thanks to the power curve of most cards, it is more profitable to slightly underclock the core and/or memory clock which helps minimize wear and tear on the cards. The cards I've had since 2009 are still running and producing the same MH/s as they always have.

          Many people who still mine bitcoins with GPU's are people who don't pay for electricity costs thanks to the difficulty rise from FPGA's and ASIC's. This pushed out any profitability for me, but I still have profitability from Litecoin, which is a similar cryptocurrency.

          Even if there were no profits and I was just breaking even I would still do it because I would like a use for my gaming machines since I rarely game anymore but still want to sit down and play every couple of weeks.

          • Considering the cards overheat at above 120 C and overclocking doesn't really do anything for hashing, miners in hot climate could put the boxes in a protected area outside and suffer few ill effects.
          • if you would like a use for your gaming machines, why not BOINC? you can choose where to donate computing power, although I'm not sure how many projects work on the gpu.

        • by antdude ( 79039 )

          Or how much heat is added during the hot times. Ugh. It's nice during the winter though!

      • by amanicdroid ( 1822516 ) on Friday February 22, 2013 @04:07PM (#42983465)
        Haha, I'm at less than 1:1 electricity to bitcoin ratio after ~5 months.
        Kill-A-Watt says I've used approx $68.23 of electricity at 11.5 cents per kWh. Bitcoins currently trade at 1 to $30 and I've got 2.2 bitcoins. The Radeon 6770 was (and still is) ~$110.

        Additional factors to consider:
        -The bitcoin machine is also my daily workstation so if it were running headless and otherwise unused it would have probably done better in the electricity used category.
        -It makes a damn fine space heater and I've enjoyed it immensely this winter.
        -My focus in this project was to learn hands-on about scientific computing applications and it's been great for that.

        In conclusion: as a business it would have been a flop, partially because I haven't optimized the setup for that application. As a learning opportunity and 200 watt heater it's been phenomenal.
        • In conclusion: as a business it would have been a flop, partially because I haven't optimized the setup for that application. As a learning opportunity and 200 watt heater it's been phenomenal.

          Well then, a learning opportunity and a 200 watt heater are fine outcomes then. :-P

        • That is the most insight i've ever gotten into the bitcoin economy, I've always passed because the hardware is worth more and can break from such use, especially in the long term. I'm still passing, but it's good to know about the electricity. As for TFA, I don't think the author realizes that there's a ton of video cards with multi-gpus on board, it's not all crossfire and SLI, and hasn't been for the last decade. The method they're using seems legit on the surface until you read at the bottom that they

        • by retep ( 108840 )

          Keep in mind that for Bitcoin the individuals like you running tiny little mining setups that might not be actually profitable as a fun hobby are a very good thing. Bitcoin needs mining power to be as well distributed as possible to make it difficult to co-opt, so the hundreds or maybe even thousands of individuals like you help that goal. However, it's helped best if you actually validate your blocks properly, and that means mining with P2Pool right now.

          Bitcoin is lucky that the costs to mine for a small r

          • ADDED BONUS: many have already have all the equipment they need to get started. Like you said, it's a fun hobby.
      • The break-even point for GPU mining doesn't exist anymore if you have to pay for power and it's a very, very long time if you don't. Why? ASICs.
      • by pclminion ( 145572 ) on Friday February 22, 2013 @05:23PM (#42984555)

        Out of curiosity, what's your break even point?

        I don't know where the break even point is, but once you pass it, you can be very profitable. One of my friends built a custom "supercomputer" out of cheap motherboards and graphics cards for about $80k -- along with completely custom software to automatically tune clock speeds and fan rates in real time (all of which was written in bash script). At peak performance, his machine generated about $20k worth of bitcoin every month, which easily paid for the $12k monthly electric bill.

        After a couple of difficulty-doublings, and the imminent arrival of the ASIC miners, this lost its profitability, and he went back to being a DBA... The machine is still out at the farm, cranking away. I think he'll disassemble it and part it out for cash in a month or two.

        • So one of your friends is using company equipment in a server farm to mine bitcoins? Sounds very illegal
          • I don't get it. Are you assuming that anybody who spends $80k on something must be using someone else's money? You're a moron. This was his private project, which he managed to live off of for almost two years.
            • The machine is still out at the farm

              So you are telling me he had the upfront money AND a server farm that rented him space, using a completely custom, unknown type of machine that no one had a problem with? Sounds legit

              Or am I an asshole and it is on a real vegetable/animal farm?
              • by pclminion ( 145572 ) on Friday February 22, 2013 @07:05PM (#42985725)

                Dude, it's a farm. A fucking farm. 40 acres of red wheat.

                He designed the rack system himself, along with custom power supply headers that he had fabbed at a nearby plant. He even tried to reduce equipment costs by hiring a Taiwanese company to produce custom GPU cards for him for $70 a piece (they didn't work very well).

                Nobody does that shit anymore. It was like watching Steve Wozniak.

        • It makes me sad that someone could run up a $12K monthly electric bill without assigning an environmental cost to where that power was coming from.

          • by pclminion ( 145572 ) on Friday February 22, 2013 @08:02PM (#42986213)

            It makes me sad that someone could run up a $12K monthly electric bill without assigning an environmental cost to where that power was coming from.

            Making assumptions is bad.

            Before the Bitcoin operation got started, my friend's business was making biodiesel out of local rendered chicken fat and other things. He single-handedly supplied most of the farmers in a 5 mile radius with fuel for their farm operations. Prior to the biodiesel years, he ran the largest privately owned solar grid in the county, providing something like 25 kilowatts back to the grid, for a couple of years solid. He is the most environmentally obsessed person I know, and has certainly contributed far more to the local green economy than he has taken out of it.

            The ultimate plan, which did not come to fruition (because of the rising difficulty of mining bitcoin, as I stated earlier), was to completely cover the 40 acre property with an array of solar panels, each panel having a custom GPU mining module installed on the underside -- open air cooling of the machines, solar power for the bitcoins, and it would have qualified as the largest solar array in the United States.

            To think that he's some kind of forest-destroying air-blackening capitalist is about the furthest from the truth as you can get. Check your assumptions.

            • To think that he's some kind of forest-destroying air-blackening capitalist is about the furthest from the truth as you can get. Check your assumptions.

              His heart is pure, so his free money scheme has no externalities? All righty, then.

              • His heart is pure, so his free money scheme has no externalities? All righty, then.

                Is YOUR net carbon emission negative? Have you ever bothered to even measure it?

    • by Anonymous Coward

      This is only because nvidia intentionally cripples all consumer grade GPUs, artificially reducing their precision to make them near useless as GPGPU devices.
      This is simply to price gouge the high end.
      And they get away with it because they have a near monopoly on the market.
      AMD is a latecomer and for whatever reason they don't have the platform and support to break in to the GPGPU market. (Nvidia really did take the torch and pioneer the field)

      • by amanicdroid ( 1822516 ) on Friday February 22, 2013 @04:28PM (#42983735)
        This is the explanation I've been given for the disparity between Nvidia and AMD:
        https://en.bitcoin.it/wiki/Why_a_GPU_mines_faster_than_a_CPU#Why_are_AMD_GPUs_faster_than_Nvidia_GPUs.3F [bitcoin.it]

        Specifically:

        Secondly, another difference favoring Bitcoin mining on AMD GPUs instead of Nvidia's is that the mining algorithm is based on SHA-256, which makes heavy use of the 32-bit integer right rotate operation. This operation can be implemented as a single hardware instruction on AMD GPUs (BIT_ALIGN_INT), but requires three separate hardware instructions to be emulated on Nvidia GPUs (2 shifts + 1 add). This alone gives AMD another 1.7x performance advantage (~1900 instructions instead of ~3250 to execute the SHA-256 compression function).

        For GPU programming I've enjoyed Nvidia's CUDA package greatly over wrangling OpenCL that Radeon relies on.

        • by tyrione ( 134248 ) on Friday February 22, 2013 @08:43PM (#42986499) Homepage

          This is the explanation I've been given for the disparity between Nvidia and AMD: https://en.bitcoin.it/wiki/Why_a_GPU_mines_faster_than_a_CPU#Why_are_AMD_GPUs_faster_than_Nvidia_GPUs.3F [bitcoin.it] Specifically:

          Secondly, another difference favoring Bitcoin mining on AMD GPUs instead of Nvidia's is that the mining algorithm is based on SHA-256, which makes heavy use of the 32-bit integer right rotate operation. This operation can be implemented as a single hardware instruction on AMD GPUs (BIT_ALIGN_INT), but requires three separate hardware instructions to be emulated on Nvidia GPUs (2 shifts + 1 add). This alone gives AMD another 1.7x performance advantage (~1900 instructions instead of ~3250 to execute the SHA-256 compression function).

          For GPU programming I've enjoyed Nvidia's CUDA package greatly over wrangling OpenCL that Radeon relies on.

          You're living on borrowed time with CUDA. The entire industry has already moved to OpenCL and it will only expand when all the heavy Engineering and Science vendors are fully on-board. When Ansys 14.5 already moved to OpenCL for its latest release you know such a conservative corporation is one of the last to make the transition.

          • Does OpenCL support device-to-device remote copy over Infiniband?

            Honestly asking, because that's an absolute killer feature for HPC applications. PCIx is abhorrently, soul-crushingly slow from the GPU's perspective, and being able to RDMA without ever moving through the hosts' memory saves half your pcix bandwidth use.
  • by Anonymous Coward

    I'm sure that AMD, the losing party, will dispute the results and come up with its own methoology to counter this.

    Then again, everyone knew nVidia high end cards are better, so was this new test really necessary??

    • I'm sure that AMD, the losing party, will dispute the results and come up with its own methoology to counter this.

      Then again, everyone knew nVidia high end cards are better, so was this new test really necessary??

      The point of the "new test" is that framerate is a terrible metric because it averages out what you care about.
      When you measure frame times individually you can then quantify how often a game slows down and by how much.
      You don't just have an average FPS, or a MAX/AVG/MIN.

  • Regardless... (Score:5, Interesting)

    by Cinder6 ( 894572 ) on Friday February 22, 2013 @03:44PM (#42983163)

    As an owner of a Crossfire setup, it's obviously not a 2x improvement over a single card; however, it's also a marked improvement over a single card. When I first set up this rig (August), I had problems with micro-stutter.* Now, though, after AMD's newer drivers and manually limiting games to 59 FPS, I don't see it anymore; games appear smooth as silk.

    At a mathematical level, it may not be a perfect solution, but at a perceptual level, I am perfectly satisfied with my purchase. With that said, buying two mid-line cards instead of one high-end card isn't a good choice. Only buy two (or more) cards if you're going high-end.

    *I was initially very disappointed with the Radeons. That's no longer the case, but I will probably still go nVidia the next time I upgrade, which hopefully won't be for years.

    • Limiting your framerate to only 59fps on Crossfire is acceptable to you? What resolution are you pushing that 59fps doesn't defeat the purpose of having Crossfire?
      • by Cinder6 ( 894572 )

        Why yes, it's acceptable, because 59 is more than enough for smooth animations--your eyes don't notice the difference, and your monitor probably couldn't even refresh fast enough to show it, anyway. My games never drop below this rate, so it looks smooth throughout.

        • Why yes, it's acceptable, because 59 is more than enough for smooth animations--your eyes don't notice the difference ...

          proves the point that only suckers buy into SLI/CF scheme

          • by epyT-R ( 613989 )

            People who want and can see the difference between 60hz and 120 aren't suckers for their willingness to pay up, but it is true that SLI doesn't always deliver. We are far from the 3dfx days where a second card gave an automatic 100% performance boost in every application. As someone who can easily see the difference, I always shoot for a single GPU whenever possible.

            • 3dfx cards only gave 100% performance gains when your rendering pipeline was two pass... one for models, one for textures. Not every game did it this way. Quake 2 did, however
          • proves the point that only suckers buy into SLI/CF scheme

            SLI/CF decrease the chances that the frame rate will drop below an acceptable level. They're pointlessly rendering if they go beyond what you, and your monitor, can perceive.

            The only point proven is that you do not understand FPS, nor do you understand the purpose of SLI/CF.

            • They're pointlessly rendering if they go beyond what you, and your monitor, can perceive.

              The only point proven is that you do not understand FPS, nor do you understand the purpose of SLI/CF.

              I understand fps plenty. More than OP because I know you can spot more than 60Hz. You need to be a really big sucker if you
              a believe going over 60Hz will be unnoticeable
              b pay for two cards

      • by Anonymous Coward

        Limiting your framerate to only 59fps on Crossfire is acceptable to you?

        You must be one of those fucking morons who think there's a visual difference between 120fps and 60fps on a 60Hz LCD monitor.

        • by epyT-R ( 613989 )

          You must be one of those fucking morons who:
          1. doesn't realize real 120hz panels exist.
          2. doesn't realize that even a vsync'd disabled 60hz display still allows for lower input response latency if the graphics cards allow higher framerate.
          3. doesn't realize that 60hz+ isn't the only reason people do multigpu. Having twice the fillrate helps in other areas too.

          • 2. doesn't realize that even a vsync'd disabled 60hz display still allows for lower input response latency if the graphics cards allow higher framerate.

            No, no it doesn't, unless the input system is part of the graphics thread and running on the frame update timer, which is conspicuously not the case these days in any case of competence.

          • by Anonymous Coward

            How much better is your gaming experience with a Monster brand HDMI cable?

        • by zipn00b ( 868192 )
          There is a difference as I have to blink twice as fast at 120fps...............
      • Ever hear of Eyefinity? 5760x1200 is a lot of pixels to push.

    • 59fps? Why that number? And how does that work with your monitor, which I assume refreshes at 60 Hz?

  • by WilliamGeorge ( 816305 ) on Friday February 22, 2013 @03:45PM (#42983177)

    It started when people began to look not only at average frame rate, but at *minimum* frame rate during a benchmark run. That shows how low the FPS can dip, which was the beginning of acknowledging that something in the user-experience mattered beyond average frame rate. It has gotten a lot more advanced, as pointed out in the article here, and this sort of information is very helpful for people building or buying gaming computers. I use info like this on an almost daily basis to help my customers get the best system for their needs, and I greatly appreciate the enthusiasts and websites which continue to push the ways we do testing!

  • by gman003 ( 1693318 ) on Friday February 22, 2013 @04:02PM (#42983411)

    99th percentile frame times. That gives you a realistic minimum framerate, discarding most outliers (many games, particularly those using UE3, tend to have a few very choppy frames right on level load, that don't really affect performance).

    • UE3 uses some kind of deferred loading. Notice when you first enter the menu and whatnot that everything looks like garbage for a moment - the hitching you see at the start is because of texture uploads to your VRAM and the like.

      • Yeah, I knew that was why it happened. Many games, even most open-world games do that - they have low-res textures loaded for everything, and dynamically load and unload the higher-res ones depending on what the scene needs. Late UE2.5 and early UE3 titles seem to stick out as the ones that preload *no* high-res textures until the level actually starts. UT3, The Last Remnant, Bioshock, games like that.

        Rage is another example of one that is extremely aggressive about unloading textures - you can look at a wa

    • I'm not interested in that, but I'm fine with chopping the first and last 5% of the frames off before calculating the frame times. I want to know what the frame rate is going to be when everything is exploding and a texture is being loaded and my bus is congested etc etc.

  • Developers (Score:5, Insightful)

    by LBt1st ( 709520 ) on Friday February 22, 2013 @04:20PM (#42983637)

    This is interesting from a developer standpoint as well. This means we are wasting processing time rendering frames and are only displayed for a handful of milliseconds. These frames could be dropped entirely and that processing time could go to use elsewhere.

    • by epyT-R ( 613989 )

      I guess.. if you're targeting your game at mouthbreathing harelips.. might as well just produce your shovelware for the consoles then and not worry about multigpu PC at all.

    • I understand that http://www.lucidlogix.com/product-virtu-mvp.shtml [lucidlogix.com] is doing this selected drop of frames.
    • But isn't that par for the course? I mean, whenever the frame rate exceeds the refresh rate of the monitor, you're using resources to render literally invisible frames. Yet it's my impressions that games/drivers will still render those frames. Isn't that right? I would appreciate games that save me energy, or used those resources to make better AI decisions, etc.
    • Re:Developers (Score:4, Informative)

      by mikael ( 484 ) on Friday February 22, 2013 @06:25PM (#42985301)

      Developers still like to have everything on a "main loop" - render static scenery, get user move, render player, get network player moves, render network players, render auxiliary data). Other stuff will be spinning and bobbing up and down on its own based on timers. Some frames might never be rendered, but they help keep the "tempo" or the smoothness of the animation. As each PC screen can have a different screen resolution, it will have a different refresh rate, anything from 50Hz to 120/240Hz. Every rendered frame is only going to be visibile for several milliseconds (50Hz = 20 milliseconds, 100Hz = 10 milliseconds). If a frame is rendered, it will be perceived even if not consciously.

      Early home computers allowed the program to synchronize animation updates to the VBI (Vertical Blank Interrupt) and HBI (Horizontal Blank Interrupt). That way, you could do smooth jitter-free physics synchronised to the frame flipping.

      16-bit console system programmers would render out lines across the current scan-line to see how much processing they could do in each frame. While the tiles were updated during the VBI, the physics could be updated during the CRT scanning.

      These days, I would guess you would need either a vertical blank callback for the CPU or shader for the GPU.

    • Turn on Vsync and then you won't render more frames than the monitor can display. If you want to go a step further then fix your engine so that everything isn't stuck on the main loop waiting for a frame to be rendered like many developers still do many years after the proliferation of multi-core cpu's.

      • It's generally desirable to have the AI and physics run at a fixed time step because it allows you to reproduce results exactly. That way you can record gameplay by just recording the inputs. So usually you will have a 'simulation' thread running at a fixed tick rate and a 'render' thread producing frames as fast as possible. I agree about the Vsync, there is not point whatsoever in making frames faster than the display can display them.

        And in fact that's the problem with this frame-time benchmarking, if th

  • by Anonymous Coward

    As a long-time GTX 295 owner, I've known for quite a while that my eyes are really good at seeing stuttering. For a few years, my GTX 295 did a splendid job keeping up with games, and as long as I could manage 60 FPS everything went seemed pretty smooth. I did have a few moments where I did see micro-stuttering but I found that either enabling V-sync or enabling frame limiting solved the problem. As you can see in this diagram http://www.pcper.com/files/review/2013-02-22/fr-3.png it's very possible that you

  • I discovered this about a year ago, when i wanted to add a 3rd monitor to my system, and discovered I couldn't do it in Crossfire mode with my dual 4850s, but COULD do it if i turned it off. Productivity being slightly more important to me than game performance, I turned it off and hooked up my 3rd monitor.

    A few days later I decided to fire up Skyrim, and didn't notice any discernible drop in performance at all. My settings were all on medium, just because the cards were a few years old, but still, I expect

    • Your specific series of GPU didn't have a crossfire profile for a specific game, resulting in no performance increase, and this means that both Crossfire and SLI are worthless? Granted they aren't for everyone but, for higher resolutions at high settings they border on mandatory and, at least in the case of SLI, make a massive performance difference.
      • In the case of Skyrim, the game didn't work AT ALL with Crossfire on my system when it was first released. Had to disable Crossfire at first, then ATI released a profile that finally got it working, but it wasn't a performance boost, it was a bug fix.

        I'm not saying Crossfire/SLI didn't give any performance boosts, what I'm saying is that for most models of graphics cards, it is a marketing gimmick to get you to pay twice for something that might average out to a 10% performance boost across all your games.

What is research but a blind date with knowledge? -- Will Harvey

Working...