New GPU Testing Methodology Puts Multi-GPU Solutions In Question 112
Vigile writes "A big shift in the way graphics cards and gaming performance are tested has been occurring over the last few months, with many review sites now using frame times rather than just average frame rates to compare products. Another unique testing methodology called Frame Rating has been started by PC Perspective that uses video capture equipment capable of recording uncompressed high resolution output direct from the graphics card, a colored bar overlay system and post-processing on that recorded video to evaluate performance as it is seen by the end user. The benefit is that there is literally no software interference between the data points and what the user sees, making it is as close to an 'experience metric' as any developed. Interestingly, multi-GPU solutions like SLI and CrossFire have very different results when viewed in this light, with AMD's offering clearly presenting a poorer, and more stuttery, animation."
Marketing claims exaggerated? (Score:2, Redundant)
That unpossible. :-P
Re: (Score:2)
Re: (Score:2)
Hmm, I've not actully used a AMD/ATI videocard in 10 years, and the last one I had was an oddball PCI one in an oddball laptop (fujitsu P2110 if anyone cares). I did finally get it to work, but it took most of a year and around 4 distros. That said, I'm on my 4th or 6th nvidia card, and installing the nvidia drivers, and running "xorg --configure" was the most I have had to do (well back in the bad old days i had to swap the driver line in xorg.conf. These days everything seems to just work, and there are n
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
Re: (Score:3)
Mind..... you, the o......only thing w..w....w....w...orse thanananananan stutttttttttttttttttttttttery animation is jerkyerkyerky so........und.
FTFY
Re: (Score:2)
SHODAN? Is that you?
Re: (Score:1)
Re: (Score:1)
You use GPUs for video games? (Score:5, Funny)
Re:You use GPUs for video games? (Score:4, Interesting)
Out of curiosity, what's your break even point?
If you went out now, and bought one of these video cards solely for this ... how long would it take to recoup the cost of the card? Or is this something you'd run for a long time, and get two bucks out of, but still have had to pay for your electricity?
I hear people talking about this, but since I don't follow BitCoin closely enough, I have no idea if it's lucrative, or just geeky.
Re:You use GPUs for video games? (Score:4, Funny)
People that "mine" bitcoins don't pay for their own electricity. Most people don't have the basement circuits metered separately from the rest of the house.
Re:You use GPUs for video games? (Score:5, Insightful)
Don't forget electrical costs. At $0.10 a kWh you are paying $0.24 a day (24 hours) per 100 watts of continuous average power consumption. This is $7.20 per month per 100W @ $0.10 /kWh or $87.60 a year. Adjust up/down for your cost of electricity and power usage (120W and $0.12/kWh = 1.2 * 1.2 = 1.44x adjustment)
Now add to this the waste heat vented into your house on the months you cool your house + the depreciated costs (and wear and tear) of the computer assets you tied up processing Bitcoins, then you'll have your true cost and you can calculate your break even point based on initial investment + ongoing costs - product (bitcoins) produced.
Re: (Score:2)
You'll note that I didn't. ;-)
Re: (Score:3)
Re: (Score:2)
That costs a lot more to run than a natural gas furnace.
Re: (Score:3)
I doubt a natural gas furnace even has a PCIx16 slot much less match even a budget Nvidia's hashing specs.
^See what I did there? Another joke.
Re: (Score:2)
In every house or apartment that has frosted windows in the doors or skylight windows above the doors, a single laptop screen would light up the entire floor - much to the annoyance of those who went to bed early to sleep vs. those who wanted to read slashdot.
Re:You use GPUs for video games? (Score:4, Interesting)
Don't forget electrical costs. At $0.10 a kWh you are paying $0.24 a day (24 hours) per 100 watts of continuous average power consumption. This is $7.20 per month per 100W @ $0.10 /kWh or $87.60 a year. Adjust up/down for your cost of electricity and power usage (120W and $0.12/kWh = 1.2 * 1.2 = 1.44x adjustment)
Believe me, I do not. With electricity costs taken into account I make around $4 per day (from 4 video cards) from Bitcoin or Litecoin on 2 gaming systems I rarely use. When I use my main gaming system it is slightly less.
Now add to this the waste heat vented into your house on the months you cool your house
Living in a colder climate these costs offset, however I have no hard numbers. The slightly higher electricity cost in the summer months are offset from a savings in natural gas cost in the winter months.
+ the depreciated costs (and wear and tear) of the computer assets you tied up processing Bitcoins
The goal is to maximize profits and not necessarily maximize the amount of bitcoins/litecoins I mine, so thanks to the power curve of most cards, it is more profitable to slightly underclock the core and/or memory clock which helps minimize wear and tear on the cards. The cards I've had since 2009 are still running and producing the same MH/s as they always have.
Many people who still mine bitcoins with GPU's are people who don't pay for electricity costs thanks to the difficulty rise from FPGA's and ASIC's. This pushed out any profitability for me, but I still have profitability from Litecoin, which is a similar cryptocurrency.
Even if there were no profits and I was just breaking even I would still do it because I would like a use for my gaming machines since I rarely game anymore but still want to sit down and play every couple of weeks.
Re: (Score:2)
Re: (Score:2)
if you would like a use for your gaming machines, why not BOINC? you can choose where to donate computing power, although I'm not sure how many projects work on the gpu.
Re: (Score:2)
Or how much heat is added during the hot times. Ugh. It's nice during the winter though!
Re:You use GPUs for video games? (Score:5, Interesting)
Kill-A-Watt says I've used approx $68.23 of electricity at 11.5 cents per kWh. Bitcoins currently trade at 1 to $30 and I've got 2.2 bitcoins. The Radeon 6770 was (and still is) ~$110.
Additional factors to consider:
-The bitcoin machine is also my daily workstation so if it were running headless and otherwise unused it would have probably done better in the electricity used category.
-It makes a damn fine space heater and I've enjoyed it immensely this winter.
-My focus in this project was to learn hands-on about scientific computing applications and it's been great for that.
In conclusion: as a business it would have been a flop, partially because I haven't optimized the setup for that application. As a learning opportunity and 200 watt heater it's been phenomenal.
Re: (Score:2)
Well then, a learning opportunity and a 200 watt heater are fine outcomes then. :-P
Re: (Score:2)
That is the most insight i've ever gotten into the bitcoin economy, I've always passed because the hardware is worth more and can break from such use, especially in the long term. I'm still passing, but it's good to know about the electricity. As for TFA, I don't think the author realizes that there's a ton of video cards with multi-gpus on board, it's not all crossfire and SLI, and hasn't been for the last decade. The method they're using seems legit on the surface until you read at the bottom that they
Re: (Score:2)
Re: (Score:2)
Keep in mind that for Bitcoin the individuals like you running tiny little mining setups that might not be actually profitable as a fun hobby are a very good thing. Bitcoin needs mining power to be as well distributed as possible to make it difficult to co-opt, so the hundreds or maybe even thousands of individuals like you help that goal. However, it's helped best if you actually validate your blocks properly, and that means mining with P2Pool right now.
Bitcoin is lucky that the costs to mine for a small r
Re: (Score:2)
Re: (Score:2)
Re:You use GPUs for video games? (Score:4, Insightful)
Re:You use GPUs for video games? (Score:5, Informative)
Out of curiosity, what's your break even point?
I don't know where the break even point is, but once you pass it, you can be very profitable. One of my friends built a custom "supercomputer" out of cheap motherboards and graphics cards for about $80k -- along with completely custom software to automatically tune clock speeds and fan rates in real time (all of which was written in bash script). At peak performance, his machine generated about $20k worth of bitcoin every month, which easily paid for the $12k monthly electric bill.
After a couple of difficulty-doublings, and the imminent arrival of the ASIC miners, this lost its profitability, and he went back to being a DBA... The machine is still out at the farm, cranking away. I think he'll disassemble it and part it out for cash in a month or two.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
So you are telling me he had the upfront money AND a server farm that rented him space, using a completely custom, unknown type of machine that no one had a problem with? Sounds legit
Or am I an asshole and it is on a real vegetable/animal farm?
Re:You use GPUs for video games? (Score:4, Informative)
Dude, it's a farm. A fucking farm. 40 acres of red wheat.
He designed the rack system himself, along with custom power supply headers that he had fabbed at a nearby plant. He even tried to reduce equipment costs by hiring a Taiwanese company to produce custom GPU cards for him for $70 a piece (they didn't work very well).
Nobody does that shit anymore. It was like watching Steve Wozniak.
Re: (Score:2)
It makes me sad that someone could run up a $12K monthly electric bill without assigning an environmental cost to where that power was coming from.
Re:You use GPUs for video games? (Score:4, Informative)
It makes me sad that someone could run up a $12K monthly electric bill without assigning an environmental cost to where that power was coming from.
Making assumptions is bad.
Before the Bitcoin operation got started, my friend's business was making biodiesel out of local rendered chicken fat and other things. He single-handedly supplied most of the farmers in a 5 mile radius with fuel for their farm operations. Prior to the biodiesel years, he ran the largest privately owned solar grid in the county, providing something like 25 kilowatts back to the grid, for a couple of years solid. He is the most environmentally obsessed person I know, and has certainly contributed far more to the local green economy than he has taken out of it.
The ultimate plan, which did not come to fruition (because of the rising difficulty of mining bitcoin, as I stated earlier), was to completely cover the 40 acre property with an array of solar panels, each panel having a custom GPU mining module installed on the underside -- open air cooling of the machines, solar power for the bitcoins, and it would have qualified as the largest solar array in the United States.
To think that he's some kind of forest-destroying air-blackening capitalist is about the furthest from the truth as you can get. Check your assumptions.
Re: (Score:2)
His heart is pure, so his free money scheme has no externalities? All righty, then.
Re: (Score:2)
His heart is pure, so his free money scheme has no externalities? All righty, then.
Is YOUR net carbon emission negative? Have you ever bothered to even measure it?
Re: (Score:1)
This is only because nvidia intentionally cripples all consumer grade GPUs, artificially reducing their precision to make them near useless as GPGPU devices.
This is simply to price gouge the high end.
And they get away with it because they have a near monopoly on the market.
AMD is a latecomer and for whatever reason they don't have the platform and support to break in to the GPGPU market. (Nvidia really did take the torch and pioneer the field)
Re:You use GPUs for video games? (Score:5, Informative)
https://en.bitcoin.it/wiki/Why_a_GPU_mines_faster_than_a_CPU#Why_are_AMD_GPUs_faster_than_Nvidia_GPUs.3F [bitcoin.it]
Specifically:
For GPU programming I've enjoyed Nvidia's CUDA package greatly over wrangling OpenCL that Radeon relies on.
Re:You use GPUs for video games? (Score:4, Insightful)
This is the explanation I've been given for the disparity between Nvidia and AMD: https://en.bitcoin.it/wiki/Why_a_GPU_mines_faster_than_a_CPU#Why_are_AMD_GPUs_faster_than_Nvidia_GPUs.3F [bitcoin.it] Specifically:
For GPU programming I've enjoyed Nvidia's CUDA package greatly over wrangling OpenCL that Radeon relies on.
You're living on borrowed time with CUDA. The entire industry has already moved to OpenCL and it will only expand when all the heavy Engineering and Science vendors are fully on-board. When Ansys 14.5 already moved to OpenCL for its latest release you know such a conservative corporation is one of the last to make the transition.
Re: (Score:2)
Honestly asking, because that's an absolute killer feature for HPC applications. PCIx is abhorrently, soul-crushingly slow from the GPU's perspective, and being able to RDMA without ever moving through the hosts' memory saves half your pcix bandwidth use.
Re: (Score:2)
Ansys is not designed really for your HPC football field sized super computer, but for high end workstations, or a small in house farm. Going OpenCL must have been seen as a good idea. AMD still makes FireGL cards, and using OpenCL lets Ansys work everywhere.
AMD will dispute (Score:1)
I'm sure that AMD, the losing party, will dispute the results and come up with its own methoology to counter this.
Then again, everyone knew nVidia high end cards are better, so was this new test really necessary??
Re: (Score:2)
I'm sure that AMD, the losing party, will dispute the results and come up with its own methoology to counter this.
Then again, everyone knew nVidia high end cards are better, so was this new test really necessary??
The point of the "new test" is that framerate is a terrible metric because it averages out what you care about.
When you measure frame times individually you can then quantify how often a game slows down and by how much.
You don't just have an average FPS, or a MAX/AVG/MIN.
Regardless... (Score:5, Interesting)
As an owner of a Crossfire setup, it's obviously not a 2x improvement over a single card; however, it's also a marked improvement over a single card. When I first set up this rig (August), I had problems with micro-stutter.* Now, though, after AMD's newer drivers and manually limiting games to 59 FPS, I don't see it anymore; games appear smooth as silk.
At a mathematical level, it may not be a perfect solution, but at a perceptual level, I am perfectly satisfied with my purchase. With that said, buying two mid-line cards instead of one high-end card isn't a good choice. Only buy two (or more) cards if you're going high-end.
*I was initially very disappointed with the Radeons. That's no longer the case, but I will probably still go nVidia the next time I upgrade, which hopefully won't be for years.
Re: (Score:2)
Re: (Score:2)
Why yes, it's acceptable, because 59 is more than enough for smooth animations--your eyes don't notice the difference, and your monitor probably couldn't even refresh fast enough to show it, anyway. My games never drop below this rate, so it looks smooth throughout.
Re: (Score:2)
Why yes, it's acceptable, because 59 is more than enough for smooth animations--your eyes don't notice the difference ...
proves the point that only suckers buy into SLI/CF scheme
Re: (Score:2)
People who want and can see the difference between 60hz and 120 aren't suckers for their willingness to pay up, but it is true that SLI doesn't always deliver. We are far from the 3dfx days where a second card gave an automatic 100% performance boost in every application. As someone who can easily see the difference, I always shoot for a single GPU whenever possible.
Re: (Score:2)
Re: (Score:2)
SLI/CF decrease the chances that the frame rate will drop below an acceptable level. They're pointlessly rendering if they go beyond what you, and your monitor, can perceive.
The only point proven is that you do not understand FPS, nor do you understand the purpose of SLI/CF.
Re: (Score:2)
They're pointlessly rendering if they go beyond what you, and your monitor, can perceive.
The only point proven is that you do not understand FPS, nor do you understand the purpose of SLI/CF.
I understand fps plenty. More than OP because I know you can spot more than 60Hz. You need to be a really big sucker if you
a believe going over 60Hz will be unnoticeable
b pay for two cards
Re: (Score:1)
Limiting your framerate to only 59fps on Crossfire is acceptable to you?
You must be one of those fucking morons who think there's a visual difference between 120fps and 60fps on a 60Hz LCD monitor.
Re: (Score:3)
You must be one of those fucking morons who:
1. doesn't realize real 120hz panels exist.
2. doesn't realize that even a vsync'd disabled 60hz display still allows for lower input response latency if the graphics cards allow higher framerate.
3. doesn't realize that 60hz+ isn't the only reason people do multigpu. Having twice the fillrate helps in other areas too.
Re: (Score:1)
It has everything to do with input lag. If I move my mouse just under half-a-frame (monitor frame) before the monitor refreshes, then the monitor gives me a frame it started rendering just OVER half-frame before the refresh, I get to wait another full frame before my mouse movement is reflected in-game. If the GPU produces frames significa
Re: (Score:2)
2. doesn't realize that even a vsync'd disabled 60hz display still allows for lower input response latency if the graphics cards allow higher framerate.
No, no it doesn't, unless the input system is part of the graphics thread and running on the frame update timer, which is conspicuously not the case these days in any case of competence.
Re: (Score:1)
How much better is your gaming experience with a Monster brand HDMI cable?
Re: (Score:1)
Re: (Score:3)
Ever hear of Eyefinity? 5760x1200 is a lot of pixels to push.
Re: (Score:2)
59fps? Why that number? And how does that work with your monitor, which I assume refreshes at 60 Hz?
This trend has been going on a little longer (Score:4, Interesting)
It started when people began to look not only at average frame rate, but at *minimum* frame rate during a benchmark run. That shows how low the FPS can dip, which was the beginning of acknowledging that something in the user-experience mattered beyond average frame rate. It has gotten a lot more advanced, as pointed out in the article here, and this sort of information is very helpful for people building or buying gaming computers. I use info like this on an almost daily basis to help my customers get the best system for their needs, and I greatly appreciate the enthusiasts and websites which continue to push the ways we do testing!
Re: (Score:2)
Re: (Score:3)
The new methodology shows that nVidia SLI gets excellent results. However, I've owned SLI solutions, both two 285s and a 290. Neither one was satisfactory. The higher failure rate was of particular note. One of the 285s failed, I RMAd it, got a 290, eventually that failed... On top of that, even when things were working, there were lots of problems. Some games didn't run correctly. Crysis, for example, didn't seem to like SLI; it would get into a bizarre state where the screen would begin flickering rapidly
Another test I'm seeing more of (Score:5, Interesting)
99th percentile frame times. That gives you a realistic minimum framerate, discarding most outliers (many games, particularly those using UE3, tend to have a few very choppy frames right on level load, that don't really affect performance).
Re: (Score:3)
UE3 uses some kind of deferred loading. Notice when you first enter the menu and whatnot that everything looks like garbage for a moment - the hitching you see at the start is because of texture uploads to your VRAM and the like.
Re: (Score:2)
Yeah, I knew that was why it happened. Many games, even most open-world games do that - they have low-res textures loaded for everything, and dynamically load and unload the higher-res ones depending on what the scene needs. Late UE2.5 and early UE3 titles seem to stick out as the ones that preload *no* high-res textures until the level actually starts. UT3, The Last Remnant, Bioshock, games like that.
Rage is another example of one that is extremely aggressive about unloading textures - you can look at a wa
Re: (Score:2)
I'm not interested in that, but I'm fine with chopping the first and last 5% of the frames off before calculating the frame times. I want to know what the frame rate is going to be when everything is exploding and a texture is being loaded and my bus is congested etc etc.
Developers (Score:5, Insightful)
This is interesting from a developer standpoint as well. This means we are wasting processing time rendering frames and are only displayed for a handful of milliseconds. These frames could be dropped entirely and that processing time could go to use elsewhere.
Re: (Score:2)
I guess.. if you're targeting your game at mouthbreathing harelips.. might as well just produce your shovelware for the consoles then and not worry about multigpu PC at all.
Re: (Score:1)
Re: (Score:2)
Re:Developers (Score:4, Informative)
Developers still like to have everything on a "main loop" - render static scenery, get user move, render player, get network player moves, render network players, render auxiliary data). Other stuff will be spinning and bobbing up and down on its own based on timers. Some frames might never be rendered, but they help keep the "tempo" or the smoothness of the animation. As each PC screen can have a different screen resolution, it will have a different refresh rate, anything from 50Hz to 120/240Hz. Every rendered frame is only going to be visibile for several milliseconds (50Hz = 20 milliseconds, 100Hz = 10 milliseconds). If a frame is rendered, it will be perceived even if not consciously.
Early home computers allowed the program to synchronize animation updates to the VBI (Vertical Blank Interrupt) and HBI (Horizontal Blank Interrupt). That way, you could do smooth jitter-free physics synchronised to the frame flipping.
16-bit console system programmers would render out lines across the current scan-line to see how much processing they could do in each frame. While the tiles were updated during the VBI, the physics could be updated during the CRT scanning.
These days, I would guess you would need either a vertical blank callback for the CPU or shader for the GPU.
Re: (Score:2)
Turn on Vsync and then you won't render more frames than the monitor can display. If you want to go a step further then fix your engine so that everything isn't stuck on the main loop waiting for a frame to be rendered like many developers still do many years after the proliferation of multi-core cpu's.
Re: (Score:2)
It's generally desirable to have the AI and physics run at a fixed time step because it allows you to reproduce results exactly. That way you can record gameplay by just recording the inputs. So usually you will have a 'simulation' thread running at a fixed tick rate and a 'render' thread producing frames as fast as possible. I agree about the Vsync, there is not point whatsoever in making frames faster than the display can display them.
And in fact that's the problem with this frame-time benchmarking, if th
Re: (Score:2)
If I'm buying a GPU for a video game, I only care about how it benchmarks on video games.
Uh... No, you don't. You care how it actually PLAYS the games. Benchmarks are just an indication used for purchasing the GPU, and this article talks about a better way of measuring gaming performance to make better decisions.
Multi-GPU works somewhat... (Score:1)
As a long-time GTX 295 owner, I've known for quite a while that my eyes are really good at seeing stuttering. For a few years, my GTX 295 did a splendid job keeping up with games, and as long as I could manage 60 FPS everything went seemed pretty smooth. I did have a few moments where I did see micro-stuttering but I found that either enabling V-sync or enabling frame limiting solved the problem. As you can see in this diagram http://www.pcper.com/files/review/2013-02-22/fr-3.png it's very possible that you
Crossfire/SLI are worthless (Score:2)
I discovered this about a year ago, when i wanted to add a 3rd monitor to my system, and discovered I couldn't do it in Crossfire mode with my dual 4850s, but COULD do it if i turned it off. Productivity being slightly more important to me than game performance, I turned it off and hooked up my 3rd monitor.
A few days later I decided to fire up Skyrim, and didn't notice any discernible drop in performance at all. My settings were all on medium, just because the cards were a few years old, but still, I expect
Re: (Score:1)
Re: (Score:2)
In the case of Skyrim, the game didn't work AT ALL with Crossfire on my system when it was first released. Had to disable Crossfire at first, then ATI released a profile that finally got it working, but it wasn't a performance boost, it was a bug fix.
I'm not saying Crossfire/SLI didn't give any performance boosts, what I'm saying is that for most models of graphics cards, it is a marketing gimmick to get you to pay twice for something that might average out to a 10% performance boost across all your games.
Re: (Score:2)
> Besides, 60 fps, which is the refresh of the "image" is more then the human eye can distinguish
No it is not. A few us gamers can tell the difference between 60 Hz and 120 Hz.
The "practical" limit is anywhere from 72 Hz - 96 Hz. Sadly I'm not aware of anyone who has actually a done study what the practical limit is.
Re: (Score:2)
> ability to detect with consistent frame rates, and ability to detect a stutter
This a thousand times!
Lag is OK as long as it is consistent.
The nice thing about targeting a _minimum_ of 120+ Hz is that you can get proper 3D 60 Hz "for free." :-)