Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AMD Graphics Hardware

New GPU Testing Methodology Puts Multi-GPU Solutions In Question 112

Vigile writes "A big shift in the way graphics cards and gaming performance are tested has been occurring over the last few months, with many review sites now using frame times rather than just average frame rates to compare products. Another unique testing methodology called Frame Rating has been started by PC Perspective that uses video capture equipment capable of recording uncompressed high resolution output direct from the graphics card, a colored bar overlay system and post-processing on that recorded video to evaluate performance as it is seen by the end user. The benefit is that there is literally no software interference between the data points and what the user sees, making it is as close to an 'experience metric' as any developed. Interestingly, multi-GPU solutions like SLI and CrossFire have very different results when viewed in this light, with AMD's offering clearly presenting a poorer, and more stuttery, animation."
This discussion has been archived. No new comments can be posted.

New GPU Testing Methodology Puts Multi-GPU Solutions In Question

Comments Filter:
  • by gstoddart ( 321705 ) on Friday February 22, 2013 @03:41PM (#42983095) Homepage

    My AMD is cranking out Bitcoin hashes 15 times faster than an equivalently priced Nvidia so I'm okay with the results of this article.

    Out of curiosity, what's your break even point?

    If you went out now, and bought one of these video cards solely for this ... how long would it take to recoup the cost of the card? Or is this something you'd run for a long time, and get two bucks out of, but still have had to pay for your electricity?

    I hear people talking about this, but since I don't follow BitCoin closely enough, I have no idea if it's lucrative, or just geeky.

  • Regardless... (Score:5, Interesting)

    by Cinder6 ( 894572 ) on Friday February 22, 2013 @03:44PM (#42983163)

    As an owner of a Crossfire setup, it's obviously not a 2x improvement over a single card; however, it's also a marked improvement over a single card. When I first set up this rig (August), I had problems with micro-stutter.* Now, though, after AMD's newer drivers and manually limiting games to 59 FPS, I don't see it anymore; games appear smooth as silk.

    At a mathematical level, it may not be a perfect solution, but at a perceptual level, I am perfectly satisfied with my purchase. With that said, buying two mid-line cards instead of one high-end card isn't a good choice. Only buy two (or more) cards if you're going high-end.

    *I was initially very disappointed with the Radeons. That's no longer the case, but I will probably still go nVidia the next time I upgrade, which hopefully won't be for years.

  • by WilliamGeorge ( 816305 ) on Friday February 22, 2013 @03:45PM (#42983177)

    It started when people began to look not only at average frame rate, but at *minimum* frame rate during a benchmark run. That shows how low the FPS can dip, which was the beginning of acknowledging that something in the user-experience mattered beyond average frame rate. It has gotten a lot more advanced, as pointed out in the article here, and this sort of information is very helpful for people building or buying gaming computers. I use info like this on an almost daily basis to help my customers get the best system for their needs, and I greatly appreciate the enthusiasts and websites which continue to push the ways we do testing!

  • by gman003 ( 1693318 ) on Friday February 22, 2013 @04:02PM (#42983411)

    99th percentile frame times. That gives you a realistic minimum framerate, discarding most outliers (many games, particularly those using UE3, tend to have a few very choppy frames right on level load, that don't really affect performance).

  • by amanicdroid ( 1822516 ) on Friday February 22, 2013 @04:07PM (#42983465)
    Haha, I'm at less than 1:1 electricity to bitcoin ratio after ~5 months.
    Kill-A-Watt says I've used approx $68.23 of electricity at 11.5 cents per kWh. Bitcoins currently trade at 1 to $30 and I've got 2.2 bitcoins. The Radeon 6770 was (and still is) ~$110.

    Additional factors to consider:
    -The bitcoin machine is also my daily workstation so if it were running headless and otherwise unused it would have probably done better in the electricity used category.
    -It makes a damn fine space heater and I've enjoyed it immensely this winter.
    -My focus in this project was to learn hands-on about scientific computing applications and it's been great for that.

    In conclusion: as a business it would have been a flop, partially because I haven't optimized the setup for that application. As a learning opportunity and 200 watt heater it's been phenomenal.
  • by megamerican ( 1073936 ) on Friday February 22, 2013 @04:44PM (#42983937)

    Don't forget electrical costs. At $0.10 a kWh you are paying $0.24 a day (24 hours) per 100 watts of continuous average power consumption. This is $7.20 per month per 100W @ $0.10 /kWh or $87.60 a year. Adjust up/down for your cost of electricity and power usage (120W and $0.12/kWh = 1.2 * 1.2 = 1.44x adjustment)

    Believe me, I do not. With electricity costs taken into account I make around $4 per day (from 4 video cards) from Bitcoin or Litecoin on 2 gaming systems I rarely use. When I use my main gaming system it is slightly less.

    Now add to this the waste heat vented into your house on the months you cool your house

    Living in a colder climate these costs offset, however I have no hard numbers. The slightly higher electricity cost in the summer months are offset from a savings in natural gas cost in the winter months.

    + the depreciated costs (and wear and tear) of the computer assets you tied up processing Bitcoins

    The goal is to maximize profits and not necessarily maximize the amount of bitcoins/litecoins I mine, so thanks to the power curve of most cards, it is more profitable to slightly underclock the core and/or memory clock which helps minimize wear and tear on the cards. The cards I've had since 2009 are still running and producing the same MH/s as they always have.

    Many people who still mine bitcoins with GPU's are people who don't pay for electricity costs thanks to the difficulty rise from FPGA's and ASIC's. This pushed out any profitability for me, but I still have profitability from Litecoin, which is a similar cryptocurrency.

    Even if there were no profits and I was just breaking even I would still do it because I would like a use for my gaming machines since I rarely game anymore but still want to sit down and play every couple of weeks.

Lots of folks confuse bad management with destiny. -- Frank Hubbard

Working...