Folding@home GPU2 Beta Released, Examined 149
ThinSkin writes "Stanford has recently released an update to their Folding@home GPU-accelerated client, which includes notable upgrades such as support for more current Radeon graphics cards and even a visualizer to see what's going on. ExtremeTech takes a good look at the new Folding@home GPU2 client and interviews Director Dr. Vijay Pande about the project. To the uninitiated, Folding@home is a distributed computing project in which hundreds of thousands of PCs and PS3s devote a portion of their computing power to crunch chunks of biological data. The goal is 'to understand protein folding, misfolding, and related diseases.'"
Global Warming! (Score:5, Funny)
(I am joking, for those of you who are humor impaired)
Re:Global Warming! (Score:5, Funny)
Re:Global Warming! (Score:5, Funny)
Re: (Score:2)
Re:Global Warming! (Score:5, Funny)
Perhaps later. Too tired now. *yawn*.
Re: (Score:3, Funny)
Re: (Score:2, Interesting)
Re: (Score:2, Funny)
Re: (Score:2)
Perhaps this machine could assist in our efforts to run through all possible permutations to discover the true name of God [lucis.net]. . .
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re:Global Warming! (Score:5, Informative)
Folding @ Home on a PS3 costs the average participant around $150-200 year in electricity if they run it 24x7. Up to $400+ in places where electricity is more expensive. PCs average less, but only because so many of them are lower power, while all PS3s are high wattage.
I think its a worthwhile project, but the electricity people are donating isn't free and F@H uses a lot more electricity than most people think. "Oh, I've got my PC on anyway", or "Oh it can't be as much as my fridge." both of which are mistaken, your fridge uses a fraction of what a PS3 running F@H does, and even if your PC is on, running at idle or going to sleep uses a LOT less power than maxxing out the cpu and/or gpu 24x7.
A PS3 running @ 280W 24x7 for a year:
280W x 24h/d x 365d/y = 2452800 Watt-hours/year or 2452 kWh/y
at $@.12/kWh that'll cost you: $294.00 / year
Then multiply that by the number of PC's running it... it adds up fast.
Like I said, its a good program and a good cause, BUT its not free. A kid/teen shouldn't be running it without a parents permission and understanding of the cost.
I don't like the F@H 'propaganda' because I think its somewhat deceptive about the costs. Its relying on peoples attitude that their free cpu time is truly free to prevent them thinking about the real costs. If you probe they don't lie about the costs, but ethically they really should be more upfront about them.
And now that there is money involved, I should choose the best use of it. When I'm faced with a decision of choosing the best place to donate $300 I think their are other causes more worthy of my money than F@H. But that's a personal choice. If you want to donate to F@H, by all means do so.
One final issue - generally when you donate more than $10-20 to charity you get a tax receipt. $150-500 quite a bit more than $10.
Re: (Score:2, Insightful)
Re:Global Warming! (Score:4, Insightful)
Fair enough. But its a little dishonest if you don't REALIZE how much you are invested. That's my biggest issue. Once people know what it costs I have no issue if they're still willing to contribute. But it bugs me, especially since I beleive the a very significant proportion of the people contributing to F@H are not the one's paying the bills.
The other part is how much do F@H results actually cost, in aggregate? Is it good value for the science produced? They've consumed between $50 and 100 million in electricity. Could they have made better progress towards their goals if they were given the money directly? At the very least if they built their own super computer and managed the costs directly the waste would be far far less.
Not only would they be paying industrial rates for electricity instead of residential rates, they'd also be using far less of it because they'd have racks of CPUs not all powering hard drives, and what not needlessly.
Hell, just take a look at the from their site: (For the purposes of this I've assumed that it costs 'volunteers' on average $10 to run a cpu per month in electricty.)
190,000 PCs generating 182 TFLOPs. 191k cpus. Total Cost ~1.9M/month. ~$10,494/TFLOP/month
41,000 PS3 generating 1257 TFLOPs. 41k cpus. Total Cost ~0.4M/month. ~$326/TFLOP/month
What moron would keep the PCs running?
A final note about overhead. You lose 10-20% efficiency right off the top with F@H due to the lack a tax receipt. I can donate $250 to a registered charity at the same cost to me as buying $200 worth of electricty due to the taxes. Or conversely when you donate $200 to F@H -you- pay an extra 20-50 in taxes vs had you given the same $200 to a registered charity.
but if i had to choose, and if i had a choice, i'd rather invest in an @home project.. i find it a lot more intrinsically motivating than knowing i'm keeping a statistic alive that in 10-20 years might start earning their country some money through taxation because he's had his K-6 education.
Between those two I'm inclined to agree. I tend to mostly donate to small local organizations myself.
Re:Global Warming! (Score:4, Insightful)
Re: (Score:2)
Re: (Score:3, Interesting)
Re:Global Warming! (Score:4, Informative)
So you can go and buy a second PS3.
Re:Global Warming! (Score:4, Informative)
There was information when the PS3/F@H launched that consumption was 280-300W, but apparently that was actually around 200-220W so my post above was out by ~$70, and now with the newer lower wattage PS3s the price comes down even more.
But even at 135W, assuming the same
Re: (Score:2)
Having a PC on, but idling, certainly consumes less power than one with a maxed CPU/GPU.
The only way to know for sure exactly how much a difference it is for you would be to stick an ampmeter on your power cable and measure it at both times.
Re: (Score:2, Informative)
Re: (Score:2)
There are legit reasons to have a machine 'on' all the time, but most of them should go to sleep or even hibernate, if not just get turned right off.
Re: (Score:2, Interesting)
Good reason to run FireFox and AdBlock or FlashBlock. Even better, turn your PC off when you are not using it.
Re: (Score:3, Funny)
Re: (Score:2)
One final issue - generally when you donate more than $10-20 to charity you get a tax receipt. $150-500 quite a bit more than $10.
The last time I donated to charity (clothing, not money), I got a receipt that said I had donated, but not how much -- I was responsible for filling in the details and providing any documentation of value I needed. If that's acceptable for Good Will, etc,., it should work for Folding@Home. They don't need to come up with a dollar figure, you can do that. They already tell you how many work units you did, right? Accounting for the electricity cost is your problem, but they should provide the details of
Re: (Score:3, Informative)
They would need to be registered charity though, for taxes. You can't just say you donated money to X and call it a day.
Re: (Score:3, Insightful)
Re: (Score:3, Interesting)
Good point.
Be interesting to see someone try and claim it though. I wonder if the IRS would agree to its validity.
Probably help if they provided you with a proper receipt of some sort, which they don't.
And I don't think it'll help non-americans even if they did, unless they were registered as a chairty in other countries as well.
Re: (Score:2)
Re: (Score:2)
Your math is WAY off. (Score:2, Interesting)
Old PS3s (90nm):
Folding@Home with visuals: 215 watts.
Folding@Home screen saver: 185 watts.
New PS3s (65nm):
Running Folding @ home 157
Considering the GPU is still 90nm, that 157 figure should drop to ~127 watts when the screen saver kicks in.
Typical energy costs are also more like $.10/kWh.
127W x 24h/d x 365d/y = 1112520 Watt-hours/y or 1113 kWh/y
at $.10/kWh that actually costs moar like: $111/y.
Or if for some reason you're paying $12/kWh, that's still only $134, less than half of you
Re: (Score:2)
F@H with visuals (map + protein thumbnail): 148 watts
F@H with screensaver: 131 watts
Oddly the screensaver did not drop it by the same number of watts despite the fact the GPU is the same die size as on older PS3s; in fact, it was only 57% the decrease experienced on the 90nm PS3. I am uncertain as to why this may be.
Regardless, 131 watts is still only ~$120/year for me, and that's certainly manageable for the sheer amount of work and fol
Re: (Score:3, Informative)
As for the price of electricity, and your assertion that its 10c? vs 12c? Now were just playing statistics. I could justify mine by noting that prices are generally higher in Europe and Japan for electricity. (Its the equivalent of
Re: (Score:2)
But even if you were right, I'd say one cannot (for example) complain about the government not funding NASA and at the same time not run Folding @ home for economical reasons. Both are great science and both are worthy of (at least) a modest amount of investment.
Re: (Score:2)
You are right. My figures were off. The PS3 watt rating for the 90nm version is 190-220 not 280. The 65nm version is more efficient at around 157.
That brings the cost down to $188 and $137 respectively. But it bears mentioning that the
Re: (Score:2)
Re: (Score:3, Informative)
Re: (Score:2)
And thats and extremely reasonable and well thought out approach to it. But do you really think you are representative of the average F@H contri
Re: (Score:3, Informative)
Please show your work:
W : Wattage of your PC running full tilt?
P : Price of electricity in $/kwh in your area? P
8760 : hours / year
W x 8760 = Wh (Watt-Hours)
Wh / 1000 = kWh (convert from Wh to kWh)
kWh * P = Total
I'd like to see how you get to $24. Because that would require either telling me that your "FX-55 gaming rig" is averaging ~16Watts at full load, tha
Not "examined" (Score:2)
I do the laundry once a week (Score:5, Funny)
Re: (Score:2, Interesting)
visualization (Score:1)
Support for NVIDIA GPUs coming? (Score:4, Insightful)
From the benchmarks I have seen, it seems that there are currently no games that can effectively utilize, for example, 2 9800 GX2s. If Folding@home releases an Nvidia client, those people who have plunked $1000 into graphics cards may finally be able to put them to use!
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
Re: (Score:3, Insightful)
Re:Support for NVIDIA GPUs coming? (Score:5, Funny)
Re: (Score:2, Funny)
Re:Support for NVIDIA GPUs coming? (Score:5, Funny)
Re: (Score:2, Funny)
Re: (Score:2)
Re: (Score:2)
Re:Support for NVIDIA GPUs coming? (Score:4, Informative)
Ati Only (Score:4, Informative)
However it only runs on R600-based Ati cards right now. It also requires
Interestingly also, it claims to parallelize processing the atoms, so it must use the individual stream processors on the graphics card directly.
Re: (Score:2)
I guess my cluster will sit there with just its CPUs crunching numbers and its GPUs idle for a while longer.
Crude statement (Score:2, Funny)
Sounds like Pornography@home to me...
Doing this at work? (Score:1)
I, of course, would have to get the okay to do this, but I am not even sure I would want to...
Has anyone done this? How did you go about it? What concerns are there (security, reliability)?
Re:Doing this at work? (Score:5, Insightful)
Re: (Score:1)
Just because you have a 240 watt PS, doesn't mean you pull 240 constantly. In fact with drives and monitors off, you might be pulling 75. At least for most of our common computers.
I am environmentally aware, but I did the calculation and 16 hours of a computer running is less than 5 minutes of a $40,000 PHB's time. So the attempt to enforce the policy of shutting down computers nightly doesn't add up to the exec
Re:Doing this at work? (Score:4, Insightful)
You execs are right to dismiss the notion of shutting down a computer thats idle. It's NOT consuming much. However, when that same computer is crunching foldings numbers for it.... THAT is a huge cost.
Re: (Score:1)
Re: (Score:1)
BTW- you have a terribly inefficient computer if its sucking down 134w when idle... typical of an office computer is around 65w. Unless of course, you're talking about a gaming machine... (just realized the vid card you ref'd... d'oh!)
Re: (Score:1)
So is the program that intensive? Will it really pull that much power?
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Printer friendly link (Score:2, Informative)
Translation of "protein folding related diseases." (Score:3, Informative)
FYI: This means Prions related diseases => Mad cow disease
Re: (Score:2, Offtopic)
Re: (Score:2)
Heart disease is one area you can generally attribute to a genetic program not designed for the era we live in rather than specific defects (or accumulated
Shameless stat plug. (Score:1)
Idea: F@H to help filter spam? (Score:4, Interesting)
That way, if you read spam, at least you know that you contributed to F@H. If you want less spam, you turn up your threshold for how many work units the sender has to do.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
If they've got a botnet, and I can force them to use it to help F@H, I see that as a good thing.
And as I said in another post, if someone sends a legitimate email, I think it makes sense that you could return their credit when you say, "Ah yes, this isn't spam."
Re: (Score:2)
Re: (Score:3, Informative)
There are all sorts of third parties involved in sending email. I'm not proposing a solution for everyone - I'm suggesting one possibility.
Re: (Score:3, Insightful)
It's worth a shot at thinking outside the box, but they have the CPU cycles and can likely hack past any kind of attempt to node lock the work units.
I suppose a minor benefit would be that some kind of work gets done before a spam message was sent out, but there's got to be a way to get past that requirement -- F@H is based on a measure of trust (and some cross-validation) that parti
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Insightful)
Actually, a very similar system was tried; I don't know if it's still in any sort of wide-spread use (or as wide-spread as it ever got) or not.
Hashcash [wikipedia.org] involved calculating a hash, taking up CPU time, and sticking it in the email header. The recipient could easily verify that you'd spent CPU to send this message, hence, in theory, proving that you're not a spammer.
Re: (Score:2)
All sorts of people filter based on the amount of work that went in to the communication. From least impressive to most impressive - send someone an email, send a fax, place a phone call, put a letter in the mail, show up in person to talk to them, have a friend of theirs say that you'd like to get in touch, put a full page ad in the
Re: (Score:2)
Of course you'd still use your whitelist and blacklist. When did I ever say anything else?
I'm talking about unsolicited emails from addresses and domains you've never heard of before. The email saying, "Hey, I think we were in college together - and I'm going to be in town this weekend - want to get a beer?" This case is exactly where your greylist fails, because there's no way for the sender to raise the liklihood of you reading his email. My proposal would make it possible.
Re: (Score:2)
You could also try not starting a conversation with "that's the stupidest idea I've seen all week." It makes it more enjoyable for both parties.
You may enjoy the challenge of debating or discussing a moving target, I don't.
Well, when I write a full specification for the system, I'll give you a call. Until then, I was posting an idea on a public forum. If you don't enjoy discussing evolving ideas, don't respond when they're posted
Re: (Score:2)
Email is inherently unreliable. I'm not proposing a mechanism that everyone would have to use to send email, I'm proposing one way that people could use to increase the liklihood that their unsolicited email would be read by the receiver.
Use your whitelist and your blacklist. For everything in between, it's a hard problem with no clear solutions, so that's why I've got a proposal.
What about AI? (Score:2)
i know it takes billions of neurons to do anything, but with all this extra power laying around we might just have enough to do it.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I guess they will still be using drivers for the cards though, even if they are not using DirectX? But this is closer to bashing right on the hardware =p if cards were all made to conform to a certain set of intructions (presumably along the lines of how all x86 processors have the same basic instructions?), we'd be able to eliminate the need for drivers there
Re: (Score:2)