Sacrificing Accuracy For Speed and Efficiency In Processors 499
Skudd writes "Modern computing has always been reliant on accuracy and correct answers. Now, a professor at Rice University in Houston posits that some future applications could be revolutionized by 'probabilistic computing.' Quoting: 'This afternoon, Krishna Palem, speaking at a computer science meeting in San Francisco, will announce results of the first real-world test of his probabilistic computer chip: The chip, which thrives on random errors, ran seven times faster than today's best technology while using just 1/30th the electricity. ... The high density of transistors on existing chips also leads to a lot of background "noise." To compensate, engineers increase the voltage applied to computer circuits to overpower the noise and ensure precise calculations. Palem began wondering how much a slight reduction in the quality of calculations might improve speed and save energy. He soon realized that some information was more valuable than other information. For example, in calculating a bank balance of $13,000.81, getting the "13" correct is much more important than the "81." Producing an answer of $13,000.57 is much closer to being correct than $57,000.81. While Palem's technology may not have a future in calculating missions to Mars, it probably has one in such applications as streaming music and video on mobile devices, he said.'
Re:Accuracy with financial calculations. (Score:3, Informative)
Hasn't this guy ever watched Superman 3?
Maybe he just watched Office Space and missed the whole Superman 3 reference?
np: Fennesz - Vacuum (Black Sea)
Multimedia processor (Score:3, Informative)
If you're about to join the upcoming avalanche of smartass comments, try reading the UDP-Lite RFC [ietf.org] first. For some applications (notably real-time voice and video), timeliness and efficiency are more important than accuracy.
If this means my music player or phone get more battery life, I'm all for it.
Primality testing (Score:3, Informative)
Re:uhhh.... (Score:5, Informative)
NEWS FLASH: Binary consists of 1 and 0.
NEWS FLASH: People use computers for calculations with more than single-digit binary results.
"probablistic computing" is another way of saying "sloppy engineering".
No- insisting on excessive precision where an "almost certainly right to within +/- x%" solution would be more than good enough and much simpler to obtain is known as overengineering.
I suspect that the financial examples chosen didn't illustrate the point as well as intended (financial companies generally don't like *any* inaccuracy), but that doesn't change the general principle.
Would you prefer a routing algorithm that gobbled up power and took ages to run for a guaranteed shortest route or one that was far more efficient and 99.9% certain to give a route that was within 3% of the shortest possible distance?
Re:Accuracy with financial calculations. (Score:5, Informative)
quite correct the thousands is far more important than the cents, however 13,810.00 is really close to 13,000.81 right. it has all the same numbers in a similar order.
Not by the fuzzy logic the guy's using. He's going for scientific accuracy. IE 13,000.81 (+-.001%). It's just our brains that compare symbols that would consider those numbers 'close'.
In which case a $810 error in a $13k account is a big friggen error, and would violate the standards of the chip he's working on. Now, I don't know HOW he's making sure high order bits are done more accurately than the low order ones, but that's what the article mentions him doing.
Re:DSP's? (Score:3, Informative)
Re:Nothing new here... (Score:1, Informative)
JPEG and MPEG are more about a tradeoff of accuracy for space, not a tradeoff of accuracy for speed or for electricity.
Re:Bank balance (Score:5, Informative)
'It's like drinking from a well. Connoisseurs may claim to be able to taste the difference between it and tap water, but that's just the extra tang from all the bull shit.'
Probably not the best example. Humans have an amazing ability to taste very minute differences in water. My TDS meter tells me that tap water here is extremely pure to begin with, but I can pick the same that has undergone carbon and ro filtering versus straight tap water in a blind taste test with 100% accuracy. I'm certainly no connoisseur.
Actually, I'm from rural Illinois, and all the water be it tap or properly maintained well is fairly sweet there with minimal filtering. Actually the streams there are a bit muddy tasting but the water itself is sweet as it flows. It definitely beats this Florida swap water. I tasted unfiltered Florida well water once (most Florida wells have filters built in) and I vomited. The tap water here won't make you sick and it isn't that nasty but it still tastes funky.
That said, I doubt I could tell the difference between tap, well, Illinois, or Florida water that has had that additional filtering (Carbon and Reverse Osmosis, any of those machines for $0.39/gallon at the grocery store will do). My TDS meter shows a difference in purity even from one dispensing machine to the next, but I can't taste that difference. Whatever minerals survive that process are probably pretty much the same anywhere and taste good. That filtered water tastes better than any of the unfiltered waters.
Article badly misrepresents the idea (Score:5, Informative)
The author of the linked article has completely misunderstood what this research is about. It is NOT about tolerating errors in the output of computations; that would be completely infeasible. It's about tolerating errors in intermediate values, by using redundancy. For example, three adders made out of unreliable transistors plus a control unit to have them vote, may be smaller and use less power than one adder made out of reliable transistors. However, you can't make everything out of unreliable transistors. In particular, the control unit, and the parts that compare results to each other, have to work reliably and can't be duplicated. That is what is meant by "some information was more valuable than other information", not the low-order bits of a numeric computation.
Re:A Little Bit of History Repeating (Score:2, Informative)
Basically he uses probability, fuzzy logic uses certainty. It's a hard distinction unless you really like math but if you want to argue it's worth arguing.
Re:Article badly misrepresents the idea (Score:1, Informative)
The errors are not completely random, so 3 adders combined with voting doesn't help your accuracy. If you are using these kinds of adders you're better off just using one, and accepting the error rate in the results.