Sacrificing Accuracy For Speed and Efficiency In Processors 499
Skudd writes "Modern computing has always been reliant on accuracy and correct answers. Now, a professor at Rice University in Houston posits that some future applications could be revolutionized by 'probabilistic computing.' Quoting: 'This afternoon, Krishna Palem, speaking at a computer science meeting in San Francisco, will announce results of the first real-world test of his probabilistic computer chip: The chip, which thrives on random errors, ran seven times faster than today's best technology while using just 1/30th the electricity. ... The high density of transistors on existing chips also leads to a lot of background "noise." To compensate, engineers increase the voltage applied to computer circuits to overpower the noise and ensure precise calculations. Palem began wondering how much a slight reduction in the quality of calculations might improve speed and save energy. He soon realized that some information was more valuable than other information. For example, in calculating a bank balance of $13,000.81, getting the "13" correct is much more important than the "81." Producing an answer of $13,000.57 is much closer to being correct than $57,000.81. While Palem's technology may not have a future in calculating missions to Mars, it probably has one in such applications as streaming music and video on mobile devices, he said.'
Bank balance (Score:5, Funny)
Re:Bank balance (Score:5, Insightful)
Re:Bank balance (Score:5, Insightful)
If you are listening to music on a portable media device, it's safe to say that you aren't going to be able to hear the difference between the lossy format and the lossless format.
It's like drinking from a well. Connoisseurs may claim to be able to taste the difference between it and tap water, but that's just the extra tang from all the bull shit.
Re: (Score:2)
I expect financial calculations to be accurate to the penny, or even calculated to the third or fourth decimal place, then rounded to the nearest 2nd decimal place.
But I agree, audio & video playback and other things are different. An occasional error on the 15th or 16th bit isn't going to be audible in real-world portable circumstances.
Re:Bank balance (Score:5, Interesting)
High accuracy is required for encoding music and video, though.
Maybe we could have a selective accuracy, where programmers can set their needs via registers or direct it to different CPU cores. Accurate cores for banking transactions, AV-stream encoding and 2D GUI operations while inaccurate cores are used for AV-stream decoding, and computer game 3d drawing and AI decisions.
There's a whole lot we are calculating now without the need for more than 3 significant digits - and a whole bunch where we intentionally use random numbers, sometimes even with strong hardware entropy gathering.
These are all cases where we could just scrap the accuracy for faster processing or longer battery times. No one cares about single bit errors in portable audio decoders or in high fps 3d gaming.
Or maybe we could... (Score:3)
Maybe we could have a selective accuracy, where programmers can set their needs via registers or direct it to different CPU cores ... just keep using (say, in C) short, int, long, long long (no, AV codecs should not require floating point, but is you wish, there are floats, doubles, long doubles, etc.).
Of course proper implementation of some AV decoder on a modern processor will use available SIMD instructions (MMX and friends) where programmer can easily trade off accuracy (in byte-sized chunks) for speed
Re:Bank balance (Score:4, Insightful)
Re: (Score:3, Insightful)
You wha?
I think hardware and software designers already have that covered when they perform processing on different parts of your system, such as your CPU, or GPU...
Specialization is a good thing, unless you have a preference for the performance of the directx reference rasterizer....
Re: (Score:3, Insightful)
Any numeric analyst worth their pay will have thought hard about every calculation. If there's a subtraction, then either it won't be two floating-point numbers of similar magnitude, or the result won't be crucial (e.g. it might be an error estimate rather than the actual result).
If the characteristics of the hardware are known, then algorithms can be designed to suit them.
Re:Bank balance (Score:5, Insightful)
There are countless applications for a computer that don't depend on accuracy, but do depend on speed. For example: gaming, stock analysis, scientific/mathematical research etc. Just about every use for the computer can benefit from this. Bear in mind these applications can take the hit of inaccuracy, if not benefit from it depending on the situation. Yes there are some instances were accuracy is crucial, but that's why they will continue to make both of the processors. It's what they call a free market, and there will be always be a new niche to fill.
Re: (Score:3, Interesting)
Given that, I would expect this hardware - if it proves useful - would primarily be in the "entertainment" sector of the market. Of course making this judgment
Re:Bank balance (Score:4, Insightful)
The randomness in these processes occurs in particular places, in particular quantities - this processor presumably produces some characteristic amount of randomness in each calculation, but the odds on it being a meaningful amount for whatever arbitrary calculation you're doing is vanishingly small - and given it's apparently treated to give different amounts of randomness in different bits, it's almost certainly non-uniform as well.
In almost all simulations you want to make use of extremely well-controlled random numbers - something which adding some noise as part of your floating-point calculations is not.
Re: (Score:3, Insightful)
Accuracy might not matter for some steps in an implicit, iterative numerical scheme.
Re: (Score:3, Insightful)
... In the US cents in my checkbook are important....
Would that not depend on whether the errors made were equally in your favor and against it? If you did a large number of transactions and half of them added a penny or two to your account and half of them subtracted a similar amount, it would all even out in the, would it not? Only if the errors were unidirectional, would there be a problem in the long term.
Re: (Score:3, Interesting)
There are a few areas where games care a little bit about accuracy. You've got to be really careful about it in any kind of flight game with playing fields of more than a few miles in size. It's amazing the kind of graphical artifacts you can get if you don't take the error in floating points into account when you handle that sort of thing. In our first naive implementation of the engine, at around 50 kilometers out every character would jitter constantly every time they moved because of the last floating p
Re: (Score:3, Insightful)
Don't forget there are errors in the hardware processes anyway and error correction algorithms running on the software side that take care of them.
Re:Bank balance (Score:5, Informative)
'It's like drinking from a well. Connoisseurs may claim to be able to taste the difference between it and tap water, but that's just the extra tang from all the bull shit.'
Probably not the best example. Humans have an amazing ability to taste very minute differences in water. My TDS meter tells me that tap water here is extremely pure to begin with, but I can pick the same that has undergone carbon and ro filtering versus straight tap water in a blind taste test with 100% accuracy. I'm certainly no connoisseur.
Actually, I'm from rural Illinois, and all the water be it tap or properly maintained well is fairly sweet there with minimal filtering. Actually the streams there are a bit muddy tasting but the water itself is sweet as it flows. It definitely beats this Florida swap water. I tasted unfiltered Florida well water once (most Florida wells have filters built in) and I vomited. The tap water here won't make you sick and it isn't that nasty but it still tastes funky.
That said, I doubt I could tell the difference between tap, well, Illinois, or Florida water that has had that additional filtering (Carbon and Reverse Osmosis, any of those machines for $0.39/gallon at the grocery store will do). My TDS meter shows a difference in purity even from one dispensing machine to the next, but I can't taste that difference. Whatever minerals survive that process are probably pretty much the same anywhere and taste good. That filtered water tastes better than any of the unfiltered waters.
Re: (Score:3, Insightful)
In all probability this would never make it into usage as a CPU, more likely a dedicated section of silicon performing the function of a GPU/FPU/DSP. Actual control instructions would be high-integrity, just the low significance digits would get the 'cheap but a bit more unreliable' methods.
I'd like to see how much an image would change if you imposed a 1% chance of a 10% error in either the chroma or luminance of each pixel. I'm willing to bet it'd be 'take a magnifying glass and the images next to each
Re: (Score:3, Interesting)
Re: (Score:2)
Cutting off at the cents place isn't arbitrary--it's done because the cent is the smallest unit of currency produced by the US.
Re: (Score:3)
There's rounding in virtually every transaction you already encounter. Do you live in a location with sales tax?
In pay periods where my paycheck is mathematically supposed to be consistent, it also fluctuates by a cent sometimes. The value averages out but there's still rounding and it's quite obvious.
Re: (Score:3, Interesting)
Well the odds are you probably wouldn't even notice if a few bits here and there were wrong in your audio stream. I'm not sure what the error rate is but if its less than 17 times as much as we have now it'd be worth considering for some applications
I figure if they do use this technology they'd more than likely use the multi-core system currently in place and make one a high accuracy CPU while the other 2-4 cores high speed CPUs. Like someone said it'd be used for gaming and streaming video/audio where 'ac
Primality testing (Score:3, Informative)
Re: (Score:3, Interesting)
Even more so: we intentionally gather entropy to improve the pseudo random numbers. With intentionally inaccurate CPU cores, we could scrap all that and gather entropy en-passant AND be much faster anyway.
Re: (Score:3, Insightful)
Yeah, but there is a big difference between "random" and "incorrect".
The errors resulting from undesirable interactions between transistors are probably a lot less random than a good pseudorandom number generator for these purposes.
Re: (Score:3, Insightful)
I'm not even sure such things exist. Even if you meant pseudorandom, that still has zero to do with the point under discussion.
Re: (Score:2)
For example, in calculating a bank balance of $13,000.81, getting the "13" correct is much more important than the "81."
not to an accountant is isn't.
Besides, to take the example further: if getting the $800 right is much more important than the rest of the $800,000,000,000, does it mean no-one will care if my account suddenly goes from $2000 to $20,000?
Reminds me of... (Score:5, Funny)
A: Because they added 486 and 100 on the first Pentium and got 585.999983605.
Re: (Score:2)
I was thinking along the same lines.
Although considering they call this thing a probabilistic chip, maybe we'll soon have the Infinite Improbability Drive.
Re: (Score:2)
I was thinking along the same lines.
Although considering they call this thing a probabilistic chip, maybe we'll soon have the Infinite Improbability Drive.
We're going to need a really hot cup of tea...
Re:Reminds me of... (Score:5, Funny)
My computer's not slow, it's just being careful.
Accuracy with financial calculations. (Score:5, Funny)
Accuracy with financial calculations is extremely important. Hasn't this guy ever watched Superman 3?
Re: (Score:3, Informative)
Hasn't this guy ever watched Superman 3?
Maybe he just watched Office Space and missed the whole Superman 3 reference?
np: Fennesz - Vacuum (Black Sea)
Re: (Score:2)
I'll note that he was making the point that the thousands part is far more important than the cents part.
Basically, it sounds like he dumbed down the answer too much. Of course the cents are important in your bank account. And, more importantly, it's fairly trivial for us to KEEP that accuracy.
Still, when you start expanding it to, say, a company's balance on the books, you tend to get errors. Think of it like a warehouse inventory - every time you do an inventory, there's a chance that somebody will cou
Re: (Score:2)
quite correct the thousands is far more important than the cents, however 13,810.00 is really close to 13,000.81 right. it has all the same numbers in a similar order.
Banks calculate out to the tens of thousands of a place simple because you need that much for rounding errors. heck in my business most items are priced out to the ten of thousands of place, getting pricing like $121.3456 for a cost.
Re:Accuracy with financial calculations. (Score:5, Informative)
quite correct the thousands is far more important than the cents, however 13,810.00 is really close to 13,000.81 right. it has all the same numbers in a similar order.
Not by the fuzzy logic the guy's using. He's going for scientific accuracy. IE 13,000.81 (+-.001%). It's just our brains that compare symbols that would consider those numbers 'close'.
In which case a $810 error in a $13k account is a big friggen error, and would violate the standards of the chip he's working on. Now, I don't know HOW he's making sure high order bits are done more accurately than the low order ones, but that's what the article mentions him doing.
Use in MP3 Players (Score:2, Funny)
I guess the question is can Cher sue over this technology?
Re:Use in MP3 Players (Score:5, Interesting)
If you bought a popular artist recently, your music is autotuned already.
Anyway, this means that less than 0.1 percent of your decoded audio samples will be less 0.1 percent off. This is probably very acceptable outside concert halls and living rooms if it delivers large bonuses in battery saving or calculation speed.
For example, we could use a much beefier compression algorithm than MP3 or current algorithms even longer on even smaller devices.
Cher Patent Term Extension Act (Score:2)
I guess the question is can Cher sue over this technology?
Only if lawmakers pass the Cher Patent Term Extension Act [kuro5hin.org].
Old Tech? (Score:2)
wll, (Score:5, Funny)
i scrfcd accrc 4 spd a lng tm ago
Re:wll, (Score:5, Interesting)
This whole thing is old and silly.
Seymour Cray is known for saying "Do you want fast or accurate?" because back then there was no IEEE 754 spec (which is not infinitely precise) for floating point numbers at the time and machines were pretty primitive then and his machine did Newtonian approximations of many numeric calculations that were accurate to a point, just like John Carmack did (in software) with Doom's inverse square root.
The moral of the story is that in 2009 and beyond its probably best to have hardware continue to be accurate. This is why we have digital 1s and 0s instead of some other base of computation.
Now, in software, feel free to make things as sloppy as you want. If your bank (not mine) wants to round 13,000.83 to some other value, then by all means go for it. But I think that most of us are OK with accurate hardware.
Re:wll, (Score:5, Funny)
i scrfcd accrc 4 spd a lng tm ago
and it was going so well too... until you got thirsty and told your friend ..
"hy! I wnt sm ck!"
Top Ten Slogans (Score:5, Funny)
9.9999973251 - It's a FLAW, Dammit, not a Bug
8.9999163362 - It's the new math
7.9999414610 - Nearly 300 Correct Opcodes
6.9999831538 - "You Don't Need to Know What's Inside" (tm)
5.9999835137 - Redefining the PC -- and Mathematics As Well
4.9999999021 - We Fixed It, Really
3.9998245917 - Division Considered Harmful
2.9991523619 - Why Do You Think They Call It *Floating* Point?
1.9999103517 - We're Looking for a Few Good Flaws
0.9999999998 - "The Errata Inside" (tm)
Re:Top Ten Slogans (Score:5, Funny)
TOP TEN SLOGANS:
runs Excel just as well as always :-)
It seems like when you need a precise calculation (Score:5, Insightful)
like financial things, you compare the end product with the end product of the same calculation run either through the chip again or another chip (or increasingly likely another core).
Still would be faster too.
Re: (Score:2)
Repeating the process is not going to improve the results.
Suppose you repeat your calculation 3 times. How will you know that the result of comparing the three results with one another is correct?
What if the only answer you can obtain is "A equals B with a probability of 0.9998"? Recursively repeat this comparison and then compare the results? :)
DSP's? (Score:3, Insightful)
My question is, if it's just as well to use a DSP, why not just use a damned DSP?
Re:DSP's? (Score:5, Insightful)
I think the point would be using a DSP/math coprocessor that uses 1/30th the power in exchange for a .001% loss in accuracy for non-essential tasks like music decoding.
I mean, combined with the lousy earbuds most people use, who'd notice? Especially if it makes their MP3 player last 3 times as long as ones that use more traditional and technically accurate DSP/decoder?
Re: (Score:3, Informative)
Re: (Score:2)
http://de.wikipedia.org/wiki/GPGPU [wikipedia.org]
Not completely correct (Score:4, Insightful)
If you read the chapter about the history about the IEEE FP standard in Microprocessors : a quantitative approach, then you will see that in the past accuracy was already sacrificed for speed in supercomputers.
Hmmm (Score:2)
What happens when you add up millions of these inaccurate accounts and end up gaining or losing millions of dollars?
Re:Hmmm (Score:5, Funny)
You get a one-way ticket to pound-me-in-the-ass prison.
Watch out for your cornhole, bud.
Re:Hmmm (Score:5, Funny)
Re: (Score:2)
Given the current state of difficulty settings and boss AI I'm not entirely sure players would notice the difference.
gfx (Score:5, Insightful)
can't this be used in gfx cards, i mean with anti-aliasing and high resolutions it doesn't really matter so much if 1/2 a pixel is #ffffff or #f8f4f0 , hell you can probably even get a pixel entirely wrong for one frame and nobody will care (as long as it doesn't happen too often).
Re: (Score:2, Interesting)
I agree. In fact, with real-time photorealistic rendering, these slight deviations would probably make frames look more accurate, since real video is full of low-level random noise.
The film-grain shader on Left 4 Dead wouldn't be necessary any more.
Re: (Score:2)
Re: (Score:2)
hell you can probably even get a pixel entirely wrong for one frame and nobody will care (as long as it doesn't happen too often).
You must be new enough here to have never been fragged by a shock spell across a dark room. Flashes of light -- even small ones -- are important, at least in Oblivion.
Multimedia processor (Score:3, Informative)
If you're about to join the upcoming avalanche of smartass comments, try reading the UDP-Lite RFC [ietf.org] first. For some applications (notably real-time voice and video), timeliness and efficiency are more important than accuracy.
If this means my music player or phone get more battery life, I'm all for it.
Re: (Score:2)
I'm with you there. I want to listen to my music for weeks without having to plug it in at all, especially if I'm camping.
Re: (Score:2)
Another appropriate comparison would be UDP with Forward Erasure Protection (FEC), which is a probabilistically guaranteed delivery mechanism (i.e., the data will probably arrive in full, but there is no guarantee, for times when accepting that risk is better than requiring delivery).
Sacrifices are expected (Score:4, Interesting)
I have spent the last 9 months coding up a dynamic scalable UI for the nokia tablets.
I have had to make huge compromises to accuracy to obtain the desired performance.
I had the choice of using the full featured (but slow) widget sets and graphical primatives which existed already, or find a way to make it work as I expected it to.
The results have left people breathless :)
take a look here: http://www.youtube.com/watch?v=iMXp0Dg_UaY [youtube.com]
Re:Sacrifices are expected (Score:5, Funny)
That was a very confusing video. What I learned from it: you haven't done some stuff, Zoom Fish!, widgets, Zoom Fish!, behind schedule, zoom, Fish!, widget framework, Fish!
I guess it's a system that lets you zoom in on fish?
Suitable for streaming media??? (Score:2, Funny)
I'd like to see executives at CBS explain how nipples showed ON TOP of a superbowl performer's outfit.
Talk about a wardrobe malfunction.
I can see the defense now:
Your honor: We ran probabilistic tests with out processors, and while we couldn't really duplicate the problem, we were able to show a penis during one test run. We'd really like to show it to you, but Ms. Jackson has stated that she would quote "Sue us into the ground" unquote.
Nothing new here... (Score:4, Insightful)
Isn't that essentially what JPEG, MPEG and every other lossy codec or transform does?
Re: (Score:2, Interesting)
Re:Nothing new here... (Score:4, Interesting)
There have been dedicated MPEG encoding and decoding chips for many years. DXR3 comes to mind.
I think the only new twist is applying the idea to general calculations as a whole as opposed to a specific function or set of functions in software. An interesting idea. Maybe we'll end up with double-precision, single-precision and ballpark floats.
Re: (Score:2)
No. Those are lossy but deterministic.
Games? (Score:4, Insightful)
I could definitely foresee this being used in game systems, especially for graphics.
As long as it mostly looks right, that's all that really matters.
A Safety Sticker ? (Score:2)
So, what, future computers may come with a big sticker :
WARNING : Should not be used in life-critical calculations.
On the other hand, if the errors are really rare and random, and he can make chips 7 times faster at 1/30th the power drain, then you could array 3 such chips at 7 times the speed and 1/10 the power usage, and do your computations by majority vote.
Cool for my entertainment (Score:2)
My bank yould love this (Score:2, Interesting)
What a waste of grant money... (Score:2, Insightful)
They can not be used for ANY result that is later used by anything else -- after all, data based on bad data yields bad results.
Next thing you know, some offshore manufacturer will use the "imprecise" (cheaper) chips instead of the "accurate" ones, and simple things we depend on everyday will fail in wonky ways.
A bit-flip on a microwave will make a 30-second timer not expire at 0, and
Re: (Score:3, Insightful)
Any system which fails permanently due to a single bit error is unstable and not robust (in the numeric sense). If the system is really critical, you should better be ready for bit errors.
This approach is basically similar to what would be required in analog systems. After all, analog engineering was quite possible. The main of the meat in decoding MP3 is not about seeking in the stream, it's a lot of Fourier and postprocessing of the waveform -- loss there can be completely acceptable.
19 times out of 20 (Score:2)
Don't forget the probability is curve. You can design it so most of the time the lost accuracy will be where you don't care about it (the cents). But if a million people are banking, then probability says some of them will have significant errors (in the thousands column).
This is not good for
a) firing a missile
b) driving a car
c) designing a bridge or building
d) my bank balance
General computing.
The first thing that comes to mind... (Score:4, Insightful)
...is gaming applications.
Programmers spend a lot of time coming up with algorithms that simulate randomness for AI or cloud generation or landscapes or whatever... if the processor was just wonky to begin with it'd make certain things a lot more natural looking.
It's interesting that AI in games is always touted as being "ultra-realistic", but always ends up being insanely easy to trip up. Having something "close enough" would add just enough realism/randomness to situations to perhaps make games and environment more dynamic.
I wouldn't want these things processing my bank balance, though, unless it rounded up.
A Little Bit of History Repeating (Score:2, Interesting)
So basically he's advocating fuzzy logic [wikipedia.org], which was big in AI research in the 80's?
Re: (Score:3, Insightful)
Well, it's a *little* more important (Score:3, Interesting)
For example, in calculating a bank balance of $13,000.81, deliberately risking getting the "13" incorrect is fraud that risks $13,000 in damages and $1,000,000 in statutory penalties, and risking getting the "81" incorrect is fraud that only risks $0.81 in damages and $1,000,000 in statutory penalties. Surely saving a couple watt-microseconds is worth that!
Bad example... (Score:4, Insightful)
It looks like you got sucked into the bad example land. Later on in the article it mentions that it's intended for stuff where accuracy isn't paramount, but where it's not really necessary. Multimedia applications over space/bank calculations.
I mean, there's 1.764 Million pixels in my screen that I'm typing my post on at the moment. Does it really introduce much error if I round it to 1.8M? I'm also running at 60Hz. Do you think that I'd really notice if there's a .01% chance that instead of getting white(255) I get white(254)? That'd be an average of 176 pixels a refresh, assuming an all-white screen. Thing is, those pixels wouldn't be the same every time. Then again, logically each pixel would tend towards red/blue/yellow depending on the error. But only slightly. In a HD movie, are you really going to notice?
hopeless article (Score:2)
No information in TFA whatsoever, only a rather bad example.
Probabilistic algorithms are very useful in computer science, and introducing uncertainly at the hardware level might be very interesting for some applications. But how did they implemented this? How do they save power? How are they faster than normal processors?
Bah. I hate non articles like this.
So, to get it right... (Score:2)
... they would need to use a crowd of these processors and some kind of "wisdom of crowds" algorithm to figure out which of the output values is good.
So, in rough figures, if 30 processors is enough to get a good reliable answer from the 'crowd' of procesors, and the overhead of the "wisdom of crowds" algorithm is less than 14%, then maybe we have a system that uses the same power and is about 6x as fast, but no power savings.
If a less good answer is acceptible, then maybe only a few processors are necessar
Obligatory NASA research (Score:5, Funny)
Sooo... (Score:3, Insightful)
He's invented analog?
Analog? (Score:3, Insightful)
Re: (Score:3, Funny)
Transistors are naively analog
Oh those simple-minded transistors. When will they learn?
Early Pentium 60 fiasco (Score:3, Interesting)
You probably remember when some scientists noticed it was generating some math error in their applications so they contacted Intel about it.
Intel's response was, "Well, these are early processors in design and shouldn't affect 99% of the population". Of course the scientists created an awareness of this issue and public generated a stink about it even though 99% of them may never run into the bug that affects their applications.
So now 15+ years later this pops up saying it is OK to have these errors in non-critical applications for sake of speed. People are going to wonder about these chips in their products when something isn't working right and may cry afoul.
Article badly misrepresents the idea (Score:5, Informative)
The author of the linked article has completely misunderstood what this research is about. It is NOT about tolerating errors in the output of computations; that would be completely infeasible. It's about tolerating errors in intermediate values, by using redundancy. For example, three adders made out of unreliable transistors plus a control unit to have them vote, may be smaller and use less power than one adder made out of reliable transistors. However, you can't make everything out of unreliable transistors. In particular, the control unit, and the parts that compare results to each other, have to work reliably and can't be duplicated. That is what is meant by "some information was more valuable than other information", not the low-order bits of a numeric computation.
And this is a major step forward in the research (Score:3, Insightful)
.... of artificial intelligence.
Already here (Score:3, Interesting)
We already have accuracy issues with the processors available today. I've worked with quant teams who will insist on only having machines with Intel processors in their compute farm, because they get a different result from the same code running on AMD machines. As the business has signed off the Intel numbers, Intel it is.
And nonsensical applications again.... (Score:3, Interesting)
In order to stream auto or videao meaningfully, you need to play/display the contents. This has energy consumption high enough that energy savings in the computations will not matter at all before long. In addition, such a chip will allways be very special purpose and most IT experts will not be able to programm in. I call this a dead end.
Proof at last... (Score:3, Funny)
... that the economy is now based on Monopoly money.
Now when you log on to your online banking account, you'll get a Chance card:
Bank errors are in your favor... at the moment.
Re: (Score:2)
But think of the power savings! :P
Honestly, this sounds more like a DSP replacement. Most computers need exact math, but some don't.
Re: (Score:2)
Re: (Score:2)
well, lets argue it a bit anyway, for fun.
If fractions are rounded up and down randomly, I mean with an equal chance, then there should be no overall loss or gain, in the long run. Of course, there might be outliers, some accounts might get shorted a bit more, some might get lucky, some people may try to game the system. Sounds just like real life. It wouldn't make such a big difference.
Still, you're right, nobody would accept such a system, because we all like the illusion of control.
Re: (Score:2)
And if you want to argue this, I suggest you go talk to the bean counters where you work.
Bean counters won't use this (at least I hope not). Telecom engineers will.
Re:uhhh.... (Score:5, Informative)
NEWS FLASH: Binary consists of 1 and 0.
NEWS FLASH: People use computers for calculations with more than single-digit binary results.
"probablistic computing" is another way of saying "sloppy engineering".
No- insisting on excessive precision where an "almost certainly right to within +/- x%" solution would be more than good enough and much simpler to obtain is known as overengineering.
I suspect that the financial examples chosen didn't illustrate the point as well as intended (financial companies generally don't like *any* inaccuracy), but that doesn't change the general principle.
Would you prefer a routing algorithm that gobbled up power and took ages to run for a guaranteed shortest route or one that was far more efficient and 99.9% certain to give a route that was within 3% of the shortest possible distance?
Re: (Score:3, Insightful)
You say what you say, but for the life of me, I have never felt concerned with the supposed quality of audio. I just can't tell the difference between 128kbps and 256kbps, MP3 from FLAC, 22khz from 44khz.
I also can not tell the difference between HD and upscaled SD when in motion.
It just feels so ridiculous, I swear people are only fooling themselves in this high-def nonsense. I can't be sure of that, maybe there really is something I'm missing I'm unaware of.