Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
Math Technology

Sacrificing Accuracy For Speed and Efficiency In Processors 499

Skudd writes "Modern computing has always been reliant on accuracy and correct answers. Now, a professor at Rice University in Houston posits that some future applications could be revolutionized by 'probabilistic computing.' Quoting: 'This afternoon, Krishna Palem, speaking at a computer science meeting in San Francisco, will announce results of the first real-world test of his probabilistic computer chip: The chip, which thrives on random errors, ran seven times faster than today's best technology while using just 1/30th the electricity. ... The high density of transistors on existing chips also leads to a lot of background "noise." To compensate, engineers increase the voltage applied to computer circuits to overpower the noise and ensure precise calculations. Palem began wondering how much a slight reduction in the quality of calculations might improve speed and save energy. He soon realized that some information was more valuable than other information. For example, in calculating a bank balance of $13,000.81, getting the "13" correct is much more important than the "81." Producing an answer of $13,000.57 is much closer to being correct than $57,000.81. While Palem's technology may not have a future in calculating missions to Mars, it probably has one in such applications as streaming music and video on mobile devices, he said.'
This discussion has been archived. No new comments can be posted.

Sacrificing Accuracy For Speed and Efficiency In Processors

Comments Filter:
  • by johnny cashed ( 590023 ) on Sunday February 08, 2009 @02:13PM (#26773965) Homepage
    And $81,000.31 is a much more correct answer!
    • Re:Bank balance (Score:5, Insightful)

      by CastrTroy ( 595695 ) on Sunday February 08, 2009 @02:17PM (#26774003) Homepage
      I agree. The whole problem with the example given in the summary is that your bank balance should never be wrong. There is no room for error in calculating bank balances. I also don't want to hear skips and pops in my music because they though it would be more energy efficient to use a processor that produced errors. I already get 26 hours of charge out of my MP3 player. I'd rather have them focus on getting more space for cheaper so I can carry lossless audio on my portable mp3 player.
      • Re:Bank balance (Score:5, Insightful)

        by BadAnalogyGuy ( 945258 ) <> on Sunday February 08, 2009 @02:21PM (#26774039)

        If you are listening to music on a portable media device, it's safe to say that you aren't going to be able to hear the difference between the lossy format and the lossless format.

        It's like drinking from a well. Connoisseurs may claim to be able to taste the difference between it and tap water, but that's just the extra tang from all the bull shit.

        • I expect financial calculations to be accurate to the penny, or even calculated to the third or fourth decimal place, then rounded to the nearest 2nd decimal place.

          But I agree, audio & video playback and other things are different. An occasional error on the 15th or 16th bit isn't going to be audible in real-world portable circumstances.

          • Re:Bank balance (Score:5, Interesting)

            by phoenix321 ( 734987 ) * on Sunday February 08, 2009 @02:51PM (#26774409)

            High accuracy is required for encoding music and video, though.

            Maybe we could have a selective accuracy, where programmers can set their needs via registers or direct it to different CPU cores. Accurate cores for banking transactions, AV-stream encoding and 2D GUI operations while inaccurate cores are used for AV-stream decoding, and computer game 3d drawing and AI decisions.

            There's a whole lot we are calculating now without the need for more than 3 significant digits - and a whole bunch where we intentionally use random numbers, sometimes even with strong hardware entropy gathering.

            These are all cases where we could just scrap the accuracy for faster processing or longer battery times. No one cares about single bit errors in portable audio decoders or in high fps 3d gaming.

            • Maybe we could have a selective accuracy, where programmers can set their needs via registers or direct it to different CPU cores ... just keep using (say, in C) short, int, long, long long (no, AV codecs should not require floating point, but is you wish, there are floats, doubles, long doubles, etc.).

              Of course proper implementation of some AV decoder on a modern processor will use available SIMD instructions (MMX and friends) where programmer can easily trade off accuracy (in byte-sized chunks) for speed

            • Re:Bank balance (Score:4, Insightful)

              by memco ( 721915 ) on Sunday February 08, 2009 @03:48PM (#26775035) Journal
              Ok, so not only do I have to give up efficiency in the chip itself, but now my efficiency suffers because I now have to determine which chips are useful for which applications. I don't want to have to start thinking about whether or not I plan to use my new laptop for anything requiring accuracy greater than such and such a percentage. I suppose this might be effective for niche markets, but it seems messy if you try to make it part of all computing platforms.
              • Re: (Score:3, Insightful)

                by ozphx ( 1061292 )

                You wha?

                I think hardware and software designers already have that covered when they perform processing on different parts of your system, such as your CPU, or GPU...

                Specialization is a good thing, unless you have a preference for the performance of the directx reference rasterizer....

          • Re:Bank balance (Score:5, Insightful)

            by Hojima ( 1228978 ) on Sunday February 08, 2009 @02:56PM (#26774467)

            There are countless applications for a computer that don't depend on accuracy, but do depend on speed. For example: gaming, stock analysis, scientific/mathematical research etc. Just about every use for the computer can benefit from this. Bear in mind these applications can take the hit of inaccuracy, if not benefit from it depending on the situation. Yes there are some instances were accuracy is crucial, but that's why they will continue to make both of the processors. It's what they call a free market, and there will be always be a new niche to fill.

            • Re: (Score:3, Interesting)

              Obviously this is the potential argument for these chips, but the majority of these systems have a trade-off between between speed and accuracy: Most numerical mathematical methods have a clear trade-off between speed and accuracy. It's pointless to gain speed if it pulls your accuracy down more than simply reducing the complexity of your algorithms.

              Given that, I would expect this hardware - if it proves useful - would primarily be in the "entertainment" sector of the market. Of course making this judgment

            • Re: (Score:3, Interesting)

              by mdarksbane ( 587589 )

              There are a few areas where games care a little bit about accuracy. You've got to be really careful about it in any kind of flight game with playing fields of more than a few miles in size. It's amazing the kind of graphical artifacts you can get if you don't take the error in floating points into account when you handle that sort of thing. In our first naive implementation of the engine, at around 50 kilometers out every character would jitter constantly every time they moved because of the last floating p

          • Re: (Score:3, Insightful)

            by shaitand ( 626655 )

            Don't forget there are errors in the hardware processes anyway and error correction algorithms running on the software side that take care of them.

        • Re:Bank balance (Score:5, Informative)

          by shaitand ( 626655 ) on Sunday February 08, 2009 @05:32PM (#26776191) Journal

          'It's like drinking from a well. Connoisseurs may claim to be able to taste the difference between it and tap water, but that's just the extra tang from all the bull shit.'

          Probably not the best example. Humans have an amazing ability to taste very minute differences in water. My TDS meter tells me that tap water here is extremely pure to begin with, but I can pick the same that has undergone carbon and ro filtering versus straight tap water in a blind taste test with 100% accuracy. I'm certainly no connoisseur.

          Actually, I'm from rural Illinois, and all the water be it tap or properly maintained well is fairly sweet there with minimal filtering. Actually the streams there are a bit muddy tasting but the water itself is sweet as it flows. It definitely beats this Florida swap water. I tasted unfiltered Florida well water once (most Florida wells have filters built in) and I vomited. The tap water here won't make you sick and it isn't that nasty but it still tastes funky.

          That said, I doubt I could tell the difference between tap, well, Illinois, or Florida water that has had that additional filtering (Carbon and Reverse Osmosis, any of those machines for $0.39/gallon at the grocery store will do). My TDS meter shows a difference in purity even from one dispensing machine to the next, but I can't taste that difference. Whatever minerals survive that process are probably pretty much the same anywhere and taste good. That filtered water tastes better than any of the unfiltered waters.

      • Re: (Score:3, Interesting)

        I disagree. Cents is already an arbitrary cut off for the calculations' accuracy, why not just cut it off at dollars? I do my budget religiously every night and go over everything with my wife a couple of times a month. If every transaction were rounded up to the nearest dollar, it wouldn't destroy my finances. I doubt it would seriously mess up my finances if they were off by $5. Now, if I were able to give up efficiency and accuracy in my financial calculations for something else that I considered valuabl
        • by Sancho ( 17056 ) *

          Cutting off at the cents place isn't arbitrary--it's done because the cent is the smallest unit of currency produced by the US.

      • Re: (Score:3, Interesting)

        by Tuoqui ( 1091447 )

        Well the odds are you probably wouldn't even notice if a few bits here and there were wrong in your audio stream. I'm not sure what the error rate is but if its less than 17 times as much as we have now it'd be worth considering for some applications

        I figure if they do use this technology they'd more than likely use the multi-core system currently in place and make one a high accuracy CPU while the other 2-4 cores high speed CPUs. Like someone said it'd be used for gaming and streaming video/audio where 'ac

    • Primality testing (Score:3, Informative)

      by 2.7182 ( 819680 )
      Random algorithms are used all the time. In RSA for example, random primes must be generated. This is done with an algorithm that probably gives the right answer, which is good enough. The chance that it would fail is so tiny as to not matter.
      • Re: (Score:3, Interesting)

        by phoenix321 ( 734987 ) *

        Even more so: we intentionally gather entropy to improve the pseudo random numbers. With intentionally inaccurate CPU cores, we could scrap all that and gather entropy en-passant AND be much faster anyway.

      • Re: (Score:3, Insightful)

        by Dahamma ( 304068 )

        Yeah, but there is a big difference between "random" and "incorrect".

        The errors resulting from undesirable interactions between transistors are probably a lot less random than a good pseudorandom number generator for these purposes.

      • Re: (Score:3, Insightful)

        by Hognoxious ( 631665 )

        Random algorithms are used all the time.

        I'm not even sure such things exist. Even if you meant pseudorandom, that still has zero to do with the point under discussion.

    • For example, in calculating a bank balance of $13,000.81, getting the "13" correct is much more important than the "81."

      not to an accountant is isn't.

      Besides, to take the example further: if getting the $800 right is much more important than the rest of the $800,000,000,000, does it mean no-one will care if my account suddenly goes from $2000 to $20,000?

  • by rob1980 ( 941751 ) on Sunday February 08, 2009 @02:14PM (#26773973)
    Q: Why didn't Intel call the Pentium the 586?
    A: Because they added 486 and 100 on the first Pentium and got 585.999983605.
  • by onion2k ( 203094 ) * on Sunday February 08, 2009 @02:15PM (#26773981) Homepage

    Accuracy with financial calculations is extremely important. Hasn't this guy ever watched Superman 3?

    • Re: (Score:3, Informative)

      by Briareos ( 21163 ) *

      Hasn't this guy ever watched Superman 3?

      Maybe he just watched Office Space and missed the whole Superman 3 reference?

      np: Fennesz - Vacuum (Black Sea)

    • I'll note that he was making the point that the thousands part is far more important than the cents part.

      Basically, it sounds like he dumbed down the answer too much. Of course the cents are important in your bank account. And, more importantly, it's fairly trivial for us to KEEP that accuracy.

      Still, when you start expanding it to, say, a company's balance on the books, you tend to get errors. Think of it like a warehouse inventory - every time you do an inventory, there's a chance that somebody will cou

      • quite correct the thousands is far more important than the cents, however 13,810.00 is really close to 13,000.81 right. it has all the same numbers in a similar order.

        Banks calculate out to the tens of thousands of a place simple because you need that much for rounding errors. heck in my business most items are priced out to the ten of thousands of place, getting pricing like $121.3456 for a cost.

        • by Firethorn ( 177587 ) on Sunday February 08, 2009 @03:41PM (#26774957) Homepage Journal

          quite correct the thousands is far more important than the cents, however 13,810.00 is really close to 13,000.81 right. it has all the same numbers in a similar order.

          Not by the fuzzy logic the guy's using. He's going for scientific accuracy. IE 13,000.81 (+-.001%). It's just our brains that compare symbols that would consider those numbers 'close'.

          In which case a $810 error in a $13k account is a big friggen error, and would violate the standards of the chip he's working on. Now, I don't know HOW he's making sure high order bits are done more accurately than the low order ones, but that's what the article mentions him doing.

  • So what you're saying is that it might make all my MP3s sound like they are AutoTuned? But the battery will last 30 times longer?

    I guess the question is can Cher sue over this technology?
  • Didn't Intel already implement this technology in their Pentium 2 FPUs? Didn't seem very desirable to me...
  • wll, (Score:5, Funny)

    by greenguy ( 162630 ) <<estebandido> <at> <>> on Sunday February 08, 2009 @02:21PM (#26774033) Homepage Journal

    i scrfcd accrc 4 spd a lng tm ago

    • Re:wll, (Score:5, Interesting)

      by hackstraw ( 262471 ) on Sunday February 08, 2009 @03:26PM (#26774797)

      This whole thing is old and silly.

      Seymour Cray is known for saying "Do you want fast or accurate?" because back then there was no IEEE 754 spec (which is not infinitely precise) for floating point numbers at the time and machines were pretty primitive then and his machine did Newtonian approximations of many numeric calculations that were accurate to a point, just like John Carmack did (in software) with Doom's inverse square root.

      The moral of the story is that in 2009 and beyond its probably best to have hardware continue to be accurate. This is why we have digital 1s and 0s instead of some other base of computation.

      Now, in software, feel free to make things as sloppy as you want. If your bank (not mine) wants to round 13,000.83 to some other value, then by all means go for it. But I think that most of us are OK with accurate hardware.

    • Re:wll, (Score:5, Funny)

      by vux984 ( 928602 ) on Sunday February 08, 2009 @04:14PM (#26775327)

      i scrfcd accrc 4 spd a lng tm ago

      and it was going so well too... until you got thirsty and told your friend ..

      "hy! I wnt sm ck!"

  • by AmigaHeretic ( 991368 ) on Sunday February 08, 2009 @02:22PM (#26774057) Journal

    9.9999973251 - It's a FLAW, Dammit, not a Bug

    8.9999163362 - It's the new math

    7.9999414610 - Nearly 300 Correct Opcodes

    6.9999831538 - "You Don't Need to Know What's Inside" (tm)

    5.9999835137 - Redefining the PC -- and Mathematics As Well

    4.9999999021 - We Fixed It, Really

    3.9998245917 - Division Considered Harmful

    2.9991523619 - Why Do You Think They Call It *Floating* Point?

    1.9999103517 - We're Looking for a Few Good Flaws

    0.9999999998 - "The Errata Inside" (tm)
  • by rolfwind ( 528248 ) on Sunday February 08, 2009 @02:23PM (#26774071)

    like financial things, you compare the end product with the end product of the same calculation run either through the chip again or another chip (or increasingly likely another core).

    Still would be faster too.

    • Repeating the process is not going to improve the results.

      Suppose you repeat your calculation 3 times. How will you know that the result of comparing the three results with one another is correct?

      What if the only answer you can obtain is "A equals B with a probability of 0.9998"? Recursively repeat this comparison and then compare the results? :)

  • DSP's? (Score:3, Insightful)

    by Zantetsuken ( 935350 ) on Sunday February 08, 2009 @02:24PM (#26774075) Homepage
    Isn't that the point of using a DSP? So you can use a slower CPU to run the firmware and the DSP do that grunt work of decoding, thus letting you save power with the low voltage CPU?

    My question is, if it's just as well to use a DSP, why not just use a damned DSP?
    • Re:DSP's? (Score:5, Insightful)

      by Firethorn ( 177587 ) on Sunday February 08, 2009 @02:52PM (#26774425) Homepage Journal

      I think the point would be using a DSP/math coprocessor that uses 1/30th the power in exchange for a .001% loss in accuracy for non-essential tasks like music decoding.

      I mean, combined with the lousy earbuds most people use, who'd notice? Especially if it makes their MP3 player last 3 times as long as ones that use more traditional and technically accurate DSP/decoder?

      • Re: (Score:3, Informative)

        by Heather D ( 1279828 )
        Even with good earbuds this would probably not be noticeable. This will have a huge impact upon DSP systems if it pans out. It could have many other applications as well. Robotics, artificial intelligence, fuzzy logic, neural networks, just to name a few.
  • by chthon ( 580889 ) on Sunday February 08, 2009 @02:26PM (#26774099) Homepage Journal

    If you read the chapter about the history about the IEEE FP standard in Microprocessors : a quantitative approach, then you will see that in the past accuracy was already sacrificed for speed in supercomputers.

  • What happens when you add up millions of these inaccurate accounts and end up gaining or losing millions of dollars?

  • gfx (Score:5, Insightful)

    by RiotingPacifist ( 1228016 ) on Sunday February 08, 2009 @02:27PM (#26774117)

    can't this be used in gfx cards, i mean with anti-aliasing and high resolutions it doesn't really matter so much if 1/2 a pixel is #ffffff or #f8f4f0 , hell you can probably even get a pixel entirely wrong for one frame and nobody will care (as long as it doesn't happen too often).

    • Re: (Score:2, Interesting)

      by retroStick ( 1040570 )

      I agree. In fact, with real-time photorealistic rendering, these slight deviations would probably make frames look more accurate, since real video is full of low-level random noise.
      The film-grain shader on Left 4 Dead wouldn't be necessary any more.

    • Funny you should mention that; as far as I'm aware some graphics cards *do* tolerate these sorts of minor glitches. IIRC, I heard this in connection with a /. article that discussed using GFX chips as general-use processors.
    • by FlyByPC ( 841016 )

      hell you can probably even get a pixel entirely wrong for one frame and nobody will care (as long as it doesn't happen too often).

      You must be new enough here to have never been fragged by a shock spell across a dark room. Flashes of light -- even small ones -- are important, at least in Oblivion.

  • Multimedia processor (Score:3, Informative)

    by Gerald ( 9696 ) on Sunday February 08, 2009 @02:27PM (#26774121) Homepage

    If you're about to join the upcoming avalanche of smartass comments, try reading the UDP-Lite RFC [] first. For some applications (notably real-time voice and video), timeliness and efficiency are more important than accuracy.

    If this means my music player or phone get more battery life, I'm all for it.

    • by ndogg ( 158021 )

      I'm with you there. I want to listen to my music for weeks without having to plug it in at all, especially if I'm camping.

    • by mbone ( 558574 )

      Another appropriate comparison would be UDP with Forward Erasure Protection (FEC), which is a probabilistically guaranteed delivery mechanism (i.e., the data will probably arrive in full, but there is no guarantee, for times when accepting that risk is better than requiring delivery).

  • by LiquidCoooled ( 634315 ) on Sunday February 08, 2009 @02:32PM (#26774177) Homepage Journal

    I have spent the last 9 months coding up a dynamic scalable UI for the nokia tablets.

    I have had to make huge compromises to accuracy to obtain the desired performance.

    I had the choice of using the full featured (but slow) widget sets and graphical primatives which existed already, or find a way to make it work as I expected it to.

    The results have left people breathless :)

    take a look here: []

  • by Anonymous Coward

    I'd like to see executives at CBS explain how nipples showed ON TOP of a superbowl performer's outfit.

    Talk about a wardrobe malfunction.

    I can see the defense now:

    Your honor: We ran probabilistic tests with out processors, and while we couldn't really duplicate the problem, we were able to show a penis during one test run. We'd really like to show it to you, but Ms. Jackson has stated that she would quote "Sue us into the ground" unquote.

  • by chill ( 34294 ) on Sunday February 08, 2009 @02:33PM (#26774197) Journal

    Isn't that essentially what JPEG, MPEG and every other lossy codec or transform does?

  • Games? (Score:4, Insightful)

    by ndogg ( 158021 ) <the,rhorn&gmail,com> on Sunday February 08, 2009 @02:40PM (#26774267) Homepage Journal

    I could definitely foresee this being used in game systems, especially for graphics.

    As long as it mostly looks right, that's all that really matters.

  • So, what, future computers may come with a big sticker :

    WARNING : Should not be used in life-critical calculations.

    On the other hand, if the errors are really rare and random, and he can make chips 7 times faster at 1/30th the power drain, then you could array 3 such chips at 7 times the speed and 1/10 the power usage, and do your computations by majority vote.

  • Wow... soon, I will be able to comprehend music and video at 7x the standard playing speed! Wonderful!
  • by Celc ( 1471887 )
    I'm sure my bank would love to argue they at least got the 13 right as they skim a penny of every transaction. I'm sure this the most awesome thing since sliced bread, but can we please avoid trying too argue this from the point of peoples bank accounts when it introduces random error.
  • Processors that provide different output for the same input cannot be used for anything that wants predictable output.

    They can not be used for ANY result that is later used by anything else -- after all, data based on bad data yields bad results.

    Next thing you know, some offshore manufacturer will use the "imprecise" (cheaper) chips instead of the "accurate" ones, and simple things we depend on everyday will fail in wonky ways.

    A bit-flip on a microwave will make a 30-second timer not expire at 0, and

    • Re: (Score:3, Insightful)

      by cnettel ( 836611 )

      Any system which fails permanently due to a single bit error is unstable and not robust (in the numeric sense). If the system is really critical, you should better be ready for bit errors.

      This approach is basically similar to what would be required in analog systems. After all, analog engineering was quite possible. The main of the meat in decoding MP3 is not about seeking in the stream, it's a lot of Fourier and postprocessing of the waveform -- loss there can be completely acceptable.

  • Don't forget the probability is curve. You can design it so most of the time the lost accuracy will be where you don't care about it (the cents). But if a million people are banking, then probability says some of them will have significant errors (in the thousands column).

    This is not good for
    a) firing a missile
    b) driving a car
    c) designing a bridge or building
    d) my bank balance

    General computing.

  • by Lordfly ( 590616 ) on Sunday February 08, 2009 @02:52PM (#26774423) Homepage Journal gaming applications.

    Programmers spend a lot of time coming up with algorithms that simulate randomness for AI or cloud generation or landscapes or whatever... if the processor was just wonky to begin with it'd make certain things a lot more natural looking.

    It's interesting that AI in games is always touted as being "ultra-realistic", but always ends up being insanely easy to trip up. Having something "close enough" would add just enough realism/randomness to situations to perhaps make games and environment more dynamic.

    I wouldn't want these things processing my bank balance, though, unless it rounded up.

  • So basically he's advocating fuzzy logic [], which was big in AI research in the 80's?

    • Re: (Score:3, Insightful)

      by osu-neko ( 2604 )
      Sorry, I missed the part back in the 80's where using fuzzy logic caused my processor to consume 1/30th the power.
  • by roystgnr ( 4015 ) <roystgnr&ticam,utexas,edu> on Sunday February 08, 2009 @02:58PM (#26774481) Homepage

    For example, in calculating a bank balance of $13,000.81, deliberately risking getting the "13" incorrect is fraud that risks $13,000 in damages and $1,000,000 in statutory penalties, and risking getting the "81" incorrect is fraud that only risks $0.81 in damages and $1,000,000 in statutory penalties. Surely saving a couple watt-microseconds is worth that!

    • Bad example... (Score:4, Insightful)

      by Firethorn ( 177587 ) on Sunday February 08, 2009 @03:30PM (#26774843) Homepage Journal

      It looks like you got sucked into the bad example land. Later on in the article it mentions that it's intended for stuff where accuracy isn't paramount, but where it's not really necessary. Multimedia applications over space/bank calculations.

      I mean, there's 1.764 Million pixels in my screen that I'm typing my post on at the moment. Does it really introduce much error if I round it to 1.8M? I'm also running at 60Hz. Do you think that I'd really notice if there's a .01% chance that instead of getting white(255) I get white(254)? That'd be an average of 176 pixels a refresh, assuming an all-white screen. Thing is, those pixels wouldn't be the same every time. Then again, logically each pixel would tend towards red/blue/yellow depending on the error. But only slightly. In a HD movie, are you really going to notice?

  • No information in TFA whatsoever, only a rather bad example.

    Probabilistic algorithms are very useful in computer science, and introducing uncertainly at the hardware level might be very interesting for some applications. But how did they implemented this? How do they save power? How are they faster than normal processors?

    Bah. I hate non articles like this.


  • ... they would need to use a crowd of these processors and some kind of "wisdom of crowds" algorithm to figure out which of the output values is good.

    So, in rough figures, if 30 processors is enough to get a good reliable answer from the 'crowd' of procesors, and the overhead of the "wisdom of crowds" algorithm is less than 14%, then maybe we have a system that uses the same power and is about 6x as fast, but no power savings.

    If a less good answer is acceptible, then maybe only a few processors are necessar

  • by DirePickle ( 796986 ) on Sunday February 08, 2009 @03:11PM (#26774639)
    From NASA []:

    Computer scientist Arthur Boran was ecstatic. A few minutes earlier, he had programmed a basic mathematical problem into his prototypical Akron I computer. His request was simply, "Give me the sum of every odd number between zero and ten." The computer's quick answer, 157, was unexpected, to say the least. With growing excitement, Boran requested an explanation of the computer's reasoning. The printout read as follows: THE TERM "ODD NUMBER" IS AMBIGUOUS. I THEREFORE CHOOSE TO INTERPRET IT AS MEANING "A NUMBER THAT IS FUNNY LOOKING." USING MY AESTHETIC JUDGEMENT, I PICKED THE NUMBERS 3, 8, AND 147, ADDED THEM UP, AND GOT 157.

    A few moments later there was an addendum: I GUESS I MEANT 158.

    Followed shortly thereafter by: 147 IS MORE THAN 10, ISN'T IT? SORRY.

  • Sooo... (Score:3, Insightful)

    by thetzar ( 30126 ) on Sunday February 08, 2009 @05:16PM (#26776031) Homepage

    He's invented analog?

  • Analog? (Score:3, Insightful)

    by w0mprat ( 1317953 ) on Sunday February 08, 2009 @06:35PM (#26776823)
    Transistors are naively analog. It's ironic that we use them for digital logic locked to a frequency cycle.
  • by Darkk ( 1296127 ) on Sunday February 08, 2009 @07:04PM (#26777183)

    You probably remember when some scientists noticed it was generating some math error in their applications so they contacted Intel about it.

    Intel's response was, "Well, these are early processors in design and shouldn't affect 99% of the population". Of course the scientists created an awareness of this issue and public generated a stink about it even though 99% of them may never run into the bug that affects their applications.

    So now 15+ years later this pops up saying it is OK to have these errors in non-critical applications for sake of speed. People are going to wonder about these chips in their products when something isn't working right and may cry afoul.

  • by Jimmy_B ( 129296 ) <slashdot AT jimrandomh DOT org> on Sunday February 08, 2009 @07:28PM (#26777445) Homepage

    The author of the linked article has completely misunderstood what this research is about. It is NOT about tolerating errors in the output of computations; that would be completely infeasible. It's about tolerating errors in intermediate values, by using redundancy. For example, three adders made out of unreliable transistors plus a control unit to have them vote, may be smaller and use less power than one adder made out of reliable transistors. However, you can't make everything out of unreliable transistors. In particular, the control unit, and the parts that compare results to each other, have to work reliably and can't be duplicated. That is what is meant by "some information was more valuable than other information", not the low-order bits of a numeric computation.

  • by 3seas ( 184403 ) on Sunday February 08, 2009 @08:18PM (#26777875) Homepage Journal

    .... of artificial intelligence.

  • Already here (Score:3, Interesting)

    by Builder ( 103701 ) on Monday February 09, 2009 @03:24AM (#26780545)

    We already have accuracy issues with the processors available today. I've worked with quant teams who will insist on only having machines with Intel processors in their compute farm, because they get a different result from the same code running on AMD machines. As the business has signed off the Intel numbers, Intel it is.

  • by gweihir ( 88907 ) on Monday February 09, 2009 @04:07AM (#26780679)

    In order to stream auto or videao meaningfully, you need to play/display the contents. This has energy consumption high enough that energy savings in the computations will not matter at all before long. In addition, such a chip will allways be very special purpose and most IT experts will not be able to programm in. I call this a dead end.

  • by Roadkills-R-Us ( 122219 ) on Monday February 09, 2009 @01:58PM (#26786459) Homepage

    ... that the economy is now based on Monopoly money.

    Now when you log on to your online banking account, you'll get a Chance card:

    Bank errors are in your favor... at the moment.

We can found no scientific discipline, nor a healthy profession on the technical mistakes of the Department of Defense and IBM. -- Edsger Dijkstra