Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Programming GNU is Not Unix IBM Software IT Technology

IBM Releases Open Source Machine Learning Compiler 146

sheepweevil writes "IBM just released Milepost GCC, 'the world's first open source machine learning compiler.' The compiler analyses the software and determines which code optimizations will be most effective during compilation using machine learning techniques. Experiments carried out with the compiler achieved an average 18% performance improvement. The compiler is expected to significantly reduce time-to-market of new software, because lengthy manual optimization can now be carried out by the compiler. A new code tuning website has been launched to coincide with the compiler release. The website features collaborative performance tuning and sharing of interesting optimization cases."
This discussion has been archived. No new comments can be posted.

IBM Releases Open Source Machine Learning Compiler

Comments Filter:
  • by contr0l ( 1590249 ) on Friday July 03, 2009 @01:52AM (#28568707)
    I'm not a programmer at all, but have dabbled in a few different languages, as I find programming very interesting. (Got pretty good at mirc scripting when I was younger, which lead to visual basic, C++, and now C# dballing that nvr leads to anything). This said, I have a basic knowledge of programming in general. My question is, What things can a compiler do to your code to 'optimize' it for you? I would think majority of any good optimizations might require rethinking whole methods of doing things and/or recoding chunks of code. If the compiler tries to do this, wouldn't it likely screw your code up? Or how would it know 'what' your really trying to do? Outside of removing comments, can someone please explain other Basic optimization methods, (I say basic, like removing comments - You know that cant screw anything up), that a compiler can do on your code that wont screw it up? Thanks in advance.
  • Re:Oh really? (Score:5, Interesting)

    by lee1026 ( 876806 ) on Friday July 03, 2009 @01:53AM (#28568715)

    The last one is actually quite possible, and indeed is a huge area of compiler research.

  • by Anonymous Coward on Friday July 03, 2009 @02:17AM (#28568827)
    Classic examples:
    • Replace a mod (e.g. x % 32) with a bitwise-and (e.g. x & 31) when the divisor is a power of two. Nearly every compiler does this now, but twenty years ago it was a common manual optimization trick.
    • Replace a branch with an arithmetic operation that yields the same result.

      i < 0 ? -i : i

      vs

      cdq
      xor eax, edx
      sub eax, edx

  • by TheRaven64 ( 641858 ) on Friday July 03, 2009 @05:10AM (#28569551) Journal

    Another very similar one, and one that comes up more commonly, is the replacement of a multiplication or division by a constant by a series of additions, subtractions, and bitshifts.

    ARGH! Mod parent down! Please, please, please don't ever repeat this again to people asking things about optimisation. On most modern computers, shifts are slow. They are often even microcoded as multiplications, because they are incredibly rare in code outside segments where someone has decided to 'optimise'. Even when they're not, a typical CPU has more multiply units than shift units and the extra operations needed from the shift and add sequence bloat i-cache usage and cause pipeline stalls by adding adjacent dependencies. The 'optimised' version you describe will almost certainly be slower than the version using the multiply instruction.

    I did some benchmarks with a Core 2 Duo a few months back of this exact optimisation and discovered that in the simplest case the add-and-shift version was as slow as the multiply, in any more complex case it was slower. There's a reason why GCC hasn't done this for some years.

  • Re:Oh really? (Score:4, Interesting)

    by DiegoBravo ( 324012 ) on Friday July 03, 2009 @08:45AM (#28570541) Journal

    That kind of confusing summaries are too frequent that sometimes I go to RTFA!

    Seriously, the summaries should be subject to moderation too (I don't know if the firehose thing lets do that.)

  • Re:Automation... (Score:2, Interesting)

    by tkinnun0 ( 756022 ) on Friday July 03, 2009 @09:02AM (#28570671)

    See idiocracy. Go out and watch it. I'll wait

    The main tenets in Idiocracy were that IQ is hereditary and those with less IQ spend more time procreating. Automation was merely allowing their society to function, barely. IOW, I don't see your point. Can you elaborate, please?

  • Re:Oh really? (Score:3, Interesting)

    by AceJohnny ( 253840 ) <jlargentaye@gmailCOUGAR.com minus cat> on Friday July 03, 2009 @09:08AM (#28570729) Journal

    While the the summary is wrong on this subject, I can tell you that, yes, manual optimization is part of our work and can slow down the release of our product. If we told a customer that yes, we will be able to do VGA 30FPS H.264 encode. Code optimization on our custom core is going to take some time and effort. I work in the embedded multimedia field.

    I think we're going to be very, very interested in this project.

FORTUNE'S FUN FACTS TO KNOW AND TELL: A giant panda bear is really a member of the racoon family.

Working...