Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?

Slashdot videos: Now with more Slashdot!

  • View

  • Discuss

  • Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

Software Bug Open Source Security

Are All Bugs Shallow? Questioning Linus's Law 596

Posted by kdawson
from the defending-a-shibboleth dept.
root777 writes to point out a provocative blog piece by a Microsoft program manager, questioning one of the almost unquestioned tenets of open source development: that given enough eyeballs, all bugs are shallow. Are they? Shawn Hernan looks at DARPA's Sardonix experiment and the Coverity static-analysis bug discovery program in open source projects to conclude that perhaps not enough eyeballs are in evidence. Is he wrong? Why? "Most members of the periphery [those outside the core developer group] do not have the necessary debugging skills ... the vast numbers of 'eyeballs' apparently do not exist. ... [C]ode review is hardly all that makes software more secure. Getting software right is very, very difficult. ... Code review alone is not sufficient. Testing is not sufficient. Tools are not sufficient. Features are not sufficient. None of the things we do in isolation are sufficient. To get software truly correct, especially to get it secure, you have to address all phases of the software development lifecycle, and integrate security into the day-to-day activities."
This discussion has been archived. No new comments can be posted.

Are All Bugs Shallow? Questioning Linus's Law

Comments Filter:
  • by Zebra_X (13249) on Tuesday February 16, 2010 @12:47AM (#31152216)

    But then again, you get what you pay for so... oh wait

  • by iserlohn (49556) on Tuesday February 16, 2010 @12:53AM (#31152244) Homepage

    "The proof of the pudding is in the eating", or as in the case of Microsoft, "the proof of the FUDding is in the beating"...

  • by haruchai (17472) on Tuesday February 16, 2010 @01:39AM (#31152514)

    Let me rephrase this for him -

    "For 25 years, we deliberately chose to ignore the bitter lessons that were learned by the big vendors, to take shortcuts
    to ship shit software first and fix it later and to build up massive layers of cruft in the name of backward compatibility. Now we are caught in a nice pickle
    as we've spent years trying fill the leaks in our crap - some of which is so insecure that, 8 years after the launch, we still have record numbers of bugs in
    Windows XP almost every fucking Patch Tuesday -and restructure it into something rock solid.
    However, until we can get this done, we need to play smoke and mirrors, convince you to toss Win XP - and your old PC, most likely, buy our latest
    and greatest and spit out evermore FUD about how nobody else can get stuff done except us.

    Ladies and gentlemen, I give you the M$ business plan and I'm pleased to say that it's working as well as ever and thank you all"

  • Re:Silent L (Score:4, Funny)

    by deniable (76198) on Tuesday February 16, 2010 @02:03AM (#31152634)
    I get it, ULSER. Good one. They cause me that sort of stress too.
  • by rebelscience (1717928) on Tuesday February 16, 2010 @02:09AM (#31152666) Homepage

    Of course, humans cannot think of everything, but with the right software model and the right tools, we will be able to. For the same reason that we use tools to perform complex calculations flawlessly, calculations that we use to have an extremely hard time doing reliably manually. We don't have the right software model in which to construct rock-solid applications because we are not thinking outside the box. We are addicted to our way of doing things.

    I defend the hypothesis [] that the two major crises that afflict the computer industry (unreliability and low productivity) are due to our having adopted the Turing Machine as the de facto computing model in the last century. The thread concept (algorithm) is fundamentally flawed and the use of multithreading in multicore processors exacerbates the productivity and reliability problems by at least an order of magnitude. The only way to solve the crisis is to switch to a non-threaded, non-algorithmic, syncrhonous (deterministic), reactive and implicitly parallel model.

    The big surprise in all this is that the solution to the crisis is not rocket science. It is based on a simple parallelizing concept that has been in use for decades. We already use it to simulate parallelism in video games, simulations and cellular automata. Use two buffers; while processing buffer A, fill buffer B with all the objects to be processed during next cycle. When buffer A is done, swap buffers and repeat the cycle. Two buffers are used to prevent racing conditions and ensure robust timing. No threads, no fuss and the resulting code is deterministic. We just need to take the concept down to the instruction level within the processor itself and adopt a synchronous reactive software model. It's not rocket science.

    Folks, the days of Turing, Babbage and Lady Ada are soon coming to an end. It's time to wake up and abandon the flawed ideas of the baby-boomer generation and forge a new future. The boomers were wildly successful but this is a new age, the age of massive parallelism and super complex programs. The boomers need to retire and pass the baton to a new generation of computists. Sorry but that's the way I see it.

  • by harlows_monkeys (106428) on Tuesday February 16, 2010 @02:16AM (#31152708) Homepage

    Any technological endeavor human beings work towards will always be subject to "more eyeballs means improvement".

    So that's why the more people there are on the committee that designed a language or protocol, the better the result. I'd always wondered about that.

  • by Anonymous Coward on Tuesday February 16, 2010 @02:42AM (#31152800)

    I was about to write something about what sort of dumbass would go PLOKTA whilst running a debugger, then I remembered cats.

  • by Demonoid-Penguin (1669014) on Tuesday February 16, 2010 @03:32AM (#31153008) Homepage

    Yes, but thanks to proprietary software, none of those bugs will be fixed, only found and exploited.

    Didn't you read the referenced article? Microsoft has a superior, um, thingie - and so there are no bugs to be found. And if you disagree they will crank up their Aesopian ghostwriting dept and prove you wrong. (probably with a backing soundtrack of screaming rabbits and crying babies).

    Having said that - where's my FanBoi(TM) t-shirt?

  • by Interoperable (1651953) on Tuesday February 16, 2010 @03:35AM (#31153024)

    Presumably Microsoft employees do look at the source code but some days I have my doubts. (I'm kidding of course)

  • by nextekcarl (1402899) on Tuesday February 16, 2010 @03:46AM (#31153080)

    ...A perfect program that is never written isn't very useful.

    It is, however, bug-free!

  • by Anonymous Coward on Tuesday February 16, 2010 @05:50AM (#31153578)

    Like "Quick and Dirty Operating System" from Seattle? Whatever happened to that?

  • by rasputin465 (1032646) on Tuesday February 16, 2010 @07:00AM (#31153796)
    I had the exact same thought. "Getting software right is very, very difficult" ... "trust us, we know; we still haven't figure out how to get it right".

"It ain't so much the things we don't know that get us in trouble. It's the things we know that ain't so." -- Artemus Ward aka Charles Farrar Brown