Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Security Software IT Technology

Is Code Auditing of Open Source Apps Necessary? 108

An anonymous reader writes "Following Sun Microsystems' decision to release a raft of open source applications to support its secure cloud computing strategy, companies may be wondering if they should conduct security tests of their customized open source software before deployment. While the use of encryption and VPNs to extend a secure bridge between a company IT resource and a private cloud facility is very positive — especially now that Amazon is beta testing its pay-as-you-go private cloud facility — it's important that the underlying application code is also secure. What do you think?"
This discussion has been archived. No new comments can be posted.

Is Code Auditing of Open Source Apps Necessary?

Comments Filter:
  • Yes. (Score:5, Insightful)

    by wed128 ( 722152 ) on Wednesday December 23, 2009 @12:46PM (#30536246)

    Next Question.

  • by BadAnalogyGuy ( 945258 ) <BadAnalogyGuy@gmail.com> on Wednesday December 23, 2009 @12:54PM (#30536326)

    The answer is Yes. When you run software, you are running it under 1 of the following 3 assumptions:

    1. You implicitly trust the vendor
    2. You have tested it yourself and trust your tests
    3. You are oblivious (the vast majority of users are)

    What's more, since Open Source software lacks any single person you could possibly sue in case things go terribly wrong, it makes sense to mistrust it a priori. OSS isn't magically secure because it is open. It still needs testing and validation if you intend to run it in any serious corporate environment.

    To simply accept a software package without assuming it is riddled with bugs and security vulnerabilities is foolish. No matter if it is a proprietary software package or an Open Source community project, any sane CIO will want some sort of evidence that the product will not end up losing them money and customer trust due to security vunerabilities.

  • Re:Yes. (Score:4, Insightful)

    by causality ( 777677 ) on Wednesday December 23, 2009 @12:57PM (#30536354)

    Next Question.

    No shit. I don't understand how this got to be a story. What's next, "Should Engineers Who Design Bridges Demonstrate Competency Before Thousands of Automobiles Drive on Those Bridges?"

  • by jimbobborg ( 128330 ) on Wednesday December 23, 2009 @12:59PM (#30536392)

    What's more, since Open Source software lacks any single person you could possibly sue in case things go terribly wrong, it makes sense to mistrust it a priori. OSS isn't magically secure because it is open. It still needs testing and validation if you intend to run it in any serious corporate environment.

    I still hear this every once in a while. So my question is, has anyone ever sued Microsoft for loss of data/trust? Have you not read the EULA?

  • by BronsCon ( 927697 ) <social@bronstrup.com> on Wednesday December 23, 2009 @01:08PM (#30536524) Journal

    It's different because users of paid merchandise or services can seek legal remediation if something goes terribly wrong. Unless, of course, the license agreement specifically states that there is no guarantee of the program's fitness for any specific purpose.

    There, fixed that for ya.

  • by minsk ( 805035 ) on Wednesday December 23, 2009 @01:09PM (#30536538)

    The payment creates an[] obligation.

    An obligation to include vicious anti-liability clauses and avoid any admission of wrong-doing?

  • by elnyka ( 803306 ) on Wednesday December 23, 2009 @01:26PM (#30536708)

    How are they auditing the code of the closed source apps they're using? If there are steps in place, use those as a minimum. If there aren't, then how's the blind faith of using those programs different than what's needed for open source?

    Flipping the question does not answer the original one, which is a valid one and which deserves an answer. The answer is, just like anything, it depends. It depends on the open source artifacts in question; it depends on the specific audit/security requirements; it depends on how critical the app under development is; it depends on SLA agreements (if one exists and requires it.)

    As you said, if there are steps in place, use those as a minimum, provided that they are sufficient for the requirements at hand.

    If there aren't any, you can't just cross your arms and say "well, if I didn't do them with COTS, why would I with FOSS"? If there aren't, and your project requires them, then shit, you implement them.

    The question of whether to sec audit something, be it COTS or FOSS is predicated by the requirements at hand, not on whether a previous usage of COTS (or FOSS) was properly audited in the past.

  • Re:Yes. (Score:3, Insightful)

    by tool462 ( 677306 ) on Wednesday December 23, 2009 @02:13PM (#30537194)

    Interesting. I can think of another field where this could be useful:

    Require all fund managers to have a significant portion of their net worth in the funds they manage. If the fund collapses, they go down with the ship.

  • by Anonymous Coward on Wednesday December 23, 2009 @02:30PM (#30537354)

    The funny thing is, how many people are actually eyeballing the code? Are you, or do you just assume thousands of other people are?

"Experience has proved that some people indeed know everything." -- Russell Baker

Working...