Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Security Software IT Technology

Is Code Auditing of Open Source Apps Necessary? 108

An anonymous reader writes "Following Sun Microsystems' decision to release a raft of open source applications to support its secure cloud computing strategy, companies may be wondering if they should conduct security tests of their customized open source software before deployment. While the use of encryption and VPNs to extend a secure bridge between a company IT resource and a private cloud facility is very positive — especially now that Amazon is beta testing its pay-as-you-go private cloud facility — it's important that the underlying application code is also secure. What do you think?"
This discussion has been archived. No new comments can be posted.

Is Code Auditing of Open Source Apps Necessary?

Comments Filter:
  • Yes. (Score:5, Insightful)

    by wed128 ( 722152 ) on Wednesday December 23, 2009 @12:46PM (#30536246)

    Next Question.

    • Re:Yes. (Score:4, Insightful)

      by causality ( 777677 ) on Wednesday December 23, 2009 @12:57PM (#30536354)

      Next Question.

      No shit. I don't understand how this got to be a story. What's next, "Should Engineers Who Design Bridges Demonstrate Competency Before Thousands of Automobiles Drive on Those Bridges?"

      • Re:Yes. (Score:5, Funny)

        by Thanshin ( 1188877 ) on Wednesday December 23, 2009 @01:08PM (#30536532)

        No shit. I don't understand how this got to be a story. What's next, "Should Engineers Who Design Bridges Demonstrate Competency Before Thousands of Automobiles Drive on Those Bridges?"

        No.

        They should pass an accelerated three month course on how to mix cement, then spend six months mixing cement for 300$/month and then change jobs saying in their CV that they have five years of experience in construction. Only then they're ready to apply their experience to design a bridge.

        When the first car goes over it and falls to its demise, they're just have to patch the bridge.

        After a couple of years and innumerable patches, the bridge, now essentially a pile of cement over a chasm, will finally stop dropping more than a couple cars per day to the void. At that point, the engineers are ready to find a management position.

        • Ahh, if they hired engineers like they hired software devs.

          Actually, where's the guy with the woodpecker destroying civilization in his sig when you need him?

        • by Anonymous Coward

          King of Swamp Castle: When I first came here, this was all swamp. Everyone
          said I was daft to build a castle on a swamp, but I built in all the same,
          just to show them. It sank into the swamp. So I built a second one. And that
          one sank into the swamp. So I built a third. That burned down, fell over,
          and then sank into the swamp. But the fourth one stayed up. And that's what
          you're going to get, Son, the strongest castle in all of England.

          • by Savage-Rabbit ( 308260 ) on Wednesday December 23, 2009 @01:59PM (#30537036)

            King of Swamp Castle: When I first came here, this was all swamp. Everyone
            said I was daft to build a castle on a swamp, but I built in all the same,
            just to show them. It sank into the swamp. So I built a second one. And that
            one sank into the swamp. So I built a third. That burned down, fell over,
            and then sank into the swamp. But the fourth one stayed up. And that's what
            you're going to get, Son, the strongest castle in all of England.

            That sounds a lot like the development history of Windows.

            • by Intron ( 870560 ) on Wednesday December 23, 2009 @02:47PM (#30537536)
              Except for the part about the 4th one staying up.
            • by Nutria ( 679911 )

              That sounds a lot like the development history of Windows.

              Well yes, and an example that "persistence pays off".

            • King of Swamp Castle: When I first came here, this was all swamp. Everyone
              said I was daft to build a castle on a swamp, but I built in all the same,
              just to show them. It sank into the swamp. So I built a second one. And that
              one sank into the swamp. So I built a third. That burned down, fell over,
              and then sank into the swamp. But the fourth one stayed up. And that's what
              you're going to get, Son, the strongest castle in all of England.

              That sounds a lot like the development history of Windows.

              So the first one is DOS (except for the "built it myself" part), the second one is Win3.1, the third is Win ME, and the fourth is XP? Where do Vista and 7 fit in?

              • Win3.1 wasn't an OS, it was only an app that ran on top of DOS. No idea about Win ME. The first true new OS after DOS in the windows family was WinNT.
        • Re:Yes. (Score:4, Interesting)

          by dkleinsc ( 563838 ) on Wednesday December 23, 2009 @01:55PM (#30536990) Homepage

          I'm reminded of the method of quality assurance used by the Romans: After putting in the capstone of an arch, the engineer responsible for creating that arch was required to stand under it while the wooden scaffolding was removed.

          • Re: (Score:3, Insightful)

            by tool462 ( 677306 )

            Interesting. I can think of another field where this could be useful:

            Require all fund managers to have a significant portion of their net worth in the funds they manage. If the fund collapses, they go down with the ship.

        • After a couple of years and innumerable patches, the bridge, now essentially a pile of cement over a chasm, will finally stop dropping more than a couple cars per day

          We usually call that "pile of cement over a chasm" a "dam"
          I used to think they were purpose built structures, but now I know that they're just a cement mixer's version of "bridge"

        • No, the first vehicle to cross the bridge should always be a heavy bus carrying the all engineers that designed it, as well as all the suppliers of materials for the bridge and supervisors for the construction. I think we'll refer to this method as "Chinese Quality Control".
        • And if it was a Massively Multicar bridge, it wouldn't even cross the chasm all the way before it was opened.
      • Re: (Score:1, Interesting)

        by Anonymous Coward

        Next Question.

        No shit. I don't understand how this got to be a story. What's next, "Should Engineers Who Design Bridges Demonstrate Competency Before Thousands of Automobiles Drive on Those Bridges?"

        Going off on a tangent here.
        One of my college professors once spoke in class about a former student of our school.
        A bridge had collapsed and the engineer testified in court that it was his college Profs fault.
        Why you might ask?
        The Answer:
        Because he had made the same mistake on a class project and had not been penalized for it then.

        I never did find out if my professor had made that one up or if it really was based on a real case.

    • Next up on easy question theatre... why are you hitting yourself? why are you hitting yourself? why are you hitting yourself?
    • Are you happy?

    • Re: (Score:1, Informative)

      by Anonymous Coward

      Code review of **every line** is best practice. That's independent, desk check style code reviews. The reviewer needs to feel they could put their name on the code, or start writing action. Any questions need to be addressed prior to the sit-down review with an uninterested moderator. Any burning questions that were not answered to everyone's satisfaction, need to be researched until there aren't any more "I don't understand" that section of code.

  • OpenBSD (Score:2, Informative)

    by Anonymous Coward

    OpenBSD does code audits. All security-sensitive applications should be, if not by the developers, by the people deploying them, if they have the resources.

  • Flip the question. (Score:2, Interesting)

    by tacarat ( 696339 )
    How are they auditing the code of the closed source apps they're using? If there are steps in place, use those as a minimum. If there aren't, then how's the blind faith of using those programs different than what's needed for open source?
    • It's different because users of paid merchandise or services can seek legal remediation if something goes terribly wrong. The payment creates and obligation. In free software, there's no corresponding obligation, because there has been no payment. Of course, paid OSS (e.g., from RedHat) falls somewhere confusingly in the middle.
      • It's different because users of paid merchandise or services can seek legal remediation if something goes terribly wrong.

        They must not have read the EULA...

      • by BronsCon ( 927697 ) <social@bronstrup.com> on Wednesday December 23, 2009 @01:08PM (#30536524) Journal

        It's different because users of paid merchandise or services can seek legal remediation if something goes terribly wrong. Unless, of course, the license agreement specifically states that there is no guarantee of the program's fitness for any specific purpose.

        There, fixed that for ya.

        • IANAL, but that clause would be trivial to toss out. If a company is marketing their software as "the best financial package available" and a giant bug in it then causes massive losses for their customers, leaning on that clause just ain't gonna cut it.

          • Re: (Score:3, Informative)

            by Kjella ( 173770 )

            IANAL, but that clause would be trivial to toss out

            Lawyer: "I'm not a software developer, but it's trivial to use that java library in a C# application"

            That's about how many orders of wrong you are here. I also play my share of lawyer on slashdot, but I know how to read cornell.edu - and it's amazing how much better the discussion would be if most people had - but I also know when to STFU and not make a fool out of myself. Like in this case UCC 2-316. Exclusion or Modification of Warranties. [cornell.edu] which quite clearly states that you can exclude any implied warran

            • Lawyer: "I'm not a software developer, but it's trivial to use that java library in a C# application"

              There are ways to do exactly that. A quick Google search turned up this discussion [bytes.com].

              If a company is selling a financial accounting package, and tries to state in their disclaimer that the software is "not fit for any particular purpose", I really can't see a judge signing off on that. Free software, because there is no contract between the parties, can get away with that. But when there's a contract you have to be much, much more explicit to avoid things like this.

        • It's different because users of paid merchandise or services can seek legal remediation if something goes terribly wrong. Unless, of course, the license agreement specifically states that there is no guarantee of the program's fitness for any specific purpose. Except, of course, when/where the law states that there is an automatic guarantee and automatic liability.

          There, fixed that for ya.

      • Re: (Score:3, Insightful)

        by minsk ( 805035 )

        The payment creates an[] obligation.

        An obligation to include vicious anti-liability clauses and avoid any admission of wrong-doing?

      • Has anybody sued MS and won because there was a bug in their product? Do you think you could sue any sizeable software company and get any money out of them because you lost money due to a bug in the product? Unless you are hiring a company to do custom software, and it's spelled out in the contract, there probably isn't much of a recourse for anybody who loses money/data due to a bug in software.
        • Re: (Score:3, Funny)

          by schon ( 31600 )

          Has anybody sued MS and won because there was a bug in their product?

          Of course not. Everyone knows that MS products don't have bugs.

    • by mrisaacs ( 59875 )

      It's not uncommon for large organizations to require access to code, have a third party audit it, or require some form of liability insurance from the vendor when closed source code is purchased. There's also the not very reliable, and very dangerous, assumption that vendors have already vetted the code against malicious/non-secure code.

      For open source code - there's no-one accountable vouching for the code or offering insurance - so organizations are forced to audit the code Plus there's the usually wrong

    • by elnyka ( 803306 ) on Wednesday December 23, 2009 @01:26PM (#30536708)

      How are they auditing the code of the closed source apps they're using? If there are steps in place, use those as a minimum. If there aren't, then how's the blind faith of using those programs different than what's needed for open source?

      Flipping the question does not answer the original one, which is a valid one and which deserves an answer. The answer is, just like anything, it depends. It depends on the open source artifacts in question; it depends on the specific audit/security requirements; it depends on how critical the app under development is; it depends on SLA agreements (if one exists and requires it.)

      As you said, if there are steps in place, use those as a minimum, provided that they are sufficient for the requirements at hand.

      If there aren't any, you can't just cross your arms and say "well, if I didn't do them with COTS, why would I with FOSS"? If there aren't, and your project requires them, then shit, you implement them.

      The question of whether to sec audit something, be it COTS or FOSS is predicated by the requirements at hand, not on whether a previous usage of COTS (or FOSS) was properly audited in the past.

      • by tacarat ( 696339 )
        That's a great follow up line of thinking for folks that flipped the initial question. No mod points, though :(
    • How are they auditing the code of the closed source apps they're using? If there are steps in place, use those as a minimum. If there aren't, then how's the blind faith of using those programs different than what's needed for open source?

      Good point... however I would posit that somebody had better be auditing the code, be it open source or closed. In the closed case, it should be the vendor itself, or a neutral 3rd party. Now granted there is no guarantee that it is done properly in the closed source case, but that should be part of the vendor's liability. (yeah yeah, vendors dont take liability for shrink-wrap software, but they typically do for custom projects)

      As far as open source goes... none of us have the time or manpower to audit all

      • by Coz ( 178857 )

        Someone should be auditing Apache and Linux, and it had better be the vendors making the cash off it. If Red Hat and the others aren't reviewing the code base regularly, I want to know what my support contract's paying for. I should receive an assurance that the system has been audited for most known vulnerabilities, and every patch should have eyes on it (besides the maintainer's) that look for obvious things (buffer overflows, SQL injection vulnerabilities) and oddness (the nightmare of a multi-patch Ea

        • That's an inescapable reality and it's not unique to closed or open source software. You always have to contend with the fact that the developer may have left a bug intentionally that allows remote code execution or privilege escalation.

          You can audit code all day long, but the chance of a something getting through is high. You might be able to take a small application and with some assurance say it's bug free, but you'll never ever accomplish such a feat with a large project like the Linux kernel or the ent

    • IVV under NDA. Independed validation and verification under non-disclosure agreement.

      That is, if anyone in private industry bothers to buy source and have it independently audited.

  • by BadAnalogyGuy ( 945258 ) <BadAnalogyGuy@gmail.com> on Wednesday December 23, 2009 @12:54PM (#30536326)

    The answer is Yes. When you run software, you are running it under 1 of the following 3 assumptions:

    1. You implicitly trust the vendor
    2. You have tested it yourself and trust your tests
    3. You are oblivious (the vast majority of users are)

    What's more, since Open Source software lacks any single person you could possibly sue in case things go terribly wrong, it makes sense to mistrust it a priori. OSS isn't magically secure because it is open. It still needs testing and validation if you intend to run it in any serious corporate environment.

    To simply accept a software package without assuming it is riddled with bugs and security vulnerabilities is foolish. No matter if it is a proprietary software package or an Open Source community project, any sane CIO will want some sort of evidence that the product will not end up losing them money and customer trust due to security vunerabilities.

    • by jimbobborg ( 128330 ) on Wednesday December 23, 2009 @12:59PM (#30536392)

      What's more, since Open Source software lacks any single person you could possibly sue in case things go terribly wrong, it makes sense to mistrust it a priori. OSS isn't magically secure because it is open. It still needs testing and validation if you intend to run it in any serious corporate environment.

      I still hear this every once in a while. So my question is, has anyone ever sued Microsoft for loss of data/trust? Have you not read the EULA?

      • The EULA might be irrelevant, depending on the specifics of the case. In particular, there is a notion of an "implied warranty", that no EULA can break.

        In common law jurisdictions, an implied warranty is a contract law term for certain assurances that are presumed to be made in the sale of products or real property, due to the circumstances of the sale. These assurances are characterized as warranties irrespective of whether the seller has expressly promised them orally or in writing. They include an impli

      • I worked for one of the law firms that Microsoft hires to defend themselves from law suits. They may have even written part of the EULA that waves the right to sue for damages and lost of data.trust.

        But if anyone does sue them, Microsoft can afford the best lawyers to fight it, and run up costs until it is a Pyrrhic victory [wikipedia.org] that cost more in legal fees and court fees than they won from Microsoft.

        • by haruchai ( 17472 )

          Is that how things work in the US? I'm pretty sure that the loser has to reimburse the winner of a lawsuit for costs and fees in Canada. Also, the judge's has the right to cap the amount that must
          be re-imbursed - this is useful when one party has much greater resources that the other or when
          the judge feels that certain tactics were inappropriate.

          For example, let's say I sue M$ for infringement of something I created and they start burying me in paperwork ( I can only afford basic legal representation ), use

          • The only exception to this is to get a pro-bono (ie free) lawyer to take the case. In order to pass the bar exam for each court, a lawyer needs to take on pro-bono cases for a certain amount of billable hours in addition to passing the bar exam.

            Some non-profit organizations like the ACLU will pay for legal fees under certain cases like discrimination.

            In some cases the judge will award the winner of a law suit the legal fees from the other side, only if they judge decides to do that as part of the damages, e

      • If that were enough to guarantee that it would be impossible to be held legally responsible for product failures or shortcomings, it would be sufficient to attach a EULA to all medication that states the provider is not in any was responsible for death or disability caused by the drug. There's no way something like that would hold up in court if people experience severe problems from the drug.

        Microsoft might be responsible if they advertised their product as never causing any problems or resulting in los
        • If that were enough to guarantee that it would be impossible to be held legally responsible for product failures or shortcomings, it would be sufficient to attach a EULA to all medication that states the provider is not in any was responsible for death or disability caused by the drug. There's no way something like that would hold up in court if people experience severe problems from the drug.

          I think it makes a difference that drugs are both intended for human ingestion. Drugs are also regulated by the F

    • by Xtifr ( 1323 )

      What's more, since Open Source software lacks any single person you could possibly sue in case things go terribly wrong

      Let's rephrase that--may lack. There's plenty of ways you can arrange to have OSS that has someone to sue. Most of those ways involve payment, however, which seems like a reasonable trade-off for the assumption of that risk.

  • You *think* the VPN and encryption software is secure. But flaws have been found in the past. The the basic underlying strategy of security is a multi-layered defense.

  • The fact that this question has to even be asked, tells you a lot about how applications are developed.

    The US has dedicated itself to a race to the bottom in quality and price. Testing is just one of those things companies throw out because it is an expense with no obvious benefits, to those who are not vested in the long term for their products.

    • The fact that this question has to even be asked, tells you a lot about how applications are developed.

      The US has dedicated itself to a race to the bottom in quality and price. Testing is just one of those things companies throw out because it is an expense with no obvious benefits, to those who are not vested in the long term for their products.

      Well of course. Concerns about larger long-term benefit might interfere with the All-Important concern about lesser short-term gain.

    • by flajann ( 658201 )

      The fact that this question has to even be asked, tells you a lot about how applications are developed.

      The US has dedicated itself to a race to the bottom in quality and price. Testing is just one of those things companies throw out because it is an expense with no obvious benefits, to those who are not vested in the long term for their products.

      There is so much pressure from the business side to rush to market that corners are inevitably cut, and the first place that usually gets cut is testing.

      The realities of today's high-tech business world almost demands that you release crappy code NOW just to get your foot in the door of the market share. You can always release upgrades after the poor fools have bought into your software.

      In an ideal world, everything should receive security audits before release. If you are Big Company releasing to Open

    • Why should I pay people to test my products when I can get my customers to pay me for the privilege of testing my products? (No, I don't work for Microsoft -- I'm just playing Devil's Advocate here.)
      • *You* test to make sure the product is saleable.

        *They* test to make sure it meets their needs.

        If you don't do your part, they won't even get to their testing, as your product won't be considered.

  • If you want publicity in any way you can get it, feel free to skip testing. Data breaches make good news. It may not be the kind of publicity you want.

    Seriously, it depends on your level of trust and you level of need for security. Though, if you are using a supposedly secure transport, I imagine your need for security is relatively high. Besides, you are putting your trust in an external company, which means if that company gets breached your data is right there. If you don't encrypt it with a second

  • I think the answer reasonably is anywhere between "yes" and "absolutely yes". For example, auditing should probably be considered very important for software such as slashdotter Fyodor's Nmap.

    You can't trust everyone in the open source community to be completely white-hat all the time...

  • If you have the resources to vet the code without draining resources, then it may be useful for you to do it. If you use closed source code, you just have to trust that (and maybe black box test it). At a minimum, test everything to the same standard.
    If you barely have the resources to cobble together a quick and dirty IT system, then trying to security test open source software may not be the best way to grow your company (unless that's what you're intending to do as your business, in which case, you'll

  • It's not just a matter of security. I would think you would want to verify, via some method (code review, etc) that the code is correct and provides the desired results, doesn't crash, is properly integrated, etc.

            Brett

  • > companies may be wondering if they should conduct security tests of their customized open source software before deployment ..

    If they haven't already conducted penetration tests before deployment and implemented a secure irrevocable auditing system, then they shouldn't even be in the business ..

  • ...the next question that's a posted article [rubs crystal ball]Is Code Testing of Open Source Apps Necessary?[/rubs crystal ball]

  • The consequences of fixing a problem while it's being exploited are usually much more severe than not having the problem in the first place. Proactive security [openbsd.org] is the way to go. That's why BUGTRAQ is peppered with statements like, "This problem was fixed in OpenBSD about 6 months ago"
  • Seriously, this is a dumb question and reeks of someone trolling for a reply.
  • Uh, isn't one of the points of open source that you have thousands of eyeballs auditing the code? What the hell kind of question is this to ask, really?

  • The problem is that code auditing generally tries to detect bugs. Even in the best case scenario where you can have a complete, manual audit of the entire codebase, you will miss many, many bugs. A much cheaper and in many ways better option is to just take a look at the code. Would you be proud of having written it? Ashamed? If you'd be ashamed of it, I say auditing is useless - there will always be vulnerabilities you've missed. If you're proud of it, an audit might be worth the cost - but, then, you coul

  • I'm sure they're just opensourcing the bits of Sun's portfolio that they didn't want - sort of a cheap and easy way to divest themselves of responsibility for code and products they didn't want when they took over Sun.

    Rest assured, any bits they feel will help them make Oracle an even more ubiquitous player in the database niche of IT will not see the light of day any time soon. Frankly, I'm surprised they haven't killed MySQL yet (although they may have plans for it; and the fact that it was previously o

  • Companies should audit the code for these apps the same way they audit Linux, Bash, JBOSS and the various other OS applications they deploy. Why should this code be any different.

  • Being open source in now way means a program is bug free, or even does what it claims. Sure, chances are someone else has already found if there is something horribly wrong, but the whole point of it being open source is so you can audit it yourself. If you don't bother to actually look at the code, it might as well be closed source, since you aren't looking at the code anyway.

  • Somebody said "it depends" with a certain level of sarcasm above, but I'm going to say it in all seriousness, and echo the "why was this posted" question, also coming from a different angle.

    The headline says "open source apps" without qualification, so I'll address all open source apps first

    The criteria for wanting an audit are the same, and not all software requires an in-house audit for various and I would have said obvious reasons.

    But there are some observations that apply to open source that do not a

  • by cenc ( 1310167 ) on Wednesday December 23, 2009 @02:14PM (#30537210) Homepage

    Open source code development by definition is a sort of "self-auditing" process. That is all good. The bigger problem that is unaddressed in the the FOSS community at large that I see is when the projects that run them fall apart. For example, in this case is the Sun going to set on Sun is still not known. What about Mysql?

    More commonly it is the problem of rag tag bands of volunteers (that are increasingly novice these days), where a couple major players move the project along and if something happens to them the project goes off the rails. The rather high profile example of this was CentOS fiasco earlier this year.

    I know everyone is going to come back and say things like, "if you don't like it, fork it". That is a nice sentiment, but much harder to do in practice. Often it is like saying if you don't like the service you get at Wall Mart start your own department store chain, bank, pharmacy, or whatever. Not something even most larger companies can do, let alone end private users.

    We need a system for auditing and reviewing open source projects for their viability and overall health so users (individuals, companies, and other projects that depend on them) can make real decisions about using what they produce. Right now it is more of an art than a science to determine if a project is going to live. I am not saying limit open source creativity or stop small projects, but provide transparency as to the health of the projects. We can see the structure of the code, we should be able to see the structure of community that builds and maintains it.

    • by Chirs ( 87576 )

      'I know everyone is going to come back and say things like, "if you don't like it, fork it". That is a nice sentiment, but much harder to do in practice.'

      You've always got the option of paying someone else to fork it, or else buying a commercial project. Nobody is *forcing* companies to use FOSS--they generally use it because it is a good business decision.

  • Not necessary if the application is not critical.

    CERN's LHC and my bank's software system are typical examples of critical applications. My neighbour's wifi router is not.

    • In what regard is CERN's LHC software critical to you? Your neighbor's wifi router can be critical to your neighbor, and it most likely is to its manufacturer. I'd be hesitant to call any piece of software more complex than "hello world" categorically non-critical. If it's made publicly available or sold, the maker is^Z should be responsible that it doesn't eat anyone's babies, unless of course that is its purpose.

  • If there is a good reason to do this then companies will do it because it serves their own self interest.
    • "If there is a good reason to do this then companies will do it because it serves their own self interest."

      That statement presumes enlightened self-interest on the part of those companies...

  • "Due diligence". That's all I have to say. Do I audit the code for my personal website? No. Would I audit code for a large commercial site? I should think so.
  • I'll go against everyone and say that no, you should not have to audit the code.

    The fact that in order to use a software package safely an expert has to go through every single instruction is an aberration that would be done away with by using a capability operating system like KeyKOS, CapROS, or Coyotos.

    Start OpenOffice or PDF reader or whatever with 1) authorization to interact with its X11 window 2) a means to call out to a trusted system dialog box for reading and saving files from/to the user's space.

It is easier to write an incorrect program than understand a correct one.

Working...