Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Mozilla The Internet

Netscape Nondisclosing Mozilla Security Bugs? 123

AP writes: "Mozilla developers are contemplating disclosing Mozilla security bugs only to a limited group of people, unavailable to the public until a fix is found, as indicated in this news post and the discussion thread. Are Mozilla developers missing the point of open source (implying open security bugs) or are they under pressure from Netscape? Tell Mozilla developers what you think." Please read this post from MozillaZine, in which it is explained that there is mere open /discussion/ about security and disclosure in the mozilla security newsgroup. Thanks to Hard_Code for the hook-up.
This discussion has been archived. No new comments can be posted.

Netscape Nondisclosing Mozilla Security Bugs?

Comments Filter:
  • by Anonymous Coward
    Your forget something :

    Mozilla is not a daemon used by sysadmins,
    but a consumer product.

    Consumers don't care about bugs, and don't like to
    update their browser every week with a 56K modem.

    So hiding the bug from Kiddies view is IHMO a good idea.
  • As a black hat, and a working sysadmin, I can tell you that: A) We already know about things like this, and much worse. The only people that are being informed are clueless sysadmins who don't have underground connections. B) As a sysadmin, I deserve to know about the security problems of a given software package so that I can make an informed decision whether to deploy, or pull from production, said software. Without this knowledge I am defenseless. This is not about open vs closed source, it's about being responsible and telling _everyone_ that the software package in question is no longer trustworthy.
  • by Anonymous Coward
    Disclose the bugs to those people likely to be able to help with the fixes and hopefully get a secure version out before hordes of script kiddies exploit them.

    Security through obscurity always fails long term, but if it is only for a few weeks it can be a winner.

  • Linguistics is a complete joke of a science, full of intellectual frauds like Chomsky. The whole things should be thrown out of universities and recognised for the pseudoscience it is.
  • What's with all these beards and pony tails and body piercings anyway ? Is there some kind of unwritten law of the Unix/slashdot zealot community that one must give the appearance of a fat hairy hippy to be taken seriously, or considered 31337 ?

    Or is it that the Unix zealots want to make themselves unapproachable so they can avoid having to answer those "difficult" questions, such as "Why is WindowsNT 2000 kicking Unix's ass all over the corpoate data centers ?"

  • by Anonymous Coward
    What is the point of a company (even open-source) telling the entire world about a security bug that they have not yet fixed. I know, let's post up on the internet the Administrtive passwords to anyone who installs Mozilla. Thay way anyone who wants to crack into the machine can. Stupid.

    Now onto the solution, give details of the security hole to a number of developers on the project so that they can produce a fix. Oh my, that's what they are doing.

    It is with common sense that they have decided to go down this route, anyone with half the intelligence of a gnat can see that. I can't believe that most of you think this is bad. It is without a doubt one of the most authoritative articles available on slashdot to prove that most of the people here don't actually think about the situation before posting some mindless rubbish.

    As you all want the security holes to be published, are you willing to install it also? Just to show your support for open-source? If you really want to know, read the source.

    You're a bunch of copy-cats, follow-my-leaders, short-sighted prats.

    Thinking is to reason about or reflect on something, to exercise the power of reason, as by conceiving ideas, drawing inferences, and using judgment.

  • That's wonderfully idealistic, but what if the flaw with the door falling off would allow anyone who knew about it to be able to steal your car too?

    Should the manufacturer just go off and announce "If you have a 97 ford tempo by doing this and this you can open the car and steal it" before they have a fix for the problem?

    No thank you.

    The whole argument here is whether to announce the volnerabilities BEFORE a fix is ready.. or to keep it hush hush. Until AFTER a patch is available. This seems to me like a perfectly resonable to me.


  • > I think we agree on the following points:
    > - keep security bugs relatively quiet (Security Group Only) until
    > a fix is found, tested and committed
    > - Security Group needs to be different than ``Netscape only''

    This is completely flawed thinking. Nothing less than full disclosure is acceptable. What if I am using portions of the Mozilla code in another project? What right do you have to keep known problems with the open code in any way secret? If this is the intent, why bother keeping Mozilla open source at all?

    If AOL wants to do this with their internal "Netscape Navigator" code, fine, but creating an elite group of special people in the Mozilla community who have priviledged access to the bug database is unacceptable.

    Haven't we all learned by this point that security through obscurity DOES NOT WORK?

    ----

  • No no no no no no no no, you are wrong there....

    I've seen Hackers, to be considered 3l337, you have to be British, use a fairly naff American accent and wear the right gear. This means a combination of motorbiking leathers, football (soccer) shirts and rollerblades.
    It's vitally important you also spray your portable 'puter in a camo. stylee. If you can then run some kind of unique OS combining the worst aspects of the Mac, Windows and unix, so much the better.

    Use weird slang, that also helps and if in doubt, get celebrities to hack for you.

    That was an extract from 'troc's guide to being an hacker'

    troc
  • right but they are not saying that the bugs will never be disclosed...only that this beta period will they not disclose the bugs because they do not have time to fix them all. what sort of sys admin will be deploying beta software?????
  • I can't even believe that this is an issue. Open source means that everyone has the ability to look at every aspect of the code - bugs - functionality or security related. Otherwise, it isn't open source. If they choose to not let everyone have the abilty to contribute, then they shouldn't proclaim their project as being open-sourced.
  • The issue here isn't whether anyone is saying they have "a right to know". The Mozilla team can do anything they want with their project. Just don't call it open-source if it isn't open-sourced.

    Truth in advertising.

  • Sometimes life isn't fair.
  • idealism has nothing to do with socialism, and yes I agree that gnu is strongly socialist (you own what you produce, that is all socialism means, nothing more, nothing less - many socialists are anarchists as well)

    idealism is a metaphysical position which holds that reality, at least the only one we "can be sure of", exists only conceptually (Plato, Hegel, et. al.)

    All totalitarian institutions, including facism, stalinism and the modern american corporation all have their roots in this philosophy, namely that of Hegel. Ideas = more important than people, hence organizations formed around ideas, well, you get the point.

    I could make a better case but I'm in a hurry.

    L8R
  • I think that what will happen is that the bug tracking project will fork if necessary.

    If anyone feels that an important bug they submitted into bugzilla is being ignored, they'll post it to bugtraq or wherever.

    All the mozilla folks are doing is calling for sensitivity.

    Hamish

  • But no-one in the Real World is using the product for Real Work. So what problems would releasing details about the bugs actually cause? I really don't see why Netscape has a problem with this.

    Now weary traveller, rest your head. For just like me, you're utterly dead.
  • How does limiting the reporting of security bugs make the source less open? The source is still there. All they're limiting, is access to certain portions of the Bugzilla database. There's only a limited number of people who can perform commits on the mozilla.org CVS tree, too, and yet the project is still Open Source

    I can't think of any open source project (off the top of my head) that doesn't recommend you report security bugs in private to the maintainers, and give them a chance to fix it before going to bugtraq. Most advisories you read on bugtraq contain those magic words "I contacted the maintainer about this, and [the fix is available here | they did not respond]". This is the correct way to do things. Report the bug to the maintainers. Give them a reasonable amount of time to patch it. When it's patched, or if the maintainers fail to respond, release the information. The only exception I can think of is if you know that the bug is already being exploited in the wild, and the users need immediate protection.

    There is nothing stopping anyone who finds a bug in mozilla from immediately posting it to bugtraq, alt.2600, or mailing it in a brown parcel to their maiden aunt. All the Mozilla maintainers are saying is that if a security bug is reported to their Bugzilla system, it'll be kept to a limited number of developers until a fix is found.

    The only problem with the system was the one that was being talked about in the referenced post - security bugs are being marked "Netscape-Only", which meant giving Netscape employees who didn't work on Mozilla access, while cutting trusted non-Netscape Mozilla developers out of the loop.

    Fair enough.

    <offtopic>

    Over the last few days, slashdot has really showed the result of its descent into the Animal Farm mentality, bleating its "four legs good, two legs bad" editorial mantras about really complicated subjects such as copyright law, the ethics of service theft, and now the question of publicising security holes.

    At the same time, the signal-to-noise has plummetted to new depths. My occasional forays into M2, that once averaged showing me 8 good posts and 2 troll/flamebait posts, are now more likely to be 2 good posts, 2 bad posts wrongly moderated up, and six troll/flamebait posts. Of the posts that do end up under articles, it's getting harder and harder to find the well-informed posters, as distinct from the wannabes who think they know what they're talking about when actually they don't.

    Oh well. I guess it's time to go back to Usenet. At least there I can score posts on my own criteria.

    </offtopic>

    Charles Miller
    --
  • Patches, rather than complete re-issue of code, might be a good idea. Back in the days when I was a mainframe system programmer one of the most common methods for the supplier (ICL) to provide bug fixes was by issuing patches. These were applied against the executable file, and were typically only 20 or 30 lines in length.

    This technique does not seem to have become very popular with smaller systems. The only current example I can of is fortify.net [fortify.net] who provide patches for 128bit security for Netscape (not needed so much now that Netscape are allowed to export the 128bit versions.)

  • The initial news post says that some of the bugs are in the Netscape branded beta 1, so this does affect Netscape directly.
  • I feel it is up to the owner of each OpenSource project as to how they handle security problems. If I was writing a daemon that someone pointed out had a security hole, I'd fix it furst and then inform the maling lists of a new version to fix hole X. If it is going to take a wile to fix, things would be different.
  • Letting the world know about bugs to force people like M$ to release patches is all well and good...

    BUT...

    Mozilla is intended to be a widely used application & if any security bugs were publicly announced it'd be next to impossible to upgrade all NN Million copies of it out there quickly and easily. The script kiddies that hang around on #l33t would have a field day!

    It's just not practical to re-release/patch Mozilla every few days just for the sake of being able to see the bugs in bugzilla. Fair enough, it's open source, but i'm sure if you check in the license there's NOTHING about witholding information (such as bugs) about the product.
  • It should be openly published. Nobody can know for sure that they are the first to discover the bug.

    That would be the reason to quitely pass it along to the main developers By posting the problem widely, you are only alerting the bad guys (should they have missed it). Once the developers know about the problem, rest assured, there will be a fix out very soon.
    Now if the software in distress is not open source, the situation is different. The likes M$ ain't likely so post a fix unless they have a compelling reason to do so. Public embarrasment and/or lawsuits are the only really compelling reasons for them...
    Now, Mozilla is open source, so if there is a problem, tell them, or fix the bug yourself and post the fix. If the Mozilla team finds a bug, they can tell the developers (them self ;-) fix the bug and post the fix.
  • Now, that is a very valid argument that I can wholeheartedly buy.
    I still argue that public disclosure to get the bug fixed is invalid. (That was the statement that I replied to.)
  • The issue here is not wether or not the code is open source. The issue is when a security bug is discovered by the Mozilla team should the tell the world before they have a fix for the bug.
    If you find the bug by all means fix it or post the bug (or ignore, or exploit :-)
  • Bug reports on security issues tend to be a lot different than other kinds of bug reports, partly because the kinds of people who find security holes are a lot different from those who find, say, problems with menu rendering.

    Fixing security holes is not a time consuming issue because most of the time, the submitter includes a fix for it with the bug report! When this is not the case, the nature of the fix is almost always obvious based on the description of the problem.

    The vast majority of security exploits can be fixed with a one- or two-line change to the code. If the Mozilla developers don't have time to make a one-line security fix to their code in a timely fashion, they have much bigger problems to deal with than security.

    And *no one*, commercial or open, discloses information about their security bugs until they have a fix. It would be foolish to do so, not because it's bad for PR but because it's a disservice to the users. If that puts them in the same boat as Microsoft, then so be it--Microsoft can't be wrong all the time.

    OTOH, if the Mozilla team tried to delay release of the fix until Netscape also had a chance to get theirs out, that would be wrong. It also wouldn't do them much good, since the person reporting the bug is going to post it to BUGTRAQ sooner or later anyway...
  • I don't know how buggy Mozilla might be (all bugs, not just security bugs). I do know that Netscape Navigator 3.X and 4.X were both buggy as hell, and even at 4.72 many bugs remain. I've been personally tracking one bug that I first reported to Netscape at 4.0b2 which remains in 4.72.

    Due to new bugs in the 4.X series which affect my desktop, and the poor performance of 4.X, I am forced to continue using 3.X. 3.X's bugs are mainly in the Java and JavaScript implementations, which I leave disabled. While 4.X appears to be more stable in Java and JavaScript (but not 100% so), it has terrible rendering performance, especially with nested tables.

    Will Mozilla be better? I genuinely hope it is. Do I expect it to be? Perhaps. I was unable to get the copy I downloaded to even compile (but I have not tried in a while now).

    I hold out hope because I know that one of the things that makes for a horribly buggy program is the reliance on third-party code, especially commercial code. Fixing bugs related to that is a nightmare because the application programmers have to do as much work or more just to prove to the third party that the problem is in the third party code (which it sometimes is, and sometimes is just a matter of confusion about how it is supposed to work, e.g. a documentation bug). To the extent Mozilla does NOT use third party code, it has a chance of being more reliable and more secure.

    Hopefully that will also mean more time to address security bugs. But it should not be an excuse to waste time on security bugs.

    If a bug is found and is NOT known to those who would exploit it, then I am in favor of those bug reports being kept secret for a finit length of time (30 days at most). There's no reason the authors cannot address those bugs within 30 days. If the bug becomes known to those who would exploit it, I believe I then should be made aware of the bug so I can apply work arounds immediately (which might be to cease using Mozilla altogether of a specific workarounds is not available).
  • Selective shitfit throwing is not a good excuse my friend. It simply needs to be an open process, and perhaps THE reason for that is a bug that destroys your /usr/local/games directory... a LOT of people would be pissed off (including me). Keep it open, keep people happy. Any 'extreme exploits' will simply need to be fixed quickly. I know its not always that easy, but by this time it should be... how long have the mozilla peeples been playing with that code? After a while, you kind of become meshed into the code yourself. Btw, since I'm to lazy to check myself, is Mozilla fully GPL'ed (not LGPL or any other stupidGPL, G-P-L)or are there use restrictions inherited from netscape anywhere in there?



    -=chiphead
    -=-=-=-
  • Mozilla is free software. You (slashdot) say they're hiding security holes... Oh really? How's that?

    Let's say they find out about a security problem, but they don't tell *you* (not a mozilla developer) about it, they just check in a fix. Well, all the information that anybody should need to know about it is contained in the source tree, isn't it? The only difference is that it wasn't packaged nicely for the script-kiddies. It's available in a form that is really only useful to people who care to know the code. Which is fine, since they're the only ones who are going to be fixing the bugs, aren't they?

    This approach is *not* security through obscurity, not by a long shot. And it's not unique to Mozilla; it's standard best practice in the free software community. Just ask debian, redhat, or openbsd; this is how they've been doing things for years, with *excellent* results.
  • > how do you earn the right to be trusted? There's no karma system at Mozilla.

    I always use -
    %export MOZILLA_KARMA="http://slashdot.org/users.pl"

    --
  • I had high hopes for Mozilla, till I used it. The flakiness I can deal with, the IE5 workalike/lookalike design choice just makes me sad. The non-disclosure, though, that's a different matter. We knew all along that it wasn't a real OS license, know we get to see why. This is why RMS is so adamant about FREE SOFTWARE as a opposed to "OPEN". It's a big difference, folks.
  • Everyone is whining about "security through obscurity". Well, sorry, the other forms of security are already gone -- that's what it means that there's a security bug. Security through obscurity is better than nothing.

    I agree.

    I think the ./ elite need to remember that are many, many users that do not continuously download the latest security fixes. Only people who are full-time admins or don't have a life can keep up with this stuff. The rest of us at best manage to update our system with the latest version of our favorite distribution.

    Ideally, the system should update itself, but that has its own problems. Maybe this is available already?

  • Ah, but you miss several points entirely.

    First of all, the GPL is not Linus Torvalds; it is an FSF idea. Second, the GPL does very well enforce ownership of the code. The GPL is a measure used to prevent abuse of the code; freely donated code; by those who would like to appropriate it for themselves. Any GPL code I release is mine and mine alone, and you may only distribute it if you follow the license agreement.

    Free software does not care wether or not corporations can make money off code. It is irrelevant. The freedom and evolution of the code is by far more important; if corporations cannot compete with what is to large amounts developed by enthusiasts in their spare time then they have some serious problems and should try in a less competetive buisness.

    Sure you can attempt to put some sort of socialist/capitalist/whatever perspective onto it, but you end up with a fundamentally flawed argument because political leanings or economic theories arent what it is about. It is about freedom and knowledge and software that works.
  • Isn't it more disrespectful to open up about the bugs when a patch isn't almost done. What good can a bunch of average joes and script kiddies knowing about the bug do? Only the developers can fix it, so why shouldn't they be the only ones to know about it until they're a few days or so from fixing it?

    How would you feel if your system were cracked, using a known but unpublicized exploit -- and that knowledge of this problem was withheld from you, supposedly for your own protection? I don't know about you, but I'd be hopping mad.

    If a known problem is publicized, sysadmins can shut down or restrict the (potentially) compromised service, make backups, check for intrusions, watch for more intrusions. Programmers who aren't part of the "insider" group may nonetheless have some insight that will lead to a faster or more definitive fix.

    Microsoft will spread FUD regardless of what open source developers do. Just ignore them and do the right thing.

  • Just one small detail, (admitably I don't agree with your position, but I've covered that in other posts) how do you figure PGP is obscure? Peter Zimmerman made a point of posting the source code, so that in the grand tradition of cryptography anyone and everyone could look and see if anything was left weak.


    ...And in the grand tradition of open source, Netscape opened the source to their browser, so that anyone and everyone could look and see if anything was left weak.

    I agree with the mantra of all security professionals, "security through obscurity does not work." However, that mantra addresses an entirely different situation. Netscape is not saying, "It's secure, because we say it's secure. Trust us."

    What they're saying is, "We believe it's secure, but feel free to look for yourself." In the cases where they find a security problem, guess what... OPEN OR NOT, IT'S NOT SECURE!!!! Announcing that fact to the world does NOT make it more secure. It's already insecure.

    In that case, the benefits of publicizing the defect must be weighed against the dangers. If they publicize the defect, they may get some help fixing the defect from the public. However, given that the Mozilla project has had a fairly long history, they probably have a good idea of which members of the public will contribute. The likelihood of someone new stepping up to fix a security problem is very slim.

    Announcing the defect to the world will initiate a race between the Mozilla developers and malicious developers with their army of script kiddies. You can bet that as soon as a viable exploit is published, it will spread like wildfire, and people will get burned.

    When the fix is complete, they will post the patch AND the source code for review. Then the cycle of open review (and all the benefits that it brings) can continue.

    -----

  • The correct thing to do to wait for the bugs to be fixed before releasing the software or declaring it out of Beta.

    Crackers have proved themselves smart enough to obtain source to packages to discover and exploit holes, and Mozilla will be no different. I certainly don't get a warm fuzzy feeeling about Mozilla if AOL pressures them into releasing it with security bugs, whether they're advertized or not.

    I've used the web for e-commerce for a long time, and have always felt comfortable about it, but with the current spate of credit card heists, I am beginning to get a bit leary, and will personally choose to steer clear of Mozilla until I have reason to believe it has no known security bugs.
  • Sorry guy, but you are the whiner who needs to shut your trap. Security through obscurity does not work. Your argument is that the bug shouldn't be leaked until a solution or workaround is known. Well!, Duh!! When you share the problem, the tons of smart programmers out there will come up with a solution, and I can promise you that a solution willl come up in an hour even if it is not perfect. Tell me what security hole that has been discovered and couldn't be solved thus leading to many attacks? None. The problem with Security is that people don't patch up and keep in touch with it. But for the sake of your argument, let's assume we find a bug which is not easy to fix, What does that mean? It simply means we are using the wrong product... Sorry guy, Security through obscurity does not work.

    I can tpye and speel.

  • You mis-represented that issue of cryptogram.

    Bruce was lashing out against vendors who disclose a security expolit merely to plug their own product.

    He clarifies this point, and promotes Full Disclosure in the next issue of cryptogram [counterpane.com].

    He also dismisses the points you made in your posting.

  • No, I think it is quite clear that unless you know a little bit about assembly language you cant understand how a "bug" can turn into a "security flaw". Many times a security flaw will not manifest itself as a bug (recognisable by the user). Yes programmers make mistakes but often security flaws are a result of not knowing what is bad programming practice - from a security point of view. It takes a trained mindset to search for security flaws and often that is not the skill set of your average c++ programmer. Just as someone with low level assembly knowledge of security flaws may not know correct object oriented programming principles, a c++ programmer will probably not know anything about security. I think it is a fair thing to say that unless you open your source to the "script kiddies" who will willingly sit there all day looking for security flaws you will not find them (look at microsoft) and unless you announce discovered flaws to script kiddies you will have a longer period of abuse before a fix is implemented.
  • If there was a bug in my browser that would allow someone to tamper with my system, or track my actions, don't you think I would want to know so that I could use another browser?

    While I'm surfing away oblivious to this bug, malicious hackers who have attained knowledge of this bug are busy uploading Back Orifice on my system.

  • I'm a sysadmin by profession and choice. Now here's a hypothetical situation:

    I work for a company that has control over technology X, where it is vital that during development that X stay an internal and closely held secret. This is consistant with a lot of companies. To keep X internal we have a firewall. Like almost every company these days we find it neccesary to allow http fairly free access through our firewall.

    In this circumstance a browser presents the most likely form of penetration of my firewall. My user runs up the buggy webbrowser (take your pick, they've all had read arbitrary file exploits with the exception of Opera and Lynx AFAIK), and joe@mycompition.com gets some funny html to them (emails it probably), oops.

    Http is one of the only protocols that is routinely passed through most every firewall in the world. I feel a browser exploit is one of the most critical security issues in existance for the corporate IT dept at least.

    And saying consumers shouldn't be aware of this is like saying we should remove low oil lights from cars, because they might scare users. The solution is to make the technobable understandable. It's not that tricky, lots of us have to do it for managment types every day :).


    ----
    Remove the rocks from my head to send email
  • In that case, the benefits of publicizing the defect must be weighed against the dangers. If they publicize the defect, they may get some help fixing the defect from the public. However, given that the Mozilla project has had a fairly long history, they probably have a good idea of which members of the public will contribute. The likelihood of someone new stepping up to fix a security problem is very slim.

    Ah, but what about the benifits to someone other then Mozilla. What about me, and all the other sysadmins who have a responsibility morally and legally to keep our employers networks secure. We, the customers of Mozilla (or Oracle, or M$s, I'm not picking on Mozilla here) have a right to expect that we won't be kept in the dark about a security exploit that might be used against us. Mozilla has *NO* way of knowing when they choose to keep a bug to themselves that it is not being actively exploited in the wild currently. The only morally responisble thing for them to do is to spread the publicity far and wide so that, I, as a user have the option to suspend use of their product until I know it is once again safe.
    Futhermore, I belive this would provide Mozilla with a very real market diffienciation from their competition (primarily IE).

    History has shown that when given the option to keep it to themselves companies choose to take uncionciousably long lengths of time to patch security holes.

    I am a professional who takes his responsibility to his customers very seriously. I find it reprehensible the manner in which our suppliers unilaterally decide to keep this information from us.

    ----
    Remove the rocks from my head to send email
  • Hrm, I obviously shouldn't write after an allnighter, I wasn't being as clear as I thought I was. Clever that. OK let me see if I can explain:

    Now, before it goes gold, include code that alows for a remote notification of security problems. Code it in now, for the future. It zots off to mozilla.org and checks for a specific html page. If it's there it renders it in a pop up window. The page would be different for each version of mozilla.

    Now iff there's a security exposure for the version you are using, when you boot Mozilla, it would bring up a window saying, "We have identified a possible security exposure. (suggested work arounds, disable java temp etc) We are currently working on an updated version of Mozilla and will notify you as soon as we have one completed"

    When the new version is available the pop-up would come up again, and include a link to get your shiney new cracker proof copy of Mozilla.

    Simple, straight foward and it acomplishes the primary goal of full disclosure. It keeps everyone, the white hats, the black hats and the innocents (users) all on the same information footing. We can then make decisions in our various capacities having as much information in hand as is available at any given time. Users could choose not to use Mozilla until the bug fix comes out for instance. Security officers could write patch for the proxy on the firewall to log attacks.


    ----
    Remove the rocks from my head to send email
  • Just one small detail, (admitably I don't agree with your position, but I've covered that in other posts) how do you figure PGP is obscure? Peter Zimmerman made a point of posting the source code, so that in the grand tradition of cryptography anyone and everyone could look and see if anything was left weak.

    As for M$ junk, the problem is that the holes that get out tend to cause more damage. Wittness the recent spat of cred card info thefts from MSSQL servers. Exploits that are found are found by the black hats and we only find out about them when they become too blatent and someone notices.

    Minupla
    ----
    Remove the rocks from my head to send email

  • Start_Mozilla();
    // begin security bugcheck
    if exist http://www.mozilla.org/security/alert_v100.xml
    { pop warning message containing contents of
    http://www.mozilla.org/security/alert_v100.xml
    }
    // end security bugcheck

    Continue_Mozzilla();

    So you see, the software knows nothing about new bugs, it just does a quick check of an online source when you start it, and display the warning from it. The software is no smarter, it just knows where to look to see if the developers have some news to pass on.


    ----
    Remove the rocks from my head to send email
  • by CoolAss ( 62578 )
    Ok, ok... so I said I would never visit Slashdot again, I lied.

    I just can't keep away from all the hipocracy!!!

    As far as this crap goes, GIVE ME A BREAK! If this was ANY Microsoft product the bugs would have been flooded all over the net and you people would be cheering.

    But it's Nutscrape, your pride and joy, so no no... we can't have bug reports spread. They are so good and happy, no bugs here! LOL!

    Netscape sucks. Period. Let the bugs fly!
  • This argument would also exempt Microsoft, IBM and any other company that refuses to acknowledge defects in their code. How can we complain about the MAX_INT nmber of errors in Win2K and not complain about this.

    If this behavior meets the current definition of Open Source, it is time to change the definition again!
  • But when the Mozilla developers spot a bug, why the hell should they not keep it to themselves? So it can be fixed. The bug is found, so it's not a question of "more eyes". Then it becaomes a question of more hands coding. Look do I really have to get into such a childish debate to explain it? It's like pulling teeth. You don't seem to get the core of the issue which is: CRACKING is EASY. You don't need a schematic to slash a tire. How many virus creators have the Windows source code? Evil is easy, good is difficult. They're already working on it, and it's pretty unlikely that anyone outside the project is going to have a far superior solution. Why is it people think the population is retarded? I work on one project where one fool didn't release details to people who COULD help, much less your so-called "outsiders". Elitism pisses me off. Anyone who doesn't code downloads the binary. You seem to forget however that the ones in the project are also in the general public. They maybe working on a project and are interested in getting more details on how to interface with Gecko. Those people are very likely How in hell does it help security for them to give the cracker kiddies their raw material? Kiddies don't NEED the source. Software IS NOT a black art. Christ! You really don't get it.
  • Are you really protecting people by keeping vulnerabilities secret? If you were a car manufacturer and had a defect in a car door that opened by accident, would you keep this secret until a fix would be found, or would you make this information public? It might be wise to warn others so they might be able to prepare themselves, lest they end up missing finding this vulnerability at the wrong time.

    One difference, security bugs in Netscape/Mozilla are not lethal. (least not to humans... systems maybe). Point being, you release bug info about a faulty hood latch, people aren't going to go around exploiting said bug. Whereas releasing security bug information will give "script kiddies" the chance to exploit and brag.

    I prefer to use an operating system and utilities where users are quick to point out flaws, share a script to quickly prove its flaw, and convince others quickly that its a bad bug. Usually, a fix is available by the time I see the exploit (hours.)

    So would I. When you find one that one in which people use scripts to demostrate and NOT exploit.. let me know please :)
    Not saying that everyone exploits in the Linux/Open Source community.

    Give me a free system where people are allowed to speak, work together, and make the product possible. Hiding details only benefits a group, not the public. Sharing problems opens you up to a community. Its nothing to be afraid of. Its progress and change.

    Agreed. Note, I am not disagreeing with you, just saying that there are 2 sides to the issue and I can understand their reasoning for keeping the info within a small group. Limit the access, limit the liability.

    In truth, I'm torn. I am glad they want to protect me from "script kiddie" exploiting holes and bugs found in their software, but at the same time it bothers me. Then again, I'm not sure *all* information wants/needs to be free.

  • AOL is a BIG chunkc of the newbie market.

    Yes, exactly the same people who won't upgrade a browser when someone finds a security hole in it, because they don't read Bugtraq. Personally I have no patience with such people. :)
  • but I can't see it going much anywhere on Windows. It's not, and never will be, wide spread enough or important enough to hide security bugs.

    You seem to forget that AOL now owns Netscape, and has hated Microsoft for some time. How long after NS6 will AOL release a version of their software instead of IE5. Or how long until they have a set-top box based on Linux and Mozilla? Most likely in the next two year, IMHO, and AOL is a BIG chunkc of the newbie market.
  • legal issues - the bugs probably contain code excerpts, API discussions, patches, and such that you can't see w/o netscape violating their RSA license for BSAFE. Besides, the code that those bugs apply to (PSM) has patent issues that mean you can't see it or work on it anyway.

    RSA has gone out of their way to make SSL an extremely sticky mess, that's why mozilla avoided it for so long...
  • Isn't it more disrespectful to open up about the bugs when a patch isn't almost done. What good can a bunch of average joes and script kiddies knowing about the bug do? Only the developers can fix it, so why shouldn't they be the only ones to know about it until they're a few days or so from fixing it? All releasing the information about bugs does is give competitors, no matter who they are, an opportunity to play on the ignorance of joe blow user. Think Microsoft FUD is bad now? Just wait until they get the chance to rip Mozilla to shreds because only Mozilla's developers publically admit to all the bugs. That's the problem with admitting that they exist before a patch is almost finished, it gives companies like Microsoft an easy way to spread FUD and gain converts quickly.
  • Whether a fix exist or not, when a problem of a product is known to the developers, witholding it from the users is to strip the users the right of the information essential to decide whether they want to keep using the product. Even though these softwares often come with no warrentee, it's a moral obligation to disclose the truth even it might mean scaring away some caucious users. I think it's even so for the users who are the willing beta testers.
  • This should, IMHO, be rated a lot higher than it is. It seems that a lot of the people who've gotten high scores in this discussion haven't read the article. They just saw "Open Source project...not disclosing" and jumped.


    -RickHunter
  • by karmma ( 105156 )
    Why should the Mozilla community trust Netscape if Netscape doesn't trust the community?
  • [Devils_Advocate begin]
    No, your right, security through obscurity doesn't work. But you've missed the point, thats not what they are doing, they are using the 'open source to finds bugs' princaple, and then not desclosing the details to the public until a fix is found. This is damage limitation, as there is a period of time that the bug exists, and a patch does not, and by limiting the people who have access to this information they are attempting to limit the amount of damage that can be done by 'script kiddies'. I makes some sense, how good an idea this is is debatable, but it is not security through obscurity, and to call it such is to muddy the issue with buzzword bingo.
    [end Devils_Advocate]
    Thad
  • Linux isn't targetted to the average person, but any particular version of Netscape is. With this, comes a larger percentage of people willing to screw other people over, and a larger percentage of people who have no clue what to do about it.

    Linux has the bonus of being a much smaller community and can be controlled by the people more. However, 'open sourcing' exploitable bugs and how to exploit them in a project such as THIS one, can only mean problems for Average Joe User- and a fielday for Average Joe HaXoR d00d.
    ----
    Don't underestimate the power of peanut brittle
  • I take it we'll be getting your crack of PGP any day soon, then? How about the non-open protocols used by the interbank system? Or perhaps you're working on those military cyphers? The NSA has a few coding machines which are kept under armed guard 24 hours a day! That's so obscure that it's bound to be incredibly weak

    "Security through obscurity does not work" is a complete substitute for thought about the relative uses of obscurity and openness. It's just as permissible to say "Security through openness does not work". In fact, security through any single one-word concept is likely to be weak, because security is a whole process.

    When it comes to things like my medical records, or bank statements, then I think I'd like them kept obscure, thank you very much. There are other forms of information, like nuclear bomb plans, which I'd also like as obscure as possible, simply to make things difficult enough that people won't bother.

    When it comes to software, then yeah, open the source. But there's no need to shout about bugs to people who have no intention of helping to fix them. The source of Mozilla is open, so if you spot a bug yourself, put it on your website, post it to slashdot, do whatever you like.

    But when the Mozilla developers spot a bug, why the hell should they not keep it to themselves? The bug is found, so it's not a question of "more eyes". They're already working on it, and it's pretty unlikely that anyone outside the project is going to have a far superior solution. How in hell does it help security for them to give the cracker kiddies their raw material?

    This story has nothing to do with "security through obscurity". The source of Mozilla remains open. It's just a group of people exercising their right to work together and not to invite outside interference.

    And people who first-post stupid slogans for the appreciation of the slashbot moderators should be ashamed of themselves.
  • Check it out at mozillaZine [mozillazine.org].
  • I'm willing to bet that the average end-user of Mozilla (consumer) will not actively check for updates, patches, security fixes etc (assuming that distributions such as Netscape will have the same kind of wide-spread usage that Netscape 3 enjoyed). So will Mozilla (or Mozilla distributions) check for updates - perhaps in a way similar to IE and Windows98? Otherwise your average consumer will be running an older version of the browser whose security bugs could potentially be all over the internet for every little script kiddie to exploit.
  • Windows update is nice, but it only alerts the user after the fix is known, not when the bug is reported.
    --
  • Because "the community" includes thousands of script kiddies. I think limited initial disclosure is easily justifiable.
  • 13785 [mozilla.org] was an error, it turns out, and has been remedied.

    I'm apparently to look out for other such cases of error, but some at Netscape don't think that I should be able to view Netscape-confidential bugs, so that might get tricky.

    We Shall See.

  • Thank you sir, for pointing out the obvious point everyone seemed to be missing.

    It seems the great Netscape/AOL group in their corporate wisdom forgot all the valid reasons why they made the damn product open source and began this project in the first place. It is time for the flood of perfectly flame-free reasonable commentary to everyone in Netscape's corporate ladder. Anyone out there have an email list? They need to be reminded of the reasons why they started this to begin with.
  • While in the long term, security through obscurity doesn't work, in the short term, I'd think that it's a workable solution. I wish that the lot of ethical hackers would announce their discovery's of bugs and holes to the company's whose products have them and, depending on the complexitiy of the issue, hold off on making any announcements until a fix is at least available.

    I mean, yes, the more skilled crackers will still have a chance to find out and exploit the holes, but it might at least prevent some of the script kiddie utilities from popping up, until at least a fix is available. If people get hit and there is no fix, it's basically terrorism (on a much larger scale). If people want to browse the web, they've got few real choices, so it's not like they can go "oh... IE sucks, i'll switch to netscape. Oh, netscape sucks too, i'll switch to Opera, oh... opera sucks, what do i do now?"

    If fixes are available, and people still get "hit" by the script kiddies, at least they have a some one to point their finger at, with that person being themselves.
  • I agree entirely.

    Let's take an ancient example: unix fingerd. Bugs, can get root exploits, and all that stuff. But note, NOW FIXED!
    You do still get "sysadmins" who think that "just not running a fingerd" is to save on the bugs; but it's really no excuse to deny people a reasonable service (in the tcp sense if not legalistic!) just because "it used to have bugs". *Every* piece of software out there "used to", and every piece of software out there "still might have". So run it if you want and keep it up to date and pass on what you know and implement security fixes....
    Oh look. Did I just describe something compatible only with keeping the whole thing open? Oops ;)
    ~Tim
    --
    .|` Clouds cross the black moonlight,
  • bah.. what do primarily C++ mozilla programmers know about fixing security flaws? Hands up, how many mozilla developers know how to write a buffer overflow? How many of you know how to find a buffer overflow.. I mean, you coded it in the first place, obviously you don't know what you coded was a bad thing. So yes, if someone on the mozilla team finds a security bug, go ahead and keep it secret.. the "script kiddies" trolling the source for overflows will do the same.
  • I think there's been ample evidence that this is not the case. Let's look at the IIS bug that came about this last summer. Big buffer overflow in a major server product. Came out in bugtraq among other garden spots. Exploits and fixes hours later.

    There are more white hats then there are black hats. We should use our numbers to solve these problems quickly by finding a solution and DISTRIBUTING the information as quickly and as widely as possible.

    Doing otherwise will alianate some of your best sources for fixing the problem, your opensource programmers. How long do you think a security bug would remain unpatched given the number of eyes an open source software project like Mozilla has looking for a solution. Frankly I'm sure we'd have a patch before they had an exploit for it, both of us starting from the same place.

    Security through obscurity gives a false sense of security to everyone.

    Consider the (not totally unrealistic scenario. Not saying this would happen with Netscape/Mozilla but it could and has happened with other software products.) Netscape finds a security bug, and generates an internal bug for it. It gets assigned to an overworked programmer, (and let's face it they're all overworked) who says, "well I have this list of things to do, and this bug isn't crashing computers cuz noone knows about it, so I'll just put it on the back burner". In the meantime, someone else with access to internal bug reports decides to 'leak' it to a friend to prove how hooked in he is. It goes out to the 313t3 crowd, we have an exploit and script kiddies, and eventually the rest of the world finds out through bugtraq. Oops.

    Think it's far fetched. Check your bugtraq archive site, there are piles of "I sent this to vendor X, 3 months ago, they haven't done anything about it, so I'm submitting it to Bugtraq now" posts. Happens all the time. If we don't talk about it, we lose the biggest advantage the net gives us. Communications.

    Disclaimer: I am not affiliated with bugtraq in any way other then being a loyal reader for many years.

    ----
    Remove the rocks from my head to use my email address
    ----
    Remove the rocks from my head to send email
  • and only apply the embargo if the knowledge is not already available in the cracker community.
    Ah, now there is the crux of the problem. How do you know? Sure it's easy if someone puts a 'sploit on bugtraq. But what if they don't? What if some group keeps it to themselves, abusing the bug in private. Now maybe you argue that 48 hrs isn't that bad. But what if that is 48 hrs where some web site is infecting each computer connecting to it with a worm. We've seen lots of browser bugs that allow files on comptuers to be manipulated, so this isn't way out in fairy tale land. This would be a pretty standard "keep it quiet we don't want anyone to know the emporer isn't wearing any clothes just yet" bug. I am a good internet user. I have a personal firewall between my browser and the net. I have the skills to be able to defend myself if I know I'm being attacked. Do you want to guess how irate I'd be if I found out Netscape/Moz. withheld details of the exploit that compromised my home computer, or worse yet, the company I work for because they wanted to avoid bad press? Ha! Just watch the press, "Open Source software causes Internet Worm, Gates says, 'I told you so'". Seem far fetched? I argue it's simply a matter of time.

    The ONLY socially responsible thing for a software organization to do is to publicize it as far and wide as they can as soon as they can. That way they can use that as a defense when the dark smelly stuff hits the oscilating airflow assist device.

    ----
    Remove the rocks from my head to send email
  • They have missed the point of open source, but there are precedents for doing this, eg a developer notices a way mozilla could leak credit card info. However:

    1. By hiding the hole they are blocking interested developers from having a chance to explore and fix the problem

    2. In many cases the bug would probably be talked about (and possibly exploited) on Bugtraq anyway

    3. They should be secure enough in thier beliefs that if a serious security bug is found instead of hiding the fact, they should issue an announcement saying there is a bug, and leave it up to people to decide if the software should be used until a solution is found. There could be levels of this, eg:
    a) Bug is serious, and being exploited in the wild,
    b) Bug is serious and not being exploited
    c) Bug is minor, but some people may be affected

    Once a fix is found this could be distributed on a mailing list.
  • only does the public have full access to the current source code, it also has access to all bug reports.

    There are some bugs that are closed to the public. At least four SSL bugs (all dependencies of 13785 [mozilla.org]) are: 28335 [mozilla.org], 28418 [mozilla.org], 28430 [mozilla.org], and 28333 [mozilla.org].

    --

  • The article goes on and on about "security through obscurity" vs. "many eyes", but fails to mention a single actual security related bug.

    The majority of the software mentioned is actually security software -- firewalls and detect stuff.

    The only "exploit" mentioned is the DOS against Yahoo et al a couple weeks back. Which, they again fail to mention, did not exploit bugs in software. Or, it didn't exploit any bugs in the software run by the sites attacked.

    It seemed more along the lines of, oh, look at the security companies who gave us money to mention them. See their neat products.

    --Kevin
  • From reading the referenced post, it appears that there will be a system wereby "security" bugs only appear on bugzilla if you are logged in as a member of the security team.

    Currently, it is limited to "netscape only", which doesn't make that much sense, because not all of the main contributers are in that category.

    --Kevin
  • I would recommend that all open source projects do the same. If you spot a security bug in the Linux kernel, or Apache, or sendmail or whatever, let the maintainers know quietly and give them a chance to announce and fix in good order; tell the world only if this procedure doesn't seem to work.

    This is generally what happens on mailing lists like Bugtraq. However, often it's better to just post to Bugtraq first off and let people confirm/analyze it, then figure out how to fix it (usually easy when you know what's wrong). For instance, over the weekend someone posted something that crashed his Slackware 7 / 2.2.14 machine - he thought it was a kernel bug. I tried it on my RH 6.1 / 2.2.14 machine (who needs uptime?), but nothing happened. So maybe it's a problem with PAM? Maybe something else? Probably it'll be figured out in a few days, and a fix will be applied. But it wasn't a kernel bug: he would have just been bothering the wrong people.

    Note this is not the case if you know what the problem is (ie, there is a buffer overflow in line xx in somefile.c in Apache, or whatever). In that case the right thing to do is to let the proper people know, then post to Bugtraq and/or other appropriate mailing list(s) after a week or two has passed and a fix is available.
  • It's generally considered good etiquette on mailing lists like Bugtraq to give the vendor at least a week advance notice before posting, to give them time to at least start working on a fix. However, if Mozilla is going to take bug reports and "cover them up" for an extended time, I can certainly see people posting exploits without bothering to let Mozilla know: why bother?

    In some situations, this method is acceptable (for instance, some security holes in DNS and Kerberos were hidden for years to allow fixes to be implemented). However, Mozilla (however over-hyped it is) will never have the wide spread adoption of DNS or Kerberos - if it's lucky and well managed it may become popular in the *nix/BeOS/MacOS/etc world, but I can't see it going much anywhere on Windows. It's not, and never will be, wide spread enough or important enough to hide security bugs.
  • Open Source is analogous, but not strictly equivalent to scientific peer review. Among the differing concepts are:

    "Given enough eyeballs, all bugs are shallow."

    and

    One is NOT allowed to propagate alterations to community work without disclosure.

    Neither of the above are true in scientific peer review, and both are frequently touted as essential elements of Open Source success.

  • Mozilla is not a daemon used by sysadmins, but a consumer product.

    If I'm administering workstations, I may well install Mozilla. As admin, I should be concerned.

    Consumers don't care about bugs, and don't like to update their browser every week with a 56K modem.

    Generally true, but presumptuous. This is a Microsoft-esque outlook: we know what you care about, and we're providing it for you, like it or not. Besides, by that logic, why not announce the bugs before there are fixes? The end-users probably won't update their browsers anyway.

  • This is why Mozilla's modularity needs to be carried one step further, such that security updates can be found and fixed at the touch of a button. I envision something like this: an "Update" button on the toolbar.

    You know, such things have always struck me as potential security holes in themselves. For instance, a man-in-the-middle attack might be possible. I run Debian's apt-get, and I always wonder just a little...

  • The problem here is that the open source model isn't designed because of efficiency,(I'm definitly not saying it doesn't work) it is an ideal. Open source is more political than technical.

    I think you make a good point; often we don't talk about freedom, and we should. But nobody's talking about hiding the source. A bunch of people are just deciding under what circumstances they'd like to discuss the source. Since they're not talking about stopping anyone else from doing anything at all, I can't see that anyone's rights are infringed by this.

  • I think there are some good points being made on both sides of this. For my part, I (more or less mildly) disapprove of this scheme.

    If there's a security hole on my system as a result of my installing Mozilla, and the Mozilla developers know about it, I think they have a certain ethical responsibility to let me know about it. Then I can decide how to deal with it. Because I make the decision, my needs can be taken into account.

    The consensus in the free software community seems to be that empowering the user is a good thing in general. On the topic of security in particular, we think that if you empower both sides by letting them know what's going on with the code, it's a net gain for the "white hats".

    If we really hold such a belief, perhaps we should have the faith to see it through to its conclusion.

    If you don't disclose until there's a fix, I'm stuck with an insecure system, banking on the obscurity of the bug till someone lets me know what's going on. If you disclose right away, I have the choice to take whatever steps I feel appropriate, including suspending use of the program.

    IMHO, choice is a good thing here, as elsewhere. But like I said, the other side has some respectable points.

  • Well... okay the bugs may be fixed. But if the Mozilla developers do not put the bugs with the rest (on bugzilla) then they can expect to spend much more time fixing them. And perhaps some will never be fixed...

    Aside from that, having the bugs available for everyone to view will allow users to make decisions about whether they want to use Mozilla. Users can make educated decisions about whether Mozilla is secure enough for him/her. Holding back the bugs is like saying: "Mozilla is secure enough for you (end users)," Mozilla developers. Which of course takes away some freedom. Although users have the option to just not use Mozilla until the bugs are fixed or disclosed.

    To me this sounds like a marketing decision. I imagine some Netscape suits do not want to ship Mozilla with known security bugs. So they have two options:

    1) Delay the shipment of the new Netscape until the important security bugs are fixed.

    2) Hide the bugs.

    What would you do? Microsoft will hide the bugs. Netscape seems to be no better than Microsoft and will likely share the same fate (watch the next couple years for Microsoft's fate ;)
  • by Paul Crowley ( 837 ) on Tuesday March 28, 2000 @12:59AM (#1166357) Homepage Journal
    It was the February issue [counterpane.com] that discussed this. I agree with you and him, and I think Netscape could avoid this whole row by promising a fixed "sunset" after which bugs will be publicized no matter what. A month should do it - most security problems that can be fixed with patches at all seem to be "d'oh!"s that get fixed very rapidly after identification.

    I would recommend that all open source projects do the same. If you spot a security bug in the Linux kernel, or Apache, or sendmail or whatever, let the maintainers know quietly and give them a chance to announce and fix in good order; tell the world only if this procedure doesn't seem to work.
    --
  • by Millennium ( 2451 ) on Tuesday March 28, 2000 @01:06AM (#1166358)
    This is an outrage. It goes completely and totally against the principles under which Mozilla is being developed. While I can see where the idea comes from, it's the same shortsighted lunacy which affects most proprietary vendors today.

    It is precisely because security holes in Linux are announced openly that it is so secure. There will always be security holes in software, so the race for most secure will go to the one with the fastest bugfix turnaround time. That will go to the one with the most eyes looking at the code. And that means the one which doesn't try to "hide" its flaws.

    This does pose a potential problem, however, because you are dealing with Web browser users here. Most aren't savvy enough to upgrade the browser every time a new hole is found and patched. This is why Mozilla's modularity needs to be carried one step further, such that security updates can be found and fixed at the touch of a button. I envision something like this: an "Update" button on the toolbar. This button is normally inactive, but when some central repository (Mozilla.org perhaps?) finds and posts a bug, the button activates and glows (or undergoes some sort of obvious state change to alert the user that something is wrong). This button will take the user to the security site (wherever that might be), where the user is given an explanation of the bug, and the chance to automatically download and install the fix. This last process has to be made as transparent as possible; remember we're dealing with everyone from techno-gods to people who wonder why the cupholder on the front of their PC isn't working anymore after they set the huge drink on it.

    But the fact is, security through obscurity for an open-source project is hypocrisy of a very high order. I seriously hope Mozilla doesn't take this path. The results would be disastrous for Mozilla and Open-Source in general (M$ in particular will jump on this as a precedent: "see, even Open-Source guys know openness doesn't work!")
  • 1. PGP has never relied on obscurity.

    2. "When it comes to things like my medical records, or bank statements, then I think I'd like them kept obscure"

    I'd rather have mine kept secure. You seem to be mixing the two. Obscuring your records by XORing them and posting them to USENET probably won't give you the desired result.

    3. NSA also doesn't give KG-84's, KY-71's or whatever they're using this year out to anyone who wants one, that gives them a decisive edge in this case (though I've always wondered why the physical boxes themselves were always security-rated lower than the information flowing through them.)

    This story has a lot to do with security through obscurity, since "normal" bugs don't get this treatment, just security bugs.

    Maybe you should re-evaluate the meaning of the term "security through obscurity", as you seem to be missing it.

    Paul
  • by The Vorlon ( 15731 ) on Tuesday March 28, 2000 @05:33AM (#1166360)
    Many consider it bad form to post to BUGTRAQ without first giving a software vendor a chance to address the problem.

    It's also considered bad form for a vendor to sit on a security bug instead of dealing with it promptly.

    Up to this point, Mozilla has actually been somewhat unique in that, not only does the public have full access to the current source code, it also has access to all bug reports. While this is a commendably open attitude, it's not the best way to protect the end user.

    All this policy change really means is that, in the short time between when the bug is first reported and when a fix is issued (or it shows up on BUGTRAQ without a fix), it's not posted in BugZilla for all the world to see. Effectively, what this means is that there are going to be fewer people out there trying to write exploits based on the bug report. This is no different from the policy in use by any other team I've ever encountered.

    If they publish information about the security bug while their product is still vulnerable, what do they gain? It will cause the userbase to worry more. Some may stop using the product until the fix is available, but what about those who don't have that option? What if you've deployed Mozilla in 30 public labs on campus, on 5 different platforms, you're the only one who can do anything about it, you have to be on campus to fix it, and you're reading about the problem while on vacation in Cancún?

    The above scenario is extreme, but not atypical. For most end users, the only real difference will be that it's suddenly become much more likely that someone will brew up an exploit for the security hole.

    So there's a security hole in the browser. So what? Do you really think the software you use is secure? There are a *lot* of free software programmers out there who write the kind of code which would spell "instant root shell" if it was ever run suid root. Even if none of the programmers working on Mozilla are like this, there are still a lot of things that can go wrong just because of the sheer complexity of the application.

    If you think the programs you're using are secure, you're kidding yourself. The most security-conscious development team I've ever had the pleasure of observing is the Samba team, and even they've had a security hole or two over the past few years. There are always going to be problems, and if they're security problems, I'd much prefer that only the program maintainers know about it, instead of publishing it for all the crackers of the world to see.

    I for one commend the Mozilla team for keeping the user's best interest in mind with this decision.
  • If you don't like the plans that mozilla.org and it's contributers come up with for handling certain security issues there are a couple of things you can do about it. You can grab the Bugzilla code (or some similar open source tool) and set up your own system. If you build something with enough value over the current BugZilla maybe people will start using it and you can admin it and set the rules about how to mark certain bugs in certain ways. The Mozilla community has been holding very public discussions about how to deal with bug reports, from issues like security bugs to issues like what to do with personal ads that find there way into Bugzilla. Most of these discussions are still ongoing. If you contribute to Mozilla (in any way) then I think your input to these discussions is valuable. I participate. I voice my opinions. Bugzilla and Mozilla are the way they are in part because I participated in the process.

    If you're unhappy with the direction Mozilla is going on any issue the place to make it better is not on /. The place to make Mozilla better is Mozilla. This project is being developed by people who care about making something great. Even the people getting paid to work on it care. Tough decisions are made every day and they are made in a fishbowl world. If you don't like what Mozilla is then fix it.

    Asa

    (posted with a damn nice Mozilla nightly build 032708)

  • by Hard_Code ( 49548 ) on Tuesday March 28, 2000 @04:57AM (#1166363)
    Man, people...the first thing I do when I see a suspicious story is go to the SOURCE.

    On MozillaZine.org they have a post explaining that there is only open /discussion about/ security and disclosure:

    http://www.mozillazine.org/talkback.html?article =1268
  • by tqbf ( 59350 ) on Tuesday March 28, 2000 @01:58AM (#1166364) Homepage
    Not everything Bruce Schneier says is right. The article you're citing is particularly wrong, and I wrote a formal response to it which you can find at SecurityFocus [securityfocus.com] or my home page [pobox.com]. Schneier, and possibly Mozilla as well, is missing the point of full disclosure.

    We can debate the morality of nondisclosure ad nauseum. I'm more interested in engineering than morality --- and the fact of the matter is that policies that discourage full and open disclosure DO NOT WORK. They hinder the discovery of important security flaws and create an environment in which black hats have a significant advantage over white hats. Remember, the black hats don't give a damn about Mozilla's disclosure policies, and history tells us that they tend to find the problems first.

    Nondisclosure has (dubious) practical benefits for tightly-guarded closed projects. Mozilla obviously doesn't qualify, and the idea that security flaws in Mozilla's open codebase can be meaningfully hidden is ludicrous.

    As this is a Slashdot story, I don't trust the veracity of the claims that Mozilla is going to try to hide security information. However, if, contrary to the established best practices in the security community, they decide to go ahead with some sort of Mozilla "inner circle", they are going to give a nice black eye to Open Source's security argument.

  • by haggar ( 72771 ) on Tuesday March 28, 2000 @01:11AM (#1166365) Homepage Journal
    note: this is not stricly related to this post, it (unfortunately) may apply to many other appeared on /.

    Are Mozilla developers missing the point of open source (implying open security bugs) or are they under pressure from Netscape? Tell Mozilla developers what you think.

    OK, so why do we al have to think the same? The country where I came from had a bitter fight for DEMOCRACY, which means, we don't have to all think, speak and act alike. Freedom of thought, freedom of expression etc. So, please, don't imply we will all tell Netscape that they don't get it, because, shock horror, we are not an army that is under the command of the /. generals.

    sorry if this is offtopic, I think it had to be said, sooner or later.

  • by SimonMcC ( 104927 ) on Tuesday March 28, 2000 @12:22AM (#1166366) Homepage Journal
    Surely all they are doing is witholding info on the bugs, not the actual source code, preventing people from taking advantage of the weakness until it's fixed. The code is there, go find the bugs & weaknesses yourself!
  • by Paul Johnson ( 33553 ) on Tuesday March 28, 2000 @12:23AM (#1166367) Homepage
    I think this is a good idea, provided that limits are placed on this non-disclosure. For example an embargo of 48 hours to give the security people a head start over the script kiddies could be a very good idea.

    On the other hand if the script kiddies already have an exploit for a hole then not telling sysadmins about the problem is obviously counterproductive.

    So, limit it to 48 hours, and only apply the embargo if the knowledge is not already available in the cracker community.

    Paul.

  • by thogard ( 43403 ) on Tuesday March 28, 2000 @02:41AM (#1166368) Homepage
    You find a bug that is a security risk and its either:
    A) a major hole (lets a remote user run any code as root)
    B) a minor hole (you build a stack frame that may get called one time in a billion only if syslog times out when the moon is full)

    Then you can tell the development team which can:
    A) ignore you
    B) start working on a quick fix
    C) start working on complete fix

    The bug is like A:A then some script kiddie will find it and make your day worse but things like B:C are a real pain to fix correctly and they may need time to think about the situation and then take corrective measures while discussing solutions that don't open up other holes.

    I think they are right in holding back major security holes but when you report the bug you should get a message back saying:
    "Your bug has a number of security related issues and we feel that telling the world at this time will result in a number of systems being compromised therefore we ask you to please wait till [a date a few days away] before disclosing this to sources that may result in a exploits becoming widely available. We have set up a special open mailing list for this bug at bug76347634@just.a.dot.com."

    I would accept that as reasonable.
  • by FooBarSmith ( 85970 ) on Tuesday March 28, 2000 @12:29AM (#1166369)
    I think this is behaving in a responsible way towards users of the software. The Apache group work in a similar fashion; from the Apache website:

    Reporting Security Problems with Apache
    The Apache Group takes a very active stance in eliminating security problems and denial of service attacks against the Apache web server. We strongly encourage folks to report such problems to our private security mailing list first, before disclosing them in a public forum. The mailing address is: I-found-a-security-problem-in-the-apache-source-co de@apache.org. We cannot accept regular bug reports or other queries at this address, we ask that you use our bug reporting page for those. All mail sent to this address that does not relate to security issues will be ignored.

    Note that all networked servers are subject to denial of service attacks, and we cannot promise magic workarounds to generic problems (such as a client streaming lots of data to your server, or re-requesting the same URL repeatedly). In general our philosophy is to avoid any attacks which can cause the server to consume resources in a non-linear relationship to the size of inputs.

  • by *borktheork* ( 123647 ) on Tuesday March 28, 2000 @12:25AM (#1166370)
    You just said that to get a fast first post without troll factor, right?

    Anyway, this isn't anything to get upset about. If you actually bothered to read Bugtraq, you'd see that this is pretty standard practice.

    Most of the time, when an exploitable bug is found, the vendor is contacted first and is given some time to come up with a fix. Sometimes a workaround is posted along with the exploit.

    Bottom line : making the world aware of a problem there isn't a fix for is usually bad policy. Don't give me that 'we have a right to know' crap. If you want to know, go and find the bugs yourself. Because otherwise, if you know so do a million script kiddies. And telling people not to use Netscape whilst a fix is being worked on is hardly doable.

  • by Anonymous Coward on Tuesday March 28, 2000 @12:22AM (#1166371)
    Look, the point of open source software is that it is analagous to the scientific concept of "peer review". You'll notice that there is no scientific concept of "review by every lugnut who knows ftp". That's because scientists aren't prey to these libertarian/egalitarian visions in which "everyone can contribute". The fact is that the marginal contribution beyond the first hundred or so developers is pretty negligible.

    You have to think in terms of marginal benefit versus marginal cost. The Mozilla developers may not be super l33to sk33to, but they're at least competent to work on Mozilla. The various long-haired lugnuts, slashbots and script kiddies who will be filling up this thread with karma-whoring sermons on "give us the source!" are massively unlikely to add anything to the exercise but noise.

    This is the way forward. Open source, but peer review only during development. With a defined way to get into the "peer group". Thus shutting out the whiners and lamers, and not letting the whole product be compromised by someone's exploitation of a bug that they had no business seeing.
  • by arcade ( 16638 ) on Tuesday March 28, 2000 @01:23AM (#1166372) Homepage
    What if someone found a hole in Apache? Should they post it far and wide, or should they quitely pass it along to the main developers so that the hole can be closed before half the world's websites are replaced by "ThiZ Site HAXed by KeWl d00d"?

    It should be openly published. Nobody can know for sure that they are the first to discover the bug. It could've been circulating in hidden circles for years - without anybody knowing.

    It is blatantly disrespectful to the customers not to open the bugs to the community. Only that way they may secure themselves.

    --
    Rune Kristian Viken
    --
    "Rune Kristian Viken" - arcade@kvine-nospam.sdal.com - arcade@efnet
  • by Minupla ( 62455 ) <minupla@gmail . c om> on Tuesday March 28, 2000 @12:52AM (#1166373) Homepage Journal
    *sighs* OK, here we go again.

    1) Not disclosing a security hole does NOT make it go away.
    2) Software developers don't always know all the security holes being actively exploited. It is entirely possible that the hole we're being 'protected' from is in fact being exploited in the wild, and the only thing that's accomplished is we're not being careful in our use of the product until it can be patched.
    3) Non-disclosure tends to slow the repair of security holes in any environment, never mind open source, where your very strength is in your userbase.

    I personally would be in favor of draconian disclosure. When a security bug is discovered, pop up a dialog box, forcing me to read about the advisory and 'continue at own risk' until a fix can be developed and a notice of said fix distributed using the same draconian alert box.

    That way everyone who uses the product (rather then just those of us who read full disclosure lists like Bugtraq) knows what exactly is going on and can change their habits accordingly. Additionally you'll have every open source programmer on the planet competing to squash the bug.

    Seems like a no brainer to me people.

    ---
    Remove the rocks from my head to send email.
    ----
    Remove the rocks from my head to send email
  • by kevin805 ( 84623 ) on Tuesday March 28, 2000 @12:44AM (#1166374) Homepage
    Everyone is whining about "security through obscurity". Well, sorry, the other forms of security are already gone -- that's what it means that there's a security bug. Security through obscurity is better than nothing.

    Unlike closed source projects, even a general description of the problem might be enough to find the exact bug, and to develop an exploit for an open source project. This means that it's even more critical that the nature of the bug isn't leaked until a solution or work-around is known.

    What if someone found a hole in Apache? Should they post it far and wide, or should they quitely pass it along to the main developers so that the hole can be closed before half the world's websites are replaced by "ThiZ Site HAXed by KeWl d00d"?

    Bruce Schneier addressed this in a recent cryptogram. See: http://www.counterpane.com/crypto- gram-0001.html [counterpane.com]

    Of course, there is the possibility of Netscape taking the microsoft approach -- just ignore it until actual damage is caused -- but I think this is unlikely. They are already agreeing that the bugs need to have a distribution other than netscape only. I think it's pretty unlikely they would let security bugs be swept under the rug.

    --Kevin
  • by zesnark ( 167803 ) <zsn AT fastfin DOT net> on Tuesday March 28, 2000 @12:33AM (#1166375)
    Mozilla is still under development and should not be treated as a release product. Security bugs should be published openly to ensure the fastest and most robust fix possible. There is no sense in concealing information about a product still under development. z

Utility is when you have one telephone, luxury is when you have two, opulence is when you have three -- and paradise is when you have none. -- Doug Larson

Working...