Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Google Microsoft Security Technology

When Is It Right To Go Public With Security Flaws? 126

nk497 writes "When it comes to security flaws, who should be warned first: users or software vendors? The debate has flared up again, after Google researcher Tavis Ormandy published a flaw in Windows Support. As previously noted on Slashdot, Google has since promised to back researchers that give vendors at least 60-days to sort out a solution to reported flaws, while Microsoft has responded by renaming responsible disclosure as 'coordinated vulnerability disclosure.' Microsoft is set to announce something related to community-based defense at Black Hat, but it's not likely to be a bug bounty, as the firm has again said it won't pay for vulnerabilities. So what other methods for managing disclosures could the security industry develop, that balance vendors need for time to develop a solution and researchers' needs to work together and publish?"
This discussion has been archived. No new comments can be posted.

When Is It Right To Go Public With Security Flaws?

Comments Filter:
  • by Rogerborg ( 306625 ) on Tuesday July 27, 2010 @10:38AM (#33044778) Homepage

    Never, ever a responsibility. You didn't write the bug, you didn't miss it in testing, you didn't release it. You owe the developer nothing.

    The only ethical consideration should be your sole judgement about the best method to get a fix in the hands of vulnerable users.

    You don't like that, Microsoft? Then do you own vulnerability testing and don't release software with vulnerabilities: the problem goes away overnight. Until then, sit down, shut up, grow up, and quit your bitching about being caught with your pants down.

  • by Anonymous Coward on Tuesday July 27, 2010 @10:45AM (#33044862)

    No one is bright enough to find a security whole that couldn't have been discovered elsewhere before. So it's pretty likely the flaw is either known to the vendor who might not have had seen the need for fixing this, or it is known to an attacker, who already uses the flaw and just didn't appear (yet) on the radar of any researcher or the vendor. As it might be possible that you yourself are monitored by anybody else, your finding might be in the open that way. So it makes no sense in keeping the public uninformed.

    cb

  • by Anonymous Coward on Tuesday July 27, 2010 @10:46AM (#33044880)

    WRT WRT 3a: So the industry and the manufacturer are basically patting each other on the back, happy in the knowledge that if no-one from the club talks about the problem, it's impossible to discover otherwise? It's going to be slightly icky to say "we told you so" when this is discovered independently and causes "a massive Zero Day event that would only harm consumers or leave them without the services of the software for several months." (Note that I used "when this is discovered", not "if". As you may be aware, if something could be done, it's only a matter of time until somebody does it)

  • Re:Never (Score:4, Insightful)

    by Whalou ( 721698 ) on Tuesday July 27, 2010 @10:47AM (#33044886)
    You do your user name proud.
  • by Nadaka ( 224565 ) on Tuesday July 27, 2010 @10:50AM (#33044932)

    This is standard operating procedure and responsible disclosure as far as I can tell.

    The problem is that the company is likely to file an injunction to stop the presentation and possibly file blackmail charges against you.

    You need to amend the above procedure with anonymous notification and demonstration in order to protect the safety of those following responsible disclosure.

  • by mOdQuArK! ( 87332 ) on Tuesday July 27, 2010 @11:10AM (#33045294)

    Technically speaking, you don't owe the other users anything either - it's still a matter of courtesy.

  • by koinu ( 472851 ) on Tuesday July 27, 2010 @11:11AM (#33045302)

    Do not give bad guys the possibility to learn about a flaw earlier than the users who are affected. If you don't publish the flaw, there is a certain possibility that it will be sold at black markets and kept secret to be able to use against customers. You can see that full disclosure groups are targets of commercial crackers. Full disclosure is like destroying business of criminals.

    A customer should always be aware of a flaw and know how to protect himself against it.

    There is no need for exploit code. You should publish it BEFORE having a PoC to warn as early as possible (but this is pretty rare, because having a PoC is usually the first indication that a flaw exists). It would also help to give as much information as possible how to protect against attacks (fixes/patches, what to avoid, what to disable, how to minimize the risk).

  • Re:Never (Score:3, Insightful)

    by jgtg32a ( 1173373 ) on Tuesday July 27, 2010 @11:30AM (#33045658)

    I thought the Brits cracked all of that?

  • Re:Never (Score:3, Insightful)

    by John Hasler ( 414242 ) on Tuesday July 27, 2010 @11:54AM (#33046102) Homepage

    Actually it was the Poles and the Brits who broke Enigma: the USA broke the Japanese codes. Irrelevant in any case though. The Germans had developed Enigma themselves and were using it only internally: there were no trusting "users" at risk.

  • by Hatta ( 162192 ) on Tuesday July 27, 2010 @12:12PM (#33046410) Journal

    Huh? If there's a severe vulnerability and the manufacturer refuses to fix it, you should release it immediately. Then at least those affected can mitigate their vulnerability. Otherwise, the black hats have free reign.

  • Re:Never (Score:3, Insightful)

    by bluefoxlucid ( 723572 ) on Tuesday July 27, 2010 @12:13PM (#33046422) Homepage Journal
    The vulnerabilities are the same regardless of who is at risk. The argument is that only 'good guys' are able to find vulnerabilities, and that 'bad guys' don't find or can't keep hold of such information, or just can't use it. The GP purports that keeping problems a secret will never result in secret underground cults developing a cohesive, structured approach to abusing those problems.
  • "Unsupported OS" means "unsupported OS." The vendor disavows any responsibility for bad things that happen when using their software on your unsupported platform.

    This is a common thing for software vendors to do to close out tickets quickly. If it's an unsupported scenario (hardware, software, use case, etc.) then they can close it and keep their average ticket lifetime down.

    A little shady, I guess, but if they never claimed to support your platform I don't see what you could really complain about.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...