Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Bug Security Software

Why Responsible Vulnerability Disclosure Is Painful and Inefficient 182

A recent rant up at Attrition.org highlights problems with the responsible disclosure of security issues. While some vendors are happy to do their own research and patch reported problems, others drag their feet and make unreasonable demands on a researcher's time and effort, making anonymous public disclosure an ever-more-tempting option. Quoting: "After a couple hours of poking, I found a huge unauthenticated confidentiality hole. Once the euphoria wore off, I realized I had a big problem on my hands. I had to tell my employer's app owners and we had to assess risk and make a decision on what to do about it. After some quick meetings with stakeholders, we decided to severely limit access to the thing while we worked with the vendor. The vendor refused to acknowledge it was a security issue. Odd, considering most everyone who sees the issue unmistakably agrees that it is not acceptable. Now I'm forced to play hardball, yet nobody wants to fully-disclose and destroy relations with this vendor, whose software is somewhat relied on. Meanwhile, I know there are hundreds of institutions, small and large, using this software who have no idea that it has flawed security and who would probably not find the risk acceptable. What can I do? Nothing. Oh well, sucks to be them. ... I've had a vendor tell me to put a webapp firewall in front of their software. Did they offer to pay for it? No. That would be like Toyota telling its customers to buy ejector seats (unsubsidized ejector seats, that is) to resolve the accelerator problem in their vehicles. I've had other vendors demand I spend time helping them understand the issue, basically consulting for free for them. Have you ever knocked on a neighbor's door to tell them they left their headlights on? Did they then require you to cook them dinner? Exactly..."
This discussion has been archived. No new comments can be posted.

Why Responsible Vulnerability Disclosure Is Painful and Inefficient

Comments Filter:
  • Blow the Whistle (Score:2, Insightful)

    by Anonymous Coward on Sunday April 11, 2010 @11:25AM (#31808156)

    What can you morally do otherwise but blow the whistle?
    If you keep playing ball with these people, they will continue to act in the fashion that they do. Nothing will change. They are basically getting away with it.

    Give them some pain now and let this be a lesson both to them and to others.

  • by unity100 ( 970058 ) on Sunday April 11, 2010 @11:28AM (#31808184) Homepage Journal

    To the net. Current dominant dog eat dog capitalist corporate culture makes corporations suppress and hush hush stuff rather than sparing the effort to fix.

  • by YesIAmAScript ( 886271 ) on Sunday April 11, 2010 @11:32AM (#31808232)

    None of those problems listed seem to be with responsible disclosure. It's your job to responsibly disclose. And you should protect their secret for a while. After that, it's not really your problem if they won't or can't act.

    I agree there are also issues here with relying on code that you now know has security issues. But those aren't anything to do with responsible disclosure either. If you posted it to the internet you'd still have issues relying on them. Same as if you didn't tell anyone.

    Look at it this way, when you tell a vendor what's wrong and try to help them fix it, you're really doing it to help yourself. Your doing it because you believe it will be less work than changing your entire system to not rely on their code.

    As an aside, I don't get a big rush when I find a problem in someone else's code. Maybe I'm just old and jaded now, but I'm just trying to get everything to work well, finding that someone else didn't do their job doesn't usually make my situation any better (as you indicate here), so I don't relish it.

  • by Wrath0fb0b ( 302444 ) on Sunday April 11, 2010 @11:38AM (#31808262)

    IMO, RD is supposed to entail a good-faith effort to notify the vendor and delay public disclosure for a reasonable amount of time (i.e. not dragging their feet) while they push a patch. If you notify the vendor, preferably including a test case, and they refuse to acknowledge that it is a security issue or suggest ridiculous fixes, that's the end of your RD duties. Ethically speaking, you are in the clear. RD requires you to give them a chance to fix the problem before publicizing it, nothing more.

    Now, vendor/rube relations are another matter entirely that are distinct from your duty of responsible disclosure. I don't envy being in the situation where you fear their wrath for disclosure but want to do the right thing.

  • Re:Leak it (Score:5, Insightful)

    by egcagrac0 ( 1410377 ) on Sunday April 11, 2010 @11:39AM (#31808276)

    Interesting, but...

    You're then basically telling anyone who might employ such an exploit how to compromise your system.

    My responsibility is to protect my system first, and then to help other people protect their systems.

    I'm not saying that a leak of an exploit isn't good, I'm saying you need to make sure that something is protecting your system from that exploit before you leak it.

  • Re:Leak it (Score:5, Insightful)

    by schon ( 31600 ) on Sunday April 11, 2010 @11:49AM (#31808334)

    You're then basically telling anyone who might employ such an exploit how to compromise your system.

    And you're assuming that someone who wants to exploit your system doesn't already know how.

  • by SwashbucklingCowboy ( 727629 ) on Sunday April 11, 2010 @11:50AM (#31808338)

    "I've had other vendors demand I spend time helping them understand the issue, basically consulting for free for them. Have you ever knocked on a neighbor's door to tell them they left their headlights on? Did they then require you to cook them dinner? Exactly..."

    You say it happens, they can't reproduce it. Thus, you have to help them understand what it is you're doing. It's not unusual for people to think they've found a bug when they in fact have not.

  • by The Second Horseman ( 121958 ) on Sunday April 11, 2010 @11:51AM (#31808348)

    I'm betting this is software being used in an educational or other not-for-profit environment. If so, one thing you have going for you is that employees in that sector actually talk to people from other institutions. If disclosing your work-around wouldn't just give away the entire problem, I'd actually publish what you did to protect the application. That way, your peers can decide if they want to do it and get a head start. It puts the vendor on notice that someone is going to notice this problem eventually.

    If your workaround gives away the entire problem, that's more difficult. Assuming education / not-for-profit, I'd start by talking (verbally) to peers at schools using the application, and see if you can get some traction that way. A group of pissed-off customers might be more effective. Your CIO may be participating in regional or national organizations, and may be able to talk to his or her peers about the issue as well.

    If you just release it on your own in an "in-your-face" way without your bosses signing on, they could decide to take it out on you if the vendor gets pissed or tries to go after you for violating some gag clause in the licensing agreement (some have them) or damaging their business. They shouldn't, but I can think of a couple of really stupid, obnoxious vendors that might.

  • try the press (Score:1, Insightful)

    by Anonymous Coward on Sunday April 11, 2010 @11:53AM (#31808366)

    Send you information to the press and ask for anonymity. Watch them jump to solve your problem. Try dgoodin@theregister.com.

  • by Anonymous Coward on Sunday April 11, 2010 @11:54AM (#31808386)

    Because you might have to change business practices or requisition more software, think of the VP who chose the vendor losing face, that can't happen now. Oh and it would cost the company more time and money, retraining and learning things are so unproductive - the board won't be impressed! At least with a security hole you can blame it on outside forces, and if it is a confidentiality hole, the customers have already paid you .. just hope it doesn't get exploited .. let the customers bear the cost of restitution.

  • by Anonymous Coward on Sunday April 11, 2010 @11:54AM (#31808390)

    US CERT, or your local version. They can wield the clue-bat on your behalf.

  • by S77IM ( 1371931 ) on Sunday April 11, 2010 @11:57AM (#31808402)

    The vendor refused to acknowledge it was a security issue.

    Then it's either a feature or a regular old non-security-related bug, and I don't see the problem with announcing it to everybody. Right?

      -- 77IM

  • by theshowmecanuck ( 703852 ) on Sunday April 11, 2010 @12:01PM (#31808434) Journal
    Or do they have a moral obligation to their shareholders to not spend money if they don't have to (keep up the bottom line)? Or do they have an ethical obligation? Just playing devils advocate since I think the idea of profitability IS a big part of this, and trading shareholders concerns over the customers or the employees seems to be the new morality for companies and corporations. If it is, then what they are doing probably seems just fine to them, and what the OP is complaining about probably seems strange.
  • Re:Leak it (Score:1, Insightful)

    by Anonymous Coward on Sunday April 11, 2010 @12:04PM (#31808454)

    Lol. It won't be so anonymous to the vendor if you were talking to them about it the day before.

    Ok they won't be able to prove that it was you who leaked it because someone else could have found the exact same problem at the exact same time, but you'll be the first suspect on the list and then the police might be able to prove it was you.

  • by Anonymous Coward on Sunday April 11, 2010 @12:09PM (#31808488)

    I had this kind of problem about 15 years ago. It was a real pain. My problem was that I wasn't able to publish my findings as the vendor made pretty clear he'd sue me over that. So I reviewed my requirements on software and found a solution I haven't heard of before - Open Source. Since then I use Open Source and though it has some minor drawbacks I don't regret this step.

    cb

  • by Aladrin ( 926209 ) on Sunday April 11, 2010 @12:10PM (#31808506)

    I agree, too. There is a certain amount of working together that is expected when you report a bug. Even once the bug is identified and reproduced, you should still be willing to help them verify that the bug is fixed as well.

    It's even in -your- best interest to do so, so that the bug gets fixed properly and quickly.

    I've done this before with thirdparty libraries that had issues with my code. I'll even admit that about 1/4 of the time, it was actually my fault and not a bug in their code after all. And sometimes, the bug fix was extremely simple and I had to jump through hoops to prove that it existed... That's just life.

  • by Anonymous Coward on Sunday April 11, 2010 @12:14PM (#31808542)

    not understanding != not reproducible

    For all you know, he submitted a 2-page e-mail detailing the problem.

  • Re:Leak it (Score:2, Insightful)

    by Tim C ( 15259 ) on Sunday April 11, 2010 @12:22PM (#31808580)

    And you're assuming that someone who wants to exploit your system doesn't already know how.

    If the exploit is secret, maybe they do know, maybe they don't.

    If the exploit is publicly disclosed, they almost certainly do.

    Given the stated situation (of having a vulnerable system) the former is preferable to the latter.

  • by Anonymous Coward on Sunday April 11, 2010 @12:32PM (#31808658)
    Sure, but how many shares of the company did you need to buy in order to get a seat at their shareholder's meeting? Then the plane ticket to get to the meeting, etc. Not really something most people are going to do - especially the submitter who couldn't even be bothered to help them reproduce the problem.
  • by dragisha ( 788 ) <dragisha.m3w@org> on Sunday April 11, 2010 @12:33PM (#31808668)

    And we all know what it really is.

    Protect your system as good as you can, firewalls, backups, whatever, or just rely on your own obscurity, and publish!

    Act surprised, act I-told-you-so, be outraged with whatever happens, and then - in few days - install a patch.

  • Re:Leak it (Score:4, Insightful)

    by Todd Knarr ( 15451 ) on Sunday April 11, 2010 @12:33PM (#31808678) Homepage

    If the exploit is secret, maybe they do know, maybe they don't.
    If the exploit is publicly disclosed, they almost certainly do.

    If the exploit is secret and maybe the bad guys know about it and maybe they don't, the only safe course is to assume they do know about it and act accordingly. It's like a door: maybe a burglar will come around trying it and maybe he won't, but you still don't go on vacation leaving the front door to your house unlocked because it's not worth the risk if he does come around.

  • by Z00L00K ( 682162 ) on Sunday April 11, 2010 @12:38PM (#31808728) Homepage Journal

    There is no need to play security issues nice anymore, just tell them in general terms that you have acquired knowledge of a bug, not that you found the bug yourself and then provide that info to a suitable magazine or publish it on Wikileaks.

  • Re:Leak it (Score:3, Insightful)

    by schon ( 31600 ) on Sunday April 11, 2010 @12:44PM (#31808794)

    Given the stated situation (of having a vulnerable system) the former is preferable to the latter.

    *sigh* Logic. You fail it.

    Given that an adversary *might* be able to exploit a vulnerability, the "preferability" of any given scenario is completely irrelevant - you take steps to make sure they can't. Seriously, saying "oh, if I wish hard enough, maybe nobody will exploit it" is FUCKING BRAINDEAD.

    If a vendor refuses to fix a vulnerability, then you must (by necessity) make the problem as widely known, so that as many of the vendor's customers (both current and future) can apply pressure to get them to fix it.

  • There are two problems I see with that. First, no matter how well-intentioned your actions, it's not the company that pays - it's the people and companies who have the software installed; in essence making the person who publishes this directly responsible (though not legally) for damages caused by the exploit. The company will just say, "oops", shrug, and move on.

    The second one is the fact that for large software packages, there is no such thing as "quick fix". When you have a few million lines of a code, a regression test is non-trivial. And when you have thousands or millions of customers, you have to regression test before you can release. This process can take weeks or even months.

    And unfortunately, even in the case where the company is being a complete jackass and simply ignoring you (or asking to to consult for free) , the first issue still applies. Say what you will about "security through obscurity" -- the fact remains that it is one of many valid and valuable tools in security. (Lest someone interpret that as an anti-OSS comment let me also add that "many eyes" is another such valid and valuable tool. Not all tools are available to all people/products.)

  • by ErikTheRed ( 162431 ) on Sunday April 11, 2010 @12:57PM (#31808892) Homepage

    Simple answer:

    Responsible Disclosure should be limited to vendors that publicly pledge (or, preferably, contractually agree to via their licensing terms) to Responsibly Fix issues that are disclosed to them. If a company doesn't abide by their own Responsibly Fix policy, it should be disclosed so that others realize that it's null and void.

  • by u38cg ( 607297 ) <calum@callingthetune.co.uk> on Sunday April 11, 2010 @01:10PM (#31809000) Homepage
    I don't see a problem with this. The vendor has replied that it doesn't consider this to be a security issue, so surely a public notification isn't going to hurt anyone, right? Right? Oh, your customers are cancelling? I thought it wasn't an issue? What can I say, silly me.
  • gentlemen (Score:3, Insightful)

    by headonfire ( 160408 ) on Sunday April 11, 2010 @01:15PM (#31809030)

    Well, I think it works like this. If vendors as a group want to encourage more "responsible disclosure", they need to operate in such a way that they take potential vulnerabilities seriously, and I don't just mean in a "we're taking this very seriously" kind of way, but more of a "we have a dedicated, knowledgable staff member/team to look into situations like this" sort of thing. If they decide it's not an issue after all, then any responsibility you have to the vendor is over regarding that issue. If they're not willing to even consider it as an issue after you've made a good faith effort to let them know how much of a problem you think it is, any responsibility you have to the vendor is over regarding that issue.

    In short, a gentleman's agreement only works if both parties are gentlemen.

  • by jkroll ( 32063 ) on Sunday April 11, 2010 @01:27PM (#31809142)

    The vendor refused to acknowledge it was a security issue.

    Then it's either a feature or a regular old non-security-related bug, and I don't see the problem with announcing it to everybody. Right?

    If you are really certain it is a valid issue, take a look at their marketing page and find out who their reference customers are. If you can get some of their important customers to raise this issue as well, you may have better luck getting some action on it without disclosing the vulnerability to the world at large.

  • by TheLink ( 130905 ) on Sunday April 11, 2010 @01:27PM (#31809144) Journal
    My point is the situation often isn't as grave as many security researchers like to think.

    Yes there is a hole. But there are zillions of holes. If the hacker wants in, they've probably already got in via some PHP exploit or some malware infected PC.

    If you've found an exploit in IE or firefox or Windows or OSX or ssh, sure go kick up a big fuss. Or just wait till the next pwn2own competition ;).

    Whereas if you've found an exploit in some ADSL modem router made by some Korean company, and they're not fixing it after you've reported it. Big deal. Hardly any of the routers will still be working 3 years from now. How many of you got pwned because of those infamous bugs that were present for years without anyone knowing? How many hackers would make a lot of $$$$ from exploiting some non-"vastly deployed" software in your company, and get away with it? Would they be better off concentrating on building windows botnets?

    If it's a problem with some vendor supplied app that your company uses, disclose the problem to the bosses. If the vendor doesn't want to fix it, and you're not the boss, it's between the boss and the vendor. You provide info and advice. Of course you don't play down the risks - that's not your job - that's the vendor's job ;).
  • by Anonymous Coward on Sunday April 11, 2010 @03:03PM (#31810068)

    Send the vendor an email:

    Thank you for clarifying that this is not a security issue. I can now safely post information about this non-security issue into public websites and ask for workarounds from helpful hackers. This would not be possible for security related bugs as those are best be kept secret for a few weeks, so that the vendor can provide and distribute a fix for it.

  • by cdrguru ( 88047 ) on Sunday April 11, 2010 @03:12PM (#31810134) Homepage

    First off, there are very few software packages of any size that are sold with terms and conditions that would allow anyone to sue the vendor for any reason. The only path I could possibly see is an agreement that bugs will be fixed - but this vendor is disclaiming that this is a bug. So even that probably isn't going to get you anywhere. Suing is almost certainly not an option unless you have money to burn.

    There are far too few details in the posting to explain how this software is used and what its function is. It could be something that is on an internal web site and the exposure is from potentially unqualified employees accessing internal information. It could also be a public web site that allows people in Eastern Europe to get enough medical information on celebrities to blackmail them. Who knows?

    If the vendor is saying it isn't a bug it is doubtful many other customers are going to immediately see that it is a problem. They might after a while - again, depending on how this software is used. So going to other custromers isn't likely to be very useful short term.

    Certainly this is a the result of buying packaged software rather than writing everything yourself. You give up a certain amount of control. Open source isn't a solution, because if you aren't familiar with the code it could take a very long time to learn enough about it to do anything like fixing it. I'm sure you could pay a consultant to learn the package and fix it, but then you could also just pay a consultant to write you a system the way you wanted in the first place.

    This is the age of the packaged software solution. It started around 1985 or so and continues to this day. Absolutely, a side effect of this is that you need to be on good working terms with your vendor(s) and your vendor(s) need to be competent. I haven't seen anything that says the vendor is incompetent, just that they disagree about the severity of the problem so far.

    Sure, as a programmer I would like to think that every single company should be employing programmers and custom-writing all their own stuff. Just like 1975. Except that isn't the way things turned out. Too bad, fewer programmers employed. But it is how things work today and nobody is going back.

  • A bit late (Score:4, Insightful)

    by Arker ( 91948 ) on Sunday April 11, 2010 @03:16PM (#31810166) Homepage
    The sad thing is that by doing the right thing and attempting to report this responsibly, the articles author has now set himself up to be scapegoated. If the exploit is leaked now, whether he does it or not, the vendor will blame him for it and focus as much on destroying his career as on fixing the problem. This is the reason many security professionals decided long ago *not* to report these things "responsibly" but simply to leak them anonymously in every case. Doing this forces the vendor to deal with the vulnerability without making the reporter vulnerable.
  • by Volguus Zildrohar ( 1618657 ) on Sunday April 11, 2010 @03:39PM (#31810362)

    I have no information about this instance, but the public face of the company you are speaking to is not always knowledgeable or intelligent. I work for a software company, and I have seen instances where a customer reports an issue and has it waved away without consultation by the first-contact support employee, usually under hand-waving like "no-one else is seeing this problem", or "I haven't been able to reproduce it". In most of these cases, an experienced person has not been consulted, and there has been no further investigation on our side.

    Yes, the support people are part of the company, which makes the response the company's responsibility, but at the end of the day it comes down to flawed human beings assuming they know more than they do. We try to take steps to prevent it, but it's hard to anticipate and recover from without manually inspecting each support incident after the fact - a step we just don't have the time or money to take.

  • Re:Sue Them (Score:3, Insightful)

    by SwashbucklingCowboy ( 727629 ) on Sunday April 11, 2010 @04:21PM (#31810814)

    > Sue them

    On what grounds? They didn't do what he wanted them to do?

  • One question (Score:1, Insightful)

    by Anonymous Coward on Sunday April 11, 2010 @06:32PM (#31811840)

    I've read through this and have one question that may determine whether there is a concern or not.

    1) Is the software designed to be deployed 'inside' the company firewall?

    If the answer is yes then don't put it in the DMZ and report security concerns.

    Also, if the competition already knows about this, wouldn't they be likely to expose this?

    The only reason I posted this (anonymous, forgot my password) is that I work for a software vendor and for political reasons (not technical ones) certain companies will place software that is designed for internal use on the internet. We highly recommend against this and really there is no purpose of exposing it in such a way.

    I am not suggesting that each company purchase a firewall, but place the software behind the existing firewall.

    Too many things about this discussion make be believe the sw in question was designed for internal use.

  • by Stephen Samuel ( 106962 ) <samuel AT bcgreen DOT com> on Sunday April 11, 2010 @09:57PM (#31813096) Homepage Journal
    If they do know how to exploit your system, then it's too late, agreed. If on the other hand they don't (or they hadn't thought of it), the last thing you want to do is post a big message on your front window "The key is under the welcome mat". Some folk who might not have, otherwise, thought to ransack your house might just drop by to 'take a look'.

    As somebody else pointed out ... the least you owe to your employer is to (attempt to) lock down your own system before you tell the world where the hole is.

The system was down for backups from 5am to 10am last Saturday.

Working...