Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Bug Security Software

Why Responsible Vulnerability Disclosure Is Painful and Inefficient 182

A recent rant up at Attrition.org highlights problems with the responsible disclosure of security issues. While some vendors are happy to do their own research and patch reported problems, others drag their feet and make unreasonable demands on a researcher's time and effort, making anonymous public disclosure an ever-more-tempting option. Quoting: "After a couple hours of poking, I found a huge unauthenticated confidentiality hole. Once the euphoria wore off, I realized I had a big problem on my hands. I had to tell my employer's app owners and we had to assess risk and make a decision on what to do about it. After some quick meetings with stakeholders, we decided to severely limit access to the thing while we worked with the vendor. The vendor refused to acknowledge it was a security issue. Odd, considering most everyone who sees the issue unmistakably agrees that it is not acceptable. Now I'm forced to play hardball, yet nobody wants to fully-disclose and destroy relations with this vendor, whose software is somewhat relied on. Meanwhile, I know there are hundreds of institutions, small and large, using this software who have no idea that it has flawed security and who would probably not find the risk acceptable. What can I do? Nothing. Oh well, sucks to be them. ... I've had a vendor tell me to put a webapp firewall in front of their software. Did they offer to pay for it? No. That would be like Toyota telling its customers to buy ejector seats (unsubsidized ejector seats, that is) to resolve the accelerator problem in their vehicles. I've had other vendors demand I spend time helping them understand the issue, basically consulting for free for them. Have you ever knocked on a neighbor's door to tell them they left their headlights on? Did they then require you to cook them dinner? Exactly..."
This discussion has been archived. No new comments can be posted.

Why Responsible Vulnerability Disclosure Is Painful and Inefficient

Comments Filter:
  • by 2.7182 ( 819680 ) on Sunday April 11, 2010 @10:24AM (#31808144)
    that there is an exploit that allows a user to bump their post up to first.
  • Blow the Whistle (Score:2, Insightful)

    by Anonymous Coward

    What can you morally do otherwise but blow the whistle?
    If you keep playing ball with these people, they will continue to act in the fashion that they do. Nothing will change. They are basically getting away with it.

    Give them some pain now and let this be a lesson both to them and to others.

    • Re:Blow the Whistle (Score:5, Interesting)

      by TheLink ( 130905 ) on Sunday April 11, 2010 @10:57AM (#31808398) Journal
      And they do get away with it.

      Maybe things are different now with more hackers looking for holes, but back then you could find bugs, disclose them responsibly and privately, have the vendor/site still not fix them for years, and nobody notices - not even the hackers.

      They eventually might fix it in some future release (not even the next major release :) ). Or they replace it with something totally different.

      So nobody's hurt. At least that I knew of, which is the other problem: who knows right? Maybe a hacker did find it and was very discreet. Ignorance is bliss or not ;)...

      It's just like leaving your car unlocked somewhere with the key in the ignition. In certain areas it's a sure thing that the car will be gone the next day, but not everywhere.

      For the more obscure areas, if you do disclose it publicly, the risks to the users go up more than if you had kept it quiet.

      In the long term maybe the users would be better off using something else, but who is to say the other stuff isn't as crappy? It's not like everyone has so much time to try to exploit everything. Even the security researchers don't look at "everything".
      • I think that there is hurt. First lack of acknowledgement itself is a hurt on the trustworthyness of the vendor. If you can't patch it yourselves, or find a way to diminish the threat represented by the problem, than you have a fiduciay and legal responsibility to your organization.

        I'd send information higher in the food chain than you're dealing with. Tenacity may be necessary to get people to understand the gravity of the situation.

        • by TheLink ( 130905 ) on Sunday April 11, 2010 @12:27PM (#31809144) Journal
          My point is the situation often isn't as grave as many security researchers like to think.

          Yes there is a hole. But there are zillions of holes. If the hacker wants in, they've probably already got in via some PHP exploit or some malware infected PC.

          If you've found an exploit in IE or firefox or Windows or OSX or ssh, sure go kick up a big fuss. Or just wait till the next pwn2own competition ;).

          Whereas if you've found an exploit in some ADSL modem router made by some Korean company, and they're not fixing it after you've reported it. Big deal. Hardly any of the routers will still be working 3 years from now. How many of you got pwned because of those infamous bugs that were present for years without anyone knowing? How many hackers would make a lot of $$$$ from exploiting some non-"vastly deployed" software in your company, and get away with it? Would they be better off concentrating on building windows botnets?

          If it's a problem with some vendor supplied app that your company uses, disclose the problem to the bosses. If the vendor doesn't want to fix it, and you're not the boss, it's between the boss and the vendor. You provide info and advice. Of course you don't play down the risks - that's not your job - that's the vendor's job ;).
          • by Anonymous Coward on Sunday April 11, 2010 @02:03PM (#31810068)

            Send the vendor an email:

            Thank you for clarifying that this is not a security issue. I can now safely post information about this non-security issue into public websites and ask for workarounds from helpful hackers. This would not be possible for security related bugs as those are best be kept secret for a few weeks, so that the vendor can provide and distribute a fix for it.

        • Re:Blow the Whistle (Score:4, Interesting)

          by commodore64_love ( 1445365 ) on Sunday April 11, 2010 @12:28PM (#31809154) Journal

          It's the year 2005. You know that Toyota cars have a programming bug that causes cars to accelerate, and ignore any other inputs (ignores the brakes or switching gear to neutral). What do you do?

          It's the year 1970. You know that Ford cars have a design flaw that makes them the gas tank explode in an accident. What do you do?

          In the 1970 instance, a book author wrote a tell-all called "Unsafe At Any Speed" which revealed Ford's design flaw. In the 2005 case, I'd simply post what I discovered to Toyota-oriented websites and also call the U.S. Government Product Safety Commission. Otherwise Ford/Toyota would never have fixed the problems with their cars.

          I'd also report this software bug, since the vendor seems inclinded to pretend it does not exist. Better to be a whistle-blower and save lives than wait until damage is done. You can't resurrect corpses, but you can warn people while they're still alive, so they can act to protect themselves.

          IMHO.

          Please don't mod me down just 'cause you disagree.

          • Re:Blow the Whistle (Score:5, Informative)

            by germansausage ( 682057 ) on Sunday April 11, 2010 @01:31PM (#31809704)
            Your thinking of the the Ford Pinto which was prone to fires and explosions when hit in rear end collisions. It was first produced in 1971. Unsafe at any Speed was written by Ralph Nader in 1965 about the American Auto industries reluctance to fix safety problems. The car it talked about (among other things) was the Chevrolet Corvair.
    • Re: (Score:3, Insightful)

      by Z00L00K ( 682162 )

      There is no need to play security issues nice anymore, just tell them in general terms that you have acquired knowledge of a bug, not that you found the bug yourself and then provide that info to a suitable magazine or publish it on Wikileaks.

    • by u38cg ( 607297 ) <calum@callingthetune.co.uk> on Sunday April 11, 2010 @12:10PM (#31809000) Homepage
      I don't see a problem with this. The vendor has replied that it doesn't consider this to be a security issue, so surely a public notification isn't going to hurt anyone, right? Right? Oh, your customers are cancelling? I thought it wasn't an issue? What can I say, silly me.
    • Re:Blow the Whistle (Score:4, Interesting)

      by AVee ( 557523 ) <slashdot AT avee DOT org> on Monday April 12, 2010 @06:46AM (#31815344) Homepage

      What can you morally do otherwise but blow the whistle?

      You could sell it, responsibly.
      If you sell the exploit to something like the Zero Day Initiative [zerodayinitiative.com] or iDefense [idefense.com] you won't have to deal with the vendor, they will. And they are far more experienced at that as well. That way you'll get rid of your current problem, the issue is dealt with properly and you might even earn a few bucks in the process.

  • Disclosing Exploitz (Score:4, Interesting)

    by poena.dare ( 306891 ) on Sunday April 11, 2010 @10:28AM (#31808176)

    FTA:

    I've even been accused of being a spy for a company's competition (true... ask Jericho)...

    ME: "Hi, you left your headlights on."
    NEIGHBOR: "WHO SENT YOU? DID MY EX-WIFE SEND YOU? ARE YOU SLEEPING WITH HER?"

    WTF? Seriously?

    ---

    Compare how companies badly deal with vuln disclosure compared to how game companies deal with cheats and exploits. Well, MOST game companies...

  • by unity100 ( 970058 ) on Sunday April 11, 2010 @10:28AM (#31808184) Homepage Journal

    To the net. Current dominant dog eat dog capitalist corporate culture makes corporations suppress and hush hush stuff rather than sparing the effort to fix.

    • Re: (Score:3, Insightful)

      There are two problems I see with that. First, no matter how well-intentioned your actions, it's not the company that pays - it's the people and companies who have the software installed; in essence making the person who publishes this directly responsible (though not legally) for damages caused by the exploit. The company will just say, "oops", shrug, and move on.

      The second one is the fact that for large software packages, there is no such thing as "quick fix". When you have a few million lines of a c

      • If customers support such companies, then maybe they ought to pay. Why should such customers sanction bad software?

    • A bit late (Score:4, Insightful)

      by Arker ( 91948 ) on Sunday April 11, 2010 @02:16PM (#31810166) Homepage
      The sad thing is that by doing the right thing and attempting to report this responsibly, the articles author has now set himself up to be scapegoated. If the exploit is leaked now, whether he does it or not, the vendor will blame him for it and focus as much on destroying his career as on fixing the problem. This is the reason many security professionals decided long ago *not* to report these things "responsibly" but simply to leak them anonymously in every case. Doing this forces the vendor to deal with the vulnerability without making the reporter vulnerable.
  • Leak it (Score:5, Interesting)

    by Aurisor ( 932566 ) on Sunday April 11, 2010 @10:30AM (#31808202) Homepage

    Leak a working exploit anonymously. If a vendor isn't concerned with the security of their users, let them pay the price.

    • Re:Leak it (Score:5, Insightful)

      by egcagrac0 ( 1410377 ) on Sunday April 11, 2010 @10:39AM (#31808276)

      Interesting, but...

      You're then basically telling anyone who might employ such an exploit how to compromise your system.

      My responsibility is to protect my system first, and then to help other people protect their systems.

      I'm not saying that a leak of an exploit isn't good, I'm saying you need to make sure that something is protecting your system from that exploit before you leak it.

      • Re:Leak it (Score:5, Insightful)

        by schon ( 31600 ) on Sunday April 11, 2010 @10:49AM (#31808334)

        You're then basically telling anyone who might employ such an exploit how to compromise your system.

        And you're assuming that someone who wants to exploit your system doesn't already know how.

        • Re: (Score:2, Insightful)

          by Tim C ( 15259 )

          And you're assuming that someone who wants to exploit your system doesn't already know how.

          If the exploit is secret, maybe they do know, maybe they don't.

          If the exploit is publicly disclosed, they almost certainly do.

          Given the stated situation (of having a vulnerable system) the former is preferable to the latter.

          • Re:Leak it (Score:4, Insightful)

            by Todd Knarr ( 15451 ) on Sunday April 11, 2010 @11:33AM (#31808678) Homepage

            If the exploit is secret, maybe they do know, maybe they don't.
            If the exploit is publicly disclosed, they almost certainly do.

            If the exploit is secret and maybe the bad guys know about it and maybe they don't, the only safe course is to assume they do know about it and act accordingly. It's like a door: maybe a burglar will come around trying it and maybe he won't, but you still don't go on vacation leaving the front door to your house unlocked because it's not worth the risk if he does come around.

            • A classic case of "no good deed goes unpunished" such as this often leads to the logical conclusion - use the crack to pillage the vendor. The vendor is destroyed, the "do-gooder" is handsomely rewarded, and somebody writes a book about it and gets royalties from the movie staring Sandra Bullock, who is aptly cast (see "The Net" & "Demolition Man") and could use the diversion.
          • Re: (Score:3, Insightful)

            by schon ( 31600 )

            Given the stated situation (of having a vulnerable system) the former is preferable to the latter.

            *sigh* Logic. You fail it.

            Given that an adversary *might* be able to exploit a vulnerability, the "preferability" of any given scenario is completely irrelevant - you take steps to make sure they can't. Seriously, saying "oh, if I wish hard enough, maybe nobody will exploit it" is FUCKING BRAINDEAD.

            If a vendor refuses to fix a vulnerability, then you must (by necessity) make the problem as widely known, so that as many of the vendor's customers (both current and future) can apply pressure to get them to

        • Just because the thieves know how to smash my windows and pick my locks doesn't mean I give them keys to my house.

        • by Stephen Samuel ( 106962 ) <samuel AT bcgreen DOT com> on Sunday April 11, 2010 @08:57PM (#31813096) Homepage Journal
          If they do know how to exploit your system, then it's too late, agreed. If on the other hand they don't (or they hadn't thought of it), the last thing you want to do is post a big message on your front window "The key is under the welcome mat". Some folk who might not have, otherwise, thought to ransack your house might just drop by to 'take a look'.

          As somebody else pointed out ... the least you owe to your employer is to (attempt to) lock down your own system before you tell the world where the hole is.

      • Right, that's the bigger issue. And really the only cure is going with a solution which allows for the user to export the data to an alternative. Admittedly it's not something you just do at the drop of a hat, but knowing that you can take the data with you even if it's a bit of a pain makes a real difference.

        If a security vulnerability that you consider to be very serious crops up and the vendor doesn't want to fix it, you can at least take your data and play elsewhere, even if it takes several months o
      • You're then basically telling anyone who might employ such an exploit how to compromise your system.

        Basically? That is exactly what you are doing. That is the whole point...

    • That could be illegal, depending how you do it. Be careful with this advice.
  • by Anonymous Coward on Sunday April 11, 2010 @10:31AM (#31808222)

    Use hushmail and full disclosure mailinglist to report stuff like this.

    There's no sense wasting time, do it anonymously, use tor.

    Also while you do it, be sure to post personal information about other full disclosure users so that your email is removed from the official archives.

  • by YesIAmAScript ( 886271 ) on Sunday April 11, 2010 @10:32AM (#31808232)

    None of those problems listed seem to be with responsible disclosure. It's your job to responsibly disclose. And you should protect their secret for a while. After that, it's not really your problem if they won't or can't act.

    I agree there are also issues here with relying on code that you now know has security issues. But those aren't anything to do with responsible disclosure either. If you posted it to the internet you'd still have issues relying on them. Same as if you didn't tell anyone.

    Look at it this way, when you tell a vendor what's wrong and try to help them fix it, you're really doing it to help yourself. Your doing it because you believe it will be less work than changing your entire system to not rely on their code.

    As an aside, I don't get a big rush when I find a problem in someone else's code. Maybe I'm just old and jaded now, but I'm just trying to get everything to work well, finding that someone else didn't do their job doesn't usually make my situation any better (as you indicate here), so I don't relish it.

    • by dissy ( 172727 ) on Sunday April 11, 2010 @01:04PM (#31809462)

      As an aside, I don't get a big rush when I find a problem in someone else's code. Maybe I'm just old and jaded now, but I'm just trying to get everything to work well, finding that someone else didn't do their job doesn't usually make my situation any better (as you indicate here), so I don't relish it.

      Additionally this guy has the added burden of relying on this company for what sounds like a major part of their IT infrastructure.

      If the software is extremely specialized then it is possible there might only be a couple hundred customers.
      An anonymous leak soon (Say within 5 years) if this guy mentioning that he found it would be a simple conclusion to jump to to place blame for an anonymous leak.

      Remember, a company is not a court, so evidence and logic are not a requirement in producing an outcome.

      I would probably hand the info over to a big whitehat group (anonymously if need be, thou an explanation of the situation will probably get the same response) and have THEM claim discovery and report it to the vendor using responsible disclosure.

      This way the group can say "You have ___ weeks to release a fix if you like, but on ____ date we will release this exploit to the public."

      It helps in the emotional/illogical areas too, as someone at the vendor whom did correctly suspect the customer, will have a much harder time convincing others of that when such a large face is actively and directly taking the blame.
      Additionally the vendors other customers will have no reason to have negative thoughts against this one, if there is an active community in any shape around this software.

      Where I work, we have a similar situation with the software package that runs our enterprise.
      There is still a couple yahoo message groups where customers can post and talk with each other for ways of solving problems. The vendors staff even monitor the group, and official bug fixes have resulted simply from a single IT guy posting a complaint.

      It is a wise idea to remain in good standing with such a community, whom might not realize their soon to be public knowledge security problems are the fault of the vendor for not fixing them, and not this customer for being the messenger of bad news.

  • by Wrath0fb0b ( 302444 ) on Sunday April 11, 2010 @10:38AM (#31808262)

    IMO, RD is supposed to entail a good-faith effort to notify the vendor and delay public disclosure for a reasonable amount of time (i.e. not dragging their feet) while they push a patch. If you notify the vendor, preferably including a test case, and they refuse to acknowledge that it is a security issue or suggest ridiculous fixes, that's the end of your RD duties. Ethically speaking, you are in the clear. RD requires you to give them a chance to fix the problem before publicizing it, nothing more.

    Now, vendor/rube relations are another matter entirely that are distinct from your duty of responsible disclosure. I don't envy being in the situation where you fear their wrath for disclosure but want to do the right thing.

    • by amorsen ( 7485 )

      The problem is that if you do the responsible thing, the vendor ignores it, and it gets posted to bugtraq, it can be pretty bad for you. Doesn't matter if it was you who posted there or not, or if it was morally right to post it.

      You have to make the choice beforehand: either go via the vendor or go via bugtraq, you can't do one and then switch to the other.

  • wow (Score:5, Funny)

    by phantomfive ( 622387 ) on Sunday April 11, 2010 @10:41AM (#31808290) Journal

    No. That would be like Toyota telling its customers to buy ejector seats (unsubsidized ejector seats, that is) to resolve the accelerator problem in their vehicles.

    Where can I sign the petition to make that happen?? O_O

  • by SwashbucklingCowboy ( 727629 ) on Sunday April 11, 2010 @10:50AM (#31808338)

    "I've had other vendors demand I spend time helping them understand the issue, basically consulting for free for them. Have you ever knocked on a neighbor's door to tell them they left their headlights on? Did they then require you to cook them dinner? Exactly..."

    You say it happens, they can't reproduce it. Thus, you have to help them understand what it is you're doing. It's not unusual for people to think they've found a bug when they in fact have not.

    • by Aladrin ( 926209 ) on Sunday April 11, 2010 @11:10AM (#31808506)

      I agree, too. There is a certain amount of working together that is expected when you report a bug. Even once the bug is identified and reproduced, you should still be willing to help them verify that the bug is fixed as well.

      It's even in -your- best interest to do so, so that the bug gets fixed properly and quickly.

      I've done this before with thirdparty libraries that had issues with my code. I'll even admit that about 1/4 of the time, it was actually my fault and not a bug in their code after all. And sometimes, the bug fix was extremely simple and I had to jump through hoops to prove that it existed... That's just life.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      not understanding != not reproducible

      For all you know, he submitted a 2-page e-mail detailing the problem.

      • The length of the submission isn't the concern, the quality is. And even if it was a high quality description that does not mean he included everything needed to reproduce the problem. Often times a user does not understand the implementation enough to know what is relevant and what is not.

    • You say it happens, they can't reproduce it.

      They say they can't. How hard have they tried? Have they really tried at all? Chances are, it's been thrown into their bug tracking system (if they have one) and some tester has spent 5 minutes poking at it, then gone back to working on paid projects instead.

      Thus, you have to help them understand what it is you're doing.

      "have"? I do not think that word means what you think it means.

      If you believe that you have a repeatable exploit, and you've given it to th

    • yeah. At least they're interested.... Half your work is done there... now all you have to do is prove to them that the breakage is because of what they're doing, not because of what you're doing.

      If they're willing to work with you to fully understand the bug, then chances are that they'll throw better money after good and actually fix it once they understand what went wrong.

    • by sjames ( 1099 )

      Or perhaps they're just not trying very hard. I say if you have the time, help until it's clear they're just not trying. Then suggest that if they want more help, you'll need all of the source and perhaps send them your hourly rates. If you don't have the time, just say so and leave it with them or tell them how much they'd have to pay for you to make time.

  • by The Second Horseman ( 121958 ) on Sunday April 11, 2010 @10:51AM (#31808348)

    I'm betting this is software being used in an educational or other not-for-profit environment. If so, one thing you have going for you is that employees in that sector actually talk to people from other institutions. If disclosing your work-around wouldn't just give away the entire problem, I'd actually publish what you did to protect the application. That way, your peers can decide if they want to do it and get a head start. It puts the vendor on notice that someone is going to notice this problem eventually.

    If your workaround gives away the entire problem, that's more difficult. Assuming education / not-for-profit, I'd start by talking (verbally) to peers at schools using the application, and see if you can get some traction that way. A group of pissed-off customers might be more effective. Your CIO may be participating in regional or national organizations, and may be able to talk to his or her peers about the issue as well.

    If you just release it on your own in an "in-your-face" way without your bosses signing on, they could decide to take it out on you if the vendor gets pissed or tries to go after you for violating some gag clause in the licensing agreement (some have them) or damaging their business. They shouldn't, but I can think of a couple of really stupid, obnoxious vendors that might.

  • by S77IM ( 1371931 ) on Sunday April 11, 2010 @10:57AM (#31808402)

    The vendor refused to acknowledge it was a security issue.

    Then it's either a feature or a regular old non-security-related bug, and I don't see the problem with announcing it to everybody. Right?

      -- 77IM

    • Re: (Score:2, Insightful)

      by jkroll ( 32063 )

      The vendor refused to acknowledge it was a security issue.

      Then it's either a feature or a regular old non-security-related bug, and I don't see the problem with announcing it to everybody. Right?

      If you are really certain it is a valid issue, take a look at their marketing page and find out who their reference customers are. If you can get some of their important customers to raise this issue as well, you may have better luck getting some action on it without disclosing the vulnerability to the world at large.

  • Sue Them (Score:4, Interesting)

    by Herkum01 ( 592704 ) on Sunday April 11, 2010 @11:00AM (#31808416)

    Sue them, take them to court where bad publicity, will scare the hell out them into settling.

    As long as they view it as a technical issue, they are not interested. If they view it as a sales/marketing nightmare they will come to the table in a hurry. As for further cooperation, they will suck it up because other customers will see how they react as to how they will treat the company.

    Like spousal abuse, as long as a bad working relationship can be hidden they will get away with it other individuals/companies. They only way to address the issue is to make their dirty laundry public.

  • by Anonymous Coward on Sunday April 11, 2010 @11:09AM (#31808488)

    I had this kind of problem about 15 years ago. It was a real pain. My problem was that I wasn't able to publish my findings as the vendor made pretty clear he'd sue me over that. So I reviewed my requirements on software and found a solution I haven't heard of before - Open Source. Since then I use Open Source and though it has some minor drawbacks I don't regret this step.

    cb

  • by medcalf ( 68293 ) on Sunday April 11, 2010 @11:09AM (#31808496) Homepage
    The shareholder meeting. Simply note that you reported bug # XXXX some months ago, and it has not been acted on. You wouldn't mention it except that it's a security vulnerability that, if disclosed, would tank the share price for the company. So in that light, when will this vulnerability be addressed? Let everyone else take it from there.
    • Re: (Score:2, Insightful)

      by Anonymous Coward
      Sure, but how many shares of the company did you need to buy in order to get a seat at their shareholder's meeting? Then the plane ticket to get to the meeting, etc. Not really something most people are going to do - especially the submitter who couldn't even be bothered to help them reproduce the problem.
      • by 49152 ( 690909 )

        I do not know the law in his jurisdiction, but here (and almost everywhere else) it would be enough with one (1) share.

        Of course you have to foot the bill yourself to travel there.

    • The shareholder meeting. Simply note that you reported bug # XXXX some months ago, and it has not been acted on. You wouldn't mention it except that it's a security vulnerability that, if disclosed, would tank the share price for the company. So in that light, when will this vulnerability be addressed? Let everyone else take it from there.

      In that same vein, get the company to include a (responsible) disclosure policy in its handbook.

      It'd make a handy stick to beat recalcitrant vendors with:
      "Here's the problem, here's my company policy on disclosure. You choose what happens next."

      (Of course, this assumes your company is willing to let you disclose anything)

  • If your company sold them security software with a bug, your company should first fix the bug and then give them a patch for it (free of charge).

    Not tell them that it's bugged and ask them to pay you to fix it. If they don't want to accept the patched version, that's their problem, but most of your other customers will.

  • And we all know what it really is.

    Protect your system as good as you can, firewalls, backups, whatever, or just rely on your own obscurity, and publish!

    Act surprised, act I-told-you-so, be outraged with whatever happens, and then - in few days - install a patch.

  • Frankly the web app firewall idea is the most appropriate solution to this entire category of problem for your organization. You should want one, and this is just one more datapoint as to why.

    Secondly, if they won't fix the problem (and I don't mean won't do it quickly, I mean won't do it at all) then I'm sure someone else will discover it and anonymously disclose it. ...Cough... Right? ...Cough...

    • Full disclosure: I work for a company that, among other things, makes a commercial Layer 7 firewall...

    • by guruevi ( 827432 )

      Do you know what a web app firewall actually does? It's practically a proxy server that sanitizes the inputs, meaning for each application you have to make custom rules which are more involved than just "allow 0.0.0.0/0 to 192.168.1.1 port 80". Especially if you have a very custom-built application, for each data input (each field) you will have to specify what form of data it is and what the valid ranges are. This SHOULD be done within the application already by the developers, there is no reason to go ou

      • There is plenty to argue against your FUD*, but in reality though I wasn't recommending a WAF in the traditional preemptive protective capacity.

        If you have a good WAF then when you, the person responsible for security, discover a vulnerability you, the person responsible for security, can patch it in minutes.

        Even if your applications are completely in-house developed it still takes time to get the vulnerability fixed and it's not within your control how long that process takes! With a WAF you take all of t

        • You're assuming that the WAF doesn't introduce its own security issues. I've seen more than a few that can be remotely crashed, thus DoSing everything behind them, and I've even seen one that can be exploited to run arbitrary code -- by adding the "security" of a WAF you might just be adding an Internet-exposed, remotely-exploitable host that not only has access to your internal app servers, but also all their input and output, and the ability to arbitrarily manipulate both.

          And then there's all the regular

          • Well, you're either naive and misinformed, or you're trying to be dense.

            First of all we're talking about a specific use case for a WAF- not WAFs as the panacea of network or application security. I don't think I ever stated that simply adding another "security appliance" necessarily increased security overall. I did say that with a WAF you can patch holes like this yourself. Do you really disagree with that?

            That said:

            Nothing you posted is an inherent problem with WAFs (over firewalls, routers, managed sw

  • ...anonymously.

    It's only a matter of time before the black hats find it, so by publicizing it and forcing it in the vendor's face they'll see it for what it is.

  • Lawsuit? (Score:3, Interesting)

    by RoFLKOPTr ( 1294290 ) on Sunday April 11, 2010 @11:55AM (#31808888)

    I don't know what kind of terms are written in your contract with these people (or how big your company is compared to the software company), but maybe your company can sue them for failing to properly maintain their software. Right now, find other companies that use this software and tell their administrators about the vulnerability and let the pressure build on the software company. Give them a little more time to either get their shit together or blow you off and then threaten lawsuit. It would be irresponsible of you for the sake of your own security to publicly disclose the vulnerability, so you should be doing all you possibly can to not have to do that.

  • by ErikTheRed ( 162431 ) on Sunday April 11, 2010 @11:57AM (#31808892) Homepage

    Simple answer:

    Responsible Disclosure should be limited to vendors that publicly pledge (or, preferably, contractually agree to via their licensing terms) to Responsibly Fix issues that are disclosed to them. If a company doesn't abide by their own Responsibly Fix policy, it should be disclosed so that others realize that it's null and void.

  • gentlemen (Score:3, Insightful)

    by headonfire ( 160408 ) on Sunday April 11, 2010 @12:15PM (#31809030)

    Well, I think it works like this. If vendors as a group want to encourage more "responsible disclosure", they need to operate in such a way that they take potential vulnerabilities seriously, and I don't just mean in a "we're taking this very seriously" kind of way, but more of a "we have a dedicated, knowledgable staff member/team to look into situations like this" sort of thing. If they decide it's not an issue after all, then any responsibility you have to the vendor is over regarding that issue. If they're not willing to even consider it as an issue after you've made a good faith effort to let them know how much of a problem you think it is, any responsibility you have to the vendor is over regarding that issue.

    In short, a gentleman's agreement only works if both parties are gentlemen.

  • by Animats ( 122034 ) on Sunday April 11, 2010 @12:18PM (#31809058) Homepage

    It's hard getting the attention of some vendors. I see vulnerabilities in a slightly different context - hacked web sites hosting phishing pages. We distribute a list of major domains being exploited by active phishing scams. [sitetruth.com] This is obtained by processing PhishTank data, and we do this because we want to reduce the collateral damage from a tough blacklist system. At any given time, there are about 30 to 80 domains on the list.

    Some sites get themselves off the list quickly. By now, most of the better free hosting services and short-URL services are automatically checking PhishTank and the APWG blacklist to see when they've been hit. Today, if you run a service where anybody can put up a page that could be used for phishing (i.e. it's not full of your own headers and banners), you need automation to deal with attacks. I've been in contact with the abuse guy at "t35.com", which is a free hosting service. They've recently been hit by a flood of phishing attacks, with several hundred new reports in PhishTank per day. The attacks were coming in faster than the abuse guy could clean them out. They're now gaining on the problem, but haven't squashed it yet. Take-away lesson: automate this.

    The ones near the top of the list have been there for a while. Note the dates, which are the date that the oldest phishing report still online and active appeared in PhishTank. Some just need help. Typically, these are small organizations like churches and nonprofits that have had a break-in and were partially taken over by a phishing site. I send them the Anti-Phishing Working Group's "What To Do if your Site Has Been Hacked". [antiphishing.org] Sometimes I give them a phone call. They deserve sympathy.

    Then there are the hard cases. These are sites with no visible contact address, or a clueless abuse department. At the moment, Google Sites and Google Spreadsheets are being used for phishing. Google is new to the free hosting business, and the phishers have discovered some tricks that Google can't yet handle. While Google puts a "report abuse" link on their site pages, it's possible to set up a file for downloading on Google Sites, and an HTML page can be served that way [phishtank.com], without Google's abuse checking. There's also an exploit of Google Spreadsheets [phishtank.com]. That one is an example of Habbo Hotel phishing. [bbc.co.uk] We've reported these to Google several times, but they haven't been fixed yet.

    We've been seeing a new type of attack recently - a phishing operation breaks into a shared hosting server and plants phishing pages on multiple domains on a single server. One of these hit one of the mysterious "*.websitewelcome.com" servers, which has "cloaked domain registration" and no useful default web page. These seem to be associated with "ThePlanet.com", but whether ThePlanet operates them, is providing wholesale hosting, is providing colocation, or is just the upstream connectivity provider is not clear.

    Hiding the contact information of a hosting provider is legally unwise. The hosting provider may lose the "safe harbor" protection of the the DMCA. [cornell.edu] The "safe harbor" provision for "Information Residing on Systems or Networks At Direction of Users" only applies if "the service provider has designated an agent to receive notifications of claimed infringement... by making available through its service, including on its website in a location accessible to the public, and by providing to the Copyright Office, substantially the following information: the name, address, phone number, and electronic mail address of the agent." So when the RIAA or the MPAA come calling, a likely event for a hosting service, they get

  • by wampus ( 1932 ) on Sunday April 11, 2010 @12:19PM (#31809068)

    That analogy. Stop it.

  • by pem ( 1013437 ) on Sunday April 11, 2010 @12:29PM (#31809170)
    This is not a programming problem any more. Much as we all hate lawyers, this is one case where they are useful. VERY useful. You put him on notice, it's not your problem any more. He puts the vendors lawyer on ACTUAL notice (which has a specific legal meaning). He might even need to tell the other lawyer that he's going to have to report this issue in the next required SEC filing. IMO, if you already have the thing deployed, which it seems may be the case from your post, your FIRST stop should have been the attorney.
    • Re: (Score:3, Interesting)

      by Skapare ( 16644 )

      If the OP's company is a publicly traded corporation, and if this exploit represents any kind of risk to investors in any form, they are already required to include it in the next filing.

      • by pem ( 1013437 ) on Sunday April 11, 2010 @01:47PM (#31809914)

        Exactly.

        And this is the right way to go about it.

        And the nice thing is that when you report it in the filing, the other company will be shamed because other people will figure out who it is, yet you are not really adding to your own risk, because in the SEC filing you use some words like have been used in this article:

        "We have discovered a security vulnerability in a third party web application platform we use. If this is successfully exploited, (list of bad things that could happen). We have informed the vendor, and they have so far been unresponsive in fixing it. We are working diligently to deploy a firewall in front of the applications, but there is no guarantee that this vulnerability will not be exploited before we have that fully deployed, or the vendor gives us a fix."

        This is exactly the chain of command/control you want to use. You tell your counterparts at the other company that your company's policy requires you to report it to the lawyer, the lawyer tells his counterpart that SEC rules require him to disclose the vulnerability, and then he does it in the next report. End of story.

        Then the ball is in the other company's court to either fix the vulnerability or prove to your satisfaction that it isn't a problem.

  • As long as having a good working relationship with a vendor that you and your company knows is incompetent and unethical is more important than your security and the principle of doing things right, then you get what you deserve.

    You know what the exploit is. But what makes you think you are the only one? What makes you think no unethical hackers know about and won't find out about it for the life of this exploit (which seems from the vendor attitude that it could be very long)?

    You (your company) needs to ta

  • by cdrguru ( 88047 ) on Sunday April 11, 2010 @02:12PM (#31810134) Homepage

    First off, there are very few software packages of any size that are sold with terms and conditions that would allow anyone to sue the vendor for any reason. The only path I could possibly see is an agreement that bugs will be fixed - but this vendor is disclaiming that this is a bug. So even that probably isn't going to get you anywhere. Suing is almost certainly not an option unless you have money to burn.

    There are far too few details in the posting to explain how this software is used and what its function is. It could be something that is on an internal web site and the exposure is from potentially unqualified employees accessing internal information. It could also be a public web site that allows people in Eastern Europe to get enough medical information on celebrities to blackmail them. Who knows?

    If the vendor is saying it isn't a bug it is doubtful many other customers are going to immediately see that it is a problem. They might after a while - again, depending on how this software is used. So going to other custromers isn't likely to be very useful short term.

    Certainly this is a the result of buying packaged software rather than writing everything yourself. You give up a certain amount of control. Open source isn't a solution, because if you aren't familiar with the code it could take a very long time to learn enough about it to do anything like fixing it. I'm sure you could pay a consultant to learn the package and fix it, but then you could also just pay a consultant to write you a system the way you wanted in the first place.

    This is the age of the packaged software solution. It started around 1985 or so and continues to this day. Absolutely, a side effect of this is that you need to be on good working terms with your vendor(s) and your vendor(s) need to be competent. I haven't seen anything that says the vendor is incompetent, just that they disagree about the severity of the problem so far.

    Sure, as a programmer I would like to think that every single company should be employing programmers and custom-writing all their own stuff. Just like 1975. Except that isn't the way things turned out. Too bad, fewer programmers employed. But it is how things work today and nobody is going back.

  • I have no information about this instance, but the public face of the company you are speaking to is not always knowledgeable or intelligent. I work for a software company, and I have seen instances where a customer reports an issue and has it waved away without consultation by the first-contact support employee, usually under hand-waving like "no-one else is seeing this problem", or "I haven't been able to reproduce it". In most of these cases, an experienced person has not been consulted, and there has

  • by Anonymous Coward on Sunday April 11, 2010 @03:17PM (#31810788)

    the reason companies don't like people disclosing their security holes is not only do they have to release a fix, they also have to slip in a new hole and make sure most of their botnet successfully migrates to it. since there is a gradual uptake of patches and people tend to drag their feet installing a given patch botnet performance can be severely impacted reducing the marketability of it's services.

  • Zero Day Inititive (Score:2, Interesting)

    by martinlp ( 904606 )
    why not get these [zerodayinitiative.com] guys to help you. Maybe the vendor will take them seriously.
  • Neighbours (Score:5, Funny)

    by lennier ( 44736 ) on Sunday April 11, 2010 @05:37PM (#31811874) Homepage

    "Have you ever knocked on a neighbor's door to tell them they left their headlights on? Did they then require you to cook them dinner? Exactly..."

    And after dinner, did they then require you to take them to a movie, a concert, some clubs and a night of passionate...

    Excuse me, I'm just going to go check all the car headlights in my street. Be right back.

  • If a vendor blows you off with a problem this time, then you know that that's a stupid path to take next time. An anonymous report ((preferrably to other users, rather than to the public at large -- which is more likely to include black-hats)) is a bit less of a political problem if they don't know that you already know about it.

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...