Why Responsible Vulnerability Disclosure Is Painful and Inefficient 182
A recent rant up at Attrition.org highlights problems with the responsible disclosure of security issues. While some vendors are happy to do their own research and patch reported problems, others drag their feet and make unreasonable demands on a researcher's time and effort, making anonymous public disclosure an ever-more-tempting option. Quoting:
"After a couple hours of poking, I found a huge unauthenticated confidentiality hole. Once the euphoria wore off, I realized I had a big problem on my hands. I had to tell my employer's app owners and we had to assess risk and make a decision on what to do about it. After some quick meetings with stakeholders, we decided to severely limit access to the thing while we worked with the vendor. The vendor refused to acknowledge it was a security issue. Odd, considering most everyone who sees the issue unmistakably agrees that it is not acceptable. Now I'm forced to play hardball, yet nobody wants to fully-disclose and destroy relations with this vendor, whose software is somewhat relied on. Meanwhile, I know there are hundreds of institutions, small and large, using this software who have no idea that it has flawed security and who would probably not find the risk acceptable. What can I do? Nothing. Oh well, sucks to be them. ... I've had a vendor tell me to put a webapp firewall in front of their software. Did they offer to pay for it? No. That would be like Toyota telling its customers to buy ejector seats (unsubsidized ejector seats, that is) to resolve the accelerator problem in their vehicles. I've had other vendors demand I spend time helping them understand the issue, basically consulting for free for them. Have you ever knocked on a neighbor's door to tell them they left their headlights on? Did they then require you to cook them dinner? Exactly..."
I'd just like to report (Score:5, Funny)
Re: (Score:2)
We call that a "race condition." It happens on every /. article.
Blow the Whistle (Score:2, Insightful)
What can you morally do otherwise but blow the whistle?
If you keep playing ball with these people, they will continue to act in the fashion that they do. Nothing will change. They are basically getting away with it.
Give them some pain now and let this be a lesson both to them and to others.
Re:Blow the Whistle (Score:5, Interesting)
Maybe things are different now with more hackers looking for holes, but back then you could find bugs, disclose them responsibly and privately, have the vendor/site still not fix them for years, and nobody notices - not even the hackers.
They eventually might fix it in some future release (not even the next major release
So nobody's hurt. At least that I knew of, which is the other problem: who knows right? Maybe a hacker did find it and was very discreet. Ignorance is bliss or not
It's just like leaving your car unlocked somewhere with the key in the ignition. In certain areas it's a sure thing that the car will be gone the next day, but not everywhere.
For the more obscure areas, if you do disclose it publicly, the risks to the users go up more than if you had kept it quiet.
In the long term maybe the users would be better off using something else, but who is to say the other stuff isn't as crappy? It's not like everyone has so much time to try to exploit everything. Even the security researchers don't look at "everything".
Re: (Score:2)
I think that there is hurt. First lack of acknowledgement itself is a hurt on the trustworthyness of the vendor. If you can't patch it yourselves, or find a way to diminish the threat represented by the problem, than you have a fiduciay and legal responsibility to your organization.
I'd send information higher in the food chain than you're dealing with. Tenacity may be necessary to get people to understand the gravity of the situation.
Re:Blow the Whistle (Score:5, Insightful)
Yes there is a hole. But there are zillions of holes. If the hacker wants in, they've probably already got in via some PHP exploit or some malware infected PC.
If you've found an exploit in IE or firefox or Windows or OSX or ssh, sure go kick up a big fuss. Or just wait till the next pwn2own competition
Whereas if you've found an exploit in some ADSL modem router made by some Korean company, and they're not fixing it after you've reported it. Big deal. Hardly any of the routers will still be working 3 years from now. How many of you got pwned because of those infamous bugs that were present for years without anyone knowing? How many hackers would make a lot of $$$$ from exploiting some non-"vastly deployed" software in your company, and get away with it? Would they be better off concentrating on building windows botnets?
If it's a problem with some vendor supplied app that your company uses, disclose the problem to the bosses. If the vendor doesn't want to fix it, and you're not the boss, it's between the boss and the vendor. You provide info and advice. Of course you don't play down the risks - that's not your job - that's the vendor's job
Re:Blow the Whistle (Score:5, Insightful)
Send the vendor an email:
Thank you for clarifying that this is not a security issue. I can now safely post information about this non-security issue into public websites and ask for workarounds from helpful hackers. This would not be possible for security related bugs as those are best be kept secret for a few weeks, so that the vendor can provide and distribute a fix for it.
Re: (Score:3, Informative)
I wouldn't do that, since I suspect there's a big overlap in the group of vendors who go "nonsecurity issue" when it is one, and the group of vendors who would sue you if you post the "nonsecurity issue" publicly.
Maybe you could sell the vulnerability to one of those security sites, black, white or grey: http://blogs.verisign.com/idefense/2010/02/casablanca-buying-vulnerabilities-and-digressing.html [verisign.com]
Re:Blow the Whistle (Score:4, Interesting)
It's the year 2005. You know that Toyota cars have a programming bug that causes cars to accelerate, and ignore any other inputs (ignores the brakes or switching gear to neutral). What do you do?
It's the year 1970. You know that Ford cars have a design flaw that makes them the gas tank explode in an accident. What do you do?
In the 1970 instance, a book author wrote a tell-all called "Unsafe At Any Speed" which revealed Ford's design flaw. In the 2005 case, I'd simply post what I discovered to Toyota-oriented websites and also call the U.S. Government Product Safety Commission. Otherwise Ford/Toyota would never have fixed the problems with their cars.
I'd also report this software bug, since the vendor seems inclinded to pretend it does not exist. Better to be a whistle-blower and save lives than wait until damage is done. You can't resurrect corpses, but you can warn people while they're still alive, so they can act to protect themselves.
IMHO.
Please don't mod me down just 'cause you disagree.
Re:Blow the Whistle (Score:5, Informative)
Re: (Score:3, Insightful)
There is no need to play security issues nice anymore, just tell them in general terms that you have acquired knowledge of a bug, not that you found the bug yourself and then provide that info to a suitable magazine or publish it on Wikileaks.
Re:Blow the Whistle (Score:5, Insightful)
Re:Blow the Whistle (Score:4, Interesting)
What can you morally do otherwise but blow the whistle?
You could sell it, responsibly.
If you sell the exploit to something like the Zero Day Initiative [zerodayinitiative.com] or iDefense [idefense.com] you won't have to deal with the vendor, they will. And they are far more experienced at that as well. That way you'll get rid of your current problem, the issue is dealt with properly and you might even earn a few bucks in the process.
Disclosing Exploitz (Score:4, Interesting)
FTA:
I've even been accused of being a spy for a company's competition (true... ask Jericho)...
ME: "Hi, you left your headlights on."
NEIGHBOR: "WHO SENT YOU? DID MY EX-WIFE SEND YOU? ARE YOU SLEEPING WITH HER?"
WTF? Seriously?
---
Compare how companies badly deal with vuln disclosure compared to how game companies deal with cheats and exploits. Well, MOST game companies...
Dont sweat it. When you find them, just slip it (Score:4, Insightful)
To the net. Current dominant dog eat dog capitalist corporate culture makes corporations suppress and hush hush stuff rather than sparing the effort to fix.
Re: (Score:3, Insightful)
The second one is the fact that for large software packages, there is no such thing as "quick fix". When you have a few million lines of a c
Re: (Score:2)
If customers support such companies, then maybe they ought to pay. Why should such customers sanction bad software?
Re: (Score:2)
Re: (Score:2)
Maybe I'm being harsh, but the best way to stop bad software is to not have customers buy it.
Re: (Score:2)
I am sorry, but I do not buy your argument about 2 million lines being closely tangled to a security hole. Usually code is structured into independent modules and especially the authentication/authorization part is encapsulated into a server.
That various vastly, depending on the product and nature of the security flaw.
Second, a few weeks is not that much of a problem - if the customers are aware of the problem, they can apply workarounds to mitigate the problem while waiting for the fix. In the worst case they can stop using the software. Consider the worst case for the 'responsible disclosure' where the vendor hides their head in the sand while an exploit is being circulated in the black hat community.
Very true, but none of this builds a valid argument for "just tell the world and let the chips fall where they may".
Re: (Score:2)
come on. you overdid it. 2 million lines tied to a security flaw is beyond scale and measure.
A bit late (Score:4, Insightful)
Re: (Score:2)
Does capitalism typically reward those who meet the customer's needs? How would we know that?
Leak it (Score:5, Interesting)
Leak a working exploit anonymously. If a vendor isn't concerned with the security of their users, let them pay the price.
Re:Leak it (Score:5, Insightful)
Interesting, but...
You're then basically telling anyone who might employ such an exploit how to compromise your system.
My responsibility is to protect my system first, and then to help other people protect their systems.
I'm not saying that a leak of an exploit isn't good, I'm saying you need to make sure that something is protecting your system from that exploit before you leak it.
Re:Leak it (Score:5, Insightful)
You're then basically telling anyone who might employ such an exploit how to compromise your system.
And you're assuming that someone who wants to exploit your system doesn't already know how.
Re: (Score:2, Insightful)
And you're assuming that someone who wants to exploit your system doesn't already know how.
If the exploit is secret, maybe they do know, maybe they don't.
If the exploit is publicly disclosed, they almost certainly do.
Given the stated situation (of having a vulnerable system) the former is preferable to the latter.
Re:Leak it (Score:4, Insightful)
If the exploit is secret, maybe they do know, maybe they don't.
If the exploit is publicly disclosed, they almost certainly do.
If the exploit is secret and maybe the bad guys know about it and maybe they don't, the only safe course is to assume they do know about it and act accordingly. It's like a door: maybe a burglar will come around trying it and maybe he won't, but you still don't go on vacation leaving the front door to your house unlocked because it's not worth the risk if he does come around.
Re: (Score:2)
Re: (Score:2)
First thing, though: the probability it's known is 100%. After all, you know about it, right? What makes you think you're so unique, so special, so much more brilliant than any other person in the world that you're the only one who's noticed it and figured it out? Nothing. In fact, given the number of people in the world it's a virtual certainty that someone smarter than you found this vulnerability before you. So the probability it's known is 1.00, the only question left is the cost to secure yourself agai
Re: (Score:3, Insightful)
Given the stated situation (of having a vulnerable system) the former is preferable to the latter.
*sigh* Logic. You fail it.
Given that an adversary *might* be able to exploit a vulnerability, the "preferability" of any given scenario is completely irrelevant - you take steps to make sure they can't. Seriously, saying "oh, if I wish hard enough, maybe nobody will exploit it" is FUCKING BRAINDEAD.
If a vendor refuses to fix a vulnerability, then you must (by necessity) make the problem as widely known, so that as many of the vendor's customers (both current and future) can apply pressure to get them to
Re: (Score:2)
Just because the thieves know how to smash my windows and pick my locks doesn't mean I give them keys to my house.
No reason to make it easier (Score:4, Insightful)
As somebody else pointed out ... the least you owe to your employer is to (attempt to) lock down your own system before you tell the world where the hole is.
Re: (Score:2)
If a security vulnerability that you consider to be very serious crops up and the vendor doesn't want to fix it, you can at least take your data and play elsewhere, even if it takes several months o
Re: (Score:2)
Basically? That is exactly what you are doing. That is the whole point...
Re: (Score:2)
Re: (Score:2)
That could be illegal...
The law is irrelevant..
Re: (Score:2)
So leak it first, wait for it to show up somewhere, and *then* contact the vendor and point to the leaked exploit.
Maybe I'm just cynical, having been blown off many times in the past. Generally speaking, the only way to get technical attention is to make non-technical people freak out.
Hushmail and full disclosure (Score:3, Interesting)
Use hushmail and full disclosure mailinglist to report stuff like this.
There's no sense wasting time, do it anonymously, use tor.
Also while you do it, be sure to post personal information about other full disclosure users so that your email is removed from the official archives.
Re:Hushmail and full disclosure (Score:4, Insightful)
Re: (Score:2)
How much damage would be done to them financially if it was discovered they were warned of a massive vulnerability and they did nothing? From lost sales to possible actual financial losses stemming from lawsuits from the affected parties.
Re: (Score:2)
Or do they have a moral obligation to their shareholders to not spend money if they don't have to (keep up the bottom line)? Or do they have an ethical obligation?
Neither. They have a legal obligation to look after that bottom line. That means working to keep the company healthy in the long term. They could also hire bank robbers and cat-burglars to prop up the bottom line, but the long term effects on the company will be quite negative.
Anytime a company decides to make short term cuts at the cost of long term gains (to spare the bottom line) they "can't see the forest through the trees."
Re: (Score:2)
They may go down that path, but the "justification" is nothing more than the rationalizations of someone who finds a wallet and decides to keep the money. Of course, that is exactly the sort of thing that leads others to rationalize various cheating on licensing (they're ripping us off, I'm just evening out the balance by doing an extra couple installs). Corporations that understand no value other than direct profit are leading the way to a hellworld where everyone cheats all the time. I just hope that if t
Re: (Score:2)
Or do they have a moral obligation to their shareholders to not spend money if they don't have to (keep up the bottom line)?
I think that's the point: If the company is going to go by the "bottom-line-trumps-all" ideology, then users get to make it relevant to the company's bottom line by letting consumers know about the company's inferior customer care initiatives.
Are those really problems with res. desclosure? (Score:5, Insightful)
None of those problems listed seem to be with responsible disclosure. It's your job to responsibly disclose. And you should protect their secret for a while. After that, it's not really your problem if they won't or can't act.
I agree there are also issues here with relying on code that you now know has security issues. But those aren't anything to do with responsible disclosure either. If you posted it to the internet you'd still have issues relying on them. Same as if you didn't tell anyone.
Look at it this way, when you tell a vendor what's wrong and try to help them fix it, you're really doing it to help yourself. Your doing it because you believe it will be less work than changing your entire system to not rely on their code.
As an aside, I don't get a big rush when I find a problem in someone else's code. Maybe I'm just old and jaded now, but I'm just trying to get everything to work well, finding that someone else didn't do their job doesn't usually make my situation any better (as you indicate here), so I don't relish it.
Re:Are those really problems with res. desclosure? (Score:4, Interesting)
As an aside, I don't get a big rush when I find a problem in someone else's code. Maybe I'm just old and jaded now, but I'm just trying to get everything to work well, finding that someone else didn't do their job doesn't usually make my situation any better (as you indicate here), so I don't relish it.
Additionally this guy has the added burden of relying on this company for what sounds like a major part of their IT infrastructure.
If the software is extremely specialized then it is possible there might only be a couple hundred customers.
An anonymous leak soon (Say within 5 years) if this guy mentioning that he found it would be a simple conclusion to jump to to place blame for an anonymous leak.
Remember, a company is not a court, so evidence and logic are not a requirement in producing an outcome.
I would probably hand the info over to a big whitehat group (anonymously if need be, thou an explanation of the situation will probably get the same response) and have THEM claim discovery and report it to the vendor using responsible disclosure.
This way the group can say "You have ___ weeks to release a fix if you like, but on ____ date we will release this exploit to the public."
It helps in the emotional/illogical areas too, as someone at the vendor whom did correctly suspect the customer, will have a much harder time convincing others of that when such a large face is actively and directly taking the blame.
Additionally the vendors other customers will have no reason to have negative thoughts against this one, if there is an active community in any shape around this software.
Where I work, we have a similar situation with the software package that runs our enterprise.
There is still a couple yahoo message groups where customers can post and talk with each other for ways of solving problems. The vendors staff even monitor the group, and official bug fixes have resulted simply from a single IT guy posting a complaint.
It is a wise idea to remain in good standing with such a community, whom might not realize their soon to be public knowledge security problems are the fault of the vendor for not fixing them, and not this customer for being the messenger of bad news.
Re: (Score:2)
I'd say you're usually paying more than a little bit.
But again, you're about a "my vendor sucks" problem, not a responsible disclosure problem. Irresponsible or no disclosure won't change the problem that your vendor sucks and you now know you are relying on a product with known security flaws.
A misunderstanding of responsible disclosure ... (Score:5, Insightful)
IMO, RD is supposed to entail a good-faith effort to notify the vendor and delay public disclosure for a reasonable amount of time (i.e. not dragging their feet) while they push a patch. If you notify the vendor, preferably including a test case, and they refuse to acknowledge that it is a security issue or suggest ridiculous fixes, that's the end of your RD duties. Ethically speaking, you are in the clear. RD requires you to give them a chance to fix the problem before publicizing it, nothing more.
Now, vendor/rube relations are another matter entirely that are distinct from your duty of responsible disclosure. I don't envy being in the situation where you fear their wrath for disclosure but want to do the right thing.
Re: (Score:2)
The problem is that if you do the responsible thing, the vendor ignores it, and it gets posted to bugtraq, it can be pretty bad for you. Doesn't matter if it was you who posted there or not, or if it was morally right to post it.
You have to make the choice beforehand: either go via the vendor or go via bugtraq, you can't do one and then switch to the other.
wow (Score:5, Funny)
No. That would be like Toyota telling its customers to buy ejector seats (unsubsidized ejector seats, that is) to resolve the accelerator problem in their vehicles.
Where can I sign the petition to make that happen?? O_O
Re: (Score:2, Funny)
You could ask M5 [discovery.com] to mod your car.
The roof mod is extra, though.
Re: (Score:3, Funny)
If they amp the power enough, the roof mod will come free with the first use.
Re: (Score:2)
Re: (Score:2)
Depends on whether your name is Buster...
Not the same thing at all... (Score:5, Insightful)
"I've had other vendors demand I spend time helping them understand the issue, basically consulting for free for them. Have you ever knocked on a neighbor's door to tell them they left their headlights on? Did they then require you to cook them dinner? Exactly..."
You say it happens, they can't reproduce it. Thus, you have to help them understand what it is you're doing. It's not unusual for people to think they've found a bug when they in fact have not.
Re:Not the same thing at all... (Score:5, Insightful)
I agree, too. There is a certain amount of working together that is expected when you report a bug. Even once the bug is identified and reproduced, you should still be willing to help them verify that the bug is fixed as well.
It's even in -your- best interest to do so, so that the bug gets fixed properly and quickly.
I've done this before with thirdparty libraries that had issues with my code. I'll even admit that about 1/4 of the time, it was actually my fault and not a bug in their code after all. And sometimes, the bug fix was extremely simple and I had to jump through hoops to prove that it existed... That's just life.
Re: (Score:2, Insightful)
not understanding != not reproducible
For all you know, he submitted a 2-page e-mail detailing the problem.
Re: (Score:2)
The length of the submission isn't the concern, the quality is. And even if it was a high quality description that does not mean he included everything needed to reproduce the problem. Often times a user does not understand the implementation enough to know what is relevant and what is not.
Re: (Score:2)
They say they can't. How hard have they tried? Have they really tried at all? Chances are, it's been thrown into their bug tracking system (if they have one) and some tester has spent 5 minutes poking at it, then gone back to working on paid projects instead.
"have"? I do not think that word means what you think it means.
If you believe that you have a repeatable exploit, and you've given it to th
Re: (Score:2)
If they're willing to work with you to fully understand the bug, then chances are that they'll throw better money after good and actually fix it once they understand what went wrong.
Re: (Score:2)
Or perhaps they're just not trying very hard. I say if you have the time, help until it's clear they're just not trying. Then suggest that if they want more help, you'll need all of the source and perhaps send them your hourly rates. If you don't have the time, just say so and leave it with them or tell them how much they'd have to pay for you to make time.
Educational instituion? (Score:3, Insightful)
I'm betting this is software being used in an educational or other not-for-profit environment. If so, one thing you have going for you is that employees in that sector actually talk to people from other institutions. If disclosing your work-around wouldn't just give away the entire problem, I'd actually publish what you did to protect the application. That way, your peers can decide if they want to do it and get a head start. It puts the vendor on notice that someone is going to notice this problem eventually.
If your workaround gives away the entire problem, that's more difficult. Assuming education / not-for-profit, I'd start by talking (verbally) to peers at schools using the application, and see if you can get some traction that way. A group of pissed-off customers might be more effective. Your CIO may be participating in regional or national organizations, and may be able to talk to his or her peers about the issue as well.
If you just release it on your own in an "in-your-face" way without your bosses signing on, they could decide to take it out on you if the vendor gets pissed or tries to go after you for violating some gag clause in the licensing agreement (some have them) or damaging their business. They shouldn't, but I can think of a couple of really stupid, obnoxious vendors that might.
Is it an issue or not? (Score:5, Insightful)
The vendor refused to acknowledge it was a security issue.
Then it's either a feature or a regular old non-security-related bug, and I don't see the problem with announcing it to everybody. Right?
-- 77IM
Re: (Score:2, Insightful)
The vendor refused to acknowledge it was a security issue.
Then it's either a feature or a regular old non-security-related bug, and I don't see the problem with announcing it to everybody. Right?
If you are really certain it is a valid issue, take a look at their marketing page and find out who their reference customers are. If you can get some of their important customers to raise this issue as well, you may have better luck getting some action on it without disclosing the vulnerability to the world at large.
Sue Them (Score:4, Interesting)
Sue them, take them to court where bad publicity, will scare the hell out them into settling.
As long as they view it as a technical issue, they are not interested. If they view it as a sales/marketing nightmare they will come to the table in a hurry. As for further cooperation, they will suck it up because other customers will see how they react as to how they will treat the company.
Like spousal abuse, as long as a bad working relationship can be hidden they will get away with it other individuals/companies. They only way to address the issue is to make their dirty laundry public.
Re: (Score:3, Insightful)
> Sue them
On what grounds? They didn't do what he wanted them to do?
Re: (Score:2)
The product is not fit for the purpose it was sold for?
Re: (Score:2)
not a real help. anyway (Score:4, Insightful)
I had this kind of problem about 15 years ago. It was a real pain. My problem was that I wasn't able to publish my findings as the vendor made pretty clear he'd sue me over that. So I reviewed my requirements on software and found a solution I haven't heard of before - Open Source. Since then I use Open Source and though it has some minor drawbacks I don't regret this step.
cb
There is a great forum for fixing such bugs (Score:5, Interesting)
Re: (Score:2, Insightful)
Re: (Score:2)
I do not know the law in his jurisdiction, but here (and almost everywhere else) it would be enough with one (1) share.
Of course you have to foot the bill yourself to travel there.
Re: (Score:2)
The shareholder meeting. Simply note that you reported bug # XXXX some months ago, and it has not been acted on. You wouldn't mention it except that it's a security vulnerability that, if disclosed, would tank the share price for the company. So in that light, when will this vulnerability be addressed? Let everyone else take it from there.
In that same vein, get the company to include a (responsible) disclosure policy in its handbook.
It'd make a handy stick to beat recalcitrant vendors with:
"Here's the problem, here's my company policy on disclosure. You choose what happens next."
(Of course, this assumes your company is willing to let you disclose anything)
Patch it first (Score:2)
If your company sold them security software with a bug, your company should first fix the bug and then give them a patch for it (free of charge).
Not tell them that it's bugged and ask them to pay you to fix it. If they don't want to accept the patched version, that's their problem, but most of your other customers will.
Responsbl disclosure is Security Through Obscurity (Score:2, Insightful)
And we all know what it really is.
Protect your system as good as you can, firewalls, backups, whatever, or just rely on your own obscurity, and publish!
Act surprised, act I-told-you-so, be outraged with whatever happens, and then - in few days - install a patch.
Buy good WAF then blow the whistle (Score:2)
Frankly the web app firewall idea is the most appropriate solution to this entire category of problem for your organization. You should want one, and this is just one more datapoint as to why.
Secondly, if they won't fix the problem (and I don't mean won't do it quickly, I mean won't do it at all) then I'm sure someone else will discover it and anonymously disclose it. ...Cough... Right? ...Cough...
Re: (Score:2)
Full disclosure: I work for a company that, among other things, makes a commercial Layer 7 firewall...
Re: (Score:2)
Do you know what a web app firewall actually does? It's practically a proxy server that sanitizes the inputs, meaning for each application you have to make custom rules which are more involved than just "allow 0.0.0.0/0 to 192.168.1.1 port 80". Especially if you have a very custom-built application, for each data input (each field) you will have to specify what form of data it is and what the valid ranges are. This SHOULD be done within the application already by the developers, there is no reason to go ou
Re: (Score:2)
There is plenty to argue against your FUD*, but in reality though I wasn't recommending a WAF in the traditional preemptive protective capacity.
If you have a good WAF then when you, the person responsible for security, discover a vulnerability you, the person responsible for security, can patch it in minutes.
Even if your applications are completely in-house developed it still takes time to get the vulnerability fixed and it's not within your control how long that process takes! With a WAF you take all of t
Re: (Score:2)
You're assuming that the WAF doesn't introduce its own security issues. I've seen more than a few that can be remotely crashed, thus DoSing everything behind them, and I've even seen one that can be exploited to run arbitrary code -- by adding the "security" of a WAF you might just be adding an Internet-exposed, remotely-exploitable host that not only has access to your internal app servers, but also all their input and output, and the ability to arbitrarily manipulate both.
And then there's all the regular
Re: (Score:2)
Well, you're either naive and misinformed, or you're trying to be dense.
First of all we're talking about a specific use case for a WAF- not WAFs as the panacea of network or application security. I don't think I ever stated that simply adding another "security appliance" necessarily increased security overall. I did say that with a WAF you can patch holes like this yourself. Do you really disagree with that?
That said:
Nothing you posted is an inherent problem with WAFs (over firewalls, routers, managed sw
Expose it (Score:2)
...anonymously.
It's only a matter of time before the black hats find it, so by publicizing it and forcing it in the vendor's face they'll see it for what it is.
Lawsuit? (Score:3, Interesting)
I don't know what kind of terms are written in your contract with these people (or how big your company is compared to the software company), but maybe your company can sue them for failing to properly maintain their software. Right now, find other companies that use this software and tell their administrators about the vulnerability and let the pressure build on the software company. Give them a little more time to either get their shit together or blow you off and then threaten lawsuit. It would be irresponsible of you for the sake of your own security to publicly disclose the vulnerability, so you should be doing all you possibly can to not have to do that.
Limit responsible disclosure. (Score:5, Insightful)
Simple answer:
Responsible Disclosure should be limited to vendors that publicly pledge (or, preferably, contractually agree to via their licensing terms) to Responsibly Fix issues that are disclosed to them. If a company doesn't abide by their own Responsibly Fix policy, it should be disclosed so that others realize that it's null and void.
gentlemen (Score:3, Insightful)
Well, I think it works like this. If vendors as a group want to encourage more "responsible disclosure", they need to operate in such a way that they take potential vulnerabilities seriously, and I don't just mean in a "we're taking this very seriously" kind of way, but more of a "we have a dedicated, knowledgable staff member/team to look into situations like this" sort of thing. If they decide it's not an issue after all, then any responsibility you have to the vendor is over regarding that issue. If they're not willing to even consider it as an issue after you've made a good faith effort to let them know how much of a problem you think it is, any responsibility you have to the vendor is over regarding that issue.
In short, a gentleman's agreement only works if both parties are gentlemen.
Getting their attention (Score:5, Interesting)
It's hard getting the attention of some vendors. I see vulnerabilities in a slightly different context - hacked web sites hosting phishing pages. We distribute a list of major domains being exploited by active phishing scams. [sitetruth.com] This is obtained by processing PhishTank data, and we do this because we want to reduce the collateral damage from a tough blacklist system. At any given time, there are about 30 to 80 domains on the list.
Some sites get themselves off the list quickly. By now, most of the better free hosting services and short-URL services are automatically checking PhishTank and the APWG blacklist to see when they've been hit. Today, if you run a service where anybody can put up a page that could be used for phishing (i.e. it's not full of your own headers and banners), you need automation to deal with attacks. I've been in contact with the abuse guy at "t35.com", which is a free hosting service. They've recently been hit by a flood of phishing attacks, with several hundred new reports in PhishTank per day. The attacks were coming in faster than the abuse guy could clean them out. They're now gaining on the problem, but haven't squashed it yet. Take-away lesson: automate this.
The ones near the top of the list have been there for a while. Note the dates, which are the date that the oldest phishing report still online and active appeared in PhishTank. Some just need help. Typically, these are small organizations like churches and nonprofits that have had a break-in and were partially taken over by a phishing site. I send them the Anti-Phishing Working Group's "What To Do if your Site Has Been Hacked". [antiphishing.org] Sometimes I give them a phone call. They deserve sympathy.
Then there are the hard cases. These are sites with no visible contact address, or a clueless abuse department. At the moment, Google Sites and Google Spreadsheets are being used for phishing. Google is new to the free hosting business, and the phishers have discovered some tricks that Google can't yet handle. While Google puts a "report abuse" link on their site pages, it's possible to set up a file for downloading on Google Sites, and an HTML page can be served that way [phishtank.com], without Google's abuse checking. There's also an exploit of Google Spreadsheets [phishtank.com]. That one is an example of Habbo Hotel phishing. [bbc.co.uk] We've reported these to Google several times, but they haven't been fixed yet.
We've been seeing a new type of attack recently - a phishing operation breaks into a shared hosting server and plants phishing pages on multiple domains on a single server. One of these hit one of the mysterious "*.websitewelcome.com" servers, which has "cloaked domain registration" and no useful default web page. These seem to be associated with "ThePlanet.com", but whether ThePlanet operates them, is providing wholesale hosting, is providing colocation, or is just the upstream connectivity provider is not clear.
Hiding the contact information of a hosting provider is legally unwise. The hosting provider may lose the "safe harbor" protection of the the DMCA. [cornell.edu] The "safe harbor" provision for "Information Residing on Systems or Networks At Direction of Users" only applies if "the service provider has designated an agent to receive notifications of claimed infringement... by making available through its service, including on its website in a location accessible to the public, and by providing to the Copyright Office, substantially the following information: the name, address, phone number, and electronic mail address of the agent." So when the RIAA or the MPAA come calling, a likely event for a hosting service, they get
The real painful and inefficient thing? (Score:3, Funny)
That analogy. Stop it.
Inform your corporate attorney of all the facts (Score:5, Interesting)
Re: (Score:3, Interesting)
If the OP's company is a publicly traded corporation, and if this exploit represents any kind of risk to investors in any form, they are already required to include it in the next filing.
Re:Inform your corporate attorney of all the facts (Score:5, Interesting)
Exactly.
And this is the right way to go about it.
And the nice thing is that when you report it in the filing, the other company will be shamed because other people will figure out who it is, yet you are not really adding to your own risk, because in the SEC filing you use some words like have been used in this article:
"We have discovered a security vulnerability in a third party web application platform we use. If this is successfully exploited, (list of bad things that could happen). We have informed the vendor, and they have so far been unresponsive in fixing it. We are working diligently to deploy a firewall in front of the applications, but there is no guarantee that this vulnerability will not be exploited before we have that fully deployed, or the vendor gives us a fix."
This is exactly the chain of command/control you want to use. You tell your counterparts at the other company that your company's policy requires you to report it to the lawyer, the lawyer tells his counterpart that SEC rules require him to disclose the vulnerability, and then he does it in the next report. End of story.
Then the ball is in the other company's court to either fix the vulnerability or prove to your satisfaction that it isn't a problem.
Where are your (company's) priorities? (Score:2)
As long as having a good working relationship with a vendor that you and your company knows is incompetent and unethical is more important than your security and the principle of doing things right, then you get what you deserve.
You know what the exploit is. But what makes you think you are the only one? What makes you think no unethical hackers know about and won't find out about it for the life of this exploit (which seems from the vendor attitude that it could be very long)?
You (your company) needs to ta
Reality check here folks (Score:3, Insightful)
First off, there are very few software packages of any size that are sold with terms and conditions that would allow anyone to sue the vendor for any reason. The only path I could possibly see is an agreement that bugs will be fixed - but this vendor is disclaiming that this is a bug. So even that probably isn't going to get you anywhere. Suing is almost certainly not an option unless you have money to burn.
There are far too few details in the posting to explain how this software is used and what its function is. It could be something that is on an internal web site and the exposure is from potentially unqualified employees accessing internal information. It could also be a public web site that allows people in Eastern Europe to get enough medical information on celebrities to blackmail them. Who knows?
If the vendor is saying it isn't a bug it is doubtful many other customers are going to immediately see that it is a problem. They might after a while - again, depending on how this software is used. So going to other custromers isn't likely to be very useful short term.
Certainly this is a the result of buying packaged software rather than writing everything yourself. You give up a certain amount of control. Open source isn't a solution, because if you aren't familiar with the code it could take a very long time to learn enough about it to do anything like fixing it. I'm sure you could pay a consultant to learn the package and fix it, but then you could also just pay a consultant to write you a system the way you wanted in the first place.
This is the age of the packaged software solution. It started around 1985 or so and continues to this day. Absolutely, a side effect of this is that you need to be on good working terms with your vendor(s) and your vendor(s) need to be competent. I haven't seen anything that says the vendor is incompetent, just that they disagree about the severity of the problem so far.
Sure, as a programmer I would like to think that every single company should be employing programmers and custom-writing all their own stuff. Just like 1975. Except that isn't the way things turned out. Too bad, fewer programmers employed. But it is how things work today and nobody is going back.
Make sure you speak to the right people (Score:2, Insightful)
I have no information about this instance, but the public face of the company you are speaking to is not always knowledgeable or intelligent. I work for a software company, and I have seen instances where a customer reports an issue and has it waved away without consultation by the first-contact support employee, usually under hand-waving like "no-one else is seeing this problem", or "I haven't been able to reproduce it". In most of these cases, an experienced person has not been consulted, and there has
company botnets (Score:4, Funny)
the reason companies don't like people disclosing their security holes is not only do they have to release a fix, they also have to slip in a new hole and make sure most of their botnet successfully migrates to it. since there is a gradual uptake of patches and people tend to drag their feet installing a given patch botnet performance can be severely impacted reducing the marketability of it's services.
Zero Day Inititive (Score:2, Interesting)
Neighbours (Score:5, Funny)
"Have you ever knocked on a neighbor's door to tell them they left their headlights on? Did they then require you to cook them dinner? Exactly..."
And after dinner, did they then require you to take them to a movie, a concert, some clubs and a night of passionate...
Excuse me, I'm just going to go check all the car headlights in my street. Be right back.
Next time.... (Score:2)
did you post this in the wrong place? (Score:3, Funny)
Or perhaps this is some kind of steganographic secret message you are passing onto one of your field agents?
Your response has nothing at all to do with the situation here.
Re: (Score:2)
Re: (Score:2)