Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Google Cloud Security

Questioning Google's Disclosure Timeline Motivations 73

An anonymous reader writes "The presence of 0-day vulnerability exploitation is often a real and considerable threat to the Internet — particularly when very popular consumer-level software is the target. Google's stance on a 60day turnaround of vulnerability fixes from discovery, and a 7-day turnaround of fixes for actively exploited unpatched vulnerabilities, is rather naive and devoid of commercial reality. As a web services company it is much easier for Google to develop and roll out fixes promptly — but for 95+% of the rest of the world's software development companies making thick-client, server and device-specific software this is unrealistic. Statements like these from Google clearly serve their business objectives. As predominantly a web services company with many of the world's best software engineers and researchers working for them. One could argue that Google's applications and software should already be impervious to vulnerabilities (i.e. they should have discovered them themselves through internal QA processes) — rather than relying upon external researchers and bug hunters stumbling over them."
This discussion has been archived. No new comments can be posted.

Questioning Google's Disclosure Timeline Motivations

Comments Filter:
  • You suck (Score:5, Insightful)

    by AmiMoJo ( 196126 ) * on Saturday June 01, 2013 @10:31AM (#43882743) Homepage Journal

    If your company is producing security critical software and doesn't have a plan for quickly dealing with bugs you suck.

    • Re:You suck (Score:5, Insightful)

      by JMJimmy ( 2036122 ) on Saturday June 01, 2013 @10:49AM (#43882827)

      Also, even if they can't patch it quickly the point is to inform users so they can take appropriate precautions.

    • by Anonymous Coward

      "Security critical software" is anything on the Internet.

      Let's say I write a clever new web game, called "Cyber Tic Tac Toe." It's based on the classic idea of tic tac toe, but with the novel twist that it's on the Internet! As fun as my game is, and despite the fact that you are totally addicted to my awesome game and play it about 20 hours every week, questing to get your 3 Xs in a row, some people would say this is completely mundane and unimportant. (Killjoy bastards!) They would say my game, no matte

  • -1 Flamebait (Score:5, Insightful)

    by a_n_d_e_r_s ( 136412 ) on Saturday June 01, 2013 @10:31AM (#43882745) Homepage Journal

    For article. Sadly you can't moderate an article.

    • by Anonymous Coward

      What article? All I see is some idiot's comment that somehow got posted as if it were an article. Timothy must be drunk again.

    • Exactly... (Score:5, Informative)

      by theshowmecanuck ( 703852 ) on Saturday June 01, 2013 @10:59AM (#43882895) Journal
      Testing can find the presence of bugs, not the absence of them. An ages old adage that has withstood the test of time because it is true. Only the new to the game or naive would think otherwise.
    • Re: -1 Flamebait (Score:2, Insightful)

      by Anonymous Coward

      Agreed. Submitter sputtered the industry line, but if the industry wanted different standards they should try setting *some* standards instead of pretending that it's OK to ignore vulnerabilities.

  • Good catch (Score:2, Interesting)

    by Anonymous Coward

    Google has a decent point, but they're also trying to nudge the industry in a direction where more businesses will leave the driving to them [wikipedia.org], or to cloud competitors like Amazon.

  • Just Google? (Score:5, Insightful)

    by chrylis ( 262281 ) on Saturday June 01, 2013 @10:38AM (#43882767)

    Why single out Google? Shouldn't traditional software vendors have also run programs through QA?

    • Having worked in software development for 30 years, let me tell you a dark little secret.

      QA programs are never enough.

      It matters not how good your QA section is, things will get through. Why? Because the QA people have looked at the code, because QA managers are fond of believing that filling out paperwork is the same thing as careful retrospection, because the QA people work for a company whose vested interest is in getting that chunk of software out the door.

      In order to do a thorough job, one needs Diff

      • by chrylis ( 262281 )

        Oh, I'm completely with you: Developers should run QA, but there will always be bugs that slip through. I was responding to the AP's insinuation that Google should be catching all their security problems in internal QA but that Microsoft should get a pass for some reason.

      • Having worked in software development for 30 years, let me tell you a dark little secret.

        QA programs are never enough.

        Dropping previous moderation to comment here:

        Having worked in SQA for over 17 years I can tell you PM NEVER listens to SQA unless the app would be DOA on more than 20% of the target customer's machines. On occasion a XXO would override PM and tell them no... you cannot RTM until you fix *THIS* issue... but security, data loss, customer satisfaction were never significant guidance to PM or XXO decisions.

        And these issues are not deep... this is low hanging fruit, more often than not. PMs and DEVs hate SQA

    • Re:Just Google? (Score:5, Insightful)

      by Nerdfest ( 867930 ) on Saturday June 01, 2013 @11:58AM (#43883297)

      Most other companies doesn't have well funded FUD campaigns directed against them.

    • It's Google who are pushing a 7-day deadline for vulnerability disclosure. These companies are in effect saying "That's not fair, we can't fix our software that quickly!". It would be a real PR foot bullet if people understood what they're saying.

    • by antdude ( 79039 )

      Or they have QA, but lack of time and resources.

  • by Anonymous Coward
    Just because you commit to a certain turnaround, doesn't mean you won't beat it. Anyone in business knows not to produce warranties or guarantees that you can't meet every time. People that over promise and under deliver typically aren't around too long.
  • by hsmith ( 818216 ) on Saturday June 01, 2013 @10:52AM (#43882857)
    And they simply don't do anything? I've contacted companies about security flaws I've found in their products and was met with deafening silence.

    Until I publicly announce them on platforms like Twitter, then you have their full attention.
    • Re: (Score:2, Insightful)

      by AmiMoJo ( 196126 ) *

      If you say "I'm telling you privately but going public in 7 days" it sounds like a threat and the most likely response if a lawsuit or the police hammering on your door. Even if you don't say anything about going public there is a fair chance of that happening. Your actions are pretty risky.

      Unfortunately unless the company has a history of dealing with security bug reports in a timely and proper manner the most responsible thing to do is just post about it anonymously to one of the full disclosure mailing l

  • by mattiaza ( 2567891 ) on Saturday June 01, 2013 @11:00AM (#43882909)
    Google recommends 7 days for "critical vulnerabilities under active exploitation", and 60 days for vulnerabilities that are assumed to not yet be known to attackers.

    Frankly, even 7 days is too long for active attacks. Publishing the vulnerability lets users to use a workaround or shut down the service or app entirely until a fix is released.
  • by Anonymous Coward on Saturday June 01, 2013 @11:01AM (#43882917)

    A vulnerability that is already being exploited needs to be fixed right away. It's called 0-day for a reason, not 7-day. It should be disclosed immediately to force the vendor to do something about it.

    • by NotBorg ( 829820 )
      The whole reason for restricting disclosure is to prevent that situation that is already happening for anything in the "already being exploited" category. Waiting 7 days is pointless.
  • What?! (Score:5, Insightful)

    by CanEHdian ( 1098955 ) on Saturday June 01, 2013 @11:07AM (#43882957)

    a 7-day turnaround of fixes for actively exploited unpatched vulnerabilities, is rather naive and devoid of commercial reality. As a web services company it is much easier for Google to develop and roll out fixes promptly — but for 95+% of the rest of the world's software development companies making thick-client, server and device-specific software this is unrealistic

    Hello there, mr/ms/mrs anonymous COWARD, what are you saying there? It COSTS TOO MUCH to prompty (as in a week) fix ACTIVELY EXPLOITED vulnerabilities? When you get the actual problem handed to you on a silver platter? What company do you work for?

    • It also seems to rely on the(severely dubious) assumption that Google's disclosure will have much of an effect, on anything aside from how slow the vendor looks, for 'actively exploited unpatched vulnerabilities'.

      This isn't a 'Hey, we found a bug that nobody else knows about yet, we are going to release it just for giggles!' situation. Clock. Is. Already. Ticking. Google's 'deadline' may be the one at which you start looking increasingly incompetent in public; but the deadline at which everyone running your

  • by Bob9113 ( 14996 ) on Saturday June 01, 2013 @11:09AM (#43882985) Homepage

    a 7-day turnaround of fixes for actively exploited unpatched vulnerabilities, is rather naive and devoid of commercial reality.

    I read that and I was thinking, "Well, yeah, sure - I shoot for one hour and can't recall the last time it took more than a day to get a critical bug patch out, but that's not really reasonable for everyone. The team I work on is pretty focused on keeping the tracks polished so we can get high priority things through. I think 7 days is OK. It could be better, but it's OK. And Google isn't even saying it will take 7 days, they're saying 7 days is the max. But, whatever, I guess -- ultimately agitating for faster patches is something I support."

    for 95+% of the rest of the world's software development companies making thick-client, server and device-specific software this is unrealistic.

    What?!? You mean it's not realistic to get the patch available within 7 days? I mean, obviously you can't expect users to have their systems patched immediately, and sometimes a third party (like a walled garden approval path) can lock you out. But is the writer saying 95% of companies can't even have a patch pushed for release in 7 days?

    If that is true, we, as a society, need to drop what we're doing and focus on security, build management, QA workflow, whatever it is that is making that a reality. 7 days is acceptable. 95% of companies can't hit 7 days? First, that is not true in my experience. But if it is? That is not acceptable, if it is true. There really are bad people out there trying to root our electronics. Seven days to get a patch out for an actively exploited in the wild vulnerability is enough. Work the problem. Figure out why you can't hit that number, and fix it.

    • It is true. 7 days for 95% companies is unrealistic. If you make big enterprise software to be sold to big vendors (SAP?), the clauses are simple: Any regression bug that is noticed and someone screams too loud => heads roll. So they have test-cycles that take weeks, for anything, however small the change is. The problem is not with the development companies, the problem is with the user-companies. They buy software with a 90's mindset. Which software today doesn't have bugs, security bugs, and regressi
  • So the OP is bitching Google is like the Empire offering rewards that people like Boba Fett go after? I don't even

  • by linuxwrangler ( 582055 ) on Saturday June 01, 2013 @11:40AM (#43883173)

    It's no wonder this article was posted anonymously. The whole tone and writing style is exactly what one would expect in a position statement cranked out by a corporate PR flack. I wonder whose flack it is.

  • by Sloppy ( 14984 ) on Saturday June 01, 2013 @11:46AM (#43883221) Homepage Journal

    Google's stance on a 60day turnaround of vulnerability fixes from discovery, and a 7-day turnaround of fixes for actively exploited unpatched vulnerabilities, is rather naive and devoid of commercial reality.

    I think what you're saying, is that if someone is going around stabbing people in the heart, and if a doctor says these victims all need immediate medical attention (even the victims which are in isolated areas far from hospitals), then that doctor is being naive and devoid of medical reality.

    I personally think you should quit blaming the doctor for the unfairness and horror that is inherent in the situation. Declaring the urgency of a problem being addressed, isn't "naive". It's not naive, even if addressing the problem is incredibly hard or even if it's effectively impossible.

    If the doctor truly thinks the victims all really will get "immediate medical attention" then he'd be naive. But advising it isn't naive. Yelling at people "get that victim to the ER as fast as you can!" isn't naive. Telling people that heart stab wounds are very serious, isn't naive.

    And the analogy with Google here, is that you just got stabbed in the heart, they're advising and hoping you get immediate medical attention, and 7 days from now, if your wife asks Google if they've seen you lately, they're going to tell your wife, "I heard he got stabbed in the heart last week. You took him to the hospital, right? If not, you better get on that, right now." You're concerned Google is going to scare your wife?! Be concerned that you're not at the hospital yet!

    You think Google is being naive with unreasonably high expectations, but the need for those high expectations isn't their fault!

  • by russotto ( 537200 ) on Saturday June 01, 2013 @12:12PM (#43883409) Journal

    Google's standards are too hard to be realistic for the rest of the industry, but the standards for Google itself are 0 security defects ever. How does that work again?

    (disclosure: I work for Google. I don't speak for them.)

    • by NotBorg ( 829820 )
      The whole reason for not disclosing is to prevent the bad guys from knowing about it. The words "actively exploited" means the bad guys know about it. Ergo there's no reason not to disclose it. Waiting seven days is not too short because it's pointless to wait at all under said circumstance.
    • The standards for *THE* leading IT company are always higher. Et tu, Brute? => His standard was set higher by Caesar :)
  • by Tom ( 822 )

    What a dumb opinion piece.

    The main difference between a client-installed application and a web-app these days is that a patch on a web application is available as soon as you deploy it, while the patch for the client application needs to be downloaded and installed, which is mostly done automatically.

    So, in terms of time, the difference is on the order of minutes, hours at most.

    Is it more difficult to create and/or test updates for clients or for browsers? Hard to say, but the difference isn't fundamental.

  • Industry is irrelvant, if your customers are being actively exploited you are or ought to be liable if you don't fix it as fast as possible

  • The debate has always been about whether to give any advance notification at all. That's why full-disclosure@lists.grok.org.uk is called "full disclosure". That's the spectrum of behaviour that's allowed, anything up to 0, "notify vendor and public simultaneously":

    https://en.wikipedia.org/wiki/Full_disclosure#Arguments_against_Coordinated_Disclosure

    Even if not for that, the complaining seems to rest on a vague suggestion that Google's acting somewhat in their own interest instead of always doing t

  • by Todd Knarr ( 15451 ) on Saturday June 01, 2013 @12:49PM (#43883681) Homepage

    As a user, I don't care about the vendor's ability to fix it quickly. Really I don't. That's their problem. My problem is that my systems are vulnerable to compromise and I have to do something about it. I need to know what the vulnerability is, in enough detail to understand it myself, and I need to know the possible workarounds (not just the vendor's recommended one(s), which is another reason I need to know what the vulnerability actually is so I can understand all the other possible ways of dealing with it). I need to evaluate my options and take whatever steps I need to to protect my systems. If the vendor needs a month to get the fix through their change-control process, I still need to protect my systems today.

    The vendor's advice will be based on their most-likely scenario. Problem is that my situation may be radically different from the vendor's most-likely one. There's definitely going to be local considerations even if my situation's one the vendor's workaround covers. I need to understand the vulnerability to be able to evaluate it intelligently. It may not even be relevant to my setup. If it is, I may have less-intrusive workarounds (eg. for the SSH OTP authentication bug, if we've got a purely internal network that isn't accessible to the outside world or the Windows desktop portion of the network it may be less intrusive to just monitor for attempted exploits and defer doing anything until I see someone having gotten past the air gap rather than changing an authentication method that a lot of people depend on and that can't be exploited easily without being physically in the building). And if I need to take drastic steps like disabling the vulnerable SSH authentication method, I may have clients who insist they must be able to use it (maybe because their systems are based on it and they need my systems to integrate with their authentication because I'm providing services to them) and I need to be able to intelligently discuss exactly what's wrong and why it's simply not possible to use that method without being vulnerable and we've got to change to a different method despite the disruption. I can't do that unless I understand the vulnerability.

    Notice that in all the above I haven't mentioned the vendor at all. Like I said, the vendor isn't relevant at all. It's my systems that're vulnerable and me that has to do something about it. If the vendor already has a fix then well and good, but if they don't it doesn't change my situation. When vendors say they need more time, they're asking me to leave my systems vulnerable without telling me they're vulnerable. Sorry, but no. Not, that is, unless they're willing to shoulder 100% of all the costs resulting from that vulnerability being exploited. Not just direct costs, things like the costs of lost business and clean-up if the vulnerability is exploited and liabilities I may incur because of the compromise. If a vendor isn't willing to take on that liability, then they don't get to tell me I shouldn't have the information I need to protect myself from that liability. If they don't like it... this is the sound of the world's smallest violin, playing the world's saddest song just for them.

    • When vendors say they need more time, they're asking me to leave my systems vulnerable without telling me they're vulnerable. Sorry, but no. Not, that is, unless they're willing to shoulder 100% of all the costs resulting from that vulnerability being exploited.

      The bit that you're ignoring is that by telling you about the vulnerability they're also telling all the black hats about it. So while your systems are vulnerable either way, the choice is between you and all the hackers knowing or you and most of the hackers not knowing. Whether this increases or decreases your actual exposure depends on who is interested in attacking you and whether or not they already have this exploit.

      While you may be capable of implementing countermeasures to limit your vulnerability

      • The bit that you're ignoring is that by telling you about the vulnerability they're also telling all the black hats about it.

        Except that the black hats already know about it, and are actively using it to compromise systems. Telling me about it doesn't give the black hats any information they didn't already have. The only way keeping the information confidential helps me is if it's something the black hats can't possibly figure out without being told, and the track record says there's no such thing. I have y

        • Except that the black hats already know about it

          Some do. Which is a distinction I clearly drew in the post you responded to, apparently without reading all of it.

          • Some do. Which is a distinction I clearly drew in the post you responded to, apparently without reading all of it.

            No, because it's a distinction you can't draw. There's a line from a Dr. Who episode: "Not every shadow, just any shadow.". The same applies to attackers. Not every attacker will know the details, but any attacker may. And since by the time you know whether any particular attacker knows the details or not it's too late to defend, you have to assume that any attacker knows and defend yourself bef

  • by Anonymous Coward

    Pitiful cries of "*SNIFFFF* Butttt, butttt, we've been using paying customers as guinea pigs for decades, we'd actually have to pay someone to test our shit code now!!"
    "That's just not fair, we charge more because we have to pay stockholders and multi-million dollar bonuses to the dickheads who make our fucktarded decisions!"
    "Without paying testers, how else can we rake in more cash, more quickly, especially when our EULAs make it so they cannot sue us when our security software kills their PC!"

  • by Anonymous Coward

    Let's cut through several layers of meta-issue here and get to the heart of the matter, and then work our way back outwards to the issues this article is so wrong about:

    1) Software Development is arguably one of the most difficult, complex, and daunting tasks humankind can endeavor at. That is, assuming you care about factors like quality, efficiency, security, and correctness. What makes it so difficult is that you're molding things out of the most flexible clay ever dreamed of. Physics is pretty much t

  • I guess Chrome, ChromeOS, Android, numerous Android apps, Google Earth, Google Drive, Picasa and Google's many other traditional installable software products don't count.

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...