Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Google Security Technology

Google Up Ante For Disclosure Rules, Increases Bug Bounty 134

An anonymous reader writes "In a recent post by seven members of their security team, Google lashed out against the current standards of responsible disclosure, and implicitly backed the recent actions of Tavis Ormandy (who is listed as one of the authors). The company said it believed 60 days should be an 'upper bound' for fixing critical vulnerabilities, and asked to to be held to the same standard by external researchers. In another, nearly simultaneous post to the Chromium blog, Google also announced they are raising the security reward for Chrome vulnerabilities to $3133.7, apparently in response to Mozilla's recent action."
This discussion has been archived. No new comments can be posted.

Google Up Ante For Disclosure Rules, Increases Bug Bounty

Comments Filter:
  • Elite (Score:5, Funny)

    by ceraphis ( 1611217 ) on Tuesday July 20, 2010 @10:13PM (#32973832)

    Google also announced they are raising the security reward for Chrome vulnerabilities to $3133.7

    That's quite the elite sum of money to use as a reward.

    • Well, only the elite will get the reward anyway.
    • Re: (Score:1, Offtopic)

      by cosm ( 1072588 )

      Google also announced they are raising the security reward for Chrome vulnerabilities to $3133.7

      That's quite the elite sum of money to use as a reward.

      Pre-WHOOSH, because I know they are coming.

    • Re: (Score:3, Insightful)

      by sarathmenon ( 751376 )

      And also, it's contradictory to what google did earlier this year. They released a zero day for windows [threatpost.com] and gave microsoft hardly a week to patch it. And as a bonus, they made the disclosure public on a Sunday.

      I am all for more industry standard accountability, but this looks very one sided and google choosing to pick the instances where it gets a good publicity.

      • by gmhowell ( 26755 )

        I work on Sundays, why can't Microsoft? Do I get to use their software for free on the weekend? No? Does malware take off on Sunday? No?

        Screw 'em.

        • People in China work 80 hour weeks for pathetic wages. Why can't you? There are tasks that need to be done 24/7, not just 40 hours a week!
        • Sleep? Weekends? (Score:2, Interesting)

          by Anonymous Coward

          Microsoft OS and App vulnerabilities are the only internet currency better than eGold. If you travelled in those circles you'ld see how bad the situation is. I've been there and back, so I'll tell ya: it's bad. Bad. Really, really, really bad.

          If you'll pay $500, there's folks out there who will deliver the contents of your own email inbox unedited, for as far back as it goes, externally and without assistance. The most honest of them will sell you that info and let it go, but we all know there's a lot

      • Re:Elite (Score:5, Informative)

        by Undead Waffle ( 1447615 ) on Wednesday July 21, 2010 @12:34AM (#32974406)

        Looks like someone needs to RTFA.

        This article is basically laying out a policy Google will follow in the future. Here is the most critical bit:

        A lot of talented security researchers work at Google. These researchers discover many vulnerabilities in products from vendors across the board, and they share a detailed analysis of their findings with vendors to help them get started on patch development. We will be supportive of the following practices by our researchers:

        • Placing a disclosure deadline on any serious vulnerability they report, consistent with complexity of the fix. (For example, a design error needs more time to address than a simple memory corruption bug).
        • Responding to a missed disclosure deadline or refusal to address the problem by publishing an analysis of the vulnerability, along with any suggested workarounds.
        • Setting an aggressive disclosure deadline where there exists evidence that blackhats already have knowledge of a given bug.

        Now that "zero day" (well 5 days really) the Googler gave Microsoft was only because Microsoft would not commit to fixing it. That is perfectly consistent with the article, which points out "responsible disclosure" is a 2 way street and only works when the person with the vulnerability acts responsibly as well (which Microsoft didn't in this case). You could argue that he should have set a deadline regardless of whether Microsoft agreed to it, but I would not say they are contradicting themselves. They also point out in the article that responsible disclosure isn't always the best route. So I'm going to have to support Google in this article, which is simply about laying out their "supported" disclosure policy for their security researchers in the future.

        • Re: (Score:2, Insightful)

          by bloodhawk ( 813939 )

          Now that "zero day" (well 5 days really) the Googler gave Microsoft was only because Microsoft would not commit to fixing it. That is perfectly consistent with the article, which points out "responsible disclosure" is a 2 way street and only works when the person with the vulnerability acts responsibly as well (which Microsoft didn't in this case).

          that is twisting the truth more than a little. MS said they would get back to him with a timeline by the end of the week, he then went and published it anyway. the irresponsible party in that instance was definite Tavis Ormandy.

          • Re:Elite (Score:4, Insightful)

            by taviso ( 566920 ) * on Wednesday July 21, 2010 @12:47PM (#32980344) Homepage

            Actually, his comment was entirely accurate.

            I've reported dozens of critical vulnerabilities in Microsoft software over the years, and I still have multiple open cases with Microsoft security, this particular case wasn't as simple as you have assumed. I would not be so presumptuous to explain the ethics of your work to you, but evidently you believe you're qualified to lecture me in mine.

            If I were to read the sensationalised lay-press coverage of your latest publication or project, would it prepare me to write a critique of your
            work?

            • Your ethics are your own business, unless of course your actions potentially harm other people. Nobody elected you as their Internet Security Guard, so drop the elitist attitude.

              • by taviso ( 566920 ) *

                You didn't elect me your doctor either, but I'm sure you would like me to tell you if you water supply was poisoned.

      • Re: (Score:1, Informative)

        by Anonymous Coward
        Google didn't release a Windows vulnerability, someone who happens to work at Google did. He took pains to point out that he was was working independently, but MS and some in the media chose to imply that we was acting on Google's behalf. Don't take my word for it - read Tavis' original post [seclists.org] and spender's interesting, if bitter, analysis [seclists.org]
        • Re: (Score:1, Troll)

          It was a potential conflict of interest [wikipedia.org], given that he is a paid employee of Google who works as a security engineer. If this was inconsistent with Google's policies, there is definitely a problem, the problem would have been Tavis' fault (not Google's), but it would be up to Google to repudiate the actions if it believed Ormandy was not in compliance.

          This article instead suggests that Google's policy is consistent with Tavis' actions, so it really doesn't matter.

          Which is fair, but I don't see this new pol

          • Re:Elite (Score:5, Interesting)

            by Bigjeff5 ( 1143585 ) on Wednesday July 21, 2010 @02:20AM (#32974744)

            He actually gave his reasons for disclosure in the disclosure itself.

            Hcp vulnerabilities are a well known attack vector for Windows, and given that the specific vulnerability he found has existed in Windows XP for 9 years, he felt it was very likely that black hats had found the same technique and as such there was a very high likelihood that it was being actively exploited in the wild. I'm sure the ease with which it can be executed factored in as well - it's literally just a one-line hcp url with execution code in it. Therefore, he felt full disclosure so security professionals could begin mitigating the issue (i.e. disable help center) was more important than giving Microsoft ample time to fix the problem.

            Personally, I agree. Microsoft has a history of sitting on high-severity vulnerabilities for years if they aren't disclosed publicly, and this was an extremely easy to execute exploit. The prudent course here was to get the information out ASAP, with little more than a courtesy call to Microsoft before he did.

    • by mysidia ( 191772 )

      Naw, an elite sum would be $31,337.

      Which would seem more appropriate, if the security issue has to be exploitable to get it at least.

      $3133 is chump change compared to what, shall we say, sale of a security flaw to others with ahem questionable intent, would probably garner (at Google's expense)

    • That's the joke.

  • NERDS (Score:3, Funny)

    by Anonymous Coward on Tuesday July 20, 2010 @10:16PM (#32973854)

    NERDS!

    • by mcgrew ( 92797 ) *

      Yes, we are. Sorry you're too illiterate to read the site's masthead. The NASCAR site is somewhere else, Bubba.

  • by bi$hop ( 878253 ) on Tuesday July 20, 2010 @10:18PM (#32973858)
    Dear Google,

    I just found a bug in Gmail. We should talk.

    Sincerely,
    Chinese Hacker
    • by cosm ( 1072588 ) <thecosm3@gmai l . c om> on Tuesday July 20, 2010 @10:45PM (#32973998)
      Dear Chinese Hacker,

      I just found a bug in your government. We should square up.

      Sincerely,
      Google
      • Dear Google, We would like for you to meet us in the 'Square'. All manner of issues have been squashed in the square and I am sure we can some to some kind or arrangement over this issue. Sincerly, Chinese government.
      • Dear Google, here's a critical vulnerability that needs fixing: selling one's principles for 30 pieces of egg roll.
      • by mcgrew ( 92797 ) *

        Dear Google,

        That's not a bug, that's a design flaw^^^^^^^^^^^feature.

        Sincerely,
        Microsoft

  • by JoshuaZ ( 1134087 ) on Tuesday July 20, 2010 @10:19PM (#32973864) Homepage
    This is a sign of a truly competitive market. When Chrome and Mozilla are competing to the point where they need to bid on how much they pay for people to find flaws in their own software then there's serious competition. And the result is that we, the consumers, benefit the most. This is market dynamics with honest companies at their best.
    • by AHuxley ( 892839 )
      Yes great for a laptop if needed or charity spending if your a trustafarian.
      The world gets better apps, Google gets to spread more ads, the SC community learns from bug fixes.
      I find googles views on privacy and their past mistake to point at some deep issues.
      Better than DRM lock in/out turn off stagnation with MS and Apple for now.
    • This is a sign of a truly competitive market. When Chrome and Mozilla are competing to the point where they need to bid on how much they pay for people to find flaws in their own software then there's serious competition. And the result is that we, the consumers, benefit the most. This is market dynamics with honest companies at their best.

      There's just so much wrong with your characterization of these bounties as "truly competitive" that I don't know where to begin.

      In reality, the only competition these bounties represent is a competition for free publicity and goodwill.

      • Missing the point. The bounties aren't what's competitive. What's competitive is the browser market. That they need to keep upping the amount of money they are offering to find problems in their browsers is a function of the competitive browser market.
    • Re: (Score:1, Troll)

      by MLS100 ( 1073958 )

      Or for the glass half empty types: Google and Mozilla aren't willing to pay more than $3134 to eliminate a remotely exploitable vulnerability that could be potentially disastrous for their users!

      • by cynyr ( 703126 )

        Or for the glass half empty types: Google and Mozilla aren't willing to pay more than $3134 to eliminate a remotely exploitable vulnerability that could be potentially disastrous for their users!

        should read: "... aren't willing or able to pay more than $3134 ... "

    • Mozilla is a foundation, not a company. That is, if we exclude the corporation subsidiary that deals with sponsorship etc.

      So, this is not some capitalist ideal of two companies competing for our benefit, rather a very non-capitalist foundation that is encouraging what capitalism discourages. If Mozilla didn't exist, Google might not have bothered to up their reward.

      RS

    • This is a sign of a truly competitive market. When Chrome and Mozilla are competing to the point where they need to bid on how much they pay for people to find flaws in their own software then there's serious competition. And the result is that we, the consumers, benefit the most. This is market dynamics with honest companies at their best.

      So if I were a conscienceless skilled hacker and discovered a vulnerability, then in a proper free market I'd be free sell it to the highest bidding criminal?

  • by Dwonis ( 52652 ) * on Tuesday July 20, 2010 @10:21PM (#32973868)

    I'm sure a lot of people here will lament that 60 days is way too long to release a fix for most vulnerabilities, and I think that's true. On the other hand, it's probably a "reasonable upper bound" for very complex problems like the TLS session re-negotiation vulnerability, which required coordination between multiple vendors and the IETF in order to fix.

    In other words, if you think you should get a 60-day head start to fix a security bug, your bug had better be at least as complex as CVE-2009-3555.

    • Re: (Score:1, Insightful)

      by Anonymous Coward

      I'm sure a lot of people here will lament that 60 days is way too long to release a fix for most vulnerabilities, and I think that's true. On the other hand, it's probably a "reasonable upper bound" for very complex problems like the TLS session re-negotiation vulnerability, which required coordination between multiple vendors and the IETF in order to fix. In other words, if you think you should get a 60-day head start to fix a security bug, your bug had better be at least as complex as CVE-2009-3555.

      OTOH, It's a lot easier to say that if your product that needs fixing is a few magabytes of browser (and your customers do most of their complex processing on the server) than if your product that needs fixing is gigabytes of operating system with thousands of products that are much more complex than a browser running on top of it and that may be affected by the fix.

      • by fuzzyfuzzyfungus ( 1223518 ) on Tuesday July 20, 2010 @10:52PM (#32974042) Journal
        I think that your comment can be read on two levels:

        One. You are correct. Google is almost certainly taking advantage of the fact that browsers are substantially less complex(and people are comparatively tolerant of little rendering glitches, unless they scotch the whole page or "people" happen to be graphic designers...). It is a cynical; but very logical, tactic to talk most about the virtues you can cultivate most easily(though, conceivably, 60 days might actually be a much tighter limit for some of their server stuff, I don't know how hairy that can get).

        Two. If your product is too large, and too tightly coupled, to turn around a fix in two months you had better have a very compelling reason. Arguably, Microsoft's relatively tight coupling of an enormous number of pieces has been very good business; but not very good design. In the short term, Google's implicit dig is rather cynical. In the longer term, though, they are really scoring a point in a battle of architectural philosophies. Microsoft probably actually handles size, complexity, and tight inter-relation better than most(they'd be dead if they didn't); but the problems that it causes them are basically their fault. They made that mess, they deliberately coupled stuff for economic reasons that could have been decoupled for engineering ones....
        • I disagree.

          It's not like one guy wrote Windows - there are teams and teams of engineers. Microsoft employs thousands of them - there were about 1000 who wrote Windows 7, for example. Each part of the operating system is divided up into manageable components. Each of those components is divided into manageable parts, and each of those parts is divided into manageable functions, on down until you have a handful of engineers responsible for one small section of the OS.

          When a vulnerability is disclosed, the

          • As every IT manager knows, the amount of time it takes to produce some code is directly proportional to the number of people working on it!

            They should employ 100,000 coders, that way exploits will get fixed minutes after they're found!
            • As every IT manager knows, the amount of time it takes to produce some code is directly proportional to the number of people working on it!

              They should employ 100,000 coders, that way exploits will get fixed minutes after they're found!

              In that case, you can do even better: employ only 1/100,000 of a coder, and exploits will be fixed in nanoseconds! (Or wait, did you mean inversely proportional?)

          • Not to forget, but windows is a paid for OS. I would expect at least some reasonable support no matter the sophistication then. Google's product is free to the public. How many of those do you know that provide amazing support if any. Their only real responsibility would be image and PR. That is why open source is so great because anyone interested could do the patching themselves instead of relying on the owner of the software to care.
      • by Dwonis ( 52652 ) * on Tuesday July 20, 2010 @10:52PM (#32974044)

        If your bug is so big that you can't fix it in 60 days, then you need to drop the secrecy anyway so that the rest of the world can help you fix it (or work around the fact that you can't).

        Remember that these bugs are things that shouldn't exist in the first place.

        • Re: (Score:2, Insightful)

          by Anonymous Coward
          It is not taking them 60 days to make a patch because of product complexity, It is probably taking them only a few hours for the patch, however because of the huge ecosystem around windows they have to do a massive amount of regression testing to ensure they are not breaking anyones products, imagine how much adobe would scream if a security patch broke their products or how about apple for itunes and you can bet the stories wouldn't be "Apple itunes breaks because of poor Apple development practises", it w
          • by cynyr ( 703126 )

            but, apple releases update to OSX that breaks photoshop, and they have fixed the bug in xcode(the approved way of software on OSX), then shouldn't adobe just update their xcode/OSX, press the "re package, patch" button, and ship the update? why should MS be concerned about 3rd party apps? does MS software still work, or receive patches? good problem solved.

            • That's easy to answer. Gates doesn't have a RDF so people expect Windows not to break 3rd party apps. Apple fans will put up with anything.

    • by mysidia ( 191772 )

      We should probably distinguish between simple coding mistakes and fundamental security flaws in standards-defined protocols.

      Just because some flaws are complex enough that it may justifiably and reasonably take more than 2 weeks to deliver a patch does not mean there is a free pass to be laid back and wait that long to patch all flaws.

      It's not justifiable to wait 30 days to fix a one-line coding mistake that allowed a buffer overflow or underflow condition.

    • It's a lot better than the 720 day upper bound* that Microsoft uses, though.

      * I don't actually think they have an "upper bound" that starts from when they discover the vulnerability. The clock for the upper bound doesn't seem to start until the vulnerability is publicly disclosed. I'm sure the only reason that 720 day old high-severity vulnerability was fixed was because some Microsoft was bored that day.

    • I think that 60 days is a good time frame, because on top of fixing the vulnerability, you also have to convince users to install the patch. Deploying a patch to a large userbase is going to take time, and probably longer than it takes to fix the problem in the first place. That being said, maybe a more responsible approach would be to tell the vendor (and the world) that you're going to publish the vulnerability in X days or 30 days after they release a patch, whichever is longer. Now the vendor has serio
  • by martin-boundary ( 547041 ) on Tuesday July 20, 2010 @10:36PM (#32973942)
    Although it's great to have a company pledge responsible behaviour, the logical next step for the industry would be to put security vulnerability reports in escrow, with an automated time release. This could be as simple as having a CERT server distribute unique encryption keys, with each key being publically disclosed after a countdown from the time it is generated. A security researcher would encrypt each of their reports with such a key (a different one each time) and publish them on the web. Besides reducing the political squabbling between companies, this kind of system would also be great for priority disputes between researchers.
    • This is an excellent idea.

    • by adtifyj ( 868717 )

      .. the logical next step for the industry would be to put security vulnerability reports in escrow, with an automated time release. ..

      This is an amazingly simple solution. I'm surprised nobody has implemented this already.

    • Unfortunately, the major vendors will be violently against such a thing, so it's something the researchers themselves will have to implement. In doing so they'll probably lose friendly relations with major vendors - and by "lose friendly relations" I mean be slandered constantly.

      I think it would be worth it though.

      • by kscguru ( 551278 )

        It's a really good idea.

        Part of the idea would have to be having a REPUTABLE escrow service disinterested in publicity - a service that can work with both the vendor and the security researcher and balance the competing interests.

        Every security researcher wants to maximize the severity rating of the bug, an instantaneous commitment to a fix timeline, and an absurdly tight deadline (expects vendor to drop everything to analyze the bug, fix it perfectly on the first try, and release immediately). Responsible

    • I think it's a terrible idea. It's like pre-planning your future investments and locking them in. What if you did that just before our latest recession? Putting your fate in a dead hand is never a good idea.

  • 60 days is not 5 (Score:3, Insightful)

    by TouchAndGo ( 1799300 ) on Tuesday July 20, 2010 @10:38PM (#32973962)

    So google is defending the actions of an engineer who posted attack code on a Windows vulnerability 5 days after he reported it to Microsoft by saying that 60 days is more than enough time to fix a critical vulnerability...how exactly does that reasoning work?

    • by cosm ( 1072588 )
      Progress comes in little steps. I would say this is better than nothing, baby steps towards holding these megacorps feet to the vulnerability fire is incentivizing the disclosure of the bugs, instead of the cover ups and cease and desist letters who just want their shit fixed for the betterment of the world (or the glory of being 1337).
    • by bunratty ( 545641 ) on Tuesday July 20, 2010 @10:55PM (#32974050)
      Google is saying that some companies *cough* Microsoft *cough* sit on security bugs for years until they're finally exploited, putting their users at risk. It's only by publicly disclosing the bug that these companies fix the problem.
      • by Tom ( 822 )

        Yes, and us full disclosure supporters in the security community have been literally saying that for years. Good to see that some of the bigger players finally wake up.

      • So what is the business case for a Google researcher to be looking for vulnerabilities in Windows code other than for competitive reasons? Does Google guarantee that there are zero vulnerabilities in all of their own code? If not, why isn't he still looking at Google's code?

    • Re:60 days is not 5 (Score:5, Informative)

      by Anonymous Coward on Tuesday July 20, 2010 @11:00PM (#32974066)

      Read the actual reporting on what happened. Tavis gave MS 60-days, but they refused to commit to any timeline. So, he went ahead and disclosed immediately, along with a fix for affected systems.

      It's also important to understand that Tavis has been reporting critical vulnerabilities to MS for years--and in some cases waited over a year for them to push a fix. This time he saw something trivial that should be fixed immediately and he put their feet to the fire. Oddly enough, they did push out their own fix in under 60 days after the vulnerability was made public. So you don't have to agree with his methods, but you should at least frame the situation correctly.

      • by benjymouse ( 756774 ) on Wednesday July 21, 2010 @12:10AM (#32974288)
        1. Tavis Ormandy reported the bug to Microsoft on a Saturday and wanted Microsoft to commit to a 60 day timeframe.
        2. On Tuesday (a patch tuesday, mind you) Microsoft told mr. Ormandy that they would be able to present a plan the upcoming Friday - i.e. 3 days later and 6 days after the bug had been reported.
        3. Wednesday mr. Ormandy went public.

        Microsoft *never* refused to commit to a timeline. They didn't commit to a timeline within 3 days, so 4 days after reporting the bug mr.

        Ormandy went public. If he truly believed that 60days would be reasonable he could just have informed MS that he would go public exactly 60 days later. But no, Ormandy just needed an excuse to go public and show the world how much smarter than Microsoft he is.

        60 days may seem long, but it is actually very close to the current average for the largest software providers - not just Microsoft. Mozilla patches much faster but we have also seen several incidents where a Mozilla patch broke the browser and/or was ineffective. Consider the fallout if suddenly all French Windows XPs/Vista were unable to boot. MS needs to regression test each and every combination. Remember what happened when malware caused Windows XPs to not boot because and old DLL had been patched and addresses assumed by the malware had shifted?

        • by adtifyj ( 868717 )

          1. Tavis Ormandy reported the bug to Microsoft on a Saturday and wanted Microsoft to commit to a 60 day timeframe.
          2. On Tuesday (a patch tuesday, mind you) Microsoft told mr. Ormandy that they would be able to present a plan the upcoming Friday - i.e. 3 days later and 6 days after the bug had been reported.
          3. Wednesday mr. Ormandy went public.

          Microsoft *never* refused to commit to a timeline. They didn't commit to a timeline within 3 days, so 4 days after reporting the bug mr.

          Ormandy went public. If he truly believed that 60days would be reasonable he could just have informed MS that he would go public exactly 60 days later...

          That timeline doesnt look good. He should have waiting five days for a commitment, as recommended by the RFPolicy [wikipedia.org].

          • Well, since he posted 5 days later, it sounds like that's exactly what he did.

            • Well, since he posted 5 days later, it sounds like that's exactly what he did.

              Actually, it says "five working days." Meaning you give them five non-holiday weekdays, and then disclose it on the sixth weekday.

              The sixth weekday in this case would have been 9 days after the author originally reported the bug; because it was a Saturday, you'd have two weekends before the sixth weekday, during which you'd report it.

              • Meaning you give them five non-holiday weekdays

                The black hats aren't limited by weekends and holidays. Once the black hats start exploiting a vulnerability on the Wednesday night before US Thanksgiving, the start of a four-day weekend, then what should a legitimate security professional do?

                • The black hats aren't limited by weekends and holidays. Once the black hats start exploiting a vulnerability on the Wednesday night before US Thanksgiving, the start of a four-day weekend, then what should a legitimate security professional do?

                  adtifyj was the one who suggested Tavis Ormandy should have waited "five days" according to RFPolicy. I was pointing out to Bigjeff5 that RFPolicy actually says "five working days." If you want to argue the merits of RFPolicy, I suggesting replying to adtifyj's post,

        • Re: (Score:3, Insightful)

          by xous ( 1009057 )

          bah.

          It's not the security researchers responsibility to cover Microsoft's ass. Anything he gives them is a gift not a god damned right. If you want to blame someone for all the exploits blame the dumb ass that decided to couple html help shit with everything and allow it to execute binaries. Just fucking stupid.

          Sounds to me like Microsoft sat on it's ass for three days and then told him /we will get back to you on Friday/ which would piss me the fuck off too. You can't fucking figure out if you can commit t

        • MS needs to regression test each and every combination.

          Frankly, this argument just proves how bad their testing method is.

          At my job (and at Google's company too), we are using agile methodologies, and especially TDD (and also more complete regression tests).
          TDD implies that you write the tests before writing code, and this allows to quickly test any kind of components automatically.
          Regression tests are automated too, in order to early locate any kind of problem, and we are doing it with virtual machines, to avoid installing tons of computers.

          From my point of vi

          • At my job (and at Google's company too), we are using agile methodologies, and especially TDD (and also more complete regression tests).
            TDD implies that you write the tests before writing code, and this allows to quickly test any kind of components automatically.
            Regression tests are automated too, in order to early locate any kind of problem, and we are doing it with virtual machines, to avoid installing tons of computers.

            If your application has problems, the user's computer continues working.

            If a Microsoft

            • And sometimes they don't take into account certain potential problems, such as malware preventing one file from updating, making the system Bluescreen at boot.

              You are mixing 2 events in your memory:
              1) blue-screen due to a malware which modified a driver.
              2) continuous reboots when updating an antivirus.

              These have nothing to do with Microsoft !

              In these cases, I'm siding with Microsoft: it's not their responsability to be compatible with third-party software (malware and antivirus) !

              • You are mixing 2 events in your memory:
                1) blue-screen due to a malware which modified a driver.
                2) continuous reboots when updating an antivirus.

                You're correct. I was referring to the first incident, and was under the impression that the MS update replaced (or failed to replace) the driver file, but apparently it wasn't supposed to.

                For those who don't know what I'm talking about, KB977165 [microsoft.com] updated the kernel, and would cause computer infected by the Alureon rootkit to BSOD on reboot [technet.com].

        • by guruevi ( 827432 )

          People keep saying that Microsoft needs to regression test each language and version of their operating system. That is not true. A well-designed program operates independently from it's translation files. All that is necessary in a well-designed program is to catch all instances of "translate ('english string')" and create a library out of it. In most software packages and even operating systems you can drop in and out of different languages even on a per-user basis.

          Also, other programs that are part of th

      • Re: (Score:2, Troll)

        by abigsmurf ( 919188 )
        He did frame it correctly. He gave them 60 days to fix it. Not "60 days to fix it plus you must stroke my ego sufficiently and quickly enough".

        If you give someone a 60 day deadline, you stick to it. You don't throw a hissy fit and put far more computers at risk because they didn't behave exactly as you want.

        Yes the code was known and being exploited but he made the exploit far more widespread (just look at the explosion of malware that abused the bug that appeared days after he published it).

        Sorry,
        • by hkmwbz ( 531650 )
          What does this have to do with Google, exactly?
          • Perhaps try RTFS? The article says/speculates that this is in response to what we're discussing.
          • You are correct, Tavis Ormandy claims that he acted on his own. Which is a fair claim, except:

            1. Tavis Ormandy is employed by Google as a security researcher - so this is what he does for Google
            2. Tavis Ormandy used Google time and Google resources when researching this vulnerability - whether this was his 20% project or not.
            3. Tavis Ormandy communicated with fellow Google security researchers using their time and resources. He even thanked them publicly in his disclosure.

            If Tavis Ormandy was employed has a piccolo

    • by hkmwbz ( 531650 )
      What does this have to do with an individual doing something on his own behalf, not Google's? How is Google defending anything?
      • Because Google is putting out an article/official statement regarding vulnerability fix timelines and public disclosure with his name in the byline. The implication is that they fully support his view on the matter, though the timeline that's being touted as acceptable in the article is not one he stuck to. It's a lot of "do as I say, not as I do."

  • Jeopardy! (Score:4, Funny)

    by jrivar59 ( 146428 ) on Tuesday July 20, 2010 @10:46PM (#32974006)

    I can only conclude that this Jeopardy! winner [youtube.com] now works for Google.

  • I don't get it (Score:3, Insightful)

    by T Murphy ( 1054674 ) on Wednesday July 21, 2010 @02:35AM (#32974798) Journal
    What does this "eleeto" mean? Is it some sort of slang term or something?
  • Just release the discoveries and let the sane companies adapt and start testing their software properly before shipment. Pussyfooting around companies that has no other interest in security other than PR is never going to accomplish anything.

  • Google should fix their own account authentication system before they throw the first stone. Internet is full of reports like these about the problem:
    • http://talk.maemo.org/showthread.php?t=48382
    • http://blogoscoped.com/archive/2009-05-19-n84.html
    • http://derickng.com/posts/103-google-account-hijacked-or-just-a-bug
    • http://www.google.sh/support/forum/p/youtube/thread?tid=4426cc7a854b727d&hl=en
    • http://answers.yahoo.com/question/index?qid=20100321162016AAZnwCC
    • http://www.google.pl/support/forum/p/gmail/thread?ti
  • when submitting stories, its hardly "lashing out" is it?

    Though after the utter utter fiasco of getting the syetms used to facilitate law enfocement requests hacked. on one would have thought that as Clem Atlee said “A period of silence from you would be appreciated” might have been better for Google on security matters.

If all else fails, lower your standards.

Working...