Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Google Microsoft Security Technology

NSS Labs Browser Report Says IE Is the Best, Google Disagrees 205

adeelarshad82 writes "Independent testing company NSS Labs recently published a report on the ability of popular browsers to block socially engineered malware attack URLs. The test, funded by Microsoft, reported a 99 percent detection rate by Internet Explorer 9 beta, 90 percent by Internet Explorer 8, and 3 percent by Google Chrome. However, Google doesn't entirely approve of this report's focus and conclusions. According to Google not only didn't the report use Chrome 6 for the tests, the current version is Chrome 8; it also focused just on socially engineered malware, while excluding vulnerabilities in plug-ins or browsers themselves. Google defended its browser by claiming that it was built with security in mind and emphasized protection of users from drive-by downloads and plug-in vulnerabilities."
This discussion has been archived. No new comments can be posted.

NSS Labs Browser Report Says IE Is the Best, Google Disagrees

Comments Filter:
  • Wai . . . What? (Score:3, Interesting)

    by rudy_wayne ( 414635 ) on Wednesday December 15, 2010 @04:38PM (#34565596)

    "Independent testing company NSS Labs . . . . . . . . . . The test, funded by Microsoft,"

    An "independent" test that was "funded by Microsoft". WTF? How is that independent?

  • by Dan East ( 318230 ) on Wednesday December 15, 2010 @05:04PM (#34565974) Journal

    I know this isn't in the spirit of the other posts on this topic today, but I applaud MS for concentrating on security and the best interests of their end users. It's good to see they are taking these matters seriously as part of the product development process.

    That said, I still use Firefox, followed by Chrome, for browsing, but at least they are looking out for those stuck with IE simply because it ships with their OS.

  • by rtfa-troll ( 1340807 ) on Wednesday December 15, 2010 @05:06PM (#34566018)

    So its results are unquestionably incorrect and/or irrelevant?

    They may be technically true in some sense or other. However, in past such situations, Microsoft has been seen commissioning several similar reports; possibly even iterating the instructions for running the reports; then throwing away (under NDA) all the ones which don't match with their marketing wishes. You can basically assume that whatever it says is the opposite of the truth in some way or another because if it was true they would be able to just say directly it instead of commissioning someone else to say it to they can avoid claims of false advertising (for example, their old "Get the Facts" campaign was one of the few things of this type the ASA has clearly stated was misleading [wikipedia.org]). And yes; most companies do this to some extent, but few other companies could come near to sustaining the level of deception Microsoft does because eventually some employee would become disenchanted and start leaking results. For example, have a look at the Comes documents [groklaw.net], which only came out because of a lawsuit, to get some idea of the kind of things they can keep secret. Nowadays Microsoft's data destruction policies [theregister.co.uk] are much stricter and they ensure that all deals are finalised by lawyers [pbs.org] and so are legally privilaged. This kind of secrecy and professional deception means that almost any marketing claim from them should be disregarded completely until there is some level of independent confirmation.

  • by Anonymous Coward on Wednesday December 15, 2010 @05:40PM (#34566628)

    I work for UL. you don't know shit - UL's tests and the kind of stuff going on here are entirely different.

    you can actually reproduce UL's tests, and they aren't out there to "compare to another company".

    It'd be more like this:

    NSS labs browser report says IE blocks 99% of social networking vectors.

    Nothing about "in comparison to chrome", or "excellence", or how well it does. Yet all of those are in the study.

    In fact, it's incredibly unethical to comment on the performance of a product as a testing studio as good, bad or otherwise. That by itself in the studies guarantees you that these studies are biased due to the funding.

  • by eldavojohn ( 898314 ) * <eldavojohn.gmail@com> on Wednesday December 15, 2010 @05:45PM (#34566688) Journal

    I know this isn't in the spirit of the other posts on this topic today, but I applaud MS for concentrating on security and the best interests of their end users. It's good to see they are taking these matters seriously as part of the product development process.

    Don't get me wrong, I'm always happy when security is improved -- even in the most hated of products by the most hated of companies. The problem I have is when marketing gets a hold of this and spins it to attack competitors, thereby improving the public perception of their own product. This could have all been avoided had Microsoft just kept the report internal like most of NSS Labs' customers. And doing so while comparing the latest IE9 to Chrome 6 and releasing that to the public as a 'current' report now ... well, that's what I have a problem with. If a Chrome user read that report as today's news they're going to think that it's been done with today's Chrome.

  • by natehoy ( 1608657 ) on Wednesday December 15, 2010 @06:32PM (#34567368) Journal

    The report is almost useless because it has compared the latest stable and dev releases of IE with versions of Firefox and Chrome that are years old.

    What. No, wait, what?

    Read on to the end, because later I'm going to tell you what's really wrong with the test and why it's bullshit, but I have to first burn down the obvious straw man you've introduced.

    The report was released in October 2010. http://www.nsslabs.com/assets/noreg-reports/NSS%20Labs_Q32010_Browser-SEM.pdf [nsslabs.com]

    It used Google Chrome 6, which was the current stable Chrome at the time (6 came out in September 2010). Google Chrome has gone from 6 to 8 in two months. It used Firefox 3.6, which is the current stable Firefox RIGHT NOW, two months after the report was released. 3.6 was released in January 2010, but Mozilla has only done "dot" releases since October. It also included Internet Explorer 8, which was released in March 2009.

    In other words, if you want to say "older is worse", then IE8 should have been absolutely fucking pasted by this test. Ummm, right? It's the oldest browser in the test by almost a year.

    Now we get to the point that won't upset you, because THIS is what is wrong with the test.

    According to their test, what they were really testing was vendor responsiveness to known threats (on-time maintenance of the blacklist), not some response internal to the browser. They took a bunch of really recent entries of bad sites from someone and plugged them into the browsers, getting a new batch of URLs every few hours. The time was measured in hours, so what this is really saying is that Microsoft seems to be the best vendor at maintaining the server-based "bad URLs" list, though it took them 4 hours on average to block sites as opposed to Firefox's 6 hours.

    If they got these sites from their paid sponsor, then the list could easily have been biased. But there's more actual provable bias to the test than just that.

    The real bias is in the percentages. They do not actually represent "Microsoft browsers blocked 90% of sites while Firefox only blocked 20%". they are a grade-type score, where 100% means all sites were blocked immediately, while a 0% means no sites were blocked, ever. Early detection (measured in hours) seems to play a much larger role than actual number of sites detected. The scores appear to have been done on some form of normalization curve, with the sweet spot being somewhere around "One Half Hour Longer than Internet Explorer".

    Otherwise, how does an increase in response time from 4 hours (IE, both versions to within a few minutes plus or minus) to 6 hours (Firefox) make your score go from 90% to 20%?

    The net conclusion is, if you're going to use a web browser and you depend on vendor-maintained "baddie" lists as your primary line of defense (rather than script protections like NoScript, which don't depend on a vendor to maintain stuff for you), you're better off with Internet Explorer than any other mainstream browser in the market.

    It doesn't make you "70% safer" or protect you from "70% more threats", it means that it has, on average, 2 hours of lead time on the next-best browser in terms of the list of sites it protects you from. It's like saying that McAfee is better than Norton because McAfee generally releases specific virus signatures, on average, 2 hours before Norton does.

    So, the test is correct, it's just expressing the results in a very misleading way, showing a very low number for "everyone but Microsoft" because the test results were designed to score what IE did best in the highest way possible. They even spelled that out in their results:

    The value of this table is in providing context for the overall block rate, so that if a browser blocked 100% of the malware, but it took 264 hours (11 days) to do so, it is actually providing less protection than a browser with a 70% overall bloc

One possible reason that things aren't going according to plan is that there never was a plan in the first place.

Working...