Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Google Security The Internet Technology

Google Hands Out Web Security Scanner 65

An anonymous reader writes "Apparently feeling generous this week, Google has released for free another of their internally developed tools: this time, a nifty web security scanner dubbed skipfish. A vendor-sponsored study cited by InformationWeek discovered that 90% of all web applications are vulnerable to security attacks. Are Google's security people trying to change this?"
This discussion has been archived. No new comments can be posted.

Google Hands Out Web Security Scanner

Comments Filter:
  • Google API (Score:5, Interesting)

    by Tokerat ( 150341 ) on Sunday March 21, 2010 @10:00AM (#31557624) Journal
    Considering how many web apps use Google APIs in some form or another these days, I'd say it's in their best interests to ensure those sites don't all become a liability to eachother by way of their centralized cloud.
    • Re:Google API (Score:5, Interesting)

      by girlintraining ( 1395911 ) on Sunday March 21, 2010 @10:35AM (#31557846)

      I'd say it's in their best interests to ensure those sites don't all become a liability to eachother by way of their centralized cloud.

      Given how most websites still use homebrew code and database interactions, and that's the most common route of infection (injected code), this only covers a small range of possible attack vectors.

  • 2 side sword (Score:3, Interesting)

    by gmuslera ( 3436 ) on Sunday March 21, 2010 @11:00AM (#31557962) Homepage Journal
    Is VERY fast, been observed 500 request/seconds against responsive internet servers, 2000/sec when in the same lan, and of course, is targetted against dynamic apps, not exactly static images/content. With that speed the first vulnerability that they will find is vulnerability to DoS attacks. The good news: when the bad guys try to find your application vulnerabilities using this tool, that will be the only one that they will find. Worst case scenario: the code gets included in a botnet,
    • Re: (Score:1, Funny)

      by Anonymous Coward

      Is VERY fast, been observed 500 request/seconds against responsive internet servers, 2000/sec when in the same lan...

      Wow, it's almost like you read the FAQ [google.com] or something:

      500+ requests per second against responsive Internet targets, 2000+ requests per second on LAN / MAN networks...

    • Yeah, because no one else can write a C web client any more, only Google.

      </sarcasm>

      Really, do you work for Fox News or something?

  • When I click on "View a sample screenshot", my browser downloads the damn PNG file instead of simply displaying it like it should. Is it something wrong on Google's side or is it my browser?

    • That is weird. Given Google Chrome does it, too, I'd assume it's something wrong on their side.

      In particular, the headers for that URL are:

      200 OK
      Cache-Control: public, max-age=604800
      Connection: close
      Date: Sun, 21 Mar 2010 11:57:00 GMT
      Accept-Ranges: bytes
      Age: 18380
      Server: DFE/largefile
      Content-Length: 146941
      Content-Type: image/png
      Expires: Sun, 28 Mar 2010 11:57:00 GMT
      Last-Modified: Thu, 18 Mar 2010 19:13:33 GMT
      Client-Date: Sun, 21 Mar 2010 17:03:20 GMT
      Client-Peer: 209.85.225.82:80
      Client-Response-Num: 1
      Content-Disposition: attachment; filename="skipfish-screen.png"
      X-XSS-Protection: 0

      In other words, the server is deliberately telling your browser to treat it as an opaque attachment to be downloaded (and saved with that filename), and not something to be displayed.

      • by Yvan256 ( 722131 )

        Is there any way to work around websites that do that for files that you know your browser can display by itself, such as PDF files?

        • The Open in Browser plug-in for Firefox works for files that Firefox supports natively, not sure if it can help with PDFs.

        • Yes, but it's annoying enough to be pointless. Your options are pretty much to patch your browser or to set up a proxy that filters that header. Either way, you need to think about how you're going to identify it -- with content-type, or with the filename extension? (I'd suggest content-type.)

          Besides which, it actually makes sense to have this functionality. Sometimes, you have a button that says "download" explicitly. In this case, some idiot put the screenshot in the "files" area, which is intended for do

      • by shird ( 566377 )

        Well, they are linking to the "downloads" section (check out the downloads section, its the same url). It makes sense that the "downloads" should be serving stuff up as downloaded rather than embedded content.

    • by gilgongo ( 57446 )

      Ironically, when I clicked that link, I thought "Woah! The server's trying to send me a file that's not an image! It's must be 0wned!"

      But I carried on anyway because of my blind faith in all things Google, and was greeted by a rather ugly screenshot. And maybe an infected desktop or something...

  • by Anonymous Coward

    I peeked at the report, out of curiosity. They don't claim that 90% of web applications are vulnerable, they DO claim that 90 (well, 89%) of all the web vulnerabilities are in web applications (which is quite a different thing).

  • by Anonymous Coward on Sunday March 21, 2010 @01:13PM (#31558794)

    We configured skipfish and pointed it at our custom platform with full administrator rights. Entered our systems custom file extensions into the skipfish dictionary.

    Overall the performance is quite good (>3k HTTP requests per second) after tweaking concurrent connection count. Orders of magnitude better than any scanner we have ever used.

    The report UI seemed polished and provided quite a bit of useful data with summaries and drill down to detail. It would really help if instead of simply posting raw request/response data it would highlight sections of the response that lead it to make an assumption WRT a particular vulnerability.

    In terms of scan results they look for quite a number of common vulnerabilities, some of the checks are quite creative. I especially liked the check for "interesting" contents. Some of our test data tripped them - this was perfectly reasonable given content.

    Aborted the scanner at the 5 million http request mark ~20mins later.

    In terms of actual results against our system out of the several dozen possible vulnerabilties reported from XSRF, injection..etc there were no actual problems discovered - 100% false alarms.

    There is something really odd about some of the requests being made .. I don't know if its intentional to discover bugs but the folder/file parsing looks to be broken and its building stupid path names with the filename /subfolder.. This seems to be causing most of the UI not to crawl as it seems to be ending up in the 404 category. Maybe this is my fault on dictionary configuration but the system wastes way too many requests throwing the dictionary at each resource and not nearly enough time crawling the site and discovering whats available for expliot.

    I then took a cursory glance at the source code.. all of the rule checking is hard-coded in C. (See analysis.c) ... which to me seems quite stupid and useless.

    The tool is a start already better than many freebie tools I have used over the years.

    My advice is to first and foremost abstract the analysis details out of C code. Focus more on walking even if its dynamic content and bolt in some intelligence/expert system to direct activities.

  • I wouldn't be surprised if the actual number is much, much higher. This has always been a problem with software development, I'm not sure why anyone thought it got better when apps became web-based. When your business depends on apps being up and running (or running the newest, coolest features) security is usually not the highest priority.

    As a vendor I sit in meetings all the time with app architects and even security people (up to and including CISOs) at some of the biggest corporations in the world who

  • I just wanted to point out that many organizations and people are trying to resolve the global web-insecurity issue caused by many things including application insecurity. Google is just one participant in this effort. What is frustrating is that when Google talks people call it news. When these other organizations make contributions, nothing is heard.

Whoever dies with the most toys wins.

Working...