Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Google Businesses The Internet

Cross Site Scripting Discovered in Google 158

Security Test writes "Yair Amit posted a message early this morning to The Web Security Mailing List outlining a Cross Site Scripting flaw in Google that allows an attacker to carry out Phishing Attacks."
This discussion has been archived. No new comments can be posted.

Cross Site Scripting Discovered in Google

Comments Filter:
  • by Artifex ( 18308 ) on Wednesday December 21, 2005 @10:14AM (#14309026) Journal
    From TFA:
    -[ Solution

    Google solved the aforementioned issues at 01/12/2005, by using=20
    character encoding enforcement.

    --[ Acknowledgement

    The author would like to commend the Google Security Team for their=20
    cooperation and communication regarding this vulnerability.
    • by mwvdlee ( 775178 ) on Wednesday December 21, 2005 @10:17AM (#14309044) Homepage
      It's considered good practice to report security issues to the responsible parties in order to give them sufficient time to fix the problem well before disclosing it to the public .
      • It's considered good practice to report security issues to the responsible parties in order to give them sufficient time to fix the problem well before disclosing it to the public .


        Yes, I know. I was referring to the use of the word "allows" in the description. :)

    • by Pinky3 ( 22411 ) on Wednesday December 21, 2005 @10:25AM (#14309139) Homepage
      "Google solved the aforementioned issues at 01/12/2005, by using
      character encoding enforcement."

      12/01/2005 for those in the US.
      • by @madeus ( 24818 ) <slashdot_24818@mac.com> on Wednesday December 21, 2005 @10:35AM (#14309231)
        Ob-ISO International Date Format advocation ( 2005-12-01 for the win! :-)

      • "Google solved the aforementioned issues at 01/12/2005, by using
        character encoding enforcement."

        12/01/2005 for those in the US.


        This is why since high school I have used alpha abbrevs for the month when I write dates, like Dec 21, 2005, even when filing in forms that have pre-printed slashes.
      • OT: date format (Score:2, Interesting)

        by higuita ( 129722 )
        12/01/2005

        No offence but i think that this US format is plain stupid... really...

        Is that 12 of january or 1 of december? its a format that have several possible intepretations and without any logic (middle time scale/low/high !?!)

        I can understand very well the 2005/12/01 and the 01/12/2005 (i prefer the first, specially in computers, but last is better for reading on paper) but the mixed US format is wierd and dangerous...

        Most of the time looks like you must guess the correct date.

        so why dont the US kill t
        • "so why dont the US kill this stupid format?"

          It was scheduled to be phased out on 01/03/02 but, well...you can guess what happened.
        • Re:OT: date format (Score:4, Informative)

          by amliebsch ( 724858 ) on Wednesday December 21, 2005 @03:51PM (#14311965) Journal
          Most of the time looks like you must guess the correct date.

          No, it is a de-facto standard in this country. That is the way dates virtually all dates are written, so there is not often confusion. For international compatibility, we use named months or the ISO format. The U.S. military, for example, has standardized on YYYYMMDD (and HHMM, obviously).

          Incidentally, it's not entirely without logic. The order of the numbers matches the way we usually talk, i.e., ("December Twenty-First, Two-thousand and five"). Except for the the holiday colloquially known as the "4th of July," the vast majority of people say it in the format, "month day, year." Whether the written or oral ordering of the date this way came first, or simultaneously, I do not know, but it is at least consistent.

          • Here in australia we would say, "Twenty-First of December, Two-thousand and five" Which matches the way we would write the date, 21/12/2005.
          • The order of the numbers matches the way we usually talk, i.e., ("December Twenty-First, Two-thousand and five")

            In Strunk & White's "Elements of Style", a case is made for the logic and error robustness of "21 December 2005" (text separating numbers, and progressively larger units) ... and they are right. And that's the way I have talked and written ever since I first read Strunk & White, about 25 years ago.

            The ISO standard ordering YYYYMMDD is perfectly sensible, too, for computer documents.

    • Gee. They bought AOL today and they already had an insecurity? Quick workers, those Google Engineers.
    • by kawika ( 87069 ) on Wednesday December 21, 2005 @10:28AM (#14309167)
      If there ever was an endorsement for web-based applications, this is it. When a bug is fixed in Windows or Linux, it stays active in the wild for months or years because many users don't update. With web apps the user basically gets an "update" each time they visit the site. If Google fixed the problem on December 1, the vulnerability could have been announced the same day without any kind of negative impact.
      • This is one of the reason I love terminal services on Windows.
        Have to patch 20 servers over the course of a week instead of patching 400 client PC's over the course of a year.
      • by b1t r0t ( 216468 ) on Wednesday December 21, 2005 @10:58AM (#14309426)
        If there ever was an endorsement for web-based applications, this is it. When a bug is fixed in Windows or Linux, it stays active in the wild for months or years because many users don't update. With web apps the user basically gets an "update" each time they visit the site.

        This is great when there is only one site to update. But when everybody is running their own copy of the web app on their web server, you get problems like the recent epidemic of PHP-based bulletin board exploits.

        • Agree with the grandparent, but still an interesting point. This aspect is probably the most pertinent topic related to this story. You could say this makes a case not simply for web-apps but for centrally hosted web services and APIs. (Like the Google Maps API, for example)
      • Well, yes, a bug-fix in a web application can be rolled out to a billion users - but so can the original vulnerability. Double-edged sword.
        • Ummm.... this isn't a double edged sword at all.

          Bug in a web application? Millions of users are exposed to the bug until a patch is released.
          Bug in a locally run application? Millions of users are exposed to the bug until a patch is released.

          Where's the difference here?
          • The difference is that you didn't include everything necessary here... it should be:

            Bug in a web application? Millions of users are exposed to the bug until a patch is released.
            Bug in a locally run application? Millions of users are exposed to the bug until a patch is released and they hear about it and they actually apply it.
            • Ah, but you're only describing one edge - the good edge - of the sword, which is that a web application is fixed across the board when the patch is applied.

              The phrase 'double edged sword' refers to a solution having good effects and bad effects. My comment meant to indicate that a web application did not have an applicable bad edge; it's only a single-edged sword.

              Now... the 'bad edge' could be that feature improvements introduce NEW bugs, or undesirable feaetures, immediately across the board, but I'm sure
  • It's been fixed (Score:4, Informative)

    by b4k3d b34nz ( 900066 ) on Wednesday December 21, 2005 @10:18AM (#14309061)

    Although the article details an interesting exploit, Google fixed this on the 1st of this month--The title is somewhat misleading. It is useful to know that Google fixed this vulnerability 2 weeks after it was discovered, on November 15th.

    Also, for those of us unaccustomed to DD/MM/YYYY date format, that's the format of all dates in the article.

  • Others.. (Score:5, Informative)

    by slashkitty ( 21637 ) on Wednesday December 21, 2005 @10:19AM (#14309074) Homepage
    They've had others in the past, but were quick to fix them. They have even sent t-shirts as thanks for the help. Other sites are not so friendly or fast. This site shows active security holes [devitry.com] in various sites that have gone unresolved. (CSS, insecure logins, etc)
  • by Anonymous Coward
    Noooo, say it ain't so, Who'd 'a thunk it?

    I turned javascript off in 1999, just one less glaring security issue for me to address. Before anyone starts talking smack about responsive web apps, just remind me what Ed Felton said about flying pigs.

    That's right, disable js and fix the web!

  • by thr0n ( 835565 ) on Wednesday December 21, 2005 @10:27AM (#14309151)
    I told them about the XSS (CSS) security holes 2 months ago -
    response was something like: "We will work on it; or we wont - but we wont tell you ;)".
    Which sucks...

    Here we go:

    Original:
    https://www.vr-ebanking.de/index.php?RZBK=0280 [vr-ebanking.de]
    MY Version (XSS):
    https://www.vr-ebanking.de/help;jsessionid=XA?Acti on=SelectMenu&SMID=EigenesOrderbuch&MenuName=&Init Href=http://www.consti.de/secure [vr-ebanking.de]
    /Fälschung --> Imitation /

    ... Hope they change their mind, sometime. :)

    Consti / thr0n

  • What bullshit... (Score:3, Interesting)

    by ninja_assault_kitten ( 883141 ) on Wednesday December 21, 2005 @10:27AM (#14309160)
    Now we're going to start posting every freaking XSS we find? This is a VERY low impact XSS vul. Hell it's not even persistent. Who freaking cares? Are we going to post the slew of recent Yahoo XSS bugs too? WHat about the bug in Google Analytics which allowed you to iterate through all the customer domains?
    • do you know of any xss bugs in yahoo?
        • none of those qualify as XSS. The javascript in the first example must be entered by the USER, it can't be done by a third party. While they should filter this input, it's not a security hole. Allowing yourself to run JS on any site in your own browser is not a security hole (in fact, it's easy to do). It's only a problem when it can be done by someone else.
    • This XSS problem is serious because Google cookies persist for about 2 weeks. You should think a bit before posting bullshit!
    • I have to agree with parent here. This is a low impact vuln that was already fixed.

      From the disclosure:

      Therefore, when sending an XSS attack payload, encoded in UTF-7, the payload will return in the response without being altered.

      For the attack to succeed (script execution), the victim's browser should treat the XSS payload as UTF-7.


      This is a complicated vulnerability to have exploited in practice, but now that it has been mentioned, it makes me wonder just how many other encoded XSS vulns could be done wit
  • by G4from128k ( 686170 ) on Wednesday December 21, 2005 @10:31AM (#14309188)
    This example illustrates the advantages of web applications. Google was able to patch the flaw and roll it out to 100% of the user base in a short time period. Providing applications online means centralized version control and patching -- there's no waiting for all the users to patch.


    The downside is that this only works if the app provider is a proprietary vendor with a closed architecture. If 3rd parties are allowed to create extensions or if users can create their own utilities/add-ons then centralized patching would likely introduce the same types of incompatibilities and breakages that current OS patches can introduce. Worse, centralized control might mean that users have no choice but to live with the patched version.

  • This is amazing. (Score:5, Interesting)

    by dada21 ( 163177 ) * <adam.dada@gmail.com> on Wednesday December 21, 2005 @10:32AM (#14309201) Homepage Journal
    I'm always blown away by how the Internet security market works and self-correct itself without any regulation.

    A major web site has a flaw. White hat and black hat "hackers" find that flaw, exploit it, and either abuse it or let the web site know about it. The web programmers go in and close the exploit because it affects how their customers use the service and could open them up to some liability.

    This is the way the free market works. I'm a huge fan of how quickly the Internet (anthropomorphically) adapts to the changing needs of the billion of users. Some exploits that aren't fixed by the owners of code are fixed by third parties -- sometimes for profit and sometimes for free. Before we can even write one law to attempt to solve problems, others are already attacking the problems.

    I'd like to see it stay this way. Every time we move forward to create legislation to protect the end user (see CAN-SPAM and a myriad of other laws), we see failure time and again. The loopholes in the laws make them irrelevant quickly, and all we get out of that is wasted money and wasted time.

    Let the growth and expansion occur freely. We'll see some bad times (new viruses and new spam exploits) but we'll see those fixed in short order. If they don't get fixed, why is the Internet still chugging along and growing every day?
    • Yeah you're blown away, without any doubt.

      Market doesn't work, market doesn't eat, market doesn't do crap except market exist only as an abstact entity, a theoretical construct.

      Techies, geeks, hackers, whatever label .. it is knowledgable skilled people that do the fixing !
      Some of them are motivated primarily by money and secondarily by showing the company they're good at
      resolving upcoming problem...in hope somebody will notice when it's firing time again..but it's just daydreaming. Others, a minority, do t
    • Oh, for pete's sake...

      I think the 90% of the world that doesn't like obsessing with security would disagree with you about lassez-faire and how well it is handling identity theft and other criminal conduct that has exploded thanks to the internet. My dad *deserves* legal protections from phishing attacks (a specific example: banks should be required to guarantee client accounts... that is WHAT A BANK IS!!!). And a small business should have their online transactions safe from remote fraud (with banks agai
      • I think the 90% of the world that doesn't like obsessing with security would disagree with you about lassez-faire and how well it is handling identity theft and other criminal conduct that has exploded thanks to the internet. My dad *deserves* legal protections from phishing attacks (a specific example: banks should be required to guarantee client accounts... that is WHAT A BANK IS!!!). And a small business should have their online transactions safe from remote fraud (with banks again being held responsible
      • You're quick to claim that all regulatory activity is a failure, but using your same (flawed) reasoning, technological remedies have also failed to 'solve' ID theft, viruses, trojans, spam, keyloggers, hacking, international abuses, and so on.

        Yes, clearly the unregulated (or minimally regulated) Internet has proven vastly inferior to the legally enforced areas like theft, rape, assault, and murder. It turns out that market forces don't eliminate 100% of problems, whereas clearly government regulation doe
  • Encoded post.. (Score:2, Insightful)

    by slashkitty ( 21637 )
    Does anyone have the real post that hasn't been mangled by the mailing list? What are these characters that they used? Does anyone have a working exploit of this type (encoded xss) on another site?
    • Re:Encoded post.. (Score:1, Informative)

      by Anonymous Coward

      Does anyone have the real post that hasn't been mangled by the mailing list? What are these characters that they used? Does anyone have a working exploit of this type (encoded xss) on another site?

      I think that the authors of the report did the responsible thing in informing Google first, waiting until the problem was fixed (within a reasonable amount of time) and then describing the vulnerability without providing an exploit.

      The message gives enough clues about how to create an exploit, though. You j

  • XSSholes! (Score:5, Funny)

    by digitaldc ( 879047 ) * on Wednesday December 21, 2005 @10:41AM (#14309294)
    "How common are XSS holes?"
    I had to laugh at that one.

    Only an XSShole would steal your cookies.
  • by Phosphor3k ( 542747 ) on Wednesday December 21, 2005 @10:44AM (#14309319)
    Someone is trying to get their Pagerank up by submitting the story with a name of "Security Test" and linking to their shoddy website. The site has only a few links, no content, and it says the page is for sale. Will slashdot ever get their shit together and stop posting submissions with blatent pagerank-whoring links like this?
  • by Anonymous Coward on Wednesday December 21, 2005 @10:48AM (#14309352)

    This is reported as a Google.com bug, which is partially true. But this is only one half of the problem. The other half of the problem (mentioned in the full article) is due to a dubious feature in Internet Explorer: when it gets a page without a specified character encoding, it does not rely on default values for the encoding (which should be iso-8859-1 for HTML or UTF-8 for XHTML).

    Instead, Internet Exploerer tries to guess the encoding of the contents by looking at the first 4096 bytes of the page and checking the non-ASCII characters. In the case of the cross-site scripting attack decribed here, the problem is that IE would silently set the encoding of a page to UTF-7 in case some characters in the first 4096 bytes looked like UTF-7. This silent conversion to UTF-7 by Internet Explorer in a text that Google assumed to use the default encoding allowed the attackers to bypass the way Google was filtering "dangerous" characters in some URLs.

    The article puts the full blame for the vulnerability on Google.com. I think that a part of the blame should also be shared by the Internet Explorer designers (and any other browser that does unexpected things while trying to guess what the user "really meant").

    • I don't think this is a IE-only misfeature. Having a look at the browsers I use:

      Camino: Default setting is "Automatically Detect Character Encoding"
      Firefox: Default setting is UTF-8
      Safari: Doesn't explicitly say, but I just fed UTF-8 into a text file, no encoding, and Safari picked it up. So, I assume that it's default is also 'automatically detect'.
    • I think that a part of the blame should also be shared by the Internet Explorer designers (and any other browser that does unexpected things while trying to guess what the user "really meant").

      The viability of the Web is entirely dependent on browsers trying to figure out from incomplete, incorrect and/or inconsistent information what users "really meant." A browser that only renders standards-compliant HTML with unambiguous character encodings would only be able to handle a few percent of the Web.

      The fund
    • As much as I hate Microsoft and Internet Explorer, it sounds to me like their crappy browser is irrelevant here; you should never trust the browser, because anything the browser sends you could actually have come from a malicious user who bypassed the browser entirely in order to send deliberately broken input, and you have to deal with that. Your error messages don't have to be graceful and verbose, but you have to trap errors even if they should never occur.
  • by 8127972 ( 73495 ) on Wednesday December 21, 2005 @11:06AM (#14309499)
    ..... seems to be very good. They acknowledged the problem quickly (the same day if I recall correctly) and fixed it within days. Maybe instead of treating this posting as if there is a bug out there that is a clear and present danger, perhaps we should be talking about how good their response was and why other software companies aren't as responsive?
  • IANAL, but I am always amazed at how these security issues are found and resolved since the exploratory phase for white and black hats are, essentially, the same. (I have a similar pet peeve around journalists, who with their hidden cameras, are able to investigate the mysteries of illicit acts without any recompense).

    While it may be one thing to pull apart IE and Windows XP (they can be done remotely, in an unconnected lab, with zero impact to a larger community), where does one acquire the balls to go

  • Cookies (Score:3, Interesting)

    by kernelfoobar ( 569784 ) on Wednesday December 21, 2005 @11:23AM (#14309683)
    I don't know if it's related, but I've noticed a couple of times that when I get the search result page, I get asked to set a cookie from one of the sites in the results, without clicking on them. (my Firefox is configured to ask me to set cookies.). This is somewhat disturbing, I mean if my FF was set to accept cookies automatically, I would have cookies for sites I have never visited...

    Did anyone else notice this?
    • Re:Cookies (Score:5, Informative)

      by aziraphale ( 96251 ) on Wednesday December 21, 2005 @11:43AM (#14309854)
      Sounds like preloading.

      Firefox (and other Mozilla derivatives) support a preloading link. When they encounter such a link in one page, they begin downloading the content for the linked page, so they have it ready. Google assumes that you're reasonably likely to click on the first link they've sent you for some types of search result (probably where there's a very high search ranking for one particular site for the term you searched for), so sends Mozilla/firefox users a preload warning along with the search result page, with the URL of the first search result page. Firefox does its thing and starts downloading the page content for the first search result before you even click on it - including any cookies.
    • I think it's either Javascript or images in the Adsense ads.

  • Google vulnerable? (Score:5, Insightful)

    by Anonymous Cowhead ( 95009 ) on Wednesday December 21, 2005 @11:25AM (#14309698)
    It seems odd to blame this on Google. According to the linked mailing list posting, the problem is caused by the "auto detect character set" feature in IE (and probably other browsers,) and the lack of a "charset" parameter in the HTTP response from Google. The HTTP spec is pretty clear that a missing charset parameter means ISO-8859-1, not "browser should guess", and certainly not UTF-7.

    So isn't it really the "auto detect" feature in the browser that causes the vulnerability, and not Google's lack of "charset encoding enforcement" as the mailing list posting from Watchfire Research claims? Let's put the blame where it belongs. I say we should applaud Google for going the extra kilometer to protect users with non-compliant browsers.
    • I'm definitely with you on this one. The browser itself should be blamed for automatically assuming the encoding. It's like IE assumes anyone from Mexico is Italian; sure the languages share common words, but it doesn't make them the makers of spicy sausage, shoes, and Ducati motorcycles.

      As for Google going the extra 0.621371192 miles to make sure the end user is protected against this, I do praise them for their efforts. The part in the article that caught my eye was the segment near the bottom showi
    • Yah, but then people will call you a Google fan boy
    • This is not the only place Internet Explorer does something different to what HTTP says.

      As far as I know, HTTP says that if the HTTP headers have a content-type header, the browser should treat the data as though it was that content type regardless of the actual contents. But IE does not do this. IE will use the content-type header, the file extention AND the contents of the file to decide what to do with it. This means that even though the web server sent the file as text/plain, IE may not render it as pla

Any sufficiently advanced technology is indistinguishable from magic. -- Arthur C. Clarke

Working...