Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Google Businesses Security The Internet

Google Accelerator: Be Careful Where You Browse 89

Eagle5596 writes "It seems that there can be a serious problem with Google's Web Accelerator, and I'm not talking about the privacy concerns. Evidently some people have been finding that due to the prefetching of pages their accounts and data are being deleted."
This discussion has been archived. No new comments can be posted.

Google Accelerator: Be Careful Where You Browse

Comments Filter:
  • by Anonymous Coward on Saturday May 07, 2005 @04:24PM (#12463755)
    Google should have beta tested it first.
    • This IS beta! When you use a beta version of anything..don't be suprised if something "breaks"
      • AC stumps high numbered /. login.

        News at 11:00.

      • You are an idiot. "Beta" does not mean "We can do what the hell we want and delete all your data". Beta means "We have tested this application to the best of our in-house abillities, and now need wider input from a wider audience." Implied in the concept of Beta testing is the assumption that no catastrophic bugs will hit you. These have been burned out in Alpha. *especially* an organisation like Google should have caught such a simple issue.

        I am not getting into the "this is cool" or "this is evil" argum
  • Perhaps we should start keeping our own data secure, rather than relying on other people to do it for us? I mean, if you're paranoid about people using this program and gaining access to our "sensitive" data, then it's your own damn fault. Your data shouldn't be so wide open on internet web pages anyhow. Bah.
  • Another POV... (Score:3, Insightful)

    by Gothic_Walrus ( 692125 ) on Saturday May 07, 2005 @04:28PM (#12463785) Journal
    Something Awful had an article on this subject [somethingawful.com] a few days ago.

    I'm not sure if I agree with the "Google is the new Microsoft" sentiments, but thinking before you install new software is always a good idea.

    • Re:Another POV... (Score:4, Insightful)

      by Jerf ( 17166 ) on Saturday May 07, 2005 @06:18PM (#12464303) Journal
      Actually, that's yet another different problem, one where you get the wrong page from the cache, specifically somebody else's personalized page. It is completely unrelated, in the sense that one could fix either problem independently. (It is possible that they have the same root cause, but I doubt it.)

      This bring the current list of reasons not to use the Accelerator up to three, counting the obvious privacy issues.
      • In my defense, this article does use the phrase "in addition to."

        I just read it incorrectly. Not an uncommon event on my part... >_

      • "Actually, that's yet another different problem"

        Not necessarily. If to "logout" you need to click on a link, then that may potentially be cached and so you do not get logged out when you click on it. If the webapp is using a poor session implementation, it may lead to the same problem.

        Websites using session-based authentication really should use a form and do a POST to do logout.

        Of course, if web sites used http-auth (as they should), this wouldn't be a problem at all.
        • Re:Another POV... (Score:3, Informative)

          by Jerf ( 17166 )
          No matter what links you click on, you can't see another user's page, unless the web application is just horrifically badly designed, well beyond merely not quite conforming to a strict interpretation of certain HTTP standards that actually say "should" instead of "must". It is reasonable to assume many web apps use GET in ways going against the spec's recommendation, but surely if merely clicking a link could log you in as arbitrary other users, it would have been noticed. Not to mention only other users o
    • I think the author is jumping the gun. I believe that this Google Web Accelerator was born from the "Hey, why not use Google's cache all the time when browsing sites on frequently slow servers?" idea, and that these issues are merely unintentional side effects that still need to be fixed (which will be pretty complicated if you ask me).

      Still, Google will have the opportunity to store virtually the entire browsing history of Google Web Accelerator users, which people should keep in mind when installing the

  • by keesh ( 202812 ) on Saturday May 07, 2005 @04:34PM (#12463811) Homepage
    According to the HTTP spec, GET requests must not be used to change content. POST actions must be used if you're deleting / changing something. And google doesn't prefetch POST, does it?
    • Unfortunately, I'm not aware of anything in the HTML spec that allows the page designer to attach a POST action to anything other than a submit button. It's not particularly difficult to add a POST action to a JavaScript event handler, but I'm that presents problems of its own.
      • input type=image (Score:3, Informative)

        by slashkitty ( 21637 )
        It's quite easy and common.. and it's in the HTML spec. Too many people just create a GET link instead of a POST form becuase it's a little easier.
        • I'm sorry, but what is quite easy and common?

          The only two "common" ways that I'm aware of to submit a form as a POST action are to use a submit button or to fire the submit the form in a scripted event.

          If you know of a way to submit a POST action from a text link without using javascript, please share it with the rest of us.
    • Unfortunately, it's not that simple in the real world though.

      If you want to POST something, the only way to do that is to use a form. Forms cause a few problems.

      IE and Opera render forms slightly "creatively". Wherever a form ends, the browser inserts vertical space in many situations, some of which are unavoidable. This usually makes the page render very strangely. If I want a list of links, and some of them have side-effects and some don't - my choices are to make some of them forms and some regular
      • Again, you're misusing the technology. HTML is a text markup language, not a page layout language. If you want pixel perfection, use PDFs or a similar format which was designed for that kind of thing.
      • by Anonymous Coward on Saturday May 07, 2005 @07:13PM (#12464584)

        If you want to POST something, the only way to do that is to use a form. Forms cause a few problems.

        With all due respect, even though forms aren't perfect, they've been around over a decade, and if you can't deal with them by now, don't bother calling yourself a web developer.

        Wherever a form ends, the browser inserts vertical space in many situations, some of which are unavoidable.

        You're kidding, right? If you don't want a bottom margin, say so with CSS. This is basic FAQ newbie stuff [htmlhelp.org].

        If you want a regular text link to submit a form, you have to use Javascript.

        You can use CSS to make the button look like a text link.

        This creates a dependancy on Javascript

        No it doesn't. You can easily use Javascript without depending on it. That's the way it's supposed to be used. This too is basic newbie stuff.

        Other issues with form POSTing include the inability to use the back button after POSTing.

        Huh? Works fine here.

        there's no way for webmasters to tell the browser not to pop up with the "Are you sure you want to resend the POST action again?" window.

        That's not a bug, that's a feature! POST is not idempotent. Resubmitting a POST is something that absolutely needs to be warned about, because it's a fundamentally different action to reloading a page with GET.

        GET followed by refresh == just GET it again

        POST followed by refresh == send the server some more data

        So, if we choose to follow the HTTP guidelines, we break UI and style guidelines even worse.

        There is a reason submit buttons look different to links. It's because they do different things. There are semantics associated with clicking a button that aren't associated with clicking a link. If style guidelines instruct you to make submit buttons look like links, then the style guidelines are probably broken.

        So, if we choose to follow the HTTP guidelines, we break UI and style guidelines even worse. If we want to use POST we have to give up having the page rendering correctly in major browsers, break the back button, break the ability to bookmark state information (unless you encode variables both in the URL in get fashion AND others in a POST), and make every link either an image(bad for accessability and download speeds) or use some Javascript magic (even worse for bookmarkability and accessability).

        Wow. Get with the times. No really. I'd expect this kind of attitude from a newbie developer in the mid 90s.

        • Wherever a form ends, the browser inserts vertical space in many situations, some of which are unavoidable.

          You're kidding, right? If you don't want a bottom margin, say so with CSS. This is basic FAQ newbie stuff.

          Yes, and IE ignores it in some situations, and in some places will size your table even though it had added the space.

          here's no way for webmasters to tell the browser not to pop up with the "Are you sure you want to resend the POST action again?" window.

          That's not a bug, that's a feature!
          • I know Slashdot isn't a shining example of HTML compliance either

            Nuff said.

            That "Logout" link has a side effect of going to it, and it's a GET.

            I'll say it anyway. It shouldn't.

            At the most basic level, even tracking "how many people have seen this page" is an effect of loading it, that is affected by undesired prefetching. Keeping track of which pages are most recently accessed to handle server side caching of dynamic content is an effect of loading a page, even when no data on the page is changed.

          • One page frequently used had a tiny list of links on it to generate reports. "Daily"/"Weekly"/"Monthly"/"Yearly"/"All Records(Note: will take several minutes to generate)". GA was following all of those links to prefetch them.


            There isn't even a way to ask GA not to prefetch certain links other than hiding them in javascript or forms.


            you mean that google doesn't obey robots.txt?

            That suprises me.
          • There's kind of a convention that setting cookies on the client side doesn't count as a side effect. You can follow a normal text link and receive a cookie, so it doesn't hurt too much to follow a 'Logout' link for a cookie to be deleted. If something happens to prefetch that page, it just won't apply the cookie changes. If 'Logout' ends up setting some state on the server, that's less appropriate and really should be a POST.

            I know there isn't an exact line between what counts as a side effect and what
        • Other issues with form POSTing include the inability to use the back button after POSTing.

          Huh? Works fine here.
          Just one thing for those didn't get this one: you should always return Location: header when replying to POST.
      • "I would love something like:

        <a href="/link.script" method="post" variables="a=1;b=2">"

        I guess it it fortunate for us that you'll never see it - no browser would implement such a thing. It is contrary to the spirit of HTML in general and links specifically.

        See the WhatWG discussion [dreamhost.com] of this sort of thing for more reasons why it sucks.
    • by Anonymous Coward
      Quoting from section 9.1.1 Safe Methods of the HTTP 1.1 RFC (2616):

      Implementors should be aware that the software represents the user in their interactions over the Internet, and should be careful to allow the user to be aware of any actions they might take which may have an unexpected significance to themselves or others.

      In particular, the convention has been established that the GET and HEAD methods SHOULD NOT have the significance of taking an action other than retrieval. These methods ought to be c

    • Uh, doesn't that throw all of REST out the window?

    • AFAIK, GETs should be Idempotent, but that just means hitting them n times for all positive n, produces the same results, it DOES NOT specify that hitting it 0 times has the same result as hitting it once (that would be identity in addition to idempotence). Loging out is a classic example of something with a side effect that is also Idempotent. From any state, hitting a logout link takes you to the logged out state, hitting it again takes you to the same state, therefore, logout is Idempotent.

    • Most websites uses some sort of link-checking program on a scheduler to make sure they didn't accidentally create broken-links within their own website.

      Such link-check programs also follows all the links in your webpage.

      Bug in the webpage. Nothing to do with Google.
    • GWA also doesn't prefetch GET with query strings. The problem is that apparently Basecamp/Backpack uses short/pretty URLs that don't contain query strings, e.g. http://host/account/delete/121 [host] instead of http://host/account?action=delete&id=121 [host]. It's not prohibited to use GET for delete/add/whatever links.
  • If it can't determine whether or not a dynamic link (like "delete this") is harmful or not, perhaps this could be the end of Google Accelerator?
    • Re:Well (Score:1, Informative)

      by Anonymous Coward

      If it can't determine whether or not a dynamic link (like "delete this") is harmful or not

      The thing is, it can determine whether or not a dynamic link is harmful or not. GET is supposed to always be safe. The HTTP specification says so. Stupid web developers used GET in an unsafe way and are paying the penalty because Google thought something that's defined as being always safe is, well, safe.

  • by Anonymous Coward

    The root of the problem is stupid web developers ignoring RFC 2616 and using the GET method to change state.

    Now all the people who cut corners thinking it didn't matter have been caught with their pants down, they look silly because the web applications they wrote are losing data, so they have gotten angry and pointed the finger at Google.

    Sorry kids, but this is what happens when you don't follow the specs. They are there to make all our lives easier, you ignored them, you fucked up.

    Yeah, maybe G

  • Good to know, I've disabled prefetching in GWA as a result.
  • Sigh...YADA (Yet Another Duplicate Article)

    This was already posted on /. in the last day or two.
  • hey guys did i do this rite
  • If you can delete content by following a link, then this is a major security hole. Any website could easily embed such a link into java, javascript, even just an image link. Someone could send you an email with an image referencing the link. This is one place you should be following the spec. If you're making an important side-effect, use POST.

  • by Anonymous Coward
    Ignoring the fact that you now have accounts that are logged in, couldn't you just as easily make a public site that allows anonymous visitors to edit content -- let's say, a wiki -- with "delete" links sprinkled on it?

    What would you say to a webmaster that sticks "delete" links everywhere on their pages, and suddenly finds that Googlebot, in its daily rounds, wipes out their entire wiki?
  • by Anonymous Coward
    Link pre-fetching, as performed by Mozilla/Firefox [mozilla.org], is an opt-in thing. Webmasters should add the "rel='prefetch'" attribute to their tags to enable software to intelligently prefetch links.

    It's safe, it's an emerging standard, and webmasters maintain control. Why isn't Google following the standard?
  • Nearly every highly-rated comment points the finger at "stupid" web designers rather than at Google, because GWA simply reveals that putting side effects on links is dangerous.

    I hope you appreciate the irony of posting such comments on a site whose Logout link is implemented via a GET (see upper left of your screen.) That's the point: every site implements Logout as a link, and Google should have recognized this.

    PS while I'm writing I might as well point out my previous GWA comment [slashdot.org] from a few days be
  • People on dial-up are going to use web accelerators. Concerns about privacy and the other nightmares accelerators cause (such as making graphics look like shit) are generally (though not exclusively) limited to people willing to pay $10-20 more a month for broadband (Netscape dial-up $9.95, AOL Dial-up $19.95, avg. DSL $29.95).

    All this stuff we bitch and moan about here probably won't make a dent in the adoption of Google's accelerator and they're just going to run roughshod over webmasters whose sites do

  • I went to http://webaccelerator.google.com/ [google.com] and I saw this message:
    "Thank you for your interest in Google Web Accelerator. We have currently reached our maximum capacity of users and are actively working to increase the number of users we can support."

    Maybe has this someting to do with all this security concerns?

As in certain cults it is possible to kill a process if you know its true name. -- Ken Thompson and Dennis M. Ritchie