Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Google Businesses Security The Internet

Google Accelerator: Be Careful Where You Browse 89

Eagle5596 writes "It seems that there can be a serious problem with Google's Web Accelerator, and I'm not talking about the privacy concerns. Evidently some people have been finding that due to the prefetching of pages their accounts and data are being deleted."
This discussion has been archived. No new comments can be posted.

Google Accelerator: Be Careful Where You Browse

Comments Filter:
  • by keesh ( 202812 ) on Saturday May 07, 2005 @04:34PM (#12463811) Homepage
    According to the HTTP spec, GET requests must not be used to change content. POST actions must be used if you're deleting / changing something. And google doesn't prefetch POST, does it?
  • by Anonymous Coward on Saturday May 07, 2005 @04:40PM (#12463851)

    The root of the problem is stupid web developers ignoring RFC 2616 and using the GET method to change state.

    Now all the people who cut corners thinking it didn't matter have been caught with their pants down, they look silly because the web applications they wrote are losing data, so they have gotten angry and pointed the finger at Google.

    Sorry kids, but this is what happens when you don't follow the specs. They are there to make all our lives easier, you ignored them, you fucked up.

    Yeah, maybe Google could have guessed the fact that you've fucked up and hobbled their software to hide your bugs. But you've got no right to complain that they didn't mollycoddle your stupid, broken web applications when it's you that broken them in the first place trying to cut corners.

  • Re:Well (Score:1, Informative)

    by Anonymous Coward on Saturday May 07, 2005 @04:44PM (#12463874)

    If it can't determine whether or not a dynamic link (like "delete this") is harmful or not

    The thing is, it can determine whether or not a dynamic link is harmful or not. GET is supposed to always be safe. The HTTP specification says so. Stupid web developers used GET in an unsafe way and are paying the penalty because Google thought something that's defined as being always safe is, well, safe.

  • input type=image (Score:3, Informative)

    by slashkitty ( 21637 ) on Saturday May 07, 2005 @04:56PM (#12463953) Homepage
    It's quite easy and common.. and it's in the HTML spec. Too many people just create a GET link instead of a POST form becuase it's a little easier.
  • by Anonymous Coward on Saturday May 07, 2005 @07:01PM (#12464511)
    Um, tell them to follow the spec? If not what are specs for then?
  • by sepluv ( 641107 ) <<moc.liamg> <ta> <yelsekalb>> on Saturday May 07, 2005 @07:12PM (#12464578)
    I wouldn't be quite so harsh. Isn't the point of early beta tests like this to find out how their UA works out there in the Real World? Apparently they've already issued a fix to solve the problem (or go some way to...I don't know the details).
  • by Anonymous Coward on Saturday May 07, 2005 @07:13PM (#12464584)

    If you want to POST something, the only way to do that is to use a form. Forms cause a few problems.

    With all due respect, even though forms aren't perfect, they've been around over a decade, and if you can't deal with them by now, don't bother calling yourself a web developer.

    Wherever a form ends, the browser inserts vertical space in many situations, some of which are unavoidable.

    You're kidding, right? If you don't want a bottom margin, say so with CSS. This is basic FAQ newbie stuff [htmlhelp.org].

    If you want a regular text link to submit a form, you have to use Javascript.

    You can use CSS to make the button look like a text link.

    This creates a dependancy on Javascript

    No it doesn't. You can easily use Javascript without depending on it. That's the way it's supposed to be used. This too is basic newbie stuff.

    Other issues with form POSTing include the inability to use the back button after POSTing.

    Huh? Works fine here.

    there's no way for webmasters to tell the browser not to pop up with the "Are you sure you want to resend the POST action again?" window.

    That's not a bug, that's a feature! POST is not idempotent. Resubmitting a POST is something that absolutely needs to be warned about, because it's a fundamentally different action to reloading a page with GET.

    GET followed by refresh == just GET it again

    POST followed by refresh == send the server some more data

    So, if we choose to follow the HTTP guidelines, we break UI and style guidelines even worse.

    There is a reason submit buttons look different to links. It's because they do different things. There are semantics associated with clicking a button that aren't associated with clicking a link. If style guidelines instruct you to make submit buttons look like links, then the style guidelines are probably broken.

    So, if we choose to follow the HTTP guidelines, we break UI and style guidelines even worse. If we want to use POST we have to give up having the page rendering correctly in major browsers, break the back button, break the ability to bookmark state information (unless you encode variables both in the URL in get fashion AND others in a POST), and make every link either an image(bad for accessability and download speeds) or use some Javascript magic (even worse for bookmarkability and accessability).

    Wow. Get with the times. No really. I'd expect this kind of attitude from a newbie developer in the mid 90s.

  • by sepluv ( 641107 ) <<moc.liamg> <ta> <yelsekalb>> on Saturday May 07, 2005 @08:22PM (#12464933)
    And just after that it goes on to say that, as it is expected that GET requests are sent without the explicit permission of a user, the server side (web developers) accepts all responsibility for any breach of the previous "SHOULD NOT" and have no right to blame the user side (users, Google) if they decide to make GETs do more than just retrieval of a document.

    FFS, how can these stupid web designers be threatening to sue Google when the HTTP itself (protocol of the WWW which they should all have read) says that it is there frigging fault and they should blame themselves if they use GET requests in that way.

  • Appreciate the irony (Score:2, Informative)

    by Presto_slashdot ( 573879 ) on Saturday May 07, 2005 @11:23PM (#12465723)
    Nearly every highly-rated comment points the finger at "stupid" web designers rather than at Google, because GWA simply reveals that putting side effects on links is dangerous.

    I hope you appreciate the irony of posting such comments on a site whose Logout link is implemented via a GET (see upper left of your screen.) That's the point: every site implements Logout as a link, and Google should have recognized this.

    PS while I'm writing I might as well point out my previous GWA comment [slashdot.org] from a few days before this whole controversy. I was kinda hoping to shed some light on this exact problem. No one noticed, so I went and told 37signals what was going on ;)
  • Re:Another POV... (Score:3, Informative)

    by Jerf ( 17166 ) on Saturday May 07, 2005 @11:47PM (#12465801) Journal
    No matter what links you click on, you can't see another user's page, unless the web application is just horrifically badly designed, well beyond merely not quite conforming to a strict interpretation of certain HTTP standards that actually say "should" instead of "must". It is reasonable to assume many web apps use GET in ways going against the spec's recommendation, but surely if merely clicking a link could log you in as arbitrary other users, it would have been noticed. Not to mention only other users of Google's caching are showing up, indicating the bug isn't coming from random link pseudo-clicking.

    If you're getting pages from other users, it is a distinct problem from aggressive precaching.

2.4 statute miles of surgical tubing at Yale U. = 1 I.V.League

Working...