Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Google The Internet Microsoft Social Networks Yahoo! Technology

Is the Web Heading Toward Redirect Hell? 321

Ant snips from Royal Pingdom this excerpt: "Google is doing it. Facebook is doing it. Yahoo is doing it. Microsoft is doing it. And soon Twitter will be doing it. We're talking about the apparent need of every web service out there to add intermediate steps to sample what we click on before they send us on to our real destination. This has been going on for a long time and is slowly starting to build into something of a redirect hell on the Web. And it has a price."
This discussion has been archived. No new comments can be posted.

Is the Web Heading Toward Redirect Hell?

Comments Filter:
  • by alain94040 ( 785132 ) * on Thursday September 23, 2010 @03:09PM (#33678818) Homepage

    Funny just this morning I noticed that it took at least 5 redirects or more for Google to let me login to Analytics. It felt like my browser had a life of its own!

    The real problem though are the link shorteners. I'd like to vote with my feet and never click on them, but for many, they are like drugs, because they let you track your influence (how many people clicked) in real-time. It's especially bad on slower connections such as smartphones. Not everyone has 1MB/s.

    Any ideas on how to convince people to stop?

    --
    Don't work on your startup project without a safety net [fairsoftware.net]

    • by duguk ( 589689 ) <dug@frag.co . u k> on Thursday September 23, 2010 @03:11PM (#33678842) Homepage Journal

      Not everyone has 1MB/s.

      Any ideas on how to convince people to stop?

      Surely it's the latency, not the bandwidth that is the problem with 301s?
      They can't be much more than a few hundred bytes!

      • Re: (Score:2, Interesting)

        by sarx ( 1905268 )

        I agree; but to be fair, I think it is easy for people with a little less knowledge to heuristically lump bandwidth and latency together, especially if they aren't dealing with (say) satellite links, because links with very low latency are in practice somewhat more likely to have high bandwidth. So if it is wrong, it is at least understandably wrong.

        • by skids ( 119237 ) on Thursday September 23, 2010 @03:29PM (#33679098) Homepage

          Not to mention, when a shared medium or statistically multiplexed PtP link of low bandwidth has congestion, latency is higher than on a higher bandwidth link, which has a much shallower queue built up and/or takes less time to wait for the 1500 byte packet that just started being transmitted to get out of the way. The distinction is only really relevant when you're discussing technicals of TCP window scaling and bandwidth delay product. Certainly not to the end user: "slow" is "slow".

        • Re: (Score:3, Informative)

          But it's easy to explain the difference, so it's not entirely understandable.

          It's like understanding the difference between top speed and acceleration. Not a terribly hard concept.

          The real problem is the "internet" is a magic black box. Most people don't understand it's really just a big network, and works like a network... actually, somewhat similar to a much-quicker-delivery postal system, in simplistic terms. Except that there's a "request" thing, not just a "send" thing.

          • Re: (Score:2, Informative)

            by Menkhaf ( 627996 )

            A better analogy is water pipes. Bandwidth is width (the bigger, the bigger throughput), latency is pressure (the higher, the faster "it" travels).
            Of course, this being /., your almost-car analogy is probably better suited.

            • Re: (Score:2, Funny)

              by Anonymous Coward

              So you're saying the internet really is a series of tubes!

            • True. I've used that one, too, hehe.

              Of course, then they ask what is "it" then? ... ;)

              • by Menkhaf ( 627996 )

                As the AC parent sibling is getting at, just explain them it's really made of tubes, and try to avoid any questions about why postage around the world still takes at least a few days, "when I can play Farmville with my cousin in China in seconds".

          • It's like understanding the difference between top speed and acceleration

            More like the difference between a sports car and a truck.

            The sports car gets you faster to your destination (low latency), but the truck allows you to haul more stuff (high bandwidth).

        • Re: (Score:3, Funny)

          by spazdor ( 902907 )

          What an uncharacteristically even-handed Slashdot response!

          You must be a noob.

      • by Anonymous Coward on Thursday September 23, 2010 @03:34PM (#33679160)

        There's also the added DNS lookups to consider.

      • A few hundred bytes ought to be enough for anyone.

      • Re: (Score:3, Interesting)

        Any ideas on how to convince people to stop?

        This would require a browser plugin to create a dictionary, by converting the short URLs into their long forms, and share that dictionary with others. Ideally, only one person would actually click through the shortened URL to learn what the long URL is, while everyone else would take advantage of that knowledge.

        Basically, this amounts to creating a community driven middle man for the URL shortening middle men. The required technology isn't more sophisticated

    • by TooMuchToDo ( 882796 ) on Thursday September 23, 2010 @03:12PM (#33678854)

      Any ideas on how to convince people to stop?

      Create a web service where you can provide a shortened URL and it will respond back with the full URL. Make sure this web service caches the redirect for at least 24 hours. You instantly kill any reason for the redirect to be there (their counts will no longer be accurate).

      If someone wants to use this sort of service, I'd be happy to throw it together and provide it for free.

    • by religious freak ( 1005821 ) on Thursday September 23, 2010 @03:23PM (#33679006)
      There is an RFC out there (I forget the number off the top of my head) which limits redirects to five. IE6 went above spec and allowed ... 20... I think. IE8 has shortened to allow 10 redirects. FF and Chrome allow the same or less. There is a limit on redirects by RFC, but many websites don't follow the rule and many browsers are forced to compensate because of this.

      Ironically, I was just recently accessing a gmail based email system with an Android phone and suddenly I get the message "too many redirects". So now there's no way my google phone can access my google mail. -1 for that one Google.
    • by JSBiff ( 87824 )

      1. Convince the cell phone industry to modernize the MMS/SMS system to allow messages longer than 256 bytes. (Sort-of done - you can send pics in MMS messages which are longer than 256-bytes, so if the 'content' of your message is actually in the 'attachment' to the message and the main text is essentially ignored, you could maybe get longer messages.
      2. Update everybody's phones (everywhere in the world) to be able to receive the longer messages.
      3. Make it part of the SMS/TXT standard to allow HTML (or a simplified subset thereof) which can be rendered by the messaging app built into the phones (so that long URL's can be hidden in nice anchor tags as in standard web-pages).
      4. Update everybody's phones (everywhere in the world) to support the new standard with embedded HTML.
      5. Convince twitter (and similar services - facebook, identi.ca, diaspora, etc), now that phones can send and receive longer messages, with HTML embedded, to update their services to allow longer messages, with HTML embedded.
      6. Finally, convince all the people that really like the ability to track where clicks *really* come from [snowballfactory.com] that they should stop.

      Personally, I'd like to see url shorteners die, because of a number of problems, but realistically I know it's not gonna happen. Things I don't like about shorteners:

      • You don't really know where the URL is going to take you. Surprise! You're at a site with a browser exploit, or a phishing site pretending to be something else, or a goatse site. Perhaps such 'shortened' URLs could potentially even be hijacked by whoever owns the shortening service - redirected so somewhere other than what was originally intended, like to a competitor's site.
      • You're doing research on something and trying to review tweets from 2 or 5 years ago (or maybe a search engine sent you to an older page about a topic which you are researching, and someone in the comments of the page included a shortened URL to a related page which would be of interest to you), and the shortener url is no longer valid, either because it expired, or the site was bought by a competitor and shut down, or just went bankrupt, whatever, and you HAVE NO IDEA what was originally being linked. At least if you had the original URL, and the original site was no longer available, there's a small chance you might be able to find a copy of the linked page in the Wayback machine/Internet Archive, or similar.
      • Someone who isn't the site you are going to visit, is tracking your browsing of that site.

      There's probably other problems I haven't thought of just yet.

    • Re: (Score:3, Funny)

      by CraftyJack ( 1031736 )

      Funny just this morning I noticed that it took at least 5 redirects or more for Google to let me login to Analytics. It felt like my browser had a life of its own!

      Sure, but you're already saving 2-5 seconds per search with Google instant, so you still come out ahead.

    • by kurokame ( 1764228 ) on Thursday September 23, 2010 @03:46PM (#33679324)

      You know those exploding consoles on Star Trek? Did you ever wonder why someone would invent exploding keyboards? Now you know.

    • by cgenman ( 325138 )

      Well, for Twitter at least, you need link shorteners if you're posting a google map or other generated URL. Some simply won't fit into 140 characters, and most won't fit into that with a description.

      Facebook's 400 character limit is much less objectionable, but you definitely bump up against them sometimes.

      • Actually, why do people take the 140 character limit as if it's the speed of light anyway? I know it's because of SMS, but this is the freaking 21st century! It's like forcing people to use only 26 letters to write stuff because it has to be compatible with the telegraph, and Morse only has encoding for said letters! Arrrgggghhh!!!

    • Standardize URLs and/or message boards/blogs such that any URL can be posted without being broken. Also, require URLs to be short enough to be passed from person to person by voice. Until then, link shorteners aren't going anywhere.
    • by richlv ( 778496 )

      message to google.

      i use your search engine a lot. quite often i also use it over slow and low latency links. i also like to right click damn urls and save/reuse them.

      oh, so, the main message.

      these redirect urls on the search result page suck, blow and fucking annoy me. not always, but some good 80% of the time i use your search engine. so, please, get rid of that crap. i don't feel like using bing or whatever, but you are just making it easier for somebody else to provide a better product/service.

  • by bziman ( 223162 ) on Thursday September 23, 2010 @03:14PM (#33678874) Homepage Journal
    I refuse to click on any "shortened" link, because I want to know PRECISELY where I'm going to end up. Thank you Slashdot and goatse.cx. If it's important enough to go visit, it's important enough to spell out properly. And thank you, but I don't live my life via SMS, so the few extra characters is worth my piece of mind.
  • by apoc.famine ( 621563 ) <apoc.famine@g m a i l . com> on Thursday September 23, 2010 @03:15PM (#33678892) Journal
    For those of us who use things like NoScript, the price can be that we don't get there. Ever.

    I know that when I go to a site that can't work unless I allow a half dozen or more other sites to run scripts, I sometimes decide that it's not worth my time. When I click a link that then has to contact several domains, (sometimes ones I have specifically blocked) I might stop right there and close the tab.

    The web isn't just headed towards redirect hell - it's turning into a damn sketchy web of tentacles working their way into every page. When I find ones that I'm not comfortable having around, I don't go back.

    I'm not sure I like what the web has become. Thanks to NoScript, I at least know what it's become.
    • As with so many other things, the situation is worse because most people don't know / don't care /are willing to put up with it. I am guilty myself. The problem is that the people making design decisions are not the people most effected by the decisions and the people effected don't understand the decisions being made.
    • by Spazntwich ( 208070 ) on Thursday September 23, 2010 @03:29PM (#33679096)

      I've noticed this as well, and just consider it the price I have to pay to avoid losing my nerd credentials along with my tiny bank balance.

      But it is becoming more prevalent, and I'm not sure what the solution is. Part of me worries this is one of the setup steps in someone's grand scheme to make the internet "dangerous" enough that the "only solution" is to grant absolute internet authority to agency x. You know. To protect the children from all the sexual predators hiding kiddie porn in bit.ly links.

    • Re: (Score:3, Interesting)

      My university seems to have come up with a plan to advertise themselves to staff and students who already work/study here: provide no direct link to the university e-mail. They want you to go to the front page, to see the latest news you're not interested in and ways to make donations to the university (hint hint), then login, and you'll be taken to more irrelevant news, links for course tools, and another link for e-mail, which will redirect you one or two times before getting to a google mail system.

      I ha

    • by Bazman ( 4849 )

      JavaScript has nothing to do with these things, surely. The server sends back a HTTP Redirect or Moved message, and your standards-compliant browser is supposed to go, "Kthx, I'll check there". It was part of the web's protocol from near enough day zero. NoScript won't stop you following them unless the redirect systems are abusing JavaScript for this. Are they? Oh dear god no.

    • The external script thing is a real bummer-- there's a lot of news sites out there that won't display properly (or at all) because they load all kinds of tracking data/cookies/plugins from other sites that I have adblocked. Or they're coded poorly, and unable to function at all. Facebook is a major culprit-- I've been seeing a lot of little "share this article if you're a facebook douchebag" mini-flash apps embedded in websites (in a frameset, no less). The crazy thing is if the app can't pull the data it w
    • by cgenman ( 325138 )

      The web is becoming a series of interdependent systems that all interact with each other. You may have one company that serves your ads, another that helps you understand your users better, another that serves extra content, etc. All of these services "tentacle" together to create a modern mature web experience.

      That's the nature of things once a system matures. Middleware providers, essentially, move in. They start providing portions of service to web pages from 3rd party servers. This can be as simple

    • I'd never realized this was a problem. Sure I've seen Facebook turn my youtube links into facebook.com redirects, but didn't think it was a big deal. It still accesses youtube quickly.

      As for for NoScript I've never seen it balk. I have "Temporarily allow to sites by default" and "Allow 2nd level domains (noscript.net)"

  • by bogaboga ( 793279 ) on Thursday September 23, 2010 @03:15PM (#33678904)

    Folks at linuxtoday.com have been doing this for a long time. It's one reason I fled the site. Instead of taking me to where I wanna go directly, they make me click twice on the same site. This I believe, enables them to collect 'vital information' to present to their advertisers.

    The bad thing is that they lost me and many others in the process.

    By the way, it's intentional for me not to link to them from Slashdot directly.

    • It would be trivial to do something with javascript - put an onclick handler that does an xmlHttpRequest to save the "needed" information without even needing to worry about header redirects and the like. The link can be something like

      <a href="http://www.thesite.com/path/to/page.html" onclick="return notifyBigBrother(this);">

      where notifyBigBrother() is a function that sends the click info to the search engine site. Why isn't this done?

  • by youn ( 1516637 ) on Thursday September 23, 2010 @03:16PM (#33678908) Homepage

    ... so they had to find yet another way to slow things down... so the web could live up to its reputation of "world wide wait" ;)

  • More ads, again... (Score:3, Insightful)

    by Drakkenmensch ( 1255800 ) on Thursday September 23, 2010 @03:17PM (#33678920)
    [Wait 30 seconds or click here to skip to comment]
  • optimize google (Score:5, Informative)

    by emkyooess ( 1551693 ) on Thursday September 23, 2010 @03:20PM (#33678966)

    The Optimize Google add-in for Firefox gets rid of some of their hellish redirects. Sadly, it doesn't update frequently and seems prone to breaking.

  • My Idea (Score:5, Funny)

    by wbav ( 223901 ) <Guardian.Bob+Slashdot@gmail.com> on Thursday September 23, 2010 @03:23PM (#33679010) Homepage Journal
    I want to create a redirect loop. Just imagine, google to tinyurl to bit.ly to dilv.it back to google.

    Or you could always just make a really long way to get to someone who'll never give you up, never let you down.
  • by Morth ( 322218 ) on Thursday September 23, 2010 @03:24PM (#33679022)

    If someone is paying me for the clicks I send to their site, I need to count it so that I know how much I should charge, and they need to count it as well to know I'm not lying. They could make the count on the destination page, but usually it's far more easy to make a special service for it.

    A redirect page is usually just a couple of hundred bytes large. Cookies might add some clutter, but probably still less than 1k in each direction, still fits in a single packet. I don't see the problem here.

  • Google and Facebook both use these "intermediate steps" to weed out malware infested sites and warn the user. Sampling can also be useful in judging if something is NSFW, or more importantly, rickrolling prevention.
    • by spazdor ( 902907 )

      Google and Facebook could just as easily filter malware out of the hyperlinks before they present them to you in the first place. I know Facebook in particular doesn't even let you post such links to your wall in the first place, let alone allow anyone to click them.

  • Not all that new (Score:5, Informative)

    by shoptroll ( 544006 ) on Thursday September 23, 2010 @03:27PM (#33679056)

    Jeff Atwood hit on this issue in a blog post last year: http://www.codinghorror.com/blog/2009/06/url-shorteners-destroying-the-web-since-2002.html [codinghorror.com]

  • This used to be considered something that was potentially a Good Thing. To help prevent link rot and redesigns from breaking links, people thinking a lot about Hypertext came up with initiatives like PURL's: http://purl.oclc.org/docs/ [oclc.org]

    Now that the primary usage of these redirects are simply to shorten links to something more convenient, we're using the same tech (a 301) and using it in different ways. One question is, how many people use the "custom link name" feature of tinyurl.com vs, simply let a random s

  • by Anonymous Coward on Thursday September 23, 2010 @03:28PM (#33679078)

    Less fragile and less of an unnecessary intermediary on this Web 2.0 (or whatever) age would be to catch the click of a link with onclick, set a cookie, and open the original, intended link. When user would again come back to the site, this cookie would be dumped to the site that so much wanted the information it was clicked. Even if the user would have some sort of embedded resource from this site open somewhere else, it could harvest the information and send it back.

    Instead, we seem to be ending up with endless chains of redirectors and opaque identifiers that are bound to organizations that don't necessarily exist in a year. What a joy to use technology which is driven by needs of utter morons and greed of those interested to press most information out of the morons...

  • by FuckingNickName ( 1362625 ) on Thursday September 23, 2010 @03:28PM (#33679086) Journal

    and there is no useful (i.e. non-light-entertainment) content created primarily through advertising revenue. Slashdot developers who have made their money over the last decade producing tat by not overestimating the intelligence of the general public cannot bear to admit this, but you simply cannot produce high-calibre content when your primary aim is to suck in as many as possible of the kind of people who take notice of adverts.

    Murdoch, often maligned for his lack of business sense but mysteriously still richer than all of us, seems to have tried and failed at pushing the subscription model. Obviously there are other viable models for producing information on the web such as government sponsorship (BBC, academia) and well organised groups of hobbyists (e.g. ham radio), but how will the sites who do not already have a dedicated subscription base through off-line heritage sustain themselves? Or maybe the answer is that they will not, the moment they take their eye off the advertiser as customer and start worrying directly about satisfying the desire for the reader to intellectually advance himself.

  • Not only do you not know where you're going to end up, but also the service can track your behaviour. Obviously this latter reason is why all the companies want to do it.

    So, how do you get around it? I don't even think we can. I think we're screwed, to be honest. It's just going to be like that, perhaps until the day an exploit comes out and re-targets all of a services re-directs (i.e. tinyurl) to some hostile domain. Then, and perhaps only then, would it get enough attention to bring it to the "mainstream

  • I didn't even know this was occurring. Guess it hasn't bothered me so far.

  • Facebook (Score:5, Informative)

    by Xacid ( 560407 ) on Thursday September 23, 2010 @03:32PM (#33679138) Journal

    To play the devil's advocate - facebook's redirects started as a way to filter out all the spam links.

  • As I commented on TFA:

    So we have jQuery, and we have AJAX. Why don’t they just attach an onClick to their links that sends a quick POST to Google before sending the user on their way, directly to the site in question? It won’t work for people without Javascript on, but that’s such a small percentage that I doubt it matters to them much. The important thing is that they could get their statistics, while still avoiding a redirect. The service providers could argue that they need the trackin

    • Re: (Score:3, Insightful)

      by farble1670 ( 803356 )

      your browser's SOP (same origin policy) prevents you from doing this. scripts aren't allowed to make net connections to sites outside the domain of the current page. this is to reduce XSS (cross-site scripting) attacks.

      i understand there are standards in the making to allow such things, securely.

  • huh? what? web sites are redirecting?

    (ok, RDR is not that good, but it helps, and I'm sure as this becomes even more prevalent, people will work around it)

  • On the other side of this are the search engines. They may not follow the chain of links, especially if it involves "cookies". So a reference that uses a redirection service may not be credited as an inbound link for ranking purposes.

    Then there's the firewall/proxy issue. Firewalls need to see where you're really going, so they have to run down the link chain. This may result in bogus hits on the end site, if both the firewall and the browser separately do this.

  • Redirects are a minor inconvenience on the net. Much more insidious is the enormous number of sites that have to be accessed to get all the content on many webpages. Add to that the layers and layers of CSS needed to render them. And the massive, often buggy stack of scripts they bring. Not to mention the server-side scripting that slows down fetching the pages and embedded content and CSS pages and scripts before you get them.

    It'd be interesting to see the average number of bytes transacted to render o

  • by bickerdyke ( 670000 ) on Thursday September 23, 2010 @03:46PM (#33679326)

    when it was considered a security hole if you DIDN'T use a redirect on your page? IIRC there used to be an attack vector where malicous sites used links from freemail pages to steel session IDs from the referer-headers.

    • by JesseMcDonald ( 536341 ) on Thursday September 23, 2010 @04:29PM (#33679882) Homepage

      To be fair, that is nothing more than a workaround for several other major security issues:

      1. The referrer header itself. This header serves no useful purpose, and leaks information that the destination website has no need to know. There is no way to use the referrer information securely, since it can be trivially forged, but it does serve as an invaluable tool for malicious attacks and unwanted tracking.

      2. Session IDs should be validated to prevent hijacking. At the very least the session ID should be ignored if it comes from a different IP address than the one which created the session. It's not a perfect solution, given dynamic IPs, NAT, and proxies, but it would block most attacks without inconveniencing normal users.

      3. No private information, including session IDs, should ever go in the URL. HTTP POST requests or cookies are a better solution here. (Naturally, cookies should be valid only until the end of the session unless the user explicitly indicates otherwise.)

  • I think we could just write a Java program that will loop through a range of your top 10 urls every minute. Then you would never have to click. Call it autoBrowser. Because no one should actually click on things.
  • Accused by a site that is dependent on scripts coming from other domains.
  • links you to Jap Porn.

  • I'm going to assume that most of these use HTTP redirects. This is where the server returns a 3xx result that tells the browser where to go, as opposed to rendering a full page and using Javascript to redirect.

    The nice thing about HTTP redirects is that a service like Twitter can just follow the HTTP redirects for you and cut all of the middlemen out of the chain. Even forthcoming server-side Javascript interpreters could parse out Javascript-based redirects.

  • Shady URL's (Score:5, Funny)

    by sirrunsalot ( 1575073 ) on Thursday September 23, 2010 @04:42PM (#33680044)

    Personally, I find the trend of redirecting to innocent sites via shady URL's much more alarming: http://5z8.info/foodporn_e0g0l_taliban-meetup [5z8.info]

    (I promise I'll get modded "troll" by someone who glanced at the link and assumed the worst. Hard to blame them, but I do love using those links whenever possible...)

  • by Reziac ( 43301 ) * on Thursday September 23, 2010 @08:51PM (#33682740) Homepage Journal

    ...are those that come in perfectly legitimate email, stuff that I actively subscribed to. They already know where I came from, their own damned email. Why does it need to go through a redirecting clicktracker?

    Furthermore, it lets even legit emails send me somewhere not only unanticipated but also a pain in the ass, like links that unexpectedly open a whopping great PDF.

    Many thanks to folks who posted links to two URL de-obfuscator services, which are now permanently on my toolbar.

    http://unshorten.com/index.php [unshorten.com]
    http://www.longurlplease.com/ [longurlplease.com]

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...