Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Google Technology

Google Announces Google CDN 205

leetrout writes "Google has introduced their Page Speed Service which 'is the latest tool in Google's arsenal to help speed up the web. When you sign up and point your site's DNS entry to Google, they'll enable the tool which will fetch your content from your servers, rewrite your webpages, and serve them up from Google's own servers around the world.'"
This discussion has been archived. No new comments can be posted.

Google Announces Google CDN

Comments Filter:
  • at least when they finish taking over the world we'll be able to find things on the internet REALLY FAST.

    • by queen of everything ( 695105 ) on Thursday July 28, 2011 @07:55AM (#36906824)
      how long until we just rename it to the "googlenet"?
      • by Chrisq ( 894406 ) on Thursday July 28, 2011 @08:19AM (#36907078)
        Would that make a Wifi access point a G-spot? I can see the support calls "try pinging your G-spot"
      • T-1000 (beta)

    • yesterday we read about Akamai [slashdot.org], apparently origin of 15-30% of the web traffic. Google's service seems to be similar to Akamai's offering, but free of cost.

      Tomorrow Akamai, the day after tomorrow the world?

      • by Anrego ( 830717 ) *

        For now...

        Google says that Page Speed Service will be offered for free to a limited set of testers right now. Eventually, they will charge for it, and pricing will be “competitive”.

        Also is there any sites left that are static? Could maybe be useful if you get a lot of traffic and seperate out static stuff (images, scripts, css, whatever) and dynamic stuff into two domains .. but for most of the internet?

        • by gmueckl ( 950314 )

          Most of the content I see is quasi-static. The actual page content does not change often enough to warrant a complete page regeneration on each request. Complete page generation for each request is really only justified if there are either too many pages to write them all to a static cache at the same time or if the pages are very dynamic (like very active forum threads or other dynamic data that needs to be delivered "fresh").

          • Complete page generation for each request is really only justified if there are either too many pages to write them all to a static cache at the same time

            Would an online shopping site with 80,000 products count? The product photos probably would though.

            or if the pages are very dynamic (like very active forum threads or other dynamic data that needs to be delivered "fresh").

            Would an online shopping site that displays which products a given user has recently viewed in this shopping session and what items are in the shopping cart count? Or possibly each page of product search results.

        • I'd say that traffic is more static than ever. With more and more websites using JS to update the content - instead of doing a full page request - the amount of dynamic data is very small.

          In fact, with proper caching (and HTML5 has some nice features for that), current webpages can actually be faster on dialup, by transferring 2KB of a JSON list instead of a 300KB HTML page for each update.

      • by robmv ( 855035 )

        It is not free of cost, it is free for testing and prices will be announced later

      • This is why I am using CloudFlare.

        Somehow I trust them more than Google.

        • Can you quantify this? What metrics can one use to decide what CDN to use?

          • Google's blatant disregard to privacy and the ubiquitous tracking.

            CloudFlare is a relative newcomer that seems to be about security.

            Hm.

            • by s73v3r ( 963317 )

              ...Ok. How about their actual service. Their prices, how well they actually do they things they claim to be able to do, that sort of thing.

              Increased privacy would be nice, but first and foremost, I have to go with the solution that works.

      • by Rival ( 14861 )

        yesterday we read about Akamai, apparently origin of 15-30% of the web traffic. Google's service seems to be similar to Akamai's offering, but free of cost.

        Tomorrow Akamai, the day after tomorrow the world?

        My thoughts exactly. Akamai will be pretty threatened by this, but I'm not sure what they can do about it other than offer superior service.

        I wonder if Google will try to buy them out, though -- Akamai has lost about half of its stock value over the past three quarters for some reason.

        Such a move would definitely cause alarm, though. I personally would not feel comfortable concentrating so much of the internet in one company. Single points of failure are bad.

  • by Crudely_Indecent ( 739699 ) on Thursday July 28, 2011 @07:59AM (#36906868) Journal

    So, it rewrites my HTML, but what about my PHP (Perl, Python, your_scripting_language_here)?

    • by Anrego ( 830717 ) *

      Largely my first thought. Not much of the web is static these days. Most people who just want "a basic page with some info on it" unfortunately just use facebook now. Even really simple pages tend to have _some_ dynamic widget on them that relies on server side activity.

      May be useful if one seperated out static and dynamic content into seperate domains.. but for anything short of large scale this is a hassle.

    • by slim ( 1652 )

      Like any other proxy, no doubt it will heed the cache-control HTTP headers.

    • Judging for most of Google's pages, they're more likely to ask "static pages? What're those and who still uses them?"

      I recently discovered that Google actually has a subdomain for their static content (static.google.com, I believe), since they use so little of it. Somehow, I think Google is probably expecting most pages to be non-static.

      • Google uses a subdomain because they share so much static content across their subdomains (mail... plus... docs... etc etc), so sharing the static content speeds up those subdomains due to client side caching.

        • Actually, any site benefits from using a separate domain - browsers limit the number of connections per domain, so by using two you can speed up the site considerably.

          http://code.google.com/speed/page-speed/docs/rtt.html#ParallelizeDownloads [google.com]

          • by Beuno ( 740018 )

            After spending a lot of time benchmarking, this only holds true on http. On https, the overhead of the SSL negotiation kills what you gain very quickly.

          • Serving static content from a subdomain or just another domain (e.g., Facebook's fbcdn.net) can also improve the load times because the browser won't have any cookies associated with that domain, and therefore won't lose time sending a pile of irrelevant content along with every HTTP request.

          • by xnpu ( 963139 )

            If it's recommended to work around this browser limit, why is the limit there in the first place? What's the trade off here?

      • by jesseck ( 942036 )

        Somehow, I think Google is probably expecting most pages to be non-static.

        You may be right- but Google crawls the web daily, and would be a good judge (or at least a decent one) of which content changes, and how often.

    • What about non-static pages? Do you expect Google to magically host your entire site, in its proper environment?

      No - you send out the correct headers in response to queries about changes to the page - and if the content in your "non-static" page hasn't actually changed, you tell Google that (or hell, any client that is asking) and it serves up its cached copy. Even most dynamic pages wont change every second, so why run the page code for each request?

    • Perl and php, etc still serve html to people
  • by coinreturn ( 617535 ) on Thursday July 28, 2011 @07:59AM (#36906870)
    I presume they'll be inserting ads into your website!
    • by cdrudge ( 68377 )

      Half the pages already have google ads inserted into them. They are just eliminating the additional server request...

    • Comment removed (Score:5, Insightful)

      by account_deleted ( 4530225 ) on Thursday July 28, 2011 @08:07AM (#36906962)
      Comment removed based on user account deletion
      • by Amouth ( 879122 )

        I would say the hidden catch here is that they now know more about your site traffic than they did be fore.

      • Google is not offering this as a free service once this comes out of beta. The introduction page says that they intend to charge for it once it comes out of limited beta testing. Otherwise, I guess everyone would go with the cheapest possible webhost then have Google pick up the hosting slack.

    • by Anrego ( 830717 ) *

      Oh man.. nostalgia flashback to the geocities days :D

      I remember entire sites dedicated to little bits of script you'd put in your pages to trick various the free website providers "ad insertion code" into pluggin their ad code into an invisible frame or commented out section or used javascript to remove the ad after the fact!

    • by repetty ( 260322 )

      I presume they'll be inserting ads into your website!

      How are you entitled to do that?

  • I wonder how soon before this is used in elaborate spear phishing attempts to bypass a lot of trust issues.

    "The page looks like it came from Google..."
  • by tomcatuk ( 999578 ) on Thursday July 28, 2011 @08:21AM (#36907118) Homepage
    So in 2010 they tell webmasters speed is now a ranking factor. A few months later they launch a paid for service for webmasters to improve speed. Cynical? Me? Possibly...
  • This seems like an amazing simple solution for the biggest bandwidth hogs on my servers--the images. But, it seems like it's not set up to perform in this role satisfactorily. In the FAQs, it looks like they recompress images. I'm pretty sure I'd never want another site to monkey with my, or my clients', images. An elegant and nearly transparent way to install a CDN this may be, but unless they are willing to never ever mess with my content, I don't think this will work for me. At this point move along, th
  • by guppysap13 ( 1225926 ) on Thursday July 28, 2011 @08:29AM (#36907202)
    Not sure if anyone has heard of it before, but Opera has had a similar feature built in for a while called Opera Turbo, which compresses pages on their servers before they are downloaded to the browser. It's also how Opera on the iPhone works, because of Apple's restrictions.
    • There's a current web site/service offering this but focused on protection: blocking SQL injection, botnet spam, etc... I can't for the life of me remember what it's called. They act as a CDN and reverse proxy too, but speeding up the sites was more of a side effect of reducing the number of queries by something like 30%.
  • Would this have google take all DDOS hits and not my server? Sounds good.
  • Comment removed based on user account deletion
  • ...don't expect your page to show up in search results above other sites that have signed up.
    • Have you seen any evidence that Google skews its ranking system in favor of clients that use other Google services? The most I've seen is ads being placed on top, clearly marked as ads. It is in Google's best interest to keep search excellent, regardless of any money that someone could throw at them. After all, they seem to be doing great without the need to sell out on search [google.com], don't they?
  • Akamai? Inktomi? (Score:5, Insightful)

    by IGnatius T Foobar ( 4328 ) on Thursday July 28, 2011 @09:47AM (#36908162) Homepage Journal
    Funny how all of the slashdotters are talking about privacy issues instead of this service's potential to disrupt the paid CDN industry. I wonder what Akamai thinks about this development? Or the folks at Inktomi (now part of Yahoo, I believe) for that matter?
  • Ever since I started using Request Policy (a Firefox extension) I've noticed that severan sites use request to another domain that looks related but end in cdn, example. www.penny-arcade.com makes requests to pa-cdn.com, and there are many other examples of such.

    To me it sucks because if too many sites start requiring google-cdn.com I might as well stop using Request Policy, and no I don't use google.com for my searches.

    • CDN stands for Content Distribution Network. The basic idea is they locate a variety of servers topologically close to you so that hop count goes down (reduced latency), and potential bottlenecks or core route disruptions have little or not affect on you.

  • There have been too many dumb posts...not that that is too unusual...but really its not that hard:

    http://en.wikipedia.org/wiki/Content_delivery_network

    dimes

  • At what point are we going to just throw our hands up and allow google to control every aspect of our internet experience?

    So far:

    -Dominating search
    -Branching off into the world of ISPs (with their new fiber in Ohio)
    -DNS
    -Hosting/CDN
    -Browser
    -Social Media
    -Image hosting
    -Email
    -Chat/Video/Phone

    The way things are going, they will literally become the internet. Everything single page request you make will involve google in some way...

    As it stands, I'm pretty sure 90% of the websites I go to have at least one js req


  • and pricing will be âoecompetitive"

    Indeed, I'll bet it will. Competitive with AWS? They don't say that you won't need to have a site of your own, but if they're hosting you, why would you?

    And it'll probably pay for itself, as the decrease in latency that you receive will improve your search ranking.
  • Now with the "I'm Feeling Lucky Eh" button!

  • Umm no thanks.

"It's like deja vu all over again." -- Yogi Berra

Working...