Google Announces Google CDN 205
leetrout writes "Google has introduced their Page Speed Service which 'is the latest tool in Google's arsenal to help speed up the web. When you sign up and point your site's DNS entry to Google, they'll enable the tool which will fetch your content from your servers, rewrite your webpages, and serve them up from Google's own servers around the world.'"
all your base... (Score:1)
at least when they finish taking over the world we'll be able to find things on the internet REALLY FAST.
Re:all your base... (Score:4, Funny)
Re:all your base... (Score:5, Funny)
Re: (Score:2)
I would do a WHOIS first. You can't be too careful these days.
Re: (Score:2)
T-1000 (beta)
Re: (Score:2)
If what you say is correct, then nobody will use it.
But somehow I have trouble believing that the customer would get nothing out of this. Even if it's only faster delivery to an end user , that is a very real and very tangible thing.
for end users, a tangible benefit is obvious to me (Score:2)
with metered internet caps, the speedup must be removing a lot of cruft from pages- thereby reducing the usage-- maybe not as much as say netflix's reductions but certainly enough to be useful for low limit cell data plans...
something that shaves a tenth the time must shave at least half that in bytes....
in my life I've authored webpages using wysiwyg editors that had an enormous amount of cruft.
Re: (Score:2)
Instead of cutting out bytes, they serve content from geographically closer servers. This allows lower latency, and distributed load, with means faster page loads and better response to the end user.
And this one at least (Score:2)
also specifically optimizes the html & the images in the page, reducing the file sizes, keeping my assertion valid.
Re: (Score:2)
Well whoopee for you. I'd be more impressed if you'd written (if that's what the Americanism "authored" means) web pages that had an enormous amount of cruft, using a text editor. That would be showing serious dedication to generating cruft.
Using a cruft-generating machine to generate cruft is about as impressive as standing on a beach holding a broom and saying "I've decided to let the tide come swooshing in".
Re:all your base... (but not flash) (Score:3)
But somehow I have trouble believing that the customer would get nothing out of this. Even if it's only faster delivery to an end user , that is a very real and very tangible thing.
I just ran their test on a page from my web site. The page contains a flash photo presentation with accompanying music (still waiting for a non-Flash-based tool of comparable features and ease of use; nothing even remotely close exists). According to webpagetest.org the original page loaded in 2.4 seconds, while Google's "optimized" version took 21.3 seconds. Neither of them actually loaded the Flash presentation properly. Is this because Google dislikes Flash or is it a problem with webpagetest.org?
Re: (Score:2)
If the flash file loads just fine when you visit the page you tested, then there's obviously a problem with webpagetest.
Re: (Score:3)
I'd be okay with giving to give anyone who cares to ask a comprehensive list of my interests. Unfortunately for Google and advertisers, they only get my interests, not the ad revenue.
Apple is the one who has the little padded cells. Google have improved the internet, and probably even the whole tech industry the most out of any company in the last decade. There are a whole lot of benefits that have come our way - easily the best search since 1998, gigabytes of free inbox space, free online office suite capa
Re: (Score:2)
Microsoft have given us.. well, I can't think of anything to be honest.
How about the most popular operating system in the world? The most popular office suite in the world? A PC UI that doesn't suck ass (yes, I'm implying that OS X is ugly and unusable)? The best OS for PDAs (by today's standard, Windows Mobile sucks, but it was the best back in the day). In later years, they revolutionized console gaming with the Xbox and the first major online console gaming service. And now we have Windows Phone 7 and the Metro UI which is basically an orgasm shooting directly into your opt
Re: (Score:2)
Windows and Office were around before 1998.
Actually, I'll agree that Windows Mobile was the best for a while, I used to use it long before iOS came out, and I stuck with it until Android came along. They just let it stagnate though because that's what MS do when they're the only game in town.
Xbox may have done online gaming, but I hardly consider that "revolutionary" considering I'd already been gaming online for years before it. Evolutionary perhaps, but not revolutionary. Consoles such as the Dreamcast an
Re: (Score:2)
Windows and Office were around before 1998.
What makes 1998 relevant?
Actually, I'll agree that Windows Mobile was the best for a while, I used to use it long before iOS came out, and I stuck with it until Android came along. They just let it stagnate though because that's what MS do when they're the only game in town.
Same, and true. My last phone before my Droid X was a Samsung Omnia. WM6.1 was really showing its age at that point and I was ready to chuck my phone at the wall on several occasions.
Xbox may have done online gaming, but I hardly consider that "revolutionary" considering I'd already been gaming online for years before it. Evolutionary perhaps, but not revolutionary. Consoles such as the Dreamcast and PS2 had online gaming too.
Well perhaps "revolutionary" is too markety and buzzwordy but I was unaware of Dreamcast's online capabilities, and the PS2 required an additional modem part and not many games actually supported real online gameplay. Xbox Live was revolutionary in that it provided a framework that any game could use to
Re: (Score:2)
Well, I was talking about 1998, maybe I didn't make it all that obvious though:
There are a whole lot of benefits that have come our way - easily the best search since 1998
1998 was when Google came out, and since then they've done a lot more for the tech world than Apple and MS was my point. 1998 also happens to be the year when I switched to using PCs rather than Macs and Amigas.
Since the desktop space is so established, "innovation" is not enough to switch people over - unless they are not tied down in any way by established Windows only applications. But basically hardly anyone is like that. Hom
Re: (Score:3)
You say the benefits only go one way, but aren't the users receiving more and more free services?
If you don't like the trade-off of seeing ads for those free services, you don't use them. How is this arrangement deceptive or evil? Just because they're big doesn't mean they're some nasty conspiracy.
Re: (Score:3)
Re: (Score:3)
Except in the areas that putting data together enables humans to do more with the data.
And Google has been pretty good about trying to make data more accessible to everyone on the planet. Again, not very evil.
Unless you refer only to your private data, and again Google is one of the rare companies that doesn't have private data out to anyone. AOL, Yahoo, Microsoft, Facebook, etc. do hand your private data out to other people. When Google makes inroads into their markets, they're actually taking market share
Re: (Score:2)
Well, the problem is that even if you choose not to use their services, others will
So you don't get to feel part of the IN crowd?
Re: (Score:2)
actually it does.
Re: (Score:2)
yesterday we read about Akamai [slashdot.org], apparently origin of 15-30% of the web traffic. Google's service seems to be similar to Akamai's offering, but free of cost.
Tomorrow Akamai, the day after tomorrow the world?
Re: (Score:2)
For now...
Google says that Page Speed Service will be offered for free to a limited set of testers right now. Eventually, they will charge for it, and pricing will be “competitive”.
Also is there any sites left that are static? Could maybe be useful if you get a lot of traffic and seperate out static stuff (images, scripts, css, whatever) and dynamic stuff into two domains .. but for most of the internet?
Re: (Score:2)
Most of the content I see is quasi-static. The actual page content does not change often enough to warrant a complete page regeneration on each request. Complete page generation for each request is really only justified if there are either too many pages to write them all to a static cache at the same time or if the pages are very dynamic (like very active forum threads or other dynamic data that needs to be delivered "fresh").
Static pages on an online shopping site (Score:2)
Complete page generation for each request is really only justified if there are either too many pages to write them all to a static cache at the same time
Would an online shopping site with 80,000 products count? The product photos probably would though.
or if the pages are very dynamic (like very active forum threads or other dynamic data that needs to be delivered "fresh").
Would an online shopping site that displays which products a given user has recently viewed in this shopping session and what items are in the shopping cart count? Or possibly each page of product search results.
Re: (Score:2)
I'd say that traffic is more static than ever. With more and more websites using JS to update the content - instead of doing a full page request - the amount of dynamic data is very small.
In fact, with proper caching (and HTML5 has some nice features for that), current webpages can actually be faster on dialup, by transferring 2KB of a JSON list instead of a 300KB HTML page for each update.
Re: (Score:2)
It is not free of cost, it is free for testing and prices will be announced later
CloudFlare for me (Score:2)
This is why I am using CloudFlare.
Somehow I trust them more than Google.
Re: (Score:2)
Can you quantify this? What metrics can one use to decide what CDN to use?
Re: (Score:2)
Google's blatant disregard to privacy and the ubiquitous tracking.
CloudFlare is a relative newcomer that seems to be about security.
Hm.
Re: (Score:2)
...Ok. How about their actual service. Their prices, how well they actually do they things they claim to be able to do, that sort of thing.
Increased privacy would be nice, but first and foremost, I have to go with the solution that works.
Re: (Score:2)
yesterday we read about Akamai, apparently origin of 15-30% of the web traffic. Google's service seems to be similar to Akamai's offering, but free of cost.
Tomorrow Akamai, the day after tomorrow the world?
My thoughts exactly. Akamai will be pretty threatened by this, but I'm not sure what they can do about it other than offer superior service.
I wonder if Google will try to buy them out, though -- Akamai has lost about half of its stock value over the past three quarters for some reason.
Such a move would definitely cause alarm, though. I personally would not feel comfortable concentrating so much of the internet in one company. Single points of failure are bad.
Re: (Score:3)
Re: (Score:2)
Yeah...
Google gets free hosting / bandwidth from most of the ISPs in the world because they want to reduce their bandwidth bill. Google container? What did you think they did with that tech?
95% of people watch 5% of the content... THAT will come from the container....
So if Google hosts your data there, then there is a good possibility you will be just as close if not closer than Akamai. With the increases in CPU / HDD space, why not?
But what about non-static pages? (Score:5, Insightful)
So, it rewrites my HTML, but what about my PHP (Perl, Python, your_scripting_language_here)?
Re: (Score:2)
Largely my first thought. Not much of the web is static these days. Most people who just want "a basic page with some info on it" unfortunately just use facebook now. Even really simple pages tend to have _some_ dynamic widget on them that relies on server side activity.
May be useful if one seperated out static and dynamic content into seperate domains.. but for anything short of large scale this is a hassle.
Re:But what about non-static pages? (Score:5, Insightful)
Not much of the web is static these days.
Actually, almost all of it is, still.
Images: static. Videos: static. Big blob of CSS downloaded with the page: static. Big blob of javascript downloaded by the page: also static. Sure, there is some non-static HTML, but the job of that is to arrange a bunch of much larger static objects things on a page.
Re: (Score:2)
Like any other proxy, no doubt it will heed the cache-control HTTP headers.
Re: (Score:2)
heed [thefreedictionary.com]
Re: (Score:2)
Judging for most of Google's pages, they're more likely to ask "static pages? What're those and who still uses them?"
I recently discovered that Google actually has a subdomain for their static content (static.google.com, I believe), since they use so little of it. Somehow, I think Google is probably expecting most pages to be non-static.
Re: (Score:2)
Google uses a subdomain because they share so much static content across their subdomains (mail... plus... docs... etc etc), so sharing the static content speeds up those subdomains due to client side caching.
Re: (Score:3)
Actually, any site benefits from using a separate domain - browsers limit the number of connections per domain, so by using two you can speed up the site considerably.
http://code.google.com/speed/page-speed/docs/rtt.html#ParallelizeDownloads [google.com]
Re: (Score:2)
After spending a lot of time benchmarking, this only holds true on http. On https, the overhead of the SSL negotiation kills what you gain very quickly.
Re: (Score:3)
Serving static content from a subdomain or just another domain (e.g., Facebook's fbcdn.net) can also improve the load times because the browser won't have any cookies associated with that domain, and therefore won't lose time sending a pile of irrelevant content along with every HTTP request.
Re: (Score:2)
If it's recommended to work around this browser limit, why is the limit there in the first place? What's the trade off here?
Re: (Score:2)
Read the link, it explains it better than I could.
Re: (Score:2)
Somehow, I think Google is probably expecting most pages to be non-static.
You may be right- but Google crawls the web daily, and would be a good judge (or at least a decent one) of which content changes, and how often.
Re: (Score:2)
What about non-static pages? Do you expect Google to magically host your entire site, in its proper environment?
No - you send out the correct headers in response to queries about changes to the page - and if the content in your "non-static" page hasn't actually changed, you tell Google that (or hell, any client that is asking) and it serves up its cached copy. Even most dynamic pages wont change every second, so why run the page code for each request?
Re: (Score:2)
Holy shit (Score:2)
Holy shit, 6 out of 7 respondents to the GP (all but anredo) completely missed the point. [insert standard complaint about slashdot going downhill].
Web pages with script are not static, and caching the HTML script output does nothing. Server-side code generally has to be run per-visitor. Akamai has all sorts of crazy custom XML to specify which portions are static.
Setting up a proper CDN for the modern web is more complicated than just redirecting some DNS entries.
Re:Holy shit (Score:5, Insightful)
Holy shit, 6 out of 7 respondents to the GP (all but anredo) completely missed the point. [insert standard complaint about slashdot going downhill].
Web pages with script are not static, and caching the HTML script output does nothing. Server-side code generally has to be run per-visitor. Akamai has all sorts of crazy custom XML to specify which portions are static.
Setting up a proper CDN for the modern web is more complicated than just redirecting some DNS entries.
LOL. Talk about pot calling kettle black. This is what happens when you read the slashdot summary instead of the source material. Allow me to explain what you are missing - what Google is doing is not a CDN at all, its just a bad summary. They are providing an optimizing proxy - it could care less if your content is static or dynamic, as long as it generates HTML output, it will work. It is unclear from first glance if the proxy is a caching proxy - I would guess it is - but even then it would be a stretch to call it a CDN in a modern sense of the word.
Re: (Score:2)
Does it sanitize input?
How does it handle dynamic code? Eg, when does it know to send you a cached copy, and when to send you a fresh one.
This sounds similar to a start-up I heard about a while ago, can't remember the name though. Either way, if they implement this well, this could be an awesome service.
Re: (Score:2)
Re: (Score:2)
Yeah, that sounds like it!
Can't remember the name though.
That's CloudFlare (Score:2)
Re: (Score:2)
Some of the things on your list can't be done automatically without possibly causing problems.
You can't rewrite inefficient selectors or remove CSS rules because classes can be applied and DOM elements inserted dynamically in JavaScript. Unless you're going to do fairly complex analysis of the JavaScript on the page to determine that it's not changing classes on DOM elements or adding/removing to/from the DOM.
You also can't easily move JavaScript to the bottom of the page without possibly breaking things. T
Re: (Score:2)
BUT "your_scripting_language_there" only outputs HTML.
Not necessarily. It you're using a framework like ExtJS, then the server most likely fishes up JSON or XML which is parsed by the Javascript on the browser and the DOM in manipulated directly. I can't see Google doing anything to speed that up.
And insert ads (Score:3)
Re: (Score:2)
Half the pages already have google ads inserted into them. They are just eliminating the additional server request...
Comment removed (Score:5, Insightful)
Re: (Score:2)
I would say the hidden catch here is that they now know more about your site traffic than they did be fore.
Re: (Score:3)
Google is not offering this as a free service once this comes out of beta. The introduction page says that they intend to charge for it once it comes out of limited beta testing. Otherwise, I guess everyone would go with the cheapest possible webhost then have Google pick up the hosting slack.
Re: (Score:3)
Oh man.. nostalgia flashback to the geocities days :D
I remember entire sites dedicated to little bits of script you'd put in your pages to trick various the free website providers "ad insertion code" into pluggin their ad code into an invisible frame or commented out section or used javascript to remove the ad after the fact!
Re: (Score:2)
I presume they'll be inserting ads into your website!
How are you entitled to do that?
Phishing/Ads nightmare? (Score:2)
"The page looks like it came from Google..."
No surpises here really (Score:5, Interesting)
Re:No surpises here really (Score:4, Insightful)
Speed should be a ranking factor. They still need to demonstrate better latency than competing CDNs if they want my business.
Re: (Score:2)
That has been known to affect Google ranking for a long time - any competent webmaster should know that.
CDN for images only? (Score:2)
Lossless recompression (Score:2)
So, perhaps I missed something: Where does it say that the compression will be lossless?
There are tools to convert still GIF images to indexed PNG images, which are smaller except in a tiny minority of the smallest images. There are tools such as pngout, OptiPNG, and pngcrush that losslessly recompress the image data in PNG files and strip non-essential metadata. There are also tools like jpegtran that recompress JPEG files by trimming out Exif metadata and thumbnails, making the Huffman (entropy coding) tables more efficient, and deciding which "progressive" coefficient order produces the sm
Opera...again (Score:3)
Re: (Score:3)
Re: (Score:2)
On average, a website on CloudFlare ...
... loads twice as fast
... uses 60% less bandwidth
... has 65% fewer requests
... is way more secure
Re: (Score:2)
Re: (Score:2)
No. Not even close.
Knee-jerk much? At least spend a second Googling something you know nothing about before commenting.
This is EXACTLY like Opera Turbo - which is an optimizing proxy server - the only key difference is that Google's service is browser agnostic and Opera's designed to work with Opera browsers only.
-Em
Re: (Score:2)
I'd have to dig into Opera Turbo a bit more but I'm skeptical that Opera redirects all traffic through their server first so it can be compressed. That makes no sense. The extra time involved with that extra layer would wipe out any benefit of having it compressed in the first place.
That's precisely what Turbo does, but you miss the point of it. It's not so that you can squeeze an extra 50ms on your megabit pipe. It's for when you're on GPRS or dial-up, or other kind of connectivity that is bandwidth-limited.
DDOS protection? (Score:2)
Re: (Score:2)
Only if they're targeting you by your DNS name and not your IP address.
Re: (Score:2)
And if you don't sign up... (Score:2)
Re: (Score:2)
Re: (Score:2)
Google takes into account load times when ranking pages. If your site is served from the same datacenter then you'll have a better load time and therefore a better PageRank at the expense of everyone who doesn't pay for Google CDN. So, yes, this is anti-competitive.
I think it's highly unlikely that Google will fail to account for the same-data-center latency reduction.
The reason Google boosts the ranking of fast sites is because they want to provide a better experience for users (so users will keep coming back, so Google can keep showing ads to them). It does no good to favor the server in the other side of the building over one somewhere else if the Google server isn't also faster for the end user. I'm sure they have some clever ways of estimating the performance
Akamai? Inktomi? (Score:5, Insightful)
What is a CDN? (Score:2)
Ever since I started using Request Policy (a Firefox extension) I've noticed that severan sites use request to another domain that looks related but end in cdn, example. www.penny-arcade.com makes requests to pa-cdn.com, and there are many other examples of such.
To me it sucks because if too many sites start requiring google-cdn.com I might as well stop using Request Policy, and no I don't use google.com for my searches.
Re: (Score:2)
CDN stands for Content Distribution Network. The basic idea is they locate a variety of servers topologically close to you so that hop count goes down (reduced latency), and potential bottlenecks or core route disruptions have little or not affect on you.
Before you post, please read about how CDNs work (Score:2)
There have been too many dumb posts...not that that is too unusual...but really its not that hard:
http://en.wikipedia.org/wiki/Content_delivery_network
dimes
So.... (Score:2)
At what point are we going to just throw our hands up and allow google to control every aspect of our internet experience?
So far:
-Dominating search
-Branching off into the world of ISPs (with their new fiber in Ohio)
-DNS
-Hosting/CDN
-Browser
-Social Media
-Image hosting
-Email
-Chat/Video/Phone
The way things are going, they will literally become the internet. Everything single page request you make will involve google in some way...
As it stands, I'm pretty sure 90% of the websites I go to have at least one js req
As an ex-Akamaiite, I can only say (Score:2)
nothing.
"Competitive pricing" (Score:2)
and pricing will be âoecompetitive"
Indeed, I'll bet it will. Competitive with AWS? They don't say that you won't need to have a site of your own, but if they're hosting you, why would you?
And it'll probably pay for itself, as the decrease in latency that you receive will improve your search ranking.
GOOGLE CANADA WOOT! (Score:2)
Now with the "I'm Feeling Lucky Eh" button!
"rewrite your webpages" (Score:2)
Umm no thanks.
Re: (Score:2)
How do you expect a 3rd party without your TLS private key to proxy AND compress (i.e. modifying the content) your HTTPS connections?
Re: (Score:2)
How do you expect a 3rd party without your TLS private key to proxy AND compress (i.e. modifying the content) your HTTPS connections?
Its Google - we expect magic. Damn the common sense!!!
Seriously though - they could support it by you providing them with a key/cert - just like any other HTTPS proxy. The issue is that the way GHS works - it is very difficult, if not impossible, to support SSL. They would have to have a separate dedicated IP for each site (i.e. no just assigning your DNS to ghs.google.com) or a very large, very convoluted, ever changing certificates with a LOT of aliases. This is why GHS never supported SSL - even for cont
Re: (Score:2)
Simple. They hire Bruce Schneier and he arranges for man-in-the-middle attacks on all your traffic. Google can then also insert some ads into the encrypted stream as your users surf your site, thus keeping the service free for the rest of us :)
Re: (Score:2)
Re: (Score:2)
It's just the next Wave of features from their Labs.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
The government pretty much gets to bully around any webhost or registrar that it wants, be it Google or GoDaddy or BlueHost. But if Google refuses to publish your content on their CDN, you can simply redirect your DNS back to your original host, which needs to be maintained anyway because that's how Google would get updates. I wouldn't trust anyone to host my objectionable content without a backup. Look what happened to Wikileaks and the Amazon debacle.