Google Shares Insights On Accelerating Web Sites 230
miller60 writes "The average web page takes 4.9 seconds to load and includes 320 KB of content, according to Google executive Urs Holzle. In his keynote at the O'Reilly Velocity conference on web performance, Holzle said that competition from Chrome has made Internet Explorer and Firefox faster. He also cited the potential for refinements to TCP, DNS, and SSL/TLS to make the web a much faster place, and cited compressing headers as a powerful performance booster. Holzle also noted that Google's ranking algorithm now includes a penalty for sites that load too slowly."
google-analytics.com ? (Score:5, Insightful)
I saw my browser waiting on google-analytics.com quite often before I started using No-Script.
Why do sites put up with an AD server/analytics service that slows down a site by a large amount?
Re:google-analytics.com ? (Score:5, Insightful)
Because it's valuable data, and google is the only game in town. You can see which keywords are converting, and for what dollar amount, and which keywords are money pits. Yes, it will on occasion hang but you should look at the data that it produces before saying it's not worth it.
Re:java sites screwed (Score:3, Insightful)
This really begs the question of what it tries to load. If it simply loads the html, then JavaScript laden sites and Flash sites will have the edge over simple information sites that serve dynamic content. However, if they load all referenced content, then the reverse may be true.
I would like it if the latter were true. What could be better than every Flash site being seen as a large bundle of data that simply displays "This site requires Flash". When I surf the web, I surf for content, not pretty pictures. In my opinion, if a site can't simultaniously be surfed in Lynx, read in Braille, and parsed with a spider, then it really isn't a web site.
How fast? (Score:5, Insightful)
Google's ranking algorithm now includes a penalty for sites that load too slowly.
I'm not sure how I feel about this. My initial response was a happy one, but the more I think about it, the more it seems to be unnecessarily discriminating against those who are too far away from the bleeding edge. Do we really live in a world where 'Speed=Good' so completely that we need to penalize those who don't run fast enough? And where are we drawing the line between 'fast' and 'slow'?
Re:Ajax Libraries (Score:4, Insightful)
I'm OK with users having to pull even 100kb one time to have a nicer browsing experience all around.
I really wish I could get over my paranoia and link to the libraries on google's code CDN. Slim chance, but if they go down and my sites are still up, there be problems!
Re:Ajax Libraries (Score:2, Insightful)
Frameworks are great but they are also overused.
JQuery is fantastic if you're doing a big site that you want to feel like an app, but many people load JQuery just to do an image fade or animation - stuff that you can easily code yourself. [richnetapps.com]
To add insult to injury, sites made with Joomla, WP, Drupal, etc. often rely on plugins, which use their own libraries. The end result is a site that loads JQuery, Mootools and Scriptaculous just to do some trivial effects that would be achieved just as well with document.getElementById(), setTimeout() and the element.style property.
Penalty for speed (Score:3, Insightful)
Re:Penalty for speed (Score:1, Insightful)
And the most reputable/best sites will not be filled with crappy flash adverts and affiliate marketing crapola.
I think that's what this is aimed at. Remember it's not the only factor, so a slow site, but one that's really popular/reputable will be high up, but a unpopular site, peppered with keywords and flash/javascript advertising shite will rank low.
I'm pretty sure the google search engineers have thought of this (and many other issues) when designing it. I'm fairly certain that they didn't just stick on a "if(size>200kB){rank-=10}"
Re:Noscript (Score:5, Insightful)
That's what noscript is for. With noscript, your browser doesn't even download the .js files.
That's fine and dandy. IF.
If you don't care to see or experience the vast majority of web sites on the Intertubes today.
Honestly, when I see (yet another) pious elitist bleating about no-script or whatever, I wonder: Why don't you just surf in Lynx?
If you're surfing with no-script, you're missing 75% of the Internet. If it's not the 75% you want to see and or experience, than good for you. But bleating about the creative uses of JavaScript on the World Wide Web is old news.
Re:And how is speed relevant to the content? (Score:3, Insightful)
Speed is relevant because crap mirror sites should be ranked lower than the originating site. [Or even vice versa, faster mirrors should be preferred over the original source]
You seem to be under the delusion that Google is just going to delete slow sites, or return results purely on speed regardless of content. I have no idea what could lead you to think this way (well, I do "knee jerk reaction") because as far as I can tell, the most relevant site will be preferred but if there are multiple sites that are approximately all around the same relevance, the faster one is preferred.
Sounds like an excellent idea to me, lord knows that I've been pissed off waiting 45 seconds for a page to load when the next result loads instantly with similar information.
Re:Noscript (Score:5, Insightful)
If you're surfing with no-script, you're missing 75% of the Internet.
Actually, it's more like 95%. However, you did completely miss the point. Turning off Noscript for a site you choose to bless takes two mouse clicks and a reload.
You're not missing out on what you want to see. You're missing out on all the other random shit you couldn't care less about.
what about the other browsers? (Score:3, Insightful)
Holzle said that competition from Chrome has made Internet Explorer and Firefox faster.
Bull. Back when IE and Firefox's last major releases came out, Chrome was a tiny drop in the bucket market-share-wise. January was the first time it passed Safari in marketshare. I think it's more accurate to say that competition in general has led to companies improving their browsers. I'd bet we could also attribute the performance improvements to better standards compliance by websites, since there are now so many mainstream browsers.
I'd say that Firefox vs IE competition (and Firefox vs Safari on the mac) have inspired the improvements...
Re:Noscript (Score:5, Insightful)
Likewise. And if I see flash it's a damn good indication I just don't care what's on the site.
Re:Noscript (Score:3, Insightful)
Or if you actually want to be able to use your [credit card/bank/net provider] site. The problem isn't what we, the users, allow or deny--the problem is the hubris of the programmers and web designers who want to stuff in all that bloatware, just because they can.
Dear Web Site Designer:
If your page takes more than a few seconds to load, your customer has moved on.
If your search algorithm brings up even more pages when another term is added, your customer won't slog through the cruft
If the pages is flashing and singing and offering 50 different subjects in 20 colours, your customer is confused and will not select anything.
If your customer has to fight to find what they want and pay for it, they will go to a brick and mortar.
If your reader wants to share a link or open it in another tab, finding out it is a script that can only be opened where it was found is annoying and likely to gain you some hostility.
If there is any effort in reading and understanding what you published, they don't care how pertinent your subject is or how true your opinion.
Oooo . . . shiiiinnnny does not a good web page make.
Re:Noscript (Score:2, Insightful)
A tenth? What Internet are you using?
My my estimation based on the scroll bar position after counting for a while, I've got about 250 websites listed with site-specific preferences; some of those will just have plugins enabled so I can look at PDFs, but most are to enable JavaScript.
They range from a couple dozen sites that I want to watch some Flash video on (enabling plugins but leaving JS off doesn't seem to work) to online stores (NewEgg and Best Buy both need JS for nearly essential features) to two of my banks (one of which doesn't actually need it I think, but it adds a few neat features; the other I think needs it) to some discussion-centric sites (even /. by default, but also things like Blogspot if you want to add comments) to the social networking sites to sites like this [realworldhaskell.org].
I can't say for certain what percentage of the sites I visit I have whitelisted, but rarely does a day go by when I don't discover some new site to add.
Re:How fast? (Score:2, Insightful)
Yes, we should penalize them.
Imagine there are two sites with the information I need. I would much prefer to see the faster site than the slower one, because by loading the faster site I get my information faster.
If I wanted my information slowly, I would walk to the library...
Re:java sites screwed (Score:3, Insightful)
Re:google-analytics.com ? (Score:5, Insightful)
Not worth it to who? It's not worth it to me. Noscript please.
Re:Noscript (Score:4, Insightful)
You're not missing out on what you want to see. You're missing out on all the other random shit you couldn't care less about.
Blocking stuff has taught me that there are some sites I just want to avoid. When I load a site and it not only requires Javascript to work when it's not using any dynamic features, and tries to dump scripts and cookies from a dozen known spammers on me, then I know I really don't need to view that content. Who cares if it was interesting? It's not THAT interesting.
Re:Ajax Libraries (Score:3, Insightful)
It does not give them cash (Score:3, Insightful)
It doesn't give them money Dave, if I do not click an advert (click) and do not buy the product referenced in the advert (impression)...
They get nothing.
Are you a content producer by any chance?
Re:Penalty for speed (Score:3, Insightful)
Seconded! Penalizing slow sites only promotes those shitty mirror sites that all seem to be on fast pipes.
It's interesting: I've started noticing these spam sites creeping higher and higher up the page rank and was wondering what new trick the ad-spammers had developed to game Google. It would just figure that it turns out to be Google shooting themselves in the foot with asinine policies like this.
Re:Ajax Libraries (Score:3, Insightful)
Clever enough, but using MD5 is still running the slight risk of collisions... of course if you verify that the content-length is the same size too, you’re reducing the risk of collision substantially.