Google Shares Insights On Accelerating Web Sites 230
miller60 writes "The average web page takes 4.9 seconds to load and includes 320 KB of content, according to Google executive Urs Holzle. In his keynote at the O'Reilly Velocity conference on web performance, Holzle said that competition from Chrome has made Internet Explorer and Firefox faster. He also cited the potential for refinements to TCP, DNS, and SSL/TLS to make the web a much faster place, and cited compressing headers as a powerful performance booster. Holzle also noted that Google's ranking algorithm now includes a penalty for sites that load too slowly."
Noscript (Score:5, Informative)
That's what noscript is for. With noscript, your browser doesn't even download the .js files.
Re:Ajax Libraries (Score:2, Informative)
All modern browsers support caching, and chances are, you aren't actually downloading a brand new set of libraries each time.
Re:java sites screwed (Score:3, Informative)
There's no inherent reason that Java should be slow. I run a discussion site (linked in my sig) that's running off an all-java codebase, and while it has occasional load issues. We can render the content for the front page of the site in 20 ms or less (it's at the bottom of the page if you are curious). Java has a proper application model, so with smart use of singletons you can effectively keep the entire working set of a forum site in memory. Our performance is much poorer if you start browsing through archives, but that makes up a tiny percentage of our page views.
NONONONONO (Score:3, Informative)
"He also cited the potential for refinements to TCP, DNS, and SSL/TLS to make the web a much faster place"
The core Internet protocol and infrastructure was and remains a conduit of innovation /because/ it is agnostic to HTTP and all other protocols. Optimizing for one small subset of its protocols and for a single kind of contemporary usage would discourage all kinds of innovation using protocols we've not conceived yet, and would be the single largest setback the modern Internet has seen.
Re:java sites screwed (Score:5, Informative)
java really really only has problems with startup time (that a web spider will never see) and the delay when a servlet|jsp is hit the first time. While doing web development, we see that startup and first load most of the time, giving an appearance of slowness, but it is much better on a production server with regular traffic.
Most delay is ad-related. (Score:5, Informative)
Most real-world page load delay today seems to be associated with advertising. Merely loading the initial content usually isn't too bad, although "content-management systems" can make it much worse, as overloaded databases struggle to "customize" the content. "Web 2.0" wasn't a win; pulling in all those big CSS and JavaScript libraries doesn't help load times.
We do some measurement in this area, as SiteTruth [sitetruth.com] reads through sites trying to find a street address on each site rated. We never read more than 21 pages from a site, and for most sites, we can find a street address within 45 seconds, following links likely to lead to contact information. Only a few percent of sites go over 45 seconds for all those pages. Excessively slow sites tried recently include "directserv.org" (a link farm full of ads), "www.w3.org" (embarrassing), and "religioustolerance.org" (an underfunded nonprofit). We're not loading images, ads, Javascript, or CSS; that's pure page load delay. It's not that much of a problem, and we're seeing less of it than we did two years ago.
Re:Ajax Libraries (Score:3, Informative)
Frameworks are great but they are also overused.
JQuery is fantastic if you're doing a big site that you want to feel like an app, but many people load JQuery just to do an image fade or animation - stuff that you can easily code yourself. [richnetapps.com]
To add insult to injury, sites made with Joomla, WP, Drupal, etc. often rely on plugins, which use their own libraries. The end result is a site that loads JQuery, Mootools and Scriptaculous just to do some trivial effects that would be achieved just as well with document.getElementById(), setTimeout() and the element.style property.
The problem with doing things yourself instead of using a framework for common things like fades is that you have to remember how to code for each of the common browsers. The libraries hide that nastiness from you.
Re:Noscript (Score:5, Informative)
Re:google-analytics.com ? (Score:4, Informative)
Googles' own documents [google.com] recommend that you should use asynchronous tracking which should cause no page slowdowns, and even if you use the traditional code [google.com] it should be at the at the end just before the closing body tag.
If a page is loading slowly because of google-analytics, blame the web site developer.
That's not insightful (Score:5, Informative)
Noscript doesn't turn off Javascript. Most browsers already have an option for that. What Noscript does is to make the control of Javascript (and Flash) much more fine grained and convenient.
Some typical case:
1. Scripts on poor web sites just serve to detract from the content. Those you simply never turn on.
2. Scripts on good web sites improve access to content. Those sites you enable permanently first time you visit (press no Noscript button in the lower right corner, and select "enable permanently") and forget about it.
3. Some web sites contain a mix of the two. Here you can either explicitly enable a specific object (by clicking on a placeholder, like with flashblock), or temporarily enable scripts for that site.
Basically, Noscript makes more, not less, of the web accessible. The good web sites you use normally will not be affected (as they all will be allowed to run scripts). But following links from social web sites like /. become a much more pleasant experience.
Of course, most of the noise scripts distacting from content are ads, so AdBlock gives you much of the same benefit. But I don't want to hide ads, as that is how the sites pay their bills.
Re:Measuring speed from *where* exactly? (Score:5, Informative)
From the docs:
"Page load time is the total time from the moment the user clicks on a link to your page until the time the entire page is loaded and displayed in a browser. It is collected directly from users who have installed the Google Toolbar and have enabled the optional PageRank feature."
http://www.google.com/support/webmasters/bin/answer.py?answer=158541&hl=en
Re:I prefer low-tech solutions... (Score:4, Informative)
If you haven't already, get a router with QoS. Next, when he's in some other room 'maximising his bandwidth', set his max connections to 30 and his upload to 1/4 of your upload speed. You might also consider Tomato or DD-WRT if you have a compaitible router.
Why not block ads if you don't click? (Score:5, Informative)
If you're not going to be clicking adverts, I am sure it costs nobody money. It just costs them bandwidth. The adworld is mostly CPC/PPC.
Content websites seem to think that if I do not block an advert, I will actually click it. That is ridiculous!
My principle is that advertising is like a bribe, they paid to put it in my face. That is a product I have no interest in. I will learn about products when I have a need for them.
Re:I feel happier with NoScript (Score:3, Informative)
Re:I feel happier with NoScript (Score:3, Informative)
It's people using AdBlock that cause sites to have annoying adverts in the first place.
That is simply false. In fact, reality is exactly the opposite: It’s the sites having annoying adverts that cause people to use AdBlock in the first place.
Annoying advertisements (particularly annoying, the blinking animated gif ones) have been around at least since when I was first starting to surf the web back in the days of Netscape Navigator 2. AdBlock was pretty much unheard of back then, which meant I had no choice but to look at Flash ads for fungal foot cremes on my Hotmail account.