Google Shares Insights On Accelerating Web Sites 230
miller60 writes "The average web page takes 4.9 seconds to load and includes 320 KB of content, according to Google executive Urs Holzle. In his keynote at the O'Reilly Velocity conference on web performance, Holzle said that competition from Chrome has made Internet Explorer and Firefox faster. He also cited the potential for refinements to TCP, DNS, and SSL/TLS to make the web a much faster place, and cited compressing headers as a powerful performance booster. Holzle also noted that Google's ranking algorithm now includes a penalty for sites that load too slowly."
Ajax Libraries (Score:3, Interesting)
Now if only every website didn't include 300kb of Javascript libraries that I had to download.
Hope they don't get to trigger happy (Score:2, Interesting)
Moore's Law (Score:1, Interesting)
Not to discount how important it is to make your website as fast as possible but...
I doubt anyone with a decent internet connection is complaining about these 320k pages. Even on a cell phone it's not a big deal. As technology moves forward and speed improves even more these size related complaints will get less and less important.
Think about it - who complains about a 340k file on their hard drive anymore. I'm sure in the mid '80s lots of geeks rightfully griped about it.
Re:google-analytics.com ? (Score:1, Interesting)
I run adblock, noscript, flashblock, betterprivacy, yslow... and...I run all of that stuff at work too.
Despite that, the website I run--*runs* analytics. Even though it's slow. Oh yeah--that slowness...it's often DNS. No clue why--I'd think that server farm would stay cached everywhere on the planet. The deal is--if your browser hangs on that load--somebody wrote the page wrong. My analytics urchin--except on the blog (damned wordpress) always runs at the end, after content's done rendering.
Smart webmasters understand that their webpage is not a single document that just suddenly appears.
Unfortunately, that seems to be a rare breed--as these days I have trouble getting interns or programmers who understand even the basics like minification, modgzip, or who are even comfortable setting cache headers.
Measuring speed from *where* exactly? (Score:5, Interesting)
Where are the measuring *from*?
I've moved a site from Linode New Jersey to Linode London, UK because the target audience are in London ( http://www.lfgss.com/ [lfgss.com] ).
However in Google Webmaster Tools the page load time increased, suggesting that the measurements are being calculated from US datacentres, even though for the target audience the speed increased and page load time decreased.
I would like to see Google use the geographic target preference and to have the nearest datacentre to the target be the one that performs the measurement... or better still to have both a local and remote datacentre perform every measurement and then find a weighted time between them that might reflect real-world usage.
Otherwise if I'm being sent the message that I am being penalised for not hosting close to a Google datacentre from where the measurements are calculated, then I will end up moving there in spite of the fact that this isn't the right thing for my users.
Re:I feel happier with NoScript (Score:3, Interesting)
Re:Another trick we used: (Score:1, Interesting)
Pack all design (not content) pictures in one big picture, and use cropping to use the parts in the right places.
Is there any kind of client performance issue if, for instance, I end up with a 500k image and I then call that image into the page 50 or 60 times to build up every single component of the page (navigation button styles, hover styles, etc)? Genuine question, I don't know if the browser throws away the cropped part of the image so its manipulating tiny images, or if it's doing manipulation with massive images (and I know when doing simple JQuery scrolling of large images in Firefox I get all kinds of slow down issues which I think are related to dealing with moving so many large images at once).