Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Google The Internet IT

Google Shares Insights On Accelerating Web Sites 230

miller60 writes "The average web page takes 4.9 seconds to load and includes 320 KB of content, according to Google executive Urs Holzle. In his keynote at the O'Reilly Velocity conference on web performance, Holzle said that competition from Chrome has made Internet Explorer and Firefox faster. He also cited the potential for refinements to TCP, DNS, and SSL/TLS to make the web a much faster place, and cited compressing headers as a powerful performance booster. Holzle also noted that Google's ranking algorithm now includes a penalty for sites that load too slowly."
This discussion has been archived. No new comments can be posted.

Google Shares Insights On Accelerating Web Sites

Comments Filter:
  • Ajax Libraries (Score:3, Interesting)

    by 0100010001010011 ( 652467 ) on Thursday June 24, 2010 @12:17AM (#32673626)

    Now if only every website didn't include 300kb of Javascript libraries that I had to download.

  • by cameljockey91 ( 1455491 ) on Thursday June 24, 2010 @12:21AM (#32673648) Homepage
    How many times will their crawler check a slowly loading website before they penalizes it?
  • Moore's Law (Score:1, Interesting)

    by Anonymous Coward on Thursday June 24, 2010 @01:07AM (#32673926)

    Not to discount how important it is to make your website as fast as possible but...

    I doubt anyone with a decent internet connection is complaining about these 320k pages. Even on a cell phone it's not a big deal. As technology moves forward and speed improves even more these size related complaints will get less and less important.

    Think about it - who complains about a 340k file on their hard drive anymore. I'm sure in the mid '80s lots of geeks rightfully griped about it.

  • by Anonymous Coward on Thursday June 24, 2010 @01:39AM (#32674112)

    I run adblock, noscript, flashblock, betterprivacy, yslow... and...I run all of that stuff at work too.

    Despite that, the website I run--*runs* analytics. Even though it's slow. Oh yeah--that slowness...it's often DNS. No clue why--I'd think that server farm would stay cached everywhere on the planet. The deal is--if your browser hangs on that load--somebody wrote the page wrong. My analytics urchin--except on the blog (damned wordpress) always runs at the end, after content's done rendering.

    Smart webmasters understand that their webpage is not a single document that just suddenly appears.

    Unfortunately, that seems to be a rare breed--as these days I have trouble getting interns or programmers who understand even the basics like minification, modgzip, or who are even comfortable setting cache headers.

  • by buro9 ( 633210 ) <david&buro9,com> on Thursday June 24, 2010 @02:14AM (#32674316) Homepage

    Where are the measuring *from*?

    I've moved a site from Linode New Jersey to Linode London, UK because the target audience are in London ( http://www.lfgss.com/ [lfgss.com] ).

    However in Google Webmaster Tools the page load time increased, suggesting that the measurements are being calculated from US datacentres, even though for the target audience the speed increased and page load time decreased.

    I would like to see Google use the geographic target preference and to have the nearest datacentre to the target be the one that performs the measurement... or better still to have both a local and remote datacentre perform every measurement and then find a weighted time between them that might reflect real-world usage.

    Otherwise if I'm being sent the message that I am being penalised for not hosting close to a Google datacentre from where the measurements are calculated, then I will end up moving there in spite of the fact that this isn't the right thing for my users.

  • by delinear ( 991444 ) on Thursday June 24, 2010 @07:30AM (#32675944)
    Wait, they put up a sign welcoming the whole world to come into their house, and then you're saying it's their moral right to then complain if you don't look at the ads on the walls of their house as payment? There was no contract or agreement in place prior to my entering their house, just an open invitation - if this is a pre-requisite they should display at the very least a click through agreement that this is the understanding. I say this as someone who doesn't disable ads (because I do support a free web and for me it's easy to just ignore ads, I mentally filter them out and if the site gets some benefit by my not physically filtering them out, all power to them), but unless you're making it part of an explicit contract that you will only allow free views in exchange for enabling ads you have no right to complain when someone follows a link to your site with adblock/noscript enabled. If you don't like it, don't accept incoming links, set up a login system and enforce a policy that accounts will be deleted if ads are disabled - then sit back and enjoy your very quiet life on the web...
  • by Anonymous Coward on Thursday June 24, 2010 @10:22AM (#32677470)

    Pack all design (not content) pictures in one big picture, and use cropping to use the parts in the right places.

    Is there any kind of client performance issue if, for instance, I end up with a 500k image and I then call that image into the page 50 or 60 times to build up every single component of the page (navigation button styles, hover styles, etc)? Genuine question, I don't know if the browser throws away the cropped part of the image so its manipulating tiny images, or if it's doing manipulation with massive images (and I know when doing simple JQuery scrolling of large images in Firefox I get all kinds of slow down issues which I think are related to dealing with moving so many large images at once).

It is easier to write an incorrect program than understand a correct one.

Working...