Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Google The Internet IT

Google Shares Insights On Accelerating Web Sites 230

miller60 writes "The average web page takes 4.9 seconds to load and includes 320 KB of content, according to Google executive Urs Holzle. In his keynote at the O'Reilly Velocity conference on web performance, Holzle said that competition from Chrome has made Internet Explorer and Firefox faster. He also cited the potential for refinements to TCP, DNS, and SSL/TLS to make the web a much faster place, and cited compressing headers as a powerful performance booster. Holzle also noted that Google's ranking algorithm now includes a penalty for sites that load too slowly."
This discussion has been archived. No new comments can be posted.

Google Shares Insights On Accelerating Web Sites

Comments Filter:
  • First Post! (Score:2, Funny)

    by Anonymous Coward

    If only Slashdot loaded faster I could have had my first post!

  • Ajax Libraries (Score:3, Interesting)

    by 0100010001010011 ( 652467 ) on Wednesday June 23, 2010 @11:17PM (#32673626)

    Now if only every website didn't include 300kb of Javascript libraries that I had to download.

    • Noscript (Score:5, Informative)

      by iYk6 ( 1425255 ) on Wednesday June 23, 2010 @11:20PM (#32673640)

      That's what noscript is for. With noscript, your browser doesn't even download the .js files.

      • Re:Noscript (Score:5, Insightful)

        by Saeed al-Sahaf ( 665390 ) on Thursday June 24, 2010 @12:54AM (#32674200) Homepage

        That's what noscript is for. With noscript, your browser doesn't even download the .js files.

        That's fine and dandy. IF.

        If you don't care to see or experience the vast majority of web sites on the Intertubes today.

        Honestly, when I see (yet another) pious elitist bleating about no-script or whatever, I wonder: Why don't you just surf in Lynx?

        If you're surfing with no-script, you're missing 75% of the Internet. If it's not the 75% you want to see and or experience, than good for you. But bleating about the creative uses of JavaScript on the World Wide Web is old news.

        • Re:Noscript (Score:5, Informative)

          by Dylan16807 ( 1539195 ) on Thursday June 24, 2010 @12:59AM (#32674238)
          75%? When I turn off javascript it only seems to affect about a tenth of the sites I visit.
          • Re:Noscript (Score:5, Insightful)

            by calmofthestorm ( 1344385 ) on Thursday June 24, 2010 @01:15AM (#32674320)

            Likewise. And if I see flash it's a damn good indication I just don't care what's on the site.

            • And if I see flash it's a damn good indication I just don't care what's on the site.

              Except for games...

            • Re: (Score:2, Funny)

              by Anonymous Coward

              Hey well looky here, we have 2 gramps in our midst.

              For that, not only am i going to stand on your lawn, i'm going to rip out grass in the shape of a stencil of goatse.

            • by Kozz ( 7764 )

              Likewise. And if I see flash it's a damn good indication I just don't care what's on the site.

              Oh, yeah? I'll see your smug and raise you a get off my lawn.

          • Re: (Score:2, Insightful)

            by EvanED ( 569694 )

            A tenth? What Internet are you using?

            My my estimation based on the scroll bar position after counting for a while, I've got about 250 websites listed with site-specific preferences; some of those will just have plugins enabled so I can look at PDFs, but most are to enable JavaScript.

            They range from a couple dozen sites that I want to watch some Flash video on (enabling plugins but leaving JS off doesn't seem to work) to online stores (NewEgg and Best Buy both need JS for nearly essential features) to two of

        • Re:Noscript (Score:5, Insightful)

          by Jurily ( 900488 ) <jurily AT gmail DOT com> on Thursday June 24, 2010 @01:02AM (#32674262)

          If you're surfing with no-script, you're missing 75% of the Internet.

          Actually, it's more like 95%. However, you did completely miss the point. Turning off Noscript for a site you choose to bless takes two mouse clicks and a reload.

          You're not missing out on what you want to see. You're missing out on all the other random shit you couldn't care less about.

          • Also knock it up a notch with View -> Page Style -> No Style.

            This works really well for sites that put stories into tiny columns or use unreadable fonts.

          • Exactly. I also turn off pictures, CSS and that pointless HTML rendering. What’s the reason for that anyway?
            I just miss out on all that random shit that I couldn’t care less about. :P

          • Re:Noscript (Score:4, Insightful)

            by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Thursday June 24, 2010 @07:14AM (#32676202) Homepage Journal

            You're not missing out on what you want to see. You're missing out on all the other random shit you couldn't care less about.

            Blocking stuff has taught me that there are some sites I just want to avoid. When I load a site and it not only requires Javascript to work when it's not using any dynamic features, and tries to dump scripts and cookies from a dozen known spammers on me, then I know I really don't need to view that content. Who cares if it was interesting? It's not THAT interesting.

        • Re: (Score:3, Insightful)

          by Anonymous Coward

          Or if you actually want to be able to use your [credit card/bank/net provider] site. The problem isn't what we, the users, allow or deny--the problem is the hubris of the programmers and web designers who want to stuff in all that bloatware, just because they can.

          Dear Web Site Designer:

          If your page takes more than a few seconds to load, your customer has moved on.

          If your search algorithm brings up even more pages when another term is added, your customer won't slog through the cruft

          If the pages is flashing

        • I have noscript on and yes, it affects a lot of sites, but if I want, I can always enable the scripts for a particular site, but it sill helps not having them on by default.

        • by Per Abrahamsen ( 1397 ) on Thursday June 24, 2010 @01:43AM (#32674454) Homepage

          Noscript doesn't turn off Javascript. Most browsers already have an option for that. What Noscript does is to make the control of Javascript (and Flash) much more fine grained and convenient.

          Some typical case:

          1. Scripts on poor web sites just serve to detract from the content. Those you simply never turn on.

          2. Scripts on good web sites improve access to content. Those sites you enable permanently first time you visit (press no Noscript button in the lower right corner, and select "enable permanently") and forget about it.

          3. Some web sites contain a mix of the two. Here you can either explicitly enable a specific object (by clicking on a placeholder, like with flashblock), or temporarily enable scripts for that site.

          Basically, Noscript makes more, not less, of the web accessible. The good web sites you use normally will not be affected (as they all will be allowed to run scripts). But following links from social web sites like /. become a much more pleasant experience.

          Of course, most of the noise scripts distacting from content are ads, so AdBlock gives you much of the same benefit. But I don't want to hide ads, as that is how the sites pay their bills.

          • by improfane ( 855034 ) on Thursday June 24, 2010 @04:29AM (#32675336) Journal

            If you're not going to be clicking adverts, I am sure it costs nobody money. It just costs them bandwidth. The adworld is mostly CPC/PPC.

            Content websites seem to think that if I do not block an advert, I will actually click it. That is ridiculous!

            My principle is that advertising is like a bribe, they paid to put it in my face. That is a product I have no interest in. I will learn about products when I have a need for them.

        • It's true that javascript is crucial to the web. However, good and important javascript is limited to a handful of sites. What you should do is take a liberal approach at the beginning and e.g. immediately turn on the whole of google.com and the other top ten whole domains that you use regularly. Next, if you find yourself allowing a site a second time, you do it long term. Finally; learn a few alternative sites that are less script dependent. The worst ones are ones which make you use javascript for m

          • . You will very occasionally temporarily allow a site and that costs you a second or so

            Or spend a few minutes filling in a form then wondering why it doesn't submit or realise that some dropdown isn't populating etc. This is especially annoying on secure payment forms, which may have elements from a couple of other websites being blocked even if you've already enabled scripting on the main site. I don't like reloading forms that may charge me again, or having to go and restart the data entry process when I find out that the payment confirmation uses Javascript..

            I just use adblock to block ads

          • e.g. immediately turn on the whole of google.com

            But not google analytics... I'm tired of waiting for google to analyse my visit before the rest of the page is allowed to load.

        • I think Lynx is the wrong example to use, as it does not have Javascript.

          Run Internet Explorer for a few days. Go to the sites you go to now currently and see how unresponsive they are:

          Enjoy the flash adverts sucking up your CPU.
          Enjoy the diet and belly fat banners
          Enjoy the and accordion and menu animations
          Enjoy the Google Analytics loading
          Enjoy the updating banner ads
          Enjoy loading Prototype/jQuery/Google Ads again and again for every site you go to
          Enjoy vibrant media in-text adverts.
          Enjoy some of the sneak

          • You don't block or try counter adblockers. It's my computer, my bandwidth.

            You're asking their computer to do work to transfer over their bandwidth, accessing information that they spent money on preparing, where the payment they ask for is displaying ads along the content.

            It is certainly possible to sincerely argue that pirating this material is morally acceptable, using similar arguments as with other forms of piracy where the requested payment is different. However, when you're making their computer do stuff using their bandwidth in violation of their wishes, it is disingenuou

            • If anything, I'm *saving them* bandwidth.

              I will never click on of their adverts, so why should I see it?

              I'd hazard a guess that they cost me more in CPU usage than it ever does them. It costs me MORE than I get for the content.

            • Re: (Score:3, Interesting)

              by delinear ( 991444 )
              Wait, they put up a sign welcoming the whole world to come into their house, and then you're saying it's their moral right to then complain if you don't look at the ads on the walls of their house as payment? There was no contract or agreement in place prior to my entering their house, just an open invitation - if this is a pre-requisite they should display at the very least a click through agreement that this is the understanding. I say this as someone who doesn't disable ads (because I do support a free w
              • by dave420 ( 699308 )
                Why would you assume that they don't care about ads being served? Why would they have them if they didn't serve a purpose (pardon the pun)? Yes, sites rarely (if ever) have text saying you must have adverts served, but it doesn't take a rocket surgeon to figure out they are there for a reason.
          • As to the content producers [..] screw them. [..] It's my computer, my bandwidth.

            That's the most childish thing I've read this week.

            It's also their bandwidth, and their content (which you seem to be getting some use out of).

            You have a right to try to block their ads, and they have a right to try to get around that (as long as they aren't actually infecting your machine with malware to do this). If they can't get their ads seen, they won't make money and soon there will be no content for you to mooch.

            • This is a key misunderstanding: SHOWING an advert does NOT mean money for the web site.

              This is a logical fallacy of the content producers. Simply showing me an advert does not translate to an instant income. I have to actually click it. It's like the cosmetic companies saying using their products will make you feel better after 2 weeks. Of course they're going to say that! It makes them cash.

              The only time I have EVER clicked an advert is to give a site owner money, which is against the advertisers ToS and r

              • I know it doesn't mean more money for the site, but it definitely means more chance of clicking one (even if this is only 0.01%, it's better than 0%). I'm against ads on principal, but I think I did actually click one the other day when I was on a machine without an adblocker.

                Donating is good, and I would happily pay for services to get them ad-free, though with web pages that's pretty much a moot point when you have adblock enabled!

              • Re: (Score:3, Informative)

                by delinear ( 991444 )
                Actually you're dead wrong, because ads don't just track click-throughs, they can also track impressions. If I visit a site with an ad for product X, and then two days later I go buy product X, there is a model which will see the original site owner rewarded, even though there was a disjoint between me seeing the ad and buying the product. The amount will likely be much less than a direct click-through-purchase model, but nevertheless it recognises the cumulative effect of having seen the ad in a few places
              • by dave420 ( 699308 )
                No, it depends on the site's advertising model. Lots of sites get paid for simply showing an advert. In fact, you can't tell the difference from just visiting the site. You clearly have no idea. There's a surprise.
          • by dave420 ( 699308 )
            It's people using AdBlock that cause sites to have annoying adverts in the first place. Site owners are simply trying to recoup the cost of bandwidth by serving ads. If you turn off the adverts, then they don't get any money. If a site's adverts annoy you, just don't go to the site. It's not just your bandwidth, it's the site's bandwidth. It's very selfish to ignore that. But I'm sure you'll post back with some pithy response about how it's up to you, and blah-blah-blah flash CPU blah blah.
            • Adverts pay for hosting.

              Adverts ONLY pay for hosting if me, the surfer:

              • Clicks an advert
              • Buys a product referenced in the advert or visits the site in the future

              Otherwise they get nothing. I should know, I have £80 in my adsense account and nobody clicked my adverts and I had 80,000 impressions.

              It makes NO difference if you show me an advert or not, I WILL NOT buy it or follow it. I immediately mistrust it. They had to pay to get it in my face. I would rather wait for word of mouth or a review.

              Does that

            • Re: (Score:3, Informative)

              by clone53421 ( 1310749 )

              It's people using AdBlock that cause sites to have annoying adverts in the first place.

              That is simply false. In fact, reality is exactly the opposite: It’s the sites having annoying adverts that cause people to use AdBlock in the first place.

              Annoying advertisements (particularly annoying, the blinking animated gif ones) have been around at least since when I was first starting to surf the web back in the days of Netscape Navigator 2. AdBlock was pretty much unheard of back then, which meant I had no choice but to look at Flash ads for fungal foot cremes on my Hotmail account.

    • Re: (Score:2, Informative)

      by Anonymous Coward

      All modern browsers support caching, and chances are, you aren't actually downloading a brand new set of libraries each time.

      • Is your browser cache smart enough to deduplicate between multiple websites?

        Website A downloads jQuery
        Website B downloads jQuery

        The browser has no idea what the file contains except a filename. Since they are on different hosts, how would it know*?

        It's not possible for the browser to know which is which. The only way the browser cache would benefit this case if you hotlink from the same domain which is incredibly dangerous (say https://jquery.com/jscript.js [jquery.com])

        * One solution is to implement a 'web library stan

        • There's already a HTTP header called "Content-MD5". It's designed for error correction, but it could very well be used by caching mechanisms: send an HTTP HEAD, get the md5, compare against the local cache database to see if there are any hits.

          • Re: (Score:3, Insightful)

            by clone53421 ( 1310749 )

            Clever enough, but using MD5 is still running the slight risk of collisions... of course if you verify that the content-length is the same size too, you’re reducing the risk of collision substantially.

        • by dave420 ( 699308 )
          Google hosts jQuery on their CDN, available for use in a script tag (or inclusion via Google's own JS library) which is not incredibly dangerous. Lots of sites use it, which means all the sites that do don't require the library being re-downloaded.
    • by hey ( 83763 )

      You can load those libraries from Google.
      I assume the don't penalize you for that!

      • by Korin43 ( 881732 ) *
        If you load them from Google, it's far less likely to impact loading times (since your browser will use the same cached script for every site that loads from the same place).
        • On the other hand, Google tends to use amazingly slow servers for some functions. When I decided to edit my hosts file to have www.google-analytics.com point to 0.0.0.0 it shaved a good ten to fifteen seconds off my page loading times because GA had such ridiculously long loading times.

          Maybe they've since brought their infrastructure in order and maybe they always hosted their libraries in a more reasonable manner than GA but I'd still be wary of Google-hosted libraries - if the user doesn't have them ca
    • Re:Ajax Libraries (Score:4, Insightful)

      by nappingcracker ( 700750 ) on Wednesday June 23, 2010 @11:49PM (#32673834)
      I disagree. Libraries have greatly improved the usability of many websites. I also doubt that many people are pulling down 300kb of libraries every time, since most are minified and gzipped. Even with a ton of bells and whistles it's hard to hit 100kb of .js, The ever popular jQuery + jQuery UI is only ~30kb (with reasonably useful plugins like tabs, dialog, etc, not all the crazy and expensive FX).

      I'm OK with users having to pull even 100kb one time to have a nicer browsing experience all around.

      I really wish I could get over my paranoia and link to the libraries on google's code CDN. Slim chance, but if they go down and my sites are still up, there be problems!
    • Re: (Score:2, Insightful)

      by gaspyy ( 514539 )

      Frameworks are great but they are also overused.
      JQuery is fantastic if you're doing a big site that you want to feel like an app, but many people load JQuery just to do an image fade or animation - stuff that you can easily code yourself. [richnetapps.com]

      To add insult to injury, sites made with Joomla, WP, Drupal, etc. often rely on plugins, which use their own libraries. The end result is a site that loads JQuery, Mootools and Scriptaculous just to do some trivial effects that would be achieved just as well with document.g

      • Re: (Score:3, Informative)

        by micheas ( 231635 )

        Frameworks are great but they are also overused.
        JQuery is fantastic if you're doing a big site that you want to feel like an app, but many people load JQuery just to do an image fade or animation - stuff that you can easily code yourself. [richnetapps.com]

        To add insult to injury, sites made with Joomla, WP, Drupal, etc. often rely on plugins, which use their own libraries. The end result is a site that loads JQuery, Mootools and Scriptaculous just to do some trivial effects that would be achieved just as well with document.getElementById(), setTimeout() and the element.style property.

        The problem with doing things yourself instead of using a framework for common things like fades is that you have to remember how to code for each of the common browsers. The libraries hide that nastiness from you.

        • Re: (Score:3, Insightful)

          by delinear ( 991444 )
          It also establishes a common method of implementing things - within an environment with more than one developer, it saves a lot of time if you're all building using the same framework rather than having to work out the nuances of each other's bespoke code all the time. For little throwaway projects the time saved in doing clever UI work and in maintenance thereafter in using a framework is massive. For bigger projects the gains are less, but as you mentioned, still worth it for creating a level browser play
    • For Flash heavy sites, will the time it takes for the Flash to load be taken into account? Or how about sites slowed down by all the external ads?

    • Comment removed based on user account deletion
      • Even better, use the Net tab in Firebug which breaks it down and shows you how long each component takes to lookup, request and receive.

  • How many times will their crawler check a slowly loading website before they penalizes it?
    • NONONONONO (Score:3, Informative)

      by Magic5Ball ( 188725 )

      "He also cited the potential for refinements to TCP, DNS, and SSL/TLS to make the web a much faster place"

      The core Internet protocol and infrastructure was and remains a conduit of innovation /because/ it is agnostic to HTTP and all other protocols. Optimizing for one small subset of its protocols and for a single kind of contemporary usage would discourage all kinds of innovation using protocols we've not conceived yet, and would be the single largest setback the modern Internet has seen.

  • by Lord_of_the_nerf ( 895604 ) on Wednesday June 23, 2010 @11:22PM (#32673650)
    I find my browsing goes faster if I just yell at my housemate to stop downloading torrents that are *ahem* 'Barely Legal'.
    • by dintech ( 998802 ) on Thursday June 24, 2010 @03:56AM (#32675120)

      If you haven't already, get a router with QoS. Next, when he's in some other room 'maximising his bandwidth', set his max connections to 30 and his upload to 1/4 of your upload speed. You might also consider Tomato or DD-WRT if you have a compaitible router.

    • That’s what traffic shaper scripts for your firewall are for.
      I’ve done my own, and I still get perfect ping and priority for surfing/im/ssh etc.

  • by corsec67 ( 627446 ) on Wednesday June 23, 2010 @11:27PM (#32673680) Homepage Journal

    I saw my browser waiting on google-analytics.com quite often before I started using No-Script.

    Why do sites put up with an AD server/analytics service that slows down a site by a large amount?

    • by Anonymous Coward on Wednesday June 23, 2010 @11:42PM (#32673778)

      Because it's valuable data, and google is the only game in town. You can see which keywords are converting, and for what dollar amount, and which keywords are money pits. Yes, it will on occasion hang but you should look at the data that it produces before saying it's not worth it.

      • by dintech ( 998802 ) on Thursday June 24, 2010 @04:05AM (#32675184)

        Yes, it will on occasion hang but you should look at the data that it produces before saying it's not worth it.

        Not worth it to who? It's not worth it to me. Noscript please.

        • by garcia ( 6573 )

          It's worth it to me. That's how I afford to post the content to the website which "you" consume information from.

        • But it is worth it, if it serves to make the sites you use better.

          I know people like to be short-sighted, selfish and need instant gratification, but there are good reasons to turn off Noscript in some places even when it's not directly and immediately helping you.

          Even when talking about a site you visit being able to gain other visitors, that's important if you have any interest in that site growing.

          Think big picture.

    • Isn't this sort of mitigated when people put Google Analytics at the bottom of the page? I've never noticed any slowdowns waiting on GA on any of my sites, and I have GA scripts at the bottom of every page. The absolute bottom.
      • If you have any body onload scripts, then they will wait for the entire page to load... including google analytics that are at the very bottom.
    • by Spikeles ( 972972 ) on Thursday June 24, 2010 @01:13AM (#32674306)

      Googles' own documents [google.com] recommend that you should use asynchronous tracking which should cause no page slowdowns, and even if you use the traditional code [google.com] it should be at the at the end just before the closing body tag.

      If a page is loading slowly because of google-analytics, blame the web site developer.

    • google-analytics? That’s what Adblock is for. :)

  • How fast? (Score:5, Insightful)

    by tpstigers ( 1075021 ) on Wednesday June 23, 2010 @11:46PM (#32673814)

    Google's ranking algorithm now includes a penalty for sites that load too slowly.

    I'm not sure how I feel about this. My initial response was a happy one, but the more I think about it, the more it seems to be unnecessarily discriminating against those who are too far away from the bleeding edge. Do we really live in a world where 'Speed=Good' so completely that we need to penalize those who don't run fast enough? And where are we drawing the line between 'fast' and 'slow'?

    • While yours is a well thought out comment, from dealing extensively with web site latency for multiple sites, "bleeding edge" is often slower than "simple" or "old". As others have pointed out, it's the 300kb of javascript from 10 different social widget and ad sites that slow down the page. Most research on this topic today emphasizes client-side latency, as in the code and structure of what your browser downloads and in what sequence. Client side latency generally consumes > 90% of the user visible la
    • Cue lack of net neutrality and this becomes a nasty can of censorship worms.

    • Re: (Score:2, Insightful)

      by Wierdy1024 ( 902573 )

      Yes, we should penalize them.

      Imagine there are two sites with the information I need. I would much prefer to see the faster site than the slower one, because by loading the faster site I get my information faster.

      If I wanted my information slowly, I would walk to the library...

  • Penalty for speed (Score:3, Insightful)

    by csmanoj ( 1760150 ) on Thursday June 24, 2010 @12:12AM (#32673968)
    That would make google search results bad right. When I search I want the site with the best information. Not the one that loads fastest.
    • The ironic thing from my perspective is that Google's own services (ads and analytics) are among the worst offenders for making web pages slow down, in my experience...

    • by dintech ( 998802 )

      The web has always been slow. Anyone who remembers the Netscape days will atest to that.

    • Re: (Score:3, Insightful)

      Seconded! Penalizing slow sites only promotes those shitty mirror sites that all seem to be on fast pipes.

      It's interesting: I've started noticing these spam sites creeping higher and higher up the page rank and was wondering what new trick the ad-spammers had developed to game Google. It would just figure that it turns out to be Google shooting themselves in the foot with asinine policies like this.

  • by Animats ( 122034 ) on Thursday June 24, 2010 @12:26AM (#32674052) Homepage

    Most real-world page load delay today seems to be associated with advertising. Merely loading the initial content usually isn't too bad, although "content-management systems" can make it much worse, as overloaded databases struggle to "customize" the content. "Web 2.0" wasn't a win; pulling in all those big CSS and JavaScript libraries doesn't help load times.

    We do some measurement in this area, as SiteTruth [sitetruth.com] reads through sites trying to find a street address on each site rated. We never read more than 21 pages from a site, and for most sites, we can find a street address within 45 seconds, following links likely to lead to contact information. Only a few percent of sites go over 45 seconds for all those pages. Excessively slow sites tried recently include "directserv.org" (a link farm full of ads), "www.w3.org" (embarrassing), and "religioustolerance.org" (an underfunded nonprofit). We're not loading images, ads, Javascript, or CSS; that's pure page load delay. It's not that much of a problem, and we're seeing less of it than we did two years ago.

  • by SuperBanana ( 662181 ) on Thursday June 24, 2010 @01:10AM (#32674288)

    Holzle said that competition from Chrome has made Internet Explorer and Firefox faster.

    Bull. Back when IE and Firefox's last major releases came out, Chrome was a tiny drop in the bucket market-share-wise. January was the first time it passed Safari in marketshare. I think it's more accurate to say that competition in general has led to companies improving their browsers. I'd bet we could also attribute the performance improvements to better standards compliance by websites, since there are now so many mainstream browsers.

    I'd say that Firefox vs IE competition (and Firefox vs Safari on the mac) have inspired the improvements...

    • I sort of agree. The general competition has definitely lead to improvement all round but I seem to remember Chrome's main selling point when it was released was its speed (previously Opera's forte) which lead to the other browsers improving their speed. Firefox's was security, stability and customisability which has lead to improvements in Internet Explorer's. And Internet Explorer has...erm...lead to people paying greater attention to web standards as everyone slowly realises how shitty it is when a brows

  • by buro9 ( 633210 ) <david@buro9 . c om> on Thursday June 24, 2010 @01:14AM (#32674316) Homepage

    Where are the measuring *from*?

    I've moved a site from Linode New Jersey to Linode London, UK because the target audience are in London ( http://www.lfgss.com/ [lfgss.com] ).

    However in Google Webmaster Tools the page load time increased, suggesting that the measurements are being calculated from US datacentres, even though for the target audience the speed increased and page load time decreased.

    I would like to see Google use the geographic target preference and to have the nearest datacentre to the target be the one that performs the measurement... or better still to have both a local and remote datacentre perform every measurement and then find a weighted time between them that might reflect real-world usage.

    Otherwise if I'm being sent the message that I am being penalised for not hosting close to a Google datacentre from where the measurements are calculated, then I will end up moving there in spite of the fact that this isn't the right thing for my users.

    • by Anonymous Coward on Thursday June 24, 2010 @01:52AM (#32674510)

      From the docs:

      "Page load time is the total time from the moment the user clicks on a link to your page until the time the entire page is loaded and displayed in a browser. It is collected directly from users who have installed the Google Toolbar and have enabled the optional PageRank feature."

      http://www.google.com/support/webmasters/bin/answer.py?answer=158541&hl=en

    • w00t - my favourite forum gets a mention on /.!
  • I'll have to double-check my sites to be sure, but I think I'd be throwing a huge optimisation at any of my pages that got near 320KB, never mind averaging that large. That's just crazy-huge for a page given the amount of actual useful content that most pages have. If only people put in useful stuff instead of filling sites with pointless cruft.

    • by thijsh ( 910751 )
      This Slashdot page is 300Kb now... You seem to be forgetting that pages aren't just white with black text only. There is a lot of media being loaded, and 320Kb is only a little if you think about it. A large JPEG alone can be bigger, and multiple on one page will easily put a page in the megabyte range (i'm looking at you Bing).
  • Pack all design (not content) pictures in one big picture, and use cropping to use the parts in the right places. Saves you separate requests and hence HTTP headers and establishing separate TCP connections.
    Also shorten all your links. A server-site script can handle the en- and decoding. (But beware that this stops Google from matching keywords against the URLs.)
    Much can also be done subjectively. Like never having elements with unknown heights hold up the rendering of elements below them. Always specify t

news: gotcha

Working...