Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Yahoo! Businesses The Internet

Yahoo's YSlow Plug-in Tells You Why Your Site is Slow 103

Stoyan writes "Steve Souders, performance architect at Yahoo, announced today the public release of YSlow — a Firefox extension that adds a new panel to Firebug and reports page's performance score in addition to other performance-related features. Here is a review plus helpful tips how to make the scoring system match your needs.
This discussion has been archived. No new comments can be posted.

Yahoo's YSlow Plug-in Tells You Why Your Site is Slow

Comments Filter:
  • Why sites are slow (Score:2, Interesting)

    by Anonymous Coward on Wednesday July 25, 2007 @09:37AM (#19982525)
    Sites are only as fast as the slowest path through the site.

    If your site has 10 different affiliate links/sponsors, all hosted on different providers, your site will be slow.

    Similarly, if your site has 100 different java/javascript crapplets,widgets, your site will be even slower.

    Here is a simple guide for site creators:

    1. Don't overload on ads, I'm not going to view them anyway
    2. Put some actual content I'm interested in on your site
    3. Don't overload me with java/javascript crap, I don't care what my mouse pointer looks like, just let me click
    4. Not everything needs a php/mysql front/back end.

    Feel free to use this as a guide, and I might just visit those sites.
  • Re:/. gets a D (Score:5, Interesting)

    by jrumney ( 197329 ) on Wednesday July 25, 2007 @09:54AM (#19982683)
    My own site also got a 'D', so that seems to be the standard grade. Everything that matters, it got an 'A' for, except for using non-inlined CSS which it got a 'B' for the test that said you shouldn't (to reduce HTTP requests), and an N/A for the test that says you should (to take advantage of caching). Then there were a whole lot of irrelevant things that it got an 'F' for. The fact that none of my site is hosted on a distributed network, the fact that I leave the browser cache to make its own decision about expiring pages since I don't know in advance when I'm going to next change it, and something about ETags, I'm not sure whether it is saying I should have more of them, or I should get rid of the ones I've got.
  • Re:Another tool (Score:1, Interesting)

    by Anonymous Coward on Wednesday July 25, 2007 @10:24AM (#19982985)
    I tried the piggiest page on my own site (and thank you for the link BTW) just out of curiosity. Note that all images are almost completely necessary (it is, after all, about visual art). And I wrote it way back in 1998. IIRC there is a reprint somewhere on K5, sans graphics.

    URL: http://mcgrew.info/Art/ [mcgrew.info]
    Title: Steve's School of Fine Art
    Date: Report run on Wed Jul 25 09:10:42CDT2007

    Total HTML: 1
    Total HTML Images: 13
    Total CSS Images: 0
    Total Images: 13
    Total Scripts: 1
    Total CSS imports: 0
    Total Frames: 0
    Total Iframes: 0

    Connection Rate Download Time
    14.4K 384.00 seconds [wow that's six minutes! But as height and width attributes of the graphics are specified, the text loads first]
    28.8K 193.50 seconds
    33.6K 166.28 seconds
    56K 100.97 seconds
    ISDN 128K 33.00 seconds
    T1 1.44Mbps 5.60 seconds

    • TOTAL_HTML - Congratulations, the total number of HTML files on this page (including the main HTML file) is 1 which most browsers can multithread. Minimizing HTTP requests is key for web site optimization.
    • TOTAL_OBJECTS - Warning! The total number of objects on this page is 15 - consider reducing this to a more reasonable number. Combine, refine, and optimize your external objects. Replace graphic rollovers with CSS rollovers to speed display and minimize HTTP requests.
    • TOTAL_IMAGES - Warning! The total number of images on this page is 13 , consider reducing this to a more reasonable number. Combine, refine, and optimize your graphics. Replace graphic rollovers with CSS rollovers to speed display and minimize HTTP requests.
    • TOTAL_SIZE - Warning! The total size of this page is 491579 bytes, which will load in 100.97 seconds on a 56Kbps modem. Consider reducing total page size to less than 30K to achieve sub eight second response times on 56K connections. Pages over 100K exceed most attention thresholds at 56Kbps, even with feedback. Consider contacting us about our optimization services.
    • TOTAL_SCRIPT - Congratulations, the total number of external script files on this page is 1 . External scripts are less reliably cached than CSS files so consider combining scripts into one, or even embedding them into high-traffic pages. [google ad, added later]
    • HTML_SIZE - Caution. The total size of this HTML file is 27045 bytes, which is above 20K but below 100K. With a 10K ad and a logo this means that your page will load in over 8.6 seconds. Consider optimizing your HTML and eliminating unnecessary features. To give your users feedback, consider layering your page or using positioning to display useful content within the first two seconds.
    • IMAGES_SIZE - Warning! The total size of your images is 460375 bytes, which is over 30K. Consider optimizing your images for size, combining them, and replacing graphic rollovers with CSS. [no redundant images or image rollovers here!]
    • SCRIPT_SIZE - Caution. The total size of your external scripts is 4159 bytes, which is above 4080 bytes and less than 8K. Consider optimizing your scripts and eliminating features to reduce this to a more reasonable size. [blame Google!]
    • MULTIM_SIZE - Congratulations, the total size of all your external multimedia files is 0 bytes, which is less than 4K.


    I guess I flunk!

    -mcgrew
  • Re:/. gets a D (Score:4, Interesting)

    by daeg ( 828071 ) on Wednesday July 25, 2007 @01:17PM (#19985535)
    It depends on the headers (server), browser, and method, actually. Under some circumstances, for instance under SSL, full copies of all files will be downloaded for every request. As HTTP headers get more complex (some browsers with toolbars, etc, plus a plethora of cookies), the HTTP request/response cycle expands. It may not seem like a lot, but a .5kb request header multiplied by dozens of elements and you can quickly use up a lot of bandwidth. Firefox does a much better job than Internet Explorer under SSL, but not by much unless you enable disk-based caching.

    Something I would love to see are some of the headers condensed by the browser and server. For instance, on first request the browser sends the full headers. In the reply headers, the server would set a X-SLIM-REQUEST header with a unique ID that represents that browser configuration's set of optional headers (Accept, Accept-language, Accept-encoding, Accept-charset, User-agent, and other static headers). Further requests from that browser would then simply send the X-SLIM-REQUEST header and unique ID and the server would handle unpacking it -- if the headers are even needed. Servers that don't supply the header would continue to receive full requests, preserving full backward and forward compatibility.

    There are a few things to reduce request sizes for web applications. MOD_ASIS is one of the best ones. We use it as one of the last steps of our deployment process. All images are read in via script, compressed if they are over a certain threshold, and minimal headers are added. Apache then delivers them as-is -- reducing load on Apache as well as the network (the only thing Apache adds is the Server: and Date: lines). ETags and last-modified dates are calculated in advance. Also certain responses such as simple HTTP Moved (Location:) responses, GZip isn't used -- GZiping the response actually *adds* to the size due to their very small document size.
  • Re:/. gets a D (Score:5, Interesting)

    by mr_mischief ( 456295 ) on Wednesday July 25, 2007 @08:24PM (#19990311) Journal
    I've killed some time on this since it's a pretty interesting idea. It turns out there are plenty outside the D and F range. It does seem to like pages with a single Flash object and not much else, so that's bad. It also makes some pretty arbitrary decisions which don't mean squat to many sites. There are some sites that get enough traffic that speed is a factor but not so much that a content delivery network is really necessary, for example.

    I skipped the actual link and score on sites that are pretty much just representative of the sites around them. I wanted to include them by name, though, to show where they fall. I've stuck mostly to main index pages, and I've noted where I've gone deeper.

    A: Google [google.com] (99%), Altavista main page [altavista.com] (98%), Altavista Babelfish [altavista.com] (90%) (including upon doing a translation from English to French), Craigslist [craiglist.org] (96%), Pricewatch [pricewatch.com] (93%), Slackware Linux [slackware.com], OpenBSD [openbsd.org], Led Zeppelin site at Atlantic [ledzeppelin.com] (100%), supremecommander.com, w3m web browser site [w3m.org] (96%)

    B: Apache.org [apache.org] (87%), the lighttpd web server [lighttpd.net] (84%), Google Maps, which also got a C once [google.com] (84% in most cases), Perlmonks [perlmonks.org] (84%), Dragonfly BSD [dragonflybsd.org] (85%), Butthole Surfers band page [buttholesurfers.com] (81%), 37 Signals [37signals.com]

    C: One Laptop Per Child, [olpc.com], ESR's homepage [catb.org], the Open Source Initiative [opensource.org] (78%), Google News [google.com] (73%), Lucid CMS [lucidcms.net] (74%), Perl.org [perl.org] (75%), lucasfilm.com, Charred Dirt game [charreddirt.com]

    D: gnu.org, The Register [theregister.co.uk], A9 [a9.com] (66%), kernel.org [kernel.org], Akamai [akamai.com] (64%), kuro5hin.org, freshmeat.net, linuxcd.org, Movable Type [movabletype.org] (61%), Postnuke [postnuke.com], blogster.com, Joel on Software [joelonsoftware.com] (67%), Fog Creek Software [fogcreek.com], metallica.com, gaspowered.com, Scorched 3D [scorched3d.co.uk] (68%), id software [idsoftware.com] (64%), ISBN.nu book search [isbn.nu]

    F: MS IIS [microsoft.com] (49%), microsoft.com, msn.com, linux.com, fsf.org, discovery.com, newegg.com, rackspace.com, the Simtel archive [simtel.net] (26%), CNet Download [download.com] (29%), Adobe [adobe.com] (58%), savvis.com, mtv.com, sun.com, pclinuxos.com, freebsd.org, phpnuke.org, use.perl.org, ruby-lang.org, python.org, java.com, Rolling Stones band page [rollingstones.com] (56%), powellsbooks.com, amazon.com, barnesandnoble.com, getfirefox.com

    My site for my company (96%) gets an A (no, I'm not going to get it slashdotted) which is pretty simple but has a pic and some Javascript on it. Several sites I have done or have helped design with someone else get C or D ratings.

You knew the job was dangerous when you took it, Fred. -- Superchicken

Working...