Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
The Internet

Website Optimization 71

Michael J. Ross writes "As Internet users' expectations continue to ratchet upwards, it is increasingly essential that every Web site owner maximize the chances that those users will find the site in question, and, once found, that the site will perform well enough that those visitors become customers or members, and recommend the site to others. Key elements of a successful strategy include optimization for search engines, pay-per-click advertising, and visitor conversion, as well as responsive Web pages and fine-tuning of all the above, using various metrics. These topics and others are explored in Website Optimization: Speed, Search Engine & Conversion Rate Secrets by Andrew B. King." Keep reading for the rest of Michael's review.
Website Optimization
author Andrew B. King
pages 394
publisher O'Reilly Media
rating 8/10
reviewer Michael J. Ross
ISBN 0596515081
summary Techniques for increasing a site's SEO, conversion rates, and speed.
The book was published by O'Reilly Media on 15 July 2008, under the ISBNs 0596515081 and 978-0596515089. Website Optimization is organized into two major parts: search engine marketing optimization and Web performance optimization. The book's material, spanning 394 pages, is divided into 10 chapters, covering a range of topics: natural search engine optimization, an SEO case study, pay-per-click optimization, a case study thereof, conversion rate optimization, Web page performance, CSS optimization, AJAX optimization, server- and client-side performance techniques, and Web site metrics. The book begins with a forward by Jim Sterne, a Web marketing and metrics consultant, followed by a preface in which Andy King provides an overview of what is to follow, as well as credits to four other individuals. These credits are confusing, because they do not make clear for what exactly the individuals are being credited! The reader will be left wondering: Are these people the technical editing team? Or did they write some of the material in the book, without byline? Or did they only provide research material to the primary author? In personal correspondence to me, Andy King mentions that this book was "written by a team of experts let [sic] by me." Thus, they are apparently co-authors, but not identified as so in the book.

The first five chapters of the book focus on optimization of search engine marketing (SEM), which comprises search engine optimization (SEO), pay-per-click (PPC) advertising, and conversion rate optimization (CRO). The author(s) begin by demonstrating, through cited statistics, just how critical it is for Web sites to appear within the first few search engine result pages (SERPs), otherwise the sites will probably not be found by the roughly 90 percent of Internet users who do not bother looking at any subsequent pages. This documented selectivity should alone serve as an energizing wake-up call to any Web site owners who — either through ignorance or laziness — make no effort to improve their rankings within the major search engine results. The first chapter delineates the most common SEO mistakes, as well as basic techniques for achieving higher rankings. The two categories could have been combined, simply by inverting the language of the first category; for instance, "develop an adequate number of popular inbound links" could replace "[avoid] a lack of popular inbound links." The bulk of the SEO information will be familiar to most Web marketing veterans, though even they should glean some new pointers. All of the advice is correct, up-to-date, and worthy of implementation on any site — existing or under development. However, the "Step 3" and "Step 4" in Figures 1-6 and 1-7 may be confused by some readers with the identical section headings in the book's text. Note also that the KEI of "84,100" should instead read "84.100" (page 17). Lastly, the first and third sample URIs are missing GET keys (page 29).

The strategies for natural search engine optimization, presented in the first chapter, are illustrated in the second — through a case study of the SEO overhaul of a Philadelphia dental practice's Web site. The original version of the site was lacking keyword-rich headers, body copy, inbound links, etc. (In addition, the dentist's e-mail address was revealed to spam harvesters in plain text. Andy King mentions the use of a contact form to resolve this problem, but does not mention that there are methods of displaying an e-mail address to human visitors, while hiding it from spambots.) This site's search engine results were dramatically boosted through two iterations of SEO fine-tuning, redesign, and release. While this particular dentist's site was greatly improved by the work described in this chapter, the book itself is not improved by inclusion of said chapter, since no additional SEO techniques are offered to the reader, and the first chapter already had enough HTML code snippets to exemplify the concepts discussed. In fact, the case study results should have been boiled down to a few paragraphs and better presented as a sidebar at the end of the first chapter, or moved to the back as an appendix. This latter approach is further supported by the fact that the second chapter illustrates best practices discussed in chapters that the reader has presumably yet to read (5 and 6). The material that composes the actual last sidebar in the first chapter — on metadata and microformats — could have been relegated to an appendix.

Search engine-based ad campaigns are the most important elements in the marketing strategies of countless online vendors, and in Chapter 3, Andy King explains how to increase a site's pay-per-click results, click-through rates (CTRs), and conversion rates. He begins by explaining some key terms and concepts, which should be quite helpful for most readers — especially given how much the online marketing world is laden with terminology and acronyms. The chapter reviews the advertising programs of the three top search engines, and discusses PPC optimization for those programs, with special emphasis given to Google AdWords. Like the first chapter of the book, this one does a competent job of explaining and illustrating the key ideas, and making clear topics that can be quite daunting to anyone new to the field. However, additional clarification of some terms would be helpful, otherwise many readers may be uncertain as to what is meant by terms such as "negative keywords," which unfortunately are left undefined. Even phrases outside the online marketing industry, such as "second-price sealed bidding system," could confuse countless readers. More importantly, some of the material is discussed at a level higher than what would be really usable for most site owners and developers — in contrast to the first chapter, which generally presented more actionable details. In fact, for readers unfamiliar with all the factors involved in running a PPC ad campaign, the early portion of this chapter could prove quite bewildering. Returning to the issue of how best to present case studies, the "Bid Optimization in Action: The E-Grooming Book Example" section shows how illustrative examples can be presented much more concisely. In contrast, Chapter 4, which consumes eight pages, shows how not to illustrate concepts already discussed.

Considerable SEO and PPC efforts could pay off in the form of a huge increase in traffic to one's Web site. But all of that would be in vain if there were no corresponding increase in turning those visitors into customers. Chapter 6 is devoted to conversion rate optimization, and presents some key elements of persuading online prospects, as well as the top 10 factors for maximizing one's conversion rates, from an online marketing and sales perspective. This chapter is rich in material that should inspire site owners to critically reevaluate their sites' contents, as well as their competitors'.

The sixth chapter, on Web page optimization, commences the second part of the book, and explores the most common pitfalls that lead to poor site performance, as well as ten techniques for increasing page display speeds — many of them based upon Steve Souders's book High Performance Web Sites. Andrew King correctly notes that this optimization can result in increased profits, customer satisfaction, and accessibility. However, he also claims that it will decrease costs as well as improve site maintainability and search engine rankings. He should have made it clear that faster page loading per se will not provide those last three benefits, but rather those are potential secondary gains that result from changes to code and other factors with the goal of decreasing page load times for site visitors. Nonetheless, even the most experienced Web developers should find one or more ideas in this chapter for reducing the total bandwidth consumption of the pages they create — particularly for anyone serving video content, which receives substantial coverage in this chapter. Chapter 7, on CSS optimization, follows a pattern similar to its predecessor, by presenting ten methods for improving one's CSS code, as helpful rules. The advice is spot-on, and well illustrated with examples. The suggested methods are preceded by brief discussion of reset rules, including mention of the (differing) reset rules advocated by Eric Meyer and Yahoo. It would be interesting to have learned the author's perspective on the technical differences, and why the author chose one set of rules over another. Incidentally, the paragraph describing the section, immediately below the "Tip #1" header, should have been located above it. Also, on page 195, " | inherit" should have been explained, or, better yet, excised. Lastly, the "|" appears to be missing from the similar instances on the three subsequent pages.

During the past several years, there has been a huge increase in the usage of Asynchronous JavaScript and XML (AJAX) to reduce dynamic Web page reloading, and to make Web sites behave more like desktop applications. Unfortunately, there are pitfalls in this approach, and Andy King discusses them in Chapter 8, in addition to numerous best practices for minimizing these problems within one's own AJAX code. Incidentally, in the tip on page 225, the author states that the sample AJAX application will not run on your desktop; this apparently means that it cannot run on a local Web server. An explanation as to why, would most likely be of interest to the typical reader. Prior to getting into the details of JavaScript optimization, some tips on evaluating and choosing an AJAX library are presented.

Chapter 9 covers additional optimization techniques — aside from the Web page and code techniques covered earlier — on both the server and client side. The former category consists of parallel downloads, frequent caching, HTTP compression, delta encoding, and rewriting URIs. The latter category consists of load delaying, caching of off-site files on the server to be loaded locally, JavaScript packing, and inlining images.

The last chapter delves into Web site metrics for measuring the effectiveness of Web sites and changes made to them. The author explains some of the most popular and telling metrics, the leading Web analytics software (both Web server log analysis and JavaScript page tagging), and how they can be used for improving one's search marketing strategies and results. The chapter concludes with a detailed discussion of Web performance metrics — i.e., measures of page load times, oftentimes broken out by site, request sizes, and content type. The material clearly shows that there are a great many options for testing the optimization techniques presented in all of the earlier chapters.

There are two Web sites that have additional information about the book: O'Reilly's book page offers book descriptions, the table of contents, and confirmed and reported errata (of which there are no significant ones, as of this writing). There is a more substantial author book site, which has chapter summaries, full color figures, worksheets, all the sample code, and links to external reviews.

In general, the book achieves its goals. Aside from the occasional marketing term that will most likely puzzle the majority of readers (more on that in a moment), the writing is clear and the examples cited are applicable. The illustrations created and chosen for this book are more than adequate in quality and number, although some of the graph labels would be confusing if not clarified by the text, e.g., "Mean Fixation Duration" (page 2). Web site statistics and other data are well referenced throughout the manuscript.

On the other hand, the brief chapter summaries add nothing new to the reader's understanding, and could be disposed of without loss to the book's usefulness. Chapter summaries are more appropriate for books whose material is far more lengthy and dense, thus justifying summaries as a way to convey the highlights to the reader. As noted earlier, the case study chapters similarly add very little value, if any, to Website Optimization, and could in future editions be folded into the relevant chapters, as sidebars, or at least made much more concise and moved to the back as appendices. There is a fair bit of repetition, in the form of allusions to techniques that are covered in more detail in earlier or later chapters, and other times in the form of redundancy within chapters. For instance, the sidebar on page 156, concerning CSS and JavaScript placement, consists of a uselessly brief mention of information covered later in more detail. Trimming away all of the repeated material and the chapter summaries, and folding the case studies into the relevant chapters, would make the book leaner and a faster read. Furthermore, some of the phrases are not entirely clear in their meaning, at least to readers who are not SEO marketers. For instance, "flagged sites" (page 12) — flagged for what? Some of the phrasing is confusing, if not downright bizarre, e.g. "information scent" (page 2) and "the scent of a link" (page 122)

Admittedly, a Web site owner could learn much of this information by reading numerous articles freely available online. But most businesspeople value their time much more highly than that, and would probably find a significant amount of repetition among those articles, because they tend to "borrow" a lot from one another. This is especially true in the cases of writers who have never done SEO optimization to a Web site themselves, or run a PPC campaign.

Aside from the aforesaid weaknesses, Website Optimization is an engaging, comprehensive, and valuable resource for anyone who wishes to improve the online marketing results of their own businesses Web sites or those of the clients they support. Online business owners and Web developers unfamiliar with core SEO and site optimization techniques, are urged to read this book.

Michael J. Ross is a Web developer, writer, and freelance editor.

You can purchase Website Optimization from amazon.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
This discussion has been archived. No new comments can be posted.

Website Optimization

Comments Filter:
  • Why mention the ISBN-10? Everyone has moved to ISBN-13 (the one starting with 978). I'm surprised that O'Reilly (not to mention /.) has bothered to list the old one anywhere - nobody cares.
  • by bcrowell ( 177657 ) on Friday November 07, 2008 @03:34PM (#25679459) Homepage
    I'm afraid I'd want to steer clear of any book on internet marketing that's published by a company that spams. I'm a college physics professor, and O'Reilly has spammed me to advertise their book Heads First Physics. I was really surprised and disappointed by the spam, because I have a whole bunch of O'Reilly books on the shelf next to my desk at home, and I'd been under the impression that O'Reilly really "got it" when it came to open source and the internet. But I really am pretty firm about not being willing to do business with spammers.
    • Re: (Score:2, Interesting)

      by Anonymous Coward

      Sorry about the email regarding our Head First book. Would you mind sending me a copy of it so I can get to the bottom of this? You can email betty@oreilly.com. We don't want you or anyone else to feel like we are spamming you. I can also add you to our no email list. Thanks.

      • Sorry about the email regarding our Head First book. Would you mind sending me a copy of it so I can get to the bottom of this? You can email betty@oreilly.com. We don't want you or anyone else to feel like we are spamming you. I can also add you to our no email list. Thanks.

        Thanks for your response. I deleted the email, but if you want to find it in your logs, the string to grep for would be either lightandmatter.com or fullcoll.edu (I think it was the former). If you don't want anyone to feel like you'r

        • by weirdal ( 127848 )

          Actually what you are referring to is not spam. O'Reilly are not spamming you if they are writing to you directly. In that case everybody who writes to someone without having met the person first would be classified as a spammer. Actually now that I think of it - this comment I'm writing to you now could be spam!

          Of cause O'Reilly should respect your whiches not to be contacted if you reply to the e-mail in question. You don't write if you have actually tried that? - you said you just deleted it?

      • The spammer technique demonstrated in the parent post is called "listwashing [spamcop.net]": get the complainers off their spam lists so they can continue to spam everyone else.

        Also note that her mention of a "no email list" is an implicit admission that they send unsolicited bulk email (i.e. spam): if they only sent email to subscribers, they would not need any kind of suppression list.

        So, if parent post is legitimately from O'Reilly (which is not certain), then it's a double confirmation that they are spammers.

    • Re: (Score:3, Insightful)

      by cvos ( 716982 )
      Is slashdot now selling reviews? This seems like a paid piece with gratuitous anchor text heavy links back to the author.

      Nothing wrong with this, but disclosure would be nice.

    • by Crizp ( 216129 )

      Have you followed Tim O'Reilly's Twitter? God, that man spams that service with uninteristing marketing crap all the time, had to unfollow him. Agh.

  • It looks like Slashdot will need some website optimization with how flaky it's been today.

    • I was curious if it was just my connection. images.slashdot.org likes to stall out, and I've been getting connection reset, timeout errors recently. Linux, windows, whatever. what's gives, slash?
      • by ColaMan ( 37550 )

        I get something similar on the main page that completely glues up firefox for 20 seconds or so. Can't switch tabs, window doesn't redraw if I switch away and then back to firefox, it's completely stuck.

        Been happening now for a month, and only on /. - it's getting a bit tedious.

        Anyone else seen this? Firefox 2.0.0.14, moderately stock eeePC distro, no plugins except for adblock.

  • 1 - Don't use Flash.

    2 - DON'T use Flash.

    3 - DON'T USE FLASH.

    • Re: (Score:3, Insightful)

      Exactly. Because as everyone knows, Flash is evil [slashdot.org]
    • Don't use Flash as a presentation layout tool. Flash makes perfect sense for throwing in a little interactive graphic or diagram, or for inserting a video clip, but the website should be text and images, preferably well-described images (ALT and TITLE are your friends!)

      I used to see way too many websites that used a static image (or sliced image) as a page to avoid layout issues with HTML. Now they fade that stupid image in with Flash instead.

      • by supernova_hq ( 1014429 ) on Saturday November 08, 2008 @12:50AM (#25685397)

        Flash makes perfect sense for throwing in a little interactive graphic or diagram, or for inserting a video clip...

        Would you idiots PLEASE for the LOVE OF ALL THAT IS HOLY, stop justifying the use of flash for videos when there is a perfectly good embed tag used for embedding *gasp* videos! Not only does this avoid the annoyance of a user with flash-block, but it allows the user to use their own preferred video player, easy fullscreen and proper streaming, but there are some operating systems (pretty much anything 64 bit and most unix systems) that have abysmally shitty flash players (and those that simply do not want it).

        In my opinion, as a user AND a web developer, the ONLY truly acceptable use of flash is for games and intensively interactive media. Even in these cases, Java is a much better alternative. Too bad it lost the browser war :(

        • by Baricom ( 763970 )

          Would you idiots PLEASE for the LOVE OF ALL THAT IS HOLY, stop justifying the use of flash for videos when there is a perfectly good embed tag used for embedding *gasp* videos!

          There's no such thing as an embed tag in HTML. If you believe differently, please reply with a link to the appropriate part of the HTML/XHTML spec.

          • But there is an object element to embed video!

          • The embed tag is one of the official new elements in HTML5 [w3.org].

            We're not in an era anymore where the specs or validators can keep up with the advancement in browser technologies. Should we not use ARIA attributes to mark up our content to provide better support for assistive technologies because the W3C validators do not pass them as valid, despite the ARIA specification saying that they should be?

            We are in an era where both worlds can be meshed together. Put your content inside of the newly created video tag t

        • I see no valid difference from a security or usability perspective between using an OBJECT tag or Flash for showing videos. Both require a third party embedded object on the website. One is very heavily used and therefore probably slightly more well-audited for bugs.

        • by alexo ( 9335 )

          In my opinion, as a user AND a web developer, the ONLY truly acceptable use of flash is for games and intensively interactive media.

          Not being a web developer, I cannot comment on that part but, as a user, I sometimes stumble upon sites that use Flash in interesting ways. Yes, that means I have to add NoScript exceptions to view the content but that's not too much of an inconvenience if I believe it is "safe".

          One example is Blue Cat Networks [bluecatnetworks.com].
          Hover over the people, then try clicking some of them.

          (No, I don't

    • Re: (Score:3, Funny)

      Oh, and don't use Java.

      A website written in Java that runs quickly? hahahahahahahahahahahahahahahahahahaha

      • It depends, seen the latest? I'd bet, no.
  • This article was collapsed for my by default. I clicked on it because it looked interesting but only to find out it was a book review. Had it been labeled as such I wouldn't have had to click on the link. Slashdot apparently isn't "optimized" if they're wasting cycles like these.
    • I clicked on it because it looked interesting but only to find out it was a book review. Had it been labeled as such I wouldn't have had to click on the link.

      Uh? What are you smokin, man? :P It says: Book Reviews: Website Optimization in the title.

  • VALIDATE (Score:4, Insightful)

    by Yarcofin ( 1397091 ) on Friday November 07, 2008 @05:40PM (#25681547)
    How about people actually validate their websites? Something like only 3% of the web is valid xhtml. http://validator.w3.org/ [w3.org]
    • by Twinbee ( 767046 )

      I've never understood the obsession with validation. Especially 100% validation.

      "Oh noes, my
        hasn't been closed. Maybe it will crash millions of browsers across the world"

      • Re: (Score:3, Funny)

        by Twinbee ( 767046 )

        Ooops, that was supposed to contain a br tag. Trust slashdot to mess up despite me posting as plain text.

      • Re: (Score:1, Insightful)

        by Anonymous Coward

        Little bugs like that add up. What if every application developer adopted the same irresponsible view that you have?

        • by Twinbee ( 767046 )

          To me, it's a 'bug' that you even have to close the br tag, when one should suffice.

          I suppose the whole thing is comparable to grammar. Most times, unusual spelling or grammar will be for the worse. However, sometimes, it will make things clearer, shorter, or just be more logical.

          But in the end, the thing I would drum home is that the time spent to get 100% validation would just be better off spent making the web site in other ways.

          • Re:VALIDATE (Score:4, Informative)

            by supernova_hq ( 1014429 ) on Saturday November 08, 2008 @12:57AM (#25685423)

            I'm sorry, I really hate listening to so-called website developers complaining about having to have 100% validation. If you are any good at what you do (few web developers are these days), you should be able to write a website and only need to spend a few hours (for a rather large site) fixing validation.

            In fact, using todays frameworks (wordpress, cakephp, etc) you really only need to do this once for the layout, then the content is usually trivial.

            In my mind, a web developer who does not care about 100% validation is like a secretary not worrying about 100% spelling and grammar checking! It's simply irresponsible, lazy and bad a bad work ethic.

            • by Twinbee ( 767046 )

              The difference there is the secretary's writings will be viewable to the public. In the case of HTML, the 'mistakes' are behind the scenes and not viewable to the public. If the end result is what really counts, then why the heck care?

              Yes, I appreciate how standards can help remove ambiguities, but as long as browsers render 100% validated HTML exactly the same, then it's only a bonus if they ALSO allow leeway for webmasters' mistakes/shortcuts. Those are two separate, but not incompatible goals.

              In the end,

              • The problem is that those validation errors are likely to crop up later. Maybe in $FUTURE_BROWSER. It's not like fixing those errors is hard.

      • Re: (Score:3, Insightful)

        by Ed Avis ( 5917 )

        One reason to validate is that if you write valid HTML, it will display according to the HTML standards on any browser that's not buggy. You can view it in your favourite browser (Firefox, Chrome, whatever) and be fairly sure it will work the same in anyone else's. If the HTML contains errors, then the browser must use heuristics to correct it, and these heuristics are not standardized. So it's a matter of luck whether it will work correctly in $random_browser. It probably will, and you can test it in v

        • it will display according to the HTML standards on any browser that's not buggy. You can view it in your favourite browser (Firefox, Chrome, whatever) and be fairly sure it will work the same in anyone else's

          Yeah right, that covers about 15% of the users. For the other 85%, you have to write workarounds anyway, valid HTML or not.

          If the HTML contains errors, then the browser must use heuristics to correct it,

          Well, a lot of the HTML validation errors are typically about things like unescaped ampersands in URL

          • Re: (Score:3, Insightful)

            by Ed Avis ( 5917 )

            Yeah right, that covers about 15% of the users. For the other 85%, you have to write workarounds anyway, valid HTML or not.

            Not really. It's 2008. Any web browser written in the last ten years is capable of displaying HTML4 pages correctly. OK, if you do advanced CSS stuff then there might be subtle differences in table border collapse properties or other arcana, but who really bothers with that? Most sites don't use or need anything advanced (and CSS degrades gracefully in pretty much every browser that

            • Any web browser written in the last ten years is capable of displaying HTML4 pages correctly. OK, if you do advanced CSS stuff then there might be subtle differences in table border collapse properties or other arcana,

              Well, some parts of my website have CSS stuff that failed horribly when I tried viewing them in IE7. I'm not talking about a pixel offset, but completely disappearing DIVs. But you're right that that isn't HTML4 per se.

              BTW, there are not four HTML dialects. The basic HTML grammar does not diff

              • Simple. Don't use the Transitional DTD. Use the Strict DTD only. As for XHTML, it's a bad idea to use it as IE still doesn't support it.

      • The vast majority of validation problems are actually syntax errors. It kind of helps that the browser will parse the tree properly.

    • Valid XHTML? Or broken XHTML that really is HTML4?

  • 80 - 20 rule (Score:3, Informative)

    by SethJohnson ( 112166 ) on Friday November 07, 2008 @07:20PM (#25683091) Homepage Journal


    I used to travel the world solving performance issues for web sites. As an example, I was contracted to support voter.com on the eve of the 2000 elections.

    Want to see huge performance gains with minimal work? Here are the easy fixes:

    1. Check the error logs. Layout monkeys will frequently forget to bundle up a spacer gif or some other graphic when deploying to production. Each one of these requires a seperate HTTP request to be sent, handled, file system hit, and 404 response returned. Four tasks can be removed for the server if you remove those html references from the site layout.

    2. Either turn off logging in production, or put the logs on a separate (physical) file system.

    3. Memory is cheap. If you can, mount your document root to virtual memory.

    4. Cache, cache, cache. If you are deploying new content, spider your site in testing, manually copy the cache to the doc root of production. Keep the load off the production DB.

    5. If you can, log all DB transactions over the course of a day. Check for repetitious SQL. Convert those commands to stored procedures, then update the dynamic page generation code to use the stored procs.

    I know these are obvious recommendations. You wouldn't believe how many high-traffic sites don't implement these basic techniques.

    Seth
    • Have you considered hardware acceleration? After suggestions 1-5 are exhausted it can really improve your throughput if your server's cpu is getting bogged down.

      • Hardware acceleration by purchasing more and/or faster CPU's is a longer-term performance improvement strategy. You can't just take a web server offline, put 8 new quad-core CPU's in the box and reboot. A major web publisher is going to have to build that system in parallel, test it, then migrate it into production. This is a several months-long process.

        Meanwhile, the tips listed here will give a poorly-implemented website a boost that will seem like 8 quad-cores were installed.

        Seth
        • A gzip compression card (if you have an availiable PCIe slot) would be another hardware route to take.
  • I lead WikiFur [wikifur.com], which was recently lent hosting by a fan. I didn't want to hog the server, so I scoured the web for tips on reducing the impact of websites. There turned out to be a lot of improvements that could be made [livejournal.com] which significantly increased our performance while drastically cutting the load. The biggest difference is not in reduced bandwidth or increased maintainability but in the user experience. Simply put, people love being on a fast site. No site will reach its potential if you have to wait t
  • YSlow Firebug addon (Score:3, Informative)

    by Spikeles ( 972972 ) on Friday November 07, 2008 @08:16PM (#25683643)
    YSlow [yahoo.com]
  • definitly helps. It would help /. at least... I *always* get the "busy script" alert in Firefox here. Painful.

What is research but a blind date with knowledge? -- Will Harvey

Working...