Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
GUI Software

Web 'Rules' Changing? 384

sempf writes "Lots of things have changed since we started this HTML. The IMAGE tag was a nice change, and multimedia with plugins like Flash provide a new look. What interests me the most, however, is the change in two of the hallowed GUI 'Rules' - the three click rule and the 7 +/- 2 rule. The Three click rule (which states that any page in a site or function in an application should be accessible in three clicks) was just debunked by Josh Porter in an article called Debunking the Three Click Rule. The 7 +/- 2 rule states that a user should never be presented with more than 5-9 choices at any given point in the site or application. James Kalbach has an excellent article debunking that rule at Dr. Dobb's Journal. Worried that there will be no more 'rules'? Never you mind - the Government has come up with New Rules for us to follow."
This discussion has been archived. No new comments can be posted.

Web 'Rules' Changing?

Comments Filter:
  • by Pingular ( 670773 ) on Sunday November 30, 2003 @11:08AM (#7592334)
    I'm worried that I didn't know about any rules, and that there are any in the first place.
  • Different times. (Score:5, Interesting)

    by liveD ehT ( 662508 ) on Sunday November 30, 2003 @11:08AM (#7592337) Homepage
    The thing about rules like "three clicks", is that they are based on the pre-bubble notion of buzzwords. That doesn't work anymore in the web design field. Now we have to provide tools that the customers want to have, and design stuff so that they can easily access it. Document trees, under the nice standards at w3 [w3.org] are what has really changed with the internet, and not to mention PHP, Perl and free db solutions like MySQL and the other guys.

    If users are leaving after 12 clicks now, like it says in the article, that says something about the level of web-smarts of the average user. But what I see in these charts, is a kind of "split the difference" research insight.

    For clicking, it's 50/50 that people will go on to get what they want. For the percentage of unsatisfied users, it's 50% who are unsatisfied, according to their research.

    What they've said is: "Users weren't any more satisfied with shorter clickstreams than they were with longer clickstreams. The satisfaction of users doesn't depend on the number of clicks."

    So that means that in the old days, people were getting used to the infrastructure of web surfing, and things that were far away were annoying people. Today, people are used to the web... some teens have grown up on it, and therefore people as a whole are used to it. Therefore, things like design style and presentation mean more than how far clicks are, and if they know they can get what they want by going there.
    • by jackb_guppy ( 204733 ) on Sunday November 30, 2003 @11:24AM (#7592411)
      It is not different times.

      It is not smarter or better trained users.

      It is about UI desdign.

      It is about understanding the human that is doing the clicking and what is the value of that click.

      The GOV guide lines shows some of that. You try to remove annoyances and learning curves. Use words instead of codes. Make it easy to look at and that the eyes follow the flow.

      We have had this same issue when screens were 24x80, or smaller (anyone, remember 6x40?).

      Lastly, where most sites have taken the biggest setup, is allowing google or other search engines to index their sites. This allowed the user to simply "step" directly into website where they need to start and not navigate though lots of bad panels to the one panel they needed.

      • Re:Different times. (Score:5, Interesting)

        by Hoplite3 ( 671379 ) on Sunday November 30, 2003 @12:10PM (#7592586)

        It is not different times. It is not smarter or better trained users.

        Bah. C'mon. The nintendo generation is better at reading lots of information at once than their parents. Case in point: fighter planes. In Vietnam, pilots would turn off their SAM warning squawk-boxes because of information overload. The noisy box would begin to ping and the pilots would melt-down from watching too many gauges. Nowdays, our pilots process much much more information. Sure, some of this increased capacity comes from a proper layout of the cockpit, but much of it comes from training recieved from two Italian plummers and buttons A and B.

        In exchange for shorter attention spans, we've gained the ability to process lots of information quickly.

        • by jackb_guppy ( 204733 ) on Sunday November 30, 2003 @01:16PM (#7592877)
          It is better layout. Include in that heads up displays and better control layout.

          All that could be shown from two Italian plummers -- The action must happen in the field of view, if you want some one to react.

          Does that not define "heads up displays"?
        • by A nonymous Coward ( 7548 ) * on Sunday November 30, 2003 @02:16PM (#7593153)
          It is absolutely traceable to better GUI design. Old style cockpits were full of gauges that had to be scanned, constantly, always checking temperature gauges and a zillion things which almost always had the proper readings and did not change, scan the instruments, scan outside, scan the instruments, scan outside ... boring as hell scanning those gauges, because they were almost always showing what they should have been ... can you spell repetitive? boring?

          Glass cockpits and HOTAS, Hands On Throttle And Stick, changed everything. The computer monitored instrumentation, and only showed what was out of spec, and alerted you when that happened. HOTAS meant doing everything from the two controls, stick and throttle. No more moving your hands from the primary flight controls to reach for one of dozens of toggle switches and dials which all looked the same, while pulling 5Gs and still trying to scan all those round gauges and track the situation outside and look where your fingers were.

          I knew a retired air force pilot who had flown patched up MiGs collected from battlefields, who said the biggest difference between planes of the same era was that the US planes had HOTAS and glass cockpits, and the Russians still had round gauages and toggle switches. Even if the Russian got on the tail of a US fighter, he had to reach up or over while pulling Gs, trying to reach the arming and firing switches and having to do it quick with one of his hands which really should have stayed on the throttle and stick because he was in combat, but no, so he lost a bit of maneuvering while the American was doing it all with ease because his hands were on the controls that mattered and his eyes were outside the cockpit instead of scanning dozens of round gauges.

          *That* is a classic GUI redesign.
      • by symbolic ( 11752 ) on Sunday November 30, 2003 @02:22PM (#7593176)

        I personally don't mind longer clickstreams, as long as they make sense!. What really gets me going are two things: Flash-only navigation, and pages/contents that make no sense based on your task at hand. Take Comcast's web site for example- a prime example of UI nastiness.

        First off, if I don't have access to a flash-enabled browser, I can't do anything associated with my account, or locate any contact information. Even after I have access to flash, there's even more trouble. Instead of providing you a list of service contacts so that you can easily scan through and locate the one in your area, you first have to know which of five or so regions you reside in (county incorporated, county unincorporated, etc). How the hell am I supposed to know this? Who cares? I only want some help with my problem, and suddenly I've got a whole new issue to deal with.

        Even worse, this is the exact same request screen that appears when you're looking to BUY service from comcast, so it almost looks like they're inter-mingling their support and sales functions- quite confusing, because you're never sure if you're on the right page. My only feedback in situations like this would be to hire a competent web designer/design firm who is well aware of UI issues, and come up with a better solution.
      • by spoco2 ( 322835 ) on Sunday November 30, 2003 @11:06PM (#7595888)
        But, to me, the Government website was a mess... WAY too many headings on the one page for my liking... why couldn't they have the headings on one page which linked through to the sub headings? Or, to keep it in one page, click on the heading to expand out the sub headings?

        Reducing the number of clicks to get somewhere just to reduce the number of clicks is ridiculous when the tradeoff is an actually harder to immediately grab site.
    • Re:Different times. (Score:5, Interesting)

      by Ed Avis ( 5917 ) <ed@membled.com> on Sunday November 30, 2003 @12:32PM (#7592668) Homepage
      If you can have at most three clicks... and only seven choices at each point... then your site can hold only 7 ** 3 == 343 pages! So clearly at least one of the two rules is bogus.
      • Re:Different times. (Score:3, Interesting)

        by yamla ( 136560 )
        This is what I was told when I worked at Suncor, an oil company. I was maintaining a little over 2000 pages on the Intranet at that point and some high-priced consultant spat these two rules out at us. I said, 'Great, that takes care of 343 pages, what about the other 2000 pages on our intranet web site?' They were quite surprised we had so many pages (this was 1997 or 1998) but could offer no alternative. They just restated both rules.
      • by fishexe ( 168879 ) on Sunday November 30, 2003 @02:35PM (#7593228) Homepage
        If you can have at most three clicks... and only seven choices at each point... then your site can hold only 7 ** 3 == 343 pages! So clearly at least one of the two rules is bogus.

        Incorrect. Obviously, both rules are valid, and there exists simply the (correct) corrolary that sites with more than 343 pages are improperly desinged.
      • Re:Different times. (Score:3, Informative)

        by catenos ( 36989 )
        If you can have at most three clicks... and only seven choices at each point... then your site can hold only 7 ** 3 == 343 pages!

        Just for completeness: Your math is wrong.

        With 3 clicks you have 4 levels. The first (which you see without clicks), and the three following. The maximum number of pages you can have, given that there are no cross-links is:

        level 1: 1 page, links to 7 pages of level 2
        level 2: 7 pages, links to 7*7=49 pages of level 3
        level 3: 49 pages, links to 49*7=7^3=343 pages of level 4:
        level

    • Re:Different times. (Score:3, Interesting)

      by Atomic Frog ( 28268 )
      No, it's not about users getting smarter or anything. It is like most Windows 95/98 users have come to expect, and put up with, crappy operating systems that crash and mess up all the time.

      People will go through 12-25 clicks or more, because they have surfed so many bad sites that they "expect" to have to do that to get at the information that they need.

      Imagine that the user visits a "bad" site that needed 20 clicks to get to the information. Then he/she visits a "good" site which gets them there in 3-cli
  • No rules... (Score:5, Insightful)

    by mOoZik ( 698544 ) on Sunday November 30, 2003 @11:09AM (#7592342) Homepage
    Unlike real life, the Internet has no rules, be it content, language, format, or organization. These rules are generally asserted to better help web designers (as there are some horrendously designed sites), but they are by no means written in stone. Follow what you think is best.

    • Re:No clue... (Score:3, Insightful)

      by rokzy ( 687636 )
      > they are by no means written in stone

      who said they were?

      here's a hint:
      http://dictionary.reference.com/search?q=ru les
      3 A usual, customary, or generalized course of action or behavior: "The rule of life in the defense bar ordinarily is to go along and get along" (Scott Turow).

      the point about the article is that the ideas of what constitutes good design are changing, not that there are or aren't actual rules.

      > Follow what you think is best.

      what an empty statement, what was I going to do before y
    • Re:No rules... (Score:5, Insightful)

      by ax_42 ( 470562 ) on Sunday November 30, 2003 @01:14PM (#7592872)

      These rules are generally asserted to better help web designers (as there are some horrendously designed sites), but they are by no means written in stone. Follow what you think is best.


      No, the "rules" not there to help the webdesigner. They are a best practice to allow the USER of your site (you know, the one the site is there for) better reach the information he is looking for. If you have the talent, then what you think is best is best for the user -- if not, then please allow these "rules" help you to make a usable website.
  • by Anonymous Coward on Sunday November 30, 2003 @11:10AM (#7592348)
    The 7 +/- 2 rule doesn't apply on this site. On any given page, there can be what seems like 50-100 links! :D
  • Rules? (Score:5, Informative)

    by Pingular ( 670773 ) on Sunday November 30, 2003 @11:10AM (#7592350)
    Never you mind - the Government has come up with New Rules for us to follow
    It clearly states on the website that they're guidelines [usability.gov], not rules.
    • Re:Rules? (Score:5, Informative)

      by Chalybeous ( 728116 ) <chalybeous@@@yahoo...co...uk> on Sunday November 30, 2003 @11:21AM (#7592398) Homepage Journal

      It clearly states on the website that they're guidelines, not rules.

      ... and they're not all that new, either. I seem to recall that Jennifer Niederst [littlechair.com] wrote about government accessibility guidelines in her book, Web Design in a Nutshell [littlechair.com] (O'Reilly, 2001).
      Not that I do much web design myself, but I bought this book instead of HTML for Dummies purely on the basis that I resented the idea of purchasing a book that implied I was stupid. I was actually pleasantly surprised that Niederst's book was written for the intelligent individual, at a level accessible to beginners.
      Of course, I might be wrong - that book is about 40 miles away, in a box in my parents' attic, so I can't check to see if I remember right about any chapter on accessibility guidelines. But I liked it so much that I've gone on to buy O'Reilly books whenever I need a useful reference, like the time I considered switching to Linux. And when (if?) I get around to making a website, you can bet I'll be using WDiaN as my guide.

      Anyhow, that went a little OT. YMMV, of course.

  • by ikoleverhate ( 607286 ) on Sunday November 30, 2003 @11:11AM (#7592352)
    These are rules of UI design, not specific to the web... Bad headline ./
  • Rules (Score:5, Informative)

    by Rumagent ( 86695 ) on Sunday November 30, 2003 @11:13AM (#7592364)
    Imo, a good webpage should follow these two rules:
    1. Have actual content.

    2. Make content easily available.

    But maybe I am just old...
    • Re:Rules (Score:5, Insightful)

      by leerpm ( 570963 ) on Sunday November 30, 2003 @11:32AM (#7592446)
      That is an oversimplification of the issue. It is almost the same as saying 'all good webpages should be good'. Well of course. But what defines good? Similarly, what defines making content easily available? The answer is not as trivial as you might think. And the definition keeps changing, because both people and the technology they use keep changing.
      • Re:Rules (Score:5, Insightful)

        by placeclicker ( 709182 ) on Sunday November 30, 2003 @12:53PM (#7592755) Journal
        That is an oversimplification of the issue. It is almost the same as saying 'all good webpages should be good'. Well of course. But what defines good? Similarly, what defines making content easily available? The answer is not as trivial as you might think. And the definition keeps changing, because both people and the technology they use keep changing.
        Maybe, maybe not.

        It is true that Google [google.com] replaced Yahoo [yahoo.com] as THE search engine.

        It could well have succeded because it lacks all that crap on the front page.
  • by phrostie ( 121428 ) on Sunday November 30, 2003 @11:13AM (#7592367)
    Not only is the Three Click Rule correct for Web sites but also applications. if you embed the final page/function so deep that the user can't find it, you might as well go back to CLI or just google to the final page skipping all the intermediate menus/BS.
    • scripsit phrostie:

      if you embed the final page/function so deep that the user can't find it, you might as well go back to CLI

      You're quite right, but probably not for the reason you think. If you have to provide a large number of options, CLI is the best way to do it. That's why so many geeks that bothered to learn to use a shell still launch applications from xterms; it's more efficient than wading through menus. That's also why Google doesn't offer you a menu of the Internet, or a list of icons for

  • Yeah, right. (Score:3, Insightful)

    by pairo ( 519657 ) <gcbirzan@g[ ]l.com ['mai' in gap]> on Sunday November 30, 2003 @11:14AM (#7592369) Homepage
    So what if we have all these rules if the overwhelming majority of pages out there have Flash intros, content only accessible if you take the time to go through 20 intermediary pages? How many web designers actually know these rules (gudielines) actually exist? I for one strongly agree with these rules, since they enable you to actually USE the webpages, not simply drool over the shiny pictures, but most people out there simply don't know better.
  • by weave ( 48069 ) on Sunday November 30, 2003 @11:15AM (#7592376) Journal
    Click #1, site map

    Click #2, find link after futzing with page search if needed.

    Oh, I'm sorry, I guess that violates the 5-9 items on a page rule.

    I have better rules. How about ban senseless use of flash, annoying animated graphics, lazy conversion of printed matter to PDF documents instead of crafting true HTML pages, and sites with little or no content? But then again, who am I to argue with marketing "experts" who know what I want better than I do?

    • by Migraineman ( 632203 ) on Sunday November 30, 2003 @12:06PM (#7592564)
      Whenever I need information about a product or application, I very much appreciate having access to a PDF version. I can take it with me on my laptop when I'm in the field or at a customer site, and I can archive it on CD in the event that the product is discontinued (or the company goes tits-up, leaving me with the maintenance issues.)

      I've noticed that many companies have taken to presenting product data as HTML-only. I find that annoying because, assuming I'm interested, the first thing I end up doing is printing the HTML page to a PDF file so I can archive it. Usually I need to futz with the page formatting before I get a useful output, and that futz-time costs me time and aggrivation. I'm not advocating that all content should be PDF'd, but I do believe it has substantial value. Balancing the amount of HTML and PDF content presented is the tricky (and subjective) part.
      • by Minna Kirai ( 624281 ) on Sunday November 30, 2003 @12:36PM (#7592678)
        I find that annoying because, assuming I'm interested, the first thing I end up doing is printing the HTML page to a PDF file so I can archive it.

        This I can't believe. How can a PDF ever be better than HTML for digital archiving? HTML was meant to be read on computer; PDF is intended to be printed out.

        Unless you really meant "ugly HTML [tomshardware.com]" instead of merely "HTML". Stupid web pages with colorful toolbars, formatting, background pics, tables-for-layour, ad banners, 'related content' links and 'click here for page 3/21' on the bottom... they're a tough way to read documents, and I suppose a PDF could be an improvement.

        But the best way for publishers to present documentation is as simple, usable HTML [gnu.org]. Then, if the reader wants a PDF, she can print it herself, and it'll take whatever font and pagination she prefers. (PDFs created by publishers are greatly flawed in that the layout is frozen, instead of being dependent on the qualities of the output device. If I'm reading on a computer, there should be no page breaks.)
        • This I can't believe. How can a PDF ever be better than HTML for digital archiving? HTML was meant to be read on computer; PDF is intended to be printed out.

          Apparently he can't deal with the idea of a document that isn't contained within a single file. That's about the only archival advantage PDF has over HTML. Personally, I prefer the HTML.

        • by EvanED ( 569694 ) <{evaned} {at} {gmail.com}> on Sunday November 30, 2003 @01:58PM (#7593068)
          "PDFs created by publishers are greatly flawed in that the layout is frozen, instead of being dependent on the qualities of the output device. If I'm reading on a computer, there should be no page breaks."

          It depends on your purpose. If I'm printing some reference material (for instance, the make manual; ignore the fact that there is probably a more suitable for printing PDF version available directly from GNU), I would print it to PDF first. Why? So I can see how it will look. A lot of archival things I want to look as much like the original as possible. If I'm reading the report on the Columbia disaster, I want to read it in PDF so I can see how it was organized in print. I don't want to read the HTML version (like what NASA has for the Challenger's Rogers Report).

          Now, of course there are a lot of things for which PDF is unsuited, but there are many many cases where it is very helpful.
      • Whenever I need information about a product or application, I very much appreciate having access to a PDF version. I can take it with me on my laptop when I'm in the field or at a customer site, and I can archive it on CD in the event that the product is discontinued

        HTML pages save to disk too. They just don't come out as a single file.

        • Yes they do. It's called MHT. IE does it. If you ask them, they'll claim it's "microsoft html format", but it actually stands for "MIME HTML" (all the pages, images, etc, are encoded as MIME and embedded in a plaintext file).
    • Graphics are part of the generally accepted HCI guidelines - it is a Bad Thing, to use distracting animations.

      There's also a general convention that says "Don't do things just because you can". (Do only what is needful - I'm sure there's a Yoda-ism for this).

      And PDFs are a preferred medium for transferring information which is going to be printed in a lot of places, although generally you should still provide information in HTML and PDF format (more HCI, multiple routes to any single goal).
    • by STrinity ( 723872 ) on Sunday November 30, 2003 @02:58PM (#7593337) Homepage
      You forgot the most important rule: Never assume people will view the site exactly like it appears on your screen.

      I don't like webdesigners controling my browser, so I have Firebird set not to display blinking text, status-bar tickers, banner ads, or custom scrollbars; flash doesn't play unless I click on it; I have toolbar buttons to resize text; automatic pop-ups don't work, and links that are supposed to open in pop-ups go straight to a tab. There are lots of sites that look laughable (if not unusable) because the designer added all sorts of bells and whistles on the assumption that everyone uses IE with the defaults set.
  • by Kedder ( 529127 ) on Sunday November 30, 2003 @11:16AM (#7592379) Homepage
    It seems they doesn't follow their own rules [usability.gov] ...
  • by wowbagger ( 69688 ) on Sunday November 30, 2003 @11:17AM (#7592383) Homepage Journal
    The "Three click rule" and the 7 +/- 2" rule are good rules for designing simple UIs (of which web pages can be considered a subset), but simple inspection can reveil the problem with this idea.

    Suppose a UI were to scrupulously follow both rules. Then you would have a maximum number of choices of 9 ^ 3 = 729 choices. No more.

    That may be great IF the number of choices you have is less than 729, and IF the choices can naturally be grouped in bunches of 9.

    However, any complicated application may easily exceed this.

    Moreover, people CAN deal with more than 7 choices, as long as the choices are somewhat related - Baskin Robins 31 flavors are all exactly that - "flavors". Imagine if a BR menu offered 31 choices of foods, drinks, plate colors, locations in the restaurant, server names, music, etc. ALL AT ONCE.

    7 +/- 2 and 3 click are useful GUIDELINES. Just as saying "Using goto in C/C++ is generally a bad idea", or "pointing a loaded gun at any part of your body is a bad idea" are pretty good guidelines, there are times when you need violate them (e.g. error handling in the absence of exceptions, demonstrating a bullet-resistant vest, and designing a complicated piece of test equipment).

    You should just use them AS GUIDELINES - "Hey, I really have a lot of items in this menu, perhaps I should take a break and see if I can come up with a different way to group them?"
  • Case in point, "provide printing options [usability.gov]." Any webhead worth her salt knows that providing a duplicate page just for printing is a waste of time and effort, when CSS can do it for you [alistapart.com].
    • When I tried forcing page break by CSS in Mozilla (about half a year ago), it didn't work.
      w3c rules aren't very effective either...
    • The "no" comes when you have extra information which is required to give the printout context as a "standalone". One example might be a mail reader which has the From/Subject/Date in one frame, and the message body in another.

      Of course, generating a whole separate page is generally a waste of time, IMNSHO. What I like to do is diddle the HTML of the frame during the print operation. I haven't found a great general solution for Moz yet, but trapping onbeforeprint/onafterprint in IE and relying on the window
  • Wait... but that statement is a rule itself! So that's a contradiction so there must be atleast one rule.

    Q.E.D.

    Matt Fahrenbacher
  • by SharpFang ( 651121 ) on Sunday November 30, 2003 @11:23AM (#7592408) Homepage Journal
    1) Use as much stuff as you can. No matter how unnecessary it is, put it there.
    2) If you plan creating something something, put a link to 'under construction' page with that thing's name. If you don't plan creating it, put that link anyway.
    3) Put as many javascripts and plugin content as possible. Best if you make all navigation buttons using separate java applets, or the "enter" button with flash.
    4) A right-click blocking script is a must.
    5) Use freestyle HTML. No tag must be ever closed, let's see how the browser handles undocumented parameters, what about making up my own tags?
    6) Never forget about "Make this page your homepage" button!
    7) Graphics is everything. You may leave a 60x60px box for text content, but a huge background is essential. There should be at least half a megabyte of non-skippable intro in flash before the content proper.
    8) Instead of creating thumbnails in your gallery, use height= and width= parameters on original, full-size images.
    9) a href= is unfashionable. Use javascript to change pages.
    10) It's highly desired to open the page in a new 'kiosk' style popup window. Let's force people to disable their evil popup-blocker software, nobody dares using buttons like "reload" or "back", only site-provided navigation is allowed! ...add your own.
  • by MyNameIsFred ( 543994 ) * on Sunday November 30, 2003 @11:29AM (#7592435)
    In their analysis they had users complete different tasks and asked them about their experiences. And they noted no difference as a function of number of clicks. I think this misses the point of the 3-click rule.

    The 3-click rule gets to the importance of accomplishment -- getting that feeling of moving forward. Your typical e-commerce site takes several pages to enter credit card info, shipping address etc. But as I move thru it, I feel that I am accomplishing the task. On the other hand, if I go to the same site looking for a particular item to buy, I'll give their navigation and search tools about 3 chances to find the item before I move on to another site. If they can't get me close to what I'm wanting in 3 clicks, I'm out of there.

    This is the secret that Disney has learned. Their popular rides have LONG lines, but they keep you moving. They entertain you in line. A much better experience than a typical amusement park, where you stand stoically in line.

  • by ediron2 ( 246908 ) * on Sunday November 30, 2003 @11:32AM (#7592447) Journal
    I've just made the faux-pas of actually reading* the linked article that claimed that 3-click was debunked, and I don't agree.

    The 3-click rule says info should be accessible within three clicks.

    The article contesting this says they watched over 8000 user clicks, and most users clicked 25 times before 'giving up', when it appeared they were searching for stuff.

    The gap that I see is in not more-deeply analyzing how the clicks of users related to depth-of-tree (i.e., 1-click from home, 2-clicks, or 3-clicks, etc.) or perceived website quality. It is possible that people spent 25 clicks wandering but resurfaced to 'home' several times in trying to find the proper 3-click path to their desired target.

    My point is that truly debunking this concept would involve:

    1 - looking for 'back to home' patterns in click streams.
    2 - classifying users a few ways (Some people are too timid/stupid to use the 'back' button!)
    3 - validating user satisfaction on usability of sites that honor/ignore the 3-click rule.

    All the article does is prove that people are persistent, even in the face of crappy webpage design.

    * - My apologies; I hope admitting that I read the article doesn't completely destroy my /. karma. I promise I won't read the article ever ever ever again, so this should be a one-time problem for slashdotters, since obviously no-one else ever reads articles here.
    • I saw that study, and thought "what was the testing method", was it:
      • a bunch of internet users who agreed to have their internet activities monitored, with statiscation monitored by follow up survey.
      • an analysis of web logs, with the follow up survey based on a pop up after they surfed away?
      • click-wrapped spyware.

      Each of these methods would bring along certain problems. Weblog analysis would (I believe) give the best population sample, but depending on the added popup for the "fustration data" would ske

    • Well, it shows that for this particular user population, people are persistent AND they aren't any more likely to be dissatisfied if they need a lot of clicks than they are if they could complete their task in few.

      However, there is one big red flag with this article: It doesn't describe the user population. The 3-click rule is VITAL when you are dealing with certain kinds of user populations, and irellevant for others. If you're trying to make a sale of a new type of product for instance, you can expect t

  • The myth of 7 +/- 2 (Score:4, Interesting)

    by pommes ( 538066 ) on Sunday November 30, 2003 @11:36AM (#7592461)
    Periodically, we hear about the rule of 7 +/- 2 from inexperienced interaction designers: Users can't handle more than 7 bullets on a page, seven items in a form list, or more than seven links in a menu. This has no evidence in reality - on the contrary. The psychologist George Miller's [princeton.edu] conclusions apply to what we can memorize - not what we can perceive.

    Current research strongly supports that broad structures perform better than deep structures. Users can more easily cope with broad structures, they have a greater chance of getting lost in deep hierarchical structures, and new visitors are able to get a better overview of sites offerings from a broader structure.

    read more: The Myth of "Seven, Plus or Minus 2" [ddj.com]
    • by Da Fokka ( 94074 )
      Although it is true that Millers' rule applies to memorizing objects and not perceiving them there is a link between the two.

      Perception gets a lot harder when there are more than 7+/-2 relevant objects. If you want to test this for yourself, have someone throw a couple of pencils or similar objects on the ground and coun them. You will be able to count them in a glance if there are few objects but at some number (and for most people, that's around 7) it suddenly gets a lot harder as you'll have to count th
  • W3C [w3c.org]
    • Yup! (Score:3, Informative)

      by zonix ( 592337 )

      I agree - W3C is where it's at.

      I've just realized though that IE has a severe deficiency which is somewhat of a showstopper for the adoption of XHTML - it ignores the XML declaration in XHTML documents, like this:

      <?xml version='1.0' encoding='iso-8859-1'?>

      IE expects to encouter the DOCTYPE first, which doesn't make sense - and would be non-valid XHTML markup. When you feed IE with this as text/html, it's throws it in to quirks mode!

      Sure, the XML declaration is not strictly required, however if yo

  • by Cranky_92109 ( 414726 ) on Sunday November 30, 2003 @11:41AM (#7592479)
    I think as the web matures, these so called 'rules' will be rewritten. No hysterical 'end of rules' proclamations need be sounded.

    The 3 click rule made more sense during the bubble when there was a glut of sites for every category. Or when there really wasn't a definitive site for any one purpose. When a person knows there are a multitude of sites they can look at, they are reluctant to go too deep on any one site. I can recall using 3-5 search engines every time I was looking for something. I would look at the first result page and then try another engine. Now I only hit Google, but I'll look as deep as I need to.

    The 7+/-2 rule is based on a cognitive psychological idea first put forth in an article by George A. Miller, The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information [well.com]. In it he argues that the average person can really only hold about 5-9 things in immediate memory at one time.
    I don't believe that is an internet design 'rule' that should be ignored, too many choices in one space will overwhelm your average users.
  • by ediron2 ( 246908 ) * on Sunday November 30, 2003 @11:43AM (#7592490) Journal
    Everything already is 3-clicks away:
    1. Click in the google search box. Type search terms.
    2. Click on the 'search' button (power users: press the enter/return key)
    3. Click on appropriate link.
    ... natch!
    • Everything already is 3-clicks away:

      1. Click in the google search box. Type search terms.
      2. Click on the 'search' button (power users: press the enter/return key)
      3. Click on appropriate link.


      4. Repeat 200 times because the word "Graph" is a ridiculous name for a plotting program.

  • by spin2cool ( 651536 ) on Sunday November 30, 2003 @11:49AM (#7592508)
    These make my head hurt. Here is the complete list of web designer battle stories [tofslie.com]. Some notable excerpts:
    Client: "We want a website that can play DVD quality video, but we don't want to use streaming video and the load time must be zero."

    Designer: "That's impossible. Everything has a load time. DVD quality runs about 100 megs a minute."
    Client: "We'll take our business elsewhere..."

    Designer: so who will go to this site and for what reason will they go there?
    Client: i don't know
    Designer: well what is the purpose of this site!?!?
    client: i don't know

    client - I don't care if it doesn't work in netscape - I want it!
    Designer - umm.. okay...
    client (2 weeks later) -It looks all broken!
    designer - Huh? Nothing looks wrong on my end. ..blah blah blah... What version of IE are you using?
    client - Netscape.

    Client: Could you use a different font for every name, you know make it cool.
    Designer: Uh, that's probably not going to look so good, it'll be all cluttered and ugly
    Client: No, it'll look cool, so let's do it.
    (after doing what they want...)
    Client: Now it looks all cluttered.
    Designer: Aarrrgghhh

    And my favorite:
    "Can't we make the text blink?"
  • The "rules" and the "users" are--hopefully--interdependent. Those rules helped train a generation of web users, and now the users are setting forth their own rules. "I won't go to a site that's slow. I won't go to a site where I can't find anything."

    As people become more comfortable with the web, the rules should change to accomodate them.

    Good content, at any rate, always trumps the rules. Look at...ahem...Slashdot.
  • The article on three clicks suggests to me that web users have become so accustomed to the horrible "navigation" on so many websites that they are willing to put up with a lot of frustration.

    I think a better study would ask at what point does a user's frustration level increase beyond what they consider acceptable. In other words, what is their patience level.

    -Thomas
  • by elgaard ( 81259 ) <<kd.loga> <ta> <draagle>> on Sunday November 30, 2003 @12:05PM (#7592558) Homepage
    The guidelines recommends to optimize for screen resolution and fonts. I think that is a bad idea.

    If the statistics that say most people have 800x600 screens are not already outdated, they will be soon. And how do you optimize for peoples eyesight. If I want bigger fonts I set the minimun-font-size in the browser or tell it to ignore font-sizes in webpages even if it breaks the design of some webpages.

    How about just making pages that work with any font size and window sizes and then not use absolute font sizes?
  • by nnnneedles ( 216864 ) on Sunday November 30, 2003 @12:06PM (#7592561)
    I use it as a rule of thumb all the time.

    The thing you need to think about though, especially on the web, is this:

    It's not about having only 7 links on a page. It's about grouping. You can group links using colors, a box, a header or just placement.

    The reason site maps are useless on most sites is because if you have a web site with a good gui, it is actually mentally cheaper to click a few times and wait for pages to load, than be overwhelmed by hundreds of links at the same time.

  • by Craig Ringer ( 302899 ) on Sunday November 30, 2003 @12:07PM (#7592566) Homepage Journal
    I just fired this off to the admins of the site:
    ----

    Hi folks

    I have a few comments about your useability guidelines, most notably the font recommendations found at http://usability.gov/guidelines/fonts.html .

    While I agree that a 10pt font is ideal for many people, I think it's totally inappropriate for a website to ever set this. Many people are using high resolution or high-DPI screens where 10pt is unreadable; many need larger fonts because of visual impairment; some may want smaller text, etc. Setting an explicit point size will override any preferences the user may have made in their client.

    I have visually impaired users at work, and they find many websites apalling - I've had to set their browsers to ignore the website's font settings to make many sites useable. This is not a good situation for anybody, as the site designer uses font size and face as a significant cue for navigation and reading.

    As such, I'd love to see you note on your useability guidelines that font sizes should only be set uding relative properties - the 'em' measure in CSS, the '%' measure in CSS, the 'larger'/'smaller' descriptive terms of CSS or the 'SIZE="+-n"' measures in the HTML <FONT> tag. CSS 'pt' or 'px' should never be used where accessability is a concern.

    For an illustration of this problem, I suggest that you find a computer with a 19" monitor capable of at least 1600x1200 (or a 21" that can do 2048x1536) and try to use sites that are set to 10pt. Ideally find someone a bit older for this test. For even more fun, use an OS other than Windows that is not guaranteed to have access to the specific fonts the website designer previewed their site using.

    Another issue I think well worth mentioning is the use of leading/kerning controls in CSS, especially combined with the use of absolute measurements. Setting the leading in type may well make things look very 'crisp' and 'professional' on the designer's screen, but often makes the content almost unreadable for people who don't have the same fonts, use large or small type, or otherwise differ from the configuration of the designer's test systems. Leading specified in 'px' or 'pt' is especially bad, as this causes each line of type to overlap when the font size is larger than that the page was designed for; it also causes lines to space out very annoyingly when using smaller type sizes. If leading must be specified, it should be expressed in relative measures like 'em' or percent, so that the leading scales with the type size.

    One final comment: some sites, while designed to work with a range of type sizes, fall down severely when viewed with _extremely_ large type as is needed for someone who is partly blind. One of the staff at work has serious vision problems, and she finds that on many sites the columns do not expand with the type. If the type is large enough that only one word fits in each column, this is hard to read - but as words aren't broken, if the columns are a little narrower than type can overlap. This makes a site unuseable. Again, it's easily fixed - column and table sizes should be specified in relative measures such as 'em' or percent, never in pixels or point sizes.

    Unfortunately, certain buggy web browsers - such as many versions of Microsoft Internet Explorer - have severly broken CSS implementations that make this more difficult than it should be. It is still possible to design good sites that work well even for people who need or prefer different type sizes, however - and I think this is an important thing to encourage.

    As monitor resolutions get higher and computer use even more universal, this will no doubt become more of an issue.

    I'd love to hear your comments on these suggestions.

    Craig Ringer
  • by swein515 ( 195260 ) on Sunday November 30, 2003 @12:09PM (#7592576) Journal
    I think the 3-click study is inherently flawed, since they studied the results of tests where people were asked to complete specific tasks; naturally they would *work harder* to complete them.

    Now analyze a bunch of random people, who are not privy to the study in their everyday web habits, and see how the 3-click rule holds up.
  • by bcrowell ( 177657 ) on Sunday November 30, 2003 @12:46PM (#7592723) Homepage
    It's not the clicks, it's the bandwidth. I was just paging through the Sunday NY Times Magazine, and I probably did fifty page turns, which is the dead-trees equivalent of fifty clicks. I didn't feel frustrated at all, because after I flipped the page, I didn't have to wait 20 seconds for the graphics to load.

    That's why it's a good idea reuse the same graphics as much as possible on many pages of a web site, e.g., place a banner that identifies your company at the top of each page. Modem users will already have the graphics in their cache, and won't have to wait for them to load again.

    What really frustrates me is sites like Apple's, where you can't even tell what's on the page or how to navigate it until you wait for a megabyte of jpegs to load. Thirty three-second clicks is heaven. Three thirty-second clicks is hell.

  • by drmike0099 ( 625308 ) on Sunday November 30, 2003 @12:56PM (#7592773)
    There are actually good reasons to both of those rules, and the 7+-2 article did a better job of mentioning these than the 3-click article did.

    The research that the 7+-2 rule is based on has to do with short-term memory, not how many people can read through. The point of this rule is that if people are "browsing" when they come to the site, meaning that they are not sure what they are looking for, they have to look through all the options and choose one. If there are more options than they can store in short-term memory, they have to do multiple browses to find what they want. As an example, if the site has 20 links, and the most appropriate link is link #10, the person needs to browse the whole list once, ask themself if any of those were appropriate (which they may or may not remember), then rebrowse from the top for that choice, or start over. Since they might not remember even seeing an appropriate one, they may have to do this multiple times to move more of the list into long-term memory so they can analyze it better, or just make a choice that doesn't take into account all the options. If the list had been 7+-2 in length, they could have made that determination in their short-term memory much more quickly.

    If, on the other hand, every user coming to your site knows what they are looking for and where it is, they can look through 100 or more links to find it and as soon as they see it, they will click on it. They are not browsing, but searching for a specific thing.

    The 3-click rule is almost related to the above, and it involves browsing vs. searching. If a browser makes a choice at the top they feel is appropriate (again not sure if they're in the right spot), if they don't find what they're looking for in 3-clicks they probably determined they chose incorrectly initially and will back up and start again. If they have definite progress towards their destination, they will go dozens of links deep to find it.

    A searcher who knows what they are looking for is more confident about their initial choice and will keep digging to find it. The 3-click rule doesn't really apply to them.

    The 3-click rule is much more of a guideline, and should really be that they need to see progress to their goal after 3 clicks or they'll turn back. It was also created because you must have created a mess if someone has to dig through 25 steps to find what they're looking for; I would call that failed site design even if people were willing to go that far. The article referenced was generally pretty poor as far as a study goes, they didn't give any information about what these people were doing, if they knew what they were looking for, etc. It doesn't really prove anything, and certainly doesn't "debunk" the guideline, which is pretty much based on common sense.
  • Grain of salt... (Score:4, Insightful)

    by mechaZardoz ( 633923 ) on Sunday November 30, 2003 @01:29PM (#7592946)
    While the article sets out to debunk the web-design standard of the "3-click rule," the real object lesson here is an understanding of how the websites they examine succeed in breaking the latent frustration of visitors. Site "stickiness," keeping users clicking and exploiting links to content, must work against the natural human proclivities for exhaustion of novelty and short attention spans. It is certainly true, as was noted in the article, that years of exposure to an ever-increasing flood of information have increased our thresholds for sifting through data. Still, what really keeps someone coming back for more is a successful application of the reward principle. This shouldn't come as any suprise, game designers have plied this for years. Now, in the case of websites, we see a similar application of this principle. People will move through a task, even if it requires many multiples of 3-clicks, if this history of exploring navigational structures has shown they are moving towards a successful completition.

    Most likely, the real truth here is that the 3-click rule evolved out of an era where the 'ergonomics' of human-web internation were poorly understand, providing a quick and easy rule of thumb where content designers could easily throw up pages while still retaining visitors.

    In the end, though, one shouldn't come away from these articles with the notion that users will suffer any number of clicking injustices. It does show, however, that there is no substitute for a well-organized site that recognizes the processes by which a visitor will make use of the content.
  • by penguin7of9 ( 697383 ) on Sunday November 30, 2003 @01:39PM (#7592982)
    'IMAGE' is not an element in HTML 4 (check for yourself [w3.org]). Maybe it should be. Maybe it should stand for inline, base64 encoded images. But it doesn't.

    Makes you wonder when the submitter of the article last wrote a page of HTML...
  • by Comrade Pikachu ( 467844 ) on Sunday November 30, 2003 @01:45PM (#7593001) Homepage
    The number of choices that a person can retain in his memory (5-9 according to the cited study) is an important consideration when navigating a web site using a text-to-speech device.
  • by chhamilton ( 264664 ) on Sunday November 30, 2003 @02:07PM (#7593103)
    The 3-click rule is actually based on a little math, and doesn't just come from nowhere. The question is this: given a finite number of leaves (end destinations), how should a menu be arranged to minimize the average amount of time required to access any leaf? The assumptions are that each 'menu' (level of the tree) takes the same amount of time to read/load/listen to, and that each final menu choice is equi-probable. Under these conditions, continuous optimization shows that a tree with exp(1) = 2.718... branches per node is optimal. Thus, the choice of 3 options per menu level is usually chosen.

    Again, this rule is based on some fairly strict assumptions, and realistically, an optimal menu layout (in terms of minimizing clicks) may conflict with a logical menu layout (in terms of hierarchichal ordering).
  • 5+/- 2 (Score:4, Interesting)

    by herwin ( 169154 ) <herwin@theworldELIOT.com minus poet> on Sunday November 30, 2003 @02:41PM (#7593255) Homepage Journal
    I teach my students that, but in the context of the number of major elements to have in a system. I also tell them 3-15 is the range to be in. My point is that a system should have that number of subsystems to be 1) grokable, and 2) sufficiently complex to be worth defining.
  • by finelinebob ( 635638 ) on Sunday November 30, 2003 @02:42PM (#7593257) Homepage

    James Kalbach's article [ddj.com] points out how poorly understood the "7 +/- 2" "rule" is in general, but he seems to ignore that since its publication in 1956 psychologists have learned quite a bit about this "limitation" on information processing abilities. His suggestions are old news on this front and, instead of debunking 7 +/- 2, confirm its importance.

    Let's start off with an example from where the research was perhaps first applied -- telephone numbers (George Miller, the researcher who "discovered" this number, worked for Bell Labs). US telephone numbers, since 1947, have followed the 3-3-4 format: 3 numbers for the area code, 3 for the exchange and 4 for the line number. Add the 1 in front of any number for dialing long distance and you've got an 11-number sequence. Does this violate the 7 +/- 2 "rule"? Not really, for a number of reasons:

    1. First and foremost, this "rule" is a description of a limitation of our short term memory's (STM) ability to hold data. What constitutes a datum, however, can be quite flexible.
    2. Forget about the 1 for long distance. We all know it needs to be there. It's a true rule -- to the point that most (if not all) cell phones do not even require you to punch it in, they'll dial it for you when needed. So, in some cases, procedures related to the information you are trying to remember can reduce the demands on STM's processing, and in others the demands can be off-loaded onto technology devices that can assist our processing of the information.
    3. Area codes reduce the load of 3 digits to 1. You've probably got quite a few area codes stored in your long term memory (LTM). Even if you can't recall them all off the top of your head, you can recognize familiar ones amd may even place them geographically without much trouble once you see them again. These familiar area codes allow you to "store" these 3 digits in STM as 1 datum.
    4. Exchanges, before faxes and cell phones and modems created the explosion in demand on phone numbers, used to mean a lot more than they do now. They were originally linked to telephone switching equipment and had names identifying them. Growing up, my home phone number wasn't 582-xxxx but LUzon 2-xxxx. The first two letters of the exchange name corresponded with the digits. So, like area codes, exchanges reduced the demand from 3 digits to 2 and possibly even one -- back when I was 10, there was a LU 1 and LU 2 in my area, but nothing else.

    Given these factors, a local phone number can have a demand on your STM as little a 5 "bits" of data for a local call. Still, you might think that with auto-dial features of phones these days, does this format really matter anymore? Well, maybe not to the technology in our phones that stores the information for us, or to the telephone switching technology that accepts and routes and connects our calls, but if someone gives you a phone number to remember you'll have a much easier time of it if you at least recognize the area code, even if all you need to do is walk to the phone and dial (as opposed to memorizing it). That 3-3-4 pattern helps us cluster the data and retain it in STM longer than if we'd try to hold a ten-digit sequence without any clustering or recognizable pattern.

    The point being that 7 +/- 2 is not a design "rule" that has anything to do with the underlying technology but, rather, how human brains work. Kalbach and others either have forgotten or never knew that the "7 +/- 2" pieces of info have nothing to do with what the technology can handle and everything to do with what one person can juggle in STM while trying to do something meaningful with that info.

    Chunking or clustering data is something we do naturally, without conscious effort, to reduce demands on our information processing. Use of cultural conventions (like requiring the 1 for long distance) that everyone familiar with a task can learn can also reduce these demands. By reducing these demands, you can help people

  • Issues:
    • bulleted lists don't use the <ul> tag. If they really wanted the kind of wrapping they got, they should have used the ul { list-style-position:inside; } rule
    • oodles of font tags specifying the same thing. Why are they trying to maintain compatibility with Netscape 3 and IE 3? CSS killed those off ages ago.
    • stupidest image map ever [usability.gov]
    • the font in their images hurts my eyes:)
  • Rules vs marketing (Score:4, Insightful)

    by The Winter Queen ( 39099 ) on Sunday November 30, 2003 @04:22PM (#7593727) Homepage

    I'm a dinosaur. I'm a damn good web coder. I used to love writing clean code. I loved the challange of reproducing what the design people came up with using the least amount of resourses.

    Marketing sucked the joy out of my work. I'd tell my boss "Look, it's fast and easy to use, and it looks the same in all browsers!" and he'd say "So? It needs more animations!"

    People like me are being replaced with flash monkeys and go tards with dreamweaver.People who can't write a style sheet by hand, or create simple javascripts.

    And look at the results! Sites that crash my browser, sites where I can't find any real content. Who the hell thinks a serious b to b site should be loaded down with flash? Why use java for ad banners?

    I doubt most non tech savy users on dial up connections are slogging through this crap.

    The internet is becoming less and less useful. And we have marketing weenies to thank.

  • by Dan East ( 318230 ) on Sunday November 30, 2003 @05:58PM (#7594288) Journal
    One factor I didn't see in the article is bandwidth. What does a "click" mean? Normally it means navigating to (ie transferring from the server to their PC) new content. As bandwidth has increased, which includes everything from server performance and internet infrastructure to the final mile, the delay until that new content is available at the client has decreased, meaning that clicks are lest costly time-wise now.

    So as the penalty of clicking on a link has reduced, the tolerance to clicking has gone up.

    This should be a huge factor in the 3-click rule, which I don't remember seeing in the article.

    Dan East
  • by DocDJ ( 530740 ) on Sunday November 30, 2003 @07:10PM (#7594673)
    Is it me or does this report seem rather unscientific? I quote: "we looked at data from a recent study of 44 users attempting 620 tasks." But no mention of the conditions under which these tasks were set. It's obvious that variations in the experimental conditions will produce variations in the results. For example, someone trying to find a product on a particular website may be inclined to give up after 3 clicks if they know they can just click over to the Walmart site to look. On the other hand, if you say to someone "here's a task, achieve it using this website" it's likely that they will persevere a bit more. The cynic in me suggests that the main purpose of the article is to publicise their roadshow. But then, the report does have graphs. Who am I to argue with graphs?
  • by Stultsinator ( 160564 ) on Monday December 01, 2003 @05:45AM (#7597111)
    If you recall an earlier discussion here about ternary computing [slashdot.org] (base 3 instead of base 2) there is a scientific proof that the optimal balance between width-oriented menus (lots of choices at each level, decreasing the number of levels) and depth-oriented menus (few choices at each level, deeper levels) is to have e (~2.7) choices. Obviously you can't have .7 choices, but if the number of choices per level averages to e and you group your choices logically, you'll have a solid argument that your layout is optimal.

For God's sake, stop researching for a while and begin to think!

Working...