Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
GUI Software

Top 8 Reasons HCI is in its Stone Age 547

UltimaGuy writes "This Editorial describes 8 reasons why HCI (Human Computer Interaction) is in its stone age. It laments about screen corners, filesystem, GUI Design and also 'spatialness'. "
This discussion has been archived. No new comments can be posted.

Top 8 Reasons HCI is in its Stone Age

Comments Filter:
  • by yagu ( 721525 ) * <{yayagu} {at} {gmail.com}> on Tuesday September 06, 2005 @01:40PM (#13491275) Journal

    Some pretty good long-standing beefs listed on that blog -- beefs I've never seen addressed. (Kind of like a recent article I saw talking about cell-phones, and that consumers would much prefer seeing the cell-phone issues and problems addressed before the crap like cameras, mp3 players, video recorders, etc. get incorporated into the "phones".)

    Off the top of my head I can add three that drive me crazy:

    1. In Windows I always define my task bar to autohide. Typically I have it to the side of the screen, wide enough so when I mouse over it pops out wide enough for meaningful text to show what tasks really are. But it drives me freaking crazy when events trigger auto-popout of the task bar, often right under my keyboard, or mouse and I end up typing something, hitting enter and triggering something I didn't want, or just plain obscuring something I'm trying to see. (It's so annoying when the network gets flaky and apps that disconnect and re-connect (gaim, "hello (Picasa)", et. al.) proudly interrupt what you're doing to announce they've reconnected for you. Fuck you. I get it! (I had lunch with a best buddy and complained about that task bar behavior, and asked how to disable it -- figured he'd be the one to ask. He rubbed his chin for a second and said, "Hmmm, that's a good idea, I don't have a clue how to disable that!)
    2. Meaningless jargon in messages. (this was addressed in the blog.) I got a worried e-mail from my Mom -- she was trying to start "gaim", and it kept giving her a dialog message, "An instance of gaim is already running". What the fuck? Why do we give computerese like "instance" to lay people? I can think of a few more meaningful messages than that off the top of my head that would let her proceed with confidence.
    3. Cutesy tooltips. It's no end annoying when I have new apps installed, and the "START" menu in XP puts up the "new programs installed" tooltip, obscuring the "logoff" or "turn off computer" button I'm really trying to get to.

    Yes, we're a LONG way off from interfaces that are easy to use and that make sense to the average user.

  • by CDMA_Demo ( 841347 ) on Tuesday September 06, 2005 @01:44PM (#13491310) Homepage

    From the article: Every single little tiny-weeny little interaction-shraction requires your visual attention."

    We are a long way from HCI obviously, as the article does not seem to consider blind computer users as Human. If we focus on the hard problems (one of which is improving the interaction with disabled users) the easy ones will simply fall into place.
  • Great (Score:2, Insightful)

    by gowen ( 141411 ) <gwowen@gmail.com> on Tuesday September 06, 2005 @01:47PM (#13491351) Homepage Journal
    ... more unfounded opinion masquerading as insight and research. And about HCI again.

    Great.
  • by cataclyst ( 849310 ) on Tuesday September 06, 2005 @01:47PM (#13491352) Homepage
    From TFA:

    Have you ever seen a system which lets you, out-of-the-box, hit a corner in order to do anything at all even remotely related to anything having anything at all to do with a document or application?

    Hmmm... yea... yea, I have... In the lower left corner of the screen for 99% of out-of-the-box systems when they are on there's that little start button, which does have something remotely to do with apps & docs... Also: what about the menu bar at the top? Upper right-hand corner: close window..
    Honestly, I don't know WTF half the articles are on here for... other than us flaming the crap outta 'em..
  • by Prophet of Nixon ( 842081 ) on Tuesday September 06, 2005 @01:48PM (#13491363)
    Yea, I don't see why I should want an OS that performs arbitrary actions just because I moved a cursor to a screen corner. That would drive me mad. Also, how would it work for people who have their cursor wrap around the screen?
  • 1. Screen Corners (Score:3, Insightful)

    by oliverthered ( 187439 ) <oliverthered@nOSPAm.hotmail.com> on Tuesday September 06, 2005 @01:52PM (#13491376) Journal
    Chock of shit, well almost.

    I actually wrote an application that timed how long it took to click on a small red box with the word click me written on it (distance / time)

    After doing the math you could nicely fit a straight line to the points, I even tried splitting out the results based on the direction of movement and their was very little difference and setup a test to explicitly test the 'corner of the screen' theory.

    In the end it was no quicker to reach the corners of the screen than a small box anywhere else on the screen. That it probably why no one utilizes the corners of the screen in the way suggested.

    I wrote a few more tests and was going to put together a Java applet so that world + dog could help out.
    Things like giving your menu entries sensible names and keeping things consistant were far more important for novice and experienced users. I was also looking at things like colour coding, 'vanishing' and growing buttons and other UI elements depending on how often they were used etc...

    The main reason for the lack of good user interfaces is that no one ever seems to o solid scientific testing on them, the kind of testing that proves innovations in UI outclass current designs instead of relying on a designers hunch.
  • by TripMaster Monkey ( 862126 ) * on Tuesday September 06, 2005 @01:52PM (#13491381)

    The author of this article has some valid points here...it's unfortunate that he chooses to embed those few valid points in a sticky matrix of hyperbole, hysteria, and inaccuracies.

    Just a few things:
    From TFA:
    So is it possible to design a system that's suits both beginners and professionals? (No t33n-N30, the answer isn't Pr3f3r3nc3Zz!!!!!!!! 1337-H4XX0R5!!!.)
    That's funny....I was under the impression that preferences were exactly the answer to this issue.

    Also from TFA
    We wish to rotate an image, shrink it 50%, attach it to an e-mail and send it to a deaf musician.

    A. Utilizing a modern interface: The procedure would involve several clicks, mouse drags and keystrokes, and also require expert skills in order to complete the task in less time than one minute. Moreover, in order to complete the task at all, a number of subtasks (which are actually unrelated to the task at hand) need tending to. We need for instance worry about choosing a file name and a location in the process of storing the image, and then, from the e-mail application, locating the image we just stored in order to attach it.

    B. Say Tip a quarter to the right, crop by half and e-mail to Stevie Wonder.

    By the way, did you know that one-knob faucets were originally designed for disabled persons?
    By the way, did you know that a) Stevie Wonder is blind, not deaf, and b) 'shrink' is not synonymous with 'crop'?
  • Clear as mud (Score:3, Insightful)

    by miketo ( 461816 ) <miketo@nRABBITwlink.com minus herbivore> on Tuesday September 06, 2005 @01:52PM (#13491382)
    I really tried to get more than halfway through the article. But after phrases like " a belly-barn shackle in the reunion of unjustified friends", I couldn't continue. He bemoans the lack of clarity in HCI, yet his writing is a stream-of-consciousness mess.

    If he can't communicate his ideas better, maybe he's not the best person to describe what's wrong with HCI. I'm not the brightest bulb on the billboard, but come on -- this guy needs an editor.
  • Pet peeves... (Score:5, Insightful)

    by It doesn't come easy ( 695416 ) * on Tuesday September 06, 2005 @01:55PM (#13491402) Journal
    Menus that change. Whoever thought up the idea of menus that hide unused items or change the displayed order based on frequency of use should be one of the first ones up against the wall when the revolution comes. Changing menus are one of the worst productivity enhancements of the last millennium. Forget that you can turn it off. It should never have been invented in the first place (no doubt it's patented, too).

    Unsolicited offers from the system to remove unused shortcuts on my desktop. I don't need help removing my unused shortcuts. They are there for a reason and just because I haven't clicked on them in a month doesn't mean they're not useful.

    Special buttons to page forward/page back in the web browser. I don't know how many times I've accidentally erased my latest diatribe by inadvertently paging backward on Slashdot. Good grief, at least put the function behind a modifier key.

    Caps Lock. Who named this key anyway? In Windows, it's not a caps lock key, it's a caps reverse key. And who the hell needs a caps reverse key? hAS aNYONE eVER rEALLY nEEDED tHIS fUNCTIONALITY bEFORE? I wonder where some people's brains are sometimes.

    I could go on...and on, and on, and on...
  • The largest key (Score:5, Insightful)

    by Se7enLC ( 714730 ) on Tuesday September 06, 2005 @01:56PM (#13491410) Homepage Journal
    ...what the LARGEST KEY ON THE KEYBOARD does. Well... this key? Right over here? Ah, the chubby one! It.. spaces... kind of... leaps.. a tiny bit. In the text... See...? Nothingness! Hey, I know how this must sound... Hey! Wait!! No!!

    Hey, how about maybe it's the largest key on the keyboard because it's the MOST FREQUENTLY USED? Wow, imagine that, making something that you use often larger and thus easier to find. Doesn't seem stone age to me, seems more like tried-and-true.
  • by ABaumann ( 748617 ) on Tuesday September 06, 2005 @01:58PM (#13491424)
    It's a rant on a stupid blog. Slashdot refers to it as an "Editorial"

    The guy's simply a moron. At least half of his "points" are opinions. Others are just not really points at all. "4. Multiple representation of the file system. ... See point six." And what's with 8 having no title? Point 8 isn't a point. It's a use case.

    Finally...

    We wish to rotate an image, shrink it 50%, attach it to an e-mail and send it to a deaf musician. Say Tip a quarter to the right, crop by half and e-mail to Stevie Wonder.

    You sir, have failed. You just sent it to a blind musician, not a deaf one.
  • Editorial? (Score:5, Insightful)

    by jdog1016 ( 703094 ) on Tuesday September 06, 2005 @01:59PM (#13491428)
    I'm sorry, but I don't think "editorial" is the terminology I would use here. The correct phrase is "random blog post." Who is this person? Nowhere on the page are the credentials of the author, and nowhere in the post does he/she address anything directly related to HCI. Interfaces of popular OS's and windowing systems represent a very, very small subset of HCI, and attacking these with 8 poorly researched, poorly thought out, hardly substantiated claims is a laughable way to go about showing that HCI is in its "stone age." Human Computer Interaction is a very new thing, much newer even than computer science, which is also in its infancy, and mostly everyone that knows anything about HCI knows this. I realize that sensationaliziing common knowledge with irrelevant bullshit is amusing to some people, but Slashdot is supposed to be about news.
  • by falcon203e ( 589344 ) on Tuesday September 06, 2005 @02:02PM (#13491463)
    Here's my problem with the screen corners. Because they're the easiest to get to, they're also the easiest to land on by mistake. To simply have a corner activate a process is annoying, so there must be some sort of confirmation. A click, perhaps. Well guess what, Apple already has you covered, as the top two corners, when clicked, activate the Apple menu and the Spotlight menu. If you put something in the corner, it requires some sort of input to activate, and some other sort of input to perform its task. I'm not sure what you'd want to put in the corners, but for the sake of example let's say you want your application switcher there. Are you sure about that? Would you really rather mouse to the corner, activate the switcher, mouse to the app you want to switch to, and click again? Or would you rather find your app in the Dock/Taskbar and click it?
  • by grumbel ( 592662 ) <grumbel+slashdot@gmail.com> on Tuesday September 06, 2005 @02:03PM (#13491471) Homepage
    ### That's funny....I was under the impression that preferences were exactly the answer to this issue.

    The problem with preferences is that they are quite often not used to configure important stuff, but more in a terms of "We don't know how to do it correctly, so lets the user figure it out himself via Prefs". This than leads to inconsistency and throuble, since you can't predict how stuff will work on the users computer (MacOSX style menu at top is not much fun with focus-follows-mouse, etc.).

    But I agree that configurabilty is absolutly important especially for the UI of tomorrow. Tomorrows UI must be able to adopt to whatever problem I throw at it, but at its core it has to be consistent, so that even which changed preferences there is stuff one can depend upon when developing applications.
  • Alternatives? (Score:2, Insightful)

    by gbr ( 31010 ) on Tuesday September 06, 2005 @02:06PM (#13491496) Homepage
    A Rant without viable alternatives is a waste of space.
  • by Mr Guy ( 547690 ) on Tuesday September 06, 2005 @02:09PM (#13491525) Journal
    If we focus on the hard problems (one of which is improving the interaction with disabled users) the easy ones will simply fall into place.


    Bull. Disabled users aren't the same as normal users and designing for them isn't the same. I'm willing to bet blind users would prefer a text only computer, with the information organized in table form so it's easy to follow the hierarchy of information. The CLI, I'd think, would be ideal for blind users.

    The real problem right now is that people who are technophobes don't like to admit how good of a tool the computer really is, and how well suited for it's purpose it is. Nearly every solution I've ever seen isn't practical for how computers are actually used. Voice activation in cubicles? 3D immersion just to check your mail?

    HCI isn't going to improve vhastly until there's a good system for direct mental interaction, and even then it'll take a long time for people to trust it.
  • Nice Rant (Score:5, Insightful)

    by wayne606 ( 211893 ) on Tuesday September 06, 2005 @02:10PM (#13491542)
    I can't wait to see *his* UI design that addresses all these concerns.
  • by pla ( 258480 ) on Tuesday September 06, 2005 @02:11PM (#13491552) Journal
    Meaningless jargon in messages.

    Although a lot of programs may lay it on a tad thick, computer users NEED to learn a bit of jargon if they hope to have any shot of dealing with modern technology.

    You can't use a car without understanding what the brake and accellerator (and sometimes a clutch) do. When you take it in for repairs, even if you don't know how to fix it yourself, you want to know if you need a spark plug or a timing belt (not just "it broke, please pay $xxxx for the next 20,000 miles...").

    The same goes with computers. Your example, of an "instance", I consider not that bad... How do you phrase that better? "GAIM is already running"? Since such errors usually happen when you have a ghost process, I suspect most users would find that even more frustrating (I know how my grandfather would react - "God damn it, if I already had it running I wouldn't have tried to start it, you worthless pile of (stream of obscenties ommitted)").

    Cutesy tooltips.

    I agree 100%... You can actually turn those off, at least the ones that come from Windows itself, but XP has a rather obnoxious bug wherein you will eventually get them back, and can't turn them off again (because you already have them off).



    Oh, and your peeve about the task bar - Drives me absolutely batty. To re-quote the grandfather, "God damn it, if I wanted to switch to that window, I'd click on it, you worthless pile of (stream of obscenties ommitted)!". :)
  • by kisrael ( 134664 ) on Tuesday September 06, 2005 @02:13PM (#13491574) Homepage
    Good point about the corners.

    I think people who do HCI with a stopwatch are missing an important point, that A. initial friendliness to newbies, ideally to let them ramp up and B. "mental load" for experienced users, how much they have to keep in their head, are both as or more important than an extra millisecond.

    One random addition to this discussion:

    "If people were going to use computers all day, everyday, the design of such machines was not solely a technical problem-- it was also an aesthetic one. *A lousy interface would mean a lousy life.*"
    --Myron Krueger
  • by Minna Kirai ( 624281 ) on Tuesday September 06, 2005 @02:14PM (#13491582)
    The prime reason why HCI (aka "GUIs") is in such a poor shape is that each application still controls its own GUI.

    New OSes have little opportunity for HCI improvements because too many of the details are left down for the application programmers to decide upon. At best, the OS vendor provides a shared GUI library (buttons + widgets), and a guidebook [apple.com] teaching app authors the "right" way to do it.

    But, depending on each individual author to carry out the instructions is fundamentally limited and slow. Not every programmer will be aware of the guidelines, choose to obey them, or be capable of following it exactly even if he tries.

    And even if all coders were magically obedient to the published standard, it's still non-optimal. New ideas to improve the HCI guidelines cannot be uniformly implemented without waiting years for all programs to be updated. Computers are supposed to REDUCE redundant labor- instead of each app's GUI being written separately, all trying to implement the same guidelines, one piece of code should handle all that functionality in one place. Code reuse is a fundamental rule of software design that has taken far too long to penetrate the HCI world.

    What we need are applications written to a high level GUI description service, so that the OS can implement a UI consistent with other programs and exactly tailored to the limitations of this user (Colorblind? Blind? No keyboard? No mouse? No muscular control besides blinking [medgadget.com]?)
  • by hey! ( 33014 ) on Tuesday September 06, 2005 @02:21PM (#13491658) Homepage Journal
    Design is communication. What's easier to use, an interface that communicates "This is how you do such and so," or an interface that communicates "Hey, you! I'm easy to use!"?

    Now, suppose you are marketing a product. Which message gets you the most sales?

    Software user interfaces pretty much respond to the same pressures as any other kind of interface. Most interfaces are designed to communicate messages of desirability, not anything as pedestrian as function. Most car dashboards are a mess for that reason. You can get custom color face plates for your cell phone so you have one to match every outfit in your closet, but it's still a piece of shit to use.

  • I agree. It drives me nuts when people use the car analogy for a 'good interface' and a keyboard as a bad one. I clearly remember learning to drive and thinking 'there's way too much to focus on, I'll never be able to do this for fun' and yet, after practice, studying, and more practice I learned how to do it and enjoy it.

    No one is able to just sit down in a car and drive down the turnpike, you need to spend some time upfront with it. People need to realize that with computers as well.

    So I appreciate the additional car references you make.
  • by Jekler ( 626699 ) on Tuesday September 06, 2005 @02:42PM (#13491887)
    I RTFA, and it comes off as a written by someone who isn't very well studied on the concepts of User Interfaces. To be truthful, it sounds like the author just finished reading The Humane Interface: New Directions for Designing Interactive Systems by Jef Raskin.

    The editorialist makes a few good points, but it's a bit one-sided. He presents a very simplified view of what it takes to build a powerful user interface. There are thousands of scientists with PhDs studying the field of HCI, coming up with answers all the time, but there's a huge leap between what sounds good in theory and what actually works. One persons idea of a brilliant user interface is another person's nightmare that turns their operating system into something that resembles M.C. Ecsher's work.

    Games are the breeding ground for examples of where conceptually-superior user interfaces often fail. Take a game like Black and White or Temple of Elemental Evil. Controlling a character or environment is no longer as simple as pushing some arrow keys, it's an exercise in digital dexterity. Even though conceptually it allows you to present more options in a smaller space, it's still foreign to everyone who has ever played another game.

    Everytime you try a new user interface, it requires everyone who is comfortable to give up that comfort for the sake of eventually having an easier experience. The effect can be observed when people try using a Devorak keyboard. Technically speaking, Devorak might be a superior idea, but it also represents 4 weeks worth of practice.

    The idea that we "should" find a better way to use computers has been around for a long time. Implementing those ideas in a way that the majority of users can accept is an enormous task. If the author really thinks his ideas about user interfaces is a trivial task, he should build a prototype.

    Every couple years, someone comes up with a brilliant idea for a new way to interact with computers that involves some sort of surrealistic work of art like a Pyramid Keyboard you stick your fingers in like you're piloting an alien shuttle.

    The article is hypocritical. There's no table of contents for each numbered point. For all the talk of making things difficult, why do I need to scroll repeatedly up and down the page to locate information? And why use >> << as some sort of quotation mark replacement? He talks about how intuitive using corners is but he can't use the same symbol to quote a person that almost every English document for the last 3 centuries has. Glass house meet stones.
  • by cbiffle ( 211614 ) on Tuesday September 06, 2005 @02:43PM (#13491903)
    The same goes with computers. Your example, of an "instance", I consider not that bad... How do you phrase that better? "GAIM is already running"?


    No.

    Instead, you pop up the existing GAIM instance.

    If the user clicked on the GAIM icon, s/he wanted GAIM. Give them GAIM. The problem in the dialog is a red herring; the problem is in the implementation.
  • Agreed (Score:2, Insightful)

    by jd ( 1658 ) <imipak@yahoGINSBERGo.com minus poet> on Tuesday September 06, 2005 @03:16PM (#13492226) Homepage Journal
    The problem with a lot of "preferences", however, is that they have little to do with actual preferences and much more to do with decorations and effects.


    Ideally, the preferences option would allow you to control the skin of the interface at virtually any level along with the paradigm the interface operates under.


    For example, some people don't want an "object oriented" UI, where specific data types are linked to specific applications. On the other hand, some people do.


    There are times when you want to be able to use piping in the interface to chain certain combinations of applets, wrappers and applications. This is trivial in a command-line shell, but very difficult to achieve by point-and-click methods in a GUI.


    And so on. The list of what you MIGHT want to do is endless. Since the underlying mechanism is simply a bunch of events that trigger actions, there is no reason why preferences should not exist to create sets and sequences of events that meet your own personal requirements.


    The upshot of all this is that you'd have an ultra-lightweight GUI that could do basic operations really, really fast. The GUI that the users then saw would be built from scripts and data (possibly XML) that in turn was generated through a conventional preferences selection, pick-lists, flow-charts, CASE tools and anything else the developers thought useful.


    You could then completely reprogram what you saw and how the interface operated in purely graphical terms. As you grew more sophisticated and your needs changed, you could rewire your GUI to meet your new requirements.


    All of this could be done right now. In fact, it very nearly has happened in some ways - web interfaces dominate some markets, for example, and scripting within interfaces is increasingly common, although nowhere near the level I'm considering.


    It is very unlikely GUIs will ever evolve in this direction, however - GUIs are designed to shape how we think about what we're doing, they are deliberately NOT designed to be shaped BY how we think. In some ways, this is a good thing - it provides a focus and it makes it possible for two users to communicate methods. If everything were dynamically definable, there would be no provable common frame of reference.


    This would allow a far higher level of individual competency, but at the price of making group competency almost impossible. The current system sacrifices relatively little individual competency in order for groups to work in a standardized way, which allows you to have a much higher group competency.


    This is the age-old trade-off. You can't have something that is good for both individuals AND groups - whatever is good for one will hurt the other. As the majority of GUI users are in some sort of social setting, group competency is the more important, so that is what GUIs are aimed at supporting.


    So, yes, the "problem" can be solved, and could have been for quite some time now, but the cost is one that the majority of users simply wouldn't pay. Because of that, it is a "problem" nobody has any real interest in solving.

  • by sootman ( 158191 ) on Tuesday September 06, 2005 @03:16PM (#13492237) Homepage Journal
    Ah yes, the top left and right corners: a mere 10 pixels away (yes, I measured) from two buttons you may want to use: apple menu in the top left, and clock, username, or whatever you put up there in the top right. I laugh every time I see a PowerBook user go for the Apple menu with their trackpad and VWOOP! all their windows slide around. So they go up there, then back so things were as they were, then back again slowly. Real timesaver, that.

    Oh well, the Apple menu has been mostly worthless for four years now anyway. And who ever clicks on the time, anyway? Oh, that's right: EVERYONE, since you can't (without a hack) show the DATE up there. (Dear Apple: I generally know what day of the week it since, since job and school both operate on a standard M-F week. What freaking DATE is it?!?!? How hard would it be to add one more checkbox to the list in the date/time prefs?)

    I hear 10.4 makes the corners clickable for the Apple menu and Sporlight. It's also worth nothing that XP (finally!) lets you activate the Start menu with the bottom-left-corner pixel, and Windows since '95 has let you close a maximized window with a click in the absolute top-right corner.
  • by JourneymanMereel ( 191114 ) on Tuesday September 06, 2005 @03:21PM (#13492288) Homepage Journal
    Highly useful, but not very intuitive...

    Which is kinda the point of this story ;)
  • by blincoln ( 592401 ) on Tuesday September 06, 2005 @03:35PM (#13492462) Homepage Journal
    Instead, you pop up the existing GAIM instance.

    No. Seriously.

    I like to run multiple instances of applications. If I tell my OS I want another copy of something open, I don't want it to switch to the one that's already running.

    It would be even worse to make some applications behave the current way, and others switch to the instance that's already running. This is what a lot of MS apps do now, and it's really annoying.
  • by fossa ( 212602 ) <pat7@gmx. n e t> on Tuesday September 06, 2005 @03:40PM (#13492521) Journal

    The commandline is broken. So many people hate it. Why? Lack of visual feedback? The need to memorize many commands and their options?

    The GUI is broken. Popup windows constantly getting in the way; windows obscuring where I'm looking. Why is "ls *.bmp | xargs convert $i $i.jpg" so difficult in a GUI?

    A complete rethinking of computer interfaces is needed. I think a lot of HCI research is of little use because it's starting from such flawed premises. You can only keep patching holes for so long. Projects like the late Jef Raskin's Archy are interesting and what I consider cutting edge HCI.

    Of course, we're so entrenched at this point that any out of the box HCI research is also of little use... For shame.

  • by Jekler ( 626699 ) on Tuesday September 06, 2005 @05:04PM (#13493410)
    That's exactly my point though, it's not a matter of needing multiple instances it's a matter of needing more functionality in the program.

    We were discussing a hypothetical situation in which applications should work intelligently, such as if you try to run a program that's already running, it brings to the foreground the already existing one.

    Continuing along the same hypothetical, you don't need two instances, you need one instance with more features. Just like the grandparent said about the warning dialog being a red herring, the need to open two instances is also a red herring. We don't need two instances, we need a single instance that does what we want. In this way, we should view the need to open two instances of a media player in the same way we view the need to play another file format. It's another function the application should provide. Allowing two instances doesn't fix the problem, it masks the problem.
  • by teridon ( 139550 ) on Tuesday September 06, 2005 @07:00PM (#13494535) Homepage
    just a box with that appeared randomly [...] at given locations on the screen (including points in the corners)

    Did you also track the eye movements of the users? Did they look at the box in the corner before clicking it?

    I would posit that moving the mouse to a screen corner *without looking at it* is faster than clicking a box which appears in the corner. The users in your test may have gotten used to boxes appearing at random screen locations, and having to look where it is so they could click on it. When the box appeared in the corner, they still looked at it, to verify it was all the way in the corner. (What if it were a few pixels away from the corner?)
  • by 0-9a-f ( 445046 ) <drhex0x06@poztiv.com> on Tuesday September 06, 2005 @07:38PM (#13494858) Homepage
    1. Spotting flaws in any technology - easy.
      Example: QWERTY keyboards sux0rs!
    2. Recommending a solution is - good.
      Example: Dvorak keyboard r0x0rs!
    3. To fix the problem before everyone gets used to the broken implementation - divine.
      Example: I've never met anyone who uses a Dvorak keyboard.
    Just like this guy's rant against Windows, it seems everyone now knew that New Orleans was doomed. Problem was, everyone got used to it the way it was, and felt the money could be better used elsewhere.

    Wake me when there's some real news.

"And remember: Evil will always prevail, because Good is dumb." -- Spaceballs

Working...