Posted
by
michael
from the up-up-down-down-left-right-left-right dept.
_iris writes: "As reported on the Dot, gesture control (apparently all the rage with the kids after the latest Opera release) is coming to KDE. You can find a _very_ early release of KGesture here." Sounds like a recipe for carpal tunnel to me.
This discussion has been archived.
No new comments can be posted.
Having used gestures from the Opera browser, I have seen that they are very usefull and really enhance the WWW experience. They enhance it so much that I cannot understand the negative comments. As noted by other users, no other desktop OS has them, thus it is a genuine chance for Linux to make a head start. Please, before making any comments, think first if it is a constructive comment.
There are a large number of deaf people that cannot speak clearly due to the fact that they don't know what sounds they are supposed to make with their mouth; they use ASL to communicate both ways ('listening' and 'speaking'). Because of this, they are very agile in ASL, and can 'speak' much faster than typing, particular since most common english words are a single gesture instead of spelt out. So for those in that situation, give them these gloves, a portable linux device, and pipe the output of the glove interpretation program into a text-to-speech program, and, tada, these people now have the ability to 'talk' to any audience, deaf or non-deaf.
Way back in 1992 I implemented a small piece of software called "Lexicon" which achieved just this on my Atari ST. It was immune to the scale and speed with which the gesture was drawn (I called them doodles), and worked pretty well. I released it as public domain so it may still be floating around.
In 1995 I reimplemented it in Java, and for a while you could navigate my homepage by drawing simple gestures in an applet window. It was really simple and took about 3 hours to implement.
In 1998 I got quite into Window Maker, and started a conversation with Alfredo (Mr W.M) about integrating this functionality into it. I wrote the gesture parsing code, and he wrote a front-end, but it never really got past the experimental stage. I am sure that code is probably floating around somewhere too.
Before long I got bored with it though, it is must faster to hit a key on a keyboard or press a button on a GUI than it is to draw a gesure which is often misinterpreted.
They are great gimmicks, but of limited practical use.
oh, to hell w/ voice recognition. as i don't work in the US i am missing out on the cubicle culture and i'm sitting in my office w/ 2 other people. i think all hell will break loose if we all start mumbling "close window", "open xterm" all day. heck, I imagine a support call coming in, someone here says "Close Window" and the guy on the other end of the phone actually does this... "Uh, no, not you, that was just someone wanting to CLOSE a WINDOW"... hm, chain reaction anyone?:)
on the other hand imagine all the pranks you could play on your co-workers computer if you can teach them to react to everyday words in a real weird way:)
Gesture control represents something new, and to many people, interesting. I am not aware of any other OS/Desktop with gesture interfaces as a component.
Anyway, this does look like a Freshmeat thing, but I think talking about gesture entry, rather than biching about the story, is what the editors intended.
Gel wrist pads? You know you are not supposed to rest your wrists on anything as you type?
I'm aware of the "rules" for ergonomics. That said, I'm not resting *as* I type. I rest my wrists on them when I *stop* typing, but I still have to read. Reducing wrist travel makes it far more comfortable to deal with using the computer, not just typing or mousing about. It's a bit of a holdover from piano lessons. If you can't keep a quarter on the back of each wrist as you type/click, then you're doing it all wrong.
As for the questions regarding accidental gestures, the system I'm using traces my gestures, and only responds when I'm holding down a button. The result is a system that costs me my right mouse button (with an updated interface, I'm sure I could assign that to the fourth button on my logitech.), and even that can be turned off pretty quickly.
That said, the gesture system has its flaws, and so does excessive reliance on wrist pads. You have to know what you're dealing with before you hurt yourself doing something stupid.
I'd have to disagree with that. I've been running on a gesture system for a while now, and it's actually more convenient for most simple tasks. Since I don't have to move my hand from mouse to keyboard as often, and I have a decent setup (gel wrist pads are amazing), and I *stop* every hour or so to make sure that I don't overdo it, my wrists have been fine.
The reduction of mouse to keyboard switches has done wonders for my overall speed with my system, and caused the small amount of pain that I was already in to go away.
Maybe you should give it a shot before making any claims, Michael.
> from the up-up-down-down-left-right-left-right dept.
Karma points to whoever remembers the old Nintendo (yeah, the 8-bit thing I still play constantly in my dorm room) game this code is from. That is probably the only video game code I'll still remember when I'm 80:)
IIRC Mentor Graphics (a VLSI design tool) included gesture control. Since most of the time you were mousing components around anyway, it was convenient to use gestures for cut, copy, paste, etc. There were special selection "strokes", and other tool-specific commands. In fact, as I read about libstroke it sounds like the author of that library was inspired by similar CAD programs.
Really, the only thing you need to add for great gesture recognition is another mouse - one for each hand would really improve the gesture complexity you could generate, and make things much faster too. Ultimately I foresee some sort of VR glove (like in the oft-maligned Johnny Mnemonic) where you can type, move things around with your hands, and set up specific hand motions to do certain actions (a karate chop or a scissors motion means "cut", etc.). That would be the real convergence of the mouse and keyboard.
Caution: contents may be quarrelsome and meticulous!
You won the first price for the most stupid post of the day. Get your facts, KDE makes no money, is GPL since day one, and is truly in the spirit of free software development.
I can't speak to the implementation in an OS, but there's a gestural component to Discreet compositing software, and believe me, it makes things *much* faster. Personally I've never had any trouble with CTS, nor has anyone else here, and I've certainly never heard of increased rates of this in folks in the biz. I suspect it's related more to implementation than anything else. I mean, if I was expected to rename a file with gestures, duh, but wiping the screen to cancel an operation, or wiggling to undo something - I could see that being a nice adjunct to the normal methods.
As long as I can turn it off when playing unreal tournament.:)
This switching from keyboard to mouse to keyboard to mouse is exactly why I prefer to drive everything from the keyboard. I use WinNT at work, and have found that it's quite easy to get around (most of) the system & developer tools with keyboard strokes only.
[OT: Can someone point me to good resource for keyboard-only navigation under Gnome? Or is KDE better in this regard? I'm using Sawfish at the moment.]
[OT(2): This is one of the things that really bugs me about the Macintosh; you
must use the mouse on a Mac.]
Anyway, in order to reduce my travel from the keyboard to the mouse (necessary even when you can reduce the use of a mouse), I bought an ergo keyboard that has a built-in touchpad in the lower right corner. It took some getting used to, but now I love it.
My main point: I think guesture control would be easiest with one of these touchpads, and not with a mouse (and probably very difficult with a trackball).
I have used some high end CAD software like
Mentor Graphics that uses gestures extensively.
It is more productive than using a mouse when
you get used to it.
Actually, I would LOVE to see this in a game like Diablo II. You learn spells, and then you cast them by making gestures with your mouse over your enemies. Being a magic user might actually take some talent then.
Grab UAE or dust off your Amiga and search for a copy of Tower's Curse.
Agree with you about "optional". One major issue that isn't immediately clear to me is how actions get bound to guestures. The more the user can customize and assign, the better; that's something that *VERY* few OSes or apps get right now. Of course, good intuitive defaults are important, but it's not nice to straightjacket users.
On the other hand, the other big unknown is how long it will take before these kinds of interfaces can progress beyond simple commands. I was playing around with OS X voice commands the other day, and I realized there aren't that many. While I'll probably be using a keyboard til I die, it's still interesting for another reason. As you start creating the ability for a computer to do something like "open a new document with vim in my complaint letters folder", progressing to "download all the images with thumbnails on this page and put them in a new folder called 'Kournikova'", you eventually start crossing over into the real ability of a computer to use language. No doubt, these kind of simple OS-related tasks will be the first practical application of this.
Interesting, but I still prefer keyboard shortcuts. Heavy mouse usage makes me rest my hands, which makes my elbows tingle after a while. Seriously, keep your wrists in the air as much as possible. There's a reason you never heard about carpal tunnel syndrome in the era of manual typewriters.
Boss of nothin. Big deal.
Son, go get daddy's hard plastic eyes.
aww come on man! It doesn't take much to realize that punishing the creature makes it stop if you do it enough times. If he is holding food and you punish him (perhaps that 'food' is a villager) he will learn not to eat that food.
As long as it's intuitive, something many programmers and UI designers fail to grasp, I have no problem with this, as long as I can use other methods.
I don't want to have to make circles over and over again with my mouse to mean rearrange the icons by name, or in the opposite direction to sort by size.
The multi platform 3D modeling program
blender [blender.nl] (full featured and fits on a floppy!) has had this for a few years now. It's really pretty easy to get used to.
I mean, once it's a huge success and adopted by everyone i'm sure someone will creep out from under some stone and claim the patent, and would now all makers of Software using gestures for input please pay the license for the past 3 years (anyone remember GIFs?).
It's not that i think of this kind of input as an obvious concept, but maybe the folks implementing it should look if there is already someone applying for a patent. It'd also be a good idea to look where and with who the idea crept up first, it'd make looking for prior art so much easier in ten years time.
My god man, look at your user id!!! After all this time you are STILL whining about unnoteworthy news. Well, it's okay. Maybe you were confused and thought you were on a different website. This is slashdot, for the love of god! Come on man! Wake up!
Sorry, had to get that horrible pun out of my system.
Seriously: while gesture controls may not be ready for prime time just yet, consider that the technology may prove useful for those who communicate best using only their hands. A 17-year-old from Colorado recently won the grand prize at the International Science and Engineering Fair [rockymountainnews.com] for designing a glove that can interpret the movements of someone "speaking" in American Sign Language (ASL) and then output the communication as text.
So, the hard of hearing could control their computer with ASL commands, or dictate letters the same way the rest of us can with a voice-powered word processor, using beefed-up gesture control technology and, of course, hardware that can reliably interpret their hand movements. It's somewhat tangential to the story, but an intriguing concept nonetheless.
I disagree. If you're already holding the mouse, you can move around really quickly with gesture-based navigation in Opera. Of course, this is assuming you're using a real mouse; with a laptop touchpad this wouldn't be convenient at all. --
It was an expensive MIDI composing program for the Mac. I can't remember the name of it. The last time I used it was in '94. It let you shake off a tool that you selected.
If KDE really wanted to improve their interface, they should make those tiny little toolbar buttons a lot bigger by adding labels. When you increase the size of a target (aka control aka widget), the user can access it faster (something we in the UI industry call Fitts' law). Right now, KDE has billions of tiny buttons that aren't very forthcoming as to what they do (a problem alleviated by a label) and that have crappy access times as a result of their tinyness. Just like all those buttons in M$ office. I guarantee you that few users if any ever use the toolbar buttons in word or excel because they're esoteric and have no speed advantage. Another problem with KDE is lack of progressive disclosure, which is the concept of putting the most simple, basic options at the top-level of an interface, and then giving the user the option of digging down to a more complex level if needed. KDE doesn't do this. They throw 18 billion menu entries, buttons, and other controls straight at the user. When this happens, users will feel completely overwhelmed and won't know where to begin in using program. Just looking at Konqueror makes my head spin.
I'm not bashing KDE for adding a good advanced feature like gesturing, but this seems to be just one more instance in a trend that desktop environments have followed as of late: adding cool, trendy, buzzword-compliant technologies but then completely blowing it the most basic and fundamental UI design principles.
"But you can customize it" people say
"But if you dig deep enough into the configuration, you can change" people say
Such are the ideas that hold linux from the desktop. Many users starting off will do neither, and shouldn't be expected to try to improve things that should have been improved to begin with. If there's something in an interface that is supposed to be done (e.g. labeling toolbar buttons) and makes an interface more usable, it should be the default.
Actually, I find that if I can get all my commands on the keyboard, I am much faster. If I have to pickup the mouse it really slows me down. I would rather every function in an application or desktop have a shortcut than a gesture. Since most do or allow you to create your own, I love it. The mouse is just a burden.
I kinda like that "Blender" ( www.blender.nl ), a 3d graphics app that uses a few "mouse-gestures" for common tasks. I've found I use this quite a bit. Might be interesting on the desk-top.
I hope before this trend diversifies (and M$ claims to innovate it several years from now, while poo-pooing it for the short run), that someone sets about putting together some standard gestures so our environments will act similar.
I decided to give it a try, and my keyboard usage has droped way down. I really like rolling one's hand over the mouse buttons to move forward and back, not to mention the quick gestures to minimize or close.
I use a screen with a lot of realestate, so keeping mouse movements to a minimum via not having to hit small targets is a good thing. Besides, you can always use it the old fashioned way.
Almost every Konami game included some cheat using that sequence. Ah, the actual sequence was up-up-down-down-left-right-left-right-B-A-B-A-STAR T.
On Gradius you got all weapons, on Contra you got infinite lives, and the list goes on. On Gradius II your ship exploded:)
from the libstroke ( the library behind Kgesture) homepage [etla.net] Thu May 10 - Version 0.5, featuring a GNOME version of the library written by Dan Nicolaescu, is being tested and will be released soon.
Does that mean the we can expect gnome to have the same feature soon ?
Hello All,
I wrote KGesture the other day. It is a gesture recognition application
for KDE. As far as I know, it is the first of its kind and offers
something MS doesn't.
Problem 1:
KGesture relies heavily on DCOP to communicate with running applications.
Most of the feedback that I have been getting back from users is that they
can't do what they want because the application doesn't make it available.
I was wondering what the possibility of increasing the number of methods
available in a stub is? Does performace take a hit as the number of
methods in an interface increases?
For example, most users want back() and next() from Konqueror exposed so
they can control the browser with gestures. Also, minimizing applications
(I suppose through a KWin interface).
What are the chances of getting these exposed? Is KGesture going about
this the wrong way?
Problem 2:
Currently users must use the 'dcop' command to find the method they want
to trigger. Is there a DCOP browser out there, or a widget that builds a
tree from DCOP information?
Letting the user easily select a dcop function would be a big plus.
If your interested, KGesture can be found here:
http://www.slac.com/~mpilone/projects/
It started as a little adventure in stroke recognition, but people seem to
like it, and it is a new approach to computer interaction (at least on the
desktop with a mouse! CAD apps have been doing it with a pen forever).
Thanks in advance,
-mike
Mike Pilone Computer Scientist
mpilone@slac.com http://www.slac.com/mpilone/
Visit http://master.kde.org/mailman/listinfo/kde-devel #u nsub to unsubscribe
I hope that gestures take off. I've been using Opera [opera.com] for a few days now and have played a bit of Black and White [bwgame.com] too.
I don't know if it's just the novelty value, but I'm finding them useful. Hold the right mouse button and moving around is very easy and quickens things up if you're browing with the mouse (no need to reach to the keyboard again). It's a lot quicker to hold right mouse button and drag up and down than it is to find the reload button and click it.
I agree with the poster above, there needs to be a well defined set of gestures that will work in all applications before it really takes off.
One thing that really helped deal with the pain was shoveling my car out of 18 inches of snow. Twice. (damn inconsiderate snow plows.:) I've also found that visiting my local trigger point therapist [about.com] (random google link) has helped a lot. So, it's exercise and massage therapy if you don't want to be troubled with RSI.
I do have a pair of the exercise balls, and used them rather religiously for a while, at least a month.. While they might have helped a bit, they didn't provide enough relief.
Another thing - accupuncture can be great too.. that, and someone else says Carpal Tunnel/RSI results from a chronic intra-cellular magnesium defficiency - this stuff [naturalcalm.net] is a great magnesium supplement..
Am I the only one who quickly abandoned B&W, due to frequent misinterpretation of gestures? I'd try to punish my creature, and instead I'd get him to drop the food he's holding. Gesture commands are stupid, especially when we have a perfectly good keyboard to use.
The only "intuitive" interface is the nipple. After that, it's all learned.
After spending some time playing with this, I have learned a few things:
1) Don't bother with fancy gestures yet. Only simple ones, like L will work with any reliability.
2) You *must* pause before and after the gesture. This is CRUCIAL. A slight pause, gesture, pause.
3) Scale doesnt matter. Small l, big L, it doesn't matter. It would be nice if I could filter smaller motions.
4) Understand dcop. dcop will allow you to do gestures for all sorts of KDE apps. For example, you could have gesture that makes the konquror web browser "go back", or "reload". Or it could have kmail cheack your mail. Try running kcdop, a graphical dcop browser, to get an idea of what's possible.
DCOP is one of the most wicked things about KDE. And you know what? Each and every KDE application is linked against DCOP so potentially -- all of them support it. Konqueror definitely supports DCOP, and has a nice interface for simpler methods like "open a new window" but for more complicated things like "back" or "reload" it gets a little tricky but you CAN do it... fully control your Konqueror for the command line.
DCOP and the DCOP interfaces will only get better and better, KDE should push and promote this stuff. It's something GNOME really can't compete in, yet, not even with Bonobo.
No, it's not new, and it is not revolutionary. But it is a good idea that has taken surprisingly long time to catch on in the mainstream. While I can probably see better uses for it than in a web-browser (e.g. a CAD-program, or some other mainly graphical app, such as computer games or graphics applications), it is interesting to see if the time is ripe for such a good idea to finally reach the mainstream...
In case you don't like to take my word for it, gesture recognition has existed in at least one free GPL'd C++ library for quite a few years (long before Gtk came to life, and AFAIK also before Qt, at least before it became popular). The library is called Amulet [openip.org] and has several other interesting features, such as a constraint solver for geometry layout. Another interesting aspect of Amulet is that it does not use class-based inheritance, but uses a prototype-based approach (in the form of a C++ library) instead.
But the history goes even further back. Basically, Amulet is just a reimplementation in C++ of an even older Common Lisp library called Garnet [faqs.org]. Sadly Garnet seems to be relatively unmaintained these days, but it provides many of the same features in a much better suited language (yes, Common Lisp is better suited for this kind of programming, although I must admit that modern C++ is a surprisingly flexible language, given it's static nature).
Damn, I have the habit of always resting my wrists when I type [1]. I also do this when using the mouse, do you happen to know if that's correct behaviour?
[1] Well, I'm not resting them now and I'm typing slower and with more errors than before, also I have the tendency to have my wrists float away. Hope this will pass.
Unfortunately Gnome doesn't have anything like DCOP, therefore every program would have to implement this itself.
BTW: does anyone in here know how hard it would be for Gnome to be made compatible with DCOP (excluding the actual applications that would have to be modified)?
One question I've always had about this technology is: How often do you inadvertently make one of your programmed gestures with the mouse? Do you find yourself browsing the web and all of the sudden a new window is opened, or you accidentally close the browser window?
So, the hard of hearing could control their computer with ASL commands, or dictate letters the same way the rest of us can with a voice-powered word processor, using beefed-up gesture control technology [...]
While this sounds nice in theory--and it would be very cool to see something like this in action--I fail to see how it would be faster, cheaper or more effecient than a plain old computer keyboard. --
When the last/. gesture-based story [slashdot.org] was posted, I followed a poster's comment and downloaded Sensiva [sensiva.com]. But while the program works well*, has a beautiful interface and offers a new way of controlling UI, it doesn't perform the killer function: work faster than existing solutions. No matter how fast I can wheel and deal with my mouse and enter all kinds of fancy symbols it's still faster to enter a keyboard combo and not have to move my hands from the home keys.
Now in a closed system, such as a game, drawing program, etc., mouse usage is much more important. But for general use, gesture-based input will only work at the expense of speed.
* I found that it works well for simple symbols. More complicated glyphs, such as figure eights, and symbols with crossover lines, were hopeless, no matter what size or speed I tried them. --
First off, I'm not a Dotcor, but rather someone that uses Opera for their browsing. While gesture navigation does take some time to get used to, I would have to say if implemented as well as in Opera, then it could effectivly reduce CTS. When using Opera, one can use the mouse to move everything and if your mouse is already ergonomic (ala my Logitech Mouseman+), then they way you move is no different that simply moving the mouse. It could reduce time typing, and movement on the mouse because of the smaller are the mouse has to travel (assuming that the PS2 rate is high enough, or if it's USB).
Not just for the hard of hearing. Other people could use it too.
It might look wierd for someone to be flailing his/her arms in front of a computer, but for those who know it, ASL is probably more natural and easier than typing, since it operates with words rather than letters.
ASL is one of those things that I've wanted to learn for a long time (and that I think everyone should learn). If my computer could do this, it would give me more incentive to learn it.;-)
This is another thing like QWERTY keyboards, it's really slow. I used it in Opera for a bit, before switching back to K. It isn't really worth it if you're trying to get something done, keyboard shortcuts are a much better idea.
Quick close is Ctrl+Q, quick mimimise is Ctrl+Shift+M:) I too use a high resolution, but then I also use a supersensitive mouse (ya, one of those...) set to move at high speed. It's a lot faster than gesture still, and I ain't switching. --
Yes. It's a revolutionary almost-new form of HCI. If you believe the hype, it's as significant as the change from command-line to point-and-click.
Hopefully, though, it'll never become mandatory, because just as mouse can be slower than keyboard, this isn't as fast as using a keyboard and mouse in the conventional way. --
When the "gesture control" program will be able to recognize more than 26 "gestures", they'll be able
to create a very nifty "letter gesturing" control.
This will be groundbreaking ! I suggest that we call this new feature "handwriting recognition". My friend next to me, always very creative, suggests that we call is "grafiti", but somehow I can't imagine that a company would be so misguided as to use such a silly name.
"A door is what a dog is perpetually on the wrong side of" - Ogden Nash
One question I've always had about this technology is: How often do you inadvertently make one of your programmed gestures with the mouse? Do you find yourself browsing the web and all of the sudden a new window is opened, or you accidentally close the browser window?
Its happened to me a few times with Opera, to open a link in a new window you drag the mouse down, to close a window you drag the mouse down and right. It is a pain to want to open a new window only to accidentally bump to the right during the gesture and end up losing the original window. That said it doesn't occur often enough that I turn gestures off, I'm so used to it I do find myself attempting to gesture in other browsers.
What we really need is obscene gesture control. I'm thinking Evolution might be a likely candidate for this natural turn of events, so that the recipient of my emails can somehow interpret what finger I was holding up when I flamed them.;-)
Actually, I would LOVE to see this in a game like Diablo II. You learn spells, and then you cast them by making gestures with your mouse over your enemies. Being a magic user might actually take some talent then.
Depends on what you're looking at:) Actually, I tried Opera, and I loved the gesture-based navigation. I didn't like Opera very much, though. Since KDE is my primary desktop, this project interests me very much.
I cannot STAND the Gesture-based interface that Black and White uses. I can't even stand the game, actually, but that's another matter - my opinion on it is that it's a dumb game made even worse by an atrocious input system.
Gesture navigation has its place in some environments. I can see it working in CAD, maybe in some 3D games where less precision is required, and such. But why anyone would want to navigate through a Linux GUI, Windows, or whatever else you use with gestures beats me.
Close window: Slam mouse cursor three times at the top of the screen and scream "Close already!" Go back: Slam mouse cursor twice to the left of the screen. You can see how it will go.
Maybe for those with the inability to make precice movements with their hands would benefit from this meathod of input (MS or Trisomy-21 come to mind) but it's just too cumbersome for normal use.
After someone mentioned that Blender was guestural (I knew it was but didn't make this connection till recently), I dragged out my wacom tablet and gave blender another spin (I do some work in it, but always found it a pain before), and using the tablet, suddenly blender is a lot more intuitive.
Not so much revolutionary as evolutionary. Just as keyboard shortcuts allow us to use the keyboard for much more than simply typing, gestures ("mouse shortcuts"?) allow us to use the mouse for much more than simply pointing.
I've been using Opera 5.1x for a few weeks now, and at first, I didn't use the gestures at all. But slowly, I'm incorporating them into the minutiae of my tasks. If I've already got my hand on the mouse, then often times a gesture is quicker than a keyboard shortcut (reload, new window, close). I still almost never use the forward and back gestures, but that may change in the future.
I'm not saying gestures should *replace* keyboard shortcuts, because they can't. I'm saying that they should be provided *in addition to* keyboard shortcuts. Any technique that helps me get my work (and play:-} done faster is welcome.
Am I the only one who quickly abandoned B&W, due to frequent misinterpretation of gestures?
There are limitations to gesture controls, but B&W is built for it. It's part of the game, because you're a god, so waving your hand around should produce miracles, not some function key being pressed. It's part of the mystique and aura of the game, and involves you in the story more. So while gesture based commands may be slightly gimmicky for everyday workstation use, it works well in a game where you're supposed to be a 'god', producing magical effects with the wave of a hand. I have temporarily given up the game cause it never really saves my games, but I really like the gestures for this game only. I'll stick to keyboard commands for everything else I think.
EA's game Black and White uses this (or something similar to it) for part of the game interface. Pretty cool, but I wasn't able to get comfortable with it.
It is hard in B&W to get comfortable with. But, the gestures are much more complex than the ones you find in Opera. In Opera the most complex gesture you'll ever do is left-right-left. In B&W, to cast a simple Miracle like Shower, you have to draw a spiral, and then an S. It takes some serious practice to get it working consistently. Even after playing for several hours, I still have to try 2 or 3 times usually.
I think this is from Contra, except the actual code was:
up-up-down-down-left-right-left-right-B-A-B-A start... don't quote me on that though.. it's been a while...
What's wrong with just using the old mouse button to launch programms/commmands/etc. ?
Arn't we just complicating the simpelist of tasks? I mean, people already have a hard enough time trying to figure out the Jot letters in their palm pilot.
I can only agree with you. There are good developers doing great work on KDE, but one should give them some books from Neilsen, Tognazzi or Raskin for Christmas, so KDE will not only look better than Windows and MacOS (you can do anything with themes...) but also feel better and be faster to use.
I have used some high end CAD software like Mentor Graphics that uses gestures extensively. It is more productive than using a mouse when you get used to it.
This is something I'm not understanding in this particular story; people saying Gestures are more intuitive than using the mouse.
Am I missing something or do you make the gestures with the mouse? You're either using the mouse to click on things in a menu/structured manner or you're waving it around in a gesture. Either way you're using the mouse.
Personally I find using a mouse to navigate the pointer around a screen, and clicking on menus, plus using the keyboard with the other hand for eg CTRL-C+CTRL-V to copy and paste to be a pretty intuitive system and utilises both hands to good effect. Your mouse hand does the work of navigation+selection, your other hand performs functions such as copy, cut, paste, delete, close-window, move-to-other-desktop etc. With gestures you're doing this presumably with just one hand.
It doesn't seem intuitive, it still relies on one hand on a mouse wiggling around and pressing buttons. But instead of selecting discrete options with discrete clicks and movements it necessitates learning relatively complex movements of the mouse. Something I find no better than mouse+keyboard (and mouse+keyboard is a damnsight more predictable in its response) and automatically makes things hard for people with physical disabilities who may be able to use a mouse or trackerball to select things but for whom making complex gestures is an impossibility.
I occassionally teach IT classes and I find the thought of trying to teach a 'gesture' to someone, and giving them, or the computer, the time to learn it a potential nightmare.
This is a very small step from the standard input
devices to the ultimate goal of mind-controlled input (where all you need to be able to do is think what you want to do and your computer then does it; anything else is just abstraction to one extent or another between what you want to do and how to tell your machine to do it).
I'm not 100% on this, but in the developer's notes in the README file (I believe), he mentions that Konqueror doesn't support DCOP, so you can't use KGesture to control it. Also, Mozilla's -remote option doesn't seem to be working with the GoBack, etc. commands yet. Not sure about Netscape 4.x's remote commands, though.
I have found it useful for skipping forward / back in XMMS though, especially if it is minimized, or on a different desktop, etc. Not having to switch applications is very convenient.
I have been using guestures in Opera and have fallen in love.
I can go back, forward, open new windows on links, duplicate windows, refresh... all without ever touching the keyboard.
The Opera guestures are simple and usefull. The Black and White guestures are complex and hard to use.
I agreee though, that this is a somewhat lame article for Slashdot.
I'm sure at least some of you have done over the phone tech-support (even a litle).
Ever have trouble getting someone to right-click and drag? Try this:
"OK, now close your current window by holding down the right and middle mouse button, while making a square, clockwise, aterting from the top left corner. Now, open a new konsole by holding down the right mouse button, and then moving right a short distance, then down, then right for a short distance, then up."
DarkWinter, part-time gesture phone support expert, fulltime psychotic.
After using the gestures in Black & White, I can truly say that they moderately suck, at best. With a mouse they aren't too awful bad, but for a trackball they are a nightmare. Plus, there is nothing more frustrating than repeatedly making the wrong gesture, with one that is similar to make. I have enough problems clicking on icons on the desktop with my trackball......I won't stand a chance with gestures.
What hardcore user wants to reach over for the mouse and attempt gesturing something that, if you really cared to do it frequently, would be a keyboard chord already?
The only thing a mouse is good for is Quake. That is, until they invented a scrolly-wheel.
Part of WinCE on the PocketPC is the ability to do all these gestures as part of the handwriting recognition. Just using a stylus instead of a mouse. Select some text, click and swish around to delete it. Click and move quickly to the left to backspace, etc...
Erm, try right clicking on the toolbar handle and this is what you get:
Text Position->Icons Only, Text Only, Text aside icons, Text under icons.
There are your labels. Been here ever since KDE2.0 development first started. You can also select icon size and got a selection of small, medium, and large.
As for "progressive disclosure", I don't see this problem but if you do almost all of the menus and toolbars are constructed out of XML. Edit them if you think you can do better and post it to the KDE mailing lists (or the application author).
Gesture support is an opportunity. (Score:2)
comments on the gesture support from Linux users.
Having used gestures from the Opera browser, I have seen that they are very usefull and really enhance the WWW experience. They enhance it so much that I cannot understand the negative comments. As noted by other users, no other desktop OS has them, thus it is a genuine chance for Linux to make a head start. Please, before making any comments, think first if it is a constructive comment.
Thanks.
Great... (Score:3)
Netscape: "RAM! We need more RAM!"
Re:SINF (Score:1)
uXs
--
Re:Before you dismiss this out of 'hand' . . . (Score:2)
I did something like this years ago... (Score:2)
In 1995 I reimplemented it in Java, and for a while you could navigate my homepage by drawing simple gestures in an applet window. It was really simple and took about 3 hours to implement.
In 1998 I got quite into Window Maker, and started a conversation with Alfredo (Mr W.M) about integrating this functionality into it. I wrote the gesture parsing code, and he wrote a front-end, but it never really got past the experimental stage. I am sure that code is probably floating around somewhere too.
Before long I got bored with it though, it is must faster to hit a key on a keyboard or press a button on a GUI than it is to draw a gesure which is often misinterpreted.
They are great gimmicks, but of limited practical use.
--
Re:I have to disagree (Score:1)
Take it or leave it... (Score:1)
If you like keyboard shortcuts chances are you don't use the mouse very much either.
I just want to know if I can make these gestures do cool things when I am in Emacs
Re:intuitive (Score:3)
on the other hand imagine all the pranks you could play on your co-workers computer if you can teach them to react to everyday words in a real weird way
Re:SINF (Score:2)
Anyway, this does look like a Freshmeat thing, but I think talking about gesture entry, rather than biching about the story, is what the editors intended.
Re:Carpal Tunnel Syndrome from Gestures? (Score:1)
I'm aware of the "rules" for ergonomics. That said, I'm not resting *as* I type. I rest my wrists on them when I *stop* typing, but I still have to read. Reducing wrist travel makes it far more comfortable to deal with using the computer, not just typing or mousing about. It's a bit of a holdover from piano lessons. If you can't keep a quarter on the back of each wrist as you type/click, then you're doing it all wrong.
As for the questions regarding accidental gestures, the system I'm using traces my gestures, and only responds when I'm holding down a button. The result is a system that costs me my right mouse button (with an updated interface, I'm sure I could assign that to the fourth button on my logitech.), and even that can be turned off pretty quickly.
That said, the gesture system has its flaws, and so does excessive reliance on wrist pads. You have to know what you're dealing with before you hurt yourself doing something stupid.
Raptor
Carpal Tunnel Syndrome from Gestures? (Score:2)
The reduction of mouse to keyboard switches has done wonders for my overall speed with my system, and caused the small amount of pain that I was already in to go away.
Maybe you should give it a shot before making any claims, Michael.
Raptor
The department (Score:1)
Karma points to whoever remembers the old Nintendo (yeah, the 8-bit thing I still play constantly in my dorm room) game this code is from. That is probably the only video game code I'll still remember when I'm 80
Erik
Re:the original mail from mike pilone (Score:1)
Yes, kdcop.
--
it's good for some things (Score:2)
IIRC Mentor Graphics (a VLSI design tool) included gesture control. Since most of the time you were mousing components around anyway, it was convenient to use gestures for cut, copy, paste, etc. There were special selection "strokes", and other tool-specific commands. In fact, as I read about libstroke it sounds like the author of that library was inspired by similar CAD programs.
Really, the only thing you need to add for great gesture recognition is another mouse - one for each hand would really improve the gesture complexity you could generate, and make things much faster too. Ultimately I foresee some sort of VR glove (like in the oft-maligned Johnny Mnemonic) where you can type, move things around with your hands, and set up specific hand motions to do certain actions (a karate chop or a scissors motion means "cut", etc.). That would be the real convergence of the mouse and keyboard.
Caution: contents may be quarrelsome and meticulous!
Congratulations... (Score:1)
Re:Carpal Tunnel Syndrome from Gestures? (Score:2)
Read what JWZ has to say [jwz.org].
Og, I'm turning into a fan boy. Oh well, postmodern.
--
No it doesnt (Score:1)
What exactly are referring to?
Re:Carpal Tunnel Syndrome from Gestures? (Score:1)
As long as I can turn it off when playing unreal tournament.
Keybd - Mouse - Keybd - Mouse (Score:1)
This switching from keyboard to mouse to keyboard to mouse is exactly why I prefer to drive everything from the keyboard. I use WinNT at work, and have found that it's quite easy to get around (most of) the system & developer tools with keyboard strokes only.
Anyway, in order to reduce my travel from the keyboard to the mouse (necessary even when you can reduce the use of a mouse), I bought an ergo keyboard that has a built-in touchpad in the lower right corner. It took some getting used to, but now I love it.
My main point: I think guesture control would be easiest with one of these touchpads, and not with a mouse (and probably very difficult with a trackball).
Already used in sme high end software (Score:1)
Re:Great... (Score:2)
Grab UAE or dust off your Amiga and search for a copy of Tower's Curse.
Re:intuitive (Score:2)
On the other hand, the other big unknown is how long it will take before these kinds of interfaces can progress beyond simple commands. I was playing around with OS X voice commands the other day, and I realized there aren't that many. While I'll probably be using a keyboard til I die, it's still interesting for another reason. As you start creating the ability for a computer to do something like "open a new document with vim in my complaint letters folder", progressing to "download all the images with thumbnails on this page and put them in a new folder called 'Kournikova'", you eventually start crossing over into the real ability of a computer to use language. No doubt, these kind of simple OS-related tasks will be the first practical application of this.
Interesting, but I still prefer keyboard shortcuts. Heavy mouse usage makes me rest my hands, which makes my elbows tingle after a while. Seriously, keep your wrists in the air as much as possible. There's a reason you never heard about carpal tunnel syndrome in the era of manual typewriters.
Boss of nothin. Big deal.
Son, go get daddy's hard plastic eyes.
strokes (Score:2)
(Yes, XEmacs is an OS/Desktop!)
Re:Gesture Standards (Score:3)
By freezing a standard too quickly, bad implementations can be propogated more often than necessary.
--
Re:new life for porn browsing (Score:3)
--
Re:Ugh (Score:1)
intuitive (Score:1)
I don't want to have to make circles over and over again with my mouse to mean rearrange the icons by name, or in the opposite direction to sort by size.
blender (Score:3)
The multi platform 3D modeling program
blender [blender.nl] (full featured and fits on a floppy!) has had this for a few years now. It's really pretty easy to get used to.
Congrats to KDE
-pos
The truth is more important than the facts.
And when will we see the this patented? (Score:2)
It's not that i think of this kind of input as an obvious concept, but maybe the folks implementing it should look if there is already someone applying for a patent. It'd also be a good idea to look where and with who the idea crept up first, it'd make looking for prior art so much easier in ten years time.
Re:SINF (Score:2)
Justin
Before you dismiss this out of 'hand' . . . (Score:5)
Seriously: while gesture controls may not be ready for prime time just yet, consider that the technology may prove useful for those who communicate best using only their hands. A 17-year-old from Colorado recently won the grand prize at the International Science and Engineering Fair [rockymountainnews.com] for designing a glove that can interpret the movements of someone "speaking" in American Sign Language (ASL) and then output the communication as text.
So, the hard of hearing could control their computer with ASL commands, or dictate letters the same way the rest of us can with a voice-powered word processor, using beefed-up gesture control technology and, of course, hardware that can reliably interpret their hand movements. It's somewhat tangential to the story, but an intriguing concept nonetheless.
I use gestures all the time... (Score:1)
Re:I've got a use for it (Score:1)
Could someone work on an interface for those old Nintendo Power Gloves [ebay.com]?
Re:Slow (Score:2)
--
I've seen this on an old mac program long ago (Score:1)
KDE is a wanna-be (Score:2)
I'm not bashing KDE for adding a good advanced feature like gesturing, but this seems to be just one more instance in a trend that desktop environments have followed as of late: adding cool, trendy, buzzword-compliant technologies but then completely blowing it the most basic and fundamental UI design principles.
A problem with your arguement--sane defaults (Score:2)
"But if you dig deep enough into the configuration, you can change" people say
Such are the ideas that hold linux from the desktop. Many users starting off will do neither, and shouldn't be expected to try to improve things that should have been improved to begin with. If there's something in an interface that is supposed to be done (e.g. labeling toolbar buttons) and makes an interface more usable, it should be the default.
A challenge for the great Mosfet (Score:2)
Re:Carpal Tunnel Syndrome from Gestures? (Score:1)
"Blender" uses something similar I think... (Score:1)
Gesture Standards (Score:1)
I have to disagree (Score:2)
I use a screen with a lot of realestate, so keeping mouse movements to a minimum via not having to hit small targets is a good thing. Besides, you can always use it the old fashioned way.
Re:The department (Score:1)
gnome version ? (Score:1)
Thu May 10 - Version 0.5, featuring a GNOME version of the library written by Dan Nicolaescu, is being tested and will be released soon.
Does that mean the we can expect gnome to have the same feature soon ?
the original mail from mike pilone (Score:2)
I wrote KGesture the other day. It is a gesture recognition application
for KDE. As far as I know, it is the first of its kind and offers
something MS doesn't.
Problem 1:
KGesture relies heavily on DCOP to communicate with running applications.
Most of the feedback that I have been getting back from users is that they
can't do what they want because the application doesn't make it available.
I was wondering what the possibility of increasing the number of methods
available in a stub is? Does performace take a hit as the number of
methods in an interface increases?
For example, most users want back() and next() from Konqueror exposed so
they can control the browser with gestures. Also, minimizing applications
(I suppose through a KWin interface).
What are the chances of getting these exposed? Is KGesture going about
this the wrong way?
Problem 2:
Currently users must use the 'dcop' command to find the method they want
to trigger. Is there a DCOP browser out there, or a widget that builds a
tree from DCOP information?
Letting the user easily select a dcop function would be a big plus.
If your interested, KGesture can be found here:
http://www.slac.com/~mpilone/projects/
It started as a little adventure in stroke recognition, but people seem to
like it, and it is a new approach to computer interaction (at least on the
desktop with a mouse! CAD apps have been doing it with a pen forever).
Thanks in advance,
-mike
Mike Pilone Computer Scientist
mpilone@slac.com http://www.slac.com/mpilone/
Visit
http://master.kde.org/mailman/listinfo/kde-deve
Black+White and Opera (Score:1)
I don't know if it's just the novelty value, but I'm finding them useful. Hold the right mouse button and moving around is very easy and quickens things up if you're browing with the mouse (no need to reach to the keyboard again). It's a lot quicker to hold right mouse button and drag up and down than it is to find the reload button and click it.
I agree with the poster above, there needs to be a well defined set of gestures that will work in all applications before it really takes off.
as someone who deals with an RSI... (Score:1)
One thing that really helped deal with the pain was shoveling my car out of 18 inches of snow. Twice. (damn inconsiderate snow plows. :) I've also found that visiting my local trigger point therapist [about.com] (random google link) has helped a lot. So, it's exercise and massage therapy if you don't want to be troubled with RSI.
let the flames begin.---
Re:as someone who deals with an RSI... (Score:1)
---
Re:Great... (Score:1)
Re:Already in use... (Score:1)
Somebody, help pry this game out of my fingers! It's stuck and I can't do anything else...
Ugh (Score:2)
The only "intuitive" interface is the nipple. After that, it's all learned.
Re:Carpal Tunnel Syndrome from Gestures? (Score:1)
new life for porn browsing (Score:3)
John "Dark Paladin" Hummel
How to use this effectively (Score:5)
1) Don't bother with fancy gestures yet. Only simple ones, like L will work with any reliability.
2) You *must* pause before and after the gesture. This is CRUCIAL. A slight pause, gesture, pause.
3) Scale doesnt matter. Small l, big L, it doesn't matter. It would be nice if I could filter smaller motions.
4) Understand dcop. dcop will allow you to do gestures for all sorts of KDE apps. For example, you could have gesture that makes the konquror web browser "go back", or "reload". Or it could have kmail cheack your mail. Try running kcdop, a graphical dcop browser, to get an idea of what's possible.
KDE Rules, Good luck!
DCOP rules, man (Score:3)
Not new, not revolutionary... (Score:3)
In case you don't like to take my word for it, gesture recognition has existed in at least one free GPL'd C++ library for quite a few years (long before Gtk came to life, and AFAIK also before Qt, at least before it became popular). The library is called Amulet [openip.org] and has several other interesting features, such as a constraint solver for geometry layout. Another interesting aspect of Amulet is that it does not use class-based inheritance, but uses a prototype-based approach (in the form of a C++ library) instead.
But the history goes even further back. Basically, Amulet is just a reimplementation in C++ of an even older Common Lisp library called Garnet [faqs.org]. Sadly Garnet seems to be relatively unmaintained these days, but it provides many of the same features in a much better suited language (yes, Common Lisp is better suited for this kind of programming, although I must admit that modern C++ is a surprisingly flexible language, given it's static nature).
Re:The department (Score:1)
-Aaron
Re:Carpal Tunnel Syndrome from Gestures? (Score:2)
[1] Well, I'm not resting them now and I'm typing slower and with more errors than before, also I have the tendency to have my wrists float away. Hope this will pass.
Re:gnome version ? (Score:2)
BTW: does anyone in here know how hard it would be for Gnome to be made compatible with DCOP (excluding the actual applications that would have to be modified)?
Re:Carpal Tunnel Syndrome from Gestures? (Score:1)
Re:Before you dismiss this out of 'hand' . . . (Score:1)
While this sounds nice in theory--and it would be very cool to see something like this in action--I fail to see how it would be faster, cheaper or more effecient than a plain old computer keyboard.
--
Gesture-based input only good for closed systems? (Score:2)
Now in a closed system, such as a game, drawing program, etc., mouse usage is much more important. But for general use, gesture-based input will only work at the expense of speed.
* I found that it works well for simple symbols. More complicated glyphs, such as figure eights, and symbols with crossover lines, were hopeless, no matter what size or speed I tried them.
--
Link? (Score:1)
Ergonomics (Score:2)
First off, I'm not a Dotcor, but rather someone that uses Opera for their browsing. While gesture navigation does take some time to get used to, I would have to say if implemented as well as in Opera, then it could effectivly reduce CTS. When using Opera, one can use the mouse to move everything and if your mouse is already ergonomic (ala my Logitech Mouseman+), then they way you move is no different that simply moving the mouse. It could reduce time typing, and movement on the mouse because of the smaller are the mouse has to travel (assuming that the PS2 rate is high enough, or if it's USB).
Anyways, that just me rambling.
Re:Before you dismiss this out of 'hand' . . . (Score:1)
Not just for the hard of hearing. Other people could use it too.
It might look wierd for someone to be flailing his/her arms in front of a computer, but for those who know it, ASL is probably more natural and easier than typing, since it operates with words rather than letters.
ASL is one of those things that I've wanted to learn for a long time (and that I think everyone should learn). If my computer could do this, it would give me more incentive to learn it. ;-)
Re:as someone who deals with an RSI... (Score:1)
cool (Score:1)
Damn this thing, I can't type (Score:1)
Slow (Score:1)
Although it does make good marketing :)
--
Re:I have to disagree (Score:1)
--
Re:Carpal Tunnel Syndrome from Gestures? (Score:1)
--
Re:SINF (Score:2)
Hopefully, though, it'll never become mandatory, because just as mouse can be slower than keyboard, this isn't as fast as using a keyboard and mouse in the conventional way.
--
Re:Great... (Score:1)
--
They could advance the technology even further (Score:3)
This will be groundbreaking ! I suggest that we call this new feature "handwriting recognition". My friend next to me, always very creative, suggests that we call is "grafiti", but somehow I can't imagine that a company would be so misguided as to use such a silly name.
"A door is what a dog is perpetually on the wrong side of" - Ogden Nash
Re:Carpal Tunnel Syndrome from Gestures? (Score:1)
Its happened to me a few times with Opera, to open a link in a new window you drag the mouse down, to close a window you drag the mouse down and right. It is a pain to want to open a new window only to accidentally bump to the right during the gesture and end up losing the original window. That said it doesn't occur often enough that I turn gestures off, I'm so used to it I do find myself attempting to gesture in other browsers.
What we really need (Score:1)
Opera's Gesture Control is Great! (Score:1)
I like it. Even if you only use gestures to back up one page, it is much faster.
I wish Windows had this (Score:1)
Re:Great... (Score:3)
Actually, I would LOVE to see this in a game like Diablo II. You learn spells, and then you cast them by making gestures with your mouse over your enemies. Being a magic user might actually take some talent then.
Recipe for carpal tunnel? (Score:1)
Actually, I tried Opera, and I loved the gesture-based navigation. I didn't like Opera very much, though. Since KDE is my primary desktop, this project interests me very much.
Re:Gesture suggestion: (Score:1)
Black and White (Score:1)
I cannot STAND the Gesture-based interface that Black and White uses. I can't even stand the game, actually, but that's another matter - my opinion on it is that it's a dumb game made even worse by an atrocious input system.
Gesture navigation has its place in some environments. I can see it working in CAD, maybe in some 3D games where less precision is required, and such. But why anyone would want to navigate through a Linux GUI, Windows, or whatever else you use with gestures beats me.
Close window: Slam mouse cursor three times at the top of the screen and scream "Close already!" Go back: Slam mouse cursor twice to the left of the screen. You can see how it will go.
Maybe for those with the inability to make precice movements with their hands would benefit from this meathod of input (MS or Trisomy-21 come to mind) but it's just too cumbersome for normal use.
Re:blender (Score:2)
Re:SINF (Score:1)
I've been using Opera 5.1x for a few weeks now, and at first, I didn't use the gestures at all. But slowly, I'm incorporating them into the minutiae of my tasks. If I've already got my hand on the mouse, then often times a gesture is quicker than a keyboard shortcut (reload, new window, close). I still almost never use the forward and back gestures, but that may change in the future.
I'm not saying gestures should *replace* keyboard shortcuts, because they can't. I'm saying that they should be provided *in addition to* keyboard shortcuts. Any technique that helps me get my work (and play
My $0.02
___________
Re:Carpal Tunnel Syndrome from Gestures? (Score:1)
Re:Ugh (Score:3)
There are limitations to gesture controls, but B&W is built for it. It's part of the game, because you're a god, so waving your hand around should produce miracles, not some function key being pressed. It's part of the mystique and aura of the game, and involves you in the story more. So while gesture based commands may be slightly gimmicky for everyday workstation use, it works well in a game where you're supposed to be a 'god', producing magical effects with the wave of a hand. I have temporarily given up the game cause it never really saves my games, but I really like the gestures for this game only. I'll stick to keyboard commands for everything else I think.
Re:Already in use... (Score:1)
It is hard in B&W to get comfortable with. But, the gestures are much more complex than the ones you find in Opera. In Opera the most complex gesture you'll ever do is left-right-left. In B&W, to cast a simple Miracle like Shower, you have to draw a spiral, and then an S. It takes some serious practice to get it working consistently. Even after playing for several hours, I still have to try 2 or 3 times usually.
Re:The department (Score:1)
uhhh do we realy need it? (Score:1)
ACK to: KDE is a wanna-be (Score:1)
--
Yes it does (Score:1)
* draw a line to translate the current selection
* draw a v-shape to scale the current selection
--
Re:Already used in sme high end software (Score:1)
This is something I'm not understanding in this particular story; people saying Gestures are more intuitive than using the mouse.
Am I missing something or do you make the gestures with the mouse? You're either using the mouse to click on things in a menu/structured manner or you're waving it around in a gesture. Either way you're using the mouse.
Personally I find using a mouse to navigate the pointer around a screen, and clicking on menus, plus using the keyboard with the other hand for eg CTRL-C+CTRL-V to copy and paste to be a pretty intuitive system and utilises both hands to good effect. Your mouse hand does the work of navigation+selection, your other hand performs functions such as copy, cut, paste, delete, close-window, move-to-other-desktop etc. With gestures you're doing this presumably with just one hand.
It doesn't seem intuitive, it still relies on one hand on a mouse wiggling around and pressing buttons. But instead of selecting discrete options with discrete clicks and movements it necessitates learning relatively complex movements of the mouse. Something I find no better than mouse+keyboard (and mouse+keyboard is a damnsight more predictable in its response) and automatically makes things hard for people with physical disabilities who may be able to use a mouse or trackerball to select things but for whom making complex gestures is an impossibility.
I occassionally teach IT classes and I find the thought of trying to teach a 'gesture' to someone, and giving them, or the computer, the time to learn it a potential nightmare.
This is a very small step from the standard input devices to the ultimate goal of mind-controlled input (where all you need to be able to do is think what you want to do and your computer then does it; anything else is just abstraction to one extent or another between what you want to do and how to tell your machine to do it).
Re:How to use this effectively (Score:1)
I have found it useful for skipping forward / back in XMMS though, especially if it is minimized, or on a different desktop, etc. Not having to switch applications is very convenient.
Guestures Done Right Are Great (Score:1)
Tech support nightmare (Score:5)
Ever have trouble getting someone to right-click and drag? Try this:
"OK, now close your current window by holding down the right and middle mouse button, while making a square, clockwise, aterting from the top left corner. Now, open a new konsole by holding down the right mouse button, and then moving right a short distance, then down, then right for a short distance, then up."
DarkWinter, part-time gesture phone support expert, fulltime psychotic.
I got a gesture for you..... (Score:1)
YAUT (Score:1)
What hardcore user wants to reach over for the mouse and attempt gesturing something that, if you really cared to do it frequently, would be a keyboard chord already?
The only thing a mouse is good for is Quake. That is, until they invented a scrolly-wheel.
M$ uses it already, kinda (Score:1)
I've got a use for it (Score:5)
Your a wanna-be (feature's already there) (Score:3)
Text Position->Icons Only, Text Only, Text aside icons, Text under icons.
There are your labels. Been here ever since KDE2.0 development first started. You can also select icon size and got a selection of small, medium, and large.
As for "progressive disclosure", I don't see this problem but if you do almost all of the menus and toolbars are constructed out of XML. Edit them if you think you can do better and post it to the KDE mailing lists (or the application author).