Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Software Microsoft Windows

Windows 7 Touch, Dead On Arrival 352

snydeq writes "Ongoing Microsoft hype around its Surface touch technology has suggested that, with Windows 7, a touch-based UI revolution is brewing. Unfortunately, the realities of touch use in the desktop environment and the lack of worthwhile development around the technology are conspiring against the notion of touch ever finding a meaningful place on the desktop, as InfoWorld's Galen Gruman finds out reviewing Windows 7's touch capabilities. 'There's a chicken-and-egg issue to resolve,' Gruman writes. 'Few apps cry out for a touch UI, so Microsoft and Apple can continue to get away with merely dabbling with touch as an occasional mouse-based substitute. It would take one or both of these OS makers to truly touchify their platforms, using common components to pull touch into a great number of apps automatically. Without a clear demand, their incentive to do so doesn't exist.'"
This discussion has been archived. No new comments can be posted.

Windows 7 Touch, Dead On Arrival

Comments Filter:
  • by EdZ ( 755139 ) on Tuesday September 15, 2009 @10:41PM (#29435253)
    And thank goodness for that. Touch interfaces are acceptable where there isn't room for anything else (though the lack of a physical keyboard is always highly unpleasant), but I'd hate to see multitouch become the 'standard' interface for desktop computing. Sure, it's fun to throw about a few snapshots or fly about Google Earth. For all of 5 minutes. Try actually DOING anything, however, and you'll quickly switch back to a 'traditional' interface in order to avoid grief.
  • by Brian Gordon ( 987471 ) on Tuesday September 15, 2009 @10:43PM (#29435261)

    The problem is that with laptops/desktops the screen isn't really in a good position to accurately touch.

    But I like the idea of getting rid of the persistent cursor. You just leave it lying somewhere on screen when you're not using it.. there's no reason to leave it sitting there, or have to navigate awkwardly between controls, when you can just touch.

    I'm reminded of the PC vs console gaming argument about how mice are better because you can snap directly to a target instead of holding the control stick and having to wait as you pan around. Well touch vs mouse it's the same argument. With the mouse you have to start pushing your mouse across the mousepad, wait for it to reach its destination, and then fire. With touch you just tap the spot

    Obviously touch would never work for FPS controls but desktop controls are similar.. "aiming" at the little 5-pixel high link may be harder than it has to be

  • A solution (Score:5, Interesting)

    by grasshoppa ( 657393 ) on Tuesday September 15, 2009 @10:55PM (#29435375) Homepage

    AKA: A solution in search of a problem.

    Having used touch screens for a variety of applications, I'm having a hard time envisioning it's use in a home environment. We're all used to the precision offered by a mouse, and no one wants a touch screen TV.

    It would take a radically new appliance to thrust touch technology in to the lime light.

  • by mysidia ( 191772 ) on Tuesday September 15, 2009 @10:57PM (#29435401)

    I see X as able to support all sorts of input devices... touch screen support should be standard..

    We should get touch features in common apps, they should be done in a way that makes the experience superior to anything Windows can muster.

    Hey, if that ever happens, it could be the year of the Linux desktop :)

  • by fractoid ( 1076465 ) on Tuesday September 15, 2009 @10:59PM (#29435419) Homepage

    Exactly. In fact I'd say more along the lines of "nothing, ever." Touchscreens are a fun idea but except for very specific cases (pocketable computers, public terminals a la ticket machines at train stations for instance) they're horrible in practice. You get grubby fingerprints all over your screen and the ergonomics are bad - extended use will require either a weird sitting position or severe shoulder strain. On top of that, you always have your fingers/hands in front of whatever you're trying to select.

    What I really want to see is the idea that was floating around a few years ago for iPhone-style tablet devices, where the back of the device is a multitouch sensor and the touch points are displayed as cursors on the screen. No grubby fingerprints, no fat fingers in the way.

  • by fractoid ( 1076465 ) on Tuesday September 15, 2009 @11:04PM (#29435477) Homepage

    With the mouse you have to start pushing your mouse across the mousepad, wait for it to reach its destination, and then fire. With touch you just tap the spot

    You're forgetting the huge speed amplification you get with a mouse, and the fact that you still need to move something (your finger, or your cursor) to that spot to tap it. Moving my mouse about 2 inches moves my cursor through about 15 inches. Moving my finger 15 inches to press a button requires moving my whole arm 15 inches.

    What I want to see is accurate gaze tracking. If I stare at the center of a button, it stays static in my field of view - even if my eye's making microscopic movements, it should be possible to reverse-engineer the pattern to determine the point of gaze. Couple that with a physical switch to 'click' (I like the idea of making a 'click' noise with your tongue for a simple, intuitive, self-contained interface) and you have the only point-and-click device that will beat a mouse (no, you with the track ball sit down, it's just an upside down mouse).

  • by 0100010001010011 ( 652467 ) on Tuesday September 15, 2009 @11:11PM (#29435549)

    Who says you have to touch the screen? OS X (10.6) and my MacBook Pro are an amazing blend of this technology.

    I have 1, 2, 3 and 4 finger gestures right on my track pad. Switch applications, show the desktop, Expose, launch, rotate, zoom, scroll. Everything is rather intuitive.

    The only thing is that it took me about 1 week to come from a standard button/trackpad concept to one large button and the surface feeling is a bit ... different.

  • Its a lot less of an effort to use a mouse than it is to use a touchscreen.

    I think that depends very much where the touch is. For example, the touchpad on my laptop takes very little effort to use.

    On the other hand, I absolutely cannot play FPS reasonably on the thing, so maybe you're right.

  • by oferic ( 603861 ) on Tuesday September 15, 2009 @11:26PM (#29435667)
    I use handheld computers on a regular basis at work. When I switch back to using a laptop after spending some time using a touchscreen device, I naturally want to touch the screen to move windows, select items from the taskbar, etc. It's silly that the functionality is missing. There's no need for this to replace the mouse. Touch-display and mouse input should complement each other.
  • Touch Interfaces (Score:2, Interesting)

    by AvenNYC ( 1042622 ) on Tuesday September 15, 2009 @11:51PM (#29435849)
    As I've said before, if you can touchify an OS, it's great. I use a very specific version of Windows XP on the lighting console I use. The dual touch screens take the place of the mouse (there's a trackball built in but only used really when a touch screen has issues) and of course tons of hard buttons and knobs etc. By combining the 2, touchscreens and keyboards (hard buttons) you can get everything done so fast you wouldn't believe. I don't think you can have only one or the other and go as fast as having both. That being said, it's built so the touch screens are at the right angle (and height, but that's up to you) and distance from the hard buttons, to make everything easier - you don't end up moving your hands too much. Even hours of using touch screens don't make you too fatigued.
  • by Runaway1956 ( 1322357 ) on Tuesday September 15, 2009 @11:51PM (#29435851) Homepage Journal

    Parent deserves mod points. The keyboard came first, after all. It took me some time to get used to the idea of a mouse, but today, they coexist on the very same computer. Imagine that, huh?

    So, go ahead, put the touch stuff up there. There are times when a stylus or a finger can do something that I will NEVER accomplish with that stupid mouse. Just don't kill my mouse off. I hate the little bastid, but I can't get along without him!!

  • Apple and Touch (Score:3, Interesting)

    by MidnightBrewer ( 97195 ) on Tuesday September 15, 2009 @11:58PM (#29435909)

    I assume that what the author's comment about Apple "merely dabbling" in touch interfaces was in reference to desktops only? Apple runs circles around Microsoft when it comes to successful touch interfaces built onto their OS's back end; look at the iPhone. Microsoft's own Windows Mobile platform makes almost no effort whatsoever by comparison.

  • Re:it's just useless (Score:3, Interesting)

    by eggnoglatte ( 1047660 ) on Wednesday September 16, 2009 @01:28AM (#29436377)

    Judging by the success of almost every recent 3D movie in the US, I would say there are here to say.

    When "Jaws" came out in 3D long ago (longer than I care to admit), it was just like this. Then, over time, the novelty wore off, people realized that about 20% of them can't see stereo to begin with, another 10% got simulator sickness from the mismatch of depth cues (projection size, parallax, and focus all compete for representing a different depth), and the rest of the audience got annoyed at having to wear glasses. 2-3 years later we were back to 2D movies.

    I am taking bets that it'll go the same way this time around...

  • Clicking GUI buttons is a far cry from trying to draw accurately. The demands of a graphic artist are nothing, at all, like the demands of the general computer-using masses.

    Of course, a stylus and tablet are also nothing, at all, like a touchscreen.

    The millions of office workers out there really do not want to sit for eight hours a day holding their arms in front of them like mummies. I'd say it's likely to be physically impossible for a human to do that for more than a few minutes without the muscles fatiguing to the point where they are nonfunctional.

    This touchscreen garbage keeps coming up every so often, usually with a tone of regret, lamenting the fact that the technology hasn't made any real inroads. There's a reason it's made no inroads, and that's a lack of demand. The reason the lack of demand is there is because touchscreens pretty much suck.

    You iPhone-loving kids deal with touchscreens in a very specific, limited, handheld system for reasons I can't quite fathom but I will acknowledge that the technology seems to work for that very specific, limited, handheld system. Anything more complex and touchscreens seriously start to bite, and all attempts at integrating them into a normal computing experience have been met with failure because they bite.

    Other than the iPhone, which I still don't even like, I've only seen one useful, real-world application where touchscreens were a good idea, and that's POS systems, particularly in restaurants. As a waiter I could wander over to one, tap the screen a few times, and place or modify an order. But those were also severely limited systems, with a user interface designed with a small number of very specific functions arranged into large, easy-to-tap buttons. It didn't need to do anything else, it didn't do anything else, and so the touchscreen worked well for one-handed operation (and no risk of spilling crap all over a keyboard).

    Given the totally limited places touchscreens have ever been useful, I have to say WHO CARES if it never really goes anywhere?
  • Re:kinda like... (Score:5, Interesting)

    by mikael_j ( 106439 ) on Wednesday September 16, 2009 @02:13AM (#29436563)

    Well, it would probably make sense to make touch-enabled interfaces more table-like and less wall-like. That is to say, to make them horizontal.

    Also, for people like artists I can definitely see how a large multitouch surface with the ability to switch between "hand mode" and "stylus mode" could be very useful, it would be like an oversized Cintiq with the ability to move things about with your hands.

    Unfortunately most touchscreens coming out these days seem to be geared toward the same market segment that buys D-Link network equipment (that's the "cheaper is better even if it sucks compared to the competing product that costs 2% more" crowd) with pricing that resembles that of Wacom's professional products.

    /Mikael

  • by mosch ( 204 ) on Wednesday September 16, 2009 @03:05AM (#29436805) Homepage

    I just wish Apple would sell a desktop keyboard with a multi-touch pad attached to it.

    I really like it on the laptop, but then I switch to my desk, and... nada.

  • The obvious solution would be to put the touch-screen flat on the desk (and split the keyboard out to either side). Add eye-tracking to switch context/windows, multi-touch on-screen interaction, and built-in windex for a potentially workable solution..?

    Jeez. The obvious solution would be to use computers the way they are, until some serious shift in the nature of human-computer interaction is required. It works fine the way it is.

    Right now I can kick back in my chair, sit upright, slouch around, glance at the screen while talking to people, and so forth. It works fine for basically every computer user with two functional arms, and many with only one. In fact, right now my feet are on my desk as I type this on a laptop, and I can move one hand to control the workstation sitting on the desk if I need to.

    Your solution would require us to all sit hunched over our desks, staring straight down so we could see the screen, train ourselves to limit our eye movement, spread our hands on both sides of the desk like we're having trouble holding up our body weight (which, after sitting hunched over like that, we might)...

    I fail to see what is wrong with the current desktops and laptops as they stand today.
  • Re:kinda like... (Score:3, Interesting)

    by imakemusic ( 1164993 ) on Wednesday September 16, 2009 @04:21AM (#29437229)
    I'm a musician and a touch screen would be perfect for me. Unless you want to be staring at the computer a mouse is very limiting. The time it takes to figure out where you left the cursor and reposition it over the thing you want to click on can be crucial, especially if you're performing live. Plus you get the touchy-feely-ness which is severly lacking in a keyboard & mouse setup.

    Yeah, it might get a bit greasy but then so will I.
  • by RonUSMC ( 823230 ) <RonUSMC@[ ]il.com ['gma' in gap]> on Wednesday September 16, 2009 @04:28AM (#29437265)
    ***********these are MY own personal opinions and not the opinions of my employer, they are mine and mine alone, just like the ones on my blog, http://rongeorge.com/ [rongeorge.com] *****************

    I work at MSFT and just happen to work on the Advanced Design Team that designs Natural User Interfaces for several products around the Org. I myself specialize in touch and multi-touch devices and gestural languages. The thing you have to remember, is that Touch, Multi-Touch, and Pen are all already supported in the core of the Windows 7 operating system. This isn't a small feat. No other OS has that today. The bigger fact is that we have had that for over a year now. The API recognizing the difference, and the ability to track so many targets is monumental in the input field. Ask any interaction designer and if they know the history, it will all go back to input devices and drivers "tricking" the OS into thinking it was something different rather than for what it truly is. Silverlight 3 also has this functionality already built in. These are core functions that allow any software developer around the globe to start building multi-touch applications right NOW. Not next year, but right now. The code is there.. build it.

    We are by far not "merely dabbling" I think that's ludicrous. Do you have a multi-touch device and is it working right now? Yes. That is not dabbling. There is a lot of great stuff that Microsoft has put out with this release and so many more great things to come. The one thing to remember though, is that as a platform, we have to do things thinking of other developers in mind. I came from the Surface Team before going to ADT and want to clarify something. Surface does respond to touch, but remember that it is a vision based system and WAY ahead of the competition. It has hover, item recognition, and so many other capabilites that other companies can also build on. Once again, it is a platform. Don't confuse them, they are separate devices but both with very rich interactions and uses.

    I also see all this about Apple and the iPhone. If you want to give credit where credit is due.... you should all say Wayne Westerman and not Apple. He is the genius that Apple bought and brought over to save their failing tablet and turn it into a phone. His company, Fingerworks, made an incredible product that still has very loyal fans.

    I stopped using a mouse 2 years ago, and have never looked back.

    PS: If any of you are in Seattle and would like a demonstration of Surface's capabilities along with a Win7 touch demonstration, please drop me a line, contact info is at my blog. I would be happy to show you around campus as long as you write about it here. Thanks for reading.
  • by MtViewGuy ( 197597 ) on Wednesday September 16, 2009 @07:04AM (#29437979)

    Indeed, what you described is why I think a variant of MacOS X 10.6 will show up on the rumored Apple tablet. It will be firmware based, and the tablet itselt will use the new Intel Atom N450 CPU, which will be available at the same time Apple finally ships their tablet computer.

  • by ledow ( 319597 ) on Wednesday September 16, 2009 @07:38AM (#29438155) Homepage

    I think you miss the point then.

    Multitouch is niche. Taking a percentage of, say, Windows 7 users: Hardly anybody has the equipment. Hardly anybody has the software to support it(not just OS but applications, etc.). Hardly anybody has a practical use for it - yeah, you can use gestures etc. but one-finger/cursor gestures are just as easy and been around for longer and nobody really uses them at all. The common ground on those three is inherently small.

    It's so niche that despite being the "only OS" with it (I would contend that it depends merely on your definition of multitouch - multitouch support in software from a *user's* point of view has been there for years, it may be that Windows now has some *proper* interfacing for the code behind it, that's all) and having API's and trying to get people to use it, not many do.

    There just aren't that many practical applications for it that aren't fulfilled more simply, cheaply, efficiently and easily by other means (i.e. just using a normal single-point touchscreen). It might make a cool interface for a Star Trek game. It might let you use *more* gestures if you can be bothered to learn them all, but it certainly does not replace a mouse on the average business desktop, or average home user. I don't even know of any business that *knows* what multitouch is - they don't really care either.

    It's a niche piece of technology - like stereoscopic 3D games/movies, like cool Wii controller addons, like £1000 sound systems. Yes, it's fun. Yes, loads of people will play with a demo. No, you're not going to run the world on it and including it in the standard OS is a bit of a waste of development time. Personally, I'd have been happier if MS hadn't spent so much time on it in their main OS and had just released it as a pay-for addon for those who wanted it (public kiosks, possibly? Air-traffic controllers? I don't know).

    If you stopped using a mouse, you're really too blinkered. Tell me how one plays a fast-paced FPS effectively on a multitouch screen without breaking their arm? Or drags and drops without rubbing their finger raw and/or dropping things all over the desktop? iPhones, etc. use multitouch because the screen space is limited and gestures are required to save "interface bandwidth" (i.e. the amount of things you can put on the screen at once). Desktops don't have those problems.

    It's not even that revolutionary a technology - nowhere NEAR what touchscreen was originally. It's a tiny addition from the user's point of view. I'm really unimpressed, to be honest. I'm actually more impressed by GlovePIE which has had a form of software multitouch for ages (i.e. multiple active cursors on an unmodified Windows desktop, each independently controlled by a vast array of possible hardware).

  • Re:kinda like... (Score:3, Interesting)

    by cyphergirl ( 186872 ) on Wednesday September 16, 2009 @08:14AM (#29438351) Homepage Journal

    I've always thought it would be kEwL to have a tablet-like computer mounted on the front of one of my kitchen cabinets, wirelessly downloading recipes from my main desktop through a custom cookbook client-server software. With a touch screen, I could easily control it to view parts of a recipe or do measurement conversions (after wiping my hands off, of course). Mounted vertically on a cabinet keeps it out of range of splashes and spills, and out of the hands of the kids. Alas, I am not a software engineer and my husband is a bit too busy to hack things like this together at the moment.

    So, I guess I'm the one screaming "Hey I WANT to put big honking greasy fingerprints on my screen!", but not "Oh, and I want my kids to scratch the living hell out of my screen....".

  • Neck pain (Score:3, Interesting)

    by DrYak ( 748999 ) on Wednesday September 16, 2009 @08:29AM (#29438441) Homepage

    Well, it would probably make sense to make touch-enabled interfaces more table-like and less wall-like. That is to say, to make them horizontal.

    Someone is definitely looking forward to having a sore neck.

    You see, your idea is great, except that for the past couple of million years since Lucy, we've slowly evolved and somewhat adapted to an upright position.
    Granted, our level of adaptation isn't optimal yet, given all the typical human disease associated with upright position.

    Never the less, they way we are organised, we're better at looking at thing in front of us. Not at looking down.

    As an exemple, just ask any university student currently having an exam session :
    They stay the whole day in the library and spent this time reading - i.e.: looking *down* on book *laid* on the table.
    Neck and back pain result from this. Much more than what's seen in people doing a day job with computers (where they watch a screen in front of them).

    Touch screen and other "minority-report"-like hand controlled stuff is really cool in theory. But there is a fundamental problem with all these :

    • We are good at looking in front, not down on a table...
    • ...and our hands are better at rest down (on a table for example). More muscle strain if they are held in front the whole day.

    Therefore, biomechanically, we're just not well organised to have the input and output localised at the same level.

     
    (Which one more point telling us that if we were indeed intelligently designed, the design wasn't intelligent in the first place. And probably drunk)

  • by foniksonik ( 573572 ) on Wednesday September 16, 2009 @11:05AM (#29440439) Homepage Journal

    The big advantage of a touchscreen is that you don't have to find the cursor/pointer to start manipulating. With a mouse or a trackpad every action you perform has to start where the last action left off. This means a lot of repetitive moving of the cursor/pointer to get from point a to b to c back to a back to b, etc. WIth a touch screen you avoid all of this repetitive input.

    For point and click users a touch screen could actually reduce the amount of input activity they have to do by 50% or potentially even more as touch gestures tend to be much more effective than having to click multiple buttons or keys to achieve the same results.

    The reality is that very few people are *constantly* interacting with the GUI. More typical is for people to manipulate a window (scroll) then read for 2 minutes, then repeat. On my laptop I could do that while resting my hand on the lower surface, touch the bottom scroll arrow with a finger or my thumb and not think twice about it. It would be no different than resting my finger above a down arrow key. Move a window, resize or minimize... these are very brief actions that occur every hour or so and a lot of people already avoid them with multi-touch input or key combinations.

    The question to ask is "What do we do repetitively and frequently with a mouse that would be a burden with a touch screen?"

    I honestly can't think of much. There are some accuracy issues with specific GUIs which would not work well with a touch screen if fingers were the only input option (a stylus would solve that) - but otherwise I just can't think of any job related or leisure time activity on a computer that is so repetitive and frequent that it would cause muscle fatigue if a touchscreen were used instead.

    If you are referring to typing - well everyone knows that a keyboard is the best interface for that activity, why would a touchscreen device not have a keyboard? We're not talking exclusively about Tablet PCs here... that's just one form factor.

    I think all laptops should be touchscreen and all monitors should also be touchscreen. They should both still have keyboards of course and potentially a trackpad or mouse for when you need very accurate input. However I think people would adopt the touch interface for 99% of their activities without breaking a sweat and in fact will work less hard and be less mentally fatigued at the end of the day as they will be able to relax that part of their mind which currently controls the mouse... something not everyone is good at.

"A car is just a big purse on wheels." -- Johanna Reynolds

Working...