Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
GUI Software Education

Tangible Interfaces for Computers 158

Jesrad writes "A friend pointed me to this impressive demonstration of the SenseTable by James Patten, of the Tangible Media Group project of the MIT. This project aims at conceiving better human-machine interfaces by using the concept of physical objects that the user can manipulate, to represent abstract computer data and commands. The device looks and works a lot like what was envisioned in Minority Report, it uses pressure to track blocks on a sensitive surface, and feeds back to the user by superimposing graphical data. Want to change the volume of your MP3 player? Just put a block on it and turn like you would a radio knob. Menus and commands are accessed by moving a block along command hierarchy, represented in a simple tree, or by touching the command's name. So far it only lacks a device for text input, like a keyboard, but maybe voice recognition will replace it?"
This discussion has been archived. No new comments can be posted.

Tangible Interfaces for Computers

Comments Filter:
  • I've seen virtual keyboards, but this is beyond amazing. I do not think in a few years we will be able to recognize a computer. It will have evolved that much.
    • by holt_rpi ( 454352 )
      SCOTTY: Computer....Computer? (Technician hands SCOTTY the mouse. SCOTTY uses it as a microphone) Hello, computer.

      TECHNICIAN: Just use the keyboard!

      SCOTTY: The keyboard? How quaint!
    • Or, alternatively, (Score:5, Interesting)

      by fireboy1919 ( 257783 ) <(rustyp) (at) (freeshell.org)> on Saturday November 08, 2003 @11:44AM (#7424159) Homepage Journal
      It won't work.

      The typewriter interface has been with us for over a century. We've become accustomed to it.

      I remember watching Minority Report and thinking "people don't like computers now. Do you think they'll be willing to learn such an obviously unintuitive and totally new interface?"

      This seemed like it would be especially true outside the tech sector, such as, for instance, in law enforcement.

      Remember that the only intuitive interface is the nipple. Everything else is learned. Some people may use this, yes, but I doubt most. I don't think most can deal with anything beyond using the mouse and keyboard.

      Otherwise, the following things would be used, since they're faster even though they have a higher learning curve:
      -mouse gestures would be HUGELY in use
      -keyboard shortcuts would be known by almost everyone
      -everyone would be using vi or emacs in a wysiwsg mode instead of wordpad/notepad/word.
      -User interfaces with only a single type of action (clicky-clicky) wouldn't be popular.

      When and if this is ever true of most of society, then we'll be ready for the new interfaces.
      • -mouse gestures would be HUGELY in use -keyboard shortcuts would be known by almost everyone -everyone would be using vi or emacs in a wysiwsg mode instead of wordpad/notepad/word. -User interfaces with only a single type of action (clicky-clicky) wouldn't be popular.

        I disagree. Those things make using a keyboard and mouse harder, whereas new input devices could be easier to use than a keyboard and mouse. The reason no new technologies have been widely used is because they have no signif

        • If a new input device (maybe a nipple? :) ) was introduced which was intuitive, easy to use and had a significant advantage over the current devices, I think it would be picked up in a second.

          A nipple is no good in it's current implimentation, which explains why I use a USB mouse with my Thinkpad. I find it very hard to suck on the nipple (Trackpoint), see the screen, and click the mouse buttons at the same time. Plus my boss accuses me of sleeping on the job due to "keyboard face".
          • Ted Selker invented and developed the red "joy button" keyboard clitoris into a product (the thinkpad) while he was at IBM Almaden Research Center.

            According to IBM's ad in Time magazine, it was "so hot we had to make it red". Ted made an even hotter version in the lab, that had TWO "joy buttons" -- one for each hand!

            ONE joy button was so hot they made it red, but the TWO nipple keyboard was too hot even for IBM management to handle.

            They tried to come up with a keyboard that was robust enough to withst

      • by Shrubber ( 552857 )
        I think it would *especially* be easier to implement outside of the tech sector where you do have a lot of people who are not used to the typewriter interface, even today.

        A huge number of people have no idea what they're doing with a computer in their jobs, they simply are trained to press buttons and click a mouse in a certain set of steps in order to do what they need to do in order to get their paycheck. Really most office workers aren't much different than Pavlov's dogs.

        On the other hand those peop
      • Something similar was said when the mouse was first invented...
      • "The typewriter interface has been with us for over a century. We've become accustomed to it."

        I agree with you that the typewriter interface isn't going anywhere, but I don't agree with your reasoning.

        These days, computer fear is dying. Go back to the 80's. How many people had computers? How many have computers today? Look at how kids use computers today, do you really think that they're suddenly not going to want to use them 30 years from now?

        So why do I feel that the typewriter interface isn't go
      • Before making baseless assertions about new technologies, perhaps you should try out a new system [ideanest.com] that takes its lead from the infamous Minority Report interface. Kids love it [ideanest.com] and find it quite intuitive, even though it's only a first cut and could benefit from many improvements [ideanest.com]. There are many other similar projects out there that have met with varying degrees of success (I'll let you do your own research), but to spout aphorisms like "it won't work, because it's not a nipple" is stupid.
      • by netsrek ( 76063 ) on Saturday November 08, 2003 @03:26PM (#7425032) Homepage
        Remember that the only intuitive interface is the nipple. Everything else is learned.

        Stop repeating this crap. Have you ever watched a baby have to learn how to breast feed? There's a reflex there to get to the nipple, but actually doing the feeding isn't intuitive at all.

        There are no intuitive interfaces, only ones which are similar to other interfaces you've already learnt how to use...
        • This is not true. I have three younger sisters.
          Two of them are young enough that I remember it, and besides, there's abundant scientific proof. I leave the burden of getting it on you since you're the one who doesn't believe, and I believe other readers will find this obvious.

          Newborns automatically know all the mechanics of how to feed, defacate, and cry. Walking and crawling are also known to be built-in, though they don't take effect until babies are capable of doing it.
          • Well it definitely wasn't intuitive for my daughter, the children of two friends, or the babies in the ward at the hospital. There is a reason why you have nurses showing new mothers and their babies how to breast feed... It's not just the mother not knowing how to do it. You watch next time you see a newborn attempt to breastfeed for the first time. There is an instinctive suckling going on, there is an instinctive move towards the breast, but they definitely don't know how to get milk out of the nipple
      • I remember watching Minority Report and thinking "people don't like computers now. Do you think they'll be willing to learn such an obviously unintuitive and totally new interface?"

        Funny. I remember thinking, "Boy would my arms and shoulders be sore after a few hours of that!"

        I'll take the "lazy" keyboard/mouse interface any day of the week; I only have to twitch my fingers and wrists to get something done. The only interface lazier and more effective than the keyboard has got to be direct BCI (Brain Co

    • Not to be a total plug, but FingerWorks MultiTouch technology offers seamless gestures, pointing, and typing on the same surface. Very similar to this MIT work, except you don't manipulate secondary objects, you manipulate fingers on the surface directly. Different finger combinations can attach to different controls like Zoom, Undo/Redo, etc. We'll soon have an SDK available with which you can directly connect zoom/scale/translate hand motions to your favorite GUI controls.

      For those who haven't noticed
    • I do not think in a few years we will be able to recognize a computer. It will have evolved that much.


      And the Linux geeks will be using *40* year old Unix commands, raving about vi and the great CLI.
  • ... just press the button to type the specific character?
    One could even have different keyboard layouts being switchable with a knob... oh, wonder, wonder!

    Feel free to add other irony below...
  • Dang, Slashdotted already. That didn't take long.
    • I managed to read the page but can't get at any of the demonstration videos.

      Anyway, as for a pressure sensitive table, it sounds like a great idea but... I thought they were working on a table that read variances in the magnetic flux caused by hands moving over the table.

  • umm... (Score:3, Insightful)

    by mozumder ( 178398 ) on Saturday November 08, 2003 @11:26AM (#7424092)
    So why can't you just put a volume knob on that MP3 player?
  • Now we just have to convince the guys who make these [slashdot.org] to associate with the Tangible Media people. Minority Report indeed.
  • by p4ul13 ( 560810 ) on Saturday November 08, 2003 @11:31AM (#7424110) Homepage
    "Where's the any key?"

    You'll have to reply with "Well where did you leave it last?"
  • Oral audible hell (Score:5, Insightful)

    by kfg ( 145172 ) on Saturday November 08, 2003 @11:32AM (#7424113)
    "So far it only lacks a device for text input, like a keyboard, but maybe voice recognition will replace it?"

    Or maybe they'll just plug a keyboard into it? Voice recognition may well have its uses, especially as an accessibility technology, but as a general input device it's really a pretty poor idea.

    Unless we're all supposed to sit in a cone of silence or something.

    KFG
    • on screen keyboard anyone?
      • on screen keyboard anyone?

        So long as you can touch type on it at 80 wpm, or even two finger "Columbus Method" at 40 wpm.

        On screen "keyboards" suck. A lot. And hard too. Not to mention what they cost in Windex.

        About the only thing worse is selecting words from a dictionary.

        KFG
    • by temojen ( 678985 ) on Saturday November 08, 2003 @03:26PM (#7425030) Journal
      number sign bach back back back hash back #include lessthan back <standard I oh back back back s t d i o dot back . h >
      int mane back maine back main bracket back ( int argh! see back back a r g c commet back , char star back ** a r g v ) brace back {
      print f ( quote back "hello world") semicolon back ;
      ] no not that brace the other one back back back back back back back back }
  • Call me a skeptic (Score:5, Insightful)

    by GFW ( 673143 ) on Saturday November 08, 2003 @11:33AM (#7424116)
    While various varieties of tangible interfaces might be useful in specific circumstances, the typical user doesn't want more crap on their desk. They want a flat, easily positioned, brillant screen (or three). They want a keyboard (which could be virtual, but most people prefer some tactile feedback for typing). They want something for pointing (which could be a glove, a mouse, entirely virtual, ...) They don't want a metaphor that looks like Play-School.
    • ummm maybe
      like ms drones
      they want all that cause they havent experienced anything else

      i know i havent....
      but would love the opportunity
    • by Anonymous Coward
      Additionally, it doesn't seem to me that there's much of a difference between this and current user interfaces. It looks shiny, but basically what they do is use a block of wood to point at things instead of a mouse. And orientation matters as well as position. Other than that, it's just drag-and-drop and point-and-click, except without mouse buttons and with shiny lights...

      Lourens
    • I bet that what most of the typical users really want is to get all this crap they have off the desk: the monitor smashed in, the keyboard thrown out of the window and the mouse stuffed up the sysadmin's, because the software side of the interface is non-intuitive and can be frustruating to use.
    • Like the interface in Minority Report with the gloves. It was fantastic.
    • by gidds ( 56397 ) <slashdot@NospaM.gidds.me.uk> on Saturday November 08, 2003 @12:30PM (#7424324) Homepage
      How many people said something similar when the WIMP environment (e.g. Mac) went public? "Real computers need you to type everything! Anything worthwhile can be shown as text - if I want to see pretty pictures I'll go to an art gallery! And keep those mice in the toybox where they belong!"

      Initially, that took lots of space, seemed a waste of resources, and you couldn't do much with it. Since then, resources have increased tremendously, new applications and methods have been developed that make good use of it, and people see the extra desk space as worthwhile. I don't know if the same might happen to the SenseTable, but I do know that if so, it won't be because it fits today's hardware, apps, and interfaces, but because it'll fit tomorrow's.

    • by Wacky_Wookie ( 683151 ) on Saturday November 08, 2003 @12:54PM (#7424427) Journal
      This is perfect for Dyslexics!

      And I should know, I am one.

      For Dyslexics and people who have never used a computer before, a command line only interface is a MASSIVE hurdle. A GUI speeds up the time it takes a dyslexic to learn about computers by a factor of 10. A tactile user interface would IMHO speed up the learning (and normal human/computer interactions) by a factor of 1000.

      For example I cannot spell, yet I'm asked to write the User Docs for my firms computer systems all the time. If I were in the land of Typewriters, I would probably not even have a job, let alone be asked to write for other people. So the GUI did for my computer interest, the same thing computers with spell check did to my Employability.

      As a dyslexic, a TUI (Tactile User Interface) matched with a good 2D or 3D GUI is the Holy Grail.

      In fact, a TUI would turn a 3D user interface into use full human/computer interaction method.

      The Human brain is designed to work in a 3D space with tactile feedback. Anything else requires the brain to waste resources on "translator system" in order to use things like command line only interfaces. And for Dyslexics, everything is mucked up in "translation".

      If computers had been command line only when I was in school, I would not have been interested in them and would not be doing what I am doing right now: Sitting in the office on Saturday night (I'm in London) Posting on Slashdot instead of ironing out the kinks these new computers that my firm just bought.

      Wait...maybe GUI's are bad J:)
      • That's cool, I hadn't even considered the trouble a dyslexic would have with a text based interface.

        I was thinking as a musician (which some might consier to be another result of mis-wiring in the brain :-) that we are generally trying to minimize the amount of friction the interface contributes to the task of controlling the computer as an instrumnt. I believe this is one of the reasons it has taken so long for them to catch on in music in general, and are still rarely used in performance for anything
      • The primitive parts of the human brain are designed to work in 3D space. The more advanced parts are perfectly comfortable with an abstract means of communications such as speach or typing. Interestingly, it appears that typing requires seperate resources than speach, and that speah interferes with the resources used for thinking. So the keyboard interface is suprisingly powerful --- it takes advantage of the brain's well-developed abstract communication centers, and allows you to think and communicate at t
    • What I want for development is a wall size display with sensors that can tell where my fingers are and what my eyes are looking at. I want it to recognize gestures for scrolling around, linking, backing up, and so on. That's for examining code. When I write code, I want a keyboard. Voice control might be useful for a few queries, such as Who Wrote This, Where Is This Used, but in general I don't want to spend all day yakking to a computer.

      For home use, voice response for controlling a/v, lights, etc, t
  • by haydon4 ( 123439 ) on Saturday November 08, 2003 @11:36AM (#7424125)
    So far it only lacks a device for text input, like a keyboard, but maybe voice recognition will replace it?

    I talk to my computer enough as it is. The day that it actually listens to me is that day that I'll have to rebuild it every other week, and red will be the day when it starts talking back to me.
  • What if they make the blocks smarter by building a display on the top-surface, wheels on the bottom, and a processor inside? The block-top interface could display additional information. The table could automatically move the blocks into an pre-designed configuration (or adjust the configuration to match user-initiated movements of some other blocks). The wheeled mechanism could provide haptic-feedback as the user moves the blocks along the table. Distributed processing among the wirelessly networked bl
  • Audiopad (Score:4, Informative)

    by LeoDV ( 653216 ) on Saturday November 08, 2003 @11:40AM (#7424142) Journal
    A concept like this one has already been explored at MIT [mit.edu] with the Audiopad [mit.edu] (Google Cache [216.239.59.104]), used to make music but really could be used as a new, innovative kind of interface.

    What I'm waiting for is for someone to combine that Linux HD of the PS2 and the EyeToy into a Minority Report type interface.
    • The demonstration video in the article shows Audiopad, among other things. It is the very same MIT people that are working on AudioPad and the SenseTable.
    • This is actually the same thing developed by the James Pattern and Ben Recht [mit.edu]. I was just on the site a couple of days ago looking at it for an MIS class. This page however is the first time that I have seen some of the other unique applications such as the sandscape.

      Now earlier in the comments fireboy1919 that it wouldn't work because people are unwilling to learn a new interface in addition to the ones they are already good at. I think for it to be successful it depends on the application of they system
    • eh, did you read the article? Your link is to the same project as in the article...
    • re: the parent comment.. the audiopad is also done by james patten. so it was him that "already explored" it.

      the reason this work looks like the table in minority report is because john underkoffler, a former member of the same group at the media lab, was science & technology advisor for minority report and designed/spec'd/envisioned/whatever some of the devices used in the film. some of john's research on which that table was based:
      http://tangible.media.mit.edu/projects/Lu m inous_Ro om/Luminous_Room.h
  • by Anonymous Coward on Saturday November 08, 2003 @11:42AM (#7424151)
    ...than this movement-sensitive plastic block I have on my desk right now. It actually responds to the physical movements of my hand and includes pressure-sensitive areas that allow me to interact with virtual desktop metaphors. I can actually move this device over the virtual mp3 player on my desktop and apply pressure to one of the sensitive areas to change to volume.
    • by Anonymous Coward
      Strange. I tried what you're describing, I moved the thing over the monitor to the iTunes window and pressed it on the volume control but it didn't work. That, and I can't see the screen behind the plastic block thing, that's most unpractical.
  • I thought my keyboard was already pretty tangible, but I just came out of watching Revolutions ... so, my brain hurts now. Maybe I'm still trying to connect what the "pinching" had to do with anything in that movie. Maybe nothing's real! It's all an intangible mess of connectors to something unknown, unreal.

    Blah! Hogwash!
  • by BinBoy ( 164798 ) on Saturday November 08, 2003 @11:48AM (#7424171) Homepage
    I hate this whole movement. Using computers should become EASIER. Who wants tired arms from searching on the computer or back pain from moving files? I'd prefer to do this stuff with a click of a mouse button.

    • Yes. Aparently the lessons of gorilla arm [astrian.net] has been forgotten. Each generation seems set on repeating the mistakes of their predecessors.

    • Maybe it would be good for people who can't (or at least) shouldn't use a mouse anymore. I know at least one person that uses a tablet instead of a mouse because of his CTS.

      In addition it would certainly be nice to be able to have more than one focus point on your screen- especially in real time programs like audio and video production (even in non-realtime app, like if you've ever played with Reason you know what I mean). There's also something to be said for increased precision when you don't have to
    • Who wants tired arms from searching on the computer or back pain from moving files? I'd prefer to do this stuff with a click of a mouse button.

      Let's use our imagination just a little. First, imagine that screen tech gets cheap enough so that anyone can have a 4x8 foot screen. Make a desktop of that screen and you will wonder how anyone ever made themselves stare at a tiny monitor all day. Papers could be laid out so that you can stare at all of the material needed at once and virtual desktops would real

  • by wfberg ( 24378 ) on Saturday November 08, 2003 @11:51AM (#7424180)
    I think "loseable" would be a better one.. I can't even find the remote control for my TV most of the time (and I have 3 RCs); it would be a BAD idea to have all sorts of controls that do different things and contain state information.. Can you imagine losing the volume knob?
  • by LeoDV ( 653216 ) on Saturday November 08, 2003 @11:53AM (#7424192) Journal
    Even when the technology is perfected to Star Trek standards i.e. you don't even need to think about articulating to make yourself understood by the computer, keyboards will remain the preferred input method of many, including me, simply because it's the fastest. I haven't ever "learned" to type but I average around 100 WPM and peak at 120, without a DVORAK keyboard. I'll rather use that to jolt down an idea, write a letter, program or post at Slashdot than voice recognition.
    • An average person speaks at about 100-150 words a minute, without much effort. An old lecturer of mine was once clocked at 250wpm while giving evidence as a scientific adviser (boy did you have to pay attention in his lectures!).
      So a person with no training at all, given perfect voice recognition, could dictate faster than you could type after (presumably) a lot of typing practice.
    • Your comment says it all. You don't write I haven't even learned to type without quote marks, because that's not true. The truth is that you have learned to type, albeit informally.

      You, perhaps, are sure that a keyboard will be the most efficient input device you can ever hope to use. But I fail to see how that relates to the possibility of a more intuitive interface for future generations.

  • by bluethundr ( 562578 ) * on Saturday November 08, 2003 @12:02PM (#7424219) Homepage Journal
    This work reminds me of the work that Douglas Englebart [stanford.edu] was doing in the 1960s. And while I think this new interface work is great and needed I also believe that the biggest impediment to adopting new methods are cultural ones. While you could (and should) say that the delay in adoption of Englebart's ideas (windowing systems, a mouse for input) was the technical challenge of bringing these methods to home computing mahcines, you can't forget that cultural forces were also at work slowing down people's acceptance of the GUI.

    But a more dramatic example of the slowness of cultural change is the fact that I am typing this on a QWERTY keyboard. Dvorak [mwbrooks.com] has been around for years but still we type on devices that show their Victorian age heritage. Even when there is no need at all for the random shuffling of the alphabet across the current keyboard in the way we use it!

    Another fine example is the red-headed stepchild of the Englebart revolution; the BAT [nanopac.com] keyboard. The BAT is supposedly easier to learn to use (I've never tried it myself) than a regular keyoard and is also supposedly more ergonomic than a keyboard, as well. It is aslo easier on the joints (or so they say). Now it's mostly sold for people who have Carpal Tunnel Syndrome and other injuries/disabilities. But it was originally thought to be a better method for input for everyone (injured/disabled or not) to use.

    Englebart was right about most things (which were later refined by others into the form in which we now recognize them), but the BAT just never caught on. Too different, probably, from what people had already been using for over a century.

    • But a more dramatic example of the slowness of cultural change is the fact that I am typing this on a QWERTY keyboard. Dvorak [mwbrooks.com] has been around for years but still we type on devices that show their Victorian age heritage. Even when there is no need at all for the random shuffling of the alphabet across the current keyboard in the way we use it!

      You know that this is all a myth, don't you? It is one of those "geek myths" people keep on repeating to each other without really bothering to check
      • It's not a myth. A debatable point perhaps. As for checking the facts, here's some more. [dvorak-keyboard.com]
        • Hey, thanks for the link. I might have to change my tune again... :)

          • No Problem, the same thing happened to me :)

            My story went like this:

            1) Read that Dvorak was faster/better (told people).

            2) Read that it wasn't (told people).

            3) My wife tried Dvorak and said I should check it out. I told her it wasn't that great. I did some more research, found out it was (arguably) better and tried it.

            4) Now my wife and I both use Dvorak exclusively.

            All speed arguments aside, the main reason I changed my mind was that I found it to be more comfortable.
        • I have a simple question for you then. Why hasn't the Dvorak Simplified Keyboard penetrated the commercial market?

          If you truly could increase speed by 20%-40%, then you can reduce your support staff by the same. Large corporations are not stupid; spending a month on training in exchange of the ability to get rid of almost half your typing pool would have been done all over the place. Yet, during the post-war boom, you have no significant increase of DSK. Nor were there mass layoffs of unneeded staff.

          Is it

          • What the hell are you on about? Who said anything about Dvorak being suppressed?

            As for commercial market penetration: who knows? There's plenty of arguably superior products that have gone the way of the Dodo.

            As for corporate adoption: where do you get the idea that, if the speed increases are true, increased typing speed translates to a commensurate increase in overall productivity; or, that the initial resistance to adoption of equipment is the same as changing a basic, and arguably perfectly servicea
            • What the hell are you on about?

              Well, that's what I mean. A lot of Dvorak users seem to be worse than Mac users. (Yes, I know that it's impossible to be more fanatical than a Mac user. Bear with me.) It's almost as if I've insulted their faith.

              As for corporate adoption: where do you get the idea that, if the speed increases are true, increased typing speed translates to a commensurate increase in overall productivity;

              You've thrown me for a loop there. Isn't the point behind Dvorak increasing product

              • Well, that's what I mean. A lot of Dvorak users seem to be worse than Mac users. (Yes, I know that it's impossible to be more fanatical than a Mac user. Bear with me.) It's almost as if I've insulted their faith.

                Agreed. Although I'll admit to being an...afficianado, zealots are just annoying.

                You've thrown me for a loop there. Isn't the point behind Dvorak increasing productivity?

                I don't know if this is the focus of most Dvorak users. I've certainly heard it argued that Dvorak is faster, and I've b
      • ...at least one study indicates that placing commonly used keys far apart, as with the QWERTY, actually speeds typing, since you frequently alternate hands.

        But I don't frequently alternate hands with my QWERTY. In fact, if you pay attention to what your hands are doing, it seems that your left hand does quite a bit more than your right. I kind of assumed that Dvorak fixed this by adjusting the balance.
  • so I can finally sequeze those tits on p0rn sites !! this is very noble I will cry a river.
  • So instead of moving my mouse to the volume bar and dragging it, then moving it up to a menu and scrolling down the menu, all with the same motions and buttons ... I have to lift something up, move it across my desk, and manipulate it in different ways depending on what i'm doing?

    This sounds like something they may have invented before the mouse. Maybe back in the day it was a bunch of blocks all over your desk that you had to move, then eventually they all got consolidated into one universal interface de

  • by G4from128k ( 686170 ) on Saturday November 08, 2003 @12:25PM (#7424304)
    I can understand why some people are appalled by tangible interface concepts. These are the same people that refered to GUIs as WIMPs (Windows, Icons, Menus, and Pointers). For some people, a command line, keyboard-coded interface just works. But it is not the best interface for everyone or every application.

    1) Media creation: Who still creates CAD drawings with a keyboard only? I used some early versions of Autocad that where keyboard-only -- they sucked. Sometimes a tangible pointer with a 1-to-1 interface mapping between a 2-D surface and the screen is superior. For artists, the use of an LCD graphics pad and pressure-sensitive stylus means much higher productivity and finer control. (I've even scene academic research suggesting that a two-mouse interface could improve productivity.)

    2) Mapping to the Realworld: Go aboard an aircraft carrier and look at how they keep track of flight-deck operations. A miniature replica of the flight deck and miniature aircraft provide an intuitive 1-to-1 mapping between the model and the real-world. I'd bet that they could improve flight deck operations if those little aircraft moved automatically to reflect actual locations and if manual movements of aircraft spawned automatic commands to flight deck personel.

    3) Multiuser interfaces: the demos of MIT's system that I have seen (a business-oriented supply chain visualization tool) leverage the table interface for multi-user applications. With the table, anyone around it can reach over and move a block. And everyone can easily see who moved the block.

    The power of tangible interfaces is that they can help create a more literal mapping between a digital artifact and the real-world. Sometimes less abstraction leads to better ease-of-use.
    • 1) Media creation: Who still creates CAD drawings with a keyboard only? I used some early versions of Autocad that where keyboard-only -- they sucked. Sometimes a tangible pointer with a 1-to-1 interface mapping between a 2-D surface and the screen is superior.
      I don't know.. when you are doing CAD, don't you need accuracy? Is it really easier to hit just the right pixel, or type (35,25)?

      DISCLAIMER: I don't do CAD, don't really know anything about it.

    • Who still creates CAD drawings with a keyboard only?

      Just about everyone. I know many many many engineers and have worked in many engineering offices and I have yet to see a digitizer in any of them. With 3D CAD these days there are a few 3D manipulators, but the mouse works just great with a scroll wheel etc. In CAD you don't use your pointer to place lines, you use smart snapping and parameters to define the drawing using exact values.

      • > > Who still creates CAD drawings with a keyboard only?

        Just about everyone. I know many many many engineers and have worked in many engineering offices and I have yet to see a digitizer in any of them. With 3D CAD these days there are a few 3D manipulators, but the mouse works just great with a scroll wheel etc.


        If you use the mouse, its not keyboard-only. The mouse is a tangible manipulator that provides a good correspondence between X-Y motion of the hand and X-Y motion on the screen. The
  • by Doomdark ( 136619 ) on Saturday November 08, 2003 @12:26PM (#7424307) Homepage Journal
    So far it only lacks a device for text input, like a keyboard, but maybe voice recognition will replace it?"

    I'm certainly not the first poster to comment on this, but I just don't understand why many assume voice input would be the preferred method? That it'd even be better than physical controls (be that keyboard, mouse, switches, joystick, whatever). There's enough aural noise in the environment, even without more; accidental commands, specificity, technical things... And except for niches where it does make sense (if one can not use his/her hands or even legs), there just doesn't seem to be much beyond 'coolness factor'? Just like you can get carpal tunnel syndrome, your throat can go sore etc.; there are no health benefits; people can generally point/click/type faster than talk; GUIs are multi-dimensional (2 currently), speech generally single-dimensional, so there's one less way to distinguish what was the target (ie. no location information)... and so on and on.

    Now as to Star Trek and other sci-fi movies (including Minority Report), isn't it fairly obvious why voice input was/is used? It's the easy way to indicate what a character is inputting, and what are the results! Even if it wasn't for futuristic touch, it's so much better for needs of movies and tv series than, say, keyboard input (keyboard and mouse are only shown when realism is needed). Directors are in general experienced and smart professionals, and know that voice input is a very good solution for THEIR problems. Just like even though there hasn't been the need to stay on call for tracing to work for decades now, they still always imply it is, in crime series, just because that's a cheap (albeit unrealistic) trick to add some suspense to plot. Just don't assume they are prophets that show what future will be; just what works for them.

    • Now as to Star Trek and other sci-fi movies (including Minority Report), isn't it fairly obvious why voice input was/is used? It's the easy way to indicate what a character is inputting, and what are the results!

      At least four times an episode, some red shirt sitting at his little desk on the bridge thinks to himself, "Goddammit, Kirk, use your damn keyboard already."

  • At a glance, this sounds very much like the underlying interface stuff behind the Audiopad project (also from MIT, IIRC) - smart pucks moved around a projected image on the sensing surface. There were a few pucks which controlled various musical loops and one which acted as the microphone (the closer a loop was to the mic puck, the louder it played).

    Not that I'm doing anything down - my guess is they're now making more general use of the stuff they'd developed for Audiopad, or Audiopad was just the first a
  • by Anonymous Coward
    i will really be touching those hot cybergirls? ^^
  • I really want an intangible interface for computers! Holographics everywhere responding to my voice and movements. A virutal symphony of light, color, and sound as I dance gracefully throughout the room twirling in a ballet of control. Fucking MS Office 2012 sucks now. And that was just to make the text bold...
  • Hey,

    slashdot managed an MIT web server! (the media lab's) That's not too bad for a saturday.

    ( the qt movi was embedded in the page, all 5MB of it...)

    f
  • *yawn* wow, you mean i can control stuff by moving my hands, and making gestures? and all i need is some holographic projection mechanism, a darkened room, and tons of space?

    this is nothing that hasn't been done with touchscreens. this just takes wear and tear out of the equation.
  • The concept makes me think of the controls the Asgard have on their ships. Placing/moving the "stones" can have all sorts of different effects.
  • Neato (Score:1, Funny)

    by Jesus 2.0 ( 701858 )

    This project aims at conceiving better human-machine interfaces by using the concept of physical objects that the user can manipulate, to represent abstract computer data and commands.

    You mean they're going to invent the mouse and the keyboard? Awesome.

    • I think it's the other way around: instead of bringing an avatar of your hand into the abstract representation in the computer's screen (mouse pointer), it's the computer's data that will have avatars in your world.
  • by Anonymous Coward
    I am typing this on a Touchstream keyboard (by Fingerworks [fingerworks.com])--essentially a glide-pad, like on your average laptop, but keyboard-sized and with letters printed on. It's very interesting to use, and I've concluded that I'll stick to it ... bit weird to have a keyboard without any keys at first :) but you get used to it.

    It's definitely very cool to move the text cursor around, directly linked to the movement of your left index + middle finger (seemingly), and to be able to cut/paste by "picking" text with thum

  • by spellcheckur ( 253528 ) on Saturday November 08, 2003 @01:16PM (#7424517)
    It's not surprising this looks like Minority Report [imdb.com].

    John Underkoffler [mit.edu] is a former member of the Tangible Media Group, and was the science advisor [imdb.com] on the film.

  • by adrianbaugh ( 696007 ) on Saturday November 08, 2003 @01:28PM (#7424556) Homepage Journal
    Just look what they did to emacs :-O
    Seriously, while this probably has niche applications (previous posters have mentioned a few that sound plausible) I don't see that it offers much to the conventional desktop user (a keyboard and mouse require much less movement than the shenanigans Tom Cruise got up to in the movie and, other than keeping office workers fit, these interfaces will just lower productivity).
    So what about wearable computers? Something you wear on your belt with a head-mounted display, designed to be used while walking along? Well, to me it doesn't make much sense in this context either: again, if you end up requiring much odd movement on the user's part it won't work. In my opinion the future is far more likely to look like a next-generation of Canon's eye-controlled (pupil-tracking) autofocus system to control a pointer on some head-mounted display coupled with (in the short term) an interface that minimizes the need for text input together with some kind of finger-based character input device[0] or (longer term) speech recognition of a standard where the software doesn't need training and can cope with background noise[1].

    [0] There was one mentioned on slashdot ages ago that looked a bit like a gripmaster (key for each finger plus the thumb), and text was typed by entering chords.
    [1] Incidentally, how much research has been done on using stereo input to speech recognition programs to reduce background noise? I would have thought that would help quite a lot, albeit at the expense of CPU time.

  • Throughout time, there has been one hallmark of the existing user interface - despite the absence of any real tactile interface (save the keyboard), it's efficient. It seems like some of these interfaces strip away this efficiency and replace it with flexibility. This isn't a bad thing, but it does lead us to the point that for any given task, we'll need to decide if a given interface will provide the results we need.

    I have to say that out of all the examples included on the MIT web site, the one I see wit
  • Remember the Italian Restaurant
    that was really a time-warp
    spaceship? Arguing with the
    waiter over the bill caused changes
    in the space-time continuum, driving
    the ship along.
  • With a tangible interface comes a new essential peripheral: cleaning fluid. As an alternative: disposable rubber gloves.
  • Ok, there was something simular in a recent Batman comic book I read.
    Basically, he wore these little lcd projecter things over his eyes, and had a
    pair of Minority Report gautlet things, and the information he needed was
    superimposed in front of him, monitors, virtual keyboard, everything you need
    floating right in front of you. He was able to type in the air, as if it were
    actually there. Imagine being to intensity or decrease the
    transparency of the controls, and you got something I would give a lot to play
  • Apparantly Zowie Interactive made a similar toy. It was a pirate ship with serial cable, and moving the pieces around would make your computer respond. Product disappeared without a trace; very little is on the net, and eBay has nothing.

    Also apparantly, the company was bought by LEGO, so there may be hope.

    These guys [tudelft.nl] have all available info, including a link to the above MIT paper.
  • Just like that.
    Yes, it's cool to watch. But no:
    1. You have to learn it. That round thingies are used for this, star-like thingies for that
    2. Implementation shown is not logical. You either project supplemental interface from the top -- then any hand movement distorts it, or from the bottom, where hand movement and physical interface object distort/shadow it. Simple solution?
    Make the object virtual.

    Oh, wait. Isn't it the type of interface those hollywood designers made for Voyager, where you click, slide, a
  • While this may sound like it's really high-tech, these kinds of user interfaces have been part of science fiction movies and television shows for a long time. After all, you don't need a lot of props or complicated screen simulations to make it look good.

My sister opened a computer store in Hawaii. She sells C shells down by the seashore.

Working...