Forgot your password?
typodupeerror
Android Google Cellphones Input Devices Patents

Google Works On Kinect-Like Interface For Android 49

Posted by Soulskill
from the operating-phones-becomes-more-like-voodoo dept.
bizwriter writes "A patent filing made public last week suggests that Google may be trying to implement a motion-detection interface, like Microsoft Kinect, for portable electronic gadgets. The patent application is for technology that turns a mobile device's camera into a motion-input system. In other words, it could be goodbye to fingerprints and streaks on the front of your tablet or smartphone. Google could incorporate such a feature into Android in general or keep it as a differentiating advantage for its acquisition of Motorola."
This discussion has been archived. No new comments can be posted.

Google Works On Kinect-Like Interface For Android

Comments Filter:
  • More sales (Score:5, Funny)

    by agentgonzo (1026204) on Friday March 09, 2012 @11:23AM (#39300945)
    I get it. They're following the Wii model:

    Rapid gesticulation to control device
    Accidentally throw device across room
    Have to buy new device to replace broken one
    Profit!
    • by cpu6502 (1960974)

      I LOLed. :-)

      Except the throwing the device across the room usually isn't "accidental". ;-) The Wii control frustrates me to no end, especially when trying to play a rapid-paced game like Metroid Prime 3 or Sonic. I wish I could go back to using the gamecube controller because it is more precise & registers my inputs 99.999% of the time. (The wii control is more like 90% of the time, which is lousy.)

      • Re: (Score:2, Funny)

        by Anonymous Coward

        You just know WP7 gesture control is going to involve throwing chairs across the room.

    • by AmiMoJo (196126)

      I recently saw some people trying Samsung TVs with gesture control. Flailing your arms about while seated in a deep couch is hard, you look like an idiot and the person next to you gets punched in the face every time you change channel.

      The phone version sounds ideal for use on public transport.

  • So... what do you do? Set your phone on the table while you danec in front of it to send a text message? It's cool on the kinect, but seems weird on android. Also, shouldn't they be putting more energy into changing the name of google play store back to android market?
    • by DrgnDancer (137700) on Friday March 09, 2012 @11:51AM (#39301233) Homepage

      I suspect more like: "wave hand in front of phone instead of swiping". You'll probably still have to type on the screen or get a hardware keyboard, but this could free up some of the constant tapping and swiping across your viewing surface that you need to do for even the grossest control movement on a phone or tablet. Between this and improved dictation you could remove most of the need to touch the screen, but you're not going to completely eliminate it without a either a physical keyboard of someone coming up with a completely new paradigm.

      • by cpu6502 (1960974)

        I agree. 10 hours a day of waving my hand in front of my workscreen or phone would exhaust me. I am fundamentally-lazy and prefer to make as little movement as possible (i.e. use a mouse). I became an engineer because I wanted to find easier, simpler ways of doing things (less work) and waving my arms around like Tom Cruise in minority report is not easier. It looks very tiring.

        I've also noticed in Star Trek TNG or DS9 whenever they want to do real work, they put down the PADD and transfer the screen to

    • So... what do you do? Set your phone on the table while you danec in front of it to send a text message? It's cool on the kinect, but seems weird on android. Also, shouldn't they be putting more energy into changing the name of google play store back to android market?

      Now come on. Clearly we'll all have to learn Sign Language in order to communicate with our phones, which is going to be tough to do while holding the phone if the sign requires two hands (unless, of course, your name is Zaphod).

  • This is something I can't get my head around.

    People, me included, spend hundreds of pounds on a phone and then worry about it getting damaged, forgetting the fact that it'll be replaced for free when they renew their contract two years down the line. People, not me, spend money on ugly cases that turn their phone into something that no longer looks like an expensive phone. Hundreds of pounds for engineered hard plastic, metal and gorilla glass, for what?

    Me? I keep my S2 in my jeans pocket, without coins and
    • Re:Fingerprints (Score:4, Informative)

      by hobarrera (2008506) on Friday March 09, 2012 @11:38AM (#39301091) Homepage

      It's not everywhere you phone is free. I purchased a Nokia N900 (it's from 2009), used, for half a months salary here. And I'm pretty much middle class as well. So I'd rather take care of it, since buying another in two years is out of the question.

    • Agree. I've never bought a protective case for my phone. Kind of voids the whole point of buying a 'sexy' looking phone!
    • by geekoid (135745)

      I got my Nexus - S from best buy.
      For the same price as a screen and case I got a replacement plan. SO if it becomes damaged, I can replace it. This means I don't worry as much and just keep it in my back pocket.

      As an added bonus: I paid 99 dollars for the phone, and if I take it back to best buy I get a 119 dollar gift card.

      I do occasional buy custom back pieces. My last one Had Dr venture on it.

  • Great- so now I have to memorise the riverdance choreography to unlock my phone for use?

    I am predicting strange looks from my coworkers.

    • by hairyfeet (841228)
      At least it'll be easy to get to the customer service line of your carrier, you'll just give the phone the finger.
  • by Lord Grey (463613) on Friday March 09, 2012 @11:38AM (#39301101)

    From Claim 1 of the patent filing:

    A method of controlling a portable electronic device including an image capturing device, the method comprising: detecting, via the image capturing device, motions of an object over the image capturing device; determining a type of the detected motions using timing information related to the detected motions, the timing information comprising duration of at least one of the detected motions; and controlling the portable electronic device based on the determined motion type.

    Claim 2 then says:

    The method of claim 1, wherein the type of the detected motions comprises single tapping, double tapping, hovering, holding and swiping.

    Then there is a lot of refinement, talking about edge detection, direction of movement, the usual definition of a computing device with memory, and finally kicking off predetermined actions based on recognized motions.

    But look at Claim 2: "... comprises single tapping, double tapping, hovering, holding and swiping." To me, this patent seems to be a simple extrapolation of the gestures Apple made popular with their mobile UI, with the addition of "hovering" (assuming I understand the definition of that word, here). Same gestures, different input control.

    Is there a significant difference between, say, swiping across a phone's screen and making the same gesture a few inches away? (I'm thinking that if the device interpreted motions from a larger distance then the only thing that will reliably happen is a serious of hilarious DoS attacks via interpretive dance.)

    • Also sounds like the stuff from eyeSight. I used this a few years ago (on Symbian): http://www.eyesight-tech.com/technology/ [eyesight-tech.com]
    • Is there a significant difference between, say, swiping across a phone's screen and making the same gesture a few inches away?

      Maybe, maybe not, but this probably isn't about even "a few inches away". Look at claim 3: "The method of claim 1, wherein detecting motions comprises: receiving images from the image capturing device, each of the received images is associated with a motion of the object; determining an illumination level for each of the received images; comparing each of the determined illumination

  • I mean, I knew they were working on Gmail Motion [google.com], but I though they left that idea when it stopped being April 1st.

  • Wasn't this an April Fools prank of theirs a while back?

  • I see it now.

    I, Phone.

    The first rule of Phone-botics. Never ask for any gestures from a human to unlock the phone that could hurt a human.

  • So basically they patent a decent motion detection algorithm, and they can do so just because it's used for controlling a portable electronic device including an image capturing device. How innovative. Don't get me wrong, I'm not against Google on this, such a feature could be great. But now everyone who was thinking on doing anything on android that would use visual motion information (which, surprise, comes from images through the camera) can go find something else to do. Well, businness as usual.
  • This is not the first such UI, there's already an existing gesture UI for Nokia's N9 phone. Relatively simple and experimental, but still.
    http://store.ovi.com/content/214364 [ovi.com]
    I have not tried it myself.

    Full disclosure: I work for Nokia, even though I've not had anything to do with this particular software.

  • As with voice, giving casual input to the detector that could be interpreted and acted on, even if not meant for the phone, is a potential danger.
  • by SalsaDoom (14830) on Friday March 09, 2012 @02:45PM (#39303455) Journal

    Because otherwise, Smartphones and Tablets really are getting out of control.

    I thought -- and still do -- think its stupid to have a speech interface for a phone. I mean you look stupid talking to a robot woman on your phone. The last thing I want to do is start dancing in front of my phone.

    "Give me a sec, I have to do the shuffle to unlock my phone, and then the achy-breaky to open my email."

    I'm actually pretty happy with smart phone interfaces these days, just the way things are..

    • by x1r8a3k (1170111)
      Voice interfaces have their place. For example, I can use speech recognition on my phone to say "Navigate to 123 Main St City State" much faster than I can go to the nav app, tap the search button, and type it in.

      However, you don't have to use it. You can still do it all through taps and typing. I'm sure this motion interface will be completely optional too, so just ignore it.
  • by Nyder (754090) on Friday March 09, 2012 @04:33PM (#39305109) Journal

    Poor cellphones.

    At first, they were held like normal phones, up to your ears.

    Then came text messaging, and everyone was typing on a keypad.

    Then came little keyboards, so were weren't using the number pads.

    Then we got rid of the keyboards and used the screen for typing.

    Now they want us to hold the phone in front of us with 1 hand,while waving our hand (like a magician or something) in front of the phone.

    Wtf happened to using a phone like a fucking phone?

    Though I have to give props to google here, chasing a patent before everyone else, since this is the next step.

    but what happened to voice control? It's a phone, we talk into it, why not actually control it with our voice? Shit, i better get that idea patented...

Philogyny recapitulates erogeny; erogeny recapitulates philogyny.

Working...