Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
Google Cellphones

Google's Project Tango Seeks To Map a 3D World 49

Nerval's Lobster writes "Google's Advanced Technology and Projects Group is working on a new initiative, Project Tango, which could allow developers to quickly map objects and interiors in 3D. At the heart of Project Tango is a prototype smartphone with a 5-inch screen, packed with hardware and software optimized to take 3D measurements of the surrounding environment. The associated development APIs can feed tons of positioning and orientation data to Android applications written in Java, C/C++, and the Unity Game Engine. In addition to a 'standard' 4-megapixel camera, the device features a motion-tracking camera and an aperture for integrated depth sensing; integrated into the circuitry are two computer-vision processors. Google claims it only has 200 developer units in stock, and it's willing to give them to independent developers who can submit a detailed idea for a project involving 3D mapping of some sort. The deadline for unit distribution is March 14, 2014. In theory, developers could use ultra-portable 3D mapping to create better maps, visualizations, and games. ('What if you could search for a product and see where the exact shelf is located in a super-store?' Google's Website asks at one point.) The bigger question is what Google intends to do with the technology if it proves effective. Google Maps with super-detailed interiors, anyone?"
This discussion has been archived. No new comments can be posted.

Google's Project Tango Seeks To Map a 3D World

Comments Filter:
  • Robotics (Score:5, Insightful)

    by Andrio ( 2580551 ) on Friday February 21, 2014 @12:49PM (#46304513)

    Everything Google is doing now is for their upcoming robotics division. This is how their robots will see and map the environment.
    Man, I wish my Roomba had this ability, rather than just randomly moving and adjusting direction based on what it bumped into.

    I'm really excited for the new robots industry :)

    • by Anonymous Coward

      A sensor like this could be *very* cool for their self driving cars too... Possibly radically bringing down the cost...

    • Now all we need are some anti gravity thrusters and we have the mapping pups [] of Prometheus.

      Come on Google, you've done the hard part with taking over the planet - now lets get working on gravity. (Or is Apple supposed to do that after they get the watch thingy to work?)

      • Apple can't do it. They're busy with their immortality project. What? You didn't think Steve was really dead did you?
    • Re:Robotics (Score:4, Insightful)

      by timeOday ( 582209 ) on Friday February 21, 2014 @01:16PM (#46304683)
      I think you are right their interest is in robotics. But it may also wind up being a boon to 3d printing. After all, a copy machine needs a scanner as well as a printer.
    • Yup, what AI needs most right now is a way to digitize its environment to 3d imagination space like you see in video games. Hey even if you made the technology and no robots, you could map out the world, and have video games that you play across the whole Earth :) Cannon ball run USA anyone? I have an AI blog, says the same thing, we need 3d digitization []
    • by MobyDisk ( 75490 )

      I wish my Roomba had this ability

      Some of the competitors products do. I forget which one(s). There was a neat youtube video of some company's robot and it would decide when it was done a room and not go back into it, just by mapping out where the doors were as it went along.

    • Think bigger. Pair this with some Google Glass/Oculus Rift hybrid (ie what that tech could become in the next decade or two) and some advanced augmented reality software. If you can map your world accurately, then you can project what you want on top of it accurately.

      Anything you use but don't usually touch no longer has to actually exist -- it can be projected into reality. No TV. No screens of any sort really. Might be nice to have books, but you can just have one full of blank pages. Don't need artwork o

    • You can already get a SLAM enabled vacuum robot.

      I've had a Neato for 3 years and it makes Roomba look like a toy.

  • Project: Invasion of Privacy

    And you speculated that you had any left.

  • Build this functionality into a small quadcopter and you could use it to scout and unknown enviroment. Map an unknown enviroment in 3d before sending in personnel.
    • by Anonymous Coward

      Been done already several times with every sensor known to man.

      • by bigpat ( 158134 )

        Yes, but this appears to just use a pair of cameras on the back of a regular phone. Most accurate 3D mapping for robotics is being done with expensive and more energy intensive (battery draining) LIDARs. Cameras are cheap and don't use a lot of power so using cameras could drive down the cost of mapping 3D environments or mapping an object for 3D printing if it is something that gets included in new phones... again sure it has been done, but not in something that could just be added as a feature to everyo

        • Yes, but this appears to just use a pair of cameras on the back of a regular phone.

          Parent is not wrong. It has been done with every sensor known to man including stereo pairs and structured light projection.

          • by bigpat ( 158134 )

            No not wrong. What has been done before is 3D mapping using cameras. And even putting those cameras on a quad-rotor. And yes the first post about building the functionality into a quad-rotor was also a bit ignorant of recent work that has put cameras onto quad-rotors to do 3D mapping, but the underlying enthusiasm about what new capability is being provided is still valid if you consider it a bit less narrowly.

            If you could just strap a smartphone that had a 3D camera onto a quad-rotor or ground ro

  • not for me (Score:4, Funny)

    by clovis ( 4684 ) on Friday February 21, 2014 @01:16PM (#46304681)

    I'm from Flatland, you insensitive clod!

  • Will it help my wife find her glasses?

  • by Anonymous Coward

    Scan my feet/hands/body and sell me some shoes/gloves/clothes that fit.

  • Here's to hoping Google can force them to change their name again.
  • by Anonymous Coward

    Stealing a comment from reddit []:

    Warning: this video is not what it appears. To all of you saying "WHY THE HELL WOULD I WANT A 3D ENVIRONMENT ON MY PHONE?? TO PLAY GAMESS??", you are not the target consumer.

    TO you, the average consumer, it may seem like a neat new project with some cool implications like indoor navigation ("I'm inside the mall, how do I get to Macy's?" or "I am at a football game, where is the nearest hot dog stand?").

    Now think about what Google is; Google is evolving far past an advertising company and more into a "big data" provider.

    Take it a step further. Combine this tech with Google Glass (or some other wearable peripheral we may not know about), and the possibilities expand ("Where is the user looking usually?" "What is the optimal location for this new billboard? Based on Google Glass 3d mapping data, drivers look to the Northwest usually when traveling down Route 33).

    Take it even further to Google's long-known project to catalogue everything on the planet. This will expand into their goal to be able to "Google" real-life stuff. Like a real life Control + F. You lost your keys? You don't remember where you put them, but your phone combined with your Google Glass remembers exactly where they are. Even if it doesn't remember, perhaps you can swivel your head around the room, scanning it with a camera until it alerts you that you are looking directly at an object that looks just like a pair of keys.

    Now expand that further; big data. Have you been looking at sweaters a lot lately? Tango knows you've been shopping in department stores when you go to the mall. They can feed this data to advertisers, learn your color preferences, learn everything about you and be able to direct you to products.

    Google can learn the shopping habits - of EVERY PERSON IN THE WORLD - and relay that information to marketers. Tell them where the best place to organize their products in brick-and-mortar stores, what and when to put items on sale at a specific time.

    Google will be able to provide sales data BEFORE the sale is even MADE. Perhaps in early September people start shopping for winter clothes in New York City, but in August they were googling a new jacket, maybe looked around a leather store and looked at some jackets, etc.

    This has huge market potential when combined with all of Google's other products. Remember this project isn't just a side project, this is the result of huge acquisitions and a scientific approach to recruiting and retaining top talent (we are talking salaries in excess of 1 million).

    Google is moving to be the top "information manufacturer" in this new information age. Amazing! Wish I was a part of it.

    • Google can learn the shopping habits - of EVERY PERSON IN THE WORLD - and relay that information to marketers.

      The buzzword is "Comprehensive In-Store Analytics" [].

    • I've been waiting for tech like this, to combine with Google Glass as well -- but to do the exact reverse of what that quote suggests.

      Instead of using Glass to scan reality into a digital model, *use it to project a digital model into reality*. This will allow MUCH better augmented reality than we currently have. Perhaps to the point where you can change the color of your bed sheets with the press of a button. If you could achieve that, there's a whole lot of manual labor that suddenly becomes pure informat

  • Integrate this project with Google Glass. Or have they already?
  • by Anonymous Coward
    If you send that to a 3D printer, we'll have TWO Earths! THAT's how powerful 3D printing is.
  • There are two potentially huge markets. I, for one, would like to be able to take a few (360-degree) photos of my house and have SketchUp [] (formerly owned by Google) deliver a 3D version that prospective buyers could "walk around in" via their browsers. Similarly, construction works spend a lot of effort making site measurements to create estimates, order materials, etc.. If that could be automatically produced via 3D renderings, all the better.
  • Jack Bauer and his pals already have 3D maps and schematics of every power plant, office building, warehouse, outhouse and chicken shack. Not to mention full control of the power, network and hot and cold water taps in each of them. And all in the time it takes Chloe to recalibrate the beam forming firewall protocols against the binary-coded output logs. Or something.

Karl's version of Parkinson's Law: Work expands to exceed the time alloted it.