Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
The Military Robotics

Why the Air Force Wants To Put Lidar On Robot Dogs (popsci.com) 18

An anonymous reader quotes a report from Popular Science: Imagine being able to see the components of a potentially dangerous situation in live 3D and in fine detail without even having to survey the area,' says Brian Goddin, from the Air Force Installation and Mission Support Center public affairs, in a video produced by the military. [...] Putting lidar on drones and on ground robots gives the military a way to map the interior of a building with a machine. With that lidar data transmitted to the computers in a command center, or even just the tablet of an operator sitting outside the building, a human can see what the robot sees, and direct the robot accordingly. (In the civilian world, lidar sensors are commonly used on self-driving cars as one tool for the vehicles to perceive the world around them.) Goddin's presentation, released online December 9, 2021, shows lidar mounted on Spot, the Boston Dynamics dog-shaped robot. Ghost Robotics Q-UGV machines, also dog-shaped and sensor-rich, have been used to patrol the perimeter of Tyndall AFB, making Spot the second breed (or brand) of robot dog to serve the needs of the base.

While all of this mapping at Tyndall is happening in the wake of Hurricane Michael, creating a virtual 3D model of the buildings as they stand can guide future repair. Such a virtual model is a useful tool for regular maintenance and repair, and it provides a record of a prior state should disaster strike again. Such techniques could also allow better investigations of failure after the fact. By comparing lidar scans of downed or wrecked craft to those before launch, and to surviving aircraft that made it back from a fight, the Air Force could understand how to better make more durable craft. Scanning a wrecked plane with lidar also lets rescue workers and recovery teams know if and how they should act to save pilots and passengers, suggested Javier Rodriguez, a technician stationed at Tyndall.

This discussion has been archived. No new comments can be posted.

Why the Air Force Wants To Put Lidar On Robot Dogs

Comments Filter:
  • by frank_adrian314159 ( 469671 ) on Tuesday December 28, 2021 @08:08AM (#62121583) Homepage

    You put frickin' lasers on their heads. Then you'd have something. Add machine guns in their chests, too. All the toys a young gamer could want.

  • The largest military budget in the history of mankind. More spending than the next ten top nations combined. Nearly 77% of $1 trillion a year.
    • It's OK, they have lots of born again (fake) christians who can pray for the poor people who die because they can't afford the for profit healthcare that could have been taken care of by much of that money taken up for figuring out how to kill healthy people.

    • That is just the official budget. There are many other spending bills that overlap into the military that should be included in such totals. I'd not be surprised if it wasn't closer to $1 trillion per year.... plus within 10 years (sans imminent collapse) the budget SHALL hit $1 trillion per year.

      We should be presenting it in 10 year spans; so it's at least $7 trillion. Just about everything that is good is presented in 10 year spans. Biden's tiny build-back-better is a 10-year span-- divide it by 10 and

  • But I am also more of a software approach to the problem, vs a hardware approach. However the IR Cameras setup in parallax can extrapolate a 3d environment as well pick up details that slip by LIDAR. Also pushing software fixes to make it work better is much easier to be deployed than upgrading the hardware in the fleet.

    LIDAR is fine for gross navigation and 3d Mapping. But for close quarters where there could be a wire, spike or a number of dangerous things to a robot or a person, that can fall between

    • But I am also more of a software approach to the problem, vs a hardware approach.

      But everybody knows you just add more, bigger, and fancier hardware to simultaneously localize and map your surroundings, it’s only natural. Name me just one system with two shoddy cameras with so many defects they need to twitch a lot to even take in proper data that can still know where it is and build a reasonable working map of a building or structure.

      • That sort of thing works more or less okay when you have a big complicated brain behind the cameras, but it still has lots of problems. It's prone not only to missing things, but also to seeing things which aren't even there, because so much interpolation has to be done and sometimes the system gets it wrong — especially when it's been active for too many concurrent hours. Then it's capable of all kinds of wacky misinterpretations of the sensor input.

        Tesla is busy trying to prove your premise, but I d

        • That sort of thing works more or less okay when you have a big complicated brain behind the cameras, but it still has lots of problems. It's prone not only to missing things, but also to seeing things which aren't even there, because so much interpolation has to be done and sometimes the system gets it wrong — especially when it's been active for too many concurrent hours. Then it's capable of all kinds of wacky misinterpretations of the sensor input.

          It works great when you have a simplistic and minimalistic brain that is well adapted to those inputs behind the sensors as well. Hell, with a good enough system processing the data you don’t even need a conventional brain like with slime molds or single cell organisms. With all those sensors comes massive data overflow with tons of superficial information that also takes up substantial processing power to weed through. It still doesn’t solve the issue of what is regarded as “common sens

          • It works great when you have a simplistic and minimalistic brain that is well adapted to those inputs behind the sensors as well.

            As long as you only want to do simplistic and minimalistic things, sure.

            • As long as you only want to do simplistic and minimalistic things, sure.

              In nature, SLAM and reliable object avoidance are simplistic and minimal.

  • Is there anybody who doesn't want to put LIDAR on robot dogs?
    • "Is there anybody who doesn't want to put LIDAR on robot dogs?"

      Exactly! Like robot.sharks with lasers, it's obvious!

  • That seems too big for some situations, will they make a highly miniaturized version too?

  • I thought ultrasound would be mandatory for dogs, otherwise they can't hear the dog-whistle.

Real computer scientists don't program in assembler. They don't write in anything less portable than a number two pencil.

Working...