Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Technology

Search and Rescue Robots 63

An anonymous submitter sent in: "Interesting article on the New York Times [need an acct, blah] about the various robots currently in use to search the rubble of the WTC. Not a very technical article (expectedly). Seems to be all telepresence stuff. No mention of any autonomous devices, which is as I'd expect. Too bad we don't have a few platoons of those sony bipeds with firefighter/s&r programming to make that first response."
This discussion has been archived. No new comments can be posted.

Search and Rescue Robots

Comments Filter:
  • I don't think those sony walking computers would be very suited for the rescue there. I see it as a big acomplishment for them to be able to go up stairs, let alone crawl over stuff. Give those guys enough shielding and send them into a radio active area, sure.
    • Humanoid bots are probably stupid to build.

      Much more efficient would be some kind of spider bots, first with 2 or 3-jointed mechanical legs, then with many, many-jointed legs ala Matrix or Doctor Octopus type limbs, ultimately something like an elephant's trunk.

      Much more stable, plenty of redundancy, and ironically, easier to program than a 2-limb humanoid walker.

  • I hope the battery of those robots work longer than the one inside my notebook. *sigh*
    • i hope these robots will work with the "three laws"
      • Re:Battery (Score:4, Interesting)

        by rm-r ( 115254 ) on Thursday September 27, 2001 @06:35AM (#2357483) Homepage
        If anyone is interested in something about robots, but on a some what lighter note, try this article [bbc.co.uk] at the BBC. It's a robot that roams farms tracking and picking up slugs- the best bit is that it is fueled by their decomposing bodies!
        • If anyone is interested in something about robots, but on a some what lighter note, try this article [bbc.co.uk] at the BBC. It's a robot that roams farms tracking and picking up slugs- the best bit is that it is fueled by their decomposing bodies!

          This is old news [slashdot.org].

    • I doubt they rely on batteries for those particular robots. Using flood lights requires a lot of power, and you would want the operational times to be in hours maybe tens of hours. I'm nearly positive they would have a tether on most of the robots that go in deep rubble, if something goes wrong you can usually use the umbilical to pull the robot back out. (Those things are expensive) Also the video transmission is much clearer (and easier) by using an umbilical.

      Additionally, the chaotic nature of the environment wouldn't be one where you could depend on getting wireless coverage easily.. it could be easy to drop into a dead spot and lose control until the coverage is somehow fixed..

      But I'm sure I made more out of this than what you were trying to mention... the run times on our bigger robots is probably 4 hours (depending), but they can't traverse rubble.

      Brett
      • Below is about the robot with tracks and flippers.


        I doubt they rely on batteries for those particular robots.



        Yes we do use batteries. Right now they suck. (NiCds) But we're in the middle of developing LiIon. That should give us $BIGNUM hours of life.


        I'm nearly positive they would have a tether on most of the robots that go in deep rubble, if something goes wrong you can usually use the umbilical to pull the robot back out.


        In this case, I imagine they did use a safety tether, and there is hard-line ethernet and serial on the robot. However, we've been tasked to not use a tether. We're still experimenting with various comms from wireless modems to 802.11b to even 802.11b downconverted to the military band (so we can up the power).


        Additionally, the chaotic nature of the environment wouldn't be one where you could depend on getting wireless coverage easily.. it could be easy to drop into a dead spot and lose control until the coverage is somehow fixed.


        Losing comms is something everybody in this program has been trying to tackle for the last 3 years. There are 2 methods currently used. The dumber robots backtrack along their path until it gets back into comm. The smarter ones will either backtrack, or if it was given a task before comms drop, it'll finish the task and try to re-acquire.
  • Another link (Score:4, Interesting)

    by rm-r ( 115254 ) on Thursday September 27, 2001 @06:09AM (#2357438) Homepage
    the BBC has this piece [bbc.co.uk] on laser mapping of the rubble.
  • the BBC have a piece here [bbc.co.uk] that doesn't require all that messing about with an account.
  • Short on technical data ? try non-existing techincal data, unless you count using treads there was absolutely NO description of anything in these babies.

    It would be good for rescue operations (not for body-search ones) if the robots were equipped with a speaker/mic besides the teleprescence equipment. This way, if they found somebody alive the ppl topside could comunicate with them.
    Give them some hope.

    Are this things tethered ? or wireless ? the article shows that small tracked one without any cables sticking out, but its not operating. Cos I guess all the steel and iron would fuck up wireless comms its not like line-of-sight or some office walls between the robots and the operator. And a very powerfull tx/rx module would require a lot of juice.

    --
    T

    "If you can't improve the silence, don't speak."

    • Regarding speaker/mic: some of the Inuktuns (including the MicroTracs) have a microphone and speakers, with a headset that the operator can wear.


      Regarding tethered/wireless: the Inuktuns are tethered. This limits their range, but makes it possible to use them continuously, and there is really no risk of losing communications (unless the tether is physically broken). Further, if needed, the robot can be retrieved by following the tether.


      The iRobot/RWI robots are wireless, and would normally be capable of running with full autonomy (i.e. they have onboard processing and a sensor suite). They are limited by their battery life, and if the wireless communications are lost (due to interference or otherwise), they may be difficult to recover. For this reason, the Inuktuns were used primarily.


      Check out
      [usf.edu]
      the USF USAR page (not yet updated with WTC
      information) for more.

      • I worked with my college's robotics team at this year's AAAI/IJCAI robot competition. While I was working on the robotic waiter competition, we also competed in the USR contest and got a good score. Check out this NYT article [nytimes.com] for more information (I'm at top left in the first picture :-).

        Anyway, the field of robotic USR at this point is so new that a lot of interesting and very different technologies are being developed. A lot of groups use platforms that wouldn't be of great use in a real rescue situation, such as the iRobot Magellan Pros [irobot.com] we used, simply because they're research robots designed to roll around on flat surfaces. These in particular lack sensors specifically designed for rescue use - they have a ring of sonar and infrared range sensors around them and a video camera. Others, like the Urbans used at USF, are very rugged (they are tracked and can climb things) and have very sensitive FLIR sensors for detecting people.

        Autonomy versus teleoperation was the subject of some debate at the competition last summer. The scoring for the contest was done according to an equation that rewarded for the number of people found, penalized for the number of people it took to operate the robots, and so on. The formula gave a very clear advantage to teleoperated robots: even though teleoperating is very difficult (it's not like RC cars!), it's a lot easier for a human to look at a screen and navigate/find people than it is for a robot to derive this information from the same image. In a way, this is a very pragmatic approach: we need to develop useful technologies soon. On the other hand, this contest took place at an AI conference, where naturally advancing the state of the art of machine intelligence is viewed as a pretty important goal.

        Because of the robots' differing abilities, the contest featured three seperate NIST-developed standard courses; one with flat floors, one with more challenging debris, and one with what essentially was a big pile of rubble. Our Magellans could only handle the first room; the Urbans and a huge homebrew tracked robot from the Sharif University of Technology in Tehran took the more challenging rooms.

        In any case, here (at last) is a summary of some of the technologies used in the competition:

        • Swarthmore College (us): Standard Magellan Pro wheeled research bot with a Canon pan-tilt-zoom camera on top. Final version used "semi-autonomous" guidance, where an operator would tell the robot to go to a point not too far away and the robot would go there. Big lead-acid gel batteries; PC on board did lots of processing. <gimmick>Robot could also generate red-blue stereogram images of site on operator command by rotating slowly and compiling distance data.</gimmick>
        • University of South Florida: Teleoperated Urban tracked robots with FLIR sensors. Some info [usf.edu]. Probably lead-acid batteries, don't know about onboard processing (suspect no much).
        • Sharif University of Technology: Medium dog-sized tracked robot with binocular vision. Teleoperated by laptop or Palm. I wouldn't want to get in the way of this thing. Few or no sonars IIRC, giant lead-acid gel batteries, probably very little onboard processing.
        • University of Minnesota: Cute tube-shaped robots designed (no joke) on a DARPA grant to be shot out of a grenade launcher. Completely radio controlled; AFAIK no smarts on board. Sent wireless video feed to monitors. A spring mechanism allowed them to jump out of tight spots. DARPA doesn't let them talk about power supply, other info.
        • ?University of Utah?: A commodity approach: robots were Radio Shack RC cars with Basic STAMPs on board. While not completed at the contest, eventually they will swarm with complete autonomy around the disaster site, detect people (or other hot things) with IR sensors, and relay their position to a central controller. Power is a simple 9V battery.
        • University of Edinbugh: don't recall exactly; despite all their valiant efforts, they couldn't get anything working in time. I think they were also working on an autonomous system of two or three bots.
        All of the teleoperated robots were controlled wirelessly; none had tethers.

        Enough rambling for now. In any case, it was a really cool experience to go out to Seattle and see all the stuff the teams were trying. For me, robotics is a great, growing field that is a whole lot of fun, and the conference was one unforgettable week.

        --Tom

    • It would be good for rescue operations (not for body-search ones) if the robots were equipped with a speaker/mic besides the teleprescence equipment.



      Yes, they will have Speakers/mic. IRobot (the company who makes the chassis) has provisions for that in their upgrade path on the chassis)



      --Carlos V.

  • login NOT required (Score:5, Informative)

    by wickline ( 93513 ) on Thursday September 27, 2001 @07:07AM (#2357539)
    http://archive.nytimes.com/2001/09/27/technology/c ircuits/27ROBO.html [nytimes.com]

    This trick has been seen on slashdot before. It should be a FAQ. Editors should automatically use the archive subdomain in NYT links rather than the www subdomain.

    -matt
    • by at_18 ( 224304 )
      You can also use this account:

      Username: slash2001
      Password: slash2001

      Let's see how long it will work, after 1.000 people start using it
    • /. doesn't post the direct link because they could be sued by nytimes for linking to their sub-domain. If it is the poster that does that, they hold no liability, because it is someone else. Doesn't anyone see that is why they never do it, and ask others to do it?
    • Anyone else notice that on Sep 11 when the nytimes .com servers were struggling to keep up with the demand that the one page that popped up quickly was the registration page?
  • by hackman ( 18896 ) <bretthall@i e e e . org> on Thursday September 27, 2001 @07:32AM (#2357595) Homepage

    This is a really interesting opportunity for some high-tech to be applied to a real non-military situation. Robin Murphy came here to UCSD and talked a few months ago, and she actually brought (and we got to drive) the tracked one with the flippers in front.

    Those things are not easy to drive. One of the most difficult things is getting a perspective on where the robot is in relation to it's surroundings (very rough rubble). This is an ongoing research area for many robotics teams, and one we have been working on also.

    The submitter mentioned something about autonomous robots, I think they don't fully understand the difficulty of the problem which robotics researchers are working on. Navigating uneven building wreckage autonomously is an incredibly difficult problem, in general. Especially under the conditions of the WTC rubble. There may be some small parts of the process which can be automated, but I doubt it would be useful in this situation anyway. They were using the robots as probes to discover what was inside areas where it was dangerous for people to be there, so a human is already "in the loop". The real use of these systems is for remote visualization (i.e. show me what's in there) in hard to reach areas.

    They didn't specify what types of cameras are being used, but this is a mostly visual problem from my understanding. Most robots have standard rectilinear camera views that are forward facing, unfortunately operation of these platforms is difficult becuause of the restricted field of view and inability to see on the left, right and behind the robot. Multiple cameras helps, but adds significant complexity and disjoint views. A technology which really makes this easier is an Omni-directional Video sensor (which has a 360 deg. field of view around the sensor). These are ideal for "immersed" applications like this, and they literally give the operator a view of the entire space around the bot (except for directly overhead) and allow you to determine the robot's orientation relative to obstacles easily. The same data can also be unwarped and used to create a perspective or panoramic view of the area in real time. A pair of these and stereo software (which also has been done in our lab, [shameless plug over]) can provide a full depth-map of the area. The ODVS has the difficulty of limited resolution (same CCD, larger fov) but this can be supplemented by a Pan/Tilt/Zoom rectilinear camera.

    Really the interesting part of research in this direction is the remote operation and visualizations that help the perator navigate through the area to achieve it's goal. This is what my thesis is on, actually.

    More info: UCSD CVRR Lab [ucsd.edu] The Page of Omnidirectional Vision [upenn.edu] and our source of ODVS [kyoto-u.ac.jp]. Also check Vstone [vstone.co.jp] (in Japanese, may need to run that last one through babelfish or something).

    Mobile Robots are cool. We even have one that pulls cables for us in the drop-ceiling of our lab... we're slowly working on a web-page for that new one.. I have a cool video for it already but it's HUGE (100M or so). Anyway, I'll shut up.


    Brett
    • I actually did some research on this very topic (I was a student of Dr. Murphy while she was at the Colorado School of Mines). I have tried operating some of these very robots. Not an easy task at all.

      My research focused on using AI to assist the operator, for example, in very cluttered environment with many sharp corners, the robot would automatically lower its top speed. Other people I worked with, were researching AI assisted tele-presence.

      I am torn that this research is being put to use. Kind of like knowing first-aid; you are glad you know what to do and glad to help, but hope like hell you never have to apply it.
    • I'm on the JPL portion of TMR. TMR stands for Tactical Mobile Robotics and is the DARPA program headed by Lt. Col. John Blitch before he retired. Its the tracked robot with flippers in the front. The chassis is made by IRobot [irobot.com] and JPL is doing the Sensors/Autonomous behavior portion of the program. You can see last year's effort on the previous version of the chassis at our website: http://robotics.jpl.nasa.gov/tasks/tmr [nasa.gov]

      Those things are not easy to drive. One of the most difficult things is getting a perspective on where the robot is in relation to it's surroundings (very rough rubble). This is an ongoing research area for many robotics teams, and one we have been working on also.


      Yes, they are very difficult and disconcerting at first. It does take training. One of our most difficult long distance runs was with the operator out of sight of the robot and the robot having to navigate through a forested area, go to a tower across a parking lot using cover, and climb up a set of stairs. The hard part were the questions, "Where am I?" and "Where can I go?" The operator had to depend heavily on the 360 degree view camera and the computer bread-crumb trail to figure out where he was and where to go, but the question of traversal across obstacles.



      With a simple black-and-white camera it is hard to tell what's out there. Is there something behind that grass? Is it a rock, a hole? or a log that can high-center us? Now roll in trying to do that autonomously. The computer can see the clump of grass, but if its thick enough, it'll see it as an obstacle instead of something it can just roll over. And even if it knows it can roll over it, it can't see if there are any obstacles hidden in the grass. We're trying to use laser scanners and radars to try to solve that problem.


      Multiple cameras helps, but adds significant complexity and disjoint views. A technology which really makes this easier is an Omni-directional Video sensor (which has a 360 deg. field of view around the sensor).



      Yup, we have one of those from a company called Remote Reality. [remotereality.com] Nifty items. The one with the hole in the center of the picture gives the best resolution at the horizon. 180 degree fisheyes don't give the hole in the center, but suck at resolution on the horizon.



      --Carlos V.

  • by Muad'Dave ( 255648 ) on Thursday September 27, 2001 @08:03AM (#2357668) Homepage
    those yappy little Aibos! Small, mobile, autonomous. Add a speaker and microphone, and you can talk to survivors while they pet the 'bot and watch it do tricks.

    Better yet, the operators can cry, "Let slip the dogs of war!" or "release the hounds!" when they deploy them.
  • You can also look at Wired's version here, [wired.com] with a few more pics than NYT.
  • You mean the 1.5 foot tall ones with little nuts for hands? I don't think they could lift anything to look for a person under it...
  • Build your own... (Score:2, Informative)

    by nyjx ( 523123 )
    The long running Robo Cup Soccer challenge (Robocup.org [robocup.org]) started a search and rescue competition a after one of those involved died in the Kobe earthquake:

    href="http://www.r.cs.kobe-u.ac.jp/robocup-rescue [kobe-u.ac.jp]

    you can build your own coordinated team of virtual robots to work in a simulated disaster zone + there are some search and rescue competitions using real robots. There's a download page for the simulators and disaster toolkits.

  • A friend of mine works in the acoustics lab at a nearby university. Not long after the attack, they received a request to build a device that would allow rescuers to find people in the rubble by listening for breathing or cries for help that couldn't be heard otherwise. The team did get something together for this and went up to the site, but they never got to try it out.

    This was a couple days after the attack and rescue workers had pretty much given up on survivors at that point (though the media continued to report otherwise). According to my friend, when he got there, the police and firefighters were concentrating the search on finding their own guys and had pretty much given up on finding anyone else.

    He said that it took about 6 hours to get cleared to go onto the site itself, and then, after that, you were put in a queue to use the site for your task (searching, removing rubble, et cetera). They were bumped twice before being told they weren't going to be used, so the device never had a field test, but none of the team complained. They were in complete awe of the scene itself, and, while they ultimately felt helpless, they felt privileged just to be there and to speak with other crews and native New Yorkers. They managed to get some audio recordings of the area as well, thinking it might be important for historical reasons.
  • Unfortunately, they wouldn't be much help in the WTC situation, where everything is buried under tons of rubble, but there are some really amazing things being done in the Association for Unmanned Vehicle Systems International [auvsi.org] International Aerial Robotics Competition [gatech.edu]. My school [uwaterloo.ca] has had a team [uwaterloo.ca] for a few years, and they kick ass. The goal: Autonomous flying robots with vision, image recognition, hazard avoidance, and more. This stuff is frickin cool.

Get hold of portable property. -- Charles Dickens, "Great Expectations"

Working...