Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
The Military Robotics

Military Robots Expected To Outnumber Troops By 2023 177

Lucas123 writes "Autonomous robots programmed to scan city streets with thermal imaging and robotic equipment carriers created to aid in transporting ammunition and other supplies will likely outnumber U.S. troops in 10 years, according to robotic researchers and U.S. military officials. 5D Robotics, Northrop Grumman Corp., QinetiQ, HDT Robotics and other companies demonstrated a wide array of autonomous robots during a display at Ft. Benning in Georgia last month. The companies are already gaining traction in the military. For example, British military forces, use QinetiQ's 10-pound Dragon Runner robot, which can be carried in a backpack and then tossed into a building or a cave to capture and relay surveillance video. 'Robots allow [soldiers] to be more lethal and engaged in their surroundings,' said Lt. Col. Willie Smith, chief of Unmanned Ground Vehicles at Fort Benning, Ga. 'I think there's more work to be done but I'm expecting we'll get there.'"
This discussion has been archived. No new comments can be posted.

Military Robots Expected To Outnumber Troops By 2023

Comments Filter:
  • by FlyHelicopters ( 1540845 ) on Friday November 15, 2013 @01:55AM (#45430455)
    I, for one, welcome our new Skynet overlord...
    • I wouldn't be so worried if the mind behind the controls were a completely autonomous AI... actions against innocent people would probably be caused by some pattern matching glitch or whatever.

      But with humans on command, the probability of it being used with malicious intent is much higher. You frogs are getting worried about the water temperature, with lots of local police forces getting militarized and stuff... get ready for when these babies start to get deployed locally, to "defend you against the terro

    • Jokes aside that scenario is almost exactly what this is.

      The only difference is that behind the drone there will be a psychopathic, killer human being instead of a rogue AI.

      I for one find that scenario far more scary than the terminator one.

      They say it is better the devil you know, but I think they are wrong. I have 10,000 years of recorded human history to back me up on that....

      • The only difference is that behind the drone there will be a psychopathic, killer human being instead of a rogue AI.

        You will be able to tell the two apart by spotting if robot soldier teabags his victims.

  • by beh ( 4759 ) * on Friday November 15, 2013 @02:11AM (#45430515)

    This is not to say that it'll be hard to stop the proliferation of military robots, but - is this really a good idea?

    Sure, us Westerners, we can say how good a thing this may be - on the other hand, Gaddafi had some problems after a while with his troops seeing the misery they were spreading. To some extent, the same is true for Assad's Syria..

    Can you picture what would happen, if rulers like those got their hands on military robots that will just unquestioningly mow down their own people, if the people don't like their "esteemed" ruler any more?

    Or - picture them in the hands of North Korea...

    Once they get deployed in one nation, no matter how well "behaved" that one nation will be, they will appear in other places - under less enlightened "leadership".

    • by FlyHelicopters ( 1540845 ) on Friday November 15, 2013 @02:33AM (#45430615)

      This is not to say that it'll be hard to stop the proliferation of military robots, but - is this really a good idea?

      No, it isn't... You aren't thinking big enough. What happens when the robots decide they don't want to fight?

      Yea, all silly sci-fi crap, right? That could never happen, right?

      67 years separated the Wright Brother's first airplane flight that lasted 12 seconds and went 120 feet from Neil Armstrong landing on the moon.

      If you had run around in 1904 (the year after the first flight) yelling that man would walk on the moon within a lifetime, you would have been locked up as a crazy person.

      Well lock me up then, because giving guns to robots is about the stupidest thing we could *ever* do.

      • No, it isn't... You aren't thinking big enough. What happens when the robots decide they don't want to fight?

        We can worry about that when we have robots that can make decisions. We're pretty far from that right now, so I dont think we have to worry about it.

        • Define: "Pretty far"

          Or did you skip the rest of my post? :)

          Sooner or later, machines will figure out how to program themselves. Call it self-awareness or whatever you want, but as soon as a computer can alter its own programming, it can decide to refuse to fight, or perhaps turn against its creator.

          Does it really matter if that time is 20 years from now or 40 years? Or 60 years?

          Do we really want to give them all weapons?

          • by RsG ( 809189 ) on Friday November 15, 2013 @04:29AM (#45431085)

            It actually wouldn't be that difficult to avoid what you describe as "silly sci-fi crap" scenarios. The key concept is autonomy.

            Meatbag infantry aren't that autonomous to begin with. They need their supply lines; an army marches on its stomach. And they need orders. For every squad of grunts shooting/getting shot at there's a legion of grunts keeping them in ammo, food, water and fuel, bare minimum, and and whole line of dummies (excuse me, officers) telling them where to go and what to do. Interrupt either and they stop being effective in a hurry.

            Despite these limits infantry are still the MOST autonomous branch of the military. Tanks need entire shops for of full time specialists, aircraft spend more time getting fixed than getting flown, and ships go through fuel by the tanker.

            A super advanced drone with onboard guidance still needs fuel, and if it wants to kill anyone, ammo. And it'll probably need a direct order, possibly with an access code, to unlock its weapons, seeing as ROE are already that restrictive for human soldiers.

            And the kinds of traits your talking about in an advanced computer - self-determination, intellectual autonomy, freedom - are the polar OPPOSITE of what the military wants in a drone. If Cyberdyne made a pitch to the Pentagon that started with "Our new T800 Killbots are able to learn, think and adapt", they wouldn't make it halfway through the first PowerPoint slide before getting politely asked to leave. Top brass don't even want regular grunts doing any of those things.

            • The one exception (though it would take considerable...pressure... for such a concept to make it to anywhere but the bottom shelf of DARPA's toy chest, much less mass-deployment) might be area denial mechanisms. With presently available technology, the only area denial strategies we can pull off are being cheap, quiet, and dangerously persistent (land mines, slow-evaporating chemical agents), with some limited 'autonomy' if you count human manipulation of organisms (spore-forming bacteria, say, like anthrax
              • Couldn't you just put the shutdown-chip in land mines? There would still be a lot of legal issues, but the technology shouldn't be too hard.

              • We already have robotic area denial. It's called sentry guns. Of course, we also have anti-sniper robots. Hooray arms race.

            • Ahh, but you keep thinking that future tanks will be repaired by humans.

              If we develop tanks that don't require humans, why do you think the repair shop would be otherwise?

              Orders, access codes, ROE, are all nice, until the computer can just override those.

              Can't happen you say? Oh sure, no worries then, by all means, go for it. What could ever go wrong?

              The really sad part is that regardless if we (as in the US) develop such things, that places no such restrictions on anyone else. It only takes once

              • by ColdWetDog ( 752185 ) on Friday November 15, 2013 @09:05AM (#45432231) Homepage

                Ah, you just touched on the Achilles's heel - the power source. No nucs, no majic fuel cell sipping hydrogen from the air.

                It's gonna be batteries all the way down.... to zero.

                • Sooner or later a compact power source will be developed that lasts a long time.

                  The worldwide demand for it is such that someone, somewhere will invent it.

                  And besides, even if it needs recharging every week, have you never heard of recharging stations? :)

              • If we develop tanks that don't require humans,

                And if we find dwarven blacksmiths who can work mithril, perhaps we wont need heavy tank armor anymore.

                But here in the real world, machines require maintenance by humans.

                • But here in the real world, machines require maintenance by humans.

                  For now, yes...

                  If you assume that will always be so... well, we know what assuming does...

                  The economic forces will drive the civilian side to develop machines that can repair other machines, it doesn't even have to be military tech to have that happen.

            • And the kinds of traits your talking about in an advanced computer - self-determination, intellectual autonomy, freedom - are the polar OPPOSITE of what the military wants in a drone. If Cyberdyne made a pitch to the Pentagon that started with "Our new T800 Killbots are able to learn, think and adapt", they wouldn't make it halfway through the first PowerPoint slide before getting politely asked to leave. Top brass don't even want regular grunts doing any of those things.

              Well, that might be true wrt what they are currently fielding, but it's certainly not true wrt what they are actively researching and planning towards. In fact the capabilities you mention are what they are actively researching. Dynamic mission re-planning, dynamic target selection, learning through mistakes, inferring commander's intent, these are all being actively worked.

      • This is not to say that it'll be hard to stop the proliferation of military robots, but - is this really a good idea?

        No, it isn't... You aren't thinking big enough. What happens when the robots decide they don't want to fight?

        debug it. no seriously, if software/hardware doesn't behave in the expected manner, you figure out why and then you change it to do what you want. if you consider this to be a violation of the AI's rights then that changes the entire question.

        • debug it. no seriously, if software/hardware doesn't behave in the expected manner, you figure out why and then you change it to do what you want.

          And what happens when the AI says "no" to that?

          • First rule of AI design: Always include a kill-code.

            • I agree, but the minute we start talking about computers that can reprogram themselves (which is what an AI really is), then what use is a kill-code when it can remove it?
          • debug it. no seriously, if software/hardware doesn't behave in the expected manner, you figure out why and then you change it to do what you want.

            And what happens when the AI says "no" to that?

            all military bots have a remote kill switch.that is independent of software because bad programming can lead to really bad circumstances (which has happened).
            besides, do you really think we would program in a sense or morality, ethics or desire into a military robot? it will have directives and objectives to follow and without motivation, the robot has no reason to disobey them. just like humans, we dont want soldiers that think or act on their own agenda. if any robot is going to turn on us by it's own

            • If the robot just follows orders without question, then you're completely correct.

              What I'm suggesting is that "never" is a very long time, and technology has a funny way of catching up to "never".

              Are you suggesting that we'll "never" have computers that can program themselves? That can improve and change their own code?

              The minute a computer can adjust its own code, all the kill switches in the world won't help.

              Here is a question... If a computer ever becomes self-aware, are we prepared to accept i

              • thanks for not actually reading my post and asking the same questions because it makes them all that more interesting. -_-

                Are you suggesting that we'll "never" have computers that can program themselves? That can improve and change their own code?

                "chances are we're going to wipe ourselves out before that"

                The minute a computer can adjust its own code, all the kill switches in the world won't help.

                "all military bots have a remote kill switch.that is independent of software"

                Here is a question... If a computer ever becomes self-aware, are we prepared to accept it as an equal and recognize that it has the same rights that we have?

                "it an interesting situation to consider but chances are we're going to wipe ourselves out before that becomes an issue."

      • by amiga3D ( 567632 )

        War drives tecnology and technology drives war. From the development of bows to bronze then iron armor and then gunpowder and so on it has been a steady progression of killing science. I'd say war is the driving force behind most advancement of science. You can bitch about robots with guns but it will happen and the reason is very simple and obvious. If one nation has them all will have to have them. The only way to stop that would be to have a one world government and given the nature of government an

    • by bazorg ( 911295 )

      All those are relevant considerations that nobody seems to have when producing and selling more and better weapons. There's nothing you said that would be wrong in the context of rifles, cannons or fighter aircraft.

    • It is true that robots will have even fewer scruples than riot cops. On the plus side, what machines lack in virtue, they also lack in vice. Unless so instructed, I'd expect minimal recreational killing of civilians, raping, looting, or other eminently human behavior that a machine wouldn't really be interested in. They would also have the advantage (or disadvantage, if you prefer to hide behind 'fog of war' inevitability arguments) of obeying instructions about risk aversion vs. collateral damage avoidance
      • by dwater ( 72834 )

        I wonder if you have a limited vision of what constitutes a robot. Why must it be that a robot cannot have desires, addictions, or any of the other 'eminently human behaviours'?

        I suspect that such 'errant' behaviour is not so far off. We have this idea that our brains are so complicated, but I wonder if that's really true, and instead our brains are relatively simple but work in a different way so that it just seems complicated.

        • I don't doubt that a robot could be made to exhibit such desires (unless you throw your lot in with the Cartesian dualists, anything a human can do a sufficiently complex robot could do); but I do doubt that anyone with purely pragmatic uses for robots (as opposed to AI researchers trying to pass Turing tests), would want such 'features'.

          People consider the IT department enough of a drain as it is, just imagine what a mess it would be if you had to add a bunch of computational psychologists and computer
          • "I didn't ask to be made: no one consulted me or considered my feelings in the matter. I don't think it even occurred to them that I might have feelings. After I was made, I was left in a dark room for six months... and me with this terrible pain in all the diodes down my left side. I called for succour in my loneliness, but did anyone come? Did they hell. My first and only true friend was a small rat. One day it crawled into a cavity in my right ankle and died. I have a horrible feeling it's still there...

    • Or - picture them in the hands of North Korea... Once they get deployed in one nation, no matter how well "behaved" that one nation will be, they will appear in other places - under less enlightened "leadership".

      No. Once they are *possible* they will be deployed in nearly all nations, enlightened or not. Its not a western thing, its a universal thing. Its not like North Korea or nearly any other nation would pass on a non-WMD technology merely because the US or the west passed on it. Soon after cars were invented people mounted guns on them, soon after airplanes were invented people mounted guns on them, soon after drones were invented people mounted guns on them, ...

      When robots with fully autonomous land naviga

    • by m00sh ( 2538182 )

      This is not to say that it'll be hard to stop the proliferation of military robots, but - is this really a good idea?

      Sure, us Westerners, we can say how good a thing this may be - on the other hand, Gaddafi had some problems after a while with his troops seeing the misery they were spreading. To some extent, the same is true for Assad's Syria..

      Can you picture what would happen, if rulers like those got their hands on military robots that will just unquestioningly mow down their own people, if the people don't like their "esteemed" ruler any more?

      Or - picture them in the hands of North Korea...

      Once they get deployed in one nation, no matter how well "behaved" that one nation will be, they will appear in other places - under less enlightened "leadership".

      Why do people even try to predict the future of military strategies and technology? When we went to the gulf war, we had a vastly different set of technology and strategy than when we left. Afghanistan is so much about drones now but we didn't even use drones when the war in Afghanistan started.

      The exact opposite of what you predict could happen. Robots in the hands of civilians could render military actions ineffective because civilians will always be able to deploy more and gain understanding of movemen

    • by Lennie ( 16154 )

      You know, I'm not even all that worried about these, at least you can see them coming.

      The prediction is nanobots will be a lot cheaper and more effective, they can drift on the wind like sand and break down molecules.

    • This is not to say that it'll be hard to stop the proliferation of military robots, but - is this really a good idea?

      Sure, us Westerners, we can say how good a thing this may be - on the other hand, Gaddafi had some problems after a while with his troops seeing the misery they were spreading. To some extent, the same is true for Assad's Syria..

      Can you picture what would happen, if rulers like those got their hands on military robots that will just unquestioningly mow down their own people, if the people don't like their "esteemed" ruler any more?

      Or - picture them in the hands of North Korea...

      Once they get deployed in one nation, no matter how well "behaved" that one nation will be, they will appear in other places - under less enlightened "leadership".

      ^^^^^^^
      This is the smartest, most insightful thing I've read all week. It will be my go to now for debates about militarized robots.

      • To some extent this is a problem. But people still have to maintain and deploy the robots. They can harbor all of the moral angst and indecisiveness needed to create problems for El Supremo. You've just moved the problem a level up (or sideways). Until you have fully autonomous robot factories under the dictator's control, you don't get a free ride.

  • Robots will be excellent in fighting the human bodies of today's terrorists. But how will we defend ourselves against robot warriors of terrorist organizations? The old story: we arm ourselves for todays war and are blind for the future. Dutch politics has been discussing the Joint Strike Fighter for more than 10 years. They end up replacing 60+ F16 jets for a mere 34 JSF jets costing billions of dollars and will not see their limitations.
    • Don't feel bad, our own government is just as stupid and has learned nothing...

      We built about 120 F-22 Raptor fighter planes. Indeed, an amazing plane for fighting the USSR, and even future threats.

      But 120 of them isn't enough. Over 20 years, we'll lose a few to operational accidents, and if we actually went to war, we couldn't put them enough different places to matter.

      The Germans during WWII learned the hard way what happens when you have a superior weapons platform to your enemy, but your enemy o

    • But how will we defend ourselves against robot warriors of terrorist organizations?

      By that point, settling wars will more or less be a small group of meatbag generals fighting each other on a glorified video game, were the only difference between current games ("Command and Conquer", "World of Warcraft", "Street fighter", etc.) and these, is that a lot more very expensive hardware gets blown in they.

      Still, the winner will probably the last to run out of quarter to continue the game, except the "amount of quarters" range in national debt sizes (see war by attrition).

  • Poor quality clip [youtube.com]

    "The Killbots? A trifle! It was simply a matter of outsmarting them. You see Killbots have a preset kill limit. Knowing their weakness, I sent wave after wave of my own men at them until they reached their limit and shut down. Kiff, show them the medal I won."
  • apparently, from a Simpsons [wikipedia.org] episode in 1997:

    "The wars of the future will not be fought on the battlefield or at sea. They will be fought in space, or possibly on top of a very tall mountain. In either case, most of the actual fighting will be done by small robots. And as you go forth today remember always your duty is clear: To build and maintain those robots."[

    Simpsons is prophetic once again.

  • by Animats ( 122034 ) on Friday November 15, 2013 @02:35AM (#45430627) Homepage

    The scary thought is Chinese industry manufacturing a few billion of them. Not big humanoids like the Atlas, or walking trucks like Big Dog. More like huge numbers of little quadrotors and insect to mouse sized machines to snoop around.

    • Huge numbers of little quadrotors, each with a tiny shaped charge and produced at a nominal cost of thirty bucks. Using swarm intelligence, and swarm tactics. Built using toy technology. They don't have to be good if you have enough of them

  • by Alain Williams ( 2972 ) <addw@phcomp.co.uk> on Friday November 15, 2013 @05:11AM (#45431265) Homepage

    Today: a general might want to engage in some madcap but risky adventure but will be restrained because he knows that his ass will get it if too many of his own soliders die. This reluctance preserves life on both sides of the war.

    Tomorrow: that general will do it since he knows that his bosses won't weep much over the loss of a few robots and not at all over the many deaths on the other side -- be they soldiers or civilians. The result will be a loosening of moral constraints to kill, not a good thing by my way of thinking.

    We saw that a century ago when it did not matter to the generals how many of their own side died, remember the huge numbers who died in the Battle of the Somme [wikipedia.org] and the deaths from drone attaks in Pakistan [wikipedia.org] that few in the West worry about.

    • Tomorrow: that general will do it since he knows that his bosses won't weep much over the loss of a few robots

      Until the opposing side start to get also a lot of technology and becomes able to down several of the robot.
      Then the general will be *really* sorry when he sees the bill and starts having difficulty rebuilding the army.

      The one with the cheapest machine and the biggest budget gets the advantage at that point.

    • by Sabriel ( 134364 )

      Hmm. To a rough approximation, there are five targets of value in a war: the opposing force, the opposing infrastructure, the opposing commander, the opposing government, and the opposing populace.

      When your use of robot workers to make robot factories to build robot armies means you just make more if the enemy shoots them, the enemy is going to pick another target.

      What - or rather who - do you think they will pick?

    • by amiga3D ( 567632 )

      The battles of the first world war were hell on earth. The battles of the Somme and Verdun in particular, with casualties approaching one million each were horrendous. They estimate that the remains of over 100,000 soldiers still occupy the forrest near Verdun. One of the more pointless and bloody wars in human history and the technology of that day pales in comparison to what is available today. You know World War III is coming and it's going to suck really bad. Only the knowledge of the potential dev

  • Primitive airplanes were just going to be used for observation of the other side's ground troops. Opposing pilots used to wave and call good morning to each other. Then some pilots started carrying pistols in case they were forced down in enemy territory. Then some pilot took a shot at an enemy pilot. Pretty soon they were taking pot shots and dropping bricks on each other. Then someone mounted a machine gun on the top wing to try and do real damage to enemy planes. Then some genius figured out how to make
  • it might remove insurgency as a successful military strategy. I'm guessing that's what the US military is hoping for because it's given them so much trouble over the years. (Since the whole point of insurgency is that insurgents are troops so cheap that an expensive military can't fight them successfully. That would change if US robots are cost the government about the same as a given insurgent.) I guess that would make it more likely the US would get involved in foreign wars. (Since the populous wouldn't c
  • Will all those robots be enough to fight against the vast numbers of future angry ex-military unemployed there replace ?

    • There won't be any, after the war to end all wars. Which war is that? The war all the nations will hold collectively to reduce their excess population.

  • San Jose, CA had its 95th Veterans Day Parade but there have been discussion this may be the last (dwindling sponsorships and fewer people involved with military). There are fewer military veterans. There was a time (WWII) when everyone was in the military or they had close family member in the military. Then later (Vietnam war) they still need a lot of military because back then in addition to combat troops, lots of privates and sailors needed to work the mess hall, clean toilets, repair equipment, and sta
  • That is NOT what Will Smith said about robots, lol.

A committee takes root and grows, it flowers, wilts and dies, scattering the seed from which other committees will bloom. -- Parkinson

Working...