Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
The Military Robotics

'Don't Fear the Robopocalypse': the Case for Autonomous Weapons (thebulletin.org) 150

Lasrick shares "Don't fear the robopocalypse," an interview from the Bulletin of the Atomic Scientists with the former Army Ranger who led the team that established the U.S. Defense Department policy on autonomous weapons (and has written the upcoming book Army of None: Autonomous Weapons and the Future of War). Paul Scharre makes the case for uninhabited vehicles, robot teammates, and maybe even an outer perimeter of robotic sentries (and, for mobile troops, "a cloud of air and ground robotic systems"). But he also argues that "In general, we should strive to keep humans involved in the lethal force decision-making process as much as is feasible. What exactly that looks like in practice, I honestly don't know."

So does that mean he thinks we'll eventually see the deployment of fully autonomous weapons in combat? I think it's very hard to imagine a world where you physically take the capacity out of the hands of rogue regimes... The technology is so ubiquitous that a reasonably competent programmer could build a crude autonomous weapon in their garage. The idea of putting some kind of nonproliferation regime in place that actually keeps the underlying technology out of the hands of people -- it just seems really naive and not very realistic. I think in that kind of world, you have to anticipate that there are, at a minimum, going to be uses by terrorists and rogue regimes. I think it's more of an open question whether we cross the threshold into a world where nation-states are using them on a large scale.

And if so, I think it's worth asking, what do we mean by"them"? What degree of autonomy? There are automated defensive systems that I would characterize as human-supervised autonomous weapons -- where a human is on the loop and supervising its operation -- in use by at least 30 countries today. They've been in use for decades and really seem to have not brought about the robopocalypse or anything. I'm not sure that those [systems] are particularly problematic. In fact, one could see them as being even more beneficial and valuable in an age when things like robot swarming and cooperative autonomy become more possible.

This discussion has been archived. No new comments can be posted.

'Don't Fear the Robopocalypse': the Case for Autonomous Weapons

Comments Filter:
  • "In general, we should strive to keep humans involved in the lethal force decision-making process as much as is feasible. What exactly that looks like in practice, I honestly don't know."

    So, you tell me don't fear something, and then the expert says this statement.

    Gee, I feel so much better that we have no idea how to do anything about the pending robopocalypse other than to wag our finger at the evil in the world and say "Remember to play nice and be honest and fair when trying to kill each other."

  • by sg_oneill ( 159032 ) on Monday January 15, 2018 @06:32AM (#55930557)

    Autonomous killing machines are a frankly horrific idea. In principle, machines should serve us, NEVER the other way around. On first principles alone the idea that a machine could determine who to kill and who not to kill is a chilling idea.

    And in practice, its a terrible idea. Human soldiers already face baffling moral situations. Woman with child at checkpoint acting suspiciously. Maybe suicide bomber. But she has a child. To shoot or not to shoot. Thats the kind of thing guaranteed to give a marine a gnarly case of PTSD,if he choses wrong, and possibly also a dead mother and child (or conversely a dead platoon). But the possibiliy of a horrifically wrong choice means that Marine is going to deploy ever fragment of reason his brain can muster. . How the hell would we entrust such a monumental decision to a robot. Its "wrong" choice has no repercussions for it. If it kills an innocent mother, it doesnt care, its just a thing. If it opts for caution and it choses wrong, it still doesnt care, its already dead. Theres no incentive anywhere up the chain of command to get this right, because 'Well a robot chose badly, sorry not our fault!' is a get out of jail free to just let the bloody thing go robocop on a civilian population. We *morally* AND *practically* NEED humans in that decision loop, even if its just some guy in an air conditioned office and a VR headset.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      We *morally* AND *practically* NEED humans in that decision loop, even if its just some guy in an air conditioned office and a VR headset.

      Yes! We already have a ban on certain land mines since they kill without a human operator pulling the trigger, often harming innocent non-combatants.

      It is easy to deploy weapons that kill someone without you being anywhere near and don't require you to have to make a judgement call. Those weapons kill indiscriminately. And even in war there are rules of engagement that tells us what we can do and cannot do, unless you want to be a war criminal. Non-combatants should not be targeted for example. Perhaps you

    • A point I made above is that they are too easy and cheap to create to ignore. If we had been on the fence about using them and they existed on 9/10/02 the next day would have decided it for most of the country. Ignoring the area is a bit like ignoring malware and virus development. The question to me is how do you discuss and control something this easy and risky? Mind you this is going to happen. I am sickened by the idea but anyone who is read or watched American Sniper knows the question and t
    • Autonomous killing machines are a frankly horrific idea.

      Agreed. But the alternative, remote controlled killing machines, seems to be just as bad. We have already seen from leaked videos that soldiers given drones to pilot using a video feed seem to treat bombing people as some sort of fancy computer game.

      Given the two options I am not sure which is worse. An emotionless killing machine that follows preset rules of when, and when not, to engage or an emotional human who follows no predetermined patterns but one so removed that they regard your life as much as

      • It's not really like video game to the drone operators: https://www.nbcnews.com/news/u... [nbcnews.com]

        You're quite correct that remote killing machines operated by humans are little better than fully autonomous ones.

      • There was an interesting study a few years back that found Drone operators suffered wartime PTSD at nearly the same rate (ie pretty damn high) as combat soldiers, the implication being that the sort of adrenal escalation that is behind PTSD appears to happen just as much with people who kill remotely as those who actually hold the gun.

        The problem, of course, is that its unlikely the drone operator is going to realise what a head fuck he's in for until he's actually killed. Though I assume the same also appl

    • Since we're talking about the use of a robot capable of operating a check point, consider this. A robot can safely approach and inspect that suspicious woman and child for explosives because it doesn't need to fear for it's life like a human soldier does. It will perform that search because it is disposable and not free thinking. If woman is a suicide bomber then she kills herself and we lose a robot. We've eliminated the need to accidentally kill a civilian on suspicion alone. Activate another robot send t
    • In your example, the robot doesn't care if it gets blown up. Therefore, you reason that the people in charge will not care much about civilian casualties. Nothing could be further from the truth. The chain of command will be EXTRA careful to avoid casualties like that. They'll program the robot to be very conservative about firing on anyone who might be a civilian. Robot gets blown up? "Meh, we're out 150 kilodollars. No biggie. Nothing is more effective at swelling the enemy ranks than a few well-publiciz
    • Good lord, you've made the best case for robots I've ever heard. A human being having to make a tough choice that can easily result in an innocent woman's death or his platoons, and said choice scaring him for life? Even if he makes it correctly? Holy hell man, protecting people from danger like that is why we build robots. Just offloading the decision to someone else can help prevent PTSD for the soldier on the ground. Robots can make of sensors at a range to get a better probability assessment it's a

    • How the hell would we entrust such a monumental decision to a robot.

      Easy: You just don't care. We already don't care about drone strikes. Every look up the number of civilian casualties in Iraq. Just the ones the US government acknowledges are enough to blow your mind. You'd think it'd make headlines. But it barely even registers.

      See in war if you're not personally getting blown to bits and your family ain't then it's easy peasy to just ignore it.

    • machines should serve us, NEVER the other way around.

      I'm not afraid of whether we can keep autonomous machines on a leash. The question is, who holds the leash?

  • If I was in a gunfight I might think twice about killing another human. But a bot? No hesitation whatsoever. I'd blast the motherfucker.

    Sending bots just sounds like an expensive way to flush money down the toilet.

    • Re: (Score:2, Insightful)

      by Dutch Gun ( 899105 )

      If I was in a gunfight I might think twice about killing another human. But a bot? No hesitation whatsoever.

      I'm pretty sure the bot feels nothing about the prospect of killing you as well.

      • But I'm not sure if that's better or worse than humans, to be honest.

        With a human, they might be told to exterminate your village, but they might not be able to do it. They also might be told to keep casualties to a minimum, and "look out, snipers!" and now they've got the rationale for exterminating your village.

        The robots are either set to exterminate the village or to select targets carefully. You get what you get, and there is a lot less uncertainty. Whether or not that's worse than sending scared/angry

  • by Anonymous Coward
  • let hem battle it out on a deserted remote island, whoever's AI robot army is still standing in the end wins.
    ofcourse, then the question becomes, why not just run a computer simultation instead.

    • let hem battle it out on a deserted remote island, whoever's AI robot army is still standing in the end wins.

      General Zaroff and the BattleBots? That's one hell of a mash-up. Cool band name.

      ofcourse, then the question becomes, why not just run a computer simultation instead.

      Telling the kids it's just a game will be the prescription to prevent PTSD. Let's just hope they don't talk to Ender Wiggin.

    • Holy TOS Mr. Spock! No references to "A Taste of Armageddon [imdb.com]" yet? What has this site come to?
  • by johanw ( 1001493 ) on Monday January 15, 2018 @07:47AM (#55930705)

    It will be a lot safer for the public: no more cops who claim they are "under pressure" or "affraid" so they had to shoot first. A robot can afford to shoot last, only when fired uppon first.

    • by AHuxley ( 892839 )
      Contain a no go part of the city.
      Send in the robots with 24/7 aerial surveillance to counter all wifi, internet attempts to send out live streams, upload video clips.
      Set up a pathway out the area for the citizens who want to surrender, be searched and moved to another city.
      Move in the robots to try some pacification on the looters who did not take up the offer to be searched and exit the area.
      Remove power, water, block networks and wait for the criminals and looters who stayed to try and escape. Rob
    • by Agripa ( 139780 )

      It will be a lot safer for the public: no more cops who claim they are "under pressure" or "affraid" so they had to shoot first. A robot can afford to shoot last, only when fired uppon first.

      I am sure law enforcement will be responsible when using this technology.

      https://en.wikipedia.org/wiki/... [wikipedia.org]

  • by 0100010001010011 ( 652467 ) on Monday January 15, 2018 @08:07AM (#55930741)

    As we all know, the enemy always follows the rules. Right on back to some dirty farmers hiding in the woods not following the British 'rules of war'.

    The technology to build these things is not difficult. In the US a gun is easiest part of the puzzle. Toss in some OpenCV, webcam, a solenoid and you can have your own private sentry.

  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Monday January 15, 2018 @08:14AM (#55930765)
    Comment removed based on user account deletion
    • War is about control, yes. But amazingly, no matter how great your nuclear arsenal... you can still bash peoples' heads in with a rock.

      Information-based warfare does not preclude physical violence. And if a group with the capacity to wage physical conflict decides it is losing the more civilized digital conflict, it can always fall back on guns and bombs.

  • The US mil thinking is not that secret on how to win a war. The results wanted well understood.
    The free fire zones over Vietnam https://en.wikipedia.org/wiki/... [wikipedia.org]
    The British response in the Second Boer War https://en.wikipedia.org/wiki/... [wikipedia.org]
    But with robots.
  • Everyone seems to forget we've had fully autonomous lethal weapons for decades - they're called land mines and sea mines. All we've done is make them smarter and more mobile. Making them smarter should only help reduce the number of false-positive casualties. To some extent, the same basic rules apply to robots as minefields - the person culpable is the one who deploys them. We've just got more control now than we did before.
    • That's a great comparison but anti personnel mines are illegal: https://www.un.org/disarmament... [un.org]

      • And of course, ISIS would never disobey the U.N.
        • And of course, ISIS would never disobey the U.N.

          If you're insinuating that respectable nations should copy the deplorable tactics of a terrorist organization to defeat them, then what separates the two? Should the US torture and behead its enemies too? I think we'd be better off researching ways to mitigate the actions of the terrorists rather than paving the way for them. For instance, there are probably dozens of ways to blind or otherwise incapacitate drones.

  • I really can't think of a reason why a military would not develop such weapons.

  • If people aren't being killed, there is no point to war. When we have robots fighting robots, it is just a show.
    • While certainly utterly defeating an opposing force is a way to win a war, destroying infrastructure in order to break morale of the other army through lack of resources is both easier (can't hide a factory as easily as an infantry unit) and more effective.
    • The point of war is to defeat the enemy. This can be done by physical or nonphysical means, including mostly bloodless. One of Napoleon's greatest non-battles was the non-battle around Ulm in 1805, in which he forced the surrender of an enemy army without serious fighting.

      Now, all of these nonphysical means rely on physical force, so it's important to have the ability to kill enemies, but it doesn't necessarily have to be used.

  • Not the current crap.

  • by starless ( 60879 ) on Monday January 15, 2018 @10:29AM (#55931301)

    Unlike like a growing number of countries, the US hasn't yet agreed to a ban on use of land mines.

    These (simple) machines can automatically indiscriminately kill, with essentially no protection against civilian deaths, and
    can remain active for many many years.

  • This cat is out of the bag. There's little point in debating the ethics of it. Once the technology exist, and it already does, it can and will be abused. No matter how many countries say "we won't do it" it will be done. We may as well have them because as sure as the the sun sets in the west someone else will.

  • Anyone who has installed an alarm system into their home and has set it off accidentally -- should understand the main issue with arming your alarm system.

  • Nopenopenopenopenope
  • forget about combat.... ...these will be fantastic for robbing banks!

    hypothetically speaking, of course!

  • ... Come on baby, don't fear the robopocalypse
    Baby take my hand, don't fear the robopocalypse
    We'll be able to fly, don't fear the robopocalypse
    Baby I'm your man

    - New Oyster Cult

  • http://www.pdfernhout.net/reco... [pdfernhout.net] "Military robots like drones are ironic because they are created essentially to force humans to work like robots in an industrialized social order. Why not just create industrial robots to do the work instead?
    Nuclear weapons are ironic because they are about using space age systems to fight over oil and land. Why not just use advanced materials as found in nuclear missiles to make renewable energy sources (like windmills or solar panels) to replace oil, or wh

One man's constant is another man's variable. -- A.J. Perlis

Working...