'Don't Fear the Robopocalypse': the Case for Autonomous Weapons (thebulletin.org) 150
Lasrick shares "Don't fear the robopocalypse," an interview from the Bulletin of the Atomic Scientists with the former Army Ranger who led the team that established the U.S. Defense Department policy on autonomous weapons (and has written the upcoming book Army of None: Autonomous Weapons and the Future of War). Paul Scharre makes the case for uninhabited vehicles, robot teammates, and maybe even an outer perimeter of robotic sentries (and, for mobile troops, "a cloud of air and ground robotic systems"). But he also argues that "In general, we should strive to keep humans involved in the lethal force decision-making process as much as is feasible. What exactly that looks like in practice, I honestly don't know."
So does that mean he thinks we'll eventually see the deployment of fully autonomous weapons in combat? I think it's very hard to imagine a world where you physically take the capacity out of the hands of rogue regimes... The technology is so ubiquitous that a reasonably competent programmer could build a crude autonomous weapon in their garage. The idea of putting some kind of nonproliferation regime in place that actually keeps the underlying technology out of the hands of people -- it just seems really naive and not very realistic. I think in that kind of world, you have to anticipate that there are, at a minimum, going to be uses by terrorists and rogue regimes. I think it's more of an open question whether we cross the threshold into a world where nation-states are using them on a large scale.
And if so, I think it's worth asking, what do we mean by"them"? What degree of autonomy? There are automated defensive systems that I would characterize as human-supervised autonomous weapons -- where a human is on the loop and supervising its operation -- in use by at least 30 countries today. They've been in use for decades and really seem to have not brought about the robopocalypse or anything. I'm not sure that those [systems] are particularly problematic. In fact, one could see them as being even more beneficial and valuable in an age when things like robot swarming and cooperative autonomy become more possible.
So does that mean he thinks we'll eventually see the deployment of fully autonomous weapons in combat? I think it's very hard to imagine a world where you physically take the capacity out of the hands of rogue regimes... The technology is so ubiquitous that a reasonably competent programmer could build a crude autonomous weapon in their garage. The idea of putting some kind of nonproliferation regime in place that actually keeps the underlying technology out of the hands of people -- it just seems really naive and not very realistic. I think in that kind of world, you have to anticipate that there are, at a minimum, going to be uses by terrorists and rogue regimes. I think it's more of an open question whether we cross the threshold into a world where nation-states are using them on a large scale.
And if so, I think it's worth asking, what do we mean by"them"? What degree of autonomy? There are automated defensive systems that I would characterize as human-supervised autonomous weapons -- where a human is on the loop and supervising its operation -- in use by at least 30 countries today. They've been in use for decades and really seem to have not brought about the robopocalypse or anything. I'm not sure that those [systems] are particularly problematic. In fact, one could see them as being even more beneficial and valuable in an age when things like robot swarming and cooperative autonomy become more possible.
Re:Just creating them is dangerous. (Score:5, Interesting)
Given a choice, which would you rather have come to your village:
1. A carefully designed, programmed and tested robot
2. A squad of soldiers that haven't slept in two days or eaten in 18 hours, and who just medevaced a comrade who had his leg blown off below the knee by a booby trap
Are you really sure you want a human decision maker "in the loop"?
Re: (Score:3)
Given a choice, which would you rather have come to your village:
1. A carefully designed, programmed and tested robot
2. A squad of soldiers that haven't slept in two days or eaten in 18 hours, and who just medevaced a comrade who had his leg blown off below the knee by a booby trap
Are you really sure you want a human decision maker "in the loop"?
1. A carefully designed, programmed (to kill all humans) and tested robot
Can I just pick neither? (Score:2)
Re: (Score:3)
That greatly depends on the overall strategy for the war.
1. To drain the opposition of support
or
2. To crush the opposition by force
The former is what most Americans in living history now know, like the Korean war, the Vietnam war, the Gulf war etc. and in that case it's true. You don't have rouge elements, you don't have soldiers on a power trip that rape, pillage and murder. You don't have soldiers who fear for their own lives and who'd rather cause incidental or accidental damage to civilians than risk ge
Re: (Score:2)
https://en.wikipedia.org/wiki/... [wikipedia.org]
Re: (Score:2)
The argument that war being horrific will prevent it doesn't work very well historically. It's a bit better than "war is profitable", but that's a pretty low barrier.
To take your example, the middle ages is full of dukes/lords/kings leading armies back and forth across various battlefields until the area being fought over was so wrecked it couldn't support the people living there. That is often resulted in the duke/lord/king getting killed didn't stop things. It's not clear it slowed them down much. Peo
Re: (Score:2)
War is supposed to be horrific. That's a big incentive to prevent it.
America participates in more wars that any other country. As an American, if a particular war is too horrific, I can just close my browser tab for that conflict, and read about something else instead.
Re: (Score:3)
"a killing machine with unwavering loyalty and a complete absence of conscious"
That's not the problem.
If you, a civilian police officer, are faced with a terrorist attack by a semi-autonomous robotic rifle, trigger being pulled at maximum rate (or a bump stock being part of the mechanism), immune to small arms fire, able to target accurately out to 100 yards, and able to negotiate stairs and open doors, what do you do, wait for the military to respond?
Such devices are well within reach of determined and mod
Re: Just creating them is dangerous. (Score:1)
Anyone could make your same argument against our police. Have you watched a sci-fi movie recently? There's plenty of them demonstrating how brutal the once good guys could become once they have incredible power and little at risk. Throw in unwavering loyalty and you've got the evil Empire from Star Wars.
Re: (Score:2)
Sure. Different solution.
Glad my example is valid for other use cases.
Re: (Score:2)
immune to small arms fire, able to target accurately out to 100 yards, and able to negotiate stairs and open doors, what do you do, wait for the military to respond?
You need a swarm of remote-controllable drones able to get within sufficient range of the offender and deploy an artificial EMP.
The real problem is how much more quickly an autonomous attacker can kill/hurt a lot of people with precision BEFORE a credible response could be launched, and the fact the terrorist might have the advantage
Re: (Score:2)
"You need a swarm of remote-controllable drones able to get within sufficient range of the offender and deploy an artificial EMP."
Collateral damage prohibits this.
"The real problem is how much more quickly an autonomous attacker can kill/hurt a lot of people with precision BEFORE a credible response could be launched, and the fact the terrorist might have the advantage of targetting the very location, people, or things they need to target in order to snuff out or delay response efforts."
Response time is the
Re: (Score:2)
emp is not even a real weapon yet.
people keep forgetting that.
actual EMP weapons (not created by a nuclear blast) are still in the realm of scifi.
if they were not our troops would be using them daily simply because of how dang useful theyd be in a fight.
Re: (Score:2)
It's not the lack of conscience of the devices, it's the lack of conscience of the criminals building and deploying them. We face enemies that have a different conscience than we do, one incompatible with our typical morals. THAT is the problem. They will build and use weapons of unacceptable, unimaginable brutality and efficiency, all to achieve their goals.
I'm not precisely disagreeing with you, but consider:
It's not the lack of conscience of the devices, it's the lack of conscience of the governments building and deploying them. We face enemies that have a different conscience than we do, one incompatible with our typical morals. THAT is the problem. They will build and use weapons of unacceptable, unimaginable brutality and efficiency, all to achieve their goals.
If our government not building these devices would prevent them from coming into existence, I'd
Re: (Score:2)
Self-defense is likely the least common use case. I'm not concerned about that, but more about police use, where errors are so prevalent that elevating their capabilities will increase the lethality of mistakes/SWATing/overreaction, and the police are not yet held sufficiently accountable to discourage these.
Accountability isn't a uniquely police issue.
The "expert", has spoken (Score:1)
"In general, we should strive to keep humans involved in the lethal force decision-making process as much as is feasible. What exactly that looks like in practice, I honestly don't know."
So, you tell me don't fear something, and then the expert says this statement.
Gee, I feel so much better that we have no idea how to do anything about the pending robopocalypse other than to wag our finger at the evil in the world and say "Remember to play nice and be honest and fair when trying to kill each other."
Re: (Score:2)
The problem is, the MAD solution was unstable even with three players. This is going to start off with dozens of players, so MAD is critically unstable even before the "game" has started.
Its a terrible idea in principle AND practice. (Score:5, Insightful)
Autonomous killing machines are a frankly horrific idea. In principle, machines should serve us, NEVER the other way around. On first principles alone the idea that a machine could determine who to kill and who not to kill is a chilling idea.
And in practice, its a terrible idea. Human soldiers already face baffling moral situations. Woman with child at checkpoint acting suspiciously. Maybe suicide bomber. But she has a child. To shoot or not to shoot. Thats the kind of thing guaranteed to give a marine a gnarly case of PTSD,if he choses wrong, and possibly also a dead mother and child (or conversely a dead platoon). But the possibiliy of a horrifically wrong choice means that Marine is going to deploy ever fragment of reason his brain can muster. . How the hell would we entrust such a monumental decision to a robot. Its "wrong" choice has no repercussions for it. If it kills an innocent mother, it doesnt care, its just a thing. If it opts for caution and it choses wrong, it still doesnt care, its already dead. Theres no incentive anywhere up the chain of command to get this right, because 'Well a robot chose badly, sorry not our fault!' is a get out of jail free to just let the bloody thing go robocop on a civilian population. We *morally* AND *practically* NEED humans in that decision loop, even if its just some guy in an air conditioned office and a VR headset.
Re: (Score:2, Insightful)
We *morally* AND *practically* NEED humans in that decision loop, even if its just some guy in an air conditioned office and a VR headset.
Yes! We already have a ban on certain land mines since they kill without a human operator pulling the trigger, often harming innocent non-combatants.
It is easy to deploy weapons that kill someone without you being anywhere near and don't require you to have to make a judgement call. Those weapons kill indiscriminately. And even in war there are rules of engagement that tells us what we can do and cannot do, unless you want to be a war criminal. Non-combatants should not be targeted for example. Perhaps you
Re: (Score:2)
Re: (Score:2)
The problem is, it's stupid and insanely dangerous, but the danger is a long term danger, and the profit is immediate. And there's LOTS of people who can make one with various degrees of sophistication. Being moral and holding out won't stop this, we need another answer...but that that could be I don't know.
There's already enough people saying about this book or that "That was supposed to be a warning, not a handbook!". And that argument hasn't changed anything. Yes, 1984 was supposed to be a warning..
Is remote control any better? (Score:2)
Autonomous killing machines are a frankly horrific idea.
Agreed. But the alternative, remote controlled killing machines, seems to be just as bad. We have already seen from leaked videos that soldiers given drones to pilot using a video feed seem to treat bombing people as some sort of fancy computer game.
Given the two options I am not sure which is worse. An emotionless killing machine that follows preset rules of when, and when not, to engage or an emotional human who follows no predetermined patterns but one so removed that they regard your life as much as
Re: Is remote control any better? (Score:2)
It's not really like video game to the drone operators: https://www.nbcnews.com/news/u... [nbcnews.com]
You're quite correct that remote killing machines operated by humans are little better than fully autonomous ones.
Re: (Score:2)
There was an interesting study a few years back that found Drone operators suffered wartime PTSD at nearly the same rate (ie pretty damn high) as combat soldiers, the implication being that the sort of adrenal escalation that is behind PTSD appears to happen just as much with people who kill remotely as those who actually hold the gun.
The problem, of course, is that its unlikely the drone operator is going to realise what a head fuck he's in for until he's actually killed. Though I assume the same also appl
Re: (Score:1)
Re: (Score:1)
Backwards (Score:3)
Good lord, you've made the best case for robots I've ever heard. A human being having to make a tough choice that can easily result in an innocent woman's death or his platoons, and said choice scaring him for life? Even if he makes it correctly? Holy hell man, protecting people from danger like that is why we build robots. Just offloading the decision to someone else can help prevent PTSD for the soldier on the ground. Robots can make of sensors at a range to get a better probability assessment it's a
Re: (Score:2)
Easy: You just don't care. We already don't care about drone strikes. Every look up the number of civilian casualties in Iraq. Just the ones the US government acknowledges are enough to blow your mind. You'd think it'd make headlines. But it barely even registers.
See in war if you're not personally getting blown to bits and your family ain't then it's easy peasy to just ignore it.
Re: (Score:3)
machines should serve us, NEVER the other way around.
I'm not afraid of whether we can keep autonomous machines on a leash. The question is, who holds the leash?
Re: (Score:2)
You raise a valid point, but the problem is it depends entirely on the chosen "rules of engagement". And various governments have caused me to doubt that they would be merciful.
No qualms about killing a bot (Score:2, Insightful)
If I was in a gunfight I might think twice about killing another human. But a bot? No hesitation whatsoever. I'd blast the motherfucker.
Sending bots just sounds like an expensive way to flush money down the toilet.
Re: (Score:2, Insightful)
If I was in a gunfight I might think twice about killing another human. But a bot? No hesitation whatsoever.
I'm pretty sure the bot feels nothing about the prospect of killing you as well.
Re: (Score:2)
But I'm not sure if that's better or worse than humans, to be honest.
With a human, they might be told to exterminate your village, but they might not be able to do it. They also might be told to keep casualties to a minimum, and "look out, snipers!" and now they've got the rationale for exterminating your village.
The robots are either set to exterminate the village or to select targets carefully. You get what you get, and there is a lot less uncertainty. Whether or not that's worse than sending scared/angry
Recommended watch: Slaughterbots (Score:2, Informative)
Slaughterbots: https://www.youtube.com/watch?v=9CO6M2HsoIA&t= [youtube.com]
Re: (Score:3)
Also that Black Mirror episode [youtube.com]
Re: (Score:3)
Wasn't sure if you'd link to that one or this one. [youtube.com]
remote island (Score:1)
let hem battle it out on a deserted remote island, whoever's AI robot army is still standing in the end wins.
ofcourse, then the question becomes, why not just run a computer simultation instead.
Re: (Score:3)
let hem battle it out on a deserted remote island, whoever's AI robot army is still standing in the end wins.
General Zaroff and the BattleBots? That's one hell of a mash-up. Cool band name.
ofcourse, then the question becomes, why not just run a computer simultation instead.
Telling the kids it's just a game will be the prescription to prevent PTSD. Let's just hope they don't talk to Ender Wiggin.
Re: (Score:2)
Replace the police and SWAT teams by them (Score:5, Interesting)
It will be a lot safer for the public: no more cops who claim they are "under pressure" or "affraid" so they had to shoot first. A robot can afford to shoot last, only when fired uppon first.
Re: (Score:2)
Send in the robots with 24/7 aerial surveillance to counter all wifi, internet attempts to send out live streams, upload video clips.
Set up a pathway out the area for the citizens who want to surrender, be searched and moved to another city.
Move in the robots to try some pacification on the looters who did not take up the offer to be searched and exit the area.
Remove power, water, block networks and wait for the criminals and looters who stayed to try and escape. Rob
Re: (Score:2)
It will be a lot safer for the public: no more cops who claim they are "under pressure" or "affraid" so they had to shoot first. A robot can afford to shoot last, only when fired uppon first.
I am sure law enforcement will be responsible when using this technology.
https://en.wikipedia.org/wiki/... [wikipedia.org]
"policy on autonomous weapons" (Score:5, Interesting)
As we all know, the enemy always follows the rules. Right on back to some dirty farmers hiding in the woods not following the British 'rules of war'.
The technology to build these things is not difficult. In the US a gun is easiest part of the puzzle. Toss in some OpenCV, webcam, a solenoid and you can have your own private sentry.
Comment removed (Score:5, Insightful)
Re: (Score:2)
War is about control, yes. But amazingly, no matter how great your nuclear arsenal... you can still bash peoples' heads in with a rock.
Information-based warfare does not preclude physical violence. And if a group with the capacity to wage physical conflict decides it is losing the more civilized digital conflict, it can always fall back on guns and bombs.
Re: (Score:2)
What exactly that looks like in practice... (Score:2)
The free fire zones over Vietnam https://en.wikipedia.org/wiki/... [wikipedia.org]
The British response in the Second Boer War https://en.wikipedia.org/wiki/... [wikipedia.org]
But with robots.
Mines (Score:1)
Re: Mines (Score:1)
That's a great comparison but anti personnel mines are illegal: https://www.un.org/disarmament... [un.org]
Re: (Score:2)
Re: (Score:1)
And of course, ISIS would never disobey the U.N.
If you're insinuating that respectable nations should copy the deplorable tactics of a terrorist organization to defeat them, then what separates the two? Should the US torture and behead its enemies too? I think we'd be better off researching ways to mitigate the actions of the terrorists rather than paving the way for them. For instance, there are probably dozens of ways to blind or otherwise incapacitate drones.
Not "if" but "when" (Score:2)
I really can't think of a reason why a military would not develop such weapons.
Missing the point (Score:2)
Re: (Score:2)
Re: (Score:2)
The point of war is to defeat the enemy. This can be done by physical or nonphysical means, including mostly bloodless. One of Napoleon's greatest non-battles was the non-battle around Ulm in 1805, in which he forced the surrender of an enemy army without serious fighting.
Now, all of these nonphysical means rely on physical force, so it's important to have the ability to kill enemies, but it doesn't necessarily have to be used.
We fear only the "Second Variety" (Score:2)
Not the current crap.
US already uses lethal autonomous machines (Score:4, Insightful)
Unlike like a growing number of countries, the US hasn't yet agreed to a ban on use of land mines.
These (simple) machines can automatically indiscriminately kill, with essentially no protection against civilian deaths, and
can remain active for many many years.
Re: (Score:2)
Smart bullets wouldn't work. Not as anything self propelling. If you make something that small, you need to make it a target seeking poison injector. Sort of like a mechanical mosquito. Diphtheria toxin would be effective. It would be nice if it could really target seek, then you could target the spine at the base of the neck and use Novocaine to temporarily paralyze them. (I think that would work, but it might stop autonomic as well as voluntary muscles, which would kill rather than paralyze.)
What for
It really doesn't matter (Score:1)
This cat is out of the bag. There's little point in debating the ethics of it. Once the technology exist, and it already does, it can and will be abused. No matter how many countries say "we won't do it" it will be done. We may as well have them because as sure as the the sun sets in the west someone else will.
False Positive (Score:2)
Anyone who has installed an alarm system into their home and has set it off accidentally -- should understand the main issue with arming your alarm system.
Nope (Score:1)
forget about combat.... (Score:1)
forget about combat.... ...these will be fantastic for robbing banks!
hypothetically speaking, of course!
Needs more cow bell ... (Score:2)
... Come on baby, don't fear the robopocalypse
Baby take my hand, don't fear the robopocalypse
We'll be able to fly, don't fear the robopocalypse
Baby I'm your man
- New Oyster Cult
My usual on the irony of autonomous weapons... (Score:2)
http://www.pdfernhout.net/reco... [pdfernhout.net] "Military robots like drones are ironic because they are created essentially to force humans to work like robots in an industrialized social order. Why not just create industrial robots to do the work instead?
Nuclear weapons are ironic because they are about using space age systems to fight over oil and land. Why not just use advanced materials as found in nuclear missiles to make renewable energy sources (like windmills or solar panels) to replace oil, or wh
Re:They're ready: except costs (Score:5, Insightful)
Re: (Score:2)
A deployed soldier can be upwards of half a million a year in costs.
[Citation needed]
Re:They're ready: except costs (Score:5, Informative)
A deployed soldier can be upwards of half a million a year in costs.
[Citation needed]
In the land of $100 hammers and $1000 toilet seats, you really need proof of this? Give me a fucking break.
Yes, we need a citation, because the $500k number is total bullcrap. It is implausible that a deployed combat soldier, at the far end of a 10,000 mile supply chain, costs so little.
Here is a citation [cnn.com] that the actual cost is $850k to $1.4 million per soldier per year.
War ain't cheap.
Re:They're ready: except costs (Score:4, Informative)
This does not surprise me at all, especially as the cost is arrived at by taking the total campaign cost divided by the number of soldiers.
Everything the US military does costs an eye-popping amount. The V22 Osprey costs $64000/hour to operate. The Bradley AFV cost over $50 for every mile driven. Recoilless rifle ammunition runs between $500 and $3000 per round. Every time an A10 opens up its mighty 3900/ round/minute cannon, each of those rounds costs $150.
The current administration's plans for increases in troop levels in Afghanistan are expected to cost the US taxpayer over a trillion dollars when all the downstream costs are included. In return they hope to secure access to about a trillion dollars in mineral reserves for US companies.
Re: (Score:2)
I should have been clearer: it will bring the total cost for the war into the trillion dollar range, counting all downstream costs.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I's be surprised if it was as cheap as half a million.
Re: (Score:2)
Actually, that's conservative. CNN reports [cnn.com] $850,000 to $1.4 million for US soldiers in Afghanistan.
Re: (Score:2)
Re: (Score:2)
I think we have well and truly crossed the cost line now. trained soldiers are expensive.
I think the greatest concern with autonomous weapons is they can entirely change the rules of the game re. asymmetric warfare.
Autonomous large-scale-deployable indiscriminate weapons can be just a less-efficient form of other WMDs such as Chemical Weapons, which are also banned.
Imagine a country deploys 10000 killer drones over a small county to spread panic and fear.
If the assailant has autonomous weapons in s
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Except we want war to be expensive. The cost is the only thing that keeps the warmongers in check. Or rather, it's the cost associated with protection and the fear of not being able to meet it, that keeps the warmongers in check. Huge expense is about the only thing that makes these psychopaths come to the negotiating table. Imagine what would happen if war was cheap. Rather than negotiate for mineral rights so that everyone benefits, you'd just have armed drones swoop in and
Thank you very much (Score:3)
Re: (Score:2)
So drone warfare will make war less expensive so more people can participate? Yea, that sounds like a good idea. Let's do that.
Re: (Score:3)
The recent drone attack on an army base has more or less convinced me that even non-state actors are finding automated weaponry a good choice. So costs can't be too much of a barrier.
OTOH, flexibility is still superior for the human. Most robots can only deal with a relatively small number of cases, and none are even approximately as good at self-repair. (But it's easier to replace parts...so that may be an even trade-off, except for costs.)
It strikes me as an extremely dangerous direction to head, but i
A very good point (Score:2)
They even up the playing field, because muslims don't value life (even their own) and will send suicide bombers. .
Actually this is a very good point. Is it worse to build an unfeeling robot to fight, or to remove all human emotion and compassion from humans like Islam does with its vial belief system?
Re: (Score:2)
Re: (Score:2)
The current version of this stuff is already so cheap it's being used by non-state actors. There was a drone assault on an army base recently. The attackers did not identify themselves.
This stuff is coming. It may be unfortunate (probably is), but it's cheap enough and easy enough that primitive versions are already in use. So far they all depend on remote control, but simple versions that don't are easy...they just aren't as flexible. About all you need to add is a GPS starting point, an inertial guid