US Military Moving Closer To Automated Killing 472
Doofus writes "A recent article in the Washington Post, A future for drones: Automated killing, describes the steady progress the military is making toward fully autonomous networks of targeting and killing machines. Does this (concern|scare|disgust) any of you? Quoting: 'After 20 minutes, one of the aircraft, carrying a computer that processed images from an onboard camera, zeroed in on the tarp and contacted the second plane, which flew nearby and used its own sensors to examine the colorful object. Then one of the aircraft signaled to an unmanned car on the ground so it could take a final, close-up look. Target confirmed. This successful exercise in autonomous robotics could presage the future of the American way of war: a day when drones hunt, identify and kill the enemy based on calculations made by software, not decisions made by humans. Imagine aerial "Terminators," minus beefcake and time travel.' The article goes on to discuss the dangers of surrendering to fully autonomous killing, concerns about the potential for 'atrocities,' and the nature of what we call 'common sense.'"
Better computers than humans (Score:2, Insightful)
Re: (Score:2)
I've always wanted to cream the Blue Team in Paintball. From home. I wonder when this tech will be available for toy companies. Especially when the 2012 Geneva Convention on Laws of Armed Robots in Combat declares them unfit, thereby resulting in a blackmarket for jailbroken drones.
Re:Better computers than humans (Score:4, Insightful)
I don't think that having good ol' fashioned humans die in our wars is morally required of a sovereign people, rather, I question how we can truly feel ownership of our society if we do not control it, protect it, assist it and direct it.
I think there's another issue to consider before we even get near to asking questions about "societal ownership".
Automating front-line offensive & defensive forces makes it much easier for a government to use it's military might against its' own citizens, as there will be far less of a problem with human officers and front-line soldiers refusing to open fire on their fellow citizens and/or issue orders to that effect.
Somebody in the White House, Pentagon, or some military installation just types a command and pushes the "Enter" key and people are automatically hunted down and killed. A tyrant's dream.
Robots and drones are already being utilized in domestic law enforcement, so how long would it be before these fully-automated weapons systems were used domestically? You know they will be eventually if we allow it. History shows us that human nature is all too predictable when it comes to governments having immense power over relatively defenseless citizens. Governments always seek more power & control, and it never ends well once they achieve a large amount of it.
Strat
Re: (Score:2)
Same thing as cars that drive themselves. People die on the road every day due to human error, bu the moment a car with no driver crashes into something / hurts someone all hell will break loose.
This thing could be way more effective than any man at doing it's job. One mistake and it'd be dead.
There is this misconception that humans can fail, but machines can't. What they forget is that the men who built the machine were human too, so, it'll never be perfect.
Re:Better computers than humans (Score:4, Insightful)
That's all well and good, but I am more concerned about our robotic overlords commanded by the one or few who need killers without conscience and without any sense of right or wrong.
We already have a government in the US who felt it was necessary to use contractors to perform acts which exceed that which the military service members should do. But that's not good enough. They want killers who will kill, ask no questions and tell no one about it.
Landmines (Score:5, Insightful)
Landmines do automated killing every day!
Re: (Score:2)
Re: (Score:3, Informative)
Which is why civilised countries [wikipedia.org] have already outlawed them. No decent human could encourage the spread of the things which kill many civilians, animals and for the most case mine clearers for every attacking soldier they kill.
N.B. the treaty still allows anti-tank mines and even remote triggered claymore mines so it's still possible to do area denial against serious military forces. I will give Koreans a small out in this case in that this was the way that there border was divided long before the treat
Not necessarily civilized (Score:5, Informative)
Just politically correct. The US already has policies in place that effectively meet and exceed the goals of the Ottawa treaty.
We stopped selling mines, we destroyed old stockpiles. we have spent over a billion dollars clearing mines and helping victims (usually not our mines). Our new mines are self-destructing or self-disarming, and policy is to not place one without its position being recorded, and that it be removed from any battlefield after its need has passed.
Even with that, the only place we actually use them is in the Korean DMZ. The last time we used them in combat was the Gulf War, in limited use. These were scatterable mines, fired or dropped to a specific grid coordinate to deny use of that small area to the enemy. Since this was their first use we did make mistakes, as apparently not every shot was recorded and reported for later easy cleanup. Rules for their use have since been changed, and by now they should be converted to self-destructing or self-dearming anyway.
Re:Not necessarily civilized (Score:5, Informative)
Beyond that, the US has offered to sign the Ottawa treaty if an exemption for land mines in the Korean DMZ were allowed.
Re: (Score:2)
Bingo. The US has spent years phasing out land mines, and if it wasn't for the Korean DMZ, it would be a signatory to the Ottawa Treaty. It would be a backwards step if they built new weapons where humans do not make the targeting decision.
Re: (Score:2)
The word is, with the hightech drones they hope one day to get a better ratio of collaterals/enemies than with landmines.
Re: (Score:2)
And they are a pestilence in the areas where they are unleashed on. Their main victims post ware usually are children which while playing accidentally step on them.
Re:Have we sunk to this as a nation? (Score:5, Funny)
Looks like they automated Godwin too.
Automated job killing (Score:2)
When these are combat ready, there will be many unemployed soldiers.
Re: (Score:3)
Re: (Score:3)
Re:Automated job killing (Score:4, Interesting)
Re:Automated job killing (Score:4, Informative)
Re: (Score:2)
Re: (Score:3)
I assume you're referring to Ron Paul here, but you're somewhat wrong about the idea that the only opposition to war comes from libertarians. Among others, you can point to Ron Paul's frequent Democratic ally on stopping wars, Dennis Kucinich - he's staunchly anti-war, and staunchly pro-welfare, and polls about as well nationally as Ron Paul.
Have automated enemies too (Score:3)
Move all violence to online simulations.
Re: (Score:3)
Re: (Score:2)
Yeah, but we don't have to execute them. Run a simulation, find out who would have won, and let ppl live their lives as they wish while the politicians play their games without hurting anyone!
not even competent, extremely experimental (Score:5, Insightful)
Re:not even competent, extremely experimental (Score:5, Insightful)
Camouflaged tanks in a forest shouldn't be too hard. Telling the difference between a soldier and a civilian - now that's a challenge.
Re: (Score:2)
Re: (Score:2)
Even humans find that hard, so if they can get the accuracy level up it might rival what we can do despite being imperfect.
Seeing all these 'liberations' ... (Score:2)
Re: (Score:2)
Not really, a soldier is a true hit and a civilian is colateral damage, easy solution, aint it?
Re: (Score:2)
...it would have a field day at a picnic party.
Re: (Score:2)
Re:not even competent, extremely experimental (Score:4, Funny)
I think at that point it might be weee bit late. Today it's an orange tarp... tomorrow it's a camouflaged tank in a forest.... and day after it's a guy wearing red and white striped-shirt in a crowd.
$ cat killbot.log
Scanning crowd...
Target "Waldo" located.
Servo batteries one, two, and three lock on... fire!
Target "Waldo" destroyed.
$
Everything old is new again (Score:2)
Science fiction writer Cordwainer Smith called them "manshonyaggers" in a story published back in the 1950's. The word was supposed to be a far-future corruption of "menschen" and "jager", or "manhunter".
It looks like his future is going to get here a lot faster than he thought.
Re: (Score:2)
It probably won't be the Mark Elf and the Vom Acht's though, it'll be the MQ-35 Wrath-bringer and it'll only respond to commands prefaced with "God Bless America"
Solution (Score:3, Insightful)
Why don't we, instead of perfecting our killing methods, simply stop initiating economy destroying pointless wars?
I'm excited about all the trickle-down technology that'll eventually become consumer grade fare, and I appreciate the advancement in various technology that war brings, but I would much prefer it if the US stopped economically destroying itself (while giving the Middle East a "Great Satan" to fight) and instead let them get back to killing each other over tiny differences in interpretation of fundamentalist Islam.
Not even Bob the Builder can fix the Middle East at the moment. Not when you have God handing out the real estate titles and commanding the thousands of various splinter cells to annihilate everything that's not exactly identical to themselves, as trillions of dollars of oil money pour into the region to feed and fund it all.
Re: (Score:2)
What's bad for one part of the economy may be good for another part. What's 'good' for the economy is a matter of debate. I know it's a tiredly overused example, but if you owned stock in Halliburton in 2000 and hung onto it I'm sure you'd think that these pointless wars are pretty good for the economy.
Overall, I agree with your comments, but I don't think the pointless wars were a major drag on our economy. If anything, they probably helped. Lowering taxes during wartime - now that's a classic economic no-
Re: (Score:3)
Why don't we, instead of perfecting our killing methods, simply stop initiating economy destroying pointless wars?
Because war is a fantastically good way to seriously sort your country out. All you need to do is have a great big war and lose it.
Sure, it takes a few years but look at, say, Germany or Japan today versus where they were in 1945.
I reckon that's what the US is doing. Starting all these wars with a view to losing one.
Great idea for a movie. (Score:3)
Someone should make a movie about this. . .
If ever there was a story deserving... (Score:2)
...of a whatcouldgowrong tag, this would be it.
As long as the algorithm can't be a scapgoat (Score:2)
As long as the soldier who pushes the button to activate the drones is equally responsible as the one who pushes the button to drop a dumb bomb then I don't really see the issue.
As long as someone mucking and and causing friendly fire or collateral damage is equally liable then I think this is just an arms race that has the potential to avoid casualties.
When you can start shoving blame around so soldier blames the programmer and vice versa is where this becomes dangerous I think. If the soldier can blame so
Re: (Score:2)
Not that anything pass the first line of your response is relevant to the GP. I hate AC that do this.
Defending against this sort of weapon is pretty much like attacking a minefield; utterly pointless.
I don't know what the compensation for a dead soldier is but I don't think its the millions that it cost when these go down. Its going to be cheaper to have soldiers than drones for a long time yet the financial cost will hurt as much as the human cost. I think EMP technology will become a high priority as well.
You are making stupid hypothetical assumptions; the US still win their wars (which are being calle
OMFG, mistakes will be made! (Score:2)
Yep, autonomous machines are certain to make mistakes and kill people who aren't the target, who are innocent, don't really deserve to die, etc.
So what?
Humans make those mistakes now, in abundance: friendly fire, massacres, genocide, innocent bystanders... you name it. What difference does it make whether it's a human or some artificial pseudo-intelligence that makes the mistakes?
I'll tell you what the difference is: one less human on the battlefield, and thus at the least one less human that can die from
Re: (Score:2)
Re: (Score:2)
So... program the machines to "feel remorse". That one should be easy,since remorse is (a) recognizing a possible mistake, (b) analyzing the causal decisions and events, and (c) altering the decision process to minimize repeating the same pattern. Sounds pretty straightforward to me, unlike some other emotions.
What could go wrong? (Score:2)
Could it kill 9 people and wound 14?
http://www.wired.com/dangerroom/2007/10/robot-cannon-ki/ [wired.com]
Toys. (Score:2)
I guess this explains... (Score:2)
the purpose of attackwatch.com [barackobama.com]
But they forgot to leave a way to upload pictures of the targets to be terminated. Oops.
Gone Fishing (Score:2)
Between globalization and robots it appears the golden age of leisure* is closer than ever.
* Where golden age of leisure = mass unemployment and food riots
Not Gonna Happen (Score:2)
There is no way that the military is going to permit autonomous combatant units. At least, not without having a stake put through its brain.
For starters, the PR would be through the floor if even one of these things killed a civilian (though I guess with how callous the US has been towards civilian collateral casualties for the past ten years, that might not be a big deal.)
The other main reason is that there's no way a manly man is ever going to give up on the idea of manly soldiers charging manly into bat
Re: (Score:2)
The US doesn't need autonomous killing machines. Sure, the US will develop them, but so long as the Americans are busy busting on sheep herders armed with AK47s, they wont use them. You might get to the point where drones are doing everything but pull the trigger, but having a human in the loop approving all death and destruction is cheap and easy. You don't gain anything when you are fighting peasants with shitty guns by having a fully autonomous killing machines.
The US will develop the technology thoug
Re: (Score:2)
"There is no way that the military is going to permit autonomous combatant units. At least, not without having a stake put through its brain."
You are implicitly assuming that the USA will be fighting inferior enemies in the future and thus will be more concerned about bad PR than coming out on top. A potential future conventional conflict with a heavily armed opponent capable of inflicting millions of casualties will change that (most likely China but there are also other potential candidates). And in such
"...out of the hands of humans" is a misnomer (Score:2)
Examples:
IF a target is a unique type of vehicle that can be easily identified by target recognition software that _already_ does this for normal pilots AND said target is within a set of coordinates that are known to only contain hostile vehicles of that type, THEN kill it, otherwise seek human double-check and weapons release confirmation.
If a target is in an area known to not contain friendlies and is detected firing a missile or weapon (like an AA gun for example), then kill it.
If there are friendlies o
Cliche but... (Score:5, Funny)
If ever there was an appropriate time for the "whatcouldpossiblygowrong" tag, this is it.
This does concern me (Score:2)
"Doofus" (Score:2)
Does this (concern|scare|disgust) any of you?
Why am I limited to these choices? Groupthink much?
Likely applications of automated killing (Score:2)
We're quite likely to see systems that kill anybody who is shooting at friendly troops. The U.S. Army has had artillery radar systems [fas.org] for years which detect incoming shells and accurately return fire. There have been attempts to scale that down to man-portable size, but so far the systems have been too heavy.
Sooner or later, probably sooner, someone will put something like that on a combat robot.
Autonomous kills means no one is responsible! (Score:2)
The most dangerous thing is about this is that now when a glitch or bug or malware causes a plane to blow up a wedding, it means no one is responsible. No one ordered it, and no one can be punished for it.
Something tells me (Score:2)
This is not good (Score:2)
Berserkers got their start that way (Score:2)
What bothers me... (Score:4, Insightful)
What bothers me is these things make war easier to wage. When Americans aren't coming home in coffins, it's a lot easier for the public and politicians to accept war, therefore we're more likely to start wars.
If we're risking our own soldiers and pilots, at least we might think twice and look for other solutions before starting a war. However, once you've made war palatable to your own public, too often it becomes the first resort especially amongst the hawkish (and religious right versus non-Christian enemies)
Re:What bothers me... (Score:4, Funny)
Perhaps they could call such a system WOPR :-)
Before Skynet, there was Strangelove (Score:2)
President Merkin Muffley: General Turgidson, I find this very difficult to understand. I was under the impression that I was the only one in authority to order the use of nuclear weapons.
Leave the Killing to the Humans (Score:2)
This sounds like a bad idea to me as far as fighting wars go.
War is supposed to be up close, personal, and horrific. Letting machines handle the dirty work removes a large amount of the deterrance that should be inherent in pursuing a war. Knowing the horrors of war should be a big motivator in seeking alternatives to war.
What's next? Just have computers simulate attacks, calculate damage and casualties, and then those on the casualty list report to a termination center?
Re: (Score:2)
Where does it say that? The article is discussing systems that don't require human approval for a kill.
Re: (Score:2)
Re:not autonomous (Score:5, Interesting)
I read somewhere recently a quote that, IIRC, was from Churchill. It was something about avoiding war, but if you must fight, fight with severity, for that is the most humane. I think that applies here. Though it sounds incredibly cruel, if people are not dying in your war, there will be no incentive for either side to stop.
Of course, Gadhafi, Hussein, Stalin, and similar madmen are somewhat of a counter example in that they don't give up no matter how many of their side are killed. Yet Japan in WWII is an example of the ruthless severity (nuclear bombs) causing an immediate and complete cessation of any attempts to create war.
Even modern times with Gadhafi and Hussein, the invasion of Iraq was much more severe than the Libyan rebels, thus the shorter amount of time to cause the government to capitulate. (Getting the rest of the population to stop fighting, much harder... we'll see how Libya does without the outside intervention.)
Anyway, the point is that robot vs robot is war by proxy. Without the violence, the bloodshed, the impetus to end the war just won't be the same. They'll drag on for longer and longer, and resolution will be even less certain than it is today. I'm not sure that's necessarily such a good thing.
Re: (Score:2)
Anyway, the point is that robot vs robot is war by proxy. Without the violence, the bloodshed, the impetus to end the war just won't be the same. They'll drag on for longer and longer, and resolution will be even less certain than it is today. I'm not sure that's necessarily such a good thing.
There will always be conflict. War is just one method to resolve conflict. Legal fights are another method. Negotiation is another method. Robot wars is on future potential method. In my opinion, machines killing each other is vastly preferable to people killing each other, people who would be brothers in a different situation.
Re: (Score:2)
You've got it the wrong way around.
The idea of winning a war by way of killing so many of the opposition that the rest will surrender or retreat is viable some of the time, but horrific. And truth be told, it doesn't work nearly as well in real life as it does on paper; people are unpredictable creatures at the best of times, and there are plenty of cases of soldiers or entire armies fighting to the very last, horrific fate be damned, rather than surrender. In particular populations and politicians may fa
Re: (Score:2)
the way that's had a better track record of making wars end, is to destroy the ability of the enemy to make war altogether.
Indeed, that's precisely what Allies were doing with those firebombings of Germany and Japan back in WW2. Keeping in mind that (civilian) workers manning the factories are a crucial resource required for making war...
Re: (Score:2)
Some hard-line Japanese had wanted to keep fighting even after the atomic bombs were dropped.
http://en.wikipedia.org/wiki/Ky%C5%ABj%C5%8D_Incident [wikipedia.org], for example
Re: (Score:2)
Yet Japan in WWII is an example of the ruthless severity (nuclear bombs) causing an immediate and complete cessation of any attempts to create war.
Even so, it almost didn't work. There was a coordinated attempt by some in the Japanese military to kidnap Emperor Hirohito and prevent him from capitulating to the United States.
Re: (Score:2)
Anyway, the point is that robot vs robot is war by proxy. Without the violence, the bloodshed, the impetus to end the war just won't be the same. They'll drag on for longer and longer, and resolution will be even less certain than it is today. I'm not sure that's necessarily such a good thing.
Isn't the problem with this really that robot v robot doesn't actually resolve anything? I.e., one side will simply destroy the other side's robots eventually, but then what happens? Just because their robots are gone, doesn't mean the loser of that part of the war simply surrenders. Instead the humans then pick up guns and fight the remaining robots/other humans from the other side.
E.g. if China is invading your homeland, and their robots beat your robots, does your homeland just surrender after the rob
Re: (Score:2)
"Anyway, the point is that robot vs robot is war by proxy."
When I was a little kid, I read a sci-fi story (in an anthology, more then likely--I devoured them so fast I rarely remembered the authors names) that was based on the premise that humans had spread throughout the stars, and in the process discovered a planet that had an indigenous race of diminutive humanoids. This race of humanoids was divided into clans and was in a multi-fronted, never-ending state of war--a total free for all. If I remember cor
Re: (Score:2)
It is worth mentioning that it was not immediate and that a Russian force was preparing to invade. The Japanese had a very good idea of what a Russian occupation would be like and that was a major influence in surrendering to the USA.
History is too messy to be told as a simple fairy tale with no substance other than cheering for your home team.
Re: (Score:2)
Wouldn't politicians killing other politicians be even better, less pollution and your not feeding the ravenous beast, "The Global Military Industrial Complex" (apparently they are now colluding together upon a multinational basis to keep mass murdering high profit wars going). All you need is a bunch of clubs and some campaign contributions and let them go at it and, the winner is, the general public.
Those chicken hawk politicians that want war, let them fight it themselves.
Re: (Score:2)
a future where war is limited to robots killing other robots, and not humans killing each other, is a GOOD THING.
That is true in a naive way, the question is which countries is the US going to attack that can afford drones? The reality is going to be drones killing brown people with ak47s, and of course people with random objects that resemble ak47s, and people who are standing in the wrong place at the wrong time. Like the status quo I guess.
Re:War is power. (Score:5, Insightful)
All power comes from the barrel of a gun. Aimed at you - to make you comply. Willingly, or otherwise.
All power comes from being able to make someone happy. Really, think about it. A gun is no guarantee that someone will comply. If they feel certain you will shoot, then it has almost no power at all. The power of a gun comes from the fact that you MIGHT make them happy by not killing them.
If your goal is to get people to do something, you'll do much better paying them than trying to threaten them. And if you can make them happy in other ways, you may be even more powerful than merely with money.
Obama didn't obtain the most powerful office in the world by threatening to kill people (King George tried that, and got a revolution). He got votes by giving people hope for change. How much change he delivered is a different thing (certainly he delivered some), but people were happy to believe that it might be true. So they voted for him.
The reality of power is different than what a lot of people think.
Re: (Score:2, Interesting)
"All power comes from being able to make someone happy. Really, think about it. A gun is no guarantee that someone will comply. "
I don't now if you've noticed but you live in a world of millions of suffering people, you have billionaires and homeless people not for a lack of homes but for a lack of guns on part of power to kill/subdue the rich. There is no rational reason to have as much suffering as we do in the developed world because of capitalism but most people fear guns.
Re: (Score:3)
People voted for Obama because they believed he would do violence to others rather than themselves.
You either have a weird definition of violence, or a weird idea of your fellow citizens. I know nobody who voted for Obama because they thought he would do violence to someone.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
What's the difference between these organizations, and government? As you pointed out, they wouldn't work unless most members were agreeable.
Besides, I was saying that violence is the distinction between a cultural leader, and a governmental one.
Re: (Score:2)
The more classic definition:
Government is the monopoly of violence
They are violent and do not allow anyone else to be. Criminal gangs usually only care about profit, if someone is violent in their territory but profit is not in risk they are not concerned. Of course, powerful criminal gangs in a weak country can feel the void of power and begin working partially as a government (and becoming more like warlords).
You could argue that, in order for that monopoly to be effective, the government needs some bac
Re: (Score:2)
Re: (Score:2)
Besides, I was saying that violence is the distinction between a cultural leader, and a governmental one.
ok, that's probably a valid distinction.
Re: (Score:2)
Most criminals don't agree with being put in jail. You have to use the threat of violence to put them there. That is government in it's purest, most basic form. Take that away, and you no longer have government.
Re: (Score:2)
How did you think he was going to accomplish that?
What liberty? (Score:3)
There are a lot of treaties that try to limit the number of nukes, land mines and other non-discr
Re: (Score:2)
And it has worked...how many terrorist attacks have there been on US soil in the past decade?
Since I started wearing an onion on my belt, my computer has not had any infections. It works!
Re: (Score:2)
Pushing a button in the CIC as apposed to a trigger right in front of you, steals from the thought process killing life.
Logic dictates: No thinking -> easily ignore moral implications -> war crimes easy as pie.
No my friend, some things should never be automated, lest the robots rule our world.
Wow, times must have changed... (Score:2)
I will admit that we do have a few bad apples (any large population will have outliers). But to use those as a basis to excoriate us as a whole...my friend, you are sorely mistaken.
My, have things changed. I was always taught that the honor of the unit lies with each man...
Oblig Stalin quote (Score:2)
Where is the Soviet Union now?
Re: (Score:3)
Re: (Score:2)
Because at the end of the day you still have to break the will of your opponent and have them do something they wouldn't ordinarily do. Chances are 'you lost at rock em sock em so you now need to handover your port' wouldn't be overly persuasive.
Re: (Score:2)
The reason why it won't be that way is because the side which will not bother with this kind of thing, will win a robot-vs-robot war.
Re: (Score:2)
I'm sure they will only use this in countries populated by brown (non-white) people that speak in funny languages.
Funny languages - you mean like French?
Re: (Score:2)
"our military has an unusually high percentage of people "with brown skin" both doing the killing and in positions of leadership".
Because soldiers are historically recruited from the lower classes.
When President Harry S. Truman desegregated the military in 1948, African-Americans saw the Army as a key avenue for advancement. Joining up became "a way out of a worse situation," said Gregory A. Black, a retired Navy dive commander and creator of blackmilitaryworld.com, a website devoted to the history of Afric
Re: (Score:2)
0. A robot may not harm humanity, or, by inaction, allow humanity to come to harm.
Think about it: If, as the "robots" see it (i.e. are programmed), NOT killing a few gazillion people would harm humanity -- well, then we'd better kill them! No?
The morale of this: You cannot program morals! (At least not eas