Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
The Military Technology

US Military Moving Closer To Automated Killing 472

Doofus writes "A recent article in the Washington Post, A future for drones: Automated killing, describes the steady progress the military is making toward fully autonomous networks of targeting and killing machines. Does this (concern|scare|disgust) any of you? Quoting: 'After 20 minutes, one of the aircraft, carrying a computer that processed images from an onboard camera, zeroed in on the tarp and contacted the second plane, which flew nearby and used its own sensors to examine the colorful object. Then one of the aircraft signaled to an unmanned car on the ground so it could take a final, close-up look. Target confirmed. This successful exercise in autonomous robotics could presage the future of the American way of war: a day when drones hunt, identify and kill the enemy based on calculations made by software, not decisions made by humans. Imagine aerial "Terminators," minus beefcake and time travel.' The article goes on to discuss the dangers of surrendering to fully autonomous killing, concerns about the potential for 'atrocities,' and the nature of what we call 'common sense.'"
This discussion has been archived. No new comments can be posted.

US Military Moving Closer To Automated Killing

Comments Filter:
  • by Anonymous Coward
    Given the amount of friendly fire deaths in recent wars it would be interesting to see if software has a better rate of IDing enemies than humans do.
    • I've always wanted to cream the Blue Team in Paintball. From home. I wonder when this tech will be available for toy companies. Especially when the 2012 Geneva Convention on Laws of Armed Robots in Combat declares them unfit, thereby resulting in a blackmarket for jailbroken drones.

    • Same thing as cars that drive themselves. People die on the road every day due to human error, bu the moment a car with no driver crashes into something / hurts someone all hell will break loose.

      This thing could be way more effective than any man at doing it's job. One mistake and it'd be dead.

      There is this misconception that humans can fail, but machines can't. What they forget is that the men who built the machine were human too, so, it'll never be perfect.

      • by erroneus ( 253617 ) on Wednesday September 21, 2011 @04:33AM (#37465688) Homepage

        That's all well and good, but I am more concerned about our robotic overlords commanded by the one or few who need killers without conscience and without any sense of right or wrong.

        We already have a government in the US who felt it was necessary to use contractors to perform acts which exceed that which the military service members should do. But that's not good enough. They want killers who will kill, ask no questions and tell no one about it.

  • Landmines (Score:5, Insightful)

    by Anonymous Coward on Tuesday September 20, 2011 @11:24PM (#37464324)

    Landmines do automated killing every day!

    • Excellent point, and look what an indiscriminate job [handicapinternational.be] they do of it.
    • Re: (Score:3, Informative)

      by rtfa-troll ( 1340807 )

      Which is why civilised countries [wikipedia.org] have already outlawed them. No decent human could encourage the spread of the things which kill many civilians, animals and for the most case mine clearers for every attacking soldier they kill.

      N.B. the treaty still allows anti-tank mines and even remote triggered claymore mines so it's still possible to do area denial against serious military forces. I will give Koreans a small out in this case in that this was the way that there border was divided long before the treat

      • by Quila ( 201335 ) on Wednesday September 21, 2011 @09:13AM (#37467884)

        Just politically correct. The US already has policies in place that effectively meet and exceed the goals of the Ottawa treaty.

        We stopped selling mines, we destroyed old stockpiles. we have spent over a billion dollars clearing mines and helping victims (usually not our mines). Our new mines are self-destructing or self-disarming, and policy is to not place one without its position being recorded, and that it be removed from any battlefield after its need has passed.

        Even with that, the only place we actually use them is in the Korean DMZ. The last time we used them in combat was the Gulf War, in limited use. These were scatterable mines, fired or dropped to a specific grid coordinate to deny use of that small area to the enemy. Since this was their first use we did make mistakes, as apparently not every shot was recorded and reported for later easy cleanup. Rules for their use have since been changed, and by now they should be converted to self-destructing or self-dearming anyway.

    • Bingo. The US has spent years phasing out land mines, and if it wasn't for the Korean DMZ, it would be a signatory to the Ottawa Treaty. It would be a backwards step if they built new weapons where humans do not make the targeting decision.

    • The word is, with the hightech drones they hope one day to get a better ratio of collaterals/enemies than with landmines.

    • And they are a pestilence in the areas where they are unleashed on. Their main victims post ware usually are children which while playing accidentally step on them.

  • When these are combat ready, there will be many unemployed soldiers.

    • In the type of asymmetric warfare we are engaged in today robots may not be very effective especially at their current intelligence. Using a gun caries great responsibility and attaching one to a robot and having it make intelligent life and death decisions is something that is far off. I hope by the time we automate killing that we will have fixed most of the problems that cause us to go to war.
  • by blue trane ( 110704 ) on Tuesday September 20, 2011 @11:32PM (#37464374) Homepage Journal

    Move all violence to online simulations.

  • by phantomfive ( 622387 ) on Tuesday September 20, 2011 @11:36PM (#37464400) Journal
    The 'automated recognition' in this case was a large orange tarp. The difficulty of creating an automated recognition algorithm for an orange object in a natural background is extremely low. Wake us up when this thing can recognize camouflaged tanks in a forest.
  • Science fiction writer Cordwainer Smith called them "manshonyaggers" in a story published back in the 1950's. The word was supposed to be a far-future corruption of "menschen" and "jager", or "manhunter".

    It looks like his future is going to get here a lot faster than he thought.

    • by Nursie ( 632944 )

      It probably won't be the Mark Elf and the Vom Acht's though, it'll be the MQ-35 Wrath-bringer and it'll only respond to commands prefaced with "God Bless America"

  • Solution (Score:3, Insightful)

    by Sasayaki ( 1096761 ) on Tuesday September 20, 2011 @11:43PM (#37464420)

    Why don't we, instead of perfecting our killing methods, simply stop initiating economy destroying pointless wars?

    I'm excited about all the trickle-down technology that'll eventually become consumer grade fare, and I appreciate the advancement in various technology that war brings, but I would much prefer it if the US stopped economically destroying itself (while giving the Middle East a "Great Satan" to fight) and instead let them get back to killing each other over tiny differences in interpretation of fundamentalist Islam.

    Not even Bob the Builder can fix the Middle East at the moment. Not when you have God handing out the real estate titles and commanding the thousands of various splinter cells to annihilate everything that's not exactly identical to themselves, as trillions of dollars of oil money pour into the region to feed and fund it all.

    • What's bad for one part of the economy may be good for another part. What's 'good' for the economy is a matter of debate. I know it's a tiredly overused example, but if you owned stock in Halliburton in 2000 and hung onto it I'm sure you'd think that these pointless wars are pretty good for the economy.

      Overall, I agree with your comments, but I don't think the pointless wars were a major drag on our economy. If anything, they probably helped. Lowering taxes during wartime - now that's a classic economic no-

    • by jimicus ( 737525 )

      Why don't we, instead of perfecting our killing methods, simply stop initiating economy destroying pointless wars?

      Because war is a fantastically good way to seriously sort your country out. All you need to do is have a great big war and lose it.

      Sure, it takes a few years but look at, say, Germany or Japan today versus where they were in 1945.

      I reckon that's what the US is doing. Starting all these wars with a view to losing one.

  • by mosb1000 ( 710161 ) <mosb1000@mac.com> on Tuesday September 20, 2011 @11:47PM (#37464448)

    Someone should make a movie about this. . .

  • ...of a whatcouldgowrong tag, this would be it.

  • As long as the soldier who pushes the button to activate the drones is equally responsible as the one who pushes the button to drop a dumb bomb then I don't really see the issue.

    As long as someone mucking and and causing friendly fire or collateral damage is equally liable then I think this is just an arms race that has the potential to avoid casualties.

    When you can start shoving blame around so soldier blames the programmer and vice versa is where this becomes dangerous I think. If the soldier can blame so

  • Yep, autonomous machines are certain to make mistakes and kill people who aren't the target, who are innocent, don't really deserve to die, etc.

    So what?

    Humans make those mistakes now, in abundance: friendly fire, massacres, genocide, innocent bystanders... you name it. What difference does it make whether it's a human or some artificial pseudo-intelligence that makes the mistakes?

    I'll tell you what the difference is: one less human on the battlefield, and thus at the least one less human that can die from

    • Massacres and genocide I would not call an honest "mistake" as they're usually done intentionally. Automating the battlefield and particularly the killing machines would only make such actions easier.
  • Could it kill 9 people and wound 14?
    http://www.wired.com/dangerroom/2007/10/robot-cannon-ki/ [wired.com]

  • Looks good on paper but for now they are just expensive toys which may be more useful as recruiting tools (look war is just like a video game, come play with us!). Barely useful in an asymmetrical warfare conflict like the one in Afghanistan and useless in a war with a country that has a modern air force and an integrated air defense system. They'd be shot out of the sky immediately.
  • the purpose of attackwatch.com [barackobama.com]

    But they forgot to leave a way to upload pictures of the targets to be terminated. Oops.

  • Between globalization and robots it appears the golden age of leisure* is closer than ever.

    * Where golden age of leisure = mass unemployment and food riots

  • There is no way that the military is going to permit autonomous combatant units. At least, not without having a stake put through its brain.

    For starters, the PR would be through the floor if even one of these things killed a civilian (though I guess with how callous the US has been towards civilian collateral casualties for the past ten years, that might not be a big deal.)

    The other main reason is that there's no way a manly man is ever going to give up on the idea of manly soldiers charging manly into bat

    • by Shihar ( 153932 )

      The US doesn't need autonomous killing machines. Sure, the US will develop them, but so long as the Americans are busy busting on sheep herders armed with AK47s, they wont use them. You might get to the point where drones are doing everything but pull the trigger, but having a human in the loop approving all death and destruction is cheap and easy. You don't gain anything when you are fighting peasants with shitty guns by having a fully autonomous killing machines.

      The US will develop the technology thoug

    • by jlar ( 584848 )

      "There is no way that the military is going to permit autonomous combatant units. At least, not without having a stake put through its brain."

      You are implicitly assuming that the USA will be fighting inferior enemies in the future and thus will be more concerned about bad PR than coming out on top. A potential future conventional conflict with a heavily armed opponent capable of inflicting millions of casualties will change that (most likely China but there are also other potential candidates). And in such

  • Examples:

    IF a target is a unique type of vehicle that can be easily identified by target recognition software that _already_ does this for normal pilots AND said target is within a set of coordinates that are known to only contain hostile vehicles of that type, THEN kill it, otherwise seek human double-check and weapons release confirmation.

    If a target is in an area known to not contain friendlies and is detected firing a missile or weapon (like an AA gun for example), then kill it.

    If there are friendlies o

  • by guspasho ( 941623 ) on Wednesday September 21, 2011 @12:39AM (#37464670)

    If ever there was an appropriate time for the "whatcouldpossiblygowrong" tag, this is it.

  • How soon can we send it into FATA in pakistan? Time to just target the high level taliban/AQ. I am fine with using automation to do this. In fact, I think that we should send these ppl into Mexico as well once it is working decently. Lots of Cartel there.
  • Does this (concern|scare|disgust) any of you?

    Why am I limited to these choices? Groupthink much?

  • We're quite likely to see systems that kill anybody who is shooting at friendly troops. The U.S. Army has had artillery radar systems [fas.org] for years which detect incoming shells and accurately return fire. There have been attempts to scale that down to man-portable size, but so far the systems have been too heavy.

    Sooner or later, probably sooner, someone will put something like that on a combat robot.

  • The most dangerous thing is about this is that now when a glitch or bug or malware causes a plane to blow up a wedding, it means no one is responsible. No one ordered it, and no one can be punished for it.

  • I need to remove my "Gone January 20th, 2013" with the Obama logo for the "o" bumper sticker from my bumper... http://www.amazon.com/gp/product/images/B00228KSPU/ref=dp_image_z_0?ie=UTF8&n=15684181&s=automotive [amazon.com]
  • When we take the risk out of war, it loses all meaning. You America haters think we're too quick to go to war now, wait until there is no risk to our own people. No bodies coming off planes at Dover AFB, no funerals, no news reports about another young widow with children to raise without a father (or the other way around). If we remove the risk, then war becomes merely a cost item on the budget, and much easier to jump into. Take out the Sci-fi stories of the war robots turning on their Human masters (
  • What bothers me... (Score:4, Insightful)

    by Alioth ( 221270 ) <no@spam> on Wednesday September 21, 2011 @02:29AM (#37465160) Journal

    What bothers me is these things make war easier to wage. When Americans aren't coming home in coffins, it's a lot easier for the public and politicians to accept war, therefore we're more likely to start wars.

    If we're risking our own soldiers and pilots, at least we might think twice and look for other solutions before starting a war. However, once you've made war palatable to your own public, too often it becomes the first resort especially amongst the hawkish (and religious right versus non-Christian enemies)

  • Dr. Strangelove: Based on the findings of the report, my conclusion was that this idea was not a practical deterrent for reasons which at this moment must be all too obvious.

    President Merkin Muffley: General Turgidson, I find this very difficult to understand. I was under the impression that I was the only one in authority to order the use of nuclear weapons.

  • This sounds like a bad idea to me as far as fighting wars go.

    War is supposed to be up close, personal, and horrific. Letting machines handle the dirty work removes a large amount of the deterrance that should be inherent in pursuing a war. Knowing the horrors of war should be a big motivator in seeking alternatives to war.

    What's next? Just have computers simulate attacks, calculate damage and casualties, and then those on the casualty list report to a termination center?

news: gotcha

Working...