Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
The Military Robotics

Top US General Warns Against Rogue Killer Robots (thehill.com) 164

Long-time Slashdot reader Zorro quotes The Hill: The second-highest-ranking general in the U.S. military last Tuesday warned lawmakers against equipping the armed forces with autonomous weapons systems... Gen. Paul Selva warned lawmakers that the military should keep "the ethical rules of war in place lest we unleash on humanity a set of robots that we don't know how to control. I don't think it's reasonable for us to put robots in charge of whether or not we take a human life," Selva told the committee.
There's already a Defense Department directive that requires humans in the decision-making process for lethal autonomous weapons systems. But it expires later this year...
This discussion has been archived. No new comments can be posted.

Top US General Warns Against Rogue Killer Robots

Comments Filter:
  • by Anonymous Coward on Monday July 24, 2017 @06:37AM (#54865757)

    If war was ethical, only leaders would fight.

    • by JackieBrown ( 987087 ) on Monday July 24, 2017 @08:25AM (#54866249)

      Then we would have a society ruled by only those capable of fighting.

      I know the Klingon rules of ascension sound great on paper...

    • Less a question of ethics, than what is being done in Russia and China... We don't want people fighting robots.
      • by Anonymous Coward on Monday July 24, 2017 @09:12AM (#54866529)

        It wouldn't be people fighting robots, it would be people controlling robots fighting fully autonomous robots possibly. Now, I'm sure people who don't agree with me will never be convinced since dealings with ethics and morals are purely subjective, but my personal feeling is that the taking of a life should only be done by another person. Be it capital punishment or an act of war, a person should always be responsible for taking some action. The idea of automating murder sickens me and I fear that death may be trivialized if it's automated. There should be real consequences to society for killing a person. And having a person involved will weigh on their mind, barring the occasional psychopath. And even if other countries decide they want to automate it away, I don't wish to live in that sort of society. Any action that involves killing a person is a choice of last resort, you should have to be willing to deal with the emotional harm of having a person do it if you decide it's the path that needs to happen.

        • by knightghost ( 861069 ) on Monday July 24, 2017 @09:20AM (#54866605)

          To bad reality doesn't support morals. A human controlled robot is simply to slow to win. I think it'll end up with humans defining strategic mission parameters and robots using programmed tactics that are adaptable withing a framework.

        • "Any action that involves killing a person is a choice of last resort, you should have to be willing to deal with the emotional harm of having a person do it if you decide it's the path that needs to happen." Are you sure wars are fought like that now? When somebody launches a cruise missile from 100 miles away, do you think that's the same as pulling a trigger on a gun and looking at your victim in the eyes as you do so? What about nuclear weapons or air combat where you don't even see your target, just a
        • The idea of automating murder sickens me and I fear that death may be trivialized if it's automated.

          This has already happened. Nazi Germany, Jews, etc.

          Everyone seems to be forgetting the lessons learned during all of that and are seeking to rebuild it all again.

  • Time to load up on some Old Glory robot insurance.
    • by NettiWelho ( 1147351 ) on Monday July 24, 2017 @07:39AM (#54866047)
      Indeed, I'd personally be more worried about how they solve a problem of the people in power being capable of simply ordering the robots to kill everyone and robots not going rogue and following instructions to the letter.
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      autonomous robots will drastically increase the danger of a rogue general, they'll obey orders no matter what

      • They also increase the danger of a rogue government.

      • by number6x ( 626555 ) on Monday July 24, 2017 @10:19AM (#54867073)

        Land mines can be thought of as fully autonomous robots. Perhaps the simplest case of a 'robot'.

        Very simple predetermined command to follow: 'When your trigger is tripped, execute your explosion sequence.'

        Most nations have banned the use of land mines because of their uncontrolled, autonomous behavior. Once they are set, they stay set and will activate whether tripped by friend or foe.

        They will activate when tripped by the little child playing in the field years after the war is over.

        The problem the General recognizes in fully autonomous killer robots is the same problem encountered when land mines are used. The robots are just a more complex example.

        • by Anonymous Coward

          US land mines, such as those in the de-militarized zone between NK and SK are controlled. They can be activated, de-activated, and located remotely.
          The US is not a signatory of the Mine Ban Treaty because they could not secure an exception for the NK-SK border. The US does abide by all the other requirements stated in the Mine Ban Treaty such as destroying all stockpiles of land mines, manufacturing land mines, exporting or deploying land mines, and not deploying landmines any where else.

          Sort of like the US

          • bullshit, those mines have killed hundreds of rural farmers and maimed thousands. "controlled", ha.

            a war with NK will NOT start by a bunch of NK soldiers marching across the DMZ.

            • a war with NK will NOT start by a bunch of NK soldiers marching across the DMZ.

              Of course not. It's full of land mines... Would you be so confident in saying that if it weren't?

              North Korea doesn't actually want to destroy Seoul with all that artillery they currently have pointed at it. They want to own it instead. If they could just march across the DMZ to own Seoul, they would.

              • you're confused, a path through land mines can be cleared trivially in war.

                The land mines maim innocents, they should not exist. Claiming the korean DMZ is magicallly different from the worlds other DMZs where the USA has already agreed they should not be used is hypocrisy. There are plenty of other ways to maintain security in a DMZ

  • Top US General concerned about future job security. Worries the human element will soon not be a requirement when it comes to warfare.

    This is a big deal in a country where War and Combat are glorified and have seeped into the facets of everyday life.

    • Re: (Score:3, Insightful)

      by Anonymous Coward

      Imagine someone hacking those robots and turning them against your citizens or all humans. Don't you want a way to stop them?
      Don't forget all the recent hacks. Everything is hackable with enough determination and resources. And military robots sure are a very good target.

      • by ranton ( 36917 ) on Monday July 24, 2017 @07:24AM (#54865967)

        Imagine someone hacking those robots and turning them against your citizens or all humans. Don't you want a way to stop them?
        Don't forget all the recent hacks. Everything is hackable with enough determination and resources. And military robots sure are a very good target.

        Those are very real potential threats, but probably the most real threat is enemy nations having better military technology than us ("us" is relative). The mere existence of nuclear weapons is also an existential threat, but no where near as dangerous as only your enemies having nuclear weapons.

        The military doesn't have the luxury of holding back because of the worry about all the negative consequences of new military technology. If the technology can exist, someone will develop it. The best defense I can think of is developing it yourself so at least you can understand the true dangers and potentially build countermeasures against them.

        • by Nidi62 ( 1525137 ) on Monday July 24, 2017 @07:41AM (#54866059)

          At the very least it is inevitable that we will see autonomous support equipment. When the US first invade Afghanistan, Special Forces troops regularly used mules to move equipment. It's not hard to see a future foot patrol using a multi-legged, load bearing autonomous robot for carrying equipment, supplies, or wounded soldiers. If it is legged it should be able to go over almost any terrain a soldier could go. Autonomous drones for reconnaissance are also extremely likely, again especially in foot patrol/small unit situations.

          And really, once equipment like this is perfected, it should be relatively easy to develop automated targeting technology on the side and mate the two as necessary (necessary being when encountering someone else doing it). As you said do it because someone else can and probably is. With that autonomous load bearing robot I mentioned: build it with a mount for a machine gun and a slot for whatever hardware module contains the autonomous targeting software. There is nothing making you install them unless you absolutely have too. Of course, once you do, you've opened Pandora's box and there's no closing it again.

          • by mysidia ( 191772 ) on Monday July 24, 2017 @08:32AM (#54866291)

            And really, once equipment like this is perfected, it should be relatively easy to develop automated targeting technology on the side and mate the two as necessary

            The greatest threat is probably from stolen autonomous equipment getting into the hands of terrorists.

            • Re: (Score:3, Informative)

              by ranton ( 36917 )

              The greatest threat is probably from stolen autonomous equipment getting into the hands of terrorists.

              No, it really isn't. Without government scare mongering terrorists wouldn't be thought of as much of a threat at all. Your morning commute is more of a danger to you than terrorists.

            • by Nidi62 ( 1525137 )
              Unless those terrorists plan to then sell that technology to the highest bidder, there is no more danger than with them stealing any other piece of equipment. Even with all the equipment ISIL was able to loot/capture from Iraqi soldiers, and the millions of dollars they were getting when they controlled oil fields, we didn't see them producing their own Hummers, artillery, weapons, etc. Terrorists wouldn't get much out of autonomous equipment anyway as getting them to the target location without being det
          • by Whibla ( 210729 )

            Will see?

            They already exist, albeit 'theoretically' they're currently only semi-autonomous (if requiring someone to turn them on counts as such).

            Wired for War [pwsinger.com] by P.W. Singer is an excellent book covering the subject, and I'd recommend anyone interested in the subject to read it. The section describing the SWORDS robots and the like, which have already seen deployment, are very informative, and indicative of the way things are going. And this direction is not just being driven by the manufacturers, but it's

        • by Anonymous Coward

          Over budget, over schedule... shit is rushed out.

          Happens all the same in private and military. Work the bugs out in v1.1 and hope most of them are non-leathal to friendlies.

        • by Anonymous Coward

          Are murderbots likely to be the best counter to other murderbots though?

          One would assume not, and rather that a focus on cyber- and electronic-warfare ought be a more prudent defence against others fielding them.

          • by ranton ( 36917 )

            Are murderbots likely to be the best counter to other murderbots though?

            One would assume not, and rather that a focus on cyber- and electronic-warfare ought be a more prudent defence against others fielding them.

            Which is why you need to develop your own murder bots so you can develop the electronic-warfare techniques to defend against them.

        • by mysidia ( 191772 )

          The best defense I can think of is developing it yourself so at least you can understand the true dangers and potentially build countermeasures against them.

          How about you stick with "safe limited mock-ups" of the technology and develop countermeasures directly, instead.

          • by ranton ( 36917 )

            How about you stick with "safe limited mock-ups" of the technology and develop countermeasures directly, instead.

            Have you ever went from a mock-up to a production implementation without learning something new? I doubt you ever have on anything other than the most trivial of solutions.

            • by mysidia ( 191772 )

              Have you ever went from a mock-up to a production implementation without learning something new?

              No... mock-up not the countermeasures, but the operators for the weapons you are building the countermeasures against.
              Countermeasures which are not RF-based that work against remote-piloted weapons in a restricted sandbox should work fine against real-world autonomous weapons.

              The problem is if you put resources into developing the autonomous weapons themselves; the enemy is likely to conduct espionage and s

              • by ranton ( 36917 )

                No... mock-up not the countermeasures, but the operators for the weapons you are building the countermeasures against.

                That is what I thought you meant. But my point still stands: building countermeasures against real production weapons is much different than building countermeasures against mock ups.

        • by monkeyxpress ( 4016725 ) on Monday July 24, 2017 @08:32AM (#54866287)

          The military doesn't have the luxury of holding back because of the worry about all the negative consequences of new military technology. If the technology can exist, someone will develop it. The best defense I can think of is developing it yourself so at least you can understand the true dangers and potentially build countermeasures against them.

          Yet we use this 'luxury' when it comes to many types of existing weapons. And what choice does humanity have? We are well beyond local tribes with spears and shields. The western minority powers can literally make everybody on the planet extinct if they want. If we must just accept that there is no way to build lasting peace, then we are simply counting down to our own extinction as every generation of smartphone gets better at ordering pizza and looking up trivia.

          The thing that scares me the most about these weapons however, is that it removes the democratic element of war. to fight a war you need a powerful army but also a loyal army. That same mass of armed civilians can turn against a ruler if they lose their populous appeal. This is why countries like north korean must run massive propaganda campaigns, and why much of the key to the rise of fascism was its ability to use new forms of mass media. It is why a free press and education are seen as essential elements in the fight against a repeat of humanity's past atrocities.

          But once you have autonomous armies, you no longer need trained civilians. A government can indeed use that army to control citizens and ensure it remains in power against majority rule. The political implications of this should scare anyone - we have never really had such a threat before. For me this threat from within is far greater than the meaningless risk of open conflict between nuclear armed states.

          • by Dareth ( 47614 ) on Monday July 24, 2017 @09:33AM (#54866713)

            Remember that terminators that can kill people can serve tea as well. The rich and powerful will control these resources and not need that many other people. Every time I read an article on UBI - Universal Basic Income, I think it is more likely to get UBG - Universal Basic Genocide.

          • by ranton ( 36917 )

            Yet we use this 'luxury' [of holding back research] when it comes to many types of existing weapons [such as Chemical and Biological Weapons].

            The important distinction here is that we did develop these weapons. We didn't just try to hold back the technology, we banned its usage on the battlefield. But a large part of our ability to trust in a ban of such weapons is that many nations understand their usage if someone breaks these treaties. If someone started using them, and it gave them a significant edge on the battlefield, other nations could use them in retaliation if necessary. I doubt we would choose to do that but only because we have access

      • It's ok. The Killbots will have a preset kill limit. Then we'll just send wave after wave of our own men against them until they hit the limit and shut down.

    • by Etcetera ( 14711 )

      Top US General concerned about future job security. Worries the human element will soon not be a requirement when it comes to warfare.

      This is a big deal in a country where War and Combat are glorified and have seeped into the facets of everyday life.

      Lol. If you think "War" and "Combat" have seeped into everyday life in some countries, just wait till you live in a world without at least some amount of Pax Americana...

  • --There are numerous movies and sci-fi stories reiterating the notion that making killbots is a BAD IDEA.

    • Re:Movies (Score:5, Insightful)

      by DNS-and-BIND ( 461968 ) on Monday July 24, 2017 @07:43AM (#54866065) Homepage
      I love the modern idea that works of fiction, specifically written to advance a particular point of view, are somehow indicative of how reality works. It's a movie, it's entertainment.
      • Re:Movies (Score:4, Insightful)

        by NettiWelho ( 1147351 ) on Monday July 24, 2017 @08:13AM (#54866189)

        I love the modern idea that works of fiction, specifically written to advance a particular point of view, are somehow indicative of how reality works. It's a movie, it's entertainment.

        Yeah, in reality the AI wouldn't be a rogue one but being a good little german and following the orders to the letter when it exterminates the starving rioting unemployed serfs.

      • While the notion of a Terminator-sequel, self-aware system like Skynet is still firmly rooted in the realm of science fiction, the idea that we can make an autonomous system designed to kill humans isn't neatly so far-fetched, nor is the idea that some of them may go beyond their expected parameters. They'd be dumb killing machines, little more than a modern, mobile version of mines, but they'd be more than capable of killing people who happened to wander into their path until they ran out of ammo.

        They're u

      • I love the modern idea that works of fiction, specifically written to advance a particular point of view, are somehow indicative of how reality works. It's a movie, it's entertainment.

        Except for Star Trek. it's real; unlike Star Wars, which even has the fighters cast shadows in the vacuum of space.

        • Except for Star Trek. it's real; unlike Star Wars, which even has the fighters cast shadows in the vacuum of space.

          You mean like the moon casting a shadow across the earth during an eclipse through the vacuum of space? Or the Moon lander creating shadows on the moon in a near vacuum.

          If an opaque object appears between a point and a light source it will cast a shadow. I understand light bends, and in space the sun could be less of a point-source so over distances shadows may appear less crisp, but you can still have shadows in a vacuum.

        • Why wouldn't an object cast a shadow in the vacuum of space?

      • I love the modern idea that works of fiction, specifically written to advance a particular point of view

        Fiction isn't "specifically written to advance a particular point of view," it's made strictly to entertain. Part of that entertainment may be the consideration of a particular scenario.

        are somehow indicative of how reality works.

        Humans are not entertained very long by nonsense, so fiction has to have a logical sequence. Science fiction needs at least a loose grounding in science combined with a "what if" scenario that generally goes awry. The idea that such a scenario cannot go awry is to call science fiction illogical.

        It's a movie, it's entertainment.

        What happened to it being "

      • I love the modern idea that works of fiction, specifically written to advance a particular point of view, are somehow indicative of how reality works. It's a movie, it's entertainment.

        It isn't a modern idea - just take the Bible and other sacred and above all ancient works of fiction, specifically written to advance a particular point of view. And people actually believed the world was run that way. What is modern is the notion that something like literature or theatre is exclusively meant for entertainment, and even today this is rarely true. Even "mere entertainment" represents a way to collectively reflect on aspects of reality; when people watch the never ending drama and exaggerated

      • Science Fiction has proven to be a harbinger of the future. Humans havent changed much in the last 4,000 years, so its not hard to use fiction to predict what they will do with future powers like being able to track every person with a cell phone. Life imitates art, and vice versa, they are not exclusive.
    • --There are numerous movies and sci-fi stories reiterating the notion that making killbots is a BAD IDEA.

      The obvious solution is to require killbots be designed with a preset kill limit - then they can easily be defeated just by sending wave after wave of men after them.

      RIGHT MEN?!

      • A well calculated move... straight out of Sun Tzu's ancient text, The Art of War. Or my own master work, Zapp Brannigan's Big Book of War.

  • Meh. (Score:2, Insightful)

    by Anonymous Coward

    Our bigger problem at the moment are killer generals (US and elsewhere).

    • by Anonymous Coward

      Imagine the worst combination. Killer robotic generals on a rampage.

    • Our bigger problem at the moment are killer generals (US and elsewhere).

      The average person you meet is more likely to kill you than the average robot you meet. At least for now.

  • by Anonymous Coward

    Imagine someone hacking those robots and turning them against your citizens or all humans. Don't you want a way to stop them?
    Don't forget all the recent hacks. Everything is hackable with enough determination and resources. And military robots sure are a very good target!

  • We need to build our own autonomous weapons systems in order to defend ourselves from other nations attacking with autonomous weapons systems.

    Right?

    • by Anonymous Coward

      If you're a ST:TNG fan, think of the episode "Arsenal of Freedom".

      Autonomous weapons sold to both sides of a planet at war, both populations killed by those weapons, all that was left were the autonomous weapons.

  • joshua what are you doing?

  • Inevitable (Score:5, Insightful)

    by lazarus ( 2879 ) on Monday July 24, 2017 @07:29AM (#54866005) Journal

    "Killer robots" are going to be created. As it gets easier and easier to do with off-the-shelf and/or printed components it is inevitable. Once that happens what comes next will be a matter of cyber security and cyber warfare. The "winner" in any war that uses autonomous killing machines as combatants will be the side with the best electronic warfare systems.

    Gen. Paul Selva probably understands that this is currently not his government, and recent administrations either have not gotten the memo or are playing their cards very close to their chest. I suspect he is much more worried about creating efficient killing machines that get co-opted and controlled by his adversaries than some AI going rogue and asserting their position atop Earth's food chain.

    • Re:Inevitable (Score:5, Interesting)

      by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Monday July 24, 2017 @07:53AM (#54866113) Homepage Journal

      There already are sentry guns, so we already have killer robots. But note that they are stationary. That limits their potential to do harm. Making mobile killbots is a whole other thing.

      It's highly true that we cannot make a network completely secure and also use it at this time. It's just too complicated. Killbots have to be stupid. If they are autonomous, the only way to "make sure" nobody else is hacking them and using them against you is to have them sever their radio connection after accepting an order, and to not accept any further communications. And lo, the oldest form of killbot is the cruise missile.

    • "Killer robots" are going to be created.

      Arguably, they already have been, depending on how you define a robot. Landmines could be considered robots and we've banned them because they kill civilians.

      The "winner" in any war that uses autonomous killing machines as combatants will be the side with the best electronic warfare systems.

      POPPYCOCK! The winner could simply be the side that exploits a mechanical or chemical weakness in the the autonomous robots. Tanks thwarted with simple wire that tears up their treads is a good example of this concept.

      Gen. Paul Selva probably understands that this is currently not his government, and recent administrations either have not gotten the memo or are playing their cards very close to their chest. I suspect he is much more worried about creating efficient killing machines that get co-opted and controlled by his adversaries than some AI going rogue and asserting their position atop Earth's food chain.

      He'll come around because people, we are going to make the best, most luxurious, killer robots the world has ever seen. I'm telling

    • by Kjella ( 173770 )

      The US doesn't have any real existential threats (Canada? Mexico? Russian tanks rolling into Alaska?) short of a full WW3 and that kind of total war for survival would play by completely different rules. For every other kind of proxy/support war like against IS etc. efficiency is not really the primary measure of success. The US wants to play the good guys which means that they use great discretion in who, when and where they attack because it's in urban areas, against adversaries in civilian clothing and w

    • by Kiuas ( 1084567 )

      Gen. Paul Selva probably understands that this is currently not his government, and recent administrations either have not gotten the memo or are playing their cards very close to their chest. I suspect he is much more worried about creating efficient killing machines that get co-opted and controlled by his adversaries than some AI going rogue and asserting their position atop Earth's food chain.

      Way back a decade ago I used to play a lot of (now unfortunately dead) Source engine mod Dystopia [wikipedia.org] which included

    • I agree there's a problem: OTHER nations will make them whether we do or not; and therefore we are forced to pursue similar technology to compete and survive.

      Such bots will probably need a relatively simple "kill-switch" mechanism that is independent of the rest of the brain. Thus, if the main brain gets hacked or goes berserk, the independent kill-switch can be contacted to disable the entire thing. Because the kill-switch is (hopefully) a relatively simple mechanism, it's easier to prevent it being hacke

  • It'll be some poor anonymous dirt farmer doing what he thinks is right, and he'll have the first kill by an autonomous robot soldier. If he's lucky, and the robots frame rate is high enough, maybe we'll at least know what he looked like.

    The drones, the precision small yield missles, the level of technology, training and firepower that we equip our human soldiers with is staggering. They're already meat robots in so many cases, trained to think of the enemy only as the enemy and kill on command using their

  • by Anonymous Coward

    Who do you believe when robots begin doing things that we were told they weren't supposed to be able to do? The people who deployed them? The ones who built them?

    Can you believe any of them didn't have ulterior motives, or were they just that naive or incompetent to trust lethal force to something other than humans? Things that can make decisions - good or bad, morally justifiable or not - in a fraction of the time humans do, and react accordingly? Have we all forgotten that war is "a strange game. The only

  • by Anonymous Coward

    In 1974 Grumman sold 80 F-14 Tomcats to Iran for $2bn to stave of bankruptcy when the US stopped funding the contract.

    How long till autonomous killbots go up for sale to various nation states around the world for profit and then get turned against "us"?

    No one at the top ever is held accountable, except for the occasional scape goat. No engineer in the middle is considered accountable either because they were "just doing the jobs".

    • It's not like Grumman could have foreseen that our man in Iran would be ousted by those religious towelheads. Who could have expected the fourth biggest military on the planet to simply crumble to the revolt of "students"?

      But don't worry, if history teaches us something else then that all we need if something like this happens is some tinpot dictator to start a war with whoever buys the killer bots.

      And later when he's no longer convenient, we can simply dispose of him.

  • by Anonymous Coward

    Foolish biologicals.

    You will never see it coming.

  • The first rule of elites and villains: never create something which has the capability of destroying you. This man is warning against a creation that could destroy him and his kind. He's right to be alarmed. It's bad enough we have Youtube and Reddit spewing the real story to the masses, these kind of weapons could be much worse.
  • by Anonymous Coward

    when it comes to killing innocent people (for political gains and self-interest). American young men are willing as ever to enlist to "get some" and to become a "hero" in Afghanistan or wherever. And, as we've seen, they have little consideration for innocent lives.

    Why would they suddenly think putting a gun in the hands of a robot would be a bad idea? A trick to make them seem considerate, I guess.

    • Why would they suddenly think putting a gun in the hands of a robot would be a bad idea? A trick to make them seem considerate, I guess.

      https://www.youtube.com/watch?v=VTnxP7e7-YA&list=PLcG9uojq3xLEFTCjX10mHyH_E4NSjcP6a

  • If the robots are autonomous, then the soldiers aren't equipped with them - they'd fight alongside them. Or probably several hundred feet below, if the robots are drones.

  • by mrsquid0 ( 1335303 ) on Monday July 24, 2017 @08:15AM (#54866205) Homepage

    The solution to the intelligent robot problem may be to do what the robot designers did in the Star Wars universe -- program personality disorders into the intelligent robots.

    • by Tablizer ( 95088 )

      [Copy] the Star Wars universe -- program personality disorders into the intelligent robots.

      So Trump is an experimental android. Explains a lot.

      C4PO: "I'm the best droid, believe me; I know 900 foxtillian languages and everyone knows I translate the best. And I can run the Death-Star better than Vader. I know death. That asthmatic toaster is a total loser! He wastes time yanking off with his light-saber; I'd use Yuuuge weapons to wipe out the enemy in mass, let me tell ya, not play with silly little sabers w

  • by Baron_Yam ( 643147 ) on Monday July 24, 2017 @08:25AM (#54866245)

    First came the R/C devices used, then the semi-autonomous devices with minimal ability to deviate from a pre-planned route. Then came the more-or-less fully autonomous devices, where you give it a map and a target and a 'go' order. Right now, the machines return video or other data and a human gives the final OK.

    That's fine (for the US and allies) while they're the only ones who can deploy that level of tech in the field, but as everyone else catches up, it'll be the ones that take the humans out of the loop that respond faster and win the engagements. And the US won't sit by and watch as that happens, they'll remove their human oversight.

    The next step will be false flag ops, blaming the enemy's bad software. And, eventually, there will be a bad map update or a malicious instruction and you'll have a drone swarm committing genocide for you.

    This is inevitable, and rather than try and prevent it (which is futile) we ought to be worrying more about counter-strategies. Maybe we need to say that we can't be as free as we'd like to be, and drones have to go - that anything over a certain size (big enough to carry a dangerous payload a significant distance) will be shot down on sight unless it's a registered, transponder-carrying device.

    I can honestly see the day when densely populated areas are protected by automated anti-drone systems. It's just too easy to launch hundreds of moderate-sized devices at an urban center to sow chaos and fear.

    And just wait until the first self-driving car bomb...

    • What about self driving cars? There have been many articles on the ethics and morals of self driving cars and unavoidable accidents. Will cars make the decision on who to hit and who to avoid? Or does the "backup driver" get an alert:

      Please choose person to hit:
      1 Pedestrian - Elderly Lady
      2 Pedestrian - Female pusing Stroller

      3...2..1.. *Crash*

      • > There have been many articles on the ethics and morals of self driving cars and unavoidable accidents.

        Mostly silly whining, in my opinion.

        > Will cars make the decision on who to hit and who to avoid?

        If necessary, yes. But it's unlikely that a "Trolley Problem" will occur. And since there's good reason to suspect that self-driving cars will be much, much safer than human-driven ones, you'd probably treat them like seatbelts and airbags; yes, they occasionally cause harm, but statistically they save

  • All you need to do is wait until the batteries run out.

    • by hughbar ( 579555 )
      Not sure. Like Roomba etc. they just go somewhere, hide out and plug themselves in to recharge. That would be logical and quite 'easy'.
      • Nothing that shutting off the grid wouldn't fix. Or a nice little EMP device.

      • Not sure. Like Roomba etc. they just go somewhere, hide out and plug themselves in to recharge ...

        And, like Roomba they'd have a hard time going from hardwood to rugs to tile to carpet. They wouldn't be able to kill anyone under the couch. They'd bump into the cat and turn around to go seek victims in another room. If they were made by Samsung they'd need built in fire-suppression systems because they'd constantly be fighting the urge to self-immolate. And forget stairs old chap; we'll need one for every floor. The ever-helpful upstairs Killer Robot, that's Maude. Our main level Killer Robot is Hu

  • Not a new problem (Score:5, Interesting)

    by vtcodger ( 957785 ) on Monday July 24, 2017 @10:01AM (#54866931)

    Back in the 1960s, the USAF deployed a Surface to air missile called the Bomarc ( https://en.wikipedia.org/wiki/... [wikipedia.org] ) The thing had a range of around 400km and conceptually, could be used to intercept flights of long range bombers headed toward the US. The problem was that the bomarc could have a nuclear warhead. Fine if you want to take out a squadron of bombers someplace out over the Atlantic. But what if you wanted to call off an intercept for some reason? You can tell an F-106 to return to base. But putting a pilotless missile with a nuclear warhead on RTB was considered to be a non-optimum strategy.

    I'm not sure the usage was ever resolved. Fortunately or not, the threat switched from long range bombers (which we probably could not actually intercept reliably because of jamming) to ICBMs that we could not intercept reliably because we lacked the technology to intercept them.

    The bomarcs were scrapped in the early 1970s.

  • Really, really need a filter for "AIs already explored, to death, by SciFi authors, possibly from last century, and news to only people who's reading genres include volumes such titles as Twilight (and no, I am not speaking about the pony princess from MLP:FIM)."

  • Brought to you by the guys who bombed the crap out of civilians in Vietnam, Afghanistan, Iraq, Syria, ...

  • I really don't see the difference between the " model soldier " who follows orders without question and a killer robot.

    In fact, I would expect our military leaders to be salivating at the prospect of such a thing.

Ummm, well, OK. The network's the network, the computer's the computer. Sorry for the confusion. -- Sun Microsystems

Working...