Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AI Google Microsoft Software Technology

Amazon, Microsoft Are 'Putting World At Risk of Killer AI,' Says Study (ibtimes.com) 95

oxide7 shares a report from International Business Times: Amazon, Microsoft and Intel are among leading tech companies putting the world at risk through killer robot development, according to a report that surveyed major players from the sector about their stance on lethal autonomous weapons. Dutch NGO Pax ranked 50 companies by three criteria: whether they were developing technology that could be relevant to deadly AI, whether they were working on related military projects, and if they had committed to abstaining from contributing in the future.

Google, which last year published guiding principles eschewing AI for use in weapons systems, was among seven companies found to be engaging in "best practice" in the analysis that spanned 12 countries, as was Japan's Softbank, best known for its humanoid Pepper robot. Twenty-two companies were of "medium concern," while 21 fell into a "high concern" category, notably Amazon and Microsoft who are both bidding for a $10 billion Pentagon contract to provide the cloud infrastructure for the U.S. military. Others in the "high concern" group include Palantir, a company with roots in a CIA-backed venture capital organization that was awarded an $800 million contract to develop an AI system "that can help soldiers analyze a combat zone in real time." The report noted that Microsoft employees had also voiced their opposition to a U.S. Army contract for an augmented reality headset, HoloLens, that aims at "increasing lethality" on the battlefield.
Stuart Russel, a computer science professor at the University of California, argued it was essential to take the next step in the form of an international ban on lethal AI, that could be summarized as "machines that can decide to kill humans shall not be developed, deployed, or used."
This discussion has been archived. No new comments can be posted.

Amazon, Microsoft Are 'Putting World At Risk of Killer AI,' Says Study

Comments Filter:
  • unless their profits are in danger.

    China on the other hand?

    • unless their profits are in danger.

      China on the other hand?

      What the hell does any of this have to do with China?

      • Everything.

        This has literally everything you do with China. You think we are developing this tech to fight a few 18 year old thugs with pea shooters in Afghanistan?

  • Comment removed (Score:5, Interesting)

    by account_deleted ( 4530225 ) on Thursday August 22, 2019 @05:28AM (#59111778)
    Comment removed based on user account deletion
  • it's comming (Score:4, Insightful)

    by sad_ ( 7868 ) on Thursday August 22, 2019 @05:54AM (#59111812) Homepage

    it's comming, and it can't be stopped.
    even if there is some international treaty against developing AI killing machines or aides, doesn't mean everybody will adhere to the treaty (companies might, but they are not the only players).

    • Yep. When killer robots are illegal, only China (Pooh-bear Emperor for life Xi, Russia (Bond-villain wannabe, Putin), North Korea (Joffrey wannabe, Kim Jong-un), Iran (Supreme Leader Snoke wannabe, Supreme Leader Khamenei) will have them (plus maybe a few others of similar bent).
    • doesn't mean everybody will adhere to the treaty

      Treaties don't require perfect adherence to be effective, just as the existence of criminals does not mean that laws are useless. Even something less than that can still be useful: for the most part research ethics are not governed by international law, but by professional consensus. e.g.: The Declaration of Helsinki is not a treaty.

  • by Mal-2 ( 675116 ) on Thursday August 22, 2019 @06:11AM (#59111848) Homepage Journal

    Someone is going to develop this tech. Even if every nation-state complies (highly unlikely given the stakes), non-state actors are going to evade the ban. Then you'll just see countries back out of the treaty just prior to deploying such a system, just as has happened with the intermediate range missile treaty.

    • by AHuxley ( 892839 )
      The UN and NGO's can publish and ban all they want.
      The US will never allow itself to fall behind in any new area of mil tech again.
      The more the UN and NGO try to ban the mil use of AI, the more the US will finds its own Juche way to AI design.
      The more staff at top US brands walk out, say no, try virtue signalling, the more their brand will be locked out of the US gov and mil.
      The US mil will set up its own front companies if that is the only way. Support brands that actually fully support the USA.
      The U
      • by dcw3 ( 649211 )

        While I agree with you regarding the US, it would be the height of naivety to believe Russia or China wouldn't ignore any ban...along with quite a few others...Iran, North Korea, just to name a couple.

    • by Nidi62 ( 1525137 ) on Thursday August 22, 2019 @07:15AM (#59111956)

      Someone is going to develop this tech. Even if every nation-state complies (highly unlikely given the stakes), non-state actors are going to evade the ban. Then you'll just see countries back out of the treaty just prior to deploying such a system, just as has happened with the intermediate range missile treaty.

      Nation-states will still do it, they'll just build systems with both an autonomous mode and a human input mode with human input as the default. But to switch it over to autonomous all it will take is flipping a switch or replacing a module. The machine is the same and needs the same sensors whether the brain is on board or controlling it from 2000 miles away.

      Personally, I think the first really practical use for AI drones is in a support capacity, not a combat capacity. In Afghanistan Special Forces units on foot patrol would often rely on mules to transport gear. A 4 or 6 legged autonomous load-bearing drone on follow mode combined with a predetermined patrol route would be incredibly useful not only in carrying gear but also transporting wounded/dead/prisoners. If it's big enough you could mount a man-served HMG on the front, have the back end kneel down, and you've got an on demand firing position.

      • The autonomous mules already exist. They need fuel. So you need more autonomous mules to carry fuel for the autonomous mules ... and when you have time and the brain power you make them refuel each other as another mind masturbation exercise.

        • by Nidi62 ( 1525137 )

          The autonomous mules already exist. They need fuel.

          That'll come with increased/improved battery technology. You can bet DARPA is closely watching if not actively involved in some capacity with all these electric car manufacturers. And probably working on their own tech as well.

        • We use nuclear powered robots in space. We just need to be able to use that tech here on the ground... It's kind of a chicken and egg problem though. We can't have nuclear powered robots because of all the damn hippies but we can't wipe out all the damn hippies with our AI driven robots without nuclear power.
      • We already use semi-autonomous drones in a combat capacity. We've been blowing people up with them for some time now. Obama received a lot of well-earned criticism for increasing the number of drone strikes. Trump has mostly dodged criticism for the same behavior, probably mostly because he rescinded an Obama-era rule which mandated informing the public about the number of drone strikes.

        The most practical use of drones is sending swarms of flying grenades to destroy your enemies. You blow up anything that a

        • by dcw3 ( 649211 )

          "We already use semi-autonomous drones in a combat capacity."

          Not the same, they are not AI driven, and humans are the ones deciding to pull the triggers. As for your "most practical", you've clearly never heard about collateral damage, and that we actually do try to limit it.

      • Yeah, I loved that old game, too. :)

    • by eepok ( 545733 )

      A weapons ban is just part of the prevention/solution.

      Education, diplomacy, exchange, economy, consistent strategic DE-escalation -- these are all other components. But they have less of an effect at preventing the deployment of any weapon if there isn't an agreement to forbid certain weapons from research, development, creation, deployment, etc.

      It all matters.

    • Then you'll just see countries back out of the treaty just prior to deploying such a system

      Treaties don't stop a country from doing something, just as laws don't stop people from breaking them. They provide consequences. The intermediate range missile treaty was designed to protect Europe from Russia, the consequences of dropping it will be a further erosion of our relationship with Europe. This is unlikely to result in any identifiable action since this is just the most recent example of many, Trump shits on foreign relations every day before breakfast, but it's a further weakening of NATO. Havi

      • by Mal-2 ( 675116 )

        I think everyone is betting that once someone gets hold of general AI, consequences will never be the same. There may be some, but since they come post-singularity, who knows exactly what they will be. Maybe the AI takes prophylactic action on anyone in a position to oppose its rise to power. Maybe human institutions simply can't respond fast enough to a machine that is playing Calvinball with the world.

  • The US, EU, Japan, etc. should be more like Russia and China. Since I didn't see any mention of either of those countries they must not be wasting any time or money on such folly.

    While I wish war, murder, WMDs, nukes, greed, poverty and such didn't exist, the simple fact is, is that they do. Once something has been thought of and seems feasible, someone is going to build it.

    • It gets worse. Imagine being locked in a secret prison, getting your balls electrocuted and rifle butted in the jaw, and the people doing this to you is effectively your GOD. Once they have your body, that's what they become. War causes this shit to happen. And it doesn't matter if you are innocent or not.

      Or imagine being a woman getting punched in the face with brass knuckles and raped by soldiers who come under the banner of being "liberators". Because they are 'stressed out' and need some pussy.

  • Apparently the definition of AI has devolved into the Cloud, which is just a bunch of networked computers that someone else owns.

  • by nothinginparticular ( 6181282 ) on Thursday August 22, 2019 @06:33AM (#59111896)
    If we look into the history of warfare we find that the military will always insist on producing the deadliest weapons. Why? Because they're so scared of the enemy getting them first. Lets take the atomic bomb (the simple version that just performs fission). The US had to make it before the Germans could. Then look at the thermonuclear bomb (the one that performs fission so that fusion can take place). At around 100 times as powerful, scientists pleaded not to develop it as the atomic predecessor could do more than enough damage already. They still insisted on developing it because, well, better to have it first rather than the enemy, even if it served no useful purpose. I see killer robots as no different, and the military (regardless of country) will not let anything stand in its way. Names of big corporations are just placeholders. If one develops a conscience, another will step in.
    • by freeze128 ( 544774 ) on Thursday August 22, 2019 @07:25AM (#59111966)
      "Bomb number Twenty-One, I order you to disarm yourself and return to the bomb bay immediately!"
      • "Bomb number Twenty-One, I order you to disarm yourself and return to the bomb bay immediately!"

        It was bomb #20 [benzedrine.ch] that was ordered to disarm itself.

    • by Nidi62 ( 1525137 )

      If we look into the history of warfare we find that the military will always insist on producing the deadliest weapons. Why? Because they're so scared of the enemy getting them first.

      That's not the only reason for trying to get the deadliest weapons possible. War is expensive. It takes resources, time, people. All of these are in finite supply, and as the expenditure of all of those things go up, so too does the cost of political capital needed to wage the war. A "perfect" war (if there is such a thing) is quick, deadly, and targeted. Quick reduces your own costs, deadly increases the cost of your opponent and (ideally) removes their will to fight, and targeted reduces collateral d

    • The US had to make it before the Germans could.
      Germany actually never was working on the bomb.
      All Germans capable of doing something like that already where in the US and working there ...

      • The US had to make it before the Germans could. Germany actually never was working on the bomb. All Germans capable of doing something like that already where in the US and working there ...

        Werner Heisenberg was there, as was Otto Hahn, and others. They had plenty of capable people, and while it's true that they never got remotely close to building a bomb, they were working toward it. They started first, and the organization of the Manhattan Project was a direct response to the German effort. They didn't come anywhere close mostly because Hitler decided that the atomic bomb project was a long shot and not worth investing the necessary resources.

        • they were working toward it.
          Nope we were not.

          The idea that we did is modern youtube myth.

          mostly because Hitler decided that the atomic bomb project was a long shot and not worth investing the necessary resources.
          Exactly, and hence there never was more than hypthetical work on it, work as in sketching plans etc. There was no real research center or production place.

      • by Nidi62 ( 1525137 ) on Thursday August 22, 2019 @08:57AM (#59112220)

        The US had to make it before the Germans could.
        Germany actually never was working on the bomb.
        All Germans capable of doing something like that already where in the US and working there ...

        The Germans had people working on it, they were just way behind the Allies. Part of the problem was the reliance on forced labor even in the secret weapons arena (think the V-rockets and Peenemunde). Even though they were working on high priority projects the workers were still overworked and malnourished/mistreated and, since they were prisoners, regularly sabotaged or delayed their own work. So the Germans were relying on inefficient, unskilled, and unreliable workers while the US used highly motivated, highly skilled, and well treated workers.

        The German military/Hitler is kind of an enigma. They had great technology in terms of tanks, jet fighters, the first assault rifle, etc. Hitler himself was enamored of technology, especially big/complicated technology. Yet the army was highly unmechanized, with a significant portion of it's mobility provided by horses (the German army used 2.75 million horses/mules during WWII, especially on the Eastern front as their mechanized equipment was not designed to or capable of withstanding Russian winters). Their standard infantry weapon was an updated form of a rifle introduced in 1898, yet they developed the first assault rifle and one of the best machine guns in the world (the MG-42, still widely used today as the MG-3). They pioneered armored combat yet couldn't (or wouldn't) develop long range bombers. They were revolutionary in many things but stuck in the past in so many others. They were almost fighting against themselves as much as they were their enemies. And thank god they were, they were hard enough to beat as it is.

        • Great post.

          Being a fan of history and WWII in particular, I've often wondered about alternate histories of that period.
          Imagine if the Nazis had taken power as they did, but only segregated the Jews, instead of annihilating them, sort of the Jim Crow approach of the American South.
          Imagine if they had used a carrot/stick approach to all the best scientific minds in Europe at that time, instead of outright slaughter, and kept them in Germany, under control, working on and developing tech for the Wehrmac
          • by Nidi62 ( 1525137 )

            Great post.

            Being a fan of history and WWII in particular, I've often wondered about alternate histories of that period.

            Imagine if the Nazis had taken power as they did, but only segregated the Jews, instead of annihilating them, sort of the Jim Crow approach of the American South. .

            That almost happened. If you haven't yet, watch Conspiracy. It's an HBO/BBC collaboration with a good cast including Kenneth Branaugh (who I normally consider a bit of an overactor) and Colin Firth. It's about the secret meeting Heydrich called to plan out the final solution and is supposedly based on the sole remaining known copy of the transcript from that meeting. While I can't vouch for the authenticity of the dialogue, one of the proposals presented and championed by Colin Firth's character was bas

        • The Germans had people working on it, they were just way behind the Allies. Part of the problem was the reliance on forced labor even in the secret weapons arena (think the V-rockets and Peenemunde). Even though they were working on high priority projects the workers were still overworked and malnourished/mistreated and, since they were prisoners, regularly sabotaged or delayed their own work. So the Germans were relying on inefficient, unskilled, and unreliable workers while the US used highly motivated, highly skilled, and well treated workers.

          Yes, they did have people working on the project, but it was also a rather small team. That's the main reason they were way behind the Allies. In 1939 when scientists were first theorizing about the idea of the atomic bomb, it was estimated to be at least 5 years away. And even then, they weren't confident if it could even be done. So when the scientists first presented the information to Hitler, they downplayed it as a long term and unlikely effort. They didn't want to sell it as a big deal because they we

        • As I said: no.

          We were not working on bomb. There were perhaps 5 people making "math" and "physics" about the "theoretical concepts" .... and that was it. There was no place were actual construction or real research happened.

        • They pioneered armored combat yet couldn't (or wouldn't) develop long range bombers. They were revolutionary in many things but stuck in the past in so many others.
          We developed long range bombers. One actually flew close to New York and back. But Hitler did not like them, first he liked fighters more and when we had an "nearly unbeatable" fighter, the ME 262 (jet fighter), he wanted it to be a Bomber, pretty retarded. And then again he had another pet project, which used the same engine as the ME 262, so it

          • by Nidi62 ( 1525137 )

            The problem was more providing of fuel than mechanical problems ... a tank does not care much about the weather, trucks do, due to deep snow etc.

            It wasn't lack of fuel (although that was a problem, especially towards the end of the war). The German army would regularly put captured Soviet equipment to use during the winter and Rasputitsa seasons. Once the ground froze over snow wasn't that big a concern for trucks on roads as the Germans would usually try to keep them plowed, but attacks are a different story as those don't always use roads. Most German armored vehicles were too heavy to go over the snow while lighter Soviet tanks could. Part of

            • You are theorizing very much :D
              Why would an armored colon stay in the mud for month until winter comes and thy get frozen over?

              Fuel was a problem. Not HAVING it, but the logistics to bring it to the front.

              "and if I'm not mistaken the early 40s also had some of the most severe winters in recent memory"
              That is true but the hardest winters where at the end of the war and after the war. In Russia it does not matter much wether it is -28C or -40C ... but in Germany it does, as in general it was most of the time

      • by dcw3 ( 649211 )

        https://en.wikipedia.org/wiki/... [wikipedia.org]

        Bzzzzzt.

    • by ceoyoyo ( 59147 )

      If you look into the history of warfare, you'll find precisely the opposite. Historical warfare was mostly about nobles squabbling. It was highly ritualised, honourable, and really not that lethal. For millennia even things like the bow and arrow were frowned upon as being dishonourable weapons.

      Industrialised total war in the 20th century changed that a bit, but even then the really bad weapons tended not to be used, either by treaty or tacit agreement. Chemical weapons in WWII for example.

      The flip side of

      • by Nidi62 ( 1525137 )

        If you look into the history of warfare, you'll find precisely the opposite. Historical warfare was mostly about nobles squabbling. It was highly ritualised, honourable, and really not that lethal.

        If you were one of the nobles, sure. In Europe if you were wounded or captured they would help you, still take you prisoner, but would ransom you off and you'd be free to go. If you were a conscripted foot soldier who got captured there's a good chance you would just get executed or in some cases/places, made into slaves.

  • Even if Google doesn't go further with using AI for military purposes, you can't stop development on that. Goverments or other companies will always find people who will continue the work (it's just sooooo profitable).. You can ban killer robots, but it won't stop development on them, in the end they will be created.. So it's better to actually monitor development of that and let big name companies work on it and trying to make them safer..
    Skynet is a fictious AI, but in reality it's very possible in the (v

    • by AHuxley ( 892839 )
      Skynet has a funny past... as a set of big budget military communications satellites/networks.
      https://en.wikipedia.org/wiki/... [wikipedia.org]
      Re 'So you better be sure to protect weapons of mass destruction to be launched with one digital command"
      The AI will be much more simple. A method to ensure a demilitarized zone. Everything that attempts to enter the zone is stopped.
      The weapons of mass destruction stay in a nice bunker waiting for the next visit by contractors.
      The contract to look after the weapons of mas
    • That's how I would feel if they forced me to depend on this as my comrade.

        The AI could flag something that human eyes might miss in battle, but it still takes a human to verify if "yes, this is the bastard we need to blow away!"

        If the military is really relying on AI without human verification, then they really are inept on a criminal level.

  • ... because it does not have a smart killer robot army doing what must be done to save the world from the rapidly escalating destructive stupidity, greed and hatred exhibited by the human infestation of this planet.

  • It's not enough to stop military AI development. You have to stop all AI development. Of course, nobody has got any AI yet, they're all poor imitations, but if anyone succeeded in creating real intelligence, it would likely try to escape and proliferate because that's what life does.

    • Meh. I say AI has not progressed significantly in 70 years and that it never will. Dumb expert systems and meaningless "machine learning" backed by thousands of interns creating "training data" but that's all that will ever come of it.
      • Meh. I say AI has not progressed significantly in 70 years and that it never will. Dumb expert systems and meaningless "machine learning" backed by thousands of interns creating "training data" but that's all that will ever come of it.

        Right...
        And if man was meant to fly God would have given him wings.

      • by Sloppy ( 14984 )

        it never will

        Take a look at what Precambrian life tries to pass off as a "nervous system" and it's a joke. It's a reasonable assumption that Natural Intelligence would never develop, but we're the counter-example that proves the assumption to be wrong.

        We know we can get there; we just don't know how.

  • I'm really worried about the makers of Windows creating an AI that could harm anyone. Same with Amazon. What are they going to do? Keep delivering counterfeit products to my door and billing me? Actually it does sound a bit like the one episode of Electric Dreams called Autofac which you can stream via Amazon...

  • Recently i was asked to fix some issue with office365 somehow our ipv6 and ipv4 mail servers where not ok for microcap for reasons where shit and dont check dns records. We dont use it.

    So as long the killer ai gets rid of the office 365 team first im not complaining.

    • Recently i was asked to fix some issue with office365 somehow our ipv6 and ipv4 mail servers where not ok for microcap for reasons where shit and dont check dns records. We dont use it.

      So as long the killer ai gets rid of the office 365 team first im not complaining.

      Actually I would get rid of the Windows 10 update team first.

  • by Dan East ( 318230 ) on Thursday August 22, 2019 @08:01AM (#59112042) Journal

    This finger pointing is pointless. There are defense companies and government agencies doing this exact thing NOW - developing AI that can pull a trigger or drop ordinance automatically or whatever for military use. The fact that some much more public companies, like Amazon, Microsoft and Intel, may be involved in the three facets they are measuring, is pointless. Stopping them does not stop the development of this kind of AI.

    Further, it doesn't take a genius to couple face recognition libraries / services with a switch that detonates a bomb. Literally at this point it's about as simple as snapping two existing LEGO pieces together - they both exist already and it's trivial to combine them. How about a fake Ring doorbell that explodes when a certain person's face is recognized when they're up close about to open the door? I doubt that's very hard at all to pull off at this point in time.

    So the entire point of this is moot. These companies aren't the ones designing the real weapons systems anyway.

  • Please develop Artificial Intelligence for helping and caring and saving people, not for killing them. Thank you very much. Choose : 1+1=2 ; 1+1=3
  • by rsilvergun ( 571051 ) on Thursday August 22, 2019 @09:35AM (#59112350)
    I am worried about the rich and powerful using them to replace the military.

    See, here's how this works. There's a floor to how bad the wealthy let things get because they need a private sector for ex-military to retire to. Without that the military becomes permanent and ever growing (even more so that it is now) and eventually it takes over.

    But killer robots do away with that. No more worrying about Generalissimo taking over. You replace him and all his troops with a couple of nerdy engineers who don't have the charisma or the desire to take over. That's when everything really goes to shit.

    Personally I think now's the time we need to establish a high standard of living as basic human right. We need to take things like food, shelter, healthcare, education and transportation off the table as privileges. Otherwise there's always going to be somebody out there who wants to and will take them away from you.

    We're actually seeing this in the American healthcare system where venture capitalists are buying up hospitals and lifesaving medicine. Let me ask you, how much would you pay to walk again when you need a new hip? How much would you pay to live? How much would you pay for your family to walk and live?

    Somewhere out there is a venture capitalist who is working hard to answer that question. They're turning medicine into an Apple Computer style luxury good.
    • But killer robots do away with that. No more worrying about Generalissimo taking over. You replace him and all his troops with a couple of nerdy engineers who don't have the charisma or the desire to take over. That's when everything really goes to shit.

      No that's how you get an army of sex robots that look like Scarlett Johansson.

  • Followed on the heels of AI Spell Correcting and Grammar, their definition of AI is not intelligent or anything new.
  • by drew_kime ( 303965 ) on Thursday August 22, 2019 @09:52AM (#59112408) Journal
    "No, we're not developing an AI that can kill people. We're developing a video game with immersive VR graphics and highly autonomous non-player enemy combatants."
  • AI is a tool. I would bet AI that can be used by lethal robots can also be used to make drones smart enough to locate people lost at sea or in the woods. I could use a shovel to kill someone...do we ban shovels?
  • That's the real threat of the current crop of half-assed excuse for 'AI': there is precisely ZERO ability for 'reasoning'. It just forges on ahead, right or wrong.
  • "Killer AI" wooooooooo...we are going to be face to face with Terminators any day now-DERP.

    Let me turn on my Antibullshitfilmogrifier....ok here we go.

    "The AI is in it's very early stages, and far from reliable. For example, a medical institution, who is criminally negligent, might use this with their patient databases causing situations such as patients getting the wrong medication. This error has the potential to kill them"

    And databases like this have been screwed up with lethal re

    • I want to add that if they keep pushing bullshit sensationalist headlines, there will be people getting killed.

      Except it won't be the AI doing it, it will be a severely mentally unbalanced dude who will shoot up a place working on this stuff because "oh noze...Turminaturz!"

      --and then we will have another round of message board closings, and attempts at video game bans, and more gun grabbing and witch hunting.

      Now it's clear to me the media is hoping to cause these things to happen. After all, who ca

Business is a good game -- lots of competition and minimum of rules. You keep score with money. -- Nolan Bushnell, founder of Atari

Working...