Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AI Technology

AI Experts Boycott South Korean University Over 'Killer Robots' (bbc.com) 73

An anonymous reader shares a report: Leading AI experts have boycotted a South Korean university over a partnership with weapons manufacturer Hanwha Systems. More than 50 AI researchers from 30 countries signed a letter expressing concern about its plans to develop artificial intelligence for weapons. In response, the university said it would not be developing "autonomous lethal weapons." The boycott comes ahead of a UN meeting to discuss killer robots. Shin Sung-chul, president of the Korea Advanced Institute of Science and Technology (Kaist), said: "I reaffirm once again that Kaist will not conduct any research activities counter to human dignity including autonomous weapons lacking meaningful human control. Kaist is significantly aware of ethical concerns in the application of all technologies including artificial intelligence."
This discussion has been archived. No new comments can be posted.

AI Experts Boycott South Korean University Over 'Killer Robots'

Comments Filter:
  • by rossdee ( 243626 ) on Thursday April 05, 2018 @10:07AM (#56386453)

    Asimov rolls in his grave

    • by DrTJ ( 4014489 )

      Maybe it's not the first law they broke...

    • by haruchai ( 17472 )

      Asimov rolls in his grave

      Reminds me of a 1st season ST:NG episode, the Arsenal of Freedom

    • Asimov rolls in his grave

      Raises the question of what these guys think we should do when the killbots show up (which they will; can't stop every place on earth from developing them).

      Perhaps we should try to discuss 1940's science fiction with the kilbots? That should work.

      • Raises the question of what these guys think we should do when the killbots show up (which they will; can't stop every place on earth from developing them).

        Make killbot killer bots?

      • Raises the question of what these guys think we should do when the killbots show up (which they will; can't stop every place on earth from developing them).

        Well, since killbots have a preset kill limit, we can send wave after wave of men at them until they reach their limit and shut down... It worked for Captain Brannigan!

    • by CODiNE ( 27417 )

      I think the underlying point of the stories were that there's no iron-clad set of rules that COULD govern the behavior of robots without unexpected consequences. Even such obviously benign and logical rules such as those had serious limitations. Besides the impossibility of programming such rules... The 3 rules don't and CAN'T exist.

      That said, yeah killer robots suck.

  • You know what will take out robots, right ? Design one that will penetrate even the most robust faraday cage. First law of war on the battlefield... blast an EMP over enemy lines. take out central command.
    • by ShanghaiBill ( 739463 ) on Thursday April 05, 2018 @10:34AM (#56386619)

      Design one that will penetrate even the most robust faraday cage.

      So when AQ deploys a killbot on Wall Street, are we going to self-nuke NYC?

      First law of war on the battlefield...

      Modern wars are not fought on "battlefields".

      blast an EMP over enemy lines. take out central command.

      There are no "lines" and there is no "central command".

      You play way too many video games.

      • Yep.. An EMP to roll us all back to living like we are in the 1800's again. If AI evolved to be that dangerous.. then hell who needs tech. We would need to put an end to it, to preserve humanity.
      • Many modern warfare battles can be taken out with a massive EMP. If I was faced to face with a bot like that.. and had a high power EMP, I'd use it. I've built a small EMP capable of destroying old cell phones, tablets, and computers. I can imagine it scaled up.. it would work. no need for a nuclear EMP. just give me a high voltage source, large capacitor bank, decent electrodes, it could be done.
  • by DrTJ ( 4014489 ) on Thursday April 05, 2018 @10:16AM (#56386503)

    There is excellent computer vision algorithms (e.g. in automotive industry) that can detect and track humans (even partially obscured) with great accuracy.
    There is excellent robot technology available (just look at Boston, or your average drone manufacturer).
    And there is no lack of "aim and fire" technology from the e.g. north american continent.

    It is not all that difficult to assemble these pieces into a nightmarish unit. You can already now see (rather harmless soft air gun) prototypes of this on YouTube.

    The university does not need to "manufacture autonomous lethal weapons", they just need to some generic AI stuff and leave the weaponization to others who probably are more than willing to do it.

    This sounds like an arms race to me; if one army will obtains this technology, will the others sit around and accept it? Heck, even if all armies would collectively refrain from it, what prevents your favorite terrorist organization from doing it? It's not THAT high tech anymore.

  • They won't be developing "autonomous lethal weapons." They'll merely be developing autonomous technology for a company that makes lethal weapons. Nothing to see here, move along.
  • ... then either they're incredibly naive or just stupid. Either way, I wonder if they're the sort of people to be working on a paradigm changing technology since clearly their understanding of human nature and what it will do with powerful technology is left severely wanting.

    • by Anonymous Coward

      If Western, relatively free countries don't do this stuff to keep ahead, the less ethical non-free countries will beat everyone to it.

      China is not going to stop because a bunch of eggheads whined about it.

      This is just a ploy by globalists and leftists to trick people in free countries from getting ahead.

      • by Viol8 ( 599362 )

        "If Western, relatively free countries don't do this stuff to keep ahead, the less ethical non-free countries will beat everyone to it. "

        And fuckwit CEOs in various corps virtually gave china the technology to do it by outsourcing manufacturing and effectively paying the chinese to improve their technological abilities. All to save a bit of cash and increase shareholder value.

  • by cascadingstylesheet ( 140919 ) on Thursday April 05, 2018 @10:31AM (#56386599) Journal

    So, people just won't develop this military tech, because peace? Got it.

    No, we need to develop it, first, fastest, and best. And develop AI powered countermeasures too.

    • by MrKaos ( 858439 )

      So, people just won't develop this military tech, because peace? Got it.

      They say "To Err is Human, To really fowl things up you need a Computer" Can you imagine trying to debug that!??!?

      No, we need to develop it, first, fastest, and best. And develop AI powered countermeasures too.

      Of we go then, what could possibly go wrong?

      • Of we go then, what could possibly go wrong?

        It is going to be developed; the only person you can stop from developing it is yourself. Let me know how that works out for you.

        • by MrKaos ( 858439 )

          Of we go then, what could possibly go wrong?

          It is going to be developed;

          The fewer people involved making it unreliable and prone to failure the better so let's not be naive and think that nothing is going to go wrong with doing it. It's a bad idea so people of good sense and good will have the right idea slowing the process down. Do we actually need AIs that can kill us? No. Do we need to think of the ramifications and prepare for them, I don't think it's unreasonable, so let's do that first. We could start here.

          I'd start with weapons that destroy themselves if they k

    • by nasch ( 598556 )

      We need to be ready for it, but that doesn't mean we need to develop it too. If a terrorist organization obtains killbots, having our own better killbots first won't help with that. We need anti-killbot technology. Maybe that's killbots too, but maybe it's not.

  • That's like almost a statistically relevant number in a population of 1,000. I guess since the world is 7 billion people though, these 50 don't really register. But wait, you say. Not all 7 billion are AI experts! Well, there are 10,000+ people that could be labeled that, so it's still not statistically relevant.
  • Since the dawn of AI, it was funded, and still is, by the military at all universities. The technology researched and further developed by larger corporations has been both lethal and non-lethal. Usually the university does the basic research and the company adds to it.
    So now some new AI hotheads are protesting developing AI (i.e. technology) that will be used in military weapons?
    Technology is neither good or bad, it's how you apply it. Anyone can take the research done by a university and apply it to weapo

  • Naively thinking AI will not be used for creating killbots is like thinking you could create the Internet and not have it used for porn.
  • IS a "Killer Robot."

    It fly's to its destination under its own guidance then explodes.

    FAR too late to ban.

  • All make a big deal against it, but all nations are doing it. Russia, America, and esp china , are pouring huge bucks into this.
  • Welcome to the war of the thinking machines. Let the Butlerian,jihad begin.
    • ...after which mankind was controlled by two cabals of manipulative bossy bitches on permanent PMS. I'll take the killer AI instead, thanks.

  • Are busy design Terminator series (which would speak Chinese, Russian and not heavily accented English with a German flavor), everyone else should be snoozing?

    Sure makes sense to me for them to do it, and cannot understand the umbrage?!?!?
  • So, now we're going to start SJW bullshit over a scientific discipline that sci-fi writers have spun horror stories out of?

    Oi vey...

  • That said, I have 0% faith in anyone telling me they absolutely will not weaponize AI. Anyone saying that must think we're all idiots. The first and best use for AI is replacing humans in dangerous situations, replacing them with a machine. Yeah you can do that for miners and truck drivers, but the real application will be replacing people as soldiers waging war. Period. Full stop. Okay maybe I should include law enforcement too but these days war and police work are starting to blur together at least in th

"And remember: Evil will always prevail, because Good is dumb." -- Spaceballs

Working...