Follow Slashdot stories on Twitter


Forgot your password?
Transportation AI Crime

Selectable Ethics For Robotic Cars and the Possibility of a Robot Car Bomb 239

Rick Zeman writes Wired has an interesting article on the possibility of selectable ethical choices in robotic autonomous cars. From the article: "The way this would work is one customer may set the car (which he paid for) to jealously value his life over all others; another user may prefer that the car values all lives the same and minimizes harm overall; yet another may want to minimize legal liability and costs for herself; and other settings are possible. Philosophically, this opens up an interesting debate about the oft-clashing ideas of morality vs. liability." Meanwhile, others are thinking about the potential large scale damage a robot car could do.

Lasrick writes Patrick Lin writes about a recent FBI report that warns of the use of robot cars as terrorist and criminal threats, calling the use of weaponized robot cars "game changing." Lin explores the many ways in which robot cars could be exploited for nefarious purposes, including the fear that they could help terrorist organizations based in the Middle East carry out attacks on US soil. "And earlier this year, jihadists were calling for more car bombs in America. Thus, popular concerns about car bombs seem all too real." But Lin isn't too worried about these threats, and points out that there are far easier ways for terrorists to wreak havoc in the US.
This discussion has been archived. No new comments can be posted.

Selectable Ethics For Robotic Cars and the Possibility of a Robot Car Bomb

Comments Filter:
  • Insurance rates (Score:3, Interesting)

    by olsmeister ( 1488789 ) on Monday August 18, 2014 @12:12PM (#47695865)
    I wonder whether your insurance company would demand to know how you have set your car, and adjust your rates accordingly?
  • MUCH easier. (Score:4, Interesting)

    by khasim ( 1285 ) <> on Monday August 18, 2014 @12:27PM (#47696009)

    From TFA:

    Do you remember that day when you lost your mind? You aimed your car at five random people down the road.

    WTF?!? That makes no sense.

    Thankfully, your autonomous car saved their lives by grabbing the wheel from you and swerving to the right.

    Again, WTF?!? Who would design a machine that would take control away from a person TO HIT AN OBSTACLE? That's a mess of legal responsibility.

    This scene, of course, is based on the infamous "trolley problem" that many folks are now talking about in AI ethics.

    No. No they are not. The only "many folks" who are talking about it are people who have no concept of what it takes to program a car.

    Or legal liability.

    Itâ(TM)s a plausible scene, since even cars today have crash-avoidance features: some can brake by themselves to avoid collisions, and others can change lanes too.

    No, it is not "plausible". Not at all. You are speculating on a system that would be able to correctly identify ALL THE OBJECTS IN THE AREA and that is never going to happen.

    Wired is being stupid in TFA.

  • by Rob Riggs ( 6418 ) on Monday August 18, 2014 @12:42PM (#47696189) Homepage Journal
    Just wait until the AI has to keep track of liability awards so that it can make the correct decision regarding minimizing liability. At some point you are going to have a stupid jury award and all the cars are just going to refuse to go anywhere because the AI's cost benefit analysis says "just stay in park".
  • Re:Insurance rates (Score:5, Interesting)

    by grahamsz ( 150076 ) on Monday August 18, 2014 @12:44PM (#47696221) Homepage Journal

    More likely that your insurance company would enforce the settings on your car and require that you pay them extra if you'd like the car to value your life over other lives.

    With fast networks it's even possible that the insurance companies could bid on outcomes as the accident was happening. Theoretically my insurer could throw my car into a ditch to avoid damage to a bmw coming the other way.

I came, I saw, I deleted all your files.