Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Transportation Programming

Autonomous Car Ethics: If a Crash Is Unavoidable, What Does It Hit? 800

An anonymous reader writes "Patrick Lin of California Polytechnic State University explores one of the ethical problems autonomous car developers are going to have to solve: crash prioritization. He posits this scenario: suppose an autonomous car determines a crash is unavoidable, but has the option of swerving right into a small car with few safety features or swerving left into a heavier car that's more structurally sound. Do the people programming the car have it intentionally crash into the vehicle less likely to crumple? It might make more sense, and lead to fewer fatalities — but it sure wouldn't feel that way to the people in the car that got hit. He says, '[W]hile human drivers may be forgiven for making a poor split-second reaction – for instance, crashing into a Pinto that's prone to explode, instead of a more stable object – robot cars won't enjoy that freedom. Programmers have all the time in the world to get it right. It's the difference between premeditated murder and involuntary manslaughter.' We could somewhat randomize outcomes, but that would lead to generate just as much trouble. Lin adds, 'The larger challenge, though, isn't thinking through ethical dilemmas. It's also about setting accurate expectations with users and the general public who might find themselves surprised in bad ways by autonomous cars. Whatever answer to an ethical dilemma the car industry might lean towards will not be satisfying to everyone.'"
This discussion has been archived. No new comments can be posted.

Autonomous Car Ethics: If a Crash Is Unavoidable, What Does It Hit?

Comments Filter:
  • Simple answer (Score:5, Insightful)

    by Anonymous Coward on Wednesday May 07, 2014 @05:28AM (#46937461)

    Slam the brakes on and don't swerve either way. It's by no means optimal, but as far as lawsuits are concerned, it's much easier to defend "the car simply tried to stop as soon as possible" than "the car chose to hit you because it didn't want to hit someone else".

  • Screw other people (Score:5, Insightful)

    by Cyfun ( 667564 ) on Wednesday May 07, 2014 @05:34AM (#46937487) Homepage

    Let's be honest. The job of YOUR car is to keep YOU safe, so the smaller car is probably the better bet as it will have less inertia and cause you less harm. Sure, the most important law of robotics is to protect human life... but if it's going to prioritize, it should probably start with its owner.

  • by Anonymous Coward on Wednesday May 07, 2014 @05:37AM (#46937495)

    The kids are playing on a fucking railtrack, for fuck's sake. If they can't get out of the way in time, then they deserve what they get.

  • Bad example (Score:3, Insightful)

    by Anonymous Coward on Wednesday May 07, 2014 @05:38AM (#46937501)

    Why do poeple always give such easy examples when asking this question?

    Of course you save the 300 people! There's probably a lot more innocent people than 3 in that group of 300... You'd have to be very stupid to save 3 over 300 or too lazy to think about it and you make a random decision.

    The question should be more like this:
    On one track there's 10 escaped criminals and the other is your wife with son and another child in the belly.

    That's a decision you might have to think about, but most people would easily save their own wife.
    In my opion this shows most people are not ethical at all. So when someone asks you this question they pose it extremely in favor of sacraficing the innocent to make certain people will make the 'ethical' decision.

  • by Anonymous Coward on Wednesday May 07, 2014 @06:03AM (#46937591)

    I'd never have a car that did that. Me and mine are number one priority. All other priorities are secondary.

  • by bickerdyke ( 670000 ) on Wednesday May 07, 2014 @06:06AM (#46937599)

    But what if the driver of the other car, that will survive by steering your car over the cliff, would become the father of the next Hitler?

    A car will never have enough data to make a "right" descision in such a situation. Even the example from the intro is an invalid one as for a morally sound descision, you'd need to know how many passengers (and perhaps even WHICH passengers) are in those cars. Family of 5? Single guy with cancer anyway? And such an alogorith would mean assigning an individual (monetary or any dimensionless number - no difference) value to a human life. And then you've left the field of ethical behaviour quite a while ago.

    Live with imperfect descissions, as you never will be able to make the perfect one. So just stick to the usual heuristics: If you can't avoid both obstacles, Avoid the one that's closer. even if you hit the other one, you'll have a split second longer to brake. THAT might make the differnce between life and death.

  • Re:Undefined (Score:5, Insightful)

    by Wootery ( 1087023 ) on Wednesday May 07, 2014 @06:12AM (#46937621)

    Congratulations, you've given me a great go-to example of a non-answer.

    Just leave that kind of behavior undefined.

    Programs are generally deterministic beasts, by nature. What are you trying to say?

  • by PhilHibbs ( 4537 ) <snarks@gmail.com> on Wednesday May 07, 2014 @06:14AM (#46937635) Journal

    Cars have to be designed with the interests of the road-using population in mind. If you want your car to disregard everyone else's interests in favour of your own, then you should not be allowed to use public roads as you are a dangerous sociopath.

  • by Wootery ( 1087023 ) on Wednesday May 07, 2014 @06:15AM (#46937643)

    So if me and a few of my friends jump out in front of your car, the car should do everything in its power to avoid hitting us, right? Including driving off a cliff-face?

    A car which can be persuaded to deliberately kill its passengers... that might be a problem.

  • by blackest_k ( 761565 ) on Wednesday May 07, 2014 @06:19AM (#46937653) Homepage Journal

    There are very few "accidents" just people taking stupid risks. Maintain a safe distance, ie enough manouvering room so you don't join an accident, don't overtake when you can't see the end of the manouvere e.g going up hill or on a bend. Stop when necessary. Procede with caution sometimes you might want to turn off the radio open a window and listen. Use your indicators. Drive within your lights or as conditions allow. Don't be an asshole.

    Sometimes you will come across assholes on the road it is best to give them a wide birth even stop and pull over in order to get them out of your way, but don't dawdle if you want or need to drive slow make opportunities for people to overtake.

    Bad planning and poor judgement are the most common causes of accidents which is why schools have low speed limits around them as kids can be stupid around roads.

    Be helpful, I remember one time I was filtering down the centre line on a motorbike (dispatch rider) past stationary traffic and a taxi driver stuck his hand out. I braked and a pushchair popped out from between the stationary traffic. Without that warning I could have killed a toddler as it was no harm was done and I don't think the mother was ever aware of the danger.

    One thing about london traffic professional drivers work the streets most of the day and they are very road aware. The most dangerous times are when schools start and when schools let out, followed by the rush hours when the non professionals are on the road.

         

  • Re:Simple answer (Score:5, Insightful)

    by Wootery ( 1087023 ) on Wednesday May 07, 2014 @06:22AM (#46937663)

    You joke, but, like the hit the best protected car policy, it would serve to punish the most safety-conscious, whilst still making some sense on short-term utilitarian grounds.

  • Time? (Score:5, Insightful)

    by Bazman ( 4849 ) on Wednesday May 07, 2014 @06:25AM (#46937673) Journal

    "Programmers have all the time in the world to get it right". HAHAHAHAHAHA.

    No, we have deadlines like everyone else. And even then we only have all the time in the CPU. Yeah, we can add more CPUs to the system, but that makes it more complex, and that makes it harder to hit that deadline. What kind of idiot made that statement?

     

  • Re:Bad example (Score:5, Insightful)

    by Ost99 ( 101831 ) on Wednesday May 07, 2014 @06:28AM (#46937689)

    Killing someone by inaction is also murder.
    The question then becomes, kill 3 or kill 300.

  • by HyperQuantum ( 1032422 ) on Wednesday May 07, 2014 @06:44AM (#46937753) Homepage

    Screw other people

    And this is what is wrong with the world.

    Let's turn the situation around: suppose you and your children are walking on the street. Will you still prefer the autonomous car to protect it's single driver at all costs and kill you and your children instead? And then imagine how many autonomous cars will be on the road in the future, all with that same logic built-in...

  • by Anonymous Coward on Wednesday May 07, 2014 @06:48AM (#46937763)

    Be helpful, I remember one time I was filtering down the centre line on a motorbike (dispatch rider) past stationary traffic and a taxi driver stuck his hand out. I braked and a pushchair popped out from between the stationary traffic. Without that warning I could have killed a toddler as it was no harm was done and I don't think the mother was ever aware of the danger.

    And this is why lane stradling is illegal in most of the places I have driven! Traffic gave way to somebody and you, while being the pushy arrogant asshole and overtaking illegally as you advocate against, were nearly the cause of a fatality on the road.

    *rolls eyes* If everyone else follows you're advice you won't have to. Is that what you are thinking?

  • by CastrTroy ( 595695 ) on Wednesday May 07, 2014 @07:00AM (#46937815)
    This makes a lot of sense. If we wanted to maximize safety, we wouldn't all be driving around in vehicles that weigh a couple thousand pounds. That's a lot of energy to get rid of in a short time in the event of an accident. Cars make sense for long trips or when you have a lot of stuff to carry, but going back and forth to work could be done in much smaller and lighter vehicles. You could easily build an enclosed recumbent bike with a small engine that would get both amazing gas mileage and be safe if all the other vehicles on the road were similarly sized.
  • by CrimsonAvenger ( 580665 ) on Wednesday May 07, 2014 @07:13AM (#46937867)

    Should your car intervene,potentially killing you, for the good of society as a whole?

    No. Just, no.

    If your car "intervenes in an accident", then you car is programmed to cause an accident under certain conditions. Just no.

  • Re:Bad example (Score:5, Insightful)

    by zarr ( 724629 ) on Wednesday May 07, 2014 @07:21AM (#46937907)
    No, making the wrong choice makes you a murderer. At least 3 people are going to die no matter what you do. By not pulling that lever, you'll cause the death of another 297.
  • Re:Undefined (Score:5, Insightful)

    by MaskedSlacker ( 911878 ) on Wednesday May 07, 2014 @07:44AM (#46938019)

    Why not? The simplest method is to choose the collision with the lowest speed differential. In fact, this whole post is pointless. The self-driving car doesn't need to choose based on abstract concepts--choose the collision with the lowest speed differential. Lower speed differential means less energy transferred in the impact means less damage and less injuries. Moreover this is trivial for the cars to determine at this stage already. They can already calculate relative speeds between themselves and other objects, so if not all of the objects can be avoided, the choice is obvious.

  • by Hodr ( 219920 ) on Wednesday May 07, 2014 @07:45AM (#46938027) Homepage

    When this is common, I will be first in line for the OBD-999 hack that shows I always have 5 children in the back of my car.

  • Physics first (Score:5, Insightful)

    by Antique Geekmeister ( 740220 ) on Wednesday May 07, 2014 @07:49AM (#46938047)

    While a complex guidance system may be designed from the top down with such sorts of questions raised, a crashing vehicle is always a deadly weapon. Effort in reducing the risk of the accident, itself, by improving brakes, sensors, headlight effectiveness, and crash resistance of the vehicle itself is likely to be far more efficient and reliable than complex advance modeling or moral quandaries. The sophistication needed to evaluate the secondary effects of a crash is far, far beyond the capabilities of what must be a very reliable, extremely robust guidance system. Expanding its sophistication is likely to introduce far more _bugs_ into the system.

    This is a case where "Keep It Simple, Stupid" is vital. Reduce speed in a controlled fashion: Avoid pedestrians, if they can be detected, because they have no armor. Get off the road in a controlled fashion.

  • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Wednesday May 07, 2014 @07:59AM (#46938109) Homepage Journal

    Yeah, deaf kids shouldn't be playing on train tracks.

    Not only is that true, but deaf kids should be able to feel the train coming. Having spent much of my youth living next to some train tracks, putting coins on them (not in stacks, of course) and so on, you can definitely feel it before you can see it. Or, you know, feel it hitting you, then feel nothing.

  • Re:Bad example (Score:3, Insightful)

    by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Wednesday May 07, 2014 @08:00AM (#46938125) Homepage Journal

    No, making the wrong choice makes you a murderer. At least 3 people are going to die no matter what you do. By not pulling that lever, you'll cause the death of another 297.

    Uh no. By not pulling that lever, you'll fail to stop the death of 300, but you also won't cause the death of 3. In no scenario discussed is the lever-puller a murderer except if they decide the children should die, and pull the lever.

  • Re:A cat (Score:4, Insightful)

    by miknix ( 1047580 ) on Wednesday May 07, 2014 @08:00AM (#46938129) Homepage

    definitely, a cat, I hate them.

  • Re:Undefined (Score:5, Insightful)

    by Muad'Dave ( 255648 ) on Wednesday May 07, 2014 @08:08AM (#46938181) Homepage

    So you would have it choose to mow down the stationary infant in its stroller as opposed to tapping a parking pickup truck backing up at at 10 MPH?

    The problem with his original question is that he assumes the self-driving car has knowledge of the type, mass, and vulnerability of things around it. This might be the test case for the three laws of robotics - do not ever choose to hit an unprotected human (probably includes motorcyclists, bicyclists, and pedestrians). If you know (by a beacon or whatever) that a vehicle is completely autonomous and does not contain humans and has comparable delta-V, give that preference. If hitting a vehicle likely containing a human is inevitable, choose the lowest speed impact.

  • Re: Undefined (Score:4, Insightful)

    by mbone ( 558574 ) on Wednesday May 07, 2014 @08:17AM (#46938239)

    That is not possible. I can see it in the court case. You had tge capacity to choose, yet you chose not to choose and my daughter is dead.

    But, the simple reality it, that will happen anyway, no matter what decision is made. ("You chose to minimize the probability of X, and now my daughter is dead.")

    I don't, by the way, buy the "Programmers have all the time in the world to get it right" bit. Programmers will not be able to anticipate everything, and their software will not always be able to calculate everything in the few milliseconds or so you might have to make such decisions.

  • Re:Undefined (Score:4, Insightful)

    by MaskedSlacker ( 911878 ) on Wednesday May 07, 2014 @08:26AM (#46938317)

    Since when do trees move faster than children?

  • Re:Bad example (Score:1, Insightful)

    by Anonymous Coward on Wednesday May 07, 2014 @08:42AM (#46938419)

    Uh no. By not pulling that lever, you'll fail to stop the death of 300

    Semantics. You'll still have their blood on your hands.

    Unless, of course,you're a sociopath who can some how justify it because 'I didn't actually make a positive choice' or something.

    A lie of Omission is still a lie. Choosing the default is still a choice.

  • Re:Bad example (Score:2, Insightful)

    by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Wednesday May 07, 2014 @08:53AM (#46938539) Homepage Journal

    Semantics. You'll still have their blood on your hands.

    Well no, no I won't. Because I didn't cause the train crash to begin with.

    Idiots like you can say anything they want about inaction, but it's not the same as action no matter how you slice it.

    A lie of Omission is still a lie.

    A lie of omission depends on a statement which omits something. Not pulling a lever is not at all comparable; a statement is an action. Not pulling a lever is not an action.

    Choosing the default is still a choice.

    Sure. But in this case, it is not a choice which leads to murder, and no amount of arguing otherwise will make it so. If you want to argue that murder is more moral than watching those people die, go right ahead. You clearly have no fucking idea what you're on about, otherwise.

  • by Anonymous Coward on Wednesday May 07, 2014 @09:18AM (#46938765)

    Having worked on autonomous saftey systems, we chose instead to reject the inane sophism in this thread and instead operate under the real constraints (time and money) to design a system that optimized safety within those constraints. See, if you go for the philosophical bullshit first, you will only create page views on a website. However, if you start designing and building shit, you build incremental layers of safety that improve the overall system safety. Our design objective was to create two orders of magnitude in risk reduction to the overall system, and that succeded. Testing real hardware sure beats bullshitting in the breakroom.

  • Re:Bad example (Score:5, Insightful)

    by Nidi62 ( 1525137 ) on Wednesday May 07, 2014 @09:57AM (#46939197)
    But when you are completely able to take action but instead do passive cooperation, you are still making an active choice to remain passive. It's the same as if you are on a sidewalk and you see the person next to you not paying attention and about to step into oncoming traffic: you can grab them and save them from getting hit, or you can stand there and watch them splatter a windshield. Sure, you won't get arrested for murder for letting them step out onto the street, but you certainly let them die. To me, what is moral is helping when you have the ability to help.
  • by JDG1980 ( 2438906 ) on Wednesday May 07, 2014 @11:17AM (#46940069)

    It's important to keep in mind that when such crashes happen, the programmers/manufacturers/insurance companies won't have to defend them to a committee of ivory-tower utilitarian philosophers. They're going to have to defend them to a jury made up of ordinary citizens, most of whom believe that strict utilitarian ethics is monstrous sociopathy (and probably an affront to their deontological religious beliefs as well). And of course, these jury members won't even realize that they are thinking in such terms.

    Thus, whatever the programming decisions are, they have to be explicable and defensible to ordinary citizens with no philosophical training. That's why I agree with several other commenters here that "slam on the brakes" is the most obvious out. It's a lot easier to defend the fact that the car physically couldn't stop in time than to defend a deliberate choice to cause one collision in order to avert a hypothetical worse crash. This is especially true since a well-designed autonomous car drives conservatively, and would only be faced with such a situation if someone else is doing something wrong, such as dashing out into traffic right in front of the vehicle at a high rate of speed without looking. In any other situation, the car would just stop before any crash with anything took place. If you absolutely can't avoid hitting something, slamming on the brakes makes it more likely that at least you hit the person who did something to bring it on themselves, rather than one who's completely innocent.

One way to make your old car run better is to look up the price of a new model.

Working...