Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation AI

Who Is Liable When a Self-Driving Car Crashes? 937

innocent_white_lamb writes "Current laws make the driver of a car responsible for any mayhem caused by that vehicle. But what happens when there is no driver? This article argues that the dream of a self-driving car is futile since the law requires that the driver is responsible for the operation of the vehicle. Therefore, even if a car is self-driving, you as the driver must stay alert and pay attention. No texting, no reading, no snoozing. So what's the point of a self-driving car if you can't relax or do something else while 'driving?'"
This discussion has been archived. No new comments can be posted.

Who Is Liable When a Self-Driving Car Crashes?

Comments Filter:
  • Efficiency. (Score:5, Insightful)

    by lifewarped ( 833507 ) on Thursday January 09, 2014 @01:45PM (#45908079)
    A self driving car would be less likely to rubberneck, or cause other issues relating to a human driver. Cars could in theory go faster. etc.
  • Safety (Score:5, Insightful)

    by adamstew ( 909658 ) on Thursday January 09, 2014 @01:45PM (#45908085)

    I would think the point would be that machines, once properly programmed, can be the worlds safest drivers...statistically. You, as a human, will still be responsible for taking over when the machine doesn't know what to do. But, for the other 99.5% of the time, the self-driving car will make the best decisions and always be completely alert.

    Self-Driving cars, I believe, have the ability to drastically reduce deaths caused by motor vehicle accidents...one of the highest causes of death in the USA.

  • Insurance (Score:5, Insightful)

    by mfwitten ( 1906728 ) on Thursday January 09, 2014 @01:46PM (#45908105)

    There's an industry that manages risk.

    Regulation (e.g., insurance) always develops spontaneously, because there is a market for reducing chaos.

  • laws change (Score:5, Insightful)

    by MozeeToby ( 1163751 ) on Thursday January 09, 2014 @01:47PM (#45908115)

    Current law not appropriate for future technology! News at 11!

  • Depends (Score:3, Insightful)

    by Murdoch5 ( 1563847 ) on Thursday January 09, 2014 @01:49PM (#45908149) Homepage
    If the car has a software issue and crashes then the software developer is at fault. If the car has a hardware problem then the hardware developer is at fault. If the car has a mechanical failure then then mechanical engineer is at fault and so on. Either developer the components / modules correctly in the first place or not at all. If modules / components have lifespans then just lock the car from starting once those lifespans have been reached and if you don't want to be held holding the torch when shit hits the fan then don't get involved from the get go. To spite this modern system of pass the buck and never accept ownership of the problem, someone caused the issue by not doing there job right to begin with and they should have to rectify it.
  • Boring Drive (Score:5, Insightful)

    by ZombieBraintrust ( 1685608 ) on Thursday January 09, 2014 @01:52PM (#45908181)
    But with nothing to do behind the wheel 99% of the time your not going to be alert. Your going to be super bored. So when your supposed to take over you won't be prepared to do so.
  • Re:Safety (Score:4, Insightful)

    by gstoddart ( 321705 ) on Thursday January 09, 2014 @01:58PM (#45908301) Homepage

    I would think the point would be that machines, once properly programmed, can be the worlds safest drivers...statistically. You, as a human, will still be responsible for taking over when the machine doesn't know what to do.

    No way that's gonna work.

    There's now way you can expect people to be alert and responsive if they have to be on the ball for that small fraction of the time -- they'll have started reading their paper or plenty of other things.

    If I'm responsible for the operation of the vehicle, I'll bloody well drive myself and be engaged for the entire time, and don't need your autonomous car.

    I'f I'm not responsible for the operation of the vehicle, I want to be in the back seat in one hell of a good safety cage with no pretense whatsoever that I'm in control.

    You can't have the vehicle responsible most of the time, and the ostensible operator responsible whenever that stops working suddenly, it defeats the purpose.

    Which, to me, is kind of a fairly fundamental problem with self driving cars. It's all or nothing. And if *all* the cars on the road aren't autonomous, then the autonomous ones are mostly a traffic hazard with no clear liability.

  • Re:Efficiency. (Score:5, Insightful)

    by amicusNYCL ( 1538833 ) on Thursday January 09, 2014 @02:01PM (#45908375)

    Think of all the problems it could solve though. For example, oblivious drivers shoulder to shoulder going the same speed and not letting anyone else pass. If the cars were autonomous then they could simply tell each other to move over. I would love to have that ability now. Lane speed could also be regulated. If you wanted your car to drive slower then it would stay in the farther right lanes. If your car was being passed on the right, then it would keep moving over until no one is passing it on the right. It would be great if humans did that today, which is the cause for most of the slowdown that I see on the highways.

  • Re:Efficiency. (Score:1, Insightful)

    by Anonymous Coward on Thursday January 09, 2014 @02:04PM (#45908407)

    Not to mention that the government can hack your car to kill you, like they did Micheal Hastings -- and Hastings' car wasn't even self-driving.

    It is for this reason that I drive an older model with a manual transmission, with manual door locks and crank-operated windows. Government takes out my brakes? No problem, shift into first and engine-brake going 10 mph down the hill. Stuck accelerator? Put 'er in neutral. Get caught in a storm or drive into a lake? I can simply unlock the door or roll down my windows and swim out, no power components to sieze up or go inactive. Starter or battery dead? Push-start the car. Save gas? Coast in neutral down large hills. It will take nothing short of a remote-controlled bomb or gunfire or a chase ram car to assassinate somebody driving an all-manual car.

    -- Ethanol-fueled

  • Re:Boring Drive (Score:3, Insightful)

    by MickyTheIdiot ( 1032226 ) on Thursday January 09, 2014 @02:09PM (#45908497) Homepage Journal

    Yep, you're right, but the problem is that people are so fucking stupid that if any non-autonomous drivers were on the road it would be pulling over constantly. How many times a week do people get too close to you on the highway or tailgate. How many times a week do you pull up to a four-way stop and some hillbilly can't comprehend what to do? The same things will happen to self-driving cars while there are still people driving their own machines.

  • Re:Efficiency. (Score:3, Insightful)

    by fyec ( 3404475 ) on Thursday January 09, 2014 @02:18PM (#45908627)
    I'm reminded of a quote from George Carlin: "Have you ever noticed that anybody driving slower than you is an idiot, and anyone going faster than you is a maniac?"
  • Re:Boring Drive (Score:5, Insightful)

    by danlip ( 737336 ) on Thursday January 09, 2014 @02:26PM (#45908777)

    Not to mention that people who have been using self-driving cars all their life will have 99% less driving experience. They will basically all be student drivers, but without a teacher in the car when something goes wrong.

  • Re:Efficiency. (Score:5, Insightful)

    by MindStalker ( 22827 ) <mindstalker@[ ]il.com ['gma' in gap]> on Thursday January 09, 2014 @02:27PM (#45908787) Journal

    Of course in the real world the driver is almost never personally held liable. If I let my friend drive my car and he causes a crash on accident My insurance for My car will pay for the accident. I didn't cause the crash my my car which I insured crashed so ultimately my insurance pays for it and my rates go up. Who the driver is, my friend or an AI system is irrelevant.

  • Re:Efficiency. (Score:3, Insightful)

    by smooth wombat ( 796938 ) on Thursday January 09, 2014 @02:30PM (#45908845) Journal
    How is it efficient if you drive as fast as possible? Fuel mileage decreases once you hit about 50 mph. After that you're driving your costs higher.

    A report showing the effect [nbcnews.com] and a chart [fueleconomy.gov] which gives a graphical representation of this effect.
  • Re:Efficiency. (Score:4, Insightful)

    by fisted ( 2295862 ) on Thursday January 09, 2014 @02:38PM (#45908969)

    Well, I'll bite.
     

    Government takes out my brakes? No problem, shift into first and engine-brake going 10 mph down the hill.

    Good luck with that at any speed which would have the potential to kill you

    Stuck accelerator? Put 'er in neutral.

    Or turn off the ignition.

    Get caught in a storm or drive into a lake? I can simply unlock the door or roll down my windows and swim out, no power components to sieze up or go inactive.

    Car doors can usually be opened from the inside even when locked. An exception are back doors with stupid-child protection engaged

    Starter or battery dead? Push-start the car.

    Yep. But not relevant to the point being discussed. It's about gov't being out to kill you, remember?

    Save gas? Coast in neutral down large hills.

    No, you're wasting gas that way, since you still need some to keep the idle rev. Non-ancient cars will actually shut off fuel injection when gravity happens to temporarily become 'fuel'

    It will take nothing short of a remote-controlled bomb or gunfire or a chase ram car to assassinate somebody driving an all-manual car.

    And that is why your whole paranoia is even more ridiculous.
     
    Disclaimer: i drive manual transmission too, but for none of the reasons you mention. My reasons are: a) simpler/more robust design (i.e. one less part which can fail fail), b) more control, c) avoid ridicule

  • Re: Efficiency. (Score:5, Insightful)

    by iamhassi ( 659463 ) on Thursday January 09, 2014 @02:41PM (#45909015) Journal
    You think the govt needs to hack the car to kill you? I'm more worried about the govt being able to track where every car is, since once they're driverless the next step is to allow them to communicate with each other, and if they can talk then the govt can see where the cars are.

    As for who's responsible when a driverless car crashes it will probably be the same as when a dog kills someone, the owner of the dog is responsible. Just because the owner wasn't operating the wheel doesn't make them any less responsible, but just like we have learned to trust cruise control and drive by wire gas pedals to not suddenly accelerate, we will learn to trust driverless cars.
    But how will cops be able to tell drunk drivers if the car is driving? And does it even matter if they're drunk if the car is driving them home?
  • by gnasher719 ( 869701 ) on Thursday January 09, 2014 @02:46PM (#45909093)
    There are two distinct things: One is that you are officially the driver even if the car drives itself, and you are responsible. But the whole point of a self driving car is that it is safer driving in a self-driving car with your eyes closed than in a non-self driving car with open eyes. You are responsible, but nobody is going to say "you are responsible because you used a self driving car without watching". They will say "you are responsible because your self-driving car caused the crash". Which will happen less often than if you drove yourself.

    Right now you have to (a) watch out what you are doing and (b) pray that you don't have an accident. With a self driving car you don't need to watch out what you or the car are doing; you still have to pray that you don't have an accident.

    And the whole idea of taking control in unexpected situations is nonsense. In the very best case, you would have to (1) do something to take control away from the computer and (2) react to the problem. In situations where there is enough time for that, the computer can handle things just fine. And people may think they are good in unexpected situations, but they are not.
  • Re:Efficiency. (Score:5, Insightful)

    by cdrudge ( 68377 ) on Thursday January 09, 2014 @02:50PM (#45909139) Homepage

    Efficiency can have multiple meanings. You're talking about maximizing mileage for the fuel used. What if we're talking about getting you from point A to point B the fastest possible to efficiently minimize your travel time and maximize your time at the destination? Or if the self-driving car is a taxi, for delivering one fare and picking up another, balancing fuel economy, fare rates, and fare availability, "efficiently" maximizing revenue while minimizing idle time.

  • Re:Efficiency. (Score:5, Insightful)

    by silas_moeckel ( 234313 ) <silas.dsminc-corp@com> on Thursday January 09, 2014 @02:51PM (#45909155) Homepage

    Time efficient, vs cost. I can not get more time, I can get more money thus I value my time far more than money. By your charts paying 33-50% more to get someplace 2x as fast is well worth it. If your time is cheap but your money dear stay in the slow lane.

  • Re:Efficiency. (Score:3, Insightful)

    by davester666 ( 731373 ) on Thursday January 09, 2014 @03:07PM (#45909361) Journal

    everyone believes they are a skilled driver with a properly maintained vehicle

  • by adolf ( 21054 ) <flodadolf@gmail.com> on Thursday January 09, 2014 @04:30PM (#45910461) Journal

    I'm not buying a self-driving car until I can sit in the back seat.

    I'm not buying a self-driving car until I can sit in the back seat and drink a beer.

An authority is a person who can tell you more about something than you really care to know.

Working...