Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Transportation AI

Who Is Liable When a Self-Driving Car Crashes? 937

innocent_white_lamb writes "Current laws make the driver of a car responsible for any mayhem caused by that vehicle. But what happens when there is no driver? This article argues that the dream of a self-driving car is futile since the law requires that the driver is responsible for the operation of the vehicle. Therefore, even if a car is self-driving, you as the driver must stay alert and pay attention. No texting, no reading, no snoozing. So what's the point of a self-driving car if you can't relax or do something else while 'driving?'"
This discussion has been archived. No new comments can be posted.

Who Is Liable When a Self-Driving Car Crashes?

Comments Filter:
  • Efficiency. (Score:5, Insightful)

    by lifewarped ( 833507 ) on Thursday January 09, 2014 @01:45PM (#45908079)
    A self driving car would be less likely to rubberneck, or cause other issues relating to a human driver. Cars could in theory go faster. etc.
    • Cant wait till they update all those dedicated bus/carpool lanes. Self driving cars no speed limit max safe speed determined by the cars, cars slower automatically pull over and let faster cars pass. Hell leave the buses in as long as they stop obstructing the flow of traffic.

      • Re: (Score:3, Insightful)

        How is it efficient if you drive as fast as possible? Fuel mileage decreases once you hit about 50 mph. After that you're driving your costs higher.

        A report showing the effect [nbcnews.com] and a chart [fueleconomy.gov] which gives a graphical representation of this effect.
        • Re:Efficiency. (Score:5, Insightful)

          by cdrudge ( 68377 ) on Thursday January 09, 2014 @02:50PM (#45909139) Homepage

          Efficiency can have multiple meanings. You're talking about maximizing mileage for the fuel used. What if we're talking about getting you from point A to point B the fastest possible to efficiently minimize your travel time and maximize your time at the destination? Or if the self-driving car is a taxi, for delivering one fare and picking up another, balancing fuel economy, fare rates, and fare availability, "efficiently" maximizing revenue while minimizing idle time.

        • Re:Efficiency. (Score:5, Interesting)

          by holmstar ( 1388267 ) on Thursday January 09, 2014 @02:50PM (#45909151)
          Self driving cars could also form trains, driving only a few feet apart, thereby greatly reducing wind resistance. A car train might be able to get the same fuel economy at 85 mph as a single car would achieve at 50.
        • Re:Efficiency. (Score:5, Insightful)

          by silas_moeckel ( 234313 ) <silas@@@dsminc-corp...com> on Thursday January 09, 2014 @02:51PM (#45909155) Homepage

          Time efficient, vs cost. I can not get more time, I can get more money thus I value my time far more than money. By your charts paying 33-50% more to get someplace 2x as fast is well worth it. If your time is cheap but your money dear stay in the slow lane.

        • Re:Efficiency. (Score:4, Informative)

          by cyn1c77 ( 928549 ) on Thursday January 09, 2014 @03:48PM (#45909931)

          How is it efficient if you drive as fast as possible? Fuel mileage decreases once you hit about 50 mph. After that you're driving your costs higher.

          A report showing the effect [nbcnews.com] and a chart [fueleconomy.gov] which gives a graphical representation of this effect.

          Time is money, friend.

    • by jythie ( 914043 )
      Though I wonder how long it would take before marketers started allowing for customized driving parameters.

      One of the major problems with current traffic flows is it only takes a few aggressive drivers who get minor advantages to slow everything down. There are enough people who, when presented with "you can get there in 8 minutes but everyone else will take 12 or everyone including you can take 10 minutes, but for each person who chooses 8 minutes everyone, including them, will take one minute longer' wi
      • Re:Efficiency. (Score:5, Insightful)

        by amicusNYCL ( 1538833 ) on Thursday January 09, 2014 @02:01PM (#45908375)

        Think of all the problems it could solve though. For example, oblivious drivers shoulder to shoulder going the same speed and not letting anyone else pass. If the cars were autonomous then they could simply tell each other to move over. I would love to have that ability now. Lane speed could also be regulated. If you wanted your car to drive slower then it would stay in the farther right lanes. If your car was being passed on the right, then it would keep moving over until no one is passing it on the right. It would be great if humans did that today, which is the cause for most of the slowdown that I see on the highways.

        • by csumpi ( 2258986 )
          Dude! Google maps, after having collected millions and millions of data points, still can't figure out how to get me 5 miles to my destination on a sane route. Not just on my phone, but also not on google's bajillion dollar server farm. How the fugg would self driving cars tackle problems like you talking about, if we can't even calculate a route on a friggin map!
        • Yeah, that's the biggest impediment I see to traffic flow outside of traffic jams. Particularly bad are the drivers who feel the urge to speed up to pass a truck, but once they're in front of the truck they are no longer pressured and slow down, matching speeds with the truck which is now just behind them on the right. This results in a huge jam of cars behind them since they are now blocking the only passing lane around the truck.

          Germany solved that on the Autobahn by making a law requiring you to mov
      • Re:Efficiency. (Score:5, Interesting)

        by crakbone ( 860662 ) on Thursday January 09, 2014 @02:10PM (#45908511)
        Actually I see the opposite. When I drive people around they talk, work on their phones, or make calls. They don't usually tell me to go faster. On an automatic car you would most likely see people start to do other more important things than worry about that .25 second advantage they would have if they cut off three cars.
      • Aggressive drivers aren't the only cause of traffic congestion. Traffic waves often begin when one or two drivers aren't paying attention and brake too late or more than necessary, causing those behind to slow suddenly as well. Once this slow down has occurred, it effectively reduces the carrying capacity of the road, and will persist until traffic volume has reduced to match that of the reduced carrying capacity.
    • by mcgrew ( 92797 ) *

      True, self-driving cars will be safer, but that doesn't answer the question. At first you'll still need insurance. If one of them does cause a crash because of a mechanical malfunction, why would anything change? Automakers and mechanics are sued all the time for crashes caused by mechanical problems (which are actually rare, almost all car wrecks are human error). Example: Ford and Firestone for the SUV rollovers.

      I think eventually driving without insurance will be legalized when the human factor is remove

      • Re:Efficiency. (Score:5, Insightful)

        by MindStalker ( 22827 ) <mindstalker@@@gmail...com> on Thursday January 09, 2014 @02:27PM (#45908787) Journal

        Of course in the real world the driver is almost never personally held liable. If I let my friend drive my car and he causes a crash on accident My insurance for My car will pay for the accident. I didn't cause the crash my my car which I insured crashed so ultimately my insurance pays for it and my rates go up. Who the driver is, my friend or an AI system is irrelevant.

    • by Matheus ( 586080 )

      I think it doesn't really matter how safe the auto drive is or how attentive the driver is when it goes to answering the OPs question. The legality will be (should be) exactly the same as when you're using cruise control. When I driver engages such a "tool" to "assist" their driving they are not handing liability over to the "tool". If I engage cruise control and look away and cruise control propels me into another vehicle that has stopped in front of me I am still 100% liable for that accident (with a po

  • Safety (Score:5, Insightful)

    by adamstew ( 909658 ) on Thursday January 09, 2014 @01:45PM (#45908085)

    I would think the point would be that machines, once properly programmed, can be the worlds safest drivers...statistically. You, as a human, will still be responsible for taking over when the machine doesn't know what to do. But, for the other 99.5% of the time, the self-driving car will make the best decisions and always be completely alert.

    Self-Driving cars, I believe, have the ability to drastically reduce deaths caused by motor vehicle accidents...one of the highest causes of death in the USA.

    • Boring Drive (Score:5, Insightful)

      by ZombieBraintrust ( 1685608 ) on Thursday January 09, 2014 @01:52PM (#45908181)
      But with nothing to do behind the wheel 99% of the time your not going to be alert. Your going to be super bored. So when your supposed to take over you won't be prepared to do so.
      • Re:Boring Drive (Score:5, Informative)

        by SJHillman ( 1966756 ) on Thursday January 09, 2014 @02:02PM (#45908385)

        Cars should have a failsafe option when faced with a decision in dangerous circumstances. Something like "pull the fuck off the road without hitting shit then ask what to do". Sure, even a failsafe option can't account for everything, but it will probably still do a better job than your average human driver - alert or not.

        If we always waited until 100% of the issues are ironed out, then we still wouldn't even be using fire. Personally, once machine drivers are statistically safer than human drivers, I'm all for adopting them as our vehicular overlords.

        • Re: (Score:3, Insightful)

          Yep, you're right, but the problem is that people are so fucking stupid that if any non-autonomous drivers were on the road it would be pulling over constantly. How many times a week do people get too close to you on the highway or tailgate. How many times a week do you pull up to a four-way stop and some hillbilly can't comprehend what to do? The same things will happen to self-driving cars while there are still people driving their own machines.

          • Re:Boring Drive (Score:4, Informative)

            by SJHillman ( 1966756 ) on Thursday January 09, 2014 @02:22PM (#45908701)

            Tailgating, speeding, failure to signal, etc are all behaviors that the current generation of self-driving car can already account for using the same tactics that a sane human driver would use. Back in August 2012, Google's team announced that they had passed the 300,000 autonomous mile mark on public roads. Accident-free.

      • Re:Boring Drive (Score:5, Insightful)

        by danlip ( 737336 ) on Thursday January 09, 2014 @02:26PM (#45908777)

        Not to mention that people who have been using self-driving cars all their life will have 99% less driving experience. They will basically all be student drivers, but without a teacher in the car when something goes wrong.

    • I would think the point would be that machines, once properly programmed, can be the worlds safest drivers...statistically. You, as a human, will still be responsible for taking over when the machine doesn't know what to do. But, for the other 99.5% of the time, the self-driving car will make the best decisions and always be completely alert.

      The problem with that is, how much notice do you think a computer is going to give the operator when it comes across one of those situations it doesn't know how to handle? 5 minutes, probably not; 5 seconds*, maybe. That's not a lot of time to switch gears from "casually reading a book" to "OMFGABIRDISCOMINGTHROUGHTHEWINDSHIELD!"

      * I'm probably being quite generous.

      • A machine equipped with the full range of sensors available today will probably be able to detect, decide and alert the passengers to the threat faster than the average human driver would be able to detect and react to the same threat in the majority of situations.

        • Human brains are still better then computers at that type of pattern matching. Autonomous cars will require strong AI.

          • I forget which manufacturer it is, but they've begun to equip cars with IR sensors that can identify people, deer, etc at a much greater distance than the car's headlights penetrate. A computer could act on that information immediately, and act differently based on whether it's a cyclist or a deer, whether it's moving parallel, towards or away from the road and other variables, but the best you can relay to a human in that time span is "SOMETHING AHEAD!". Google is heading towards the 500,000 autonomous mil

        • by icebike ( 68054 )

          A machine equipped with the full range of sensors available today will probably be able to detect, decide and alert the passengers to the threat faster than the average human driver would be able to detect and react to the same threat in the majority of situations.

          Doesn't mean that the humans would be able to do anything about it. An alert human who was driving might be able to do so. but one chatting or reading a book won't be able to do anything.

          So you've added nothing to the conversation except a bunch of Rah Rah Raw Hip Hip Hooray for automated cars, with vague promises of increased safety and unproven assurances.

          But you've done nothing to answer the question under discussion, which makes that totally board and intentionally disengaged driver legally responsible

    • Re:Safety (Score:4, Insightful)

      by gstoddart ( 321705 ) on Thursday January 09, 2014 @01:58PM (#45908301) Homepage

      I would think the point would be that machines, once properly programmed, can be the worlds safest drivers...statistically. You, as a human, will still be responsible for taking over when the machine doesn't know what to do.

      No way that's gonna work.

      There's now way you can expect people to be alert and responsive if they have to be on the ball for that small fraction of the time -- they'll have started reading their paper or plenty of other things.

      If I'm responsible for the operation of the vehicle, I'll bloody well drive myself and be engaged for the entire time, and don't need your autonomous car.

      I'f I'm not responsible for the operation of the vehicle, I want to be in the back seat in one hell of a good safety cage with no pretense whatsoever that I'm in control.

      You can't have the vehicle responsible most of the time, and the ostensible operator responsible whenever that stops working suddenly, it defeats the purpose.

      Which, to me, is kind of a fairly fundamental problem with self driving cars. It's all or nothing. And if *all* the cars on the road aren't autonomous, then the autonomous ones are mostly a traffic hazard with no clear liability.

      • And if *all* the cars on the road aren't autonomous, then the autonomous ones are mostly a traffic hazard with no clear liability.

        The google self-driving car has already shown itself to be insanely good at avoiding crazy human drivers. Even going as far as swerving out of the way of human drivers trying to ram it. The only way autonomous cars will be a traffic hazard to human drivers is if the production cars take a HUGE step down from the existing prototypes. That's just not going to happen.

        I little bit of that is here http://www.technologyreview.com/news/520746/data-shows-googles-robot-cars-are-smoother-safer-dri [technologyreview.com]

      • If I'm responsible for the operation of the vehicle, I'll bloody well drive myself and be engaged for the entire time, and don't need your autonomous car.
        I'f I'm not responsible for the operation of the vehicle, I want to be in the back seat in one hell of a good safety cage with no pretense whatsoever that I'm in control.

        Imagine a slightly different situation. You are a rich bloke who hires a chauffeur to do his driving for him, and some law says that you are still legally responsible for any accidents caused by your car (in most cases you would be responsible anyway - kind of. Your insurance pays, and your premium goes up). You would just try to hire a good chauffeur and fire him if he drives dangerously, but you wouldn't be constantly watching him. And that chauffeur would be a professional where you are an amateur, and t

    • by alen ( 225700 )

      most car deaths in the USA happen around the big holidays and weekends. times when people are drinking and driving and probably driving late at night and tired

      if the drunks buy the self driving cars, then it will reduce deaths. but then by law they still have to stay alert to take over

    • by icebike ( 68054 )

      I would think the point would be that machines, once properly programmed, can be the worlds safest drivers...statistically. You, as a human, will still be responsible for taking over when the machine doesn't know what to do.

      I'm not allowed to run the train
      The whistle I can't blow
      I'm not allowed to say how far
      The railroad cars can go.
      I'm not allowed to shoot off steam,
      Nor even clang the bell
      But let the damn train jump the track
      And see who catches Hell!

      As the old poem suggests, and the article makes clear, There is no way a human can be awakened and handed an emergency when automation exceeds its limitations. This might work on a milling machine, when a tool runs out of raw materials, but it can't work at 70mph with impending d

  • Insurance (Score:5, Insightful)

    by mfwitten ( 1906728 ) on Thursday January 09, 2014 @01:46PM (#45908105)

    There's an industry that manages risk.

    Regulation (e.g., insurance) always develops spontaneously, because there is a market for reducing chaos.

    • Re:Insurance (Score:5, Interesting)

      by jythie ( 914043 ) on Thursday January 09, 2014 @01:53PM (#45908205)
      *nod* I could see the liability resting on your insurance carrier, then premiums being based off the model of car, version of software, or configuration.
      • Re:Insurance (Score:5, Interesting)

        by bill_mcgonigle ( 4333 ) * on Thursday January 09, 2014 @02:08PM (#45908481) Homepage Journal

        Right. It needs to be strictly civil liability - the government could really hose this up if they attach criminal penalties.

        The computer industry has set a terrible precedent here, which I hope is stopped. That person running an unpatched XP in a botnet should be just as liable as the person riding in his car, for the damage his car does and for the damage his PC does. Kaspersky should be selling combination AV/Insurance packages.

        People wonder why linux doesn't catch on despite being so much more secure than Windows. One of the factors is that Windows doesn't have to be as good because liability is artificially limited by the government, and that's a direct subsidy. Absent that protection, either Windows would get better or it'd become too expensive to run.

      • Yup. And I'd bet that autonomous cars will get you a discount, like having a theft deterrent device. Owners won't care too much that they are liable, since they were always liable and their insurance just went down. Behind the scenes I expect all sorts of juicy court battles, as insurers and manufactures fight over things like manufacturing defects vs improper sensor maintenance and the like - but to the owner of the vehicle, I don't expect much resistance.

      • by vt0asta ( 16536 )

        Agreed. It's the same as if you were driving. However, there could be a safe auto driver discount for your insurance if you allow the vehicle to do the driving more than you do...and if there is an added fee for driving a vehicle with auto drive that too will dictate it's adoption and incorporate liability costs. Further, there are these things called courts and they've been known to settle grey areas like these. "What did the manufacture know and when did they know it?" Also, as per usual...the life you sa

    • Indeed, this will solve itself over time:
      At first, drivers will probably have to stay alert and ready to take over (hands on the wheel as required by that Merc that self-drives in traffic jams). When something of a baseline safety record for autonomous vehicles is established, you'll be allowed to go hands off but may have to be required to take out additional car insurance if you use the self-driving feature. With an improving safety record, that extra insurance will drop in price over time until it'll
  • laws change (Score:5, Insightful)

    by MozeeToby ( 1163751 ) on Thursday January 09, 2014 @01:47PM (#45908115)

    Current law not appropriate for future technology! News at 11!

    • Current Law not appropriate for Future Technology = Poorly designed law. Such a law should be repealed immediately. Replacement should be technology neutral. There are always flaws in every system, we cannot eliminate all flaws, but we can mitigate against them.

      At some point, it would be better to assume the flaws, build in common structure for handling "no fault" accidents (technology failures) financially so that we remove the "get rich quick" aspects of tort litigation, and incorporate those costs into t

      • The problem is that laws can't be designed with future technology in mind as you never know where future technology will lead. Who could have envisioned, 50 years ago, that we would have cars that drove themselves? A law isn't poorly designed if some future technology isn't handled by it. In cases like that, the law needs to be updated, completely rewritten, or repealed (depending on the new situation). That's just the reality of laws and technology.

    • Hell, current laws aren't appropriate for many current technologies.

    • I was researching the appropriate statues in the Combined Annotated Statues of the Law of the State of (wherever) at the time the vehicle ran down six members of the State Supreme Court. I refer you to Evidence Photo #17, in which the rest of the car was full of lawbooks. your honor, this case should be considered pre-appealed, as it has already been presented to the Supreme Court, and I should be released on personal recognizance... .

  • Talk about a crazy-assed prognostication! This is a ridiculously stupid question (cue the "even by slashdicetimmy standards" responses).

    you might as well ask what would happen if it turned out that the number of angels that can dance on a pin turned out to be finite.

  • Depends (Score:3, Insightful)

    by Murdoch5 ( 1563847 ) on Thursday January 09, 2014 @01:49PM (#45908149) Homepage
    If the car has a software issue and crashes then the software developer is at fault. If the car has a hardware problem then the hardware developer is at fault. If the car has a mechanical failure then then mechanical engineer is at fault and so on. Either developer the components / modules correctly in the first place or not at all. If modules / components have lifespans then just lock the car from starting once those lifespans have been reached and if you don't want to be held holding the torch when shit hits the fan then don't get involved from the get go. To spite this modern system of pass the buck and never accept ownership of the problem, someone caused the issue by not doing there job right to begin with and they should have to rectify it.
    • "If the car has a software issue and crashes then the software developer is at fault. If the car has a hardware problem then the hardware developer is at fault. If the car has a mechanical failure then then mechanical engineer is at fault and so on."

      "Fault" and "Liability" are not the same thing. You can be at fault without being liable, or liable without being at fault.

    • *their

      Also, not all failures are caused by "not doing there job right", especially when venturing into new territory. The Tacoma Narrows Bridge, a classic example of a disastrous engineering project, pushed the envelope and collapsed, but not because the engineers didn't do their job right. There hadn't been a bridge of that size with that design before, and aerodynamic concerns weren't taken into account. If that bridge hadn't collapsed and taught the lesson, some other bridge would have.

      You can never remo

  • If you are driving, you are responsible.

    A car that drives itself is responsible for itself.

    Who pays in the event of an accident is the driver. In this case, the car. Probably the manufacturer would be liable.

    Manufacturers will probably get insurance for the car when driven autonomously. If self-driving cars are safer, this should be a lower insurance rate than you pay now. Additionally, self-driving cars will probably have sensor input that will prove/disprove fault.

  • Maybe it should be like govt caused problems, where the taxpayers pay for all problems.

  • by gstoddart ( 321705 ) on Thursday January 09, 2014 @01:53PM (#45908193) Homepage

    The manufacturer will have an EULA which absolves them from guilt.

    It won't be the people who sold it, because they'll also have a contract term which says they are absolved from guilt.

    So, it will come down to the owner, who will be entirely dependent on the quality of the product, as delivered by two entities who have already said "not us".

    So, if you privately buy an autonomous car, and it crashes, you will likely be on the hook for it. If you merely hire them (as in a Taxi), then I'm sure the people who rent them will also absolve themselves from guilt in some strange way -- likely through arms length 3rd parties who do the actual operation.

    This won't be so much "buyer beware" as "everyone else on the roadway beware", because you'll have a vehicle driving around that if it crashes, there's a long line of people who have already made sure their asses are covered.

    The lawyers for the companies making and selling these will have covered their asses before it ends up in the hands of anybody else.

  • A vehicle malfunction that causes an auto accident won't be attributed to the driver. When Toyota's gas pedals were getting stuck and causing deaths, the lawyers were going after drivers. There's no difference with autonomous vehicles. If the technology is found to be at fault, it will be the part manufacturer and the auto maker who will be dragged into civil court.
    • Typo: "the lawyers weren't going after drivers" when gas pedals were getting stuck.
    • There is pretty good data that most Toyota drives were stomping the wrong pedal. Same as Audi drivers 20 years ago.

      Same as 20 years ago, the lawyers don't care. They can find a sympathetic jury.

  • by i_ate_god ( 899684 ) on Thursday January 09, 2014 @01:55PM (#45908227)

    Just from memory:

    Montreal Metro is driven by autopiloting with someone in the cab for door management.

    Vancouver Skyline doesn't even have a driver anywhere, it's all automated.

    Several airports (Orlando was the last one I went to), have automated trains/monorails to shuffle people between terminals.

    Most flights you take are done almost entirely on autopilot.

    So far, it seems that mass transit is increasingly automated. So why is non-mass transit any different?

    • by gstoddart ( 321705 ) on Thursday January 09, 2014 @02:05PM (#45908433) Homepage

      Except, being on rails provides distinct advantages in terms of things being on auto-pilot.

      There's far fewer degrees of freedom in terms of what can happen, because, well, you're on frigging rails.

      You need to monitor your speed and your braking, but the turning is enforced by the rails unless you're going way too fast.

      So why is non-mass transit any different?

      Because cars aren't on rails?

      Planes are slightly different, because you can bet that the pilot is still ultimately responsible for the aircraft, and if it crashes due to pilot error, he's going to be the one hung out to dry. (Other than that, we mostly just hope/trust that pilots are professional, qualified, and able to do the job at hand)

      • Sure, being on a track makes autopiloting and "self-driving" easier, but the question the submitter proposes is already answered.

        We have self driving vehicles already, and amazingly, we know what to do when there's a crash.

        Hell, escalators break, and hurt people FFS. This isn't any different.

    • Maintenance.
      Many people will not maintain their car until something brakes, hoping that it won't be at 150kph.

      You've also mostly chosen examples where failure is limited to the mass transit itself (except for planes, but they have pilots with a great incentive to aim for something soft). A failing automated car drifting into my lane is a suddenly a lot more complex liability case.

    • by Zocalo ( 252965 )
      Because all the systems like those you cite consist entirely of automated vehicles, so in theory it should be entirely predictable what is going to be happening at any given time, which vehicle is where, and so on leaving little to no margin of error in where the cause and any blame for an accident lay. Self-driving vehicles are almost certainly going to have to share the roads with other users, including cyclists and vehicles driven by humans some of which will be idiots, plus they have to deal with the v
    • by GuB-42 ( 2483988 )

      For planes, auto-pilot is easier. Obstacles in air are very uncommon. You could cruise simply by going blindly from A to B in a straight line and the chances of hitting anything will be very low. You just need relatively simple systems to reduce this risk to something insignificant. Take off and landing are a bit trickier but even these are more predictable than driving.
      Plus, you still have two highly trained pilots aided by air traffic controllers.

  • And the liability will shift to the manufacturer of the autonomous vehicle more so than the person riding in it and owning it.
  • by spinozaq ( 409589 ) on Thursday January 09, 2014 @01:58PM (#45908295)

    The change will happen slowly, organically, over time. A self driving car will behave statistically as a very safe driver. Ownership of a self driving car should bestow upon you lower insurance rates. If the current insurance companies balk at the idea, the private market will take over and "self driving only" insurance companies will gladly take their place. Eventually, as more and more share of vehicles are self driving the size of the insurance industry will shrink significantly.

    I see no reason to change the liability burden away from the "Driver". It may seem counter intuitive, but you are gaining economic advantage by using your self driving car. For that advantage, you accept the risks, and insure yourself against them. That said, operating a self driving car will/should carry significantly less risk and liability then driving yourself around does now.

    That does not mean that the car makers are off the hook. Just like today, if a vehicle mechanically malfunctions in a way that the car maker is found responsible, the insurance company may attempt to subrogate the claim to them.

  • Who is liable if you have a crash in a taxi cab or a state-owned vehicle? The thing this article overlooked is that there is more than one business model for selling cars. Self-driving cars might flourish by allowing companies to provide a lower cost car service for those who either cannot or do not wish to drive themselves. Apps like Sidecar (http://www.side.cr/safety) and Lyft (http://www.lyft.me/safety) are already pointing in this direction and centrally controlled driverless car services could be a log

  • Just make the car white... and put a fruit symbol on it. Millions of people will buy it despite the fact it has no practical application.

  • In many areas, this is not regulated by law but by legal precedent. Besides, laws can be changed and precedents evolve.
    Besides, depending on the cause of the accident, this could easily fall under existing product liability laws, regulations, and precedents.

  • ...there has to be *somebody* who can be sued. It's the American Way.

  • Some sort of no-fault insurance that all driverless car owners would pay into that accepts responsibility for and pays out damages on accidents seems like the obvious solution here.

    If the cars are genuinely significantly safer than it would be cheaper than current insurance. And if there is an accident, the damages are covered, and there's no penalty to the owner.

    This doesn't seem like an intractable problem at all.

  • It's because of this conundrum that autonomous vehicles will only be novelty features on standard automobiles. It will be an auto-pilot or cruise control wherein the driver is still expected to take control in the case of an emergency that could not be measured by the car's sensors or accounted for by the car's algorithms.

    And that's not bad! It's just not as idyllic as some would prefer.

  • If you want to read, or nap or do anything other than pay attention to driving just use public transit. It's not always an option, but if you really just don't want to worry about driving it's the best choice. And it adds efficiency that even a self-driving car can't bring to commuting.
  • All one has to do is compare it to a situation involving another driverless vehicle: a car that rolls down a hill. If one owns a car parked on a hill and, for whatever reason, the car rolls down the hill, one is liable for any damages that results as well as any fines or penalties.

    If one believes the car rolled down the hill because of a defect in the car, then one can attempt to hold the manufacturer of the car liable for the damages, penalties, fines, etc..

    If one can show that a third party did somethi
  • by gnasher719 ( 869701 ) on Thursday January 09, 2014 @02:46PM (#45909093)
    There are two distinct things: One is that you are officially the driver even if the car drives itself, and you are responsible. But the whole point of a self driving car is that it is safer driving in a self-driving car with your eyes closed than in a non-self driving car with open eyes. You are responsible, but nobody is going to say "you are responsible because you used a self driving car without watching". They will say "you are responsible because your self-driving car caused the crash". Which will happen less often than if you drove yourself.

    Right now you have to (a) watch out what you are doing and (b) pray that you don't have an accident. With a self driving car you don't need to watch out what you or the car are doing; you still have to pray that you don't have an accident.

    And the whole idea of taking control in unexpected situations is nonsense. In the very best case, you would have to (1) do something to take control away from the computer and (2) react to the problem. In situations where there is enough time for that, the computer can handle things just fine. And people may think they are good in unexpected situations, but they are not.
  • by timholman ( 71886 ) on Thursday January 09, 2014 @03:40PM (#45909813)

    This meme of "self-driving cars will never work, because who gets sued?" keeps popping up, yet the idea of having liability insurance for personal possessions not under your direct control has been around for a long, long time. If someone visits your home and hurts himself while on your property, your homeowner's liability insurance covers you, even if you are not physically present. The insurance companies will learn to deal with self-driving vehicles, because there will be money to be made, and they will figure out a way to get into that market.

    In any case, self-driving cars are absolutely inevitable for one major reason: our aging population. Senior citizens are going to demand the freedom of personal transportation, and anyone in the U.S. who tries to tell them "no" is going to be fighting the AARP, which has some of the most powerful lobbyists in Washington. Furthermore, consider citizens who are blind, or deaf, or epileptic. Why shouldn't they have the right to personal transportation? This will become a mandate for individual rights enforced by the federal government.

    In any case, people who claim self-driving cars will never work keep ignoring the elephant in the room: 35,000 fatalities and 2.2 million injuries a year, and a cost of $250B due to car crashes - and that is just in the U.S. alone. We slaughter each other right and left, and just shrug our shoulders. I'd much rather trust a computerized driving system, even if it has rare failures, because statistically I'll still be much, much safer on the road.

    Ultimately, this argument will all be moot. It reminds me very much about how some people railed against personal cell phones when they first began to appear. How did that work out? In thirty years, you'll have a whole generation of adults who have grown up without having spent 5 minutes of their lives behind the wheel. At that point self-driving cars win by default, because most people won't even know how to drive anymore. To them, knowing how to drive a car will be about as relevant as knowing how to saddle and ride a horse.

Math is like love -- a simple idea but it can get complicated. -- R. Drabek

Working...