Who Is Liable When a Self-Driving Car Crashes? 937
innocent_white_lamb writes "Current laws make the driver of a car responsible for any mayhem caused by that vehicle. But what happens when there is no driver? This article argues that the dream of a self-driving car is futile since the law requires that the driver is responsible for the operation of the vehicle. Therefore, even if a car is self-driving, you as the driver must stay alert and pay attention. No texting, no reading, no snoozing. So what's the point of a self-driving car if you can't relax or do something else while 'driving?'"
Efficiency. (Score:5, Insightful)
Re: (Score:3)
Cant wait till they update all those dedicated bus/carpool lanes. Self driving cars no speed limit max safe speed determined by the cars, cars slower automatically pull over and let faster cars pass. Hell leave the buses in as long as they stop obstructing the flow of traffic.
Re: (Score:3, Insightful)
A report showing the effect [nbcnews.com] and a chart [fueleconomy.gov] which gives a graphical representation of this effect.
Re:Efficiency. (Score:5, Insightful)
Efficiency can have multiple meanings. You're talking about maximizing mileage for the fuel used. What if we're talking about getting you from point A to point B the fastest possible to efficiently minimize your travel time and maximize your time at the destination? Or if the self-driving car is a taxi, for delivering one fare and picking up another, balancing fuel economy, fare rates, and fare availability, "efficiently" maximizing revenue while minimizing idle time.
Re:Efficiency. (Score:5, Interesting)
Re:Efficiency. (Score:5, Insightful)
Time efficient, vs cost. I can not get more time, I can get more money thus I value my time far more than money. By your charts paying 33-50% more to get someplace 2x as fast is well worth it. If your time is cheap but your money dear stay in the slow lane.
Re:Efficiency. (Score:4, Informative)
How is it efficient if you drive as fast as possible? Fuel mileage decreases once you hit about 50 mph. After that you're driving your costs higher.
A report showing the effect [nbcnews.com] and a chart [fueleconomy.gov] which gives a graphical representation of this effect.
Time is money, friend.
Re:Efficiency. (Score:4, Informative)
The Focus at 75mph (typical motorway speed in the UK) is running at a much higher RPM (probably 4000+RPM, I can't say for sure but my old Peugeot 306 ran about that) than the M3, which is practically idling at that speed. It's part gearing, if you gave those hatchbacks an extra top gear they could get great efficiency at real motorway speeds. It's also horsepower. If you generate more HP per revolution, you don't need so many revolutions to maintain a speed. Obviously there's a balance as increasing HP typically means decreasing MPG.
Re:Efficiency. (Score:5, Funny)
Re:Efficiency. (Score:4, Insightful)
Well, I'll bite.
Government takes out my brakes? No problem, shift into first and engine-brake going 10 mph down the hill.
Good luck with that at any speed which would have the potential to kill you
Stuck accelerator? Put 'er in neutral.
Or turn off the ignition.
Get caught in a storm or drive into a lake? I can simply unlock the door or roll down my windows and swim out, no power components to sieze up or go inactive.
Car doors can usually be opened from the inside even when locked. An exception are back doors with stupid-child protection engaged
Starter or battery dead? Push-start the car.
Yep. But not relevant to the point being discussed. It's about gov't being out to kill you, remember?
Save gas? Coast in neutral down large hills.
No, you're wasting gas that way, since you still need some to keep the idle rev. Non-ancient cars will actually shut off fuel injection when gravity happens to temporarily become 'fuel'
It will take nothing short of a remote-controlled bomb or gunfire or a chase ram car to assassinate somebody driving an all-manual car.
And that is why your whole paranoia is even more ridiculous.
Disclaimer: i drive manual transmission too, but for none of the reasons you mention. My reasons are: a) simpler/more robust design (i.e. one less part which can fail fail), b) more control, c) avoid ridicule
Re:Efficiency. (Score:5, Interesting)
An exception are back doors with stupid-child protection engaged
Okay, I'm somewhat off-topic, but I've gotta ask what you have against this? I've personally had the experience of a former GF's 4 yr old opening a back door while I was cruising down the road.
Re:Efficiency. (Score:5, Funny)
As per common English operator precedence, the hyphen-operator binds tighter than the space operator.
see man 5 english for further information
Re: (Score:3)
You talk like that's a bad thing or something....?
Re: Efficiency. (Score:5, Insightful)
As for who's responsible when a driverless car crashes it will probably be the same as when a dog kills someone, the owner of the dog is responsible. Just because the owner wasn't operating the wheel doesn't make them any less responsible, but just like we have learned to trust cruise control and drive by wire gas pedals to not suddenly accelerate, we will learn to trust driverless cars.
But how will cops be able to tell drunk drivers if the car is driving? And does it even matter if they're drunk if the car is driving them home?
Re: (Score:3)
One of the major problems with current traffic flows is it only takes a few aggressive drivers who get minor advantages to slow everything down. There are enough people who, when presented with "you can get there in 8 minutes but everyone else will take 12 or everyone including you can take 10 minutes, but for each person who chooses 8 minutes everyone, including them, will take one minute longer' wi
Re:Efficiency. (Score:5, Insightful)
Think of all the problems it could solve though. For example, oblivious drivers shoulder to shoulder going the same speed and not letting anyone else pass. If the cars were autonomous then they could simply tell each other to move over. I would love to have that ability now. Lane speed could also be regulated. If you wanted your car to drive slower then it would stay in the farther right lanes. If your car was being passed on the right, then it would keep moving over until no one is passing it on the right. It would be great if humans did that today, which is the cause for most of the slowdown that I see on the highways.
Re: (Score:3)
Re: (Score:3)
Germany solved that on the Autobahn by making a law requiring you to mov
Re: (Score:2)
Why would there be a need for passing?
Because all vehicles will not become automated overnight.
Re: (Score:3, Insightful)
Re: (Score:3, Insightful)
everyone believes they are a skilled driver with a properly maintained vehicle
Re:Efficiency. (Score:5, Interesting)
Re: (Score:3)
Re: (Score:2)
True, self-driving cars will be safer, but that doesn't answer the question. At first you'll still need insurance. If one of them does cause a crash because of a mechanical malfunction, why would anything change? Automakers and mechanics are sued all the time for crashes caused by mechanical problems (which are actually rare, almost all car wrecks are human error). Example: Ford and Firestone for the SUV rollovers.
I think eventually driving without insurance will be legalized when the human factor is remove
Re:Efficiency. (Score:5, Insightful)
Of course in the real world the driver is almost never personally held liable. If I let my friend drive my car and he causes a crash on accident My insurance for My car will pay for the accident. I didn't cause the crash my my car which I insured crashed so ultimately my insurance pays for it and my rates go up. Who the driver is, my friend or an AI system is irrelevant.
Re: (Score:3)
I think it doesn't really matter how safe the auto drive is or how attentive the driver is when it goes to answering the OPs question. The legality will be (should be) exactly the same as when you're using cruise control. When I driver engages such a "tool" to "assist" their driving they are not handing liability over to the "tool". If I engage cruise control and look away and cruise control propels me into another vehicle that has stopped in front of me I am still 100% liable for that accident (with a po
Re: (Score:3)
The car manufacturers will have to take out insurance against insurance companies.
It'll work itself out. The only thing *everybody* seems to agree on is that overall, driving will be much safer once we remove human inputs from the controls.
Also, I believe the switchover to driverless cars will be amazingly fast. People will be fighting each other in the showrooms when they figure out they can watch porn instead of driving.
Re: (Score:3)
It's not that a driverless car can "do better" under certain circumstances, it's what happens when it doesn't.
This statement gets a little under my skin because it implies that under some conditions an average human driver will be able to avoid an incident that the car can't. Humans will never, at least not without implants, be able to compute the variables faster than a computer. If a self-driving car can't avoid an impending incident there is no way I will believe a human could. Any accidents that occur with self-driving cars, initially, I'm sure will be because of human drivers not doing what they should be doin
Re:Efficiency. (Score:5, Interesting)
If a self-driving car can't avoid an impending incident there is no way I will believe a human could.
We're headed down the freeway. Up ahead I see some teenagers standing on an overpass holding something large and watching cars pass underneath. I recognize a potential dropped rock and change lanes to get away from it. Will the computer do that?
I'm almost home. I see the neighbor kid playing basketball in his driveway. He shoots. He misses. I know as soon as he misses that there is a good chance the ball will roll out into the street, and knowing how oblivious the neighbor kid is I can expect him to follow. Will the computer know this? In fact, I see the kid running towards the street, but he is hidden behind a parked van and will not actually be visible in the street until he's in the street directly in front of me. Will the car track him all the way from the upper end of his driveway?
I'm passing an intersection and there are two people standing on the corner. They are in a position where they might step into the crosswalk. Can the computer read those people's body language to predict that they will or won't step off the sidewalk in front of me?
There are any number of fuzzy logic problems that the computer will never be better at solving as fast and correctly as a human is, simply because the data will be missing. Everyone who claims that the new robotic car overlords will be better and safer at doing everything for us are hopelessly naive.
Any accidents that occur with self-driving cars, initially, I'm sure will be because of human drivers not doing what they should be doing.
Even if the self-driving cars have accidents it will be because the humans, who are not doing anything have done it wrong. And the NTSB is correct for their blanket finding of "pilot error" on every airplane crash, right?
like a piano falling out of the sky landing directly on a car in traffic.
Yes, when I drive, that's exactly what I fear most. And based on your claims that a human couldn't keep an eye on a helicopter with a piano dangling on a thin wire underneath but a computer could ("If a self-driving car can't avoid an impending incident there is no way I will believe a human could") I for one welcome my new robotic masters.
Safety (Score:5, Insightful)
I would think the point would be that machines, once properly programmed, can be the worlds safest drivers...statistically. You, as a human, will still be responsible for taking over when the machine doesn't know what to do. But, for the other 99.5% of the time, the self-driving car will make the best decisions and always be completely alert.
Self-Driving cars, I believe, have the ability to drastically reduce deaths caused by motor vehicle accidents...one of the highest causes of death in the USA.
Boring Drive (Score:5, Insightful)
Re:Boring Drive (Score:5, Informative)
Cars should have a failsafe option when faced with a decision in dangerous circumstances. Something like "pull the fuck off the road without hitting shit then ask what to do". Sure, even a failsafe option can't account for everything, but it will probably still do a better job than your average human driver - alert or not.
If we always waited until 100% of the issues are ironed out, then we still wouldn't even be using fire. Personally, once machine drivers are statistically safer than human drivers, I'm all for adopting them as our vehicular overlords.
Re: (Score:3, Insightful)
Yep, you're right, but the problem is that people are so fucking stupid that if any non-autonomous drivers were on the road it would be pulling over constantly. How many times a week do people get too close to you on the highway or tailgate. How many times a week do you pull up to a four-way stop and some hillbilly can't comprehend what to do? The same things will happen to self-driving cars while there are still people driving their own machines.
Re:Boring Drive (Score:4, Informative)
Tailgating, speeding, failure to signal, etc are all behaviors that the current generation of self-driving car can already account for using the same tactics that a sane human driver would use. Back in August 2012, Google's team announced that they had passed the 300,000 autonomous mile mark on public roads. Accident-free.
Re: (Score:3)
Re:Boring Drive (Score:5, Insightful)
Not to mention that people who have been using self-driving cars all their life will have 99% less driving experience. They will basically all be student drivers, but without a teacher in the car when something goes wrong.
Re: (Score:2)
I would think the point would be that machines, once properly programmed, can be the worlds safest drivers...statistically. You, as a human, will still be responsible for taking over when the machine doesn't know what to do. But, for the other 99.5% of the time, the self-driving car will make the best decisions and always be completely alert.
The problem with that is, how much notice do you think a computer is going to give the operator when it comes across one of those situations it doesn't know how to handle? 5 minutes, probably not; 5 seconds*, maybe. That's not a lot of time to switch gears from "casually reading a book" to "OMFGABIRDISCOMINGTHROUGHTHEWINDSHIELD!"
* I'm probably being quite generous.
Re: (Score:2)
A machine equipped with the full range of sensors available today will probably be able to detect, decide and alert the passengers to the threat faster than the average human driver would be able to detect and react to the same threat in the majority of situations.
Re: (Score:2)
Human brains are still better then computers at that type of pattern matching. Autonomous cars will require strong AI.
Re: (Score:3)
I forget which manufacturer it is, but they've begun to equip cars with IR sensors that can identify people, deer, etc at a much greater distance than the car's headlights penetrate. A computer could act on that information immediately, and act differently based on whether it's a cyclist or a deer, whether it's moving parallel, towards or away from the road and other variables, but the best you can relay to a human in that time span is "SOMETHING AHEAD!". Google is heading towards the 500,000 autonomous mil
Re: (Score:3)
A machine equipped with the full range of sensors available today will probably be able to detect, decide and alert the passengers to the threat faster than the average human driver would be able to detect and react to the same threat in the majority of situations.
Doesn't mean that the humans would be able to do anything about it. An alert human who was driving might be able to do so. but one chatting or reading a book won't be able to do anything.
So you've added nothing to the conversation except a bunch of Rah Rah Raw Hip Hip Hooray for automated cars, with vague promises of increased safety and unproven assurances.
But you've done nothing to answer the question under discussion, which makes that totally board and intentionally disengaged driver legally responsible
Re:Safety (Score:4, Insightful)
No way that's gonna work.
There's now way you can expect people to be alert and responsive if they have to be on the ball for that small fraction of the time -- they'll have started reading their paper or plenty of other things.
If I'm responsible for the operation of the vehicle, I'll bloody well drive myself and be engaged for the entire time, and don't need your autonomous car.
I'f I'm not responsible for the operation of the vehicle, I want to be in the back seat in one hell of a good safety cage with no pretense whatsoever that I'm in control.
You can't have the vehicle responsible most of the time, and the ostensible operator responsible whenever that stops working suddenly, it defeats the purpose.
Which, to me, is kind of a fairly fundamental problem with self driving cars. It's all or nothing. And if *all* the cars on the road aren't autonomous, then the autonomous ones are mostly a traffic hazard with no clear liability.
Re: (Score:3)
And if *all* the cars on the road aren't autonomous, then the autonomous ones are mostly a traffic hazard with no clear liability.
The google self-driving car has already shown itself to be insanely good at avoiding crazy human drivers. Even going as far as swerving out of the way of human drivers trying to ram it. The only way autonomous cars will be a traffic hazard to human drivers is if the production cars take a HUGE step down from the existing prototypes. That's just not going to happen.
I little bit of that is here http://www.technologyreview.com/news/520746/data-shows-googles-robot-cars-are-smoother-safer-dri [technologyreview.com]
Re: (Score:3)
If I'm responsible for the operation of the vehicle, I'll bloody well drive myself and be engaged for the entire time, and don't need your autonomous car.
I'f I'm not responsible for the operation of the vehicle, I want to be in the back seat in one hell of a good safety cage with no pretense whatsoever that I'm in control.
Imagine a slightly different situation. You are a rich bloke who hires a chauffeur to do his driving for him, and some law says that you are still legally responsible for any accidents caused by your car (in most cases you would be responsible anyway - kind of. Your insurance pays, and your premium goes up). You would just try to hire a good chauffeur and fire him if he drives dangerously, but you wouldn't be constantly watching him. And that chauffeur would be a professional where you are an amateur, and t
Re: (Score:2)
most car deaths in the USA happen around the big holidays and weekends. times when people are drinking and driving and probably driving late at night and tired
if the drunks buy the self driving cars, then it will reduce deaths. but then by law they still have to stay alert to take over
Re: (Score:3)
I would think the point would be that machines, once properly programmed, can be the worlds safest drivers...statistically. You, as a human, will still be responsible for taking over when the machine doesn't know what to do.
I'm not allowed to run the train
The whistle I can't blow
I'm not allowed to say how far
The railroad cars can go.
I'm not allowed to shoot off steam,
Nor even clang the bell
But let the damn train jump the track
And see who catches Hell!
As the old poem suggests, and the article makes clear, There is no way a human can be awakened and handed an emergency when automation exceeds its limitations. This might work on a milling machine, when a tool runs out of raw materials, but it can't work at 70mph with impending d
Insurance (Score:5, Insightful)
There's an industry that manages risk.
Regulation (e.g., insurance) always develops spontaneously, because there is a market for reducing chaos.
Re:Insurance (Score:5, Interesting)
Re:Insurance (Score:5, Interesting)
Right. It needs to be strictly civil liability - the government could really hose this up if they attach criminal penalties.
The computer industry has set a terrible precedent here, which I hope is stopped. That person running an unpatched XP in a botnet should be just as liable as the person riding in his car, for the damage his car does and for the damage his PC does. Kaspersky should be selling combination AV/Insurance packages.
People wonder why linux doesn't catch on despite being so much more secure than Windows. One of the factors is that Windows doesn't have to be as good because liability is artificially limited by the government, and that's a direct subsidy. Absent that protection, either Windows would get better or it'd become too expensive to run.
Re: (Score:2)
Yup. And I'd bet that autonomous cars will get you a discount, like having a theft deterrent device. Owners won't care too much that they are liable, since they were always liable and their insurance just went down. Behind the scenes I expect all sorts of juicy court battles, as insurers and manufactures fight over things like manufacturing defects vs improper sensor maintenance and the like - but to the owner of the vehicle, I don't expect much resistance.
Re: (Score:3)
Agreed. It's the same as if you were driving. However, there could be a safe auto driver discount for your insurance if you allow the vehicle to do the driving more than you do...and if there is an added fee for driving a vehicle with auto drive that too will dictate it's adoption and incorporate liability costs. Further, there are these things called courts and they've been known to settle grey areas like these. "What did the manufacture know and when did they know it?" Also, as per usual...the life you sa
Re:Insurance (Score:5, Funny)
when it's time to collect, they weasel out.
Too true. Clearly we need insurance insurance.
Re:Insurance (Score:5, Informative)
Re: (Score:3)
Determining your car's software version would be too much work for them and cut into profits.
Empirical data points to the opposite: http://fitguide.installernet.com/progressive/ [installernet.com]
Re: (Score:3)
At first, drivers will probably have to stay alert and ready to take over (hands on the wheel as required by that Merc that self-drives in traffic jams). When something of a baseline safety record for autonomous vehicles is established, you'll be allowed to go hands off but may have to be required to take out additional car insurance if you use the self-driving feature. With an improving safety record, that extra insurance will drop in price over time until it'll
Re: (Score:3)
Only the government can by definition, regulate
Not according to my definition of "regulation".
Insurance policies have limits ... (Score:3)
That's a distinction without a difference.
No. Insurance policies often have limits. If you have a $1M policy and you are found liable for a $5M judgement then your insurance pays $1M and you are personally responsible for $4M.
laws change (Score:5, Insightful)
Current law not appropriate for future technology! News at 11!
Re: (Score:2)
Current Law not appropriate for Future Technology = Poorly designed law. Such a law should be repealed immediately. Replacement should be technology neutral. There are always flaws in every system, we cannot eliminate all flaws, but we can mitigate against them.
At some point, it would be better to assume the flaws, build in common structure for handling "no fault" accidents (technology failures) financially so that we remove the "get rich quick" aspects of tort litigation, and incorporate those costs into t
Re: (Score:2)
The problem is that laws can't be designed with future technology in mind as you never know where future technology will lead. Who could have envisioned, 50 years ago, that we would have cars that drove themselves? A law isn't poorly designed if some future technology isn't handled by it. In cases like that, the law needs to be updated, completely rewritten, or repealed (depending on the new situation). That's just the reality of laws and technology.
Re: (Score:2)
Hell, current laws aren't appropriate for many current technologies.
but your honor! (Score:3)
I was researching the appropriate statues in the Combined Annotated Statues of the Law of the State of (wherever) at the time the vehicle ran down six members of the State Supreme Court. I refer you to Evidence Photo #17, in which the rest of the car was full of lawbooks. your honor, this case should be considered pre-appealed, as it has already been presented to the Supreme Court, and I should be released on personal recognizance... .
troll (Score:2)
Talk about a crazy-assed prognostication! This is a ridiculously stupid question (cue the "even by slashdicetimmy standards" responses).
you might as well ask what would happen if it turned out that the number of angels that can dance on a pin turned out to be finite.
Depends (Score:3, Insightful)
Re: (Score:3)
"If the car has a software issue and crashes then the software developer is at fault. If the car has a hardware problem then the hardware developer is at fault. If the car has a mechanical failure then then mechanical engineer is at fault and so on."
"Fault" and "Liability" are not the same thing. You can be at fault without being liable, or liable without being at fault.
Re: (Score:2)
*their
Also, not all failures are caused by "not doing there job right", especially when venturing into new territory. The Tacoma Narrows Bridge, a classic example of a disastrous engineering project, pushed the envelope and collapsed, but not because the engineers didn't do their job right. There hadn't been a bridge of that size with that design before, and aerodynamic concerns weren't taken into account. If that bridge hadn't collapsed and taught the lesson, some other bridge would have.
You can never remo
The driver is responsible. (Score:2)
A car that drives itself is responsible for itself.
Who pays in the event of an accident is the driver. In this case, the car. Probably the manufacturer would be liable.
Manufacturers will probably get insurance for the car when driven autonomously. If self-driving cars are safer, this should be a lower insurance rate than you pay now. Additionally, self-driving cars will probably have sensor input that will prove/disprove fault.
Got a solution... (Score:2)
Maybe it should be like govt caused problems, where the taxpayers pay for all problems.
Won't be the manufacturer ... (Score:5, Informative)
The manufacturer will have an EULA which absolves them from guilt.
It won't be the people who sold it, because they'll also have a contract term which says they are absolved from guilt.
So, it will come down to the owner, who will be entirely dependent on the quality of the product, as delivered by two entities who have already said "not us".
So, if you privately buy an autonomous car, and it crashes, you will likely be on the hook for it. If you merely hire them (as in a Taxi), then I'm sure the people who rent them will also absolve themselves from guilt in some strange way -- likely through arms length 3rd parties who do the actual operation.
This won't be so much "buyer beware" as "everyone else on the roadway beware", because you'll have a vehicle driving around that if it crashes, there's a long line of people who have already made sure their asses are covered.
The lawyers for the companies making and selling these will have covered their asses before it ends up in the hands of anybody else.
Isn't it kind of obvious? (Score:2)
Re: (Score:2)
Re: (Score:2)
There is pretty good data that most Toyota drives were stomping the wrong pedal. Same as Audi drivers 20 years ago.
Same as 20 years ago, the lawyers don't care. They can find a sympathetic jury.
Automated vehicles already exist (Score:5, Interesting)
Just from memory:
Montreal Metro is driven by autopiloting with someone in the cab for door management.
Vancouver Skyline doesn't even have a driver anywhere, it's all automated.
Several airports (Orlando was the last one I went to), have automated trains/monorails to shuffle people between terminals.
Most flights you take are done almost entirely on autopilot.
So far, it seems that mass transit is increasingly automated. So why is non-mass transit any different?
Re:Automated vehicles already exist (Score:4, Informative)
Except, being on rails provides distinct advantages in terms of things being on auto-pilot.
There's far fewer degrees of freedom in terms of what can happen, because, well, you're on frigging rails.
You need to monitor your speed and your braking, but the turning is enforced by the rails unless you're going way too fast.
Because cars aren't on rails?
Planes are slightly different, because you can bet that the pilot is still ultimately responsible for the aircraft, and if it crashes due to pilot error, he's going to be the one hung out to dry. (Other than that, we mostly just hope/trust that pilots are professional, qualified, and able to do the job at hand)
Re: (Score:2)
Sure, being on a track makes autopiloting and "self-driving" easier, but the question the submitter proposes is already answered.
We have self driving vehicles already, and amazingly, we know what to do when there's a crash.
Hell, escalators break, and hurt people FFS. This isn't any different.
Re: (Score:2)
Maintenance.
Many people will not maintain their car until something brakes, hoping that it won't be at 150kph.
You've also mostly chosen examples where failure is limited to the mass transit itself (except for planes, but they have pilots with a great incentive to aim for something soft). A failing automated car drifting into my lane is a suddenly a lot more complex liability case.
Re: (Score:2)
Re: (Score:3)
For planes, auto-pilot is easier. Obstacles in air are very uncommon. You could cruise simply by going blindly from A to B in a straight line and the chances of hitting anything will be very low. You just need relatively simple systems to reduce this risk to something insignificant. Take off and landing are a bit trickier but even these are more predictable than driving.
Plus, you still have two highly trained pilots aided by air traffic controllers.
It's all about liability (Score:2)
I don't think much has to change.... (Score:5, Interesting)
The change will happen slowly, organically, over time. A self driving car will behave statistically as a very safe driver. Ownership of a self driving car should bestow upon you lower insurance rates. If the current insurance companies balk at the idea, the private market will take over and "self driving only" insurance companies will gladly take their place. Eventually, as more and more share of vehicles are self driving the size of the insurance industry will shrink significantly.
I see no reason to change the liability burden away from the "Driver". It may seem counter intuitive, but you are gaining economic advantage by using your self driving car. For that advantage, you accept the risks, and insure yourself against them. That said, operating a self driving car will/should carry significantly less risk and liability then driving yourself around does now.
That does not mean that the car makers are off the hook. Just like today, if a vehicle mechanically malfunctions in a way that the car maker is found responsible, the insurance company may attempt to subrogate the claim to them.
There's a clear business model here (Score:2)
Who is liable if you have a crash in a taxi cab or a state-owned vehicle? The thing this article overlooked is that there is more than one business model for selling cars. Self-driving cars might flourish by allowing companies to provide a lower cost car service for those who either cannot or do not wish to drive themselves. Apps like Sidecar (http://www.side.cr/safety) and Lyft (http://www.lyft.me/safety) are already pointing in this direction and centrally controlled driverless car services could be a log
Just make the car white... (Score:2)
Just make the car white... and put a fruit symbol on it. Millions of people will buy it despite the fact it has no practical application.
Not an issue...also it's a product liability (Score:2)
In many areas, this is not regulated by law but by legal precedent. Besides, laws can be changed and precedents evolve.
Besides, depending on the cause of the accident, this could easily fall under existing product liability laws, regulations, and precedents.
Because, God knows... (Score:2)
...there has to be *somebody* who can be sued. It's the American Way.
no fault insurance (Score:2)
Some sort of no-fault insurance that all driverless car owners would pay into that accepts responsibility for and pays out damages on accidents seems like the obvious solution here.
If the cars are genuinely significantly safer than it would be cheaper than current insurance. And if there is an accident, the damages are covered, and there's no penalty to the owner.
This doesn't seem like an intractable problem at all.
Novelty, Cruise Control (Score:2)
It's because of this conundrum that autonomous vehicles will only be novelty features on standard automobiles. It will be an auto-pilot or cruise control wherein the driver is still expected to take control in the case of an emergency that could not be measured by the car's sensors or accounted for by the car's algorithms.
And that's not bad! It's just not as idyllic as some would prefer.
Public Transit (Score:2)
The article misses the obvious (Score:2)
If one believes the car rolled down the hill because of a defect in the car, then one can attempt to hold the manufacturer of the car liable for the damages, penalties, fines, etc..
If one can show that a third party did somethi
Submitter doesn't understand the problem (Score:5, Insightful)
Right now you have to (a) watch out what you are doing and (b) pray that you don't have an accident. With a self driving car you don't need to watch out what you or the car are doing; you still have to pray that you don't have an accident.
And the whole idea of taking control in unexpected situations is nonsense. In the very best case, you would have to (1) do something to take control away from the computer and (2) react to the problem. In situations where there is enough time for that, the computer can handle things just fine. And people may think they are good in unexpected situations, but they are not.
More FUD (Score:3)
This meme of "self-driving cars will never work, because who gets sued?" keeps popping up, yet the idea of having liability insurance for personal possessions not under your direct control has been around for a long, long time. If someone visits your home and hurts himself while on your property, your homeowner's liability insurance covers you, even if you are not physically present. The insurance companies will learn to deal with self-driving vehicles, because there will be money to be made, and they will figure out a way to get into that market.
In any case, self-driving cars are absolutely inevitable for one major reason: our aging population. Senior citizens are going to demand the freedom of personal transportation, and anyone in the U.S. who tries to tell them "no" is going to be fighting the AARP, which has some of the most powerful lobbyists in Washington. Furthermore, consider citizens who are blind, or deaf, or epileptic. Why shouldn't they have the right to personal transportation? This will become a mandate for individual rights enforced by the federal government.
In any case, people who claim self-driving cars will never work keep ignoring the elephant in the room: 35,000 fatalities and 2.2 million injuries a year, and a cost of $250B due to car crashes - and that is just in the U.S. alone. We slaughter each other right and left, and just shrug our shoulders. I'd much rather trust a computerized driving system, even if it has rare failures, because statistically I'll still be much, much safer on the road.
Ultimately, this argument will all be moot. It reminds me very much about how some people railed against personal cell phones when they first began to appear. How did that work out? In thirty years, you'll have a whole generation of adults who have grown up without having spent 5 minutes of their lives behind the wheel. At that point self-driving cars win by default, because most people won't even know how to drive anymore. To them, knowing how to drive a car will be about as relevant as knowing how to saddle and ride a horse.
Re: (Score:3)
What? Simply wrong. You cause an accident, you pay. If your brakes fail, after you pay, you might have a civil case against your mechanic or car manufacturer.
Re: (Score:2)
so you sit in jail / prsion after a software fault ends up running a kid over?
Re:Ever heard of mechanical failures? (Score:4, Insightful)
I'm not buying a self-driving car until I can sit in the back seat and drink a beer.
Re: (Score:2)
If your breaks break, does that mean they stay in one piece?
Re: (Score:2)
Didn't this bug also disproportionally hit women drivers from India, too?
Re: (Score:2)
Re: (Score:2)
Generally it is the rider. I have seen drunk riders pulled over in New Mexico, where horses are used when a driver has their license taken away for DUI. Seriously, I have seen this twice in Taos.
Then I guess they get charged with RUI.
Re: (Score:2)
Drivers are going to be alert for about two weeks and then the novelty and thrill will have worn off, and he'll be like the guy who works at the amusement part on the roller-coaster. Yawn...
Whose insurance company ? (Score:2)
Insurance companies are not liable, they just pay the bill for the person or company who is liable.
Nope, I refuse universal -anything- (Score:3)