New Questions Raised about Tesla's 'Autopilot' Safety After Three Fatalities This Week (startribune.com) 162
The Associated Press looks at three new fatalities involving Teslas this week, saying the crashes have "increased scrutiny of the company's Autopilot driving system just months before CEO Elon Musk has planned to put fully self-driving cars on the streets."
Last Sunday, a Tesla Model S sedan left a freeway in Gardena, California, at a high speed, ran a red light and struck a Honda Civic, killing two people inside, police said.... Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University, said it's likely that the Tesla in Sunday's California crash was operating on Autopilot, which has become confused in the past by lane lines. He speculated that the lane line was more visible for the exit ramp, so the car took the ramp because it looked like a freeway lane. He also suggested that the driver might not have been paying close attention. "No normal human being would not slow down in an exit lane," he said...
On the same day, a Tesla Model 3 hit a parked firetruck on an Indiana freeway, killing a passenger in the Tesla... In both cases, authorities have yet to determine whether Tesla's Autopilot system was being used... Many experts say they're not aware of fatal crashes involving similar driver-assist systems from General Motors, Mercedes and other automakers. GM monitors drivers with cameras and will shut down the driving system if they don't watch the road. "Tesla is nowhere close to that standard," Rajkumar said. He predicted more deaths involving Teslas if the National Highway Traffic Safety Administration fails to take action...
And on Dec. 7, yet another Model 3 struck a police cruiser on a Connecticut highway, though no one was hurt... [T]he driver told police that the car was operating on Autopilot, a Tesla system designed to keep a car in its lane and a safe distance from other vehicles.
On the same day, a Tesla Model 3 hit a parked firetruck on an Indiana freeway, killing a passenger in the Tesla... In both cases, authorities have yet to determine whether Tesla's Autopilot system was being used... Many experts say they're not aware of fatal crashes involving similar driver-assist systems from General Motors, Mercedes and other automakers. GM monitors drivers with cameras and will shut down the driving system if they don't watch the road. "Tesla is nowhere close to that standard," Rajkumar said. He predicted more deaths involving Teslas if the National Highway Traffic Safety Administration fails to take action...
And on Dec. 7, yet another Model 3 struck a police cruiser on a Connecticut highway, though no one was hurt... [T]he driver told police that the car was operating on Autopilot, a Tesla system designed to keep a car in its lane and a safe distance from other vehicles.
Rajkumar doesn't live in Cali obviously (Score:4, Informative)
"No normal human being would not slow down in an exit lane"
Not only do they SPEED UP in exit lanes here in California to try to beat the light, they'll also go slow in the acceleration lane when getting on the freeway, cross over onto the shoulder and treat it as a right turn lane, and a lot of other asinine and dangerous shit.
Totally ignorant of human selfishness, I see.
Re: (Score:3, Insightful)
More importantly, he simply doesn't know whether autopilot was on. Instead of waiting to find out from the people who can and do check such things, he makes up an answer. The media writes about the made up answer, and consumers eagerly lap it up, spreading ignorance far and wide.
Re: (Score:2)
The best part is that there'll be no followup article correcting the details after they appear.
Re: (Score:2)
More importantly, he simply doesn't know whether autopilot was on. Instead of waiting to find out from the people who can and do check such things, he makes up an answer. The media writes about the made up answer, and consumers eagerly lap it up, spreading ignorance far and wide.
The best part is that there'll be no followup article correcting the details after they appear.
The NHTSA is investigating them as they appear to be autopilot related. How else would you report that?
Re: (Score:2)
Is NHTSA another name for "Associated Press"?
Re: (Score:2)
Seriously. The guy who hit the police car? What sort of credibility does "it was the autopilot, officer, honest" have?
But the firetruck sounds a lot like the earlier crash into a truck crosswise/ Both seem like views that would be very rare in the training data, which shows the real limits of machine learning: it has gaps that you simply cannot predict from successful behavior.
The human mind learns from examples, then abstracts and generalizes, so once you can demonstrate that you can recognize and avoid
Re: (Score:2)
That was my first thought - if I'd hit a police car while in a vehicle with autopilot, my first (okay, second after "I'm still alive!?!") would be "can I blame the car instead of taking the heat?"
Re: (Score:2)
Re: (Score:2)
Yep. Coming from the land of convicts to 3000mile journey around Cali, I thought I’d entered the world of mad max USA style on the freeways.
Time is money, and money is more important than life!
Secondary (Score:2)
The first example sounds like the driver being dumb. Even using my phone sitting in my cupholder, the GPS is usually accurate enough to know what lane I'm in. If Tesla's programmers are even half competent, the car would know pretty quickly via GPS that it's not on the freeway anymore.
Re: (Score:2)
"No normal human being would *"
The ever appropriate answer to that is, "Wait, hold my beer."
The scientific answer is, Let's try it again to see what happens [xkcd.com]. Of course I've accelerated out an off-ramp. The annoying ones are the people who decelerate before the get to the offramp, slowing traffic on the freeway behind them.
Re: Republicans doesn't live in USA obviously (Score:3, Insightful)
Tesla's. Autopilot system isn't perfect, it never will be. But, statistically it is safer than driving alone. The data shows this. Tesla's can recognize red lights if they have the V3 computer, but do not stop yet. They alert the driver and say "hey, we are about to cross a red light". Same thing with stop signs. Why Tesla doesn't apply the breaks yet? Not sure. It's not like they have stimulus people working on the problem. I've argued before that there are going to be deaths.
Re: (Score:3)
Two different problem sets:
1) How to make human driven cars safer
2) How to make auto-piloted cars safer
Even if #2 is safer than #1, it does not absolve the company making the technology behind #2 much better and accountable to passenger safety. Similarly, airlines are still responsible for the safety of their passengers even if air travel is much safer than car travel.
Re: (Score:2)
Commercial airline pilots use the autopilot all the time but they also are constantly monitoring the system and are ready to take over. The Tesla problem is people read the notice that says they must be attentive and ready to take the wheel at any moment, then they defeat the hand on the steering wheel safety feature and fall asleep. You can't fix stupid.
Re: (Score:2)
The Tesla problem is people read the notice that says they must be attentive and ready to take the wheel at any moment, then they defeat the hand on the steering wheel safety feature and fall asleep.
Because otherwise there is no point in the Autopilot feature for cars.
The only reason why I would use such a feature is so that I would not have to drive the car and pay as much attention. If I have to pay attention, then I'd rather drive the car myself because then I am less likely to get bored (roads are never perfectly straight and level, so having to frequently adjust the steering wheel keeps me engaged) compared to how bored I would be if I had to just watch the road without doing anything, but be aler
Re: (Score:2)
It is also different for airplanes insofar that the pilots are type-rated for their aircraft and its autopilot.
Re: (Score:2)
We *definitely* need better licensing requirements for drivers.
Re: Republicans doesn't live in USA obviously (Score:2)
beyond that, self driving cars will make driving an obsolete skill most of us won't use unless it's commercial or specialized within our lifetimes and certainly in the lifetimes of those learning to drive today.
Re: (Score:3)
With the tech we put into cars like ABS, parking sensors, lane keeping assistance, radar guided cruise control, etc... we are the safest drivers. Ever.
No, we're not. The vehicles are safer because of all the crap which has had to be thrown in because people can't be bothered to drive in a safe manner.
ABS? Because people don't keep enough distance between themselves and the person in front of them or drive for the conditions of the road.
Parking sensors? Because people can't be bothered to learn to use the
Re: (Score:2)
What? Technology in your car doesn't make you a better driver. It makes your car a better car. I expect today's drivers are the same idiots they always were, but there is a plausible argument that they're actually worse because their cars make up the difference. This story is (anecdotal) evidence for the latter.
Actually, so is your post.
Re: (Score:2)
No. It's because modern car design has made it very difficult to see the corners of the car and any obstructions near those corners.
Re: (Score:2)
Parking sensors are because humans don't have x-ray vision and mirrors don't provide 100% coverage and you can't look at all three mirrors and out the windows at the same time.
Re: (Score:2)
Re: (Score:2)
>Parking sensors are because humans don't have x-ray vision and mirrors don't provide 100% coverage and you can't look at all three mirrors and out the windows at the same time.
The fourth mirror on the back wall of my garage helps.
Re: (Score:2)
Of course there is. Lane keeping and radar cruise control is very useful. There's quite a difference between having to watch the road to make sure the car (or someone else) doesn't do something stupid, and having to do that plus brake, accelerate, lane keep and/or turn the cruise control on and off all the time.
The automatics are also there if you do get distracted. *Lots* of people rear end the car in front of them because they looked aw
Re: (Score:2)
Re: Republicans doesn't live in USA obviously (Score:2)
Re: (Score:2)
Their autopilot has the lowest incidents of accidents per mile driven, statistically.
False.
Also, when in an accident, their cars are structurally safer than any other car on the road given their weight classes.
Also false.
Tesla tried to claim they were the highest-rated car in safety tests, and an official statement was issued to say they were fucking lying.
Re: Republicans doesn't live in USA obviously (Score:2)
Sorry to tell you, but you saying âoefalseâ doesnâ(TM)t automatically mean it is.
Re: (Score:2)
Re: (Score:2)
Driver assist is ok, but full auto pilot is decades away from being usable. Even for driver assist, it needs to have a limit to the maximum speed and refuse to exceed if there is no one controlling the vehicle. People can't see the painted lines much of the time, so it's dumb to expect an auto to follow lanes. Ie, cap it at 55mph on freeways, so you can use assist when traffic is bogged down, 35 or 40 on residential roads. Still not perfectly safe but it gives much more time for the human to take contro
Re: Republicans doesn't live in USA obviously (Score:2)
Re: Republicans doesn't live in USA obviously (Score:5, Insightful)
Specifically:
1) Who exactly is "Raj Rajkumar", and why exactly are we supposed to care about him?
2) AP does not stop for stop lights. It's 100% the responsibility of the driver to stop for lights. So even if AP was on (thanks for your speculation, "engineering professor Raj Rajkumar").... and? That's like saying, "I was on cruise control, and the car drove off the road!" Yeah, because keeping you on the road is not cruise control's job.
The other two are just random accidents - so unless the standard is "Autopilot is expected to never have an accident", why exactly are we supposed to care about individual case reports? These aren't exactly rare cars anymore. I mean, a car in Indiana fatally hit a parked firetruck in December? Yeah, I found an article about it here [indystar.com]. Oh wait, that was a different case of a non-Tesla driving into a fire truck in December in Indiana and killing someone, because people drive into parked vehicles all the bloody time. So commonly that I can find the exact same accident with the exact same type of vehicle in the exact same situation in the exact same state with the exact same result in the exact same month. Why are we only supposed to care when it's a Tesla?
In the specific case of the police car, it's really obvious what happened, from the driver's description. He says it was on autopilot, he turned to the back to mess with his dog, and the car crashed into a police car and another vehicle. The idiot almost certainly bumped the wheel, taking it out of Autopilot, wherein it'll do what any other vehicle whose owner is messing with a dog in the back and not holding the wheel will do: barrel straight into whatever's ahead of it.
Because they barely exist on the road at all. Tesla has a huge number of AP vehicles on the road - nearly half a million Model 3s alone. Autopilot comes standard. SuperCruise has only been available as an option in the 2019 CT6 sedan, a vehicle, a vehicle with less than 8000 total sales. It can also only be used on pre-mapped highways. And of course, the more annoying you make a system in terms of nagging drivers, the less people will use it, which makes the design of nags on a system which can increase safety a balancing act; if you become too annoying, people stop using it.
Re: Republicans doesn't live in USA obviously (Score:2)
There are few million Volvos on the road with Pilot Assist, which is a Level 2 autonomy system, just like Tesla Autopilot. Volvo does not oversell itâ(TM)s capabilities though and does not try things that are not safe at that level.
Re: Republicans doesn't live in USA obviously (Score:5, Informative)
Pilot Assist, the thing that - unlike Tesla - Volvo advertises as "self driving" [techtimes.com], despite being a far less capable system, as well as having a far lower Euro NCAP autonomy rating?
Here's the AAA comparison. [ttnews.com]
And they were testing an obsolete Tesla, with AP 1.
IIHS did the same sort of test.
They noted a dramatic improvement between an AP2 Model 3 and a AP1 Model S. The S90 was among the worst of the bunch. For example, in a hilly-terrain lane crossing test, which the Model 3 aced every time: "The S90 stayed in the lane in 9 of 17 runs and crossed the lane line in eight runs." [iihs.org]
This was a 2018 vehicle. There've been significant improvements since then.
Here's The Drive [thedrive.com] comparing various systems - again, only on lanekeeping, not the more advanced features that only Autopilot offers, like Navigate on Autopilot:
I'd also like a reference to there being "a few million" cars out there with Pilot Assist, please. It was first introduced as an option in the 2016 XC90, which usually sells under 100k/yr globally, and is now also available as an option on the V90 (~20k), S90 (~50k), and starting in the 2018 model, the XC60 (~180k). There's probably only about a million vehicles on the road that even could have it, let alone what percentage actually have it. Do you have evidence suggesting otherwise? The key difference is that there's virtually zero reporting at all when a Volvo gets into an accident vs. when a Tesla does (where it ends up on the front pages of news sites around the world - including, right here, Slashdot), let alone what whether they were using Pilot Assist (any more than if they were using TACC or whatnot).
Re: (Score:2)
I love how you link to your own post, one that already includes a response rebutting what you're posting.
Re: (Score:2)
For anyone reading this who actually cares: the "SAE level" argument brought up by my friendly local stalker was a discussion of the fact that the SAE does not define the "number of 9s" in terms of safety, which I feel to be a huge flaw; they just describe what the car "can" and "cannot" do. Tesla said that they were targeting being "feature complete" by the end of 2019, but not the level of reliability needed for gaining approval to be driven without a driver by the end of 2019 (they're hoping for late 20
Re: (Score:3)
Regardless of how good Tesla's software and/or lawyers are at other tasks, it seems to me that "reliably stop before hitting that solid object in front of me" is a pretty central requirement for any car that wants to be considered self-driving.
Having a car that occasionally gets confused by lane markers and takes the wrong freeway exit is a lot more acceptable than having a car that occasionally runs into other cars/firetrucks/people/whatever at full speed. I think they ought to prioritize working on impro
Re: (Score:2)
No, autopilot is NOT safer than a human driving.
There is no data to prove that claim. What Tesla released was debunked, they were not comparing like for like.
The fact that they have not released proper data and that insurance companies don't offer discounts for AP enabled cars (in fact they cost more to insure, in line with the increased value) tells us that at best it's about the same, probably worse.
Why Tesla doesn't apply the breaks? (Score:2)
Because it's not broken?
Re: (Score:2)
One thing autonomous cars will never learn (Score:3)
When to ignore the rules.
Re: One thing autonomous cars will never learn (Score:2)
Re: (Score:2)
Then that's you ignoring the rules.
Re: (Score:2)
You can speed while in autopilot for example.
Only by a limited amount. I believe it is 5 mph over the limit.
This is okay on multi-lane highways, where I can just keep to the right in the slow lane.
But Autopilot is a hassle on single-lane-in-each-direction roads. Cars back up behind me, and under California law, if there are five or more cars behind me, I am legally required to pull off the road and let them pass. It is easier (and quicker) to just drive in manual mode.
Re: (Score:2)
5 mph over the programmed limit. Often the limit is wrong so you can either speed a lot or the car slows right down and people behind get frustrated.
Re: (Score:2)
that would imply that they know when to follow those rules.
they do not follow any of those rules.
it's a piece of software.
Re: (Score:2)
Whose rules are you talking about? The rules that govern the function of autonomous driving or the road rules? Tesla's Autopilot already ignores some road rules in the name of safety. Specifically rules like not coming to stop on a motorway. You seem to forget that self driving cars are designed and "taught" by people. They will approach rules the way people teach them.
Did he blame it on MCAS? (Score:2)
Yes a normal human would go fast in an exit lane (Score:2)
No normal human being would not slow down in an exit lane,
This is simply not true. You see people exiting a freeway, not slowing down at all until they are close to a light.
Or what if they were drunk and trying to run a light that was turning red at the end of the exit ramp? Then I could easily see them *accelerating* in an exit lane, misjudging speed and ramming the car into something. That kind of thing happens all the time. That's one of the drawbacks of a car with really good performance, it's very
Re: (Score:2)
I don't see that. Seems a coin toss in that case. Misunderstanding an exit lane has been a problem in the past, but OTOH I've twice seen close up a human driver crash through a red light at full speed, causing a serious accident. I'd bet on the firetruck being autopilot, given its history with stopped trucks. Of course, I'd bet the other way on the police car. Who's he kidding?
I'm sure all of these will be investigated fully.
Marketing Gimmiick - that kills people (Score:5, Insightful)
Re: (Score:2)
For the same reasons that people use cruise control.
Re: (Score:2)
Because a marketing gimmick that kills people also saves people. I'm not in favour of autopilot itself, but any such self driving feature. I've already been in a vehicle with similar such features when an incident in front of me forced me to react. I didn't have my hands on the wheel at the time, but by the time I grabbed the wheel and put my foot to where I thought the brake pedal was the car was already slowing down rapidly. Not sure if I would have hit the car in front of me or not had that system not be
No normal human would... (Score:5, Insightful)
get into an accident. Yet normal humans seem to do enough stupid shit to kill 3300 people every day across the world. Basing the idea that the car is operating in autopilot because a human wouldn't take an exit quickly or run a red light is among the dumbest thoughts ever put into writing.
I do wonder how many people see that T logo and just assume autopilot was the fault. I also wonder how many people try to blame autopilot for their own stupidity. Hitting a car on the side of a freeway? Well that happened to a delivery van in front of me on the A12 late November, and that old black smoke belching piece of shit certainly didn't have autopilot.
Re: (Score:2)
Let me tell you about this thing called "liability".
If you drive badly and kill two people you are liable for their deaths and will go to prison.
Tesla has created a system that encourages the driver to not pay attention and then does little to enforce its attention paying rules. There are better systems out there, Tesla has ample opportunity to improve theirs but didn't.
Re: (Score:2)
Tesla has created a system that encourages the driver to not pay attention
Tesla didn't create that, they just packaged all the systems together and gave it a catchy name. Autopilot is not something magic it's an amalgamation of safety features "lane keeping assistance" "automatic emergency breaking" "adaptive cruise control" "forward collision detection" and "side collision detection".
Claiming it's Tesla's liability for packaging actual safety features which have a history of reducing accidents while also at the same time encouraging drivers not to pay attention is just silly. Cl
FUD ! (Score:2)
FUD !
Dr Raj Rajkumar of CMU (Score:4, Informative)
He seems to be mostly a real time OS guy, with scheduling, networking work. His research seems to be on networking aspects of autonomous vehicles, not autonomous driving it self. Disappointed he is engaging in speculation without any evidence to indicate the auto pilot was engaged. His commented about auto pilot behavior on lane marking etc does not seem to have done work on machine vision, scene detection, or autonomous driving algorithms.
simple solution for now (Score:2)
There is nothing scarier than watching idio
"Autopilot" is not SAE 5 (Score:2)
The driver needs to pay attention. If they do not, things like this happen. That said, this feature has probably saved a lot more people than it killed. And that is the real criterion here, not whether drive automation kills people. Sure it does and it will continue to do so (even at SAE 5), but does it kill a lot less people than human drivers? And that is a definite yes and will remain one. Humans are _bad_ at driving, but most do not admit that, usually not even to themselves.
Re: (Score:2)
Re: (Score:2)
I agree with your points but would just add, I wish we had better data available.
I agree to that as well.
Re: (Score:2)
There is well known fact about advanced driver assist like Tesla autopilot. When google started their autonomous vehicle project, they saw that people acted irresponsibly and they immediately shut it down in favor of going directly to Level 4/5. This was kind of well known in the autonomous vehicle industry.
Could Tesla designed their "autopilot" so you have to pay attention and keep your hands on the steering wheel at all times? Yes, they could have, but Elon chose not to. Tesla could have used eye-tracking
Re: (Score:2)
It really depends on who many people this thing has saved. It seems rather unlikely that it has killed anywhere near the number of people saved. If so, Musk made exactly the right decision, because pushing this even with the methods he used will have saved more people than it killed.
Things are far from as simple as you try to describe them.
As a new Tesla model 3 owner (Score:5, Interesting)
Just got a Tesla Model 3. No Full Self Driving - just the basic Auto-pilot.
When you first enable the "Smart-ish Cruise Control" and "Smart-ish Lane Keeper" you will have to accept a EULA that says that the features are in Beta, the car WILL most likely crash into any objects on the road and YOU have to obligation to pay attention and apply the brakes.
With the latest v10 software update you have the option to see on the screen what the car's computer thinks is around you in real time. Just a peek at that nightmarish visualization of morphing and twitching vehicles and phantom traffic cones convinced me i'dd better pay attention at the road myself.
Driver assistance tools are nice. (Score:2)
But if you think Autopilot is ACTUALLY an autopilot, you're fucking stupid and shouldn't be allowed a car of ANY sort.
Nor the right to breed.
Re: (Score:3)
Re: (Score:3)
You seem to be conveniently ignoring the third case, where the driver stated to police that the car was in autopilot.
[Area man] yes officer, I was not in control of the car when it struck you, it's all Elons fault. I am totally absolved of all blame!
[Officer] How convenient.
Re: (Score:2)
Except that it wouldn't absolve all blame anyway, in fact legally it probably won't help at all.
AP is a driving aid, you have to pay attention. Clearly the guy wasn't paying attention.
Tesla's liability for building a system that makes it easy to not pay attention is a separate matter for another lawsuit.
Re: (Score:2)
And people involved in accidents never lie, right?
Re: (Score:2)
You seem to be conveniently ignoring the third case, where the driver stated to police that the car was in autopilot.
Which is more likely?
1. A random guy lied to the police to avoid blame.
2. Software that has functioned correctly over billions of miles suddenly exhibited a defect that has never been seen before.
Re: (Score:2)
Teslas on autopilot ram into stopped emergency response vehicles every year ... they've got fucking lucky not having killed an emergency responder yet.
Car doppler radar systems in general and Tesla autopilot in particular ignoring non moving vehicles, especially when suddenly revealed by the car in front switching lanes, is a common predictable failure mode. Calling Tesla's driver assist autopilot is a defect, it being designed to encourage people to take their hands off the wheel is a defect ... Teslas hit
Re: (Score:2)
3: Software that has exhibited stupid fucking defects and killed people before does so again.
Re: (Score:2, Insightful)
wrong, Tesla shill-boy
autopilot needs to be banned, it's killing people with stupid choices
Re: (Score:2)
Actually, I think drivers need to be banned. They make MUCH stupider choices.
Autopilot's never stumbled out of a bar after drinking a fifth of vodka and gotten behind the wheel.
Re: (Score:3, Interesting)
the autopilots are duplicating drunk drivers blunders very well, and with body count
Comment removed (Score:5, Insightful)
Re: (Score:2)
Not really, no, because comparing safety by manufacturer is pointless. You'll understand why when you explain what exactly makes a Ford Mustang so unsafe in comparison.
Re: (Score:2)
Heh, bad example. Right now "Mustang" is the same as "all cars made by Ford".
Re: (Score:2)
and with body count
Take a look at This analysis [tesladeaths.com]. This shows about 5 Tesla deaths per billion VMT (vehicle miles traveled).
Compare that to the National Safety Council [google.com] numbers of around 12.5 deaths per billion VMTs. Clearly Teslas are safer than the national average. This also includes commercial drivers who are typically safer than average by a factor of two.
Do the statistics you are comparing take into account the age and cost of the vehicles being compared? Tesla cars are on average going to be newer than other cars on the road (Tesla hasn't be around as long as many other car companies.) Since Tesla cars are relatively expensive compared to all cars on the road, they are also going to have better safety features (outside of autopilot.) For a fair statistical comparison of deaths per mile with regard to autopilot being a factor, I would want to see the statis
Re: (Score:2)
Of course those "stats" don't take that into account. They also don't take into account the fact that a deer jumping out in front of a car is more likely to jump out in front of a non Tesla than a Tesla, by several orders of magnitude. Or that any 2 (or more) random cars involved in an incident are likely going to be non Tesla's by several orders of magnitude. Even if you blame 1 car (or driver) for the incident, the OTHER cars and drivers, who are not at fault, are going to be inflating the non-Tesla st
Re: (Score:2)
"This shows about 5 Tesla deaths per billion VMT (vehicle miles traveled). Compare that to the National Safety Council [google.com] numbers of around 12.5 deaths per billion VMTs. Clearly Teslas are safer than the national average"
That's not true. The fact that some manufactures have much lower death rates than Teslas doesn't mean they are much safer cars or that Tesla's autopilot causes an marked increase in accidents.
Re: (Score:2)
You are not comparing like for like.
Tesla AP is officially only for use on highways, the safest type of road. You are comparing to the average rate on all roads.
You are also comparing wealthy Tesla owners to every random driver. You are not comparing similar accidents either.
Insurance companies probably know. They don't offer discounts for AP cars. Tesla tried to get them to and they told them no. They must know something.
Re: (Score:2)
Take a look at This analysis [tesladeaths.com]. This shows about 5 Tesla deaths per billion VMT (vehicle miles traveled).
Compare that to the National Safety Council [google.com] numbers of around 12.5 deaths per billion VMTs. Clearly Teslas are safer than the national average.
Aside from the other criticisms that have been posted, this could simply mean that Teslas are good at protecting the driver in a crash. Which is a good thing, obviously, but as they stand the numbers don't show that the autopilot is the reason for the low deaths per mile.
Re: (Score:2)
Tesla says to keep your hands on the wheel when using autopilot and gives you several warnings. If you run into something you're not in control and therefor at fault.
Re: (Score:2)
What they say and what the system is designed for are at odds. Teslas are sold because you "can" take your hands off the wheel and eyes off the road for significant amounts of time. All the safety advantages of autopilot, as it exists currently, can be had by systems which only intervene when they perceive emergencies. Lane drift warning/intervention, break assist etc. In an emergency in which autopilot can avoid a collision where the driver can't it still would, in an emergency where autopilot is oblivious
Re: (Score:2)
And proceeds to fucking drive into stationary trucks and walls, killing you, if you ignore those warnings.
Re: (Score:2)
Just like a regular car if you let go of the wheel...
Re: (Score:2)
They picked a random CMU prof to say he wasn't sure if autopilot was on. Top notch journalism right there. He must have been the first one to answer his phone.
Re: (Score:2)
But...It's not cars killing people, it's people killing people in these reports? And looks like in conditions one would not use or could not use the autopilot anyway.
I get the point that possibly statistically Tesla on autopilot may be safer then human driven cars, but it still feels different. What is these Tesla accidents were caused by say a wheel falling off?
Re: (Score:2)
Rope knots by themselves are technology, and as far as I personally am concerned, useful and interesting technology. I work as a programmer, these days, but a few times per year, I get to use a knot or two, and usually it impresses those around me. My daughter's teachers were more impressed by the knots I used to secure a shade tarp than by my setting up their PCs with Ubuntu and Office Libre... When I was single, I boated, camped, and even rappelled a few times. At that time, knowing even a dozen knots
Re: How about we wait for some evidence? (Score:2)
There is no evidence (Score:2)
evidence the drivers where not doing their duty, no tesla is licensed to self drive
There is zero evidence yet the cars WERE IN SELF DRIVING MODE, you absolute turnip of a human.
Re: (Score:2)
There is zero evidence yet the cars WERE IN SELF DRIVING MODE, you absolute turnip of a human.
There is zero evidence yet the cars did NOT have "autopilot" on.
If the cars did NOT have "autopilot" on, Elon Musk would have tweeted about it. The fact that he hasn't come out and said "autopilot" was not on tells me it was.
Re: (Score:2)
Re: (Score:2)
What? Coke changed the label on vitamin water *in the US* to omit "vitamins + water = all you need" implying that the drink only has vitamins and water in it (it's got sweeteners too). They're also going to stop saying it will keep you from getting eye diseases.
Re: (Score:2)
Hm... would you prefer that particular driver was cruising down the highway without the car keeping him in his lane and preventing him plowing into the car in front of him?
Re: (Score:2)
Yes, because at that point the driver would drive instead of thinking the car will drive for them.
SOME people will always drive like morons.
Tesla ENCOURAGES people to do so, despite NOT BEING SAFE.
Re: (Score:2)
Would he? You're probably right. People driving while playing with their cell phones has never been an issue before.