Tesla Issues Strongest Statement Yet Blaming Driver For Deadly Autopilot Crash (abc7news.com) 467
Tesla has released its strongest statement yet blaming the driver of a Tesla Model X that crashed on Autopilot almost three weeks ago. The driver, Walter Huang, died March 23rd in Mountain View when his Model X on Autopilot crashed headfirst into the safety barrier section of a divider that separates the carpool lane from the off-ramp to the left. Huang was an Apple engineer and former EA Games employee. ABC7News reports: Tesla confirmed its data shows Walter Huang was using Autopilot at the time of the crash, but that his hands were off the wheel for six seconds right before impact. Tesla sent Dan Noyes a statement Tuesday night that reads in part, "Autopilot requires the driver to be alert and have hands on the wheel... the crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road." The family's lawyer believes Tesla is blaming Huang to distract from the family's concern about the car's Autopilot.
Here is the full statement from Tesla: "We are very sorry for the family's loss. According to the family, Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location. The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so. The fundamental premise of both moral and legal liability is a broken promise, and there was none here. Tesla is extremely clear that Autopilot requires the driver to be alert and have hands on the wheel. This reminder is made every single time Autopilot is engaged. If the system detects that hands are not on, it provides visual and auditory alerts. This happened several times on Mr. Huang's drive that day. We empathize with Mr. Huang's family, who are understandably facing loss and grief, but the false impression that Autopilot is unsafe will cause harm to others on the road. NHTSA found that even the early version of Tesla Autopilot resulted in 40% fewer crashes and it has improved substantially since then. The reason that other families are not on TV is because their loved ones are still alive."
Sounds like a CYA distraction statement (Score:3, Insightful)
Re:Sounds like a CYA distraction statement (Score:5, Funny)
I'm just thrilled that these millionaires are doing the beta test for us. In a few years, they'll have most of the bugs worked out and the tech will be a commodity. They are true martyrs for the little man.
The lawsuit is likely doomed by family's own words (Score:5, Informative)
The family admits that the driver had had issues at that exact location. Why on earth would he use it there then? Why wasn't he paying attention near that spot? Why did he ignore the warnings? He was a programmer. He should have known.
Re: (Score:3)
The family admits that the driver had had issues at that exact location. Why on earth would he use it there then? Why wasn't he paying attention near that spot? Why did he ignore the warnings? He was a programmer. He should have known.
The car was on Autopilot... You know A-U-T-O-Pilot. The car should have driven itself whilst the attendant sat back watching movies on their phone.
That is the logic you can expect from end users. Warnings are just something to be ignored or at the very worst summarily dismissed. Autonomous cars are something that has been sold to them as a magic bullet to their driving woes. The end user fully believes that their time having to pay minimal attention to the road is at an end and that the car will automati
Re: The lawsuit is likely doomed by family's own w (Score:3)
No! Your the fucking asswipes that don't get it. It's got AUTOPILOT, not Chauffeur. When you go into an airplane, the pilots sit in front, don't sleep, and watch the skies, the instrumentation, and the aircraft handling, the pilots are paying attention! That is how you operate with autopilot, you don't see the pilots both taking a nap or coming back to schmooze with the flight attendants.
Re: (Score:3)
Look, human beings suck at vigilance tasks. "This is almost always OK, detect the one time in an hour that it's not"-- no one can muster the attention. X-ray screeners use something called the "Threat Image Protection System" which shows them pictures of bombs and guns and keeps them alert (it lets them know it's a test, but helps keep their mind in the "where's the gun in THIS one?" mode instead of "oh, look, another suitcase probably without a gun"). Even S&R dogs find trainers even in the middle
Re: (Score:2)
then he should have been able to steer it away from the divider
Re: Sounds like a CYA distraction statement (Score:5, Insightful)
What good is it even if they say you need to keep your hands on the steering wheel? It doesn't sound very auto to me.
I turned on cruise control and it drove right into a stopped car. What good is cruise control if I have to manually slow down? It doesn't sound very "in control" to me.
Re: (Score:3, Insightful)
Re: Sounds like a CYA distraction statement (Score:5, Insightful)
Autopilot on the other hand is supposed to keep you in the lane
No, it's not. It's supposed to do a whole bunch of things to assist you, but only if you're paying attention. It was never advertised as a "go to sleep and I'll drive for you" system, any more than cruise control was.
Re: (Score:3, Interesting)
The difference is cruise does exactly what is advertised. It maintains speed. Autopilot is advertised to stay in the lane and maintain speed like adaptive cruise does. In this case, it did not do what was advertised and someone died. Tesla is trying to shape public opinion on this because unlike AZ, it was not some homeless person whose family probably settled for peanuts. This is likely to become a 7 or 8 figure payout due to the earnings potential of an apple engineer if it goes to trial.
Re: Sounds like a CYA distraction statement (Score:2)
Autopilot is advertised to stay in the lane and maintain speed like adaptive cruise does.
You and the other nincompoops can keep repeating that as much as you like, but repetition does not make something true. Here's how Tesla advertises their newest "self driving" cars:
"Build upon Enhanced Autopilot and order Full Self-Driving Capability on your Tesla. This doubles the number of active cameras from four to eight, enabling full self-driving in almost all circumstances, at what we believe will be a probability of safety at least twice as good as the average human driver. "
That's an advertisement
Re: Sounds like a CYA distraction statement (Score:5, Insightful)
...at what we believe will be a probability of safety at least twice as good as the average human driver.
The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road,
So, it was an easily avoidable accident for a human driver... but we have an autopilot that couldn't do it, even though we claim it's twice as safe? Sounds like they are talking out of their asses from both ends here.
Re: Sounds like a CYA distraction statement (Score:5, Interesting)
Cruise control maintains your speed extremely well and doesn't ever fail catastrophically.
Actually it can and does. Cruise control in slippery conditions can put a car into a dangerous condition.
Here's one citation, I'm sure anyone can find more:
https://www.theglobeandmail.co... [theglobeandmail.com]
Early cruise control systems were sometimes quite dangerous, not always to the passengers but could cause damage to the engine or transmission. I remember cars having a hardwired switch on the dash to disable them, in addition to the software button on the steering wheel, because people learned not to trust them. They got "smarter" and today most will detect wheel slippage and not gun the engine if it hits a slippery spot in the road.
Cruise control is especially dangerous with rear wheel drive and powerful engines, like on a sports car or light truck. One wheel on a slick patch will cause the cruise control to open up the throttle and get the wheels spinning, when they finally find traction the vehicle might no longer be pointed in the desired direction of travel and the front wheels could still be on a slick surface which can send the vehicle flying uncontrolled.
Cruise control is very safe, especially newer ones that integrate with a traction control, but a claim that they never fail catastrophically is provably false.
Re: (Score:3)
Cruise control maintains your speed extremely well and doesn't ever fail catastrophically.
In the same situation cruise control would have sent the car straight into the obstacle without ever breaking. In fact that's exactly what it did. The only case where cruise control "doesn't fail catastrophically" is when the driver pays attention to the road and takes over when necessary. Precisely what would have saved this driver.
Re: (Score:2)
What good is cruise control if I have to manually slow down?
Because slowing down is the exact opposite of the purpose of cruise control? What dingbats modded this interesting?
Re: (Score:2)
Modern cruise controls slow down perfectly well. I rented a car in 2009 which did precisely that.
Re:Sounds like a CYA distraction statement (Score:5, Insightful)
Tesla should know better though. People are fucking idiots and the vehicle should not assume they'll act responsibly. If the AI system doesn't think it can manage things anymore and the user is not responding to input, it should throw the hazard lights on and make an emergency stop. Systems like this should always be able to fail gracefully. If this is a repeated problem, the system should disable the auto-pilot feature and refuse to let the driver use it. If they want it turned back on, they can write to Tesla and explain why they think that they should be allowed to be a colossal moron with a quarter million joules of kinetic energy.
Re:Sounds like a CYA distraction statement (Score:5, Interesting)
I think it's important to be mindful with your terminology. Tesla's Autopilot is not a self-driving system. It is cruise control. Conflating terminology causes nothing but confusion and undue misconceptions about emerging technologies.
If people stop thinking about Autopilot as self-driving and start thinking about it as cruise control, it becomes immediately obvious that this is not a conversation about why the car did not dodge the obstacle, but rather a conversation about why the human looked away from the road while hurtling forward at great speed in a metal basket, long enough to travel at least 200 meters in distance.
Re: (Score:2)
Re: (Score:2)
Yep, and we see these people hung up on rocks all of the time. Kinda amusing actually.
Re:Sounds like a CYA distraction statement (Score:5, Insightful)
Perhaps they shouldn't call it autopilot? The term is clearly a marketing term that makes you think it is going to automatically pilot the car. Call it advanced cruise control, or lane assistance.
Also if the car can detect your hands are off the wheel, and it is not capable of guiding itself when your hands are off the wheel then shouldn't it immediately warn you when you do so and come to a safe stop? At what point when the car is moving and you are driving safely are your hands off the wheel.
Re:Sounds like a CYA distraction statement (Score:5, Informative)
The terminology issue is a red herring. The reality is that partial self-driving capabilities lull users into a false sense of security because they usually work well. The rare failures are often catastrophic precisely because people have gotten used to the technology working, and end up surprised when it doesn't.
That's what made this recent software update such a problem. It made major changes to the way autosteer works on (at least) AP2-based Tesla cars. One of the big changes was "wide lane" handling, which changed the way vehicles behaved when they encounter a wide lane, such as an exit lane. This has resulted in a number of unexpected behaviors, up to and including cars driving straight towards gore points.
I don't know whether that change was in any way a factor in the autosteer malfunction that led to Mr. Huang's death, because I have no way to know what firmware version that car was running. However, the fact that this major update was in the process of being rolled out to users at the time of the accident is suspicious.
To be fair, a lot of other driving situations got significantly better with that software update. However, Tesla AP's tendency to ignore solid white lines has been an ongoing problem that might well have been made worse by that update; if that is the case, then the problem needs to be corrected ASAP, and they probably should NOT have continued the rollout of that update. Either way, I'm not convinced that Tesla did enough to warn drivers that autosteer might behave very differently, and to be particularly alert after that update.
Also, I would add that, speaking as a Tesla owner, it bothers me to see the amount of spin they're spewing after this accident. I realize that they don't want to let their users get scared into not using AP, because on average, it does significantly reduce accidents. And if there are videos out there showing AP malfunctions that they feel are not genuine, they can and should comment. But they should really stop trying to convince the public that the driver was solely to blame, because IMO, that just isn't the case.
First, the fact remains that autosteer obviously DID malfunction, and that malfunction DID result in a fatality that would NOT have occurred if the vehicle had not been equipped with autosteer functionality (because no sane driver would have looked away from the road for 5+ seconds without that functionality).
Second, the situation was entirely predictable. For at least a decade, people have warned that humans are likely to zone out in partial self-driving situations, and that it isn't really possible to change that innate human tendency. Tesla ignored those warnings and pushed forward anyway, and someone died. They blamed the driver, and the crash investigators tentatively agreed, and they kept pushing forward. And then a second person died. And now a third. IIRC, product liability law hinges in large part on whether user errors are reasonably predictable, and no "I agree to pay attention" can change that fact, which means this is little more than a legal smokescreen, IMO.
Third, the fact also remains that Caltrans failed to reset the safety barrier that was designed to slow down a car before impacting the gore point, after the barrier was collapsed in a wreck nearly two weeks earlier. And the fact remains that had the barrier been reset properly (as is required by law), it is unlikely that Mr. Huang would have died.
In other words, there are three parties, any one of whom/which could have prevented the fatality, and the deceased driver was only one of those three. So it is entirely disingenuous to try to pin this on the driver in the court of public opinion. IMO, it really isn't a question of who is at fault; they all are. Rather, it's a question of wh
Re: (Score:3)
Perhaps they shouldn't call it autopilot? The term is clearly a marketing term that makes you think it is going to automatically pilot the car. Call it advanced cruise control, or lane assistance.
The terminology issue is a red herring.
No, it isn't. People have a conception that "autopilot" for aircraft means the computer takes over for the pilot, who can then take a nap or wander around the cabin. We as a society have used the prefix "auto-" to mean that a computer handles the whole process.
Second, the situation was entirely predictable. For at least a decade, people have warned that humans are likely to zone out in partial self-driving situations ... and someone died.
No amount of PR can fix dumb people and their misinterpretations of reality and Tesla has made it a big point to sell how "safe" their combination of technology they call "auotpilot" while not addressing any shortcomings. A car that can keep you in a
Re: (Score:3)
Perhaps they shouldn't call it autopilot?
I really get tired of this argument. Do you even know where the term Autopilot comes from or what it means?
From Wikipedia emphasis mine:
An autopilot is a system used to control the trajectory of an aircraft without constant 'hands-on' control by a human operator being required. Autopilots do not replace human operators, but instead they assist them in controlling the aircraft. This allows them to focus on broader aspects of operations such as monitoring the trajectory, weather and systems.
The autopilot in every fucking definition of the word, is an assist device for the pilot, not a replacement. It handles speed, heading, and in some cases altitude. It doesn't monitor other aircraft, it doesn't avoid collisions, it doesn't (endless list of pilot tasks).
Re: (Score:3)
I think it's important to be mindful with your terminology. Tesla's Autopilot is not a self-driving system. It is cruise control. Conflating terminology causes nothing but confusion and undue misconceptions about emerging technologies.
That's literally what autopilot is though. [wikipedia.org]
An autopilot is a system used to control the trajectory of an aircraft without constant 'hands-on' control by a human operator being required. Autopilots do not replace human operators, but instead they assist them in controlling the aircraft. This allows them to focus on broader aspects of operations such as monitoring the trajectory, weather and systems.[1]
No (sane) person thinks autopilot is supposed to take a plan through an obstacle course of other aircraft. It's merely to assist a pilot to keep a heading, altitude, etc.
Re: (Score:2)
Before "people stop thinking about Autopilot as self-driving and start thinking about it as cruise control", as you say, perhaps Tesla ought to describe it as you do instead of the way they describe it:
https://www.tesla.com/autopilot [tesla.com]
Re:Sounds like a CYA distraction statement (Score:5, Interesting)
More fundamentally to me is the issue that said car should not drive straight into a wall at full speed without trying to slow down.
There is plenty of blame to go around-- victim, Tesla, Caltrans for starters. Each of them screwed up on at least two levels. Tesla likely needs some kind of way for drivers to flag a spot where the autopilot screwed up, so they can gather data and investigate, because the victim was aware of issues at this location and tried to address it with Tesla in (apparently) multiple occasions to no avail.
What blows my frigging mind though is that the car will drive into a stationary object with high contrast safety striping without attempting to brake. Are they trying to determine approach speed based on visual sensors only that were blinded? Their "neural net" doesn't seem to be learning some important lessons quickly enough.
Re:Sounds like a CYA distraction statement (Score:4, Insightful)
In any other scenario I agree with you, but the single person to blame for the death in this case is the driver.
Not only did the driver know it was buggy, he apparently knew that the car steered towards THAT SPECIFIC DIVIDER, and even attempted to demonstrate it to his wife by her own admission.
If I do something that I know is going to get me killed, in a place that is going to get me killed, and ignore warnings telling me that what I'm doing is about to get me killed then there's two possible explainations for that: attempted suicide, or Darwin award.
Re:Sounds like a CYA distraction statement (Score:5, Informative)
"Victim blame much?"
In this case perfectly justified since the "victim" was an adult with a driving license who was legally REQUIRED to keep his eyes on the road ahead regardless of any self driving capabilities of the car when in charge of a vehicle. Clearly he didn't and he paid the price.
The guy was supposedly smart, and a programmer not some joe sixpack, and was certainly aware that autopilot is not 100% reliable. The only person to blame here is the driver through incorrect operation of the vehicle. If an airliners autopilot made a mistake that the pilot had plenty of time to correct but didn't bother we'd be blaming the pilot, not the automation. Same here.
Re: (Score:2)
...when the AI self-driving system starts giving your warning messages about its inability to cope with the current road conditions that you should pay attention to it.
Where did it state that the AI was giving warnings about not being able to cope with the current road conditions? It mentioned giving warnings about not paying attention, not that it was having any particular troubles with the road.
Ability to self assess (Score:3)
If the AI system doesn't think it can manage things anymore and the user is not responding to input, it should throw the hazard lights on and make an emergency stop.
The first problem is at the "if".
Seems that in some cases, the "Autopilot" is completely persuaded that it is on the correct course.
It genuinely thinks that "straight ahead" is the 100% correct answer to the problem.
In that case it will never fail the driver "Hey, I need help".
Again, it's an "autopilot" (see planes, boats, etc.) just a thing that automatizes some low-level work. The captain of the aiplane/boat/tesla should still keep focus and check that everything goes as it should (it's a "level 2" autono
Re: (Score:3)
See Human Factors Engineering [wikipedia.org]
Re: (Score:2)
Re: (Score:2)
What good is it even if they say you need to keep your hands on the steering wheel?
Autopilot is in development, and is improving with every update. Progress requires testing. If you don't want to be a guinea pig, then don't engage Autopilot, or even better, don't buy a Tesla.
I own a Tesla, and while Autopilot isn't perfect, it is pretty good, and getting better. Nothing in life is risk free.
Re: (Score:2)
Autopilot was improving with every update, until Tesla had a falling out with the original developers of the system and they had to go it alone. Now it's been killing drivers.
Re: (Score:2)
What's interesting is that Huang had very specifically complained about the autopilot swerving towards this area of the divider multiple times before. So either he forgot about it and just happened to take his hands off the wheel 6 seconds before coming upon the divider, or knew it was coming up and took his hands off purposefully in order to get in what he though would be a minor accident and subsequently sue Tesla.
Blame game engaged! (Score:4, Insightful)
Tesla blames dead driver. Dead driver's family blames Tesla. Who is really at fault here?
I think the four-year-old girl is right: Why not both?
Re: (Score:2)
Because "perfect" is too high a standard.
"Better" is still preferable to "40% more accidents but at least it was at human hands!".
Re: (Score:2)
I'm not asking for perfect. If visibility was as good as Tesla says it was, why couldn't the car stay in its lane, and why did it steer into an obstruction?
Re: (Score:3, Informative)
All they said was that there multiple warnings that day, but not specifically in the six seconds before the crash when he had his hands off the wheel. It's a classic misleading statement.
Your objection is wholly irrelevant. He had to sign a paper saying he understood that he had to remain alert with his hands on the wheel to even get the feature turned on, and he was reminded of that obligation several times during the trip. It completely and totally does not matter whether the vehicle warned him in the six seconds prior to the collision. He was warned repeatedly, and before he even used the feature once he agreed that he understood his responsibilities.
Tesla is actually less at fault here
Suicide by Autopilot (Score:5, Informative)
[Emphasis mine] Hands not on the wheel, a clear day with plenty of warnings to pay attention it's like he purposely wanted to crash.
Re: (Score:2)
Re: (Score:2)
No, he had complained about the autopilot swerving towards this exact area more than once before. The probability that he just happened to take his hands off the wheel when he would have known the divider was coming up is rather suspect. Personally I'd want his internet search history for the three weeks or so before the crash.
Re:Suicide by Autopilot (Score:5, Insightful)
More likely, he had a false impression that something really bad could never happen to him -- that bad luck is something that happens to other people. It's the same reason that people text while driving. They are confident in their own situation and their own ability to handle dangerous conditions, and sometimes they end up being wrong.
Re: (Score:2)
He was a geek. He was testing an edge case. He had told Tesla about his test results, they logged them as complaints. This time, the barrier was damaged, the autopilot went straight in. He wasn't expecting that.
Re: (Score:2)
No, the barrier was damaged the entire time. That's why the autopilot was failing.
Re: (Score:2)
Also, the markings on the road were poor.
Auto-copilot would be more appropriate (Score:2)
Re: (Score:2)
Re: (Score:2)
It's nothing but smart cruise control and lane assist (that kills you).
Established car companies have both available. Not so much the 'that kills you', but after enough miles, you can bet it has.
Re: (Score:2)
Not really. "Autopilot" has never meant fully-autonomous computer control with no supervision from the pilot. In fact, the first Sperry autopilot's debut was at the Paris Air show in 1914; back when a "computer" was a person whose job was to do arithmetic by hand. It was a simple gyroscopic affair that enabled forward progress in a straight line and... well... nothing else. Rather, an autopilot is, and always has been, merely a tool to reduce the pilot's workload. It still requires preparation, programm
Why it can't check driver alertness? (Score:5, Interesting)
Tesla should be issuing challenges and driver should respond correctly, if not it should pull the car over and stop.
If alert driver is a necessary requirement for safety, the system should check for alertness and stop the car safely if the driver is not alert. It is weaseling out if it allows the car to stay on auto pilot even after its request for manual take over is not honoured. But it knows the appeal of auto pilot will be greatly reduced if it enforces alertness rules
This is why I did not order autopilot when my Model 3 offer came through last Sunday. I am a great supporter of Tesla but the auto pilot is misnamed, and promotion of its use is not correct.
Re: (Score:2)
Kind of like a dead man's switch on a train or a tram.
Re: (Score:3, Informative)
i own a Tesla. it bings at you if you're not holding the wheel, and WILL eventually shut off autopilot, but bringing the car to a stop is inherently dangerous.
ultimately, the driver of any vehicle is responsible for following the instructions for operating that vehicle.
Re: (Score:3)
Tesla's software is what steered the car into the barrier. Case closed.
CalDOT did not repair a crash barrier as required by law, causing a probably unnecessary fatality. Case closed.
If the driver is not attentive or interacting enough, the car could have turned the driver assistance features off. If Tesla didn't want to do that because it wanted to look high-tech and smart, the car could decelerate to the minimum of the posted speed limit or 40 mph -- most drivers will re-engage pretty quickly if they're
several hundred feet of visibility ahead (Score:2)
"The crash happened on a clear day with several hundred feet of visibility ahead," ... which makes you wonder how the hell the computer missed a farking wall in the middle of the road.
Re: (Score:2)
Re: (Score:2)
Tesla has LIDAR now?
Re: several hundred feet of visibility ahead (Score:2)
Re: (Score:2)
A quick google search shows you're making things up.
But that's the whole point. If a person can see the barrier, the computer should, too. If it can't see a fixed barrier on a freeway, it will kill people. That's not fit for the roads.
Re: (Score:3)
Teslas don't have lidar. Your thinking of Google cars.
Summary (Score:3, Insightful)
Tesla blames driver for using the Autopilot in exactly the way you'd expect 90% of Autopilot users to use it.
If hands-on is a requirement then... (Score:3)
Surely if Tesla demands that drivers keep their hands on the wheel at all times that the autopilot is engaged then they should have a sensor for this and disengage the autopilot whenever the driver releases the wheel -- as a safety measure.
The fact that they don't do this is a clear indication that they really do expect people to take their hands off the wheel and use autopilot as if it were perfect. Stop passing the buck Tesla!
Re: (Score:2)
Surely if Tesla demands that drivers keep their hands on the wheel at all times that the autopilot is engaged then they should have a sensor for this and disengage the autopilot whenever the driver releases the wheel -- as a safety measure.
Turning off autopilot when the driver releases the wheel AS A SAFETY MEASURE? If you were in a car in motion, which do you think is safer for the driver and others nearby: an automated co-pilot with a safety track record better than humans or an uncontrolled car in motion?
The sensors alert the driver to put their hands back on the wheel instead of turning the car into an large, fast, uncontrolled missile.
Safe degradation (Score:2)
If the autopilot is unsafe around barriers like that, it should refuse to operate around those barriers. If it can't be made to recognize those situations, it should not be used at all.
AutoPR? (Score:3)
Does anyone know if Tesla is using a bot to write their Press Releases as well?
The following:
The reason that other families are not on TV is because their loved ones are still alive.
Does not sound like something a human PR Professional would write.
In other news - Tesla autopilot can't handle 6 sec (Score:2)
I'm a pilot, and have a real autopilot (Score:5, Insightful)
I'm a pilot, been flying for 30 years, and I've flown with other pilots with varying skill and experience levels.
The most experienced pilot I've flown with never took his left hand off the control yoke. I watched him for hours while I was in the co-pilot and jump seats. He'd visit, configure radios, adjust power, but if his left hand ever came off that yoke it went right back on it as soon as the immediate task was done.
I'll drive my Tesla autopilot the same way that gray haired old pilot flew an autopilot, and with any luck I'll live to be just as old.
Re: (Score:3)
I've got a CPL - and in all my training - one thing that always stood out is that I never fully trusted the autopilot.
There's a great video that was done in 1997 called "Children of the Magenta" that seems to ring true with everything I hear about Telsa issues like this.
Youtube link:
https://www.youtube.com/watch?... [youtube.com]
Aircraft, cars, the lessons are the same.
Re: (Score:3)
The most experienced pilot I've flown with never took his left hand off the control yoke.
That's the exact work reduction autopilots are designed to enable.
Your so called "most experienced pilot" (*cough* MORON) had just defeated the exact purpose of autopilots.
This guy might as well fly with the autopilot off.
You are the kind of guy I do not want to be my captain or co-pilot.
What's the point of semi-autonomous driving? (Score:2)
Re: (Score:2)
To me it has to be either autonomous or not. If semi-autonomous driving requires you to be engaged and alert with both hands on the wheel, ready to take control at any time, then what's the point? How is it different from regular non-autonomous driving? Can anyone share their experience?
Two ways.
First, it has a better track record then your average human driver so it can help avoid accidents that the human may not.
Second, like an aircraft autopilot, it can handle routine matters but there are still times during an emergency or an unusual situation that it needs someone who can handle what it can't.
Think like this. Both human and Tesla's autopilot have a high overlap in what they can handle. There are some things the autopilot handles better just due to reaction time and 360 vision. Ther
Re: (Score:3)
it's statistically safer than non-autonamous driving.
In other words. (Score:2)
In other words:
"The victim was using cruise control. Our Tesla is not a self-driving car. Stop calling it that. Reliable self-driving cars do not exist."
Do. Not. Exist.
Re: (Score:2)
Why would someone think an "autopilot", a word already used to describe a device that makes a plane self-flying, would be self-driving?
I think Tesla should change the name until they are willing to stand behind it as self-driving..
Re: (Score:3)
In the movies the pilot flips a switch, a red light comes on, and he goes to the back of the plane to fight hijackers or have a smoke.
I think it's a poor choice of terminology because there are so many misconceptions about what autopilot is for aircrafts that it's difficult to shift the metaphor to a car.
Having to hold the wheel and pay attention is less useful than cruise control. At least with cruise control I can take my foot off the gas pedal.
What's the point? (Score:3)
Tesla is extremely clear that Autopilot requires the driver to be alert and have hands on the wheel.
I already have to do that. What's the point of buying this autopilot-that-isn't-really-an-autopilot?
Re:Is it just me or is this just not an autopilot? (Score:5, Informative)
People die while driving to work. Using your argument, no one would ever get into a car.
Yes, Tesla's Autopilot isn't perfect, and its capabilities may be exaggerated, but I believe that, overall, drivers using Autopilot are less likely to get into an accident. Isn't that the real measure?
Tesla's crash rate dropped 40 percent after Autopilot was installed, Feds say [theverge.com]
Don't let the perfect be the enemy of the good.
Re: (Score:3)
Tesla offers, per their manual:
Autopilot Tech Package:
• Traffic-Aware Cruise Control
• Autosteer
• Auto Lane Change
• Autopark
• Auto High Beam
Why is Musk and the graph specifically referencing Autosteer? I never trust data fully when specific and unexpected words are used. The omission of the word Autopilot implies......
Re: (Score:2, Insightful)
If you call it autopilot, it should be an autopilot. If it's a warning system, call it something else. Words mean things.
Re: (Score:2)
No stats? Didn't you see the link I put in my post?
Re: (Score:2)
Re:Is it just me or is this just not an autopilot? (Score:5, Insightful)
Re: (Score:3)
That is the point.
1.25 million people die on the road every year, world wide (number from 2013, had it in my head due to unrelated research).
If Tesla has one millionth of the car market, expect on death per year. That would be statistical normality.
Re: Is it just me or is this just not an autopilot (Score:5, Insightful)
It seems to me that the only point of having an autopilot would be so that you could take your hands off the wheel and not pay attention to the road. This is sorta-kinda-an-almost-but-not-quite autopilot that works ok most of the time but has failure modes involving death and / or dismemberment. Who the hell would sell a half-assed, half-baked "feature" like this?
It seems to me that the only point of having cruise control would be so that you could take your feet off the pedals and not pay attention to your speed. This is sorta-kinda-an-almost-but-not-quite cruise control that works ok most of the time but has failure modes involving death and / or dismemberment. Who the hell would sell a half-assed, half-baked "feature" like this?
Re: Is it just me or is this just not an autopilo (Score:3)
On the other hand, you command the "autopilot" to take full control of the car without causing accidents, which is an undefined problem (and Tesla is very careful about not defini
Re:Is it just me or is this just not an autopilot? (Score:4, Interesting)
Yeah the name autopilot is one of the things that kills me about what they are truly selling. Tesla's cars are at best a level two self driving car, that is hands off only. You have to have eyes on and you have to give continual input to the system. It is adaptive cruse control, lane keeping, and auto parking. It has some guidance from GPS and on-board software, but the production car that you buy is nowhere near this crap. [tesla.com] That video is clearly showing a level three car and the car is handling cleared intersections easily, something the current level twos would be suicide if you tried.
The autopilot is anything but. I'm totally pro-self driving cars, but folks need to know what they are buying and not have a hyped up product sold to them and they think it will do something it won't. Tesla cars are a hands off only car, period, the end. You have to keep eyes up, no matter what. Additionally, you need to know the absolute limits of camera/radar combos, that's right Teslas do not have LiDAR. Radar requires a calculation between differences in order to work. If traffic is stopping up ahead and the car in front of you that your Tesla is tracking suddenly pulls out of the lane to expose a car up ahead at a complete stop, your Tesla is going to ram full speed into that stopped car if you don't do something. That's because radar was tracking something and now it's not there. So the machine needs to recalculate everything, which if you're going highway speeds, you're going to end up dead before the car figures it out.
The cars need to see lines on the road. If the lines are iffy, you're going to end up dead. Traffic needs to follow a pace, it doesn't matter if it is start and stop, or if cars gracefully merge in and out of your lane. It just needs to follow a smooth flow to things and you slowly build up a feel for what's gradual enough and what isn't. If you don't pay attention to that, you're going to end up dead. If you are coming up on a change in the road's shape, like where two highways split off and you're in the lane closest to the split, you need to turn off autopilot and handle it yourself. Most of these kinds of things have really crappy indicators on the road that a split is happening and if you don't, you are going to end up like that dude. Dead.
Now if you think that level two automation is a half baked idea, that's cool. It sort of is, which is why everyone is aiming for that holy grail of level five. So perhaps maybe sit the sidelines till we get there? If what you are comparing to is level five, you're right, this shit is beta-level crap on crap. If you're talking about actual level two automation, the Tesla and all the other cars that offer level two are pretty solid. But people need to understand what they are getting themselves into and if that's not what you were expecting, then yeah, you shouldn't buy one. However, I also fault Tesla, since they post up videos like that one I linked and people buy their cars thinking, that's what they are getting which it isn't.
Re: Is it just me or is this just not an autopilot (Score:2)
I pointed out the problems with the name Autopilot at the time of some previous crash. If a geek like poor Mr Huang can be fatally wrong about its limitations (assuming that geeks are less likely to blindly trust technology) what chance do mere mortals have?
Totally agree that Autopilot is broken if it ignores a lane divider. If I had a Tesla I'd keep my hands on the wheel, my feet over the pedals and my eyes on the road.
Then after 10,000 miles I'd think, "Hey this Autopilot is pretty good" and trust it mor
Re: (Score:2)
You want to kill yourself, and let the car help?
Re: Tesla autopilot unable to autopilot (Score:3, Informative)
I own an Tesla S. I use autopilot daily, itâ(TM)s awesome.
However you do have to know what it is and what it is not capable of and you do have to be attentive because it can get itself into trouble ( today, for example, it didnâ(TM)t want to let a bus into my lane - the bus came in anyway )
All that said calling the thing autopilot is what gets Tesla in trouble. Itâ(TM)s more of a âco-pilotâ(TM)
Re: Tesla autopilot unable to autopilot (Score:4, Insightful)
Re: Clear day (Score:2)
And yet your "AI" crashed and killed someone. What idiots. Enough with the AI BS. It ain't happening.
You're the only one stupid enough to all it an AI. Tesla has been very clear about their softwares abilities and limitations.
Re: (Score:3)
It is very clearly an AI. Just not a good attempt, at least not yet.
Re: (Score:3)
Tesla's autopilot doesn't use human eyes. If I understand it correctly, it has a monochrome camera and a forward facing radar. If there is not enough monochromatic contrast between an object and its surrounding, the camera won't detect it, and if the object is at an angle and composition where it cannot return radar signals directly back to the car, the radar won't detect it.
How good the visibility is only affects the human, who is the one with the driver's license and the responsibility that goes wit
Re: Then they should stop calling it "AUTOPILOT" (Score:5, Informative)
All "AUTOPILOT" does is conjure up images of planes flying themselves while pilots LEAVE THE FUCKING COCKPIT to go to the bathroom.
No pilot would ever do that exactly because the autopilot is just a simple program which only controls speed and heading. In the sky, with very few aircraft around you, it would be much safer to leave the controls than it would be in a car, on a highway, and yet aircrew always make sure that there is at least one pilot monitoring the controls at all times. If you hear "autopilot" and think "well, no humans required!" then you are badly misinformed.
Re: (Score:2)
If you hear "autopilot" and think "well, no humans required!" then you are badly misinformed.
I do think autopilot means no human intervention required, and I would be badly misinformed, but I would also be representative of the majority of people. That last point is the most important one. It's the point that makes Tesla's branding effective.
We're talking about branding and marketing. The dictionary meaning or the "real" meaning is irrelevant. Only what the majority of potential buyers think is important. That these explanations about the "real" meaning are necessary strongly indicates that th
Re: (Score:2)
Or go for a quickie. There was a pilot's program at my school and I knew several peoplein the program to join the mile high club. Autopilot was doing the flying.
Re: (Score:2)
Re: (Score:2)
Then something has been gained.
We need more autonomous car testing around critical mass rides! Particularly Uber, but all should participate, no matter how poorly funded and/or engineered. We should see at least a few pigeons driving heavy trucks.