Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation Software Technology

Tesla Issues Strongest Statement Yet Blaming Driver For Deadly Autopilot Crash (abc7news.com) 467

Tesla has released its strongest statement yet blaming the driver of a Tesla Model X that crashed on Autopilot almost three weeks ago. The driver, Walter Huang, died March 23rd in Mountain View when his Model X on Autopilot crashed headfirst into the safety barrier section of a divider that separates the carpool lane from the off-ramp to the left. Huang was an Apple engineer and former EA Games employee. ABC7News reports: Tesla confirmed its data shows Walter Huang was using Autopilot at the time of the crash, but that his hands were off the wheel for six seconds right before impact. Tesla sent Dan Noyes a statement Tuesday night that reads in part, "Autopilot requires the driver to be alert and have hands on the wheel... the crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road." The family's lawyer believes Tesla is blaming Huang to distract from the family's concern about the car's Autopilot.
Here is the full statement from Tesla: "We are very sorry for the family's loss. According to the family, Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location. The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so. The fundamental premise of both moral and legal liability is a broken promise, and there was none here. Tesla is extremely clear that Autopilot requires the driver to be alert and have hands on the wheel. This reminder is made every single time Autopilot is engaged. If the system detects that hands are not on, it provides visual and auditory alerts. This happened several times on Mr. Huang's drive that day. We empathize with Mr. Huang's family, who are understandably facing loss and grief, but the false impression that Autopilot is unsafe will cause harm to others on the road. NHTSA found that even the early version of Tesla Autopilot resulted in 40% fewer crashes and it has improved substantially since then. The reason that other families are not on TV is because their loved ones are still alive."
This discussion has been archived. No new comments can be posted.

Tesla Issues Strongest Statement Yet Blaming Driver For Deadly Autopilot Crash

Comments Filter:
  • by Anonymous Coward on Wednesday April 11, 2018 @07:10PM (#56421111)
    If it was a clear day with several hundred feet of visibility, there is no reason for Autopilot to steer the vehicle into a concrete divider. What good is it even if they say you need to keep your hands on the steering wheel? It doesn't sound very auto to me.
    • by MightyYar ( 622222 ) on Wednesday April 11, 2018 @07:14PM (#56421131)

      I'm just thrilled that these millionaires are doing the beta test for us. In a few years, they'll have most of the bugs worked out and the tech will be a commodity. They are true martyrs for the little man.

      • by catchblue22 ( 1004569 ) on Thursday April 12, 2018 @12:05AM (#56422221) Homepage

        The family admits that the driver had had issues at that exact location. Why on earth would he use it there then? Why wasn't he paying attention near that spot? Why did he ignore the warnings? He was a programmer. He should have known.

        • by mjwx ( 966435 )

          The family admits that the driver had had issues at that exact location. Why on earth would he use it there then? Why wasn't he paying attention near that spot? Why did he ignore the warnings? He was a programmer. He should have known.

          The car was on Autopilot... You know A-U-T-O-Pilot. The car should have driven itself whilst the attendant sat back watching movies on their phone.

          That is the logic you can expect from end users. Warnings are just something to be ignored or at the very worst summarily dismissed. Autonomous cars are something that has been sold to them as a magic bullet to their driving woes. The end user fully believes that their time having to pay minimal attention to the road is at an end and that the car will automati

          • No! Your the fucking asswipes that don't get it. It's got AUTOPILOT, not Chauffeur. When you go into an airplane, the pilots sit in front, don't sleep, and watch the skies, the instrumentation, and the aircraft handling, the pilots are paying attention! That is how you operate with autopilot, you don't see the pilots both taking a nap or coming back to schmooze with the flight attendants.

            • by mlyle ( 148697 )

              Look, human beings suck at vigilance tasks. "This is almost always OK, detect the one time in an hour that it's not"-- no one can muster the attention. X-ray screeners use something called the "Threat Image Protection System" which shows them pictures of bombs and guns and keeps them alert (it lets them know it's a test, but helps keep their mind in the "where's the gun in THIS one?" mode instead of "oh, look, another suitcase probably without a gun"). Even S&R dogs find trainers even in the middle

    • then he should have been able to steer it away from the divider

    • by c6gunner ( 950153 ) on Wednesday April 11, 2018 @07:36PM (#56421277) Homepage

      What good is it even if they say you need to keep your hands on the steering wheel? It doesn't sound very auto to me.

      I turned on cruise control and it drove right into a stopped car. What good is cruise control if I have to manually slow down? It doesn't sound very "in control" to me.

      • Re: (Score:3, Insightful)

        by djinn6 ( 1868030 )
        Comparing it to cruise control is stupid. Cruise control maintains your speed extremely well and doesn't ever fail catastrophically. In hilly terrain it might go slightly too slow or too fast, but it doesn't put you in a dangerous situation. Autopilot on the other hand is supposed to keep you in the lane, but as this case demonstrates, it's actually not very good at it, and when it fails, you're in a life-and-death situation.
        • by c6gunner ( 950153 ) on Wednesday April 11, 2018 @08:18PM (#56421507) Homepage

          Autopilot on the other hand is supposed to keep you in the lane

          No, it's not. It's supposed to do a whole bunch of things to assist you, but only if you're paying attention. It was never advertised as a "go to sleep and I'll drive for you" system, any more than cruise control was.

          • Re: (Score:3, Interesting)

            by stabiesoft ( 733417 )

            The difference is cruise does exactly what is advertised. It maintains speed. Autopilot is advertised to stay in the lane and maintain speed like adaptive cruise does. In this case, it did not do what was advertised and someone died. Tesla is trying to shape public opinion on this because unlike AZ, it was not some homeless person whose family probably settled for peanuts. This is likely to become a 7 or 8 figure payout due to the earnings potential of an apple engineer if it goes to trial.

            • Autopilot is advertised to stay in the lane and maintain speed like adaptive cruise does.

              You and the other nincompoops can keep repeating that as much as you like, but repetition does not make something true. Here's how Tesla advertises their newest "self driving" cars:

              "Build upon Enhanced Autopilot and order Full Self-Driving Capability on your Tesla. This doubles the number of active cameras from four to eight, enabling full self-driving in almost all circumstances, at what we believe will be a probability of safety at least twice as good as the average human driver. "

              That's an advertisement

              • by superdave80 ( 1226592 ) on Wednesday April 11, 2018 @09:57PM (#56421859)

                ...at what we believe will be a probability of safety at least twice as good as the average human driver.

                The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road,

                So, it was an easily avoidable accident for a human driver... but we have an autopilot that couldn't do it, even though we claim it's twice as safe? Sounds like they are talking out of their asses from both ends here.

        • by blindseer ( 891256 ) <blindseer@noSPAm.earthlink.net> on Thursday April 12, 2018 @12:29AM (#56422259)

          Cruise control maintains your speed extremely well and doesn't ever fail catastrophically.

          Actually it can and does. Cruise control in slippery conditions can put a car into a dangerous condition.

          Here's one citation, I'm sure anyone can find more:
          https://www.theglobeandmail.co... [theglobeandmail.com]

          Early cruise control systems were sometimes quite dangerous, not always to the passengers but could cause damage to the engine or transmission. I remember cars having a hardwired switch on the dash to disable them, in addition to the software button on the steering wheel, because people learned not to trust them. They got "smarter" and today most will detect wheel slippage and not gun the engine if it hits a slippery spot in the road.

          Cruise control is especially dangerous with rear wheel drive and powerful engines, like on a sports car or light truck. One wheel on a slick patch will cause the cruise control to open up the throttle and get the wheels spinning, when they finally find traction the vehicle might no longer be pointed in the desired direction of travel and the front wheels could still be on a slick surface which can send the vehicle flying uncontrolled.

          Cruise control is very safe, especially newer ones that integrate with a traction control, but a claim that they never fail catastrophically is provably false.

        • by fgouget ( 925644 )

          Cruise control maintains your speed extremely well and doesn't ever fail catastrophically.

          In the same situation cruise control would have sent the car straight into the obstacle without ever breaking. In fact that's exactly what it did. The only case where cruise control "doesn't fail catastrophically" is when the driver pays attention to the road and takes over when necessary. Precisely what would have saved this driver.

      • What good is cruise control if I have to manually slow down?

        Because slowing down is the exact opposite of the purpose of cruise control? What dingbats modded this interesting?

    • by alvinrod ( 889928 ) on Wednesday April 11, 2018 @07:42PM (#56421317)
      The moral of the story is that when the AI self-driving system starts giving your warning messages about its inability to cope with the current road conditions that you should pay attention to it.

      Tesla should know better though. People are fucking idiots and the vehicle should not assume they'll act responsibly. If the AI system doesn't think it can manage things anymore and the user is not responding to input, it should throw the hazard lights on and make an emergency stop. Systems like this should always be able to fail gracefully. If this is a repeated problem, the system should disable the auto-pilot feature and refuse to let the driver use it. If they want it turned back on, they can write to Tesla and explain why they think that they should be allowed to be a colossal moron with a quarter million joules of kinetic energy.
      • by lhunath ( 1280798 ) <lhunath@l y n d i r.com> on Wednesday April 11, 2018 @08:19PM (#56421513) Homepage

        I think it's important to be mindful with your terminology. Tesla's Autopilot is not a self-driving system. It is cruise control. Conflating terminology causes nothing but confusion and undue misconceptions about emerging technologies.

        If people stop thinking about Autopilot as self-driving and start thinking about it as cruise control, it becomes immediately obvious that this is not a conversation about why the car did not dodge the obstacle, but rather a conversation about why the human looked away from the road while hurtling forward at great speed in a metal basket, long enough to travel at least 200 meters in distance.

        • I know people with auto pilots on boats and when engaged, they'll go downstairs and only pop their heads out of the cabin ever ten minutes or so. But otherwise they're watching movies or cooking or eating or using the toilet. Sometimes they'll even go to sleep if they're alone.
        • by ewibble ( 1655195 ) on Wednesday April 11, 2018 @09:06PM (#56421679)

          Perhaps they shouldn't call it autopilot? The term is clearly a marketing term that makes you think it is going to automatically pilot the car. Call it advanced cruise control, or lane assistance.

          Also if the car can detect your hands are off the wheel, and it is not capable of guiding itself when your hands are off the wheel then shouldn't it immediately warn you when you do so and come to a safe stop? At what point when the car is moving and you are driving safely are your hands off the wheel.

          • by dgatwood ( 11270 ) on Thursday April 12, 2018 @02:27AM (#56422545) Homepage Journal

            Perhaps they shouldn't call it autopilot? The term is clearly a marketing term that makes you think it is going to automatically pilot the car. Call it advanced cruise control, or lane assistance.

            The terminology issue is a red herring. The reality is that partial self-driving capabilities lull users into a false sense of security because they usually work well. The rare failures are often catastrophic precisely because people have gotten used to the technology working, and end up surprised when it doesn't.

            That's what made this recent software update such a problem. It made major changes to the way autosteer works on (at least) AP2-based Tesla cars. One of the big changes was "wide lane" handling, which changed the way vehicles behaved when they encounter a wide lane, such as an exit lane. This has resulted in a number of unexpected behaviors, up to and including cars driving straight towards gore points.

            I don't know whether that change was in any way a factor in the autosteer malfunction that led to Mr. Huang's death, because I have no way to know what firmware version that car was running. However, the fact that this major update was in the process of being rolled out to users at the time of the accident is suspicious.

            To be fair, a lot of other driving situations got significantly better with that software update. However, Tesla AP's tendency to ignore solid white lines has been an ongoing problem that might well have been made worse by that update; if that is the case, then the problem needs to be corrected ASAP, and they probably should NOT have continued the rollout of that update. Either way, I'm not convinced that Tesla did enough to warn drivers that autosteer might behave very differently, and to be particularly alert after that update.

            Also, I would add that, speaking as a Tesla owner, it bothers me to see the amount of spin they're spewing after this accident. I realize that they don't want to let their users get scared into not using AP, because on average, it does significantly reduce accidents. And if there are videos out there showing AP malfunctions that they feel are not genuine, they can and should comment. But they should really stop trying to convince the public that the driver was solely to blame, because IMO, that just isn't the case.

            First, the fact remains that autosteer obviously DID malfunction, and that malfunction DID result in a fatality that would NOT have occurred if the vehicle had not been equipped with autosteer functionality (because no sane driver would have looked away from the road for 5+ seconds without that functionality).

            Second, the situation was entirely predictable. For at least a decade, people have warned that humans are likely to zone out in partial self-driving situations, and that it isn't really possible to change that innate human tendency. Tesla ignored those warnings and pushed forward anyway, and someone died. They blamed the driver, and the crash investigators tentatively agreed, and they kept pushing forward. And then a second person died. And now a third. IIRC, product liability law hinges in large part on whether user errors are reasonably predictable, and no "I agree to pay attention" can change that fact, which means this is little more than a legal smokescreen, IMO.

            Third, the fact also remains that Caltrans failed to reset the safety barrier that was designed to slow down a car before impacting the gore point, after the barrier was collapsed in a wreck nearly two weeks earlier. And the fact remains that had the barrier been reset properly (as is required by law), it is unlikely that Mr. Huang would have died.

            In other words, there are three parties, any one of whom/which could have prevented the fatality, and the deceased driver was only one of those three. So it is entirely disingenuous to try to pin this on the driver in the court of public opinion. IMO, it really isn't a question of who is at fault; they all are. Rather, it's a question of wh

            • Perhaps they shouldn't call it autopilot? The term is clearly a marketing term that makes you think it is going to automatically pilot the car. Call it advanced cruise control, or lane assistance.

              The terminology issue is a red herring.

              No, it isn't. People have a conception that "autopilot" for aircraft means the computer takes over for the pilot, who can then take a nap or wander around the cabin. We as a society have used the prefix "auto-" to mean that a computer handles the whole process.

              Second, the situation was entirely predictable. For at least a decade, people have warned that humans are likely to zone out in partial self-driving situations ... and someone died.

              No amount of PR can fix dumb people and their misinterpretations of reality and Tesla has made it a big point to sell how "safe" their combination of technology they call "auotpilot" while not addressing any shortcomings. A car that can keep you in a

          • by Keick ( 252453 )

            Perhaps they shouldn't call it autopilot?

            I really get tired of this argument. Do you even know where the term Autopilot comes from or what it means?

            From Wikipedia emphasis mine:

            An autopilot is a system used to control the trajectory of an aircraft without constant 'hands-on' control by a human operator being required. Autopilots do not replace human operators, but instead they assist them in controlling the aircraft. This allows them to focus on broader aspects of operations such as monitoring the trajectory, weather and systems.

            The autopilot in every fucking definition of the word, is an assist device for the pilot, not a replacement. It handles speed, heading, and in some cases altitude. It doesn't monitor other aircraft, it doesn't avoid collisions, it doesn't (endless list of pilot tasks).

        • I think it's important to be mindful with your terminology. Tesla's Autopilot is not a self-driving system. It is cruise control. Conflating terminology causes nothing but confusion and undue misconceptions about emerging technologies.

          That's literally what autopilot is though. [wikipedia.org]

          An autopilot is a system used to control the trajectory of an aircraft without constant 'hands-on' control by a human operator being required. Autopilots do not replace human operators, but instead they assist them in controlling the aircraft. This allows them to focus on broader aspects of operations such as monitoring the trajectory, weather and systems.[1]

          No (sane) person thinks autopilot is supposed to take a plan through an obstacle course of other aircraft. It's merely to assist a pilot to keep a heading, altitude, etc.

        • Before "people stop thinking about Autopilot as self-driving and start thinking about it as cruise control", as you say, perhaps Tesla ought to describe it as you do instead of the way they describe it:

          https://www.tesla.com/autopilot [tesla.com]

          Full Self-Driving Hardware on All Cars

          All Tesla vehicles produced in our factory, including Model 3, have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver.

      • by aaarrrgggh ( 9205 ) on Wednesday April 11, 2018 @08:23PM (#56421527)

        More fundamentally to me is the issue that said car should not drive straight into a wall at full speed without trying to slow down.

        There is plenty of blame to go around-- victim, Tesla, Caltrans for starters. Each of them screwed up on at least two levels. Tesla likely needs some kind of way for drivers to flag a spot where the autopilot screwed up, so they can gather data and investigate, because the victim was aware of issues at this location and tried to address it with Tesla in (apparently) multiple occasions to no avail.

        What blows my frigging mind though is that the car will drive into a stationary object with high contrast safety striping without attempting to brake. Are they trying to determine approach speed based on visual sensors only that were blinded? Their "neural net" doesn't seem to be learning some important lessons quickly enough.

        • by thegarbz ( 1787294 ) on Thursday April 12, 2018 @01:45AM (#56422443)

          In any other scenario I agree with you, but the single person to blame for the death in this case is the driver.
          Not only did the driver know it was buggy, he apparently knew that the car steered towards THAT SPECIFIC DIVIDER, and even attempted to demonstrate it to his wife by her own admission.

          If I do something that I know is going to get me killed, in a place that is going to get me killed, and ignore warnings telling me that what I'm doing is about to get me killed then there's two possible explainations for that: attempted suicide, or Darwin award.

      • ...when the AI self-driving system starts giving your warning messages about its inability to cope with the current road conditions that you should pay attention to it.

        Where did it state that the AI was giving warnings about not being able to cope with the current road conditions? It mentioned giving warnings about not paying attention, not that it was having any particular troubles with the road.

      • If the AI system doesn't think it can manage things anymore and the user is not responding to input, it should throw the hazard lights on and make an emergency stop.

        The first problem is at the "if".
        Seems that in some cases, the "Autopilot" is completely persuaded that it is on the correct course.
        It genuinely thinks that "straight ahead" is the 100% correct answer to the problem.
        In that case it will never fail the driver "Hey, I need help".

        Again, it's an "autopilot" (see planes, boats, etc.) just a thing that automatizes some low-level work. The captain of the aiplane/boat/tesla should still keep focus and check that everything goes as it should (it's a "level 2" autono

    • Because the technology is in it's infancy? To me it's the equivelant of saying "this medical student is able to treat patients as long as he/she is supervised by an experienced doctor". We don't conclude that to mean that the experienced doctor should sleep in a chair and everyone should fully trust the med-student. Ready for use with supervision, and ready to be trusted, are very different things.
    • What good is it even if they say you need to keep your hands on the steering wheel?

      Autopilot is in development, and is improving with every update. Progress requires testing. If you don't want to be a guinea pig, then don't engage Autopilot, or even better, don't buy a Tesla.

      I own a Tesla, and while Autopilot isn't perfect, it is pretty good, and getting better. Nothing in life is risk free.

      • Autopilot was improving with every update, until Tesla had a falling out with the original developers of the system and they had to go it alone. Now it's been killing drivers.

    • What's interesting is that Huang had very specifically complained about the autopilot swerving towards this area of the divider multiple times before. So either he forgot about it and just happened to take his hands off the wheel 6 seconds before coming upon the divider, or knew it was coming up and took his hands off purposefully in order to get in what he though would be a minor accident and subsequently sue Tesla.

  • by Entrope ( 68843 ) on Wednesday April 11, 2018 @07:15PM (#56421135) Homepage

    Tesla blames dead driver. Dead driver's family blames Tesla. Who is really at fault here?

    I think the four-year-old girl is right: Why not both?

    • Because "perfect" is too high a standard.

      "Better" is still preferable to "40% more accidents but at least it was at human hands!".

      • by Entrope ( 68843 )

        I'm not asking for perfect. If visibility was as good as Tesla says it was, why couldn't the car stay in its lane, and why did it steer into an obstruction?

  • Suicide by Autopilot (Score:5, Informative)

    by Anonymous Coward on Wednesday April 11, 2018 @07:18PM (#56421157)
    Very strange. Specifically that

    Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location

    [Emphasis mine] Hands not on the wheel, a clear day with plenty of warnings to pay attention it's like he purposely wanted to crash.

    • I doubt that he wanted to crash. However, I suspect he wanted to text someone or check something on his phone (it'll only take like 10 seconds after all) more than he wanted to not crash (apparently his ability to estimate the likelihood of this outcome was about as terrible as most people) though.
      • No, he had complained about the autopilot swerving towards this exact area more than once before. The probability that he just happened to take his hands off the wheel when he would have known the divider was coming up is rather suspect. Personally I'd want his internet search history for the three weeks or so before the crash.

    • by Entrope ( 68843 ) on Wednesday April 11, 2018 @07:38PM (#56421283) Homepage

      More likely, he had a false impression that something really bad could never happen to him -- that bad luck is something that happens to other people. It's the same reason that people text while driving. They are confident in their own situation and their own ability to handle dangerous conditions, and sometimes they end up being wrong.

    • He was a geek. He was testing an edge case. He had told Tesla about his test results, they logged them as complaints. This time, the barrier was damaged, the autopilot went straight in. He wasn't expecting that.

  • Auto-copilot would be more appropriate for the way Tesla documents its functionality. As with any slashdot topic, naming can be the hardest part. Autopilot may sound better, but its deceptive.
    • Or just call it "driver assist", but that wouldn't be flashy enough for those types of companies.
      • It's nothing but smart cruise control and lane assist (that kills you).

        Established car companies have both available. Not so much the 'that kills you', but after enough miles, you can bet it has.

    • Not really. "Autopilot" has never meant fully-autonomous computer control with no supervision from the pilot. In fact, the first Sperry autopilot's debut was at the Paris Air show in 1914; back when a "computer" was a person whose job was to do arithmetic by hand. It was a simple gyroscopic affair that enabled forward progress in a straight line and... well... nothing else. Rather, an autopilot is, and always has been, merely a tool to reduce the pilot's workload. It still requires preparation, programm

  • by 140Mandak262Jamuna ( 970587 ) on Wednesday April 11, 2018 @07:36PM (#56421275) Journal
    We know if 100 years since the days of steam locomotives. Drivers, if they don't have to steer. They miss signals, fall asleep. They invented a variety of deadman switches to check for driver alertness. They do it even now in diesel and electric locomotives.

    Tesla should be issuing challenges and driver should respond correctly, if not it should pull the car over and stop.

    If alert driver is a necessary requirement for safety, the system should check for alertness and stop the car safely if the driver is not alert. It is weaseling out if it allows the car to stay on auto pilot even after its request for manual take over is not honoured. But it knows the appeal of auto pilot will be greatly reduced if it enforces alertness rules

    This is why I did not order autopilot when my Model 3 offer came through last Sunday. I am a great supporter of Tesla but the auto pilot is misnamed, and promotion of its use is not correct.

    • Kind of like a dead man's switch on a train or a tram.

    • Re: (Score:3, Informative)

      by pezpunk ( 205653 )

      i own a Tesla. it bings at you if you're not holding the wheel, and WILL eventually shut off autopilot, but bringing the car to a stop is inherently dangerous.

      ultimately, the driver of any vehicle is responsible for following the instructions for operating that vehicle.

  • "The crash happened on a clear day with several hundred feet of visibility ahead," ... which makes you wonder how the hell the computer missed a farking wall in the middle of the road.

  • Summary (Score:3, Insightful)

    by quantaman ( 517394 ) on Wednesday April 11, 2018 @07:38PM (#56421293)

    Tesla blames driver for using the Autopilot in exactly the way you'd expect 90% of Autopilot users to use it.

  • by NewtonsLaw ( 409638 ) on Wednesday April 11, 2018 @07:40PM (#56421299)

    Surely if Tesla demands that drivers keep their hands on the wheel at all times that the autopilot is engaged then they should have a sensor for this and disengage the autopilot whenever the driver releases the wheel -- as a safety measure.

    The fact that they don't do this is a clear indication that they really do expect people to take their hands off the wheel and use autopilot as if it were perfect. Stop passing the buck Tesla!

    • by Blue23 ( 197186 )

      Surely if Tesla demands that drivers keep their hands on the wheel at all times that the autopilot is engaged then they should have a sensor for this and disengage the autopilot whenever the driver releases the wheel -- as a safety measure.

      Turning off autopilot when the driver releases the wheel AS A SAFETY MEASURE? If you were in a car in motion, which do you think is safer for the driver and others nearby: an automated co-pilot with a safety track record better than humans or an uncontrolled car in motion?

      The sensors alert the driver to put their hands back on the wheel instead of turning the car into an large, fast, uncontrolled missile.

  • If the autopilot is unsafe around barriers like that, it should refuse to operate around those barriers. If it can't be made to recognize those situations, it should not be used at all.

  • by quantaman ( 517394 ) on Wednesday April 11, 2018 @07:40PM (#56421311)

    Does anyone know if Tesla is using a bot to write their Press Releases as well?

    The following:
    The reason that other families are not on TV is because their loved ones are still alive.

    Does not sound like something a human PR Professional would write.

  • In other news - Tesla autopilot can't handle 6 seconds of autonomous driving on a clear day on a clear highway without causing fatal wreck.
  • by DanDD ( 1857066 ) on Wednesday April 11, 2018 @07:44PM (#56421335)

    I'm a pilot, been flying for 30 years, and I've flown with other pilots with varying skill and experience levels.

    The most experienced pilot I've flown with never took his left hand off the control yoke. I watched him for hours while I was in the co-pilot and jump seats. He'd visit, configure radios, adjust power, but if his left hand ever came off that yoke it went right back on it as soon as the immediate task was done.

    I'll drive my Tesla autopilot the same way that gray haired old pilot flew an autopilot, and with any luck I'll live to be just as old.

    • by CRC'99 ( 96526 )

      I've got a CPL - and in all my training - one thing that always stood out is that I never fully trusted the autopilot.

      There's a great video that was done in 1997 called "Children of the Magenta" that seems to ring true with everything I hear about Telsa issues like this.

      Youtube link:
      https://www.youtube.com/watch?... [youtube.com]

      Aircraft, cars, the lessons are the same.

  • To me it has to be either autonomous or not. If semi-autonomous driving requires you to be engaged and alert with both hands on the wheel, ready to take control at any time, then what's the point? How is it different from regular non-autonomous driving? Can anyone share their experience?
    • by Blue23 ( 197186 )

      To me it has to be either autonomous or not. If semi-autonomous driving requires you to be engaged and alert with both hands on the wheel, ready to take control at any time, then what's the point? How is it different from regular non-autonomous driving? Can anyone share their experience?

      Two ways.

      First, it has a better track record then your average human driver so it can help avoid accidents that the human may not.

      Second, like an aircraft autopilot, it can handle routine matters but there are still times during an emergency or an unusual situation that it needs someone who can handle what it can't.

      Think like this. Both human and Tesla's autopilot have a high overlap in what they can handle. There are some things the autopilot handles better just due to reaction time and 360 vision. Ther

    • by pezpunk ( 205653 )

      it's statistically safer than non-autonamous driving.

  • In other words:

    "The victim was using cruise control. Our Tesla is not a self-driving car. Stop calling it that. Reliable self-driving cars do not exist."

    Do. Not. Exist.

    • "The victim was using cruise control. Our Tesla is not a self-driving car. Stop calling it that. Reliable self-driving cars do not exist."

      Why would someone think an "autopilot", a word already used to describe a device that makes a plane self-flying, would be self-driving?

      I think Tesla should change the name until they are willing to stand behind it as self-driving..

  • by superdave80 ( 1226592 ) on Wednesday April 11, 2018 @10:00PM (#56421867)

    Tesla is extremely clear that Autopilot requires the driver to be alert and have hands on the wheel.

    I already have to do that. What's the point of buying this autopilot-that-isn't-really-an-autopilot?

Keep up the good work! But please don't ask me to help.

Working...