Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Transportation

Tesla Autopilot, Distracted Driving To Blame In Deadly 2018 Crash (theverge.com) 171

Slashdot readers TomGreenhaw and gollum123 are sharing the findings from a National Transportation Safety Board (NTSB) investigation into a fatal Tesla Model X crash that occurred in 2018 near Mountain View, California. The agency says Tesla's Autopilot system and the driver's distraction by a mobile device were two of the probable causes of the crash. The Verge reports: The safety board arrived at those probable causes after a nearly two-year investigation into the crash. NTSB investigators also named a number of contributing factors, including that the crash attenuator in front of the barrier was damaged and had not been repaired by California's transportation department, Caltrans, in a timely manner. Had the crash attenuator been replaced, NTSB investigators said Tuesday that the driver, Walter Huang, likely would have survived. The NTSB shared its findings at the end of a three-hour-long hearing on Tuesday. During the hearing, board members took issue with Tesla's approach to mitigating the misuse of Autopilot, the National Highway Traffic Safety Administration's lax approach to regulating partial automation technology, and Apple -- Huang's employer -- for not having a distracted driving policy. (Huang was playing the mobile game on a company-issued iPhone.) "In this crash we saw an over-reliance on technology, we saw distraction, we saw a lack of policy prohibiting cell phone use while driving, and we saw infrastructure failures, which, when combined, led to this tragic loss," NTSB chairman Robert Sumwalt said at the end of the hearing on Tuesday. "We urge Tesla to continue to work on improving their Autopilot technology and for NHTSA to fulfill its oversight responsibility to ensure that corrective action is taken where necessary. It's time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars."
This discussion has been archived. No new comments can be posted.

Tesla Autopilot, Distracted Driving To Blame In Deadly 2018 Crash

Comments Filter:
  • by sexconker ( 1179573 ) on Tuesday February 25, 2020 @08:34PM (#59766850)

    4
    3
    2
    1

    • by Kokuyo ( 549451 )

      I mean I've weaned myself of wanting a Tesla but I must say, if someone put his mobile home on cruise control and went to take a nap, we'd all go "What a complete moron".

      I'm not sure why it should be different with autopilot. The term autopilot is only used in movies to describe something that can make a pilot obsolete. Real world aviation doesn't work that way either.

      I mean why does nobody sue Tesla because the cars don't go plaid when they activate ludicrous speed?

    • Re: (Score:2, Informative)

      by Rei ( 128717 )

      Hi, I'm Troy McClure! You may remember me from such films as....

      Wait, wrong thing :)

      The Slashdot summary is of course incomplete. The things NTSB blamed were:

      * Tesla for not doing more to stop distracted driving and the software not preventing the accident
      * The driver for becoming overconfident, to the point that he was playing a video game when the crash happened, and not once applied the brakes or steered
      * Apple and other smartphone makers for not locking out games, texts, etc when

      • by DRJlaw ( 946416 )

        Really, though, the oddest thing to me was that they never once asked or seemed to care about literally the only question that matters, which is: "Does this save more lives than it harms?"

        Because they already answered that question in the most embarrassing way possible -- no, it does not [arstechnica.com]:

        When QCS director Randy Whitfield ran the numbers for these vehicles, he found that the rate of crashes per mile increased by 59 percent after Tesla enabled the Autosteer technology....

        NHTSA fought QCS' FOIA request after T

        • by Rei ( 128717 )

          1) This was about NTSB, not NHTSA

          2) QCS is a guy (Randy Whitfield) working out of his house, nitpicking and attempting to throw out the vast majority of the data because trivial details were incorrectly reported, leaving only data that didn't invert the conclusion unchallenged. It's like citing Watts Up With That in a climate change discussion. It's amazing that anyone ever gives him the time of day, but when it comes to Tesla...

          That said, Whitfield sometimes works with plaintiffs attorneys, so he knows

      • by DRJlaw ( 946416 )

        It was a weird hearing to watch. It felt rather like a kangaroo court, in that they'd rail against particular entities but without the entities there to defend themselves

        Tesla did defend itself, so much so that it decided that it would front-run the investigation [washingtonpost.com]. Now it has 90 days to respond to the NTSB report, [theverge.com] which if it squanders like the previous comment period on driver assistance safety systems, will remain its own damn fault.

        • by Rei ( 128717 )

          Tesla's release of information was in response to the fact that the NTSB had already been making public claims about the case, yet was banning Tesla from doing likewise.

          I absolutely would have done the exact same thing Tesla did.

          Now it has 90 days to respond to the NTSB report

          You clearly weren't watching the hearing, so let me help you out: during the hearing the NTSB railed against pretty much everyone under the sun for not responding to their letters, not just Tesla. Now they're demanding again responses

  • wait (Score:4, Insightful)

    by xorbe ( 249648 ) on Tuesday February 25, 2020 @08:36PM (#59766866)

    Isn't this the guy that kept complaining to Tesla's feed about the trouble spot around 101S / left-side exit to 85, and finally smacked into the barrier with no hands on the wheel? The bad part was the water barriers were missing because of a recent previous accident, so his accident was into the concrete, and it killed him. Lesson, don't take your hands off the wheel especially at known problematic points. I thought this was painted as 100% driver error already.

    • Re:wait (Score:5, Informative)

      by uncqual ( 836337 ) on Tuesday February 25, 2020 @09:58PM (#59767126)

      I don't believe there have been water (or sand) barriers at that location for many years, if ever.

      There has been an (somewhat) energy absorbing barrier since at least 2011 [goo.gl] (and as I recall it was not unusual to see it in a state of disrepair) but it had been upgraded by November 2017 [goo.gl] and looks much the same in May 2019 [goo.gl].

      You are correct that at the time of the crash, eleven days earlier the barrier had been hit by a Prius going 70 MPH (the driver apparently walked away from that one) and the barrier had not been reset/repaired by Caltrans yet - in part probably because the CHP failed to notify Caltrans.

    • Re:wait (Score:4, Insightful)

      by MachineShedFred ( 621896 ) on Wednesday February 26, 2020 @03:47AM (#59767804) Journal

      Yes. The short version is this:
      1. he already knew that autopilot had an issue going through that section of road, because he had complained to Tesla Service about it, and then complained to friends about Tesla Service.
      2. there was no "crash attenuator" on that concrete divider because someone else had already destroyed it in a crash, and CalTrans hadn't bothered to replace it
      3. he used autopilot there anyway, even though he had issues with it before
      4. while using autopilot in a place where it obviously didn't work properly as evidenced by this guy's own previous experience, he thought it prudent to break out his iPhone and fuck around instead of paying attention. I wouldn't be surprised if the EMTs found the phone still running some god damn video game at the scene.

      Being as you have to agree to a big fat notice before being able to turn on autopilot that says you will stay attentive and be ready to take control of the vehicle, I'm finding it hard to see how this isn't 100% driver error. He drives any other car into that highway divider and there wouldn't be any story about it at all - for example, the crash that took out the "crash attenuator" before him - what make and model was that car? Nobody knows, because it wasn't a Tesla.

      • by DRJlaw ( 946416 )

        Examples of cognitive dissonance:

        1. he already knew that autopilot had an issue going through that section of road, because he had complained to Tesla Service about it, and then complained to friends about Tesla Service.

        and

        I'm finding it hard to see how this isn't 100% driver error.

        or

        for example, the crash that took out the "crash attenuator" before him - what make and model was that car? Nobody knows, because it wasn't a Tesla.

        and the reply posted 5 hours before to the same parent comment

        You are correct t

      • Being as you have to agree to a big fat notice before being able to turn on autopilot that says you will stay attentive and be ready to take control of the vehicle, I'm finding it hard to see how this isn't 100% driver error.

        The accident was 100% avoidable by the driver, but that doesn't mean that the accident was 100% driver error. Similarly, just because an improved Tesla system could have prevented the accident doesn't mean it was 100% Tesla error. As accident investigation teams often correctly conclude, the accident was a combination of multiple factors, each of which individually would have mitigated or completely avoided the accident. Yet, that ability of each factor to individually avoid the accident never means that

  • Truth hurts (Score:5, Insightful)

    by quonset ( 4839537 ) on Tuesday February 25, 2020 @08:37PM (#59766870)

    Not only was the guy an Apple software developer, not only did he previously complain to his family his Tesla acted erratically on that stretch of road, not only did he notify Tesla of this erratic behavior, he then went and drove that same stretch of road while playing a video game on his phone using "auto pilot" rather than driving it himself.

    Tesla can certainly bear some of the blame for this crash, but the final responsibility falls on the driver who couldn't be bothered to drive his vehicle but instead relied on software he suspected, or had reason to suspect, was not up to the task of driving for him. From another story [marketwatch.com]:

    Just before the crash, the Tesla steered to the left into a paved area between the freeway travel lanes and an exit ramp, the NTSB said. It crashed into the end of the concrete barrier. The car’s forward collision didn’t alert Huang, and its automatic emergency braking did not activate, the NTSB said.

    Huang did not brake, and there was no steering movement detected to avoid the crash, the board’s staff said.

    • He totally meant to be paying attention and yank the steering wheel at the last second, but he was totally beating his high score on Temple Run.

    • Clearly this was all Apple's fault for not telling their employees to not play games while driving. /s

      This whole incident is stupid. Distracted driving caused this crash.

      The driver knew (and apparently regularly complained) about the limitations of autopilot doing exactly what it did. So as far as he should have been concerned it behaved as it was designed to do.

      A corporate policy... Do we need a corporate policy on not murdering people too. "The FBI said that the Green River Killer's employer had no policy

    • by AmiMoJo ( 196126 )

      As the report notes there is blame for the system not checking if the driver is paying attention - torque on the wheel is meaningless and it didn't even detect that for 7 seconds before the accident.

      The report recommends that Tesla stops allowing that to happen. Maybe reduce the maximum no-torque detection time to 1 second to start with, but really they need to just abandon that idea and switch to using attention monitoring cameras instead. Cadillac, Nissan and Lexus all use cameras. As a bonus you can go h

      • There's been technology to track the eyes for at least 20 years, the car should track them and nag you whenever they go lower than the dial cluster.
    • Autopilot gives you multiple warnings about keeping your hands on the fucking wheel. This is a Darwin Award for sure.

  • She was pointing at everybody in turn, saying "You get some blame! You get some blame! And you get some blame! Everybody gets some blame!"

  • by Lonng_Time_Lurker ( 6285236 ) on Tuesday February 25, 2020 @08:39PM (#59766884)

    Companies need to have policies their employees follow the law ? Or what, it's implicitly condoning the behaviour ?

    • by uncqual ( 836337 )

      ...and he was commuting in his personal car, not even driving on company business or in a company car.

      I wonder if the NTSB thinks that Apple is at fault in all distracted driver accidents where the driver is using an iPhone because the idiot didn't follow the law with respect to not using the phone (hands on) while driving. If he had killed someone else, the driver would have been responsible for those damages (of course, Apple, Tesla, the CHP, and Caltrans would be sued also -- sue everyone in sight is wha

    • NTSB also blame the phone manufacturer (I guess that's also Apple) for not somehow detecting the person using the phone should have be driving:

      Electronic device manufacturers have the capability to lock out highly distracting functions of portable electronic devices when being used by an operator while driving, and such a feature should be installed as a default setting on all devices.

      Quite how phones are supposed to distinguish a driver from a passanger, I have no idea.

  • Huang was playing the mobile game on a company-issued iPhone.

    I'd call "Darwin Award", but those are just the participation trophies now. The Apple employment, though, adds a nice Jungian touch.

  • Level 5 (Score:4, Informative)

    by xeoron ( 639412 ) on Tuesday February 25, 2020 @08:47PM (#59766900) Homepage
    Waymo has said in interviews studies have shown them people need a full auto driving (level 5) and not assist or people will just not pay attention when they need to be.
    • I agree that was what was said, but that's really silly. So, if you cannot have 100%, have 0%. Not a very wise philosophy. I agree with the problem of distracted driving, but the solution of just jumping to level 5 and skipping level 4 is unrealistic. The same can be said with cancer. We should not treat cancer, because it requires a lot of high side effect chemotherapy. We should just cure it instead. No. Level 5 driving is simply level 4 in more places. You HAVE to have level 4 if you are to get l
      • by presidenteloco ( 659168 ) on Tuesday February 25, 2020 @09:07PM (#59766968)
        is:
        1) If it seems to be self-driving fine it will lull you into not really or fully paying attention to the situation (the road, the context).

        2) Google study found that it took multiple seconds (I can't remember how many exactly) for a not really paying attention driver to figure out what's up and execute an appropriate response, once warned that they should take over.
        Several seconds amounted to a football-field distance or something, at highway speeds.
    • waymo has an agenda...

      anyway, autopilot works fine IF you know its limits and you monitor it properly.

      its a benefit. you don't have to micro-poll 'are you centered?' and 'are you at the right following distance?'. it takes care of that and you just have to poll at 1/10 or less - to watchdog the system, as it were.

      its a net gain, really. those who have never experienced it, you need to get first-hand experience or your comments are worthless on this subject.

      its NOT an all or nothing thing. each bit that

      • Lets say you had two types of Teslas, one with the current autopilot and one which only assisted in steering when you got close to drifting across lanes or another car and sounded an annoying alarm when it did (which you couldn't turn off). All else being equal which do you think would cause less accidents and drive into less firetrucks?

        Steering assist is fine, steering automation is fucking insane ...

        Also lets say a Tesla on autopilot has a cop in front of the next stationary car it plows into, do you thin

        • by dgatwood ( 11270 )

          Lets say you had two types of Teslas, one with the current autopilot and one which only assisted in steering when you got close to drifting across lanes or another car and sounded an annoying alarm when it did (which you couldn't turn off). All else being equal which do you think would cause less accidents and drive into less firetrucks?

          The one where the computer does more will almost certainly have fewer accidents, statistically. When someone doesn't pay attention to the road or falls asleep at the wheel

    • by Jeremi ( 14640 )

      Waymo has said in interviews studies have shown them people need a full auto driving (level 5) and not assist or people will just not pay attention when they need to be.

      The funny thing is, that's equally true at full manual driving (level 0) -- occasionally people stop paying attention when they need to be, and an accident results. Doesn't seem to stop anyone from driving, though.

    • by AvitarX ( 172628 )
      I doubt that's entirely true.

      Level 4 includes fine for everything but severe weather and fine for everything but geofenced.

      Both of those would be perfectly safe if think.
    • I'm utterly surprised that a company still developing a product is spreading FUD in order to throw shade at a competitor that has something you can buy and use right now. That's basically like saying that cruise control shouldn't exist because people will just not pay attention to what speed they are going.

      Can we all just agree that the driver still has the responsibility to control the vehicle, regardless of what "level" of driver assistance tech there is? And it's very ironic that a Google property woul

  • Autonomous cars are close to worthless unless they are fully autonomous. The car makers want to keep pushing close to autonomous cars but the human mind doesn't work that way. People already have enough trouble staying awake in a regular car. Expecting someone to stay focused on a cross country trip ready to take over from the computer at any moment as if they were actually driving is stupid. Even airline pilots probably don't do that. Either a car has to be fully able to drive itself or not. I guess you co
    • Agreed (Score:5, Insightful)

      by presidenteloco ( 659168 ) on Tuesday February 25, 2020 @09:11PM (#59766990)
      A fundamental difference with a car at highway speeds and an airplane on autopilot, is that when the airplane autopilot suddenly hits some limits and disconnects with an audible warning, then in a vast majority of situations, it will be fine for the pilot to take several seconds to start understanding the situation and the cause of the disconnection. The sky is pretty empty.

      In a car on a busy road, that several seconds is often fatal, or ensuring some kind of accident.
      • The name autopilot is also used in boating, where you have to be alert most of the time. The problem is not the name. The problem is the drivers with rectal-cranial inversion. I don't mind if they kill themselves but they can kill others, and that's a problem.

    • Re: (Score:3, Insightful)

      Tesla shares blame as long as they continue to market the feature as "autopilot", full stop. There are no arguments to the contrary which make any sense.
    • ...spoken as someone who has likely NEVER tried the assist techs.

      do yourself a favor; stop writing about things you don't know and EDUCATE yourself.

      take a test drive, at least.

    • by AvitarX ( 172628 )
      I find cruise control to be a decent benifit for long drives.

      More comfort means better ability to focus means safer driving.

      I don't have anything that turns my wheel, and I actually like to manage my cruise speed rather than use the adaptive control, but I imagine the collision avoidance is safer than not having it (I sure hope o never find out though).
  • Tesla is at fault because the guy was misusing the product? How dangerous would it have been for the driver if he was playing his video game *without* any self-driving tech? I'd venture to guess it would have been much more dangerous for all other driving conditions.
  • Not Tesla's fault. (Score:4, Insightful)

    by thedarb ( 181754 ) on Tuesday February 25, 2020 @09:22PM (#59767024)

    Stop blaming the manufacturer for the driver's own stupidity.

    • Make a product that fails safe.
    • by AmiMoJo ( 196126 )

      The report notes that while he does take some of the blame there are other factors:

      - Autopilot lulls drivers into a false sense of security. The repeated warnings probably have the opposite to the intended effect, making the driver more eager to ignore and dismiss them. Much like security warnings on computers that users blindly click through.

      - Smartphone games are addictive and encourage users to pay attention to them at inappropriate times. Apple has implemented a feature that disables some phone function

      • - Autopilot lulls drivers into a false sense of security. The repeated warnings probably have the opposite to the intended effect, making the driver more eager to ignore and dismiss them. Much like security warnings on computers that users blindly click through.

        This is fundamentally a problem with humans, I would say it is more pertinent to develop independent driving cars (yearly there are more than 1 million traffic realted deaths).

        - Smartphone games are addictive and encourage users to pay attention to them at inappropriate times. Apple has implemented a feature that disables some phone functionality when driving but it is disabled by default.

        tesla is not the correct entity to address addiction.

  • by darth_borehd ( 644166 ) on Tuesday February 25, 2020 @09:24PM (#59767030)

    The whole point of autopilot *IS* to let the driver be distracted. It is to let the driver be on his phone. If it crashed while on autopilot, that's a fault in the autopilot system. Blaming the driver on being distracted misses the whole point of automated vehicles in the first place.

    • by AmiMoJo ( 196126 )

      Technically it's a drive assistance feature, like cruise control. You don't have to operate the steering wheel manually. Like cruise control you are supposed to monitor it constantly.

      We seem to have found the point at which drivers stop paying attention. Cruise control requires attention to keep the car in lane, but take the need to steer away and people start paying with their phones, wedging fruit into the wheel to defeat the hands-on detection, taking naps etc.

  • And the driver not using it to stop the car before a crash.

  • To drive.

    Mobile device distraction is a plague on society. In Australia if you so much as touch or hold a device with a screen-phone, tablet etc, even whilst stationary, it's a $450 fine and 3 demerit points. Double in NSW over the holidays.

  • Comment removed based on user account deletion
  • I test drove a red Model 3 last fall. The Tesla salesman rode along. As I was driving, I noticed that other vehicles at about 135 degrees, that's my right-rear quadrant, were not appearing in the diagram on the display panel and I mentioned that to the salesman. He looked to his back right, then at the dash-mounted display and opened his mouth to speak. I could tell from his expression and body language that he was winding up to explain to me why I was wrong. What he had seen took a moment to register

    • by ledow ( 319597 )

      From a legal standpoint, it only ever comes down to one thing.

      You're the driver, you're responsible. If you rely on a reversing camera, reversing sounder, parking sensors, lane-deviation monitor, blind-spot indicator, cruise control, automatic emergency braking, signpost-recognition, speed limiter, hell even a mirror...

      YOU are still responsible. Even if the thing fails, doesn't do it's job, misses everything, doesn't show you something that's present, or even if it does show you.

      The driver is responsible.

  • and Apple -- Huang's employer -- for not having a distracted driving policy.

    WHAT? Am I alone in considering this to be the most ridiculous "contributing factor" to a crash EVER? That somehow his employer should have told him he should pay attention when driving? Crueller people than me might ask how stupid Americans must be if they have to be told by ANYONE that they shouldn't become distracted by driving... let alone the fact that out of all the possible people who could tell you that - e.g. your parents, siblings, kids, friends, teachers, government, newspapers, entertainers, new

  • As evidenced by the number of cars WITHOUT autopilot that have crashed at this site.

    Even in aircraft, there are "levels" of autopilots, from "keep it flying straight" to "follow a pre-programmed flight plan and land at a suitably-equipped airport and brake to a stop." But, even those auto-land-capable autopilots can't pick up ATC instructions and clearances. Yet.

    People are confused about what "autopilot" means because no one seems to want to correct them, until a crash occurs.

If all else fails, lower your standards.

Working...