Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Transportation Software Technology

There's Growing Evidence Tesla's Autopilot Handles Lane Dividers Poorly (arstechnica.com) 238

An anonymous reader writes: Within the past week, two Tesla crashes have been reported while Autopilot was engaged, and both involved a Tesla vehicle slamming into a highway divider. One of the crashes resulted in the death of Walter Huang, a Tesla customer with a Model X. The other crash resulted in minor injuries to the driver, thanks largely to a working highway safety barrier in front of the concrete divider. Ars Technica reports on the growing evidence that Tesla's Autopilot handles lane dividers poorly: "The September crash isn't the only evidence that has emerged that Tesla's Autopilot feature doesn't deal well with highway lane dividers. At least two people have uploaded videos to YouTube showing their Tesla vehicles steering toward concrete barriers. One driver grabbed the wheel to prevent a collision, while the other slammed on the brakes. Tesla argues that this issue doesn't necessarily mean that Autopilot is unsafe. 'Autopilot is intended for use only with a fully attentive driver,' a Tesla spokesperson told KGO-TV. Tesla argues that Autopilot can't prevent all accidents but that it makes accidents less likely. There's some data to back this up. A 2017 study by the National Highway Transportation Safety Administration (NHTSA) found that the rate of accidents dropped by 40 percent after the introduction of Autopilot. And Tesla argues that Autopilot-equipped Tesla cars have gone 320 million miles per fatality, much better than the 86 million miles for the average car. These figures don't necessarily settle the debate. That NHTSA figure doesn't break down the severity of crashes -- it's possible that Autopilot prevents relatively minor crashes but is less effective at preventing the most serious crashes. And as some Ars commenters have pointed out, luxury cars generally have fewer fatalities than the average vehicle. So it's possible that Tesla cars' low crash rates have more to do with its wealthy customer base than its Autopilot technology. What we can say, at a minimum, is that there's little evidence that Autopilot makes Tesla drivers less safe. And we can expect Tesla to steadily improve the car's capabilities over time."
This discussion has been archived. No new comments can be posted.

There's Growing Evidence Tesla's Autopilot Handles Lane Dividers Poorly

Comments Filter:
  • wrong statistic (Score:5, Interesting)

    by phantomfive ( 622387 ) on Thursday April 05, 2018 @07:28PM (#56389907) Journal
    You don't want to know how many accidents there were in cars with autopilot, that doesn't matter. What you want to know is miles per accident *with autopilot engaged.* Using the other number is highly misleading.
    • I agree. Given that the average incident rate is much lower and assuming that the accident rate without autopilot engaged is likely the same, the incidents with autopilot engaged has to be much, much lower to bring the overall average down that much.
      • Re:wrong statistic (Score:5, Insightful)

        by viperidaenz ( 2515578 ) on Thursday April 05, 2018 @08:24PM (#56390143)

        Given that all cars in the 320 million miles/fatality are modern 5 star safety rated and the cars in the 86 million miles/fatality are average cars, you can't make the the assumption that the correlation between incidents and fatalities are the same for both groups.

        If you want to compare Autopilot cars with non-autopilot cars, the cars you compare it to also should have the same standard safety features:
        automatic emergency braking
        a dozen or so air bags
        stability control
        abs
        5 star impact rating
        front and side collision warnings

        All features that are available on other new vehicles.
        Then it's a fair comparison.

        • Re:wrong statistic (Score:4, Interesting)

          by dmpot ( 1708950 ) on Thursday April 05, 2018 @09:06PM (#56390329)

          Also you should compare cars under the similar driving conditions. Currently, autopilots refuse to function in difficult road conditions, while human drivers do.

          • Also you should compare cars under the similar driving conditions.

            So limit the comparison to just drivers in the San Francisco and Phoenix metropolitan areas?

        • by Jeremi ( 14640 )

          If you want a really good control group, how about comparing Teslas-purchased-with-the-autopilot-option against Teslas-purchased-without-the-autopilot-option?

          That's about as apples-to-apples as you're going to get, assuming there are enough of each on the road to make the comparison statistically significant (I think there probably are).

      • Re:wrong statistic (Score:4, Insightful)

        by AmiMoJo ( 196126 ) on Friday April 06, 2018 @04:25AM (#56391383) Homepage Journal

        It's not a like-for-like comparison though.

        Tesla cars are expensive. You have to be well off to own one, which means you are much less likely to be taking risks like driving drunk or on drugs. Having spent all that money on a car, you are probably going to look after it and not take the same risks you would in a $1000 banger. You are also likely travelling very different roads, better maintained and at less congested times of the day. Your car is likely to be well maintained.

        So comparing to the average, especially in the US where regulations are relatively lax, is misleading. A fair comparison would be with accident rates among luxury cars in a similar price bracket. Audi, Lexus, Mercedes.

        • by Uberbah ( 647458 )

          You have to be well off to own one, which means you are much less likely to be taking risks like driving drunk or on drugs.

          Lawyers do their best to make up for the non-drug abusing professionals with money tho

        • You've made a whole load of assumptions there. And you've missed out out other factors that may not favour the wealthier driver. Like what speed they are doing, and how likely they are to be using a cellphone whilst driving.

    • The 40 percent statistic is so abused its not even funny. The study was never to show how much safer AP is, it was to try to determine if it was flawed.. 2/3 of the cars in the study never even had any pre AP miles and they don't even account for other functions already in place such as Auto Steer.

      The Tesla safety record compared to high end sedans driving on highways in good conditions (where AP is supposed to be limited to), not rain, snow, fog, etc, is not better from any data that is available.

      htt [bestride.com]
      • What we can say, at a minimum, is that there's little evidence that Autopilot makes Tesla drivers less safe.

        No, data as to relative safety doesn't appear to be available.

        It does 'seem' like when there is a Tesla highway death, the investigation shows AP was being used. So there might be a statistic that shows more Tesla highway deaths happen with AP on, but of course most drivers probably use it most of the time they are on highways, so the number of non-AP highway deaths might not be statistically significant yet.

        The necessary data is not easy to obtain unfortunately. So everyone is assuming.

    • You don't want to know how many accidents there were in cars with autopilot, that doesn't matter. What you want to know is miles per accident *with autopilot engaged.* Using the other number is highly misleading.

      It's one of several statistics you're interested in.

      Over time, frequent users of the autopilot may become worse drivers as their skills grow rusty (or they may get better as they become less complacent during their limited driving time).

      And even then, what kind of conditions does the auto-pilot work in vs regular driving, how does it affect other drivers on the road, how do pedestrians and other drivers adjust their behaviour when the cars start to become ubiquitous, etc, etc.

      Without overwhelming evidence o

    • You don't want to know how many accidents there were in cars with autopilot, that doesn't matter. What you want to know is miles per accident *with autopilot engaged.* Using the other number is highly misleading.

      Exactly. Furthermore, it's very likely that Tesla knows the number of number miles and accidents with Autopilot on and off. So, they could trot out the comparative mean miles between accidents/fatalities if it were in their favor. That they don't implies that the comparative numbers are either similar or worse with Autopilot.

    • You don't want to know how many accidents there were in cars with autopilot, that doesn't matter. What you want to know is miles per accident *with autopilot engaged.* Using the other number is highly misleading.

      If it's going to veer into a lane divider that's a problem. So what you want to know in this case is how many times Teslas with Autopilot engaged have successfully navigated past lane dividers.

    • The Autopilot is different then a self driving car. It is too bad that this hasn't been properly explained to the public.
      Autopilot is in essence a step up from cruise control which is in general good at keeping you in your lane, at speed limit, and not ram into other cars. I would actually like this feature on my car, when I am taking a long trip, and my eyes are getting strained, and I am miles away from a place where I can safely pull over and rest. It can give me a few seconds to relax my body, refocus m

  • by zippo01 ( 688802 ) on Thursday April 05, 2018 @07:29PM (#56389911)
    What is the point to an autopilot if I have to be fully attentive and ready to take over? I would rather just drive then worry about missing something. This and the are still have to much uncertainty to waist my time with it. Also dammit if i'm going to die in a car, I want it to be my fault and freaking awesome. "He almost made it, if it hadn't been for that ...."
    • What is the point to an autopilot if I have to be fully attentive and ready to take over?

      Mainly, that is just unfortunate lawyer CYA language so they have an easy cop out for situations like this.

      "Oh, our ridiculously named system soiled the bed? That's YOUR fault."

      • Do you have low karma or are Tesla fanbois just modding down all of the non-believers?
        • Probably low karma. My karma is 'good', so I get +1 to start, but that's probably just because I don't post here very often.

    • by Anonymous Coward on Thursday April 05, 2018 @07:43PM (#56389965)

      If you have to pay attention then autopilot is less than useless. The fact it is in control makes it much more likely you will not be paying attention.

    • by ceoyoyo ( 59147 )

      I imagine it's kind of like having a spell checker. The spell checker isn't perfect, (e.g. accepting waist instead of waste) so you have to be paying some attention when you write, but it will catch and prevent lots of your errors. The two of you working together can generally do a better job more easily than you a person would alone.

    • by jaa101 ( 627731 )

      What is the point to an autopilot if I have to be fully attentive and ready to take over?

      So cruise control is also pointless?

      Also dammit if i'm going to die in a car, I want it to be my fault and freaking awesome.

      What's so special about cars? Do you ever travel by air? Do you have a pilot's licence? Air accidents tend to be much more awesome than car crashes.

    • What is the point to an autopilot if I have to be fully attentive and ready to take over?

      Kansas. No, really; ever had to drive across that shit??

    • by dgatwood ( 11270 )

      What is the point to an autopilot if I have to be fully attentive and ready to take over?

      Simple: If you fall asleep behind the wheel of a Tesla with Autosteer engaged, odds are good that you will survive the experience. If you fall asleep behind the wheel of a car without Autosteer, you probably won't.

      • >> If you fall asleep behind the wheel of a Tesla with Autosteer engaged, odds are good that you will survive the experience.

        This is why I want it.. Falling asleep at the wheel is a problem for me. Between the two of us we ought to make one decent driver.

    • What is the point to an autopilot if I have to be fully attentive and ready to take over?

      The same point as an autopilot in a plane. Just becuase you didn't know what the term meant doesn't mean it has to perform differntly to anything else called autopilot in the world.

    • Fatigue. If you're driving a long distance, you will arrive less fatigued it your hands haven't spent hours doing micro-management of steering wheel position. Just as existing cruise control and adaptive cruise control gave that benefit to your right foot. And it's not just muscular fatigue, it's the mental fatigue of micromanaging.

      Autopilot lets you take one step back. You are acting as the manager of the drive, not the worker doing the driving.

    • What is the point to an autopilot if I have to be fully attentive and ready to take over?

      Because it's a little helper pilot, not an autopilot.

  • by RightwingNutjob ( 1302813 ) on Thursday April 05, 2018 @07:29PM (#56389913)
    The intentionally misnamed "autopilot" may reduce the likelihood of wandering out of your well-marked lane in clear conditions at highways speed but every once in a while it'll drive you right into an obstacle. Reminds me of those "push this button and ten people with terminal cancer get cured but two other random people die from a meteor strike" questions taught in philosophy classes with the intent of humbling people who might otherwise believe they can quantify their way through every obstacle.
    • Re: (Score:2, Interesting)

      by Obfuscant ( 592200 )

      The intentionally misnamed "autopilot"

      I don't know why you think it is misnamed. It is named exactly the same way that aircraft autopilots are. Aircraft autopilots also require an attentive pilot ready to take over, because aircraft autopilots will happily fly the airplane into obstructions, or can fail in a large number of other ways. In fact, "can disable autopilot" is a standard pilot checklist item, and it can be done in half a dozen different ways.

      Seems like the Tesla "autopilot" is named just right.

      • by DatbeDank ( 4580343 ) on Thursday April 05, 2018 @08:17PM (#56390105)

        The intentionally misnamed "autopilot"

        I don't know why you think it is misnamed. It is named exactly the same way that aircraft autopilots are. Aircraft autopilots also require an attentive pilot ready to take over, because aircraft autopilots will happily fly the airplane into obstructions, or can fail in a large number of other ways. In fact, "can disable autopilot" is a standard pilot checklist item, and it can be done in half a dozen different ways.

        Seems like the Tesla "autopilot" is named just right.

        I take it you've never flown as a pilot before. No really, it's ok because most people aren't pilots :P

        My roadway is as big as the horizon. My fellow pilots in other planes are several hundred meters if not kilometers away.
        In my car, my fellow drivers are 1.5-2 meters away and my roadway is as big as the city planners decide to make it.

        On larger jets, they have systems that monitor you with transponders and much more. If you're aiming to the ground, the system will shout at you in a Skybus or Boeing jet.

        A few seconds of inattentiveness with autopilot on in a plane won't hurt anyone. Heck I read a book sometimes.
        A few seconds of inattentive in a passenger car (with or without Tesla autopilit) will at best cause a crash or worse kill you.

        Call it cruise control assist and save a few live or call it MuskSense if you want something sexy and to achieve the same thing. Autopilot is just a terrible misnomer for what it really is.

        • Autopilot is just a terrible misnomer for what it really is.

          And yet it functions in the same way. Just because the reaction situation is slightly different doesn't change the function. Also a few seconds of inattentiveness in a passenger car with autopilot will not kill you. Only when auto pilot is not working does that come into play.

          *I was driving next to someone in traffic in Amsterdam who was asleep at the wheel and his Model S was coping just fine. I just hope he woke up before he missed his highway exit.

      • Seems like the Tesla "autopilot" is named just right.

        Fits your name.

    • The intentionally misnamed "autopilot"

      The only people who think autopilot is misnamed are those who have no idea how the autopilot of a plane works.

  • It should be noted that the vehicle that, just a week earlier, destroyed the safety barrier that would likely have saved the Tesla Model X driver was not on autopilot. I've had close calls around entrances to HOV lanes myself. They seem to be designed for throughput over safety, often with the lane that continues actually having to turn a bit to not hit the divider instead of having the HOV folks be required to move into a long entry lane separating from the regular lane completely long before the divider.
    • Re:To be fair... (Score:5, Informative)

      by Rei ( 128717 ) on Thursday April 05, 2018 @08:18PM (#56390119) Homepage

      And the driver who destroyed said barrier was not on autopilot. Normal human error.

      Check out the intersection on Google Maps and you can see what went wrong, both for the human, and for autopilot. The left line is quite distinct. The right line is rather worn. There is no visible crosshatching at all between them. Once a vehicle crosses the fading line, what looks like a "lane" forms around them, seemingly reinforcing that this is an acceptable place to drive. This happens only seconds before the barrier is hit, so there's not that much time to react to the situation. There are no overhead signs, just the road-level sign. In dense traffic, it's not visible until you're in the invalid "lane".

      Any driver, paying attention, will of course not do this. But human drivers' attentions lapse, and that's a mistake that humans can - and recently did - make.

      Concerning Autopilot, there's a big question as to what versions people are running. Walter Huang, at the very least, was almost certainly running the old AP. It's not clear what versions the YouTubers were running. There was a massive AP update that just started rolling out recently that makes a huge difference in quality. To the degree that I'm actually rather concerned about it. The more imperfect the system, the more attention you pay to it. I have worries that with the new system, it's gotten good enough that it's going to cause peoples' attention to lapse. Having to touch (with torque) the wheel at regular intervals helps, but I hope Tesla gets eye tracking in place soon.

      • Holy crap, why do people think lane markings will EVER be 100% accurate? That's a total fucking dream; and self driving cars have better damn well work with every possibility.
      • If the Tesla Autopilot requires clearly marked lanes it is somewhat useless. At least in my country, there are usually some streets where lanes are not marked at all, or the marking is worn away. And the street has 2 (sometimes 3) lanes in one direction.

  • Inherently unsafe (Score:2, Insightful)

    by Anonymous Coward

    It seems even worse than regular driving if you have to continuously be on the look out to prevent the steering wheel suddenly sending you into a wall.

    Maybe Tesla should focus on automatic braking, parallel parking, and things like that until using their Autopilot is no longer the same as playing the Russian roulette. At some point, these accidents will damage their reputation badly...

    • You know, itâ(TM)s not randomly turning. Itâ(TM)s not able to handle a highway exit/split. Which is perfectly expected for, I quote the manual âoeauto lane keepâ.

      It works perfectly fine when driving down a marked highway. And from anecdotal experience works better than I could at night and in rain.

      And just like adaptive cruise control, itâ(TM)s a convenience feature meant to be used in the right conditions.

  • by greenwow ( 3635575 ) on Thursday April 05, 2018 @07:59PM (#56390037)

    what about motorcycles? I know BMW's Traffic Jam Assistant doesn't do well with that since I had a BMW 750 rear-end me at about 10 MPH. The guy that hit me says it usually does a great job of going the correct speed in 0-30 MPH traffic here on I-5 in Seattle. I think it didn't see me, but instead saw the dump truck in front of me and then tried to drive through me.

  • Recalculate the miles per fatality after you remove all the other cars without the other safety systems the Tesla has.
    It would be interesting to see the miles per fatality when the cars counted all have stability control, crumple zones, a dozen or so airbags, side impact beams, etc.

    Unless you're comparing 5 star crash rating cars with other 5 star crash rating cars, it's not even a remotely fair comparison of Autopilot.

  • "Tesla cars have gone 320 million miles per fatality, much better than the 86 million miles for the average car. These figures don't necessarily settle the debate. That NHTSA figure doesn't break down the severity of crashes -- it's possible that Autopilot prevents relatively minor crashes but is less effective at preventing the most serious crashes."

    I find the skewed submitter's view of "minor crashes" a bit odd.

    The comparison is for fatalities per mile. I'd have a hard time expecting that there would be
    • There is a LOT of reasons the comparison is complete BS. IHS uses 'driver deaths' per registered vehicle years as the baseline because they know if a passenger dies it doesn't make a car more dangerous. Using IHS method, I can't see how Tesla AP would possibly come out near the top.

      http://www.iihs.org/iihs/sr/st... [iihs.org]
    • The figures being commonly quoted are heavily unreliable as well, as the oldest Tesla sold dates from 2008 (and there were only 2,500 Roadsters delivered, so that changes the figures as well), so when we are talking about Tesla mileage we are also talking about very modern cars, and also prestige cars which people take care of.

      The "average" car on the road must also take into account all the 20, 30, 40 year old beaters on the roads, and the declining road-worthiness of those older cars - not only do they ou

  • by WoodstockJeff ( 568111 ) on Thursday April 05, 2018 @08:33PM (#56390191) Homepage

    Statistics are only as valid as the data they're based on, and the assumptions made about the data that isn't there.

    Most transportation statistics are missing a LOT of base data. Things like "miles driven per year" are guesses.

    Except in the case of cars like the Tesla, where there is a black box collecting statistics. How does Tesla know that auto pilot was on or off? It's recorded. How many miles are driven with AP on or off is recorded.

    Many of the details are tossed out after an interval, but Tesla can collect a whole lot of data that other manufacturers cannot.

    Now, the particular problem with dividing lanes is probably tied to trying to stay between the lines, when the lines are spreading out. If you don't stick to one line or the other, you're target is what is in the middle, and it is going to hurt.

  • For stuffing yourself in a metal can, then propelling it at great speeds just barely passing other metal cans (within inches). Ludacris you folks are...
  • Autopliot steering directly into a concrete barrier at highway speeds in broad daylight is an enormous bug. Multiple sensors must have detected the barrier under those conditions, yet the onboard AI chose to drive into it. This looks like a great big hole in Tesla's software validation process and badly faulty software. A concrete barrier directly in front of the car is not some one-in-a-zillion anomalous corner case. Autopilot software architecture must be very badly flawed; Even if the lane-detectio

  • This blithe dismissal of the fact that a machine is making decisions that kill people falls right in line with the need of industry to change the mass perception about machines killing people. And it will get better, right? Death by death. People are expendable apparently.
  • 'Autopilot is intended for use only with a fully attentive driver,'

    Said no logic ever.

    "You had one job!"

  • I wanted to watch the video from the near misses, but YouTube is starting with a one minute commercial.

  • Maybe I'm old-fashioned, but how about people just drive while they're driving? Then again, I am one of the seemingly-few people who actually enjoys driving. So perhaps I am being unrealistic.

"When the going gets tough, the tough get empirical." -- Jon Carroll

Working...