Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Stats Transportation

Newest Tesla Autopilot Data Shows A 40% Drop in Crashes (bloomberg.com) 167

There's a surprise in the data from an investigation into Tesla safety by the U.S. National Highway Traffic Safety Administration. An anonymous reader quotes Bloomberg: [W]hile all Tesla vehicles come with the hardware necessary for Autopilot, you need a software upgrade that costs thousands of dollars to make it work. Since buyers can add Autopilot features after purchase, this provides a perfect before-and-after comparison. It turns out that, according to the data Tesla gave investigators, installing Autopilot prevents crashes -- by an astonishing 40 percent...

Now -- thanks to an investigation that initially hurt the company -- there is finally some real data, and it's good news for Tesla... As the software matures to match the new hardware, Musk said on Thursday via a Tweet, Tesla is targeting a 90 percent reduction in car crashes.

UPDATE (5/4/18): The NHTSA has now clarified that their study "did not assess the effectiveness of this technology.

UPDATE (2/16/19): The study's underlying data reveals serious flaws in the methodology that undermine its credibility, according to new analysis from a research and consulting firm.
This discussion has been archived. No new comments can be posted.

Newest Tesla Autopilot Data Shows A 40% Drop in Crashes

Comments Filter:
  • by alvinrod ( 889928 ) on Saturday January 21, 2017 @04:42PM (#53712407)
    I think the technology is a good idea, but they've picked a terrible name for it. To someone who is uninformed, it makes it sound as though the feature enables automated driving for the vehicle, and while that may be the end goal, it's currently not at that level and may give a false sense of capability. They should refer to it as "Driver Assist" or something that doesn't leave anyone with a false impression of the capabilities of what it does.
    • Absolutely. And they should make it much more clear that drivers should keep their hands on the wheel at all times, which autopilot doesn't imply.

      • There is some mild rebuking if you don't hold the steering wheel.

        If you don't hold the steering wheel then after a time between 1 and 5 minutes (depending on situation) you get an audible and visual warning. Ignore the warning for 15 seconds and you get a "strike". 3 strikes within an hour and you're out: you won't be able to use autopilot until your next journey.

    • Agreed. If I bought a car with a feature called "autopilot", I would think I would be comfortable taking a snooze (or watching a Harry Potter movie) while the car did the driving for me. I am really surprised they haven't dropped the name "autopilot" as it is totally misleading and something that has been pointed out repeatedly. I suspect that there are egos involved in the decision not to change the name.

      Or, maybe, they think their software is close enough to achieve certification for totally taking ove

      • Agreed. If I bought a car with a feature called "autopilot", I would think I would be comfortable taking a snooze (or watching a Harry Potter movie) while the car did the driving for me. I am really surprised they haven't dropped the name "autopilot" as it is totally misleading and something that has been pointed out repeatedly. I suspect that there are egos involved in the decision not to change the name.

        Or, maybe, they think their software is close enough to achieve certification for totally taking over responsibility for driving the car that they think that can weather this storm and keep the name for the big roll out.

        I agree completely. I'm actually surprised that they haven't been sued for false advertising instead just like the cell phone companies have for "unlimited". It should be called "Driver Assist" until the driver is allowed to nap. Yes, I know that a pilot isn't allowed to nap but that's not really the point. A pilot has had a lot more training AND is less likely to hit something if it does dose off AND usually has a second pilot as well.

        • by mysidia ( 191772 )

          I'm actually surprised that they haven't been sued for false advertising instead just like the cell phone companies have for "unlimited". It should be called "Driver Assist" until the driver is allowed to nap.

          The brand name of a product such as AutoPilot is not a legal representation of what its capabilities are.

          Even if that were the case, the name is not inaccurate. Some members of the public have a misconception of what Auto Pilot means, and are of the false belief that an Auto Pilot refers to

      • If you were buying a Tesla you'd be so excited about the AutoPilot, you'd find out all about it before you got delivery.

        Heck you'd have to read or be taught at least something about it, because you wouldn't know how to activate it otherwise. There isn't a button marked autopilot.

    • by ShanghaiBill ( 739463 ) on Saturday January 21, 2017 @04:59PM (#53712479)

      To someone who is uninformed ...

      Since these "uninformed" people don't actually own a Tesla, it doesn't matter one iota what they think about the system. If you actually own a Tesla, the capabilities and limitations of the system are very very clear.

      Also, an autopilot on an aircraft doesn't completely fly the plane all by itself either. Pilots understand that. Do you think they should rename it so the passenger in seat 22C also understands?

      • by Calydor ( 739835 )

        So you are fine with only being informed about the capabilities of a product with a price tag of a car ... AFTER you buy the product?

        • So you are fine with only being informed about the capabilities of a product with a price tag of a car ... AFTER you buy the product?

          I didn't buy the car, my wife did. We both fully understood that she was not buying a full SDC for the following reasons:
          1. We can read.
          2. We can see.
          3. We can listen.
          4. We can think.
          People that can do none of these things are going to have a lot of problems in life beyond believing that their car can drive by itself, and it is unlikely many of them are going to be buying a Tesla.

        • by mysidia ( 191772 )

          So you are fine with only being informed about the capabilities of a product with a price tag of a car .

          The brand name Autopilot is Not automatically deceptive, just because some people have a false idea in their brain about what an autopilot refers to. Besides, it's not even proven that a substantial proportion of the population have this false idea. You haven't presented any scientific survey to prove it, so I would say that a few people on Slashdot are probably the only people who hold this mi

      • Re: (Score:2, Insightful)

        Since these "uninformed" people don't actually own a Tesla, it doesn't matter one iota what they think about the system. If you actually own a Tesla, the capabilities and limitations of the system are very very clear.

        And yet from almost the day that the "autopilot" feature became available, videos started circulating online of actual Tesla drivers doing stupid stuff like jumping into the passenger seat or back seat while letting the car "drive."

        Whether those idiots are representative of Tesla owners is beside the point. Clearly SOME idiots who actually have access to Teslas have done stupid stuff, and I don't think it's coincidence that this started when the feature named "autopilot" was released.

        Also, an autopilot on an aircraft doesn't completely fly the plane all by itself either. Pilots understand that. Do you think they should rename it so the passenger in seat 22C also understands?

        Nope. But TERRIBLE

        • So, these Darwin Award candidates are going to go into smart mode just by giving the system a different name?

          Stupid always wins.

        • by mysidia ( 191772 )

          And yet from almost the day that the "autopilot" feature became available, videos started circulating online of actual Tesla drivers doing stupid stuff like jumping into the passenger seat or back seat while letting the car "drive."

          That's not because of the name.... People do stupid stuff period. E.g. People do stupid stuff with LaneAssist [roadandtrack.com] too, and AutoPilot is not in the name.

          What makes the Tesla a little bit different is not the name, so much as the fact that the Car doesn't instantly disengage the

    • I have an idea. Why not inform the customers that their view of what an autopilot does is completely wrong. People will liken it to a plane, ask them if they would be happy flying in a plane without a pilot, and then point them at their car.

      People are applying unrealistic expectations based on incorrect preconceptions because they don't understand a technology with the same name. Help them understand, don't just change the name.

      • Because if they come up with a better name, all problems are solved. They don't have to go around with complicated explanations telling people about their misconceptions. Or maybe you think it's easy to go around telling people they have misconceptions about what words mean?

        Rename it, it's just easier.
    • The name "autopilot" isn't confusing at all if you think of it as an analogue of the autopilot in a commercial airplane. The airplane autopilot isn't fully autonomic either; for example, it can't take off on its own. Even at the cruising altitude it requires full attention of the pilots: http://www.cnbc.com/2015/03/26... [cnbc.com]
      • The name "autopilot" isn't confusing at all if you think of it as an analogue of the autopilot in a commercial airplane

        Yeah, it's not confusing, all I have to do is think about it for a bit, then read the article you linked to, and everything will be clarified.

        As a software developer, I can tell you that all my users are totally willing to put that kind of effort into understanding. They really try to understand I'm not being sarcastic here at all.

        • Well, every word has its origin, and the origin of the word "autopilot" is in the aviation industry. So, really, I can't understand why people began to think "autopilot" is equivalent to something like "robo-pilot" or whatever else they imagine.
          • Uh.......how much about the aviation industry do you really expect people to understand before they drive a car? Are you trolling or do you really, actually believe that the average person knows that much about avionics?
        • Indeed, because new words we discover in the english language instantly make sense and there's no point in a dictionary or wikipedia even existing.

          You touched on something very interesting. Knowledge, intelligence, and general inquisitiveness is dead. You just proved it by saying we need to dumb down something so people can understand it rather than explaining to people what they are getting wrong.

          • Knowledge, intelligence, and general inquisitiveness is dead. You just proved it by saying we need to dumb down something so people can understand it rather than explaining to people what they are getting wrong.

            Welcome to the consumer economy.

    • by zr ( 19885 )

      name is fine. wont take too long before autopilot does exactly what name implies.

      however. we can't count on marketing to deliver information.

      time to start including basic autopilot skills in the driver ed & license exam.

    • Names...

      This reminds me of Spanish-French AI character in Space Quest 6 that would pop up when Roger Wilco pressed the autopilot button.

      His name was "Manuel Auxveride". :)

    • Right, like how the name "automobile" makes people think the car does everything automatically. Oh, wait.
    • I don't think the name really has anything to do with it. The big worry about this technology is liability, and there is the idea out there that manufacturers are trying to be care ful about advertising what the car can and cannot do.

      Tesla I'm sure makes it abundantly clear that the car needs driver attendance. But if you sit there long enough and the car continually makes good decisions, you are gradually going to become complacent and maybe start to think that it really can do more than you thought. Th

    • Not really. Everyone knows autopilot in aircraft basically just flies the plane in a straight line. Nobody expects to take off or land for the pilot. In that sense, Tesla's autopilot actually does more than it's name implies.

  • Could it be, with the crash history of "autopilot", people are now using it more as it was intended? As DRIVER ASSIST, rather than turning it on and dozing off behind the wheel?

    I mean, THAT couldn't affect numbers at ALL, right?

    • The 40% reduction is of all Teslas that have Autopilot enabled. Right from the first day. And it covers all miles, whether autopilot was enabled or not. (Several of the safety features of autopilot are always on.)

      So to answer your question, change of driver behaviour having heard of autopilot crashes could not possibly have affected the statistic, no.

      • The 40% reduction is of all Teslas that have Autopilot enabled. Right from the first day. And it covers all miles, whether autopilot was enabled or not. (Several of the safety features of autopilot are always on.)

        So to answer your question, change of driver behaviour having heard of autopilot crashes could not possibly have affected the statistic, no.

        Yes, it absolutely could have. Driver's are now more aware of the limits of autopilot and are using it more responsibly. The sample set 'before' installation includes all driving before the rash of media attention. We may have seen a similar drop in the rate with no installation. The only way to truly compare would be large numbers of each for the same timeframe.

        And we don't have the raw numbers so we don't even know if they are statistically significant.

  • A crash rate of 1.3/ million miles and having 130 million miles of data means that there has been about 170 crashes.

    I'm guessing accidents range from minor fender-benders (although with cars of today, a "minor" fender bender costs $2k+) to the fatal accident.

    I would like to know where this 40% reduction takes place in the accident spectrum ? Does this mean that there are much fewer fender benders or fewer accidents which resulted in personal injuries?

    If it's at the lower end of the range then big whoopdie

  • One would expect that. Even a bad computer program with a dozen eyes is likely to be better than a bag of meat with only two.

    I'm more concerned about the long-term secondary effects. Do drivers who get used to this technology become dependent on it, and thus have higher accident rates when driving rental cars that lack this technology?

    Additionally, I'm less than convinced by the use of a single number here. To be meaningful, you need at least two numbers: the number of crashes avoided because of softwar

    • Re: (Score:2, Troll)

      by mykepredko ( 40154 )

      Why would you assume "a bad computer program with a dozen eyes is likely to be better than a bag of meat with only two"?

      I'm not up on state of the art on computer image/object recognition but the experience I have from about 10 years ago leads me to believe that there are still challenges to be solved, especially when it comes to recognizing movements and intentions. As a driver, some of the cues I rely on include turning indicators, wheel positions, other driver/pedestrian/cyclist eye contact as well as s

      • by Namarrgon ( 105036 ) on Saturday January 21, 2017 @05:34PM (#53712625) Homepage

        Probably because the bag of meat's eyes are too often turned elsewhere.

      • Why would you assume "a bad computer program with a dozen eyes is likely to be better than a bag of meat with only two"?

        Because it's deterministic. Because it's mass upgradable. It may not be physically better right this moment, but it is conceptually and philosophically far better than the situation we have now.

      • I'm not up on state of the art on computer image/object recognition but the experience I have from about 10 years ago leads me to believe that...

        Others have already responded to your other points, I just want to point out that experience from 10 years ago tells you basically nothing about the state of the art today. Deep learning methods have enabled dramatic progress on exactly the class of pattern matching problems that includes computer vision.

        Personally, I still think that LIDAR is inherently superior to video cameras for this task, but Tesla's numbers are impressive, and prove that while their system may not be all that it should be, it's al

      • by dgatwood ( 11270 )

        I'm not saying that the challenge of coming up with software that allows a car to autonomously drive itself better than a human isn't possible. I just challenge the assertion that a computer with multiple cameras is likely superior to a human.

        I say that for several reasons:

        • Human vision is inherently focused on a single thing at a time. They teach you to move your eyes around and scan for things that might be problems, but the reality is that we're very limited in our ability to do so. Computers don't hav
      • I'm not up on state of the art on computer image/object recognition but the experience I have from about 10 years ago leads me to believe that there are still challenges to be solved, especially when it comes to recognizing movements and intentions.

        Neural networks have come a LONG way in ten years, due in large part to the exponential growth in processing power in GPU's. Neural nets can perform the same or better as humans in a variety of image recognition tasks. For example, neural nets have been trained to give the prognosis for cancer patients based on images of tumors. The networks were trained on thousands of known images of previous cancer patients along with medical histories. When new images were passed through the network, the prognosis, incl

    • by west ( 39918 )

      To be meaningful, you need at least two numbers: the number of crashes avoided because of software intervention and the number of crashes caused by driver inattention.

      I think that two numbers would be deceptive because almost no-one is capable of acknowledging their inattention. If you found at that that 50% of accidents are caused by inattention, but the autopilot is a 20% *worse* driver than someone paying attention, you *know* that everyone would flee from AutoPilot it on the assumption they won't be part of the 50% failing to pay attention.

      One of the primary problems is that humans (in general) are incapable of acknowledging the weaknesses that cause accidents, thus

      • by dgatwood ( 11270 )

        I think that two numbers would be deceptive because almost no-one is capable of acknowledging their inattention.

        They don't have to. With as much data collection as the Tesla systems do, assuming they collect the same data with autopilot disabled, too, it should be possible to do a post-mortem (so to speak) on a random sampling of accidents and determine whether a reasonable person should have noticed the stopped car in front of them (for example) or not and whether the driver failed to react in a timely fa

    • Re: (Score:3, Insightful)

      by tempo36 ( 2382592 )

      I'm anti-antibiotic and modern medical intervention because I think knowing that they're available just makes people careless and sloppy when they travel in areas where those interventions aren't available. I would much rather a few more people die because we don't use antibiotics at all than for people to become reliant on them and just become careless and unfit.

      I agree that there could be secondary effects, but the more logical conclusion from my standpoint is that if the technology definitively improves

      • by dgatwood ( 11270 )

        I'm anti-antibiotic and modern medical intervention because I think knowing that they're available just makes people careless and sloppy when they travel in areas where those interventions aren't available. I would much rather a few more people die because we don't use antibiotics at all than for people to become reliant on them and just become careless and unfit.

        First, antibiotics are available nearly anywhere in the world you might go. By contrast, these sorts of autopilot features are available on a tin

        • Early studies strongly suggested that partial self-driving solutions did more harm than good, which is why I think we should wait to make self-driving technology available until it can truly take the place of the human driver, rather than introducing a solution that only works part of the time and can lead to false confidence the rest of the time.

          I have the same feeling about it. Make the human do more, keep them attentive, and let the car correct when an accident is about to happen (e.g. slam on the breaks when a sudden obstacle appears, or help keep traction when a drastic turn is made).

          I could be wrong, and I'd like to be wrong, but my gut says we'd be better off waiting a few more years for a more complete solution, rather than deploying a partial solution more broadly.

          Simple counter argument: deploying even partial solutions at a rather large scale, like Tesla is doing, provides a great source of real world data and experience for the software developers. Sure things are bound to go wrong sometimes in the autopilot in its curren

    • Someone else who sees one bad driver out of a thousand and lets them ruin their whole day. Suddenly human drivers can't do anything right. The fact is that I have been on the road with many human drivers that drive just fine.
      • by dgatwood ( 11270 )

        My opinion has nothing to do with bad drivers. Everybody gets tired. Everybody gets distracted. Anyone who says otherwise is kidding him/herself.

        Besides, more than 70% of all drivers eat while driving, and that's responsible (according to one study) for about 80% of all crashes. When I say humans suck as drivers, I mean that the overwhelming majority of human drivers (if not all) suck at driving at least some of the time. The only reason we don't have orders of magnitude more wrecks than we do is that

        • Who cares how many people eat while driving? What matters is how many of those that eat get into accidents they otherwise wouldn't have. Split second decision making is not often required because humans anticipate danger fairly well, even if they are eating at the wheel. By relying on AI we are relying on a situation that will rely entirely on split second judgement because that is all AI will be able to do. Right now I am not confident sensors are even robust enough to sense a situation which requires
    • One would expect that. Even a bad computer program with a dozen eyes is likely to be better than a bag of meat with only two.

      I'm more concerned about the long-term secondary effects. Do drivers who get used to this technology become dependent on it, and thus have higher accident rates when driving rental cars that lack this technology?

      That's a concern, but my bigger worry is the seat belt effect. That in response to the perception of better safety people start to take more risks.

      That's also the major reservation with this data set. These are all users relatively new to the auto-pilot, I know if you installed an auto-pilot in my car I'd be pretty damn paranoid for the first few months and my accident rate would plummet too. I'm not certain they're measuring the safety benefit of the auto-pilot or just their own drivers being extra careful

  • I can't completely wrap my head around how they could do this study and how the statistics were calculated.. The article states this is a before/after test so presumably involves comparing the the crash rate before auto pilot with that after. But it is somewhat difficult to do because did they really include drivers that crashed (before getting into AutoPilot) to see if they would crash again (after AutoPilot)? I would imagine many people might even stop driving after a crash or at least have gotten a new c
    • by Calydor ( 739835 )

      It's called a control group.

      They literally have the same car - except for AutoPilot being enabled.

      They are completely randomly selected across the nation based on one criteria: Buying a new car.

      Then they compare crash rates per mile driven for the two groups: The ones with and the ones without AutoPilot.

      You don't have to check if the guy that crashed without AutoPilot also crashes with it.

      • No because it specifically says: "[W]hile all Tesla vehicles come with the hardware necessary for Autopilot, you need a software upgrade that costs thousands of dollars to make it work. Since buyers can add Autopilot features after purchase, this provides a perfect before-and-after comparison."
        There is absolutely no reason to think that the group voluntarily electing to spend a lot of money on AutoPilot has the same overall risk as those who doesn't. So you can't compare "Drivers who bought AutoPilot afte
        • They compared crashes with autopilot on to crashes with autopilot off. Same car and driver. Car crashes. .. was autopilot on or off.

          • But people drive in way more dangerous circumstances with autopilot off. That isn't seriously how they measured it, is it?
          • To me Tesla lost a great opportunity to make it 100% reduction, just turn autopilot off immediatly before any crash, then you would never have autopilot on... profit /sacarsm
          • No they didn't. The statistics cover Teslas without autopilot vs Teslas with autopilot. The second group including all miles, whether or not autopilot was actually engaged. This is because:

            a) Some of the safety features of autopilot are on all the time even when the full autopilot isn't engaged.

            b) If they did what you suggest, they'd be comparing mostly urban travel vs mostly highway travel. As autopilot is only really designed for the highway at this stage.

            • by mspohr ( 589790 )

              Autopilot can be active at any time. It's especially good in heavy traffic. Works well on two lane roads as well as highways.
              They compared accidents driving with AP ON vs AP OFF.

              • Autopilot can be active at any time. It's especially good in heavy traffic. Works well on two lane roads as well as highways.

                I said it was only really designed for highways. Not that it can't do other roads.

                They compared accidents driving with AP ON vs AP OFF.

                You are completely wrong. Consult the report. They compared cars without AP installed vs those with it installed. Using the data for cars with AP installed, regardless of whether it was active or not.

          • They compared crashes with autopilot on to crashes with autopilot off. Same car and driver. Car crashes. .. was autopilot on or off.

            And the autopilot is only supposed to be on when driving on highways where the rate of crashes is 10x smaller than on average. So does that mean it crashes 4x more often than humans on highways?

            • by mspohr ( 589790 )

              You can turn the autopilot on at any time. Divided highway or narrow two lane roads. It's especially good in traffic.

  • >> installing Autopilot prevents crashes -- by an astonishing 40 percent...you need a software upgrade that costs thousands of dollars to make it work

    This kind of read like: "So...you want the software that shuts off your fuel value during a crash or the one that opens it full on impact. It's a $5K option...your choice, really."

    If it's really just a software option, doesn't this sound a lot like the VW software "option" that cheated on emissions tests?
    • No, it's a fancy tech feature that you're paying for (or not) which is optional. That's like complaining that one ski jacket has a built in avalanche transponder (some do) and another model by the same manufacturer doesn't and that somehow the manufacturer is liable for your injury when you choose to not purchase the additional safety feature.

      At some point if the NTSB mandates autopilot then it will be on all cars, but until then it's just a feature just like any number of others. Seats that reduce whiplash

    • That's what bugs me about this. Corporations are on their way to determining who lives and who dies based on how much one can pay. Not passively like having a better airbag, but actively.
      • Isn't that the case in the US for a long long time already, somewhat solved a few years ago but since yesterday back to how it was?

        ER staff: "Sorry, this patient can't pay for his treatment, please take him out again."

        • This compounds the problem. Now you're hoping your automaker thinks you're important enough and your family is important enough to look out for you and your price point. When will we have to pay more money if we want to drive at highway speed?
  • Just a reminder,

    Expect media outlets whose owners will benefit financially from Tesla's success to report this in a positive light and pimp it hard.

    Expect media outlets whose owners will benefit financially from Tesla's failure to report this in a negative light, bury it, or begin advertising sponsored competitor's autopilot as being superior.

    Does anyone have a working link to the actual report? It was supposed to be at static.nhtsa.gov/odi/inv/2016/INCLA-PE16007-7876.PDF
  • when I got creamed out on my road bike, assuming it can detect pedestrians. I was carrying the damn thing across a cross walk (with the little green walky man no less) and somebody ran the light. They clipped the wheel of my bike or I wouldn't be here right now. The lady stopped. It as broad daylight but somehow she didn't "see" me. Folks tune out when driving. That light almost never turns red so it didn't occur to her to stop. Put another way they're driving with their lizard brains. I'd rather they do it
    • My new car is supposed to see pedestrians and brake to stop them. I haven't actually tested this yet, due to a lack of experimental subjects.

  • installing Autopilot prevents crashes -- by an astonishing 40 percent...

    I think everybody knew that a solid autopilot system was going to be better than human drivers on average. Why is the author astonished by this?

Children begin by loving their parents. After a time they judge them. Rarely, if ever, do they forgive them. - Oscar Wilde

Working...