Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 internet speed test! ×
Transportation

Elon Musk: Tesla's Autopilot Software Could Save Half a Million Lives Every Year (fortune.com) 265

An anonymous reader writes: In the wake of a deadly crash involving a Model S that was driving with its Autopilot software turned on, Tesla CEO Elon Musk issued a few interesting remarks on the technology to Fortune. Notably, the publication recently ran a piece attempting to portray Tesla in a bad light by noting that Musk sold more than $2 billion worth of Tesla stock just 11 days after the aforementioned May, 2016 accident. And all the while, shareholders were kept in the dark up until recently. "Indeed, if anyone bothered to do the math (obviously, you did not) they would realize that of the over 1M auto deaths per year worldwide, approximately half a million people would have been saved if the Tesla autopilot was universally available. Please, take 5 mins and do the bloody math before you write an article that misleads the public.
This discussion has been archived. No new comments can be posted.

Elon Musk: Tesla's Autopilot Software Could Save Half a Million Lives Every Year

Comments Filter:
  • except..... (Score:2, Insightful)

    by ganjadude ( 952775 )
    when it doesnt http://money.cnn.com/2016/06/3... [cnn.com]
  • by Anonymous Coward on Tuesday July 05, 2016 @05:28PM (#52452029)

    Pilot-aid would be better and might have saved an extra life.

    • by creimer ( 824291 )

      Rolaid would be better and might have saved an extra life.

      FTFY

    • Pilot-aid would be better and might have saved an extra life.

      How about drunk-aid? Which begs the question, at what point is autonomous driving good enough to allow drunks back behind the wheel, and whose to blame if there is an accident?

      • "Who is" to blame.... excuse me.
      • I'd say at the point when they can legally get into the passenger's seat instead. And do so. So long as you need a human driver to be able to override the autopilot, they should be legally competent to drive.

        Once you cross the line from "autopilot" (aka simple-situation driving aid) to a fully autonomous system such as Google's self-driving cars are pursuing, then you are simply a passenger, and it shouldn't matter if you're too drunk to stand, any more than it would if you hired a taxi in that state. At t

    • by mea_culpa ( 145339 ) on Wednesday July 06, 2016 @04:50AM (#52454841)

      Autopilot is actually an accurate name for it.

      Autopilot was primarily invented for aircraft and even today, autopilot will still happily fly an aircraft into terrain without human interaction if you let it. There have been numerous CFIT fatal crashes of aircraft with over 9000 deaths. Each of these incidences brought more knowledge of how to improve technology to help prevent future occurrences (I expect the same to happen with autonomous vehicle technology). Autopilot was never intended to replace the human pilot or alleviate the responsibility of the human pilot to maintain constant situational awareness. Likewise, autopilot in the Tesla was never intended to alleviate the driver of the responsibility to maintain continuous situational awareness. The driver actually has to agree to this when using it.

      I think Hollywood may have warped people's perception of what autopilot actually is and its limitations.

  • by Anonymous Coward on Tuesday July 05, 2016 @05:29PM (#52452043)

    The Tesla autopilot running under ideal conditions (with a human backup) compared to a human driver under all conditions are not equivalent, and we cannot directly compare their failure rates. Beware of naive statistics.

  • by Anonymous Coward

    This would only be true under ideal circumstances in which everyone and everything worked flawlessly in tandem, and that just isn't the real world any more than the opposite statement. Suck it up, take responsibility, and be a man, Elon. Do the right thing and suspend public trials of this tech until it's truly ready.

    • by Nethemas the Great ( 909900 ) on Tuesday July 05, 2016 @06:38PM (#52452547)
      Tesla claims you still need to pay attention to the road around. Vehicle autopilot is no different than aircraft autopilot. It is a workload reduction device, and does not replace the driver/pilot. Interestingly this confusion about where the autopilot tasks begin and end is not limited to just cars, even aircraft pilots apparently goof this up. [cnbc.com] In the case of this Tesla crash the dumb*** was watching Harry Potter on his portable. The driver fracked up, by not paying attention. It's no different than if the pilot crew left the flight deck. Stuff happens and someone needs to be able to take over for the autopilot when it does. This driver's Tesla--which he nicknamed Tessy--saved him from another potentially serious accident of which he documented in his blog. Unfortunately that episode must have given him undue confidence/expectation in the autopilot system.
    • Re: (Score:3, Informative)

      by beelsebob ( 529313 )

      Why would he suspend trials - the current version is already safer than a human. The US average death rate when driving on a freeway is 1.08 deaths per 100,000,000 miles. Tesla autopilot's current death rate is 0.769 per 100,000,000 miles.

      • by Mr D from 63 ( 3395377 ) on Tuesday July 05, 2016 @07:05PM (#52452735)

        The US average death rate when driving on a freeway is 1.08 deaths per 100,000,000 miles. Tesla autopilot's current death rate is 0.769 per 100,000,000 miles.

        Come back with deaths per mile of people driving high end, less than 10 year old vehicles, and exclude miles driven in snow, ice or other treacherous conditions and also eliminate passenger deaths. That's just for starters.

      • That the death rate in America for all roads.

        Freeways/divided highways are about 1/4th that.

        • by Cyberax ( 705495 )
          Autopilot works on regular roads and it actually automatically limits you to the posted speed limit +5mph (in AP mode).
      • With a sample of 1 death, the two are statistically equivalent. Maybe after we ~10-20 Tesla deaths with autopilot we can properly compare, but for now...

        • Not to mention that the numbers wouldn't look nearly as good if there were four people in the car. Anyone using these statistics at this point is an idiot, and Musk is counting on the idiocy of the press when spouting his statistics.
  • by fluffernutter ( 1411889 ) on Tuesday July 05, 2016 @05:32PM (#52452067)
    How many years will it take for the automated car to be affordable for the common person? I can't afford a car with even the most minimal of automation right now save for standard cruise control. Did anyone catch the article on how the average family can't afford most cars as it is? Most people don't even see the point of buying a new car, much less an automated new car. Saving lives with automation is a pipe dream until there is a plan to make these something that everyone can buy which isn't going to happen any time soon. Stop making it an excuse to kill people with experimentation.
    • Re: (Score:3, Informative)

      How many years will it take for the automated car to be affordable for the common person?

      The marginal cost is very low. It is mostly software, which has a marginal cost of zero. Then there are a few sensors. Hi-res cameras cost less than $5 each (which is why they are in $20 cell phones). Radar units used in adaptive cruise control are less than $1000, and dropping in price. If your car already has ACC (as many do) then the additional cost for full self-driving is minimal. It is likely that you will save more on insurance than the extra cost for hardware.

      • Re: (Score:2, Interesting)

        by Anonymous Coward

        The marginal cost is very low. It is mostly software, which has a marginal cost of zero.

        So what? The cost of producing the FIRST copy of the software is sky-fucking-high. And every time you change your hardware platform, you have to spend a vast amount to certify, test, bug-fix, and fully integrate your software with that new, revised hardware. This isn't a fart app that "oh well, if it crashes now and then, no big deal." The requirements for mission-critical realtime software systems on which lives dep

        • > The requirements for mission-critical realtime software systems on which lives depend are (and *should be*) incredibly high.

          Why? As I see it they should follow the same criteria as pharmaceuticals: at least as safe and effective as the current standard. Which in the case of driving systems is heavily peppered with drunks, idiots, distracted soccer moms, etc.

      • Now cost out the robotic brake and steering systems.

        • Should be cheap enough - scale models of the systems are already mature in the high-end RC car market and mechanical actuators aren't exactly expensive. In fact many new cars already have such systems in place as safety aids - all that's missing is the judgment to take over full-time instead of just in emergencies where it appears the driver won't avoid a crash on their own.

      • Yes and I'm sure the cost to make automatic windshield wipers and adaptive headlights is very low as well, yet they only make their way into a $60K+ car.
    • How many years will it take for the automated car to be affordable for the common person?

      With services like Uber, it's not going to take very long. In the case of the "average family", Uber is already cheaper for some trips than the bus.

      And let's do the math, a full time Uber car typically does 180,000 miles every three years. But a fully automated Uber car could work around the clock and should do much more than that -- thereby requiring more frequent replacement of the car.

      • Except Uber isn't a replacement for a car until they can predict when you're going to step out your front door and be in your driveway by the time you get there.
    • by Cyberax ( 705495 )
      Tesla uses Mobil Eye camera and the regular ultrasonic sensors. Their total cost is within a couple hundreds of dollars, with a couple more for effectors (steering wheel motor, electric brakes). I'm pretty sure other car vendors will introduce comparable systems within the next couple of years.
      • Yes and they will of course extend this technology at cost to their customers. They wouldn't do anything like put the tech in a luxury car and mark it up 100x would they? That's not what is supposed to happen in capitalism is it?
        • by Cyberax ( 705495 )
          Yes, they would. Car makers first put technology into expensive cars and then after a couple of years they are forced to put it into cheap/midrange cars by their competition. Just look at adaptive cruise control - it used to be an option for high-end cars but it's now available as an option from pretty much all car brands.
    • Collision avoidance technology are becoming more widespread over time.

      What used to be only available on high range Volvos and Mercedes has now trickled down and even the smaller and cheaper VW Up! have LIDARs used for "City Safety" (=automatic brake to avoid collision with pedestrians and with other vehicle at in-city speed ranges) as a standard option.

      And that is the car currently available as the cheapest option of the fleet of some car-sharing companies.

      It *is* getting affordable.

      (Well for a certain cate

  • by quantaman ( 517394 ) on Tuesday July 05, 2016 @05:39PM (#52452117)

    He continued, “Indeed, if anyone bothered to do the math (obviously, you did not) they would realize that of the over 1M auto deaths per year worldwide, approximately half a million people would have been saved if the Tesla autopilot was universally available. Please, take 5 mins and do the bloody math before you write an article that misleads the public.”

    Are these projections from peer-reviewed research published in a proper journal? Are these projections based on public Tesla claims? Is this Elon Musk pulling numbers out of his trunk?

    Considering these are real lives of actual people at stake I hope Tesla did some serious research before selling these to the public.

  • by istartedi ( 132515 ) on Tuesday July 05, 2016 @05:39PM (#52452119) Journal

    That level of wealth alone would save far more lives. Anybody who could afford a Tesla could also afford clean drinking water, air conditioning, medicine, proper nutrition, etc. Musk is just taking in one figure and ignoring the fact that so much of the world is driving a run-down beater that doesn't have anti-lock brakes, or they're just driving motor scooters which are far more dangerous, or they're driving nothing at all and hauling water from toxic wells because of POVERTY. How about Musk buy a helmet for every 3rd world motor-scooter rider, then get back to us on this?

    • by Pinkoir ( 666130 )
      So if a technology doesn't work right now for everybody we should abandon it? This might be the wrong website for such a position. Teslas with autodrive are expensive now but the same has been true of every life-saving technological improvement at some point. In the late 70s early 80s would you have said "These ABS systems only benefit the rich. Fukkit and redirect the money to better drinking water"? That technology saves a ton of lives now. It's not Elon Musk's job to provide and legislate universal
      • You're putting words in my mouth. I didn't say he should abandon the technology--only that his reasoning is specious.

      • Technology allows us to test these things in simulators and on closed courses before releasing them to the public. Of course that would cost more, but that shouldn't be a problem for Elon since he is so altruistic.
  • by protest_boy ( 305632 ) on Tuesday July 05, 2016 @05:41PM (#52452135)

    I think there's a huge unrealized danger to these quasi-autonomous cars because people will treat them like a a fully controlled car and do things they shouldn't (e.g. read the news paper, watch a movie, doze off, etc.). Right now the drivers of these expensive Tesla cars are not representative of the larger driving public. If we put this technology into 100% of the cars on the road I predict the number of accidents due to imperfect AI will rise significantly because of driver inattention. It may still prove to be an improvement over human controlled, but I doubt the numbers of lives saved will be what Musk claims.

    Give me a car that will take me to work while I nap in the backseat. I have no interest in being on the road filled with semi-autonomous cars.

    • by MightyYar ( 622222 ) on Tuesday July 05, 2016 @06:35PM (#52452515)

      Give me a car that will take me to work while I nap in the backseat. I have no interest in being on the road filled with semi-autonomous cars.

      Sorry, but history seems to indicate that humans don't adopt technology this way. We just brashly try shit way before it is remotely safe to do so, and then regulation follows when necessary. Hell, seat belts weren't standard on ANY car until 1958, and the very first seat belt law anywhere in the world didn't happen until 1970. Automobiles have historically been death traps, with a continuum to relative safety now. This will probably continue going forward, until our descendants view our relative death traps as we do the Model T. Automation will almost certainly make cars safer, but it won't be a binary operation. It will be a long slog through imperfect implementations.

      • So by your own words, now we know better. Musk knows better or he wouldn't be so defensive about this whole thing. Only idiots repeat mistakes made in the past.
        • "We"? No. We are exactly the same species as our parents and grandparents. We'll keep right on killing each other in new and technically interesting ways. Musk can point to his 1 fatality in 134 million miles track record as being better than the 1 in 94 million human average - fair or not. He's selling cars, after all. By the time regulators catch up, they'll be requiring autopilot because the accident rate will be consistently better than humans. We'll keep right on being human.

    • by labnet ( 457441 )

      Totally agree with you protest_boy.
      I drove a Tesla for a week with the Auto Pilot function and found it dangerous.
      Because it relies on white lines for steering, a soon as conditions become non ideal, it gives up.
      Volvo said semi autonomous driving was a bad idea, and I agree with them.
      You often have hundreds of milliseconds to take corrective action, and half assed autonomous driving system (tesla has no LIDAR) will leave you lapsing concentration for extended periods making you a risk on the road.

  • One Idiot Dies (Score:2, Insightful)

    by Anonymous Coward

    ..and half of people think its the fault of the tool he was (mis)using.

    Frankly Tesla's autopilot isn't all that special. It's just a combination of lane keeping, adaptive cruise control, and brake assist that pretty much is available from any car company.

    There will never be an automated system if any kind that won't cause deaths.

  • Just because when your auto-pilot is activated, there have been half the number of deaths per kilometre doesn't mean you an save half the people who died in car accidents.

    It's only used on highways. That's a road with no pedestrians to kill.
    If keeps you in your lane and attempts to keep a safe following distance from the car in front of you (unless a truck pulls out infront of you, it keeps driving at it at full speed and kills you)
    In Britain 60% of deaths occur on country roads, not highways, where street

    • by Luthair ( 847766 )

      Its actually highways with a median which is even more restrictive in terms of unexpected things scenarios.

      Listening to general car podcasts I've heard a few stories from reviewers that the autopilot will suddenly bail, say mid-corner which doesn't sound particularly safe to me.

  • by JustAnotherOldGuy ( 4145623 ) on Tuesday July 05, 2016 @05:56PM (#52452269)

    I love the idea of self-driving cars, but I think it's going to be hard to get it to work acceptably "as is", that is, the way the problem is being approached.

    Much of the decision-making must remain "onboard" the car but I think self-driving vehicles will be vastly improved with some feedback and control signals from the road or a locale-specific traffic guidance computer.

    In other words, in addition to its own decision-making software, the vehicle should also be receiving some sort of signals or guidance info from the road in the area it's currently passing through.

    Sort of an air-traffic control system where responsibility for air traffic is handed off from control center to control center as the plane makes its way from point A to point B. The difference is that this guidance should be completely automated, and be an adjunct to what the car does, not its primary means of navigation. I'm see this primarily as speed and road condition management info.

    I know, I know- what about hackers? Yeah, that'll be an issue for sure, but it can be mitigated by the use of some solid encryption routines and boundary-monitoring, i.e. to make sure that a Bad Guy(tm) doesn't hack the controller and tell all the vehicles in its area to all speed up to 100mph or whatever. Or to tell 1/2 to speed up and 1/2 to come to a dead stop. Some things shouldn't be able to be overridden, such as max speed and collision avoidance.

    In short, I think autonomous vehicles would be better (and probably safer) if they not only thought for themselves, but also were receiving some sanity-checking and guidance info specific to the road or area they travel on.

    • AFAIK, many new cars already do this, and they have even more rudimentary "AI" than the Teslas.
      Lazy sourcing: https://www.google.com/search?... [google.com]

    • Self-driving cars are Agile!*

      *Move fast and break things
    • by bartle ( 447377 )

      Autonomous vehicles will never take off if the prerequisite is to first create a centralized traffic control system. What I think we'll see instead is autonomous vehicles taking cues from human driven vehicles via the V2V (vehicle to vehicle) communication system that will roll out in the next few years.

      Tesla's vehicle already kinda does this. They used the car's GPS units to build a map of highway lanes, based on how the humans drove them, and then feed this map into their autonomous system. One can easi

      • The idea of a car that connects to anything outside the car scares me on its own. How exactly have they prevented any kind of hacking?
  • by King_TJ ( 85913 ) on Tuesday July 05, 2016 @06:05PM (#52452327) Journal

    The truth of the matter is, Tesla pretty much HAS to come out swinging, defending its self-driving technology, or else it's easily "game over" before it even really gets started for them. Somebody had to release the tech for the general public to use first, and Tesla took the chance. (The other car manufacturers have been far more conservative with things, offering only "emergency braking / collision detection and avoidance" or just parallel parking assist.... individual components that would make up a "self driving car".)

    That said? I agree with the folks here saying his stats are way off the mark and unrealistic. Since you can't even use his technology right now when not on a highway, it's not even an option for saving any lives in collisions that happen on smaller roads.

    I think it was Mercedes or maybe Audi who commented that the Tesla system uses cameras and computer AI to determine if something is in the car's way. Their system used radar in conjunction with cameras, which sounds superior to me.

    • Tesla doesn't HAVE to come out swinging. That's just Elon's way. He's a huge whiner. And it doesn't ever seem to matter if someone's criticism of his product is valid or not, just whines away like a little baby. Embarrassing, really.

  • Doing the math (Score:5, Informative)

    by kamitchell ( 1090511 ) on Tuesday July 05, 2016 @06:08PM (#52452341)

    It's all marketing hype and mere armchair statistics.

    Fortune doesn't know how to do the math, I don't know how to do the math, Musk doesn't know how to do the math, but perhaps a few readers of this comment could do the math.

    It would take 275 million miles of autonomous driving to have any confidence at all that an autonomous car is safer than a human driver.

    Ars Technica reported on it [arstechnica.com], and if you want to see the math, the RAND corporation, who are kind of experts at the math, have a detailed report [rand.org] available, which explains the math.

    Basically, while the marketing engine can claim that autonomous driving is safer, it's not even possible to have any proof of it within any reasonable level of statistical confidence.

    I mean, sure, we try to make driving safer, and assisted driving may help, but please, let's be realistic about where we're at.

    • I'd say that needs to be 275 million miles of dedicated autonomous driving. You can't cherry pick just the highways, but the driver must take control if there is a detour or something. A safe trip should be a safe trip uninterrupted, from beginning to end.
    • Can we at least get some data on the number of people who drown [thenationa...awyers.org] in their cars vs. a floating Teslas [arstechnica.com].

  • An anonymous reader writes:

    Yeah, someone who's last name rhymes with "tusk".

    "Indeed, if anyone bothered to do the math (obviously, you did not) they would realize that of the over 1M auto deaths per year worldwide, approximately half a million people would have been saved if the Tesla autopilot was universally available.

    Holy shit. And if they took a train instead, it could save approximately ONE MILLION LIVES.

    Seriously, I understand that in an age of Martin Shkreli and Star of David dogwhistles, subtlety

  • Powerful tools in untrained/stupid hands.

    That's the problem every time, be it with computers, nuclear fission, cars or whatnot.

    Look what people are doing with the Tesla "Autopilot". Pure and utter reckless fooling around. No wonder people die.
    From all we know it's pretty certain the man was watching a f*cking DVD while being the responsible handler of an automobile.
    That alone should cost you a drivers licence for a lifetime!

    I'm glad he only killed his own stupid self and not somebody else. That would've b

"Probably the best operating system in the world is the [operating system] made for the PDP-11 by Bell Laboratories." - Ted Nelson, October 1977

Working...