Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Transportation

Elon Musk Says Tesla New Autopilot Features Would Have Prevented Recent Death (fortune.com) 160

An anonymous reader writes:Tesla Motors Chief Executive Elon Musk said on Sunday the automaker was updating its semi-autonomous driving system Autopilot with new limits on hands-off driving and other improvements that likely would have prevented a fatality in May. Musk said the update, which will be available within a week or two through an "over-the-air" software update, would rely foremost on radar to give Tesla's electric luxury cars a better sense of what is around them and when to brake. New restrictions of Autopilot 8.0 are a nod to widespread concerns that the system lulled users into a false sense of security through its "hands-off" driving capability. The updated system now will temporarily prevent drivers from using the system if they do not respond to audible warnings to take back control of the car. Musk said it was "very likely" the improved Autopilot would have prevented the death of Brown, whose car sped into the trailer of a truck crossing a highway, but he cautioned that the update "doesn't mean perfect safety."
This discussion has been archived. No new comments can be posted.

Elon Musk Says Tesla New Autopilot Features Would Have Prevented Recent Death

Comments Filter:
  • by Anonymous Coward

    So he admits from his own mouth that the previous technology is a killer?

    • Re: (Score:1, Insightful)

      Exactly. Lawyers are going to love that comment.
      • by K. S. Kyosuke ( 729550 ) on Monday September 12, 2016 @09:28AM (#52870303)
        The notion that newer car technology is safer than older car technology will completely blow their minds!
        • by Junta ( 36770 ) on Monday September 12, 2016 @09:53AM (#52870473)

          This is less like something like adding airbags and more like when airbags explode with shrapnel.

          The former is an improvement that's seen as natural evolution of things. The second is a safety problem where a functionality is likely to do more harm than good and comes with liability issues.

          Here, Tesla may be found to have been irresponsible by calling the feature 'autopilot' with a bunch of connotations in the minds of the users causing them to watch DVDs instead of driving. Additionally, 'beta testing' highly dangerous functionality is not something car companies generally get to do. When it comes to anything resembling autonomous vehicle operation, you can point at *any* other company and how extremely careful and conservative they are being. Even with similar 'lane assist' technologies that are in production vehicles, they *already* were being far more strict about monitoring driver attention than Tesla was.

          So here we have Tesla being more aggressive about how 'automatic' things are, taking less measures for safety than the rest of the market offering equivalent feature today, and not being as conservative as the efforts that are what Tesla purports this technology to be.

          I know there is a desire to bow down and really kiss up to Tesla, but they need to be held to the same standards as their competitors. They are not holy saviors of our society. They aren't even the only electric car company. They are certainly not the most accessible/affordable things. They have done nothing to earn having a blind eye turned to their mistakes.

          • The airbag comparison sucks because an explosion of the airbag wouldn't be preventable with your behavior in the car, whereas the recent death was Darwinian in nature. The explosion would make the manufacturer liable, but the late driver was most certainly breaking the ways in which the feature was supposed to be used, such as paying attention to the road. At best, one could argue that it shouldn't be offered but then you have a chicken and an egg problem of sorts. As to the "autopilot" label, see below.
            • by Junta ( 36770 )

              This may be true, but competitors all were more strict about lane assist and more reluctant to use the term 'auto' in any way associated with the technologies. Google has avoided anything other than trained professional testers to use their system, and has publicly stated their opinion that a system that's 90% there is more dangerous than one not there at all, because user expectations are problematic.

              Again, people always say how people who know anything about piloting aircraft know that autopilot is far m

          • By the way, there's one more thing... Apparently, the current average vehicular death rate in the US hovers around one death per one hundred million miles driven. All Teslas together have recently crossed the one billion miles mark. That would imply ten deaths if Teslas were average. Yet we have only one death linked with autopilot usage (no matter how flawed said usage was). That doesn't sound very bad for Tesla unless there were nine more deaths associated with it.
            • by elgaard ( 81259 )

              You are comparing apple and oranges here.

              There have been other fatal Tesla accidents that was not caused ty autopilot. And plenty of the fatal accidents in average car were not caused by bad driving (i.e., you cannot assume that autopilot would have prevented them).

          • by DrXym ( 126579 )
            If they're allowing the user to take their hands off the wheel for up to a minute then they're still not taking safety seriously. Maybe the car is better than it was before, but it'd be better again with an attentive driver instead of one playing with their web browser because the car lets them.
            • by imgod2u ( 812837 )

              Current cars allow you to take your hands off the wheels for way longer than a minute. You may not live that long depending on the road you're driving but it's an option.

              I fail to see how it's somehow required that an assist feature also make you do something a car without the assist feature doesn't make you do.

              • How long till someone comes up with a fake 'hand' that tricks the system into believing you maintain control of the car?

                • by Junta ( 36770 )

                  A soda can worked for some mercedes S class. However that is a lot more blatantly obvious that a user is doing unreasonable stuff to bypass safety mechanisms, so vendor can reasonably be considered less responsible if the user is having to get so 'imaginitive' to be jackasses.

                • by DrXym ( 126579 )
                  I'm sure there are people who spoof their car's safety belt sensors too. You can only do so much to prevent determined idiocy. It doesn't mean you shouldn't enforce safety at all for the majority of people who aren't idiots.
              • by Junta ( 36770 )

                Simple. In the unaided case, it's unbelievably obvious what will happen immediately. There's no perception that the car will save you. Driver's ed required pretty well covers what will happen if you aren't paying attention.

                The problem is the assist technologies can let you get away with it for extended periods of time, the majority of the time. So car has to nag the user to remind them that it is *not* safe to remove human attention as well. It's mitigating risk and doing a deceptively good job at it w

              • by DrXym ( 126579 )
                I should think the problem is extremely obvious. A normal car will crash itself if you take your hands off the wheel. A car with autopilot will continue to steer itself and will appear to do so quite well for the most part. A consequence of that is drivers WILL become inattentive and WILL do things other than paying attention to the road, hazards, other vehicles etc. This is entirely forseeable and obvious.

                And for the most part (that part where the car seems to be doing okay) maybe it doesn't matter if a

          • I know there is a desire to bow down and really kiss up to Tesla, but they need to be held to the same standards as their competitors.

            Such as the sky-high moral standards of GM https://en.wikipedia.org/wiki/General_Motors_ignition_switch_recalls/ [wikipedia.org]?

            • by Junta ( 36770 )

              GM was crucified in the media for that. No one took GM's side on that. That is how reaction *should* be.

              With Tesla, tons of folks are white knighting for a luxury car brand. It's insane.

          • by AmiMoJo ( 196126 )

            The most irresponsible thing was ignoring the vast amount of research into human attention spans. People simply can't concentrate on doing nothing for hours on end, ready to take over control of the vehicle at a moment's notice.

            People will get complacent, take a nap, watch a DVD or just zone out. It's human nature. NASA even warned them about it.

    • so with an auto drive car what happens when the software is at fault with the payouts?

    • Re: (Score:2, Informative)

      by Anonymous Coward

      No, all he said is that he expects that the new software will keep a few more Darwin award candidates in the gene pool. There still is such a thing as personal responsibility. Even in the USA.

    • by Anubis IV ( 1279820 ) on Monday September 12, 2016 @10:24AM (#52870693)

      So he admits from his own mouth that the previous technology is a killer?

      That's faulty logic. If a hospital upgrades to some just-released equipment and is able to save more people as a result, that doesn't mean that they killed the people who could have been saved by that equipment, had it been available earlier. The state of the art is constantly getting better. Admitting that the newer stuff is more safe than the older stuff doesn't mean that the older stuff was killing people. Quite the contrary, since in many cases that older stuff saved a number of lives that would have been lost had people relied on the alternatives that were otherwise available at the time. Saying that the newer stuff is even more safe just means that we have something even better now.

      Of course, I say all of this to point out the fault in your logic, not to suggest that I think Autopilot is ready for primetime already. Because it's not.

      • by Kjella ( 173770 )

        It's easy enough if is an improvement across the board like our new vaccine is more effective with no new side effects. Statistically though, often your new way will end up killing different people which makes it hard to swallow at an individual level. For example, say you have an injury with 5% chance of dying without surgery but you're in a dirty field hospital with a 2% chance that infections and there's very little correlation between one and the other. Are your relatives going to happy that you got a t

  • by Anonymous Coward

    ...better than educating generation after generation of human driver and relying on them to have their faculties intact every time they're behind the wheel.

  • by fustakrakich ( 1673220 ) on Monday September 12, 2016 @09:10AM (#52870215) Journal

    What does that mean? Will the vehicle quickly slow to a stop? Will it veer off a cliff or into a building or "let go of the wheel" and start swerving to scare the driver into grabbing hold? The statement doesn't make much sense.

    • by Luthair ( 847766 )
      Does it matter? The crash won't be autopilots fault since it wasn't in control anymore!
      • Well, the thing should at least scream over the speakers, *Hey dumbass! You're all gonna die if you don't grab the wheel and hit the brakes!* to get the guy's attention. A little flashing light and chime won't cut it. More than one airliner has crashed because the pilot didn't know the autopilot disengaged [wikipedia.org].

        • And not just in English, but in Mandarin too!

        • Then you get people sueing because the alarm was to disruptive and scared them into a wreck....

          The only way to win this battle is to literally sign a waiver for some new class of driver's license. Basically stating you understand and are aware of all the risks and best handling practices when owning/operating a car with an autopilot system.

          • Then you get people sueing because the alarm was to disruptive and scared them into a wreck....

            To prevent that, you first broadcast a 'trigger word' warning.
            (over the car speakers, in a very calm voice) "Hey, bro...I'm about to yell at you quite loudly, because there is a very scary moment coming up, and I don't know how to handle it. We'll need you to go ahead and take the wh..." (crunch)
          • Not too comforting to the station wagon full of nuns in the next lane over. The problem with things like this, it has to be all or nothing. Either it works or it doesn't. On the other hand auto parking isn't such a bad idea. At least that could get you within walking distance to the curb without playing bumper cars.

            • Not any better than a drivers license either though, anyone can steal a pair of keys and just drive a car today without any knowledge of how to drive. The point was the driver would be liable and not the auto-pilot, or at least the norm would be assume driver error should they actually be using the feature during a wreck.

              I guess to me this isn't any different from someone trying to use cruise control, you should still be ready to use the pedal at all times no matter what.

        • by Anonymous Coward

          I prefer the computer should say "Take the wheel or I swear I'll drive this car into a tree!" And or "Don't make me pull this car over!" and "Don't make me come back there!"

          Add in an occasional dad joke (like any time some one says "I'm Hungry" replying with "Hi Hun Gry, can I just call you Hun?" and provide answers to the question of "are we there yet?" such as "we'll get there when we get there and not a minute sooner!" and you've got the perfect auto pilot we all recall from childhood family road trips.

        • "Terrain, Terrain, Terrain. Pull Up. Terrain. Terrain. Terrain. Pull Up."

          That's what planes do.

          No sense getting personal.

    • What does that mean? [...] The statement doesn't make much sense.

      It makes sense to anyone who actually cares about this issue, as defined by "has been following it closely enough to know about it." To someone like you, who is only here to talk shit and doesn't know any of the particulars of the issue, it might not make sense. But to everyone else, who knows that the system already shuts itself off (with a warning) if drivers have their hands off the wheel for too long, this is not a shocking or confusing piece of news.

      Maybe you should go back to the beginning, and read u

      • by Anonymous Coward

        Well! Happy Monday to you too! Why are you being such a dick? Haven't had your coffee or beat the wife and kid yet? Where's your priorities? First things first, man!

      • and the safety briefing drivers are required to receive before Tesla will turn it on.

        Is it a safety briefing or a EULA?

    • I'm guessing it gives the warning, and if you don't take heed, it won't work the next time you try to engage it (but will continue to work until you do retake control this time).

      So you'll just need to find a convenient roundabout so you can dive out and leave the car running until you're ready to hop back in again the next day.

    • has an almost-self-driving capability when the lane departure assist is activated. But when driving on the freeway in heavy snow last winter, as soon as the optical system couldn't see the road because of the snow building up, all the automation shut down. Wth lots of visual and auditory warnings to let me know it was shutting down.

      I imagine this is the same sort of thing. Auditory and visual warnings to let the driver know the system is switching to fully manual operation.

    • What does that mean?

      Same thing it means now. Slowly bring the car to a stop. This facility already is in place.

  • by Anonymous Coward

    I was getting worried there for awhile when I hadn't seen an Elon Musk story for like 9 hours. I was beginning to fear the worst, but I think I'm OK now.

    Phew!

  • by ledow ( 319597 )

    According to the BBC article:

    http://www.bbc.co.uk/news/tech... [bbc.co.uk]

    "Overhead signs or bridges can also be misinterpreted if the road dips. To combat this, Tesla cars are going to be used to âoelearnâ about the road. Initially, the vehicle fleet will take no action except to note the position of road signs, bridges and other stationary objects, mapping the world according to radar,â Mr Musk wrote.
    "The car computer will then silently compare when it would have braked to the driver action and upload

    • by Anonymous Coward

      They're not whitelisting any/all objects in a particular location.
      They are whitelisting a specific radar signature in a specific location.

      • So now you can have gps accuracy issues and DB update issues that can lead to a big mess.

        • I'd suggest not flying, taking a modern boat or even a bus if you're worried about stuff like this.

          Just walk along side of the freeway - you'll be fine.

  • Elon Musk implies next autopilot version would have prevented death(s) caused by current autopilot version.
  • What would the world do without him? Now he's claiming his rocket blew the fuck up because it was hit by a UFO.

  • -Fixed bug where the car would crash into things if the sun was too bright It looks like that Bill Gates car industry joke is coming true.
  • Elon spoke frankly and with candor about the technology. Most other manufacturers don't do that. They hide behind bland statements and corporate spokesman. They have learnt it the hard way, being frank and open leads to law suits. So they hide behind these bland useless press releases. That makes them look cagey and shifting, and when something actually comes out, it is seen a lot more harshly it deserves. Let us see what happens in this crash.
  • The driver can still take their hands off the wheel for up to a minute. During which time they could be turned around fetching something out of the back, eating a sandwich, playing with Twitter. Anything except actually paying attention to the vehicle they are inside - a 2 ton vehicle hurtling down the road at 70mph.

    So yeah, maybe the new software makes the car better at not crashing into trucks. It sure as hell isn't better than if the car AND the driver were both attentive to the road. Humans are excell

    • by DrXym ( 126579 )
      Correction - 1 ton
      • by fnj ( 64210 )

        Where do you find a car weighing 1000 kg nowadays? A Mini was 600-700 kg in 1960. A VW bug was under 1000 kg. Today a Mini is 1200-1400 kg, a VW Golf is 1300-1400 kg, a Prius is 1400 kg, a Tesla S is 2000-2200 kg, a Ford Explorer is 2000-2200 kg, a Fiat 500 is 1100 kg, and a Yaris is 1100 kg

        Cars nowadays, even tiny ones, are bricks.

  • by fluffernutter ( 1411889 ) on Monday September 12, 2016 @10:31AM (#52870755)
    When I first read about this comment, all that ran through my mind was the voice of Goofy saying, "Heeeyuk well I guess THAT didn't work, lets try something different!"
  • Next week: "If only they'd stayed on HEAD, we wouldn't be having these problems. Also, your kernel is old and you're not running the latest version of systemd. Why don't you just hook into our Jenkins server at http://carautopilot.github.io/ [github.io] so you can get the latest nightly before you head out on the road each morning?"

  • by superdave80 ( 1226592 ) on Monday September 12, 2016 @12:57PM (#52872221)
    When I found out the cause of the fatal wreck (the sky was cloudy white, and so was the trailer!) I couldn't believe it. You would think that the FIRST thing you do with an autopilot program is to make sure it can see properly in front of the vehicle. Or combine it with radar. Or... something. But an autopilot that runs full speed into a giant truck/trailer without even realizing that it is even there is a complete and utter failure. What would happen if there was a blue trailer that was a similar color of the blue sky? Or an empty flatbed trailer. Would it run into those as well?
    • You would think that the FIRST thing you do with an autopilot program is to make sure it can see properly in front of the vehicle.

      Define see. You're assuming that the human would have made a better decision. Given how many accidents happen on a daily basis with humans in control I don't think this is a conclusion you can make. But there's one amazing thing here: Iteration.

      A human has an accident you can't prevent it from happening again. If you run a red light and t-bone someone else the entire world can't learn from it. Yet here we have a case where in the future this accident won't likely happen. I will happily tolerate many deaths

      • You're assuming that the human would have made a better decision.

        I think most humans would have NOT kept driving at full speed into a giant tractor trailer. But that's just a guess.

        Now, don't think that I am against self driving cars. My post specifically mentioned Tesla's autopilot. In the long term, I think self driving cars will prevent many, many more deaths than they might cause. My issue is that the very first thing any self driving car should be able to do is know if something is blocking it's path. Literally the first thing it should 'learn' to do. Given t

        • I think most humans would have NOT kept driving at full speed into a giant tractor trailer. But that's just a guess.

          And yet it happens on a daily basis without something as stupid as the trailer being the same colour as the sky.

          Really? Why would you think that?

          Re-read my post.

          If my computer calculates 1+1=5 and I figure out why and correct it to 1+1=2 then the problem is resolved. That doesn't mean 2+2 won't =5 in the future, but the 1+1 situation has been corrected.

          Now compare that to a human. It's hard enough teaching one person something, it's not possible to correct it for everyone.

          Thanks for offering to Beta Test the Tesla autopilot system

          If I had a Tesla you bet your arse I would be using the autopilot sy

Profanity is the one language all programmers know best.

Working...