Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation AI

New Questions Raised about Tesla's 'Autopilot' Safety After Three Fatalities This Week (startribune.com) 162

The Associated Press looks at three new fatalities involving Teslas this week, saying the crashes have "increased scrutiny of the company's Autopilot driving system just months before CEO Elon Musk has planned to put fully self-driving cars on the streets." Last Sunday, a Tesla Model S sedan left a freeway in Gardena, California, at a high speed, ran a red light and struck a Honda Civic, killing two people inside, police said.... Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University, said it's likely that the Tesla in Sunday's California crash was operating on Autopilot, which has become confused in the past by lane lines. He speculated that the lane line was more visible for the exit ramp, so the car took the ramp because it looked like a freeway lane. He also suggested that the driver might not have been paying close attention. "No normal human being would not slow down in an exit lane," he said...

On the same day, a Tesla Model 3 hit a parked firetruck on an Indiana freeway, killing a passenger in the Tesla... In both cases, authorities have yet to determine whether Tesla's Autopilot system was being used... Many experts say they're not aware of fatal crashes involving similar driver-assist systems from General Motors, Mercedes and other automakers. GM monitors drivers with cameras and will shut down the driving system if they don't watch the road. "Tesla is nowhere close to that standard," Rajkumar said. He predicted more deaths involving Teslas if the National Highway Traffic Safety Administration fails to take action...

And on Dec. 7, yet another Model 3 struck a police cruiser on a Connecticut highway, though no one was hurt... [T]he driver told police that the car was operating on Autopilot, a Tesla system designed to keep a car in its lane and a safe distance from other vehicles.

This discussion has been archived. No new comments can be posted.

New Questions Raised about Tesla's 'Autopilot' Safety After Three Fatalities This Week

Comments Filter:
  • by Khyber ( 864651 ) <techkitsune@gmail.com> on Sunday January 05, 2020 @02:37PM (#59589472) Homepage Journal

    "No normal human being would not slow down in an exit lane"

    Not only do they SPEED UP in exit lanes here in California to try to beat the light, they'll also go slow in the acceleration lane when getting on the freeway, cross over onto the shoulder and treat it as a right turn lane, and a lot of other asinine and dangerous shit.

    Totally ignorant of human selfishness, I see.

    • Re: (Score:3, Insightful)

      by Gavagai80 ( 1275204 )

      More importantly, he simply doesn't know whether autopilot was on. Instead of waiting to find out from the people who can and do check such things, he makes up an answer. The media writes about the made up answer, and consumers eagerly lap it up, spreading ignorance far and wide.

      • The best part is that there'll be no followup article correcting the details after they appear.

        • More importantly, he simply doesn't know whether autopilot was on. Instead of waiting to find out from the people who can and do check such things, he makes up an answer. The media writes about the made up answer, and consumers eagerly lap it up, spreading ignorance far and wide.

          The best part is that there'll be no followup article correcting the details after they appear.

          The NHTSA is investigating them as they appear to be autopilot related. How else would you report that?

      • by lgw ( 121541 )

        Seriously. The guy who hit the police car? What sort of credibility does "it was the autopilot, officer, honest" have?

        But the firetruck sounds a lot like the earlier crash into a truck crosswise/ Both seem like views that would be very rare in the training data, which shows the real limits of machine learning: it has gaps that you simply cannot predict from successful behavior.

        The human mind learns from examples, then abstracts and generalizes, so once you can demonstrate that you can recognize and avoid

        • Seriously. The guy who hit the police car? What sort of credibility does "it was the autopilot, officer, honest" have?

          That was my first thought - if I'd hit a police car while in a vehicle with autopilot, my first (okay, second after "I'm still alive!?!") would be "can I blame the car instead of taking the heat?"

      • by qubezz ( 520511 )
        We need a comment, let's call somebody at CMU, he's on the list of people who will give a quote without us having to mention the pay-per-quote "research" company...
    • by labnet ( 457441 )

      Yep. Coming from the land of convicts to 3000mile journey around Cali, I thought I’d entered the world of mad max USA style on the freeways.
      Time is money, and money is more important than life!

    • The first example sounds like the driver being dumb. Even using my phone sitting in my cupholder, the GPS is usually accurate enough to know what lane I'm in. If Tesla's programmers are even half competent, the car would know pretty quickly via GPS that it's not on the freeway anymore.

    • "No normal human being would *"

      The ever appropriate answer to that is, "Wait, hold my beer."

      The scientific answer is, Let's try it again to see what happens [xkcd.com]. Of course I've accelerated out an off-ramp. The annoying ones are the people who decelerate before the get to the offramp, slowing traffic on the freeway behind them.

  • by Dirk Becher ( 1061828 ) on Sunday January 05, 2020 @02:49PM (#59589494)

    When to ignore the rules.

    • That's not true. The road rules are not hard set in the cars. You can speed while in autopilot for example.
      • Then that's you ignoring the rules.

      • You can speed while in autopilot for example.

        Only by a limited amount. I believe it is 5 mph over the limit.

        This is okay on multi-lane highways, where I can just keep to the right in the slow lane.

        But Autopilot is a hassle on single-lane-in-each-direction roads. Cars back up behind me, and under California law, if there are five or more cars behind me, I am legally required to pull off the road and let them pass. It is easier (and quicker) to just drive in manual mode.

        • by AmiMoJo ( 196126 )

          5 mph over the programmed limit. Often the limit is wrong so you can either speed a lot or the car slows right down and people behind get frustrated.

    • by gTsiros ( 205624 )

      that would imply that they know when to follow those rules.

      they do not follow any of those rules.

      it's a piece of software.

    • Whose rules are you talking about? The rules that govern the function of autonomous driving or the road rules? Tesla's Autopilot already ignores some road rules in the name of safety. Specifically rules like not coming to stop on a motorway. You seem to forget that self driving cars are designed and "taught" by people. They will approach rules the way people teach them.

  • No not the Boeing MCAS, I think he's blaming it on the "My Car Ain't Safe" system.
  • No normal human being would not slow down in an exit lane,

    This is simply not true. You see people exiting a freeway, not slowing down at all until they are close to a light.

    Or what if they were drunk and trying to run a light that was turning red at the end of the exit ramp? Then I could easily see them *accelerating* in an exit lane, misjudging speed and ramming the car into something. That kind of thing happens all the time. That's one of the drawbacks of a car with really good performance, it's very

    • by lgw ( 121541 )

      I don't see that. Seems a coin toss in that case. Misunderstanding an exit lane has been a problem in the past, but OTOH I've twice seen close up a human driver crash through a red light at full speed, causing a serious accident. I'd bet on the firetruck being autopilot, given its history with stopped trucks. Of course, I'd bet the other way on the police car. Who's he kidding?

      I'm sure all of these will be investigated fully.

  • by RossCWilliams ( 5513152 ) on Sunday January 05, 2020 @03:43PM (#59589660)
    Why would you want autopilot is you still have to pay close attention and anticipate when it won't work as expected? This is just a marketing gimmick if used as directed.
    • by MobyDisk ( 75490 )

      For the same reasons that people use cruise control.

    • Because a marketing gimmick that kills people also saves people. I'm not in favour of autopilot itself, but any such self driving feature. I've already been in a vehicle with similar such features when an incident in front of me forced me to react. I didn't have my hands on the wheel at the time, but by the time I grabbed the wheel and put my foot to where I thought the brake pedal was the car was already slowing down rapidly. Not sure if I would have hit the car in front of me or not had that system not be

  • by thegarbz ( 1787294 ) on Sunday January 05, 2020 @03:48PM (#59589676)

    get into an accident. Yet normal humans seem to do enough stupid shit to kill 3300 people every day across the world. Basing the idea that the car is operating in autopilot because a human wouldn't take an exit quickly or run a red light is among the dumbest thoughts ever put into writing.

    I do wonder how many people see that T logo and just assume autopilot was the fault. I also wonder how many people try to blame autopilot for their own stupidity. Hitting a car on the side of a freeway? Well that happened to a delivery van in front of me on the A12 late November, and that old black smoke belching piece of shit certainly didn't have autopilot.

    • by AmiMoJo ( 196126 )

      Let me tell you about this thing called "liability".

      If you drive badly and kill two people you are liable for their deaths and will go to prison.

      Tesla has created a system that encourages the driver to not pay attention and then does little to enforce its attention paying rules. There are better systems out there, Tesla has ample opportunity to improve theirs but didn't.

      • Tesla has created a system that encourages the driver to not pay attention

        Tesla didn't create that, they just packaged all the systems together and gave it a catchy name. Autopilot is not something magic it's an amalgamation of safety features "lane keeping assistance" "automatic emergency breaking" "adaptive cruise control" "forward collision detection" and "side collision detection".

        Claiming it's Tesla's liability for packaging actual safety features which have a history of reducing accidents while also at the same time encouraging drivers not to pay attention is just silly. Cl

  • FUD !

  • by 140Mandak262Jamuna ( 970587 ) on Sunday January 05, 2020 @04:15PM (#59589744) Journal
    The research interests of Prof Rajkumar [cmu.edu] according to the self curated site in CMU.

    He seems to be mostly a real time OS guy, with scheduling, networking work. His research seems to be on networking aspects of autonomous vehicles, not autonomous driving it self. Disappointed he is engaging in speculation without any evidence to indicate the auto pilot was engaged. His commented about auto pilot behavior on lane marking etc does not seem to have done work on machine vision, scene detection, or autonomous driving algorithms.

  • Require a dead man switch action for all autopilots and disallow hands free driving period. The driver takes both hands off the wheel or only places one hand on the wheel at the 6 o'clock and the autopilot slows the car down and shoots a visual and verbal warning. Humans are not ready for hands free driving and perhaps never will be, because unlike airplane pilots there is no to little time to correct while driving on winding roads or on a freeway bumper to bumper.

    There is nothing scarier than watching idio

  • The driver needs to pay attention. If they do not, things like this happen. That said, this feature has probably saved a lot more people than it killed. And that is the real criterion here, not whether drive automation kills people. Sure it does and it will continue to do so (even at SAE 5), but does it kill a lot less people than human drivers? And that is a definite yes and will remain one. Humans are _bad_ at driving, but most do not admit that, usually not even to themselves.

    • I agree with your points but would just add, I wish we had better data available.
      • by gweihir ( 88907 )

        I agree with your points but would just add, I wish we had better data available.

        I agree to that as well.

    • by f00zbll ( 526151 )

      There is well known fact about advanced driver assist like Tesla autopilot. When google started their autonomous vehicle project, they saw that people acted irresponsibly and they immediately shut it down in favor of going directly to Level 4/5. This was kind of well known in the autonomous vehicle industry.

      Could Tesla designed their "autopilot" so you have to pay attention and keep your hands on the steering wheel at all times? Yes, they could have, but Elon chose not to. Tesla could have used eye-tracking

      • by gweihir ( 88907 )

        It really depends on who many people this thing has saved. It seems rather unlikely that it has killed anywhere near the number of people saved. If so, Musk made exactly the right decision, because pushing this even with the methods he used will have saved more people than it killed.

        Things are far from as simple as you try to describe them.

  • by psergiu ( 67614 ) on Monday January 06, 2020 @12:11AM (#59591018)

    Just got a Tesla Model 3. No Full Self Driving - just the basic Auto-pilot.
    When you first enable the "Smart-ish Cruise Control" and "Smart-ish Lane Keeper" you will have to accept a EULA that says that the features are in Beta, the car WILL most likely crash into any objects on the road and YOU have to obligation to pay attention and apply the brakes.
    With the latest v10 software update you have the option to see on the screen what the car's computer thinks is around you in real time. Just a peek at that nightmarish visualization of morphing and twitching vehicles and phantom traffic cones convinced me i'dd better pay attention at the road myself.

  • But if you think Autopilot is ACTUALLY an autopilot, you're fucking stupid and shouldn't be allowed a car of ANY sort.
    Nor the right to breed.

Algebraic symbols are used when you do not know what you are talking about. -- Philippe Schnoebelen

Working...