Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Transportation AI Government United States

Uber Halts Self-Driving Car Tests in Arizona After Friday Night Collision (businessinsider.com) 227

"Given that the Uber vehicle has flipped onto its side it looks to be a high speed crash," writes TechCrunch, though Business Insider reports that no one was seriously injured. An anonymous reader quotes their report: A self-driving Uber car was involved in an accident on Friday night in Tempe, Arizona, in one of the most serious incidents to date involving the growing fleet of autonomous vehicles being tested on U.S. roads... Uber has halted its self-driving-car pilot in Arizona and is investigating what caused the incident... A Tempe police spokesperson told Bloomberg that the Uber was not at fault in the accident and was hit by another car which failed to yield. Still, the collision will likely to turn up the temperature on the heated debate about the safety of self-driving cars.
This discussion has been archived. No new comments can be posted.

Uber Halts Self-Driving Car Tests in Arizona After Friday Night Collision

Comments Filter:
  • by Bruha ( 412869 ) on Saturday March 25, 2017 @03:43PM (#54109199) Homepage Journal

    Conversations would be different if the uber car was at fault but not all accidents can be avoided.

    • by geekmux ( 1040042 ) on Saturday March 25, 2017 @03:49PM (#54109229)

      Conversations would be different if the uber car was at fault but not all accidents can be avoided.

      Conversations will be different when the autonomous car is at fault due to a hack, as prioritizing security over everything else is usually avoided.

    • by Derec01 ( 1668942 ) on Saturday March 25, 2017 @03:51PM (#54109237)

      I appreciate the point that, statistically, this *will* happen as some accidents are unavoidable. You're absolutely correct and we should look at the bigger picture.

      However I'm skeptical of reports where the self driving car is not at fault because the other car "failed to yield". Being legally in the right doesn't necessarily mean the car was driving well or defensively, and these are the particular situations where a human might have been clued in to the other driver's behavior and avoided it entirely.

      • On the other hand, self driving cars don't get mad at other drivers making a mistake and try to get back at them, causing all kinds of dangerous situations.

        • On the other hand, self driving cars don't get mad at other drivers making a mistake and try to get back at them, causing all kinds of dangerous situations.

          Oh, yeah? Says who . . . ? An autonomous vehicle might be programmed to drive "aggressively" to get through traffic jams faster. They'll give the feature some innocuous title like, "affirmative driving".

          What you'll end up with is autonomous vehicles playing "chicken" with each other. An autonomous vehicle will not win any races by driving cautiously.

          Anyway, the point is moot, because Über is not at fault in the same way that Über is not a taxi company. Über is a newfangled economy comp

          • by religionofpeas ( 4511805 ) on Saturday March 25, 2017 @04:43PM (#54109493)

            If everybody drives aggressively, traffic jams will get worse. When there are sufficient self-driving vehicles, they'll probably come up with some communication protocol so they can synchronize their strategies and achieve optimal road use, benefiting everybody.

      • by fluffernutter ( 1411889 ) on Saturday March 25, 2017 @04:10PM (#54109311)
        The only question that matters is if a human would have avoided the accident. If they could have easily, then this accident was caused by self-driving. It doesn't matter what side of the law Uber was on.
        • by mjwx ( 966435 )

          The only question that matters is if a human would have avoided the accident. If they could have easily, then this accident was caused by self-driving. It doesn't matter what side of the law Uber was on.

          A good driver absolutely would have avoided that collision.

          The problem with computers is that they don't take into account that people will break the rules and do stupid things, a defensive driver assumes someone will do the dumbest thing possible. Drivers who are about to pull out in front of you give a lot of cues to their behaviour, everything from the way they're looking at you to rocking back and forth to revving and creeping. A good driver learns to pick up on these cues.

          The thing is, AI isn't s

      • and these are the particular situations where a human might have been clued in to the other driver's behavior and avoided it entirely.

        Based on what I've seen of other self driving cars this is something that computers will very quickly be better at than humans as well. In aggregate, humans are frigging horrible at situational awareness on the road and even worse at driving defensively (tip: Defensive driving is the opposite of being a tailgating jackass).

    • Conversations would be different if the uber car was at fault but not all accidents can be avoided.

      But there are also accidents that one could have avoided even if it were not their fault. This could very well be one of those cases. I have avoided accidents where another driver has not properly yielded more than a few times. Its a matter of not trusting the other driver to do the right thing. Its the kind of thing that is very hard to program in to an automated system.

      • It's about fear of being in an accident, and the inconvenience that it brings regardless of whether it is your fault or not. When someone cuts you off, most humans won't say "oh well" and hit them, they will still try to avoid an accident. How do you give AI a fear of this?
      • Its the kind of thing that is very hard to program in to an automated system.
        It is absolutely not hard to program into an automated system, facepalm.
        Avoid collisions, easy.

        • It is absolutely not hard to program into an automated system, facepalm. Avoid collisions, easy.

          So easy, even Uber can do it.... oh wait a minute. Maybe avoiding collisions is easier when they are anticipated.

          • Re: (Score:2, Insightful)

            A newcomer like Uber can not do it.
            Why newcomers get allowance to test their bullshit on real roads when Audi, Toyota, BMW, Mercedes etc. have self driving cars since a decades is beyond me.

    • Conversations would be different if the uber car was at fault but not all accidents can be avoided.

      There are accidents that nobody could avoid, there are accidents that I cannot avoid, and accidents that could be avoided.

      Obviously self driving cars will initially have to be clever enough to only cause accidents very, very rarely, At some point when this is achieved, they will try to avoid avoidable accidents where someone else is at fault.

    • The other car was a Tesla in autonomous mode whose driver was watching a Disney movie.

  • by StevenMaurer ( 115071 ) on Saturday March 25, 2017 @03:48PM (#54109225) Homepage

    I am reminded that when cars were first invented, there were laws put in place mandating that someone walk ahead of any self-propelled vehicle waving a red flag [wikipedia.org], for fear of scaring horses and making people uncomfortable.

    I'm sure that in one hundred years this sort of reaction - blaming the software for an inattentive driver failing to yield - will be seen in exactly the same way.

    • I here ya man! If there are two things I trust in life it's large corporations and software. When you combine those two things it's like chocolate and peanut butter!
    • So people should just go around crashing into people who cut them off then? 'Too bad, you made a mistake, I have no obligation to prevent an accident.' that's bullshit.
      • So people should just go around crashing into people who cut them off then?

        Note that the Uber car did NOT crash into the car that cut them off. The car doing the "cutting off" ran into the Uber car (I'm assuming it hit the Uber car on its side, since TFA refers to the Uber car being knocked over on its side).

        Now, it the human driver of the other vehicle decided that the Uber car had "cut him off" and crashed into the Uber car on purpose, that would fit your description nicely.

        Alas, the Uber car had righ

        • I'm not arguing the legal side of this, if the human driver was at fault they were at fault. This does not necessarily mean the accident shouldn't be avoided by the person NOT at fault.
    • by Gordo_1 ( 256312 )

      The most hilarious part of that wiki entry is the Virginia proposal requiring drivers to rapidly disassemble their car and hide the parts behind bushes at the first sign of livestock. Would have become law if not vetoed by the Governor.

      It fascinates me that we haven't really progressed at all as a species in 120 years. People will be up in arms at the first sign of autonomous vehicles crashing, even if and when they're literally proven to be say 100x safer than humans. You will have websites popping up with

      • The problem is, statistics don't matter if an automated car kills someone in a situation that a human wouldn't have. One day if they are 100x safer, I would hope they would be safe in all situations that a human would be.
        • by lgw ( 121541 )

          Who cares? If the net result is fewer deaths, then that's the net result. Your fear of the new and Luddite instincts don't factor in.

    • I am reminded that when cars were first invented, there were laws put in place mandating that someone walk ahead of any self-propelled vehicle waving a red flag, for fear of scaring horses and making people uncomfortable.

      Not automobiles as we know them.

      Steam powered road tractors, mammoth agricultural tractors and heavy construction equipment. Ca. 1860-1896. Think township or county roads that were dirt or gravel tracks barely more than a single lane wide. Now do you know why you needed a flag man?

    • by Solandri ( 704621 ) on Saturday March 25, 2017 @07:10PM (#54110105)
      Most people try to pin the blame for an accident on a single cause. Most liability laws are based on this same (erroneous) concept.

      Airline accident investigations are really good at demonstrating how an entire chain of events led up to the accident. And that any single factor happening differently could've prevented the accident. e.g. The Concorde crash was caused by (1) debris on the runway from a faulty repair on a previous plane, (2) failure of the Concorde's tires when it struck the debris, (3) failure of the undercarriage to withstand tire debris striking it from a blowout at take-off speed, (4) the manufacturer not making any procedures or provisions to recover from a double engine failure on a single side because it was considered so unlikely. Any one of these things doesn't happen and the Concorde doesn't crash.

      Safety systems layer multiple accident-avoidance measures on top of each other. This redundancy means that only when all of those measures fail is there an accident. Consequently, even if the self-driving car was not legally at fault, that it was involved in an accident still points to a possible problem. e.g. If I'm approaching an intersection and I have a green light, I don't just blindly pass through because the law says I have the right of way. I take a quick glance to the left and right to make sure nobody is going to run their red light, or that there aren't emergency vehicles approaching which might run the red light, or that there's nobody in the crosswalk parallel to me who might suddenly enter into my lane (cyclist falls over, dog or child runs out of crosswalk, etc).

      So even if the autonomous car wasn't legally at fault, that's not the same thing as saying it did nothing wrong. There may still be lessons to learn, safety systems which were supposed to work but didn't, ways to improve the autonomous car to prevent similar accidents in the future.
    • by igny ( 716218 )
      Re: blaming the software for...

      I would not blame software for anything but inevitable bugs in the software. Just think, this software is in its testing stage, wouldn't you agree that it may have bugs? As far as conspiracy theories go, I would rather blame the police for covering up a corporate mishap here.
    • I am reminded that when cars were first invented, there were laws put in place mandating that someone walk ahead of any self-propelled vehicle waving a red flag [wikipedia.org], for fear of scaring horses and making people uncomfortable.

      I'm sure that in one hundred years this sort of reaction - blaming the software for an inattentive driver failing to yield - will be seen in exactly the same way.

      The two situations are not comparable.

      When the automobile was invented it wouldn't take more than a handful of real world experiments to determine that red flag laws were unnecessary.

      But demonstrating that current self-driving car technology is as safe as a human driver is a much tougher challenge, and I'm not convinced that it's a challenge they're taking seriously.

  • Think about it this way. If someone cuts me off in traffic and I run into them because I'm not watching, wouldn't the accident technically be my fault? An accident is your fault unless you have done everything within your power to prevent the accident and it still occurred. Perhaps the AI was not programmed to deal with this situation, that would be the same as a human driver not watching the road because there would have been a disconnect between the sensors and the AI.
    • Firstly an autonomous car is not driven by a (strong) AI, barely half of the algorithms count as weak AI.
      Secondly, it is of course programmed to avoid crashes/collisions at all cost.

      What else? Why do people believe otherwise is beyond me. Even if no one is injured, the hassle with the insurances to get the damage to the cars sorted is something no one wants to have.

      • If automated cars are programmed to avoid accidents at all costs, how did the Tesla run into the trailer? How did a google car turn into a bus because there was a sand bag in it's lane. Automated cars can only be programmed to avoid accidents at all costs if the people programming them can preconceive all accidents.
  • "Fuck you California, we are pioneers, we'll go to Arizona where they welcome pioneers!"

    * THHHUNK! *

    "Ahhh, arrow in my back, arrow in my back! Help!"

  • Just like google glass people can't stand progress. I was wondering how long until people started intentionally crashing into auto cars.

    • by JustNiz ( 692889 )

      Congratiulations, you've won the prize for being the most clueless thing I've read today.

  • by linuxwrangler ( 582055 ) on Saturday March 25, 2017 @05:59PM (#54109805)

    "Given that the Uber vehicle has flipped onto its side it looks to be a high speed crash, which suggests a pretty serious incident..."

    In one past life I learned accident investigation and in another extricated victims, both dead and alive, from vehicle collisions. I have to call malarkey on the "high-speed" claim.

    Cars can tip over at very low speed. I've seen at least two such crashes within two blocks of my house. In one, a driver ran a stop sign and clipped a small SUV which tipped over onto the opposite sidewalk. The entire accident scene covered, perhaps, 30 feet edge to edge.

    In the other, a driver drifted into the parking lane sideswiping a parked car such that the door-panels hooked which caused the car to rotate then roll.

    The "high-speed" car in both cases was traveling 20-30mph.

    Though the provided photo does not show a large surrounding area, neither car looks crushed - just some body-panel denting and debris is right next to the car vs. scattered down the roadway and "nobody was seriously injured."

    Nothing about this suggests high-speed.

    • Aren't SUVs with their high centre of mass a known rollover hazard? I very much remember the "reindeer test" videos of about a decade ago: basically making a very sudden, sharp turn at fairly high speeds to avoid a reindeer, causing most SUVs to roll over.

      • by mjwx ( 966435 )

        Aren't SUVs with their high centre of mass a known rollover hazard? I very much remember the "reindeer test" videos of about a decade ago: basically making a very sudden, sharp turn at fairly high speeds to avoid a reindeer, causing most SUVs to roll over.

        Yes, the high risk of rollovers in SUV's have been known for years. However because the *NCAP programs dont bother with rollover tests (or rear end tests, it's not like nose-tail collisions are the most common type or anything) this risk is ignored by manufacturers who are making a lot of money by selling jacked up hatchbacks.

  • In most states, if the vehicle was moving and not legally stopped or parked, then it is partially at fault. In states without no fault insurance laws, no matter how negligent the other driver is, both vehicles will be blamed.

What is research but a blind date with knowledge? -- Will Harvey

Working...