Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
Transportation Communications Google Technology

Police Chief: Uber Self-Driving Car 'Likely' Not At Fault In Fatal Crash (arstechnica.com) 527

An anonymous reader quotes a report from Ars Technica: The chief of the Tempe Police has told the San Francisco Chronicle that Uber is likely not responsible for the Sunday evening crash that killed 49-year-old pedestrian Elaine Herzberg. "I suspect preliminarily it appears that the Uber would likely not be at fault in this accident," said Chief Sylvia Moir. Herzberg was "pushing a bicycle laden with plastic shopping bags," according to the Chronicle's Carolyn Said, when she "abruptly walked from a center median into a lane of traffic." After viewing video captured by the Uber vehicle, Moir concluded that "it's very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway." Moir added that "it is dangerous to cross roadways in the evening hour when well-illuminated, managed crosswalks are available." The police said that the vehicle was traveling 38 miles per hour in a 35 mile-per-hour zone, according to the Chronicle -- though a Google Street View shot of the roadway taken last July shows a speed limit of 45 miles per hour along that stretch of road.
This discussion has been archived. No new comments can be posted.

Police Chief: Uber Self-Driving Car 'Likely' Not At Fault In Fatal Crash

Comments Filter:
  • by Joe_Dragon ( 2206452 ) on Tuesday March 20, 2018 @04:55PM (#56293353)

    Why does it look like an sidewalk?

    • by guruevi ( 827432 )

      It doesn't, it looks like a median, it's not a safe place to be as a bicyclist laden with anything.

      • by gatfirls ( 1315141 ) on Tuesday March 20, 2018 @05:14PM (#56293487)

        Sure looks like a sidewalk to me.

        https://www.google.com/maps/@3... [google.com]

        Maybe it's for design since it doesn't make any sense. If you move around on street view they put up signs telling people not to use it so something like this has probably happened before.

        https://www.google.com/maps/@3... [google.com]

        • That piece of sidewalk on the median really doesn't make any sense and my guess as to why it's there is that it's some kind of remnant from the way that area used to be. The reason why I don't think it's a turning point for emergency vehicles is that it's clearly a sidewalk and to turn around at the "X" requires a vehicle with a very short turning circle. However seeing how there's no zebra crossings on either side only somebody who is completely careless or really needs to take that shortcut is going to us
          • I just noticed the path is also lighted. Like with a light in the middle of it. And it's next to a park.

            I'm starting to think the local funeral home designed that area.

        • Looks to me like some 'designer' got what they wanted. Nothing more.
    • Not just the median. (Score:5, Informative)

      by Ecuador ( 740021 ) on Tuesday March 20, 2018 @05:18PM (#56293525) Homepage

      Interesting series of tweets: https://twitter.com/EricPaulDe... [twitter.com]
      The median looks like it has fancy, inviting paths, but it also warns you not to use them. And the actual crossing is kind of daunting...
      It is a rather bad design, but it does look dangerous in any case, so if I wanted to cross that way I would exercise extreme caution...

      • by dgatwood ( 11270 )

        Twitter actually ran an ad about how chicken gets to your table, on these tweets about someone getting killed while crossing the road. I reported it as "I don't like this ad", because there's no "This is highly inappropriate in this context" option. Lovely.

    • It's so much not a sidewalk that they have signs to tell you not to use it: https://imgur.com/a/KyxTK [imgur.com]
    • by Mal-2 ( 675116 )

      It doesn't. Those paths are as wide as two traffic lanes. They look like they're there so that maintenance crews can get around, and also to be able to get their vehicles out of the road.

  • by rickb928 ( 945187 ) on Tuesday March 20, 2018 @05:09PM (#56293433) Homepage Journal

    There will be a thorough investigation of the vehicle, the programming, all of the data and details. Even if it is decided that the victim acted imprudently, such accidents always (at least around here, unless it was the police involved) are fully investigated, and the driver is rarely exonerated from all blame, just the proximate causal fault.

    Now, for you ignats who see class discrimination in the description that the victim was pushing a bicycle laden with shopping bags, a word; the police are the upper caste in these situations. Corporations will be prosecuted more often than police officers, and more often than reputable members of the community, IE, government. Or favored citizens. This is not new.

    There was more than one factor leading to this tragedy, and if the end result is change in how these vehicles monitor their surroundings to have more time to analyze and react, excellent, and if the result is a recognition that even self-driving vehicles are unable to avoid such accidents, just as even skilled and careful human drivers are, well, then we've learned that self-driving does not equal infallible. That's important, and useful information.

    • by alvinrod ( 889928 ) on Tuesday March 20, 2018 @05:43PM (#56293723)

      There was more than one factor leading to this tragedy, and if the end result is change in how these vehicles monitor their surroundings to have more time to analyze and react, excellent, and if the result is a recognition that even self-driving vehicles are unable to avoid such accidents, just as even skilled and careful human drivers are, well, then we've learned that self-driving does not equal infallible. That's important, and useful information.

      Who is expecting self-driven vehicles to be infallible in all conditions? No matter how quickly they can react to sensor data indicating an emergency, they're still bound by the laws of physics and may not be capable of avoiding collision with something that suddenly enters their field of observation. I suspect that this incident will help engineers to design a better autonomous vehicle, but as with any new safety feature we create nature has a way of designing better idiots as well. If someone were to jump out (or be pushed in front of) a vehicle traveling at some speed, there's always a limitation to how much that vehicle is going to be able to deviate from its current trajectory and anyone who falls inside of that window is going to be hit. The only thing that can be done about that is to engineer vehicles that can come to a stop within a shorter window.

      • by Ichijo ( 607641 )

        No matter how quickly they can react to sensor data indicating an emergency, they're still bound by the laws of physics and may not be capable of avoiding collision with something that suddenly enters their field of observation.

        Why can't autonomous cars avoid overdriving their headlights [driversed.com]?

      • Re: (Score:3, Insightful)

        by green1 ( 322787 )

        Who is expecting self-driven vehicles to be infallible in all conditions?

        Quite a few posters in this thread, and every other thread that has existed on self driving cars on this, or any other, forum.

        they're still bound by the laws of physics

        Heresy! That is NOT a popular opinion around here!

        In all seriousness, I'm sick of all the people who think self driving cars will avoid all collisions, and even more sick of those who think that if they don't we should just give up on them.

        Self driving cars have the potential to, one day in the future, eliminate almost all preventable collisions. But that is not the same as all colli

    • then we've learned that self-driving does not equal infallible

      What will we learn next? That water is wet?

      No one with a brain has ever claimed that self-driving cars are or ever could be infallible. Nor do they need to be. They only need to be better than the average human, which is a low bar. And, then, over time they'll get better and better as the algorithms and sensors are refined. In a few decades the NTSB will be combing through car wrecks with something akin to the same scrutiny they apply now to plane crashes because the wrecks will be so rare.

      But they'll o

  • Entitled pedestrians (Score:5, Interesting)

    by DeplorableCodeMonkey ( 4828467 ) on Tuesday March 20, 2018 @05:09PM (#56293435)

    In my community, we have great sidewalks, many crosswalks and all that needed to create a safe and walkable community. What do the pedestrians still do, you ask?

    Walk out into traffic if it's more convenient. If a car hits them after taking reasonable measures to stop, they ought to be liable for all of the damage caused including to the vehicle and the driver's therapy if required.

    My wife knew someone who killed a pedestrian who just walked out into traffic like this without thinking. Totally unavoidable. The "victim" was the driver, not the pedestrian because the driver was obeying the law and some stranger decided "fuck the traffic laws" and made her party to an accidental vehicular homicide.

  • by Anonymous Coward

    The police chief needs to get some facts straight about the technology of autonomous vehicles work. LiDAR comes from LASERs. From the VEHICLE.

    Unclear which "shadows" Chief Moir is talking about. Streetlights are but a one illumination source at play here.

    • by sl3xd ( 111641 )

      The police chief is using non-technical, human terms, rather than quibbling over semantics of idioms.

      "Shadows" can simply mean "obscured from view," and is a common American English idiom.

      It could also mean "shadows cast by the headlights" or "shadows cast by the LIDAR beam".

      If you're in the shadows of a car's headlights, it's a sure bet the driver can't see you.

  • Humans and AI. (Score:4, Insightful)

    by Izuzan ( 2620111 ) on Tuesday March 20, 2018 @05:16PM (#56293503)

    Humans can adjust to changing situations, they can also ready body language. Most people slow down when they see someone on the side of tge road looking like they are going to step out. An AI cant read that sort of thing. They can only react tl basic things presented to them.

    • by Kjella ( 173770 )

      Humans can adjust to changing situations, they can also ready body language. Most people slow down when they see someone on the side of tge road looking like they are going to step out. An AI cant read that sort of thing. They can only react tl basic things presented to them.

      My impression is that they detect and react to the actual physical posture and motion. But they can't read the person and tell if he appears drunk, high, mentally challenged or in some other way odd and likely to do odd things. It's a bit like the difference between a dog on a leash and a street dog with no leash, to a human they pose very different risks. But without programming in a ton of "human" logic they'll look just the same to a computer.

    • An AI cant read that sort of thing.

      Why not?

    • by djinn6 ( 1868030 )
      Without the video itself we won't be able to tell for sure. However, from the streetview imagery, there's a number of places she could have emerged from that would've kept her her hidden behind bushes or trees. And I don't think she stopped at the side of the road at all. If she did, she would have had enough time to see the car that clearly was not slowing down for her.
    • I've seen enough road fails youtube videos to have quite a bit of anecdotal evidence that people do not slow down and react extremely poorly when the person does step out.

      Also the police chief, who has seen the video, made it clear that the person "stepped out of the shadows" and "would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) "

      Seems like people are hell bent on the belief that autonomous cars can't be better than the amazingly faulty human.

      • by ceoyoyo ( 59147 )

        My brain is magic. It's such a common (and strong) reaction whenever anyone mentions automation or AI that it almost seems like some kind of instinct.

        Maybe this has all happened before....

  • Sensors (Score:4, Interesting)

    by Volda ( 1113105 ) on Tuesday March 20, 2018 @05:16PM (#56293509)
    Doesnt the car have sensors that could have detected the person and her bike with bags? Dont get me wrong it appears that the pedestrian was in the wrong but something should have been detected and the car should have done something to try and avoid the accident. Maybe these cars are not smart enough yet. Though once fully certified I would expect them to be better at driving then people. Something to work towards I guess.
    • Re: (Score:2, Insightful)

      Sure it does. It supposedly is watching in all directions all the time and supposedly has a reaction time better than a human. But there is no 'mind' in there; it's a 'pseudo-intelligence', it can't think, it doesn't know the difference between a living being and an inanimate object -- because it has no capacity to think.
      • Re:Sensors (Score:5, Insightful)

        by ColaMan ( 37550 ) on Tuesday March 20, 2018 @05:52PM (#56293791) Journal

        Doesn't matter a damn if it has a reaction time better than a human if someone steps out onto the road 20 feet in front of the car and you've got half a second to judge and react.

        There are basic physical numbers at play here - the mass of the vehicle, the ability of the braking system in the car to scrub off speed, the conditions of the tyres, the road surface, etc. In those kinds of short-distance collisions, a computer will be able to reduce the speed of the car by a few mph over a person and that's it.

        The only saving grace that a person has is the ability to read body language and judge that someone might step out onto the road. And even then that usually only results in a foot off the accelerator, and not yet placed on the brake.

         

        • The word I think you're looking for is 'intuition', and these machines, aside from having no actual cognitive ability and no actual awareness, are completely and totally incapable of anything even resembling 'intuition'.
        • But what if a human driver would have seen that person while they were still 150 yards down the road? A human slows down ahead of time, monitors the situation, and doesn't have to react in the last 20 feet. It is very unlikely that the view of the human was totally obscured; though much MORE likely in the last 10 yards. When does the car start paying attention?
    • According to the summary, the pedestrian stepped so sudden into the path of the car, that no one/nothing could have prevented the crash.

    • by Mal-2 ( 675116 )

      Sensors aren't omnipotent. They can't see around obstructions any better than humans can, although some things that are obstructions to humans aren't to sensors. Still, if someone hides behind a parked car and then jumps into traffic, no sensors on the vehicle are going to spot them. The only hope is that they get a warning from a vehicle ahead, who saw the person as that preceding vehicle passed.

  • by Morky ( 577776 ) on Tuesday March 20, 2018 @05:49PM (#56293781)
    Another key takeaway is that this scenario can now be analyzed and applied to millions of future situations. I just wish all the various autonomous driving companies were sharing their work.
  • by viperidaenz ( 2515578 ) on Tuesday March 20, 2018 @06:25PM (#56293989)

    What reason did the Uber car have for going 38 in a 35 zone?
    Surely the speed limit was lowered from 45 to 35 for a reason, probably for safety reasons.
    Can the car not read road signs? It doesn't have the excuse of "I was watching the road, not my speedo" for a minor speeding offence. Did Uber fail to update the map data when the speed limits changed?

    The risk of death being hit by a car below 30mph is relatively low. It increases rapidly as speed increases.
    9% chance of death at 30mph.
    50% chance of death at 40mph.
    Starts reaching 100% fatal over 50mph.

    There's a reasonable chance the woman, who may well have been in the wrong, would still be alive if the car was traveling at or below the 35mph limit.

    source: https://nacto.org/docs/usdg/re... [nacto.org]

    There's another study that showed a reduction in speed by 5km/h would result in 30% fewer deaths. That happens to be how much the Uber car was over the limit.
    http://humantransport.org/side... [humantransport.org]

    • Variance (Score:5, Interesting)

      by rsilvergun ( 571051 ) on Tuesday March 20, 2018 @06:48PM (#56294099)
      Is usually around 5mph. It's difficult to keep a car at a rock solid 35mph, even for a computer. Changes in elevation can quickly alter your speed and religiously adjusting for it isn't even always the safest thing to do.

      One of the hard lessons I had when driving is that if you slow down too much aggressive or stupid drivers will take that as a signal to go. My first accident was a t-bone where a girl hit me because she was trying to do a left into a busy road. I saw her start to move and put on my breaks. She saw me coming and did the same, but then saw me breaking and decided this somehow meant I was going to come to a complete stop in the middle of a busy street (the only option that would have stopped the accident by then). If I had not breaked she wouldn't have gone and the accident wouldn't have happened.

      What I'm saying is there's such a thing as too much caution. Now, maybe if we can get the meatbags off the road that won't be true anymore.
      • by Ichijo ( 607641 )

        Imagine 4 scenarios:

        1. You braked and she saw it as a cue to proceed. End result: a low-energy collision.

        2. You braked and she braked. End result: no collision.

        3. You didn't brake so she did. End result: no collision.

        4. You didn't brake and neither did she. End result: a high-energy collision.

        Whether you braked or not, there was a possibility of no collision, so we can cross out options 2 and 3, leaving you to choose between a low-energy collision (option 1) or a high-energy one (option 4). I think you made

    • This is ~50 yards from where the accident happened.

      https://www.google.com/maps/@3... [google.com]

      Maybe the car was decelerating after identifying a newly posted 35mph sign?

      But I am sure those "what if's" along with that same data will net the surviving family a nice chunk of change in litigation. It's not the fault of the person walking to oncoming traffic that's the problem here it's the ED-209 killer car traveling marginally outside of the posted limit.

  • defensive driving (Score:4, Insightful)

    by fluffernutter ( 1411889 ) on Tuesday March 20, 2018 @07:30PM (#56294331)
    How far do these cars look ahead? In defensive driving, they teach you to look WAY up the road. 150 yards back from the intersection, you are more likely to see people running onto the road than 5 yards from the intersection; it may just be a flash of them seen between vehicles up ahead. Are these cars properly watching as they pull up? They should have to submit high definition video from the moment the car starts to when it stops, from the perspective of a driver. If that person that it hits becomes visible at any time and the AI doesn't show any reaction in some way, "she ran out in front of the car" isn't good enough if you're only paying attention 20 feet before the intersection. I know some people don't notice these things, but a lot of people do and it prevents accidents. I would rather have autonomous cars be modeled after defensive driving techniques and I am concerned that they are not.
  • by stabiesoft ( 733417 ) on Tuesday March 20, 2018 @08:09PM (#56294467) Homepage

    If it is clearly the woman's fault, then produce the video for us all to see. Please blur the impact though. I just want to see for myself how much time before the woman entered the lane until impact. Simple, where is the video?

Thus spake the master programmer: "Time for you to leave." -- Geoffrey James, "The Tao of Programming"

Working...