Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Transportation United States

Uber Test Vehicles Involved In 37 Crashes Before Fatal Self-Driving Incident (reuters.com) 92

Uber's autonomous test vehicles were involved in 37 crashes in the 18 months before a fatal March 2018 self-driving car accident, the National Transportation Safety Board (NTSB) said on Tuesday. Reuters reports: The board said between September 2016 and March 2018, there were 37 crashes of Uber vehicles in autonomous mode at the time, including 33 that involved another vehicle striking test vehicles. In one incident, the test vehicle struck a bent bicycle lane bollard that partially occupied the test vehicle's lane of travel. In another incident, the operator took control to avoid a rapidly approaching oncoming vehicle that entered its lane of travel. The vehicle operator steered away and struck a parked car. The NTSB will hold a probable cause hearing on the crash Nov. 19. A spokeswoman for Uber's self-driving car division said the company has "adopted critical program improvements to further prioritize safety. We deeply value the thoroughness of the NTSB's investigation into the crash and look forward to reviewing their recommendations."

Bloomberg is also reporting that Uber's self-driving test car wasn't programmed to recognize and react to jawalkers. The report said "the system design did not include a consideration for jaywalking pedestrians." [Elaine Herzberg, the 49-year-old pedestrian that was struck by one of Uber's self-driving cars] was crossing the road outside of a crosswalk.
This discussion has been archived. No new comments can be posted.

Uber Test Vehicles Involved In 37 Crashes Before Fatal Self-Driving Incident

Comments Filter:
  • Corrected Headline (Score:5, Insightful)

    by MikeDataLink ( 536925 ) on Tuesday November 05, 2019 @09:07PM (#59385394) Homepage Journal

    Uber Test Vehicles Involved In 3 Crashes Before Fatal Self-Driving Incident, also Ubers were hit 33 times by other drivers.

    • Re: (Score:2, Troll)

      So? If it's being hit with greater frequency than a human than it's doing something very wrong!
      • So? If it's being hit with greater frequency than a human than it's doing something very wrong!

        Cool story bro. But that's not what the article says.

        • The article repeatedly describes all 37 incidents as crashes, gives no indication I can see that any of them didn't amount to a crash, and says the uber vehicles involved were in autonomous mode, which suggests they were test vehicles... what about the headline is inconsistent?

      • According to the NYT the avg accident rate is 6.7/million. Uber had over 3 million miles. 6.7 *3 million = 20.1 expected accidents vs 37.
        https://www.nytimes.com/2006/0... [nytimes.com]

        But, how thorough are statistics in The Times vs Uber? I doubt the NYT citation would include hitting a bicycle lane cone.

        • By "thorough" do you mean localized? Rates in Arizona might be higher than average.
          • Rates in warm states are 2/3 lower than rates in icy states.
          • By thorough I mean, how many extremely minor "accidents" are the NYT stats including vs Uber's stats which are from an extremely thorough analysis of every single second of driving with bug reports, lidar and video proof.

            Is 6.8/Million just what is worth reporting to your Insurance for a >$500 deductible damage or is it parallel parking and bumping the car behind you. Because I don't think people consider bumping while parking an accident worth remembering or reporting to a survey but uber would know th

      • Why the fuck is this modded troll??
    • by bloodhawk ( 813939 ) on Tuesday November 05, 2019 @11:52PM (#59385738)
      My mother has been hit by other drivers in her car many times above normal too. Though if you go for a drive with her you will quickly realize why. She is erratic, sometimes overly cautious to the point of dangerous (stops on a roundabout or where she has right of way) and does last minute changes of direction. It scares the crap out of me being in a car when she drives as it feels like she is just begging other people to hit her. So just because you were being hit doesn't mean you aren't a major part of the problem
      • My mother has been hit by other drivers in her car many times above normal too

        The nice thing about self driving cars is that eventually they can code that bad behavior to stop. Even better still, once they've coded the bad behavior to stop for one car, every other car stops doing that as well. I don't think we can reprogram your mom to stop her bad behavior and definitely someone getting stopped for DUI hasn't instantly stopped everyone else from getting a DUI. So I get that bad behavior on the road can lead to problems, however, I would argue that at least with the computers we s

        • The nice thing about self driving cars is that eventually they can code that bad behavior to stop. Even better still, once they've coded the bad behavior to stop for one car, every other car stops doing that as well.

          Conversely, when a massive real-time update (and you know they're coming, if they aren't already a thing) with defects/bugs go out, EVERY car starts exhibiting said defects... all at once. I am not looking forward to the day a "minor update" for the satellite radio option I'm not even using knocks out the GPS while I'm on the freeway. I know, I'm crazy... Slashdot doesn't have a headline or five a day of terrible software defects...

          • Well if that was the case, then we would have already heard about that kind of issue. However, the reality of it is that the systems that control the user facing functions are a completely different system from the self-driving system. Typically, the user facing things that you are looking at are driven by some form of SoC, while the actual self-driving is done by a system that using some sort of ASIC. Additionally, there's typically an election system put in place to monitor the overall health of the sy

            • Yep, greed and cheapness. At one point or another they'll create a system with a single point of failure and it'll dive the car into the ground.

      • It scares the crap out of me being in a car when she drives as it feels like she is just begging other people to hit her. So just because you were being hit doesn't mean you aren't a major part of the problem

        Absolutely correct. And with Teslas we're able to review the telemetry and confirm when that's been the case. Can you point me to instances where that's been the conclusion? Because I haven't seen them.

      • I'll call your mother and raise you a mother-in-law...

        Lived in the mountains. Drove tanks (large sedans, back when they were made out of metal). Two accidents at the bottom of the same steep hill. Finally another accident at the bottom of "Dead Man's Curve". ...and she finally learned from her experiences: she quit driving.

    • Because of the Ubers driving unexpectedly and doing unpredictable things.

    • If so-called SDCs are so much better than humans then why couldn't they avoid those other 33 collisions that allegedly weren't their fault, hmm?
    • also Ubers were hit 33 times by other drivers.

      Most people realise quite quickly that you only really start to learn how to drive after you've taken your test. Driving isn't just about doing things correctly yourself, it's about realising that other people do stupid things all the time.

      A self-driving car needs to work the same way (unless all humans are banned from driving). So although you seem to be implying that the Uber car wasn't at fault for the 33 times another vehicle crashed into it, whilst that is technically true, it is far more than a human

    • "recognize and react to jawalkers."

      Yes, it has long been known that the only solution to Jawas an assult by an Imperial Walker regiment . . .

      hawk

  • by chuckugly ( 2030942 ) on Tuesday November 05, 2019 @09:09PM (#59385398)

    If I read that correctly, out of 37 crashes, 33 were other cars hitting the robocar, one was a human operator taking control to avoid a deviant human driver, one was hitting a plastic lane divider that was already broken and in the robot-car's lane, and a couple are not recounted in the summary.

    I'm not an Uber fan but that doesn't look like a terrible record to me.

    • by fluffernutter ( 1411889 ) on Tuesday November 05, 2019 @09:16PM (#59385410)
      Again.. if it is being hit with the same frequency as a human, then fine this is non news. But if it's driving in a jerky and unpredictable manner and as a result people are running into it, than it's not human enough to be on a road with humans.
      • by DarkOx ( 621550 )

        yes it will be interesting to see what all the NTSB ultimately has to say.

        Typically we fault a driver who rear ends another driver; while the lead driver should check their mirror before hard braking is applied usually the greater sin is considered to be following so near YOU can't stop or avoid if the lead car slows suddenly or stops short.

        However lets imagine two cars are on four-lane highway (two in each direction). There are no other vehicles in the road way nearby and conditions are good. There is a pa

        • We fault the human driver behind, because the assumption is that all drivers have a vested interest in not getting hit, including the driver in front. We can assume that the driver is making their best effort. With automated driving there is no such assumption because there is no emotion, just a computer following logic. Furthermore, the programmers propose to be able to program for all situations, thus they are able to intuit them beforehand and plan a response. Then the response has to be appropriate
        • Maybe you don't realize this, but you should be able to slam your breaks AT ANY TIME on the road and drivers behind you should be driving safely enough (with enough distance in front of them) that they are able to slow down in time. If they don't, you need to trade in your license in favor of having other people drive you around. Most people don't leave enough distance in front of their car.
      • by Hodr ( 219920 )

        Maybe people crash into it because they are distracted by the car with no driver?

      • Again.. if it is being hit with the same frequency as a human, then fine this is non news. But if it's driving in a jerky and unpredictable manner and as a result people are running into it, than it's not human enough to be on a road with humans.

        Are they being hit more often? Seems those numbers should be easy to calculate.

    • The issue here is that the uber self-driving cars are not reacting in a proper defensive manner to occurrences outside the scope of traffic law. So, for example, an uber hit a jaywalking pedestrian, that is not something a competent human driver would do. In another instance the driver had to take over because the uber was not going to react properly to an oncoming vehicle in order to minimize the damage. The uber could not react to a bent road sign that any human driver would easily see and avoid hittin

      • Actually, hitting pedestrians, whether jaywalking or not, is illegal.

        A cyclist here in London who hit a woman crossing the street on a red pedestrian light, not looking at the traffic but her phone, was ordered to pay half of her medical bill. She was at fault as well, but hittting her meant the cyclist was also at fault.
        • by leptons ( 891340 )
          "Jaywalking" is only a thing when a pedestrian crosses outside a crosswalk but still within close proximity to a crosswalk. The person that died when struck by the uber autonomous vehicle was not jaywalking, they were nowhere near a crosswalk, and that's not "jaywalking", that's simply crossing a road on foot.
      • by sphealey ( 2855 )

        - - - - - So, for example, an uber hit a jaywalking pedestrian, that is not something a competent human driver would do. - - - -

        Agree with your overall point. However it is important to note that the design of the roadways in that area made it almost impossible for people not in automobiles to cross legally, and that the responsible highway department was aware there was a substantial population of people who commuted by foot and bicycle in that area. The designation of "jaywalker" is being used by Ub

  • You need true neural network AI to correctly identify the nearly infinite objects and situations that can happen when driving. And even then if people think that can match humans' 16+ years of seeing and identifying all objects and situations and their context, even that's a stretch. We've hit the limit of what image recognition and 3D LIDAR scanning can do and surprise, it's not good enough. We knew this going in. Self driving cars are never going to be good enough at our current technology level because d
    • You will never achieve "true" AI with a neural network. That isn't how brains work. The fact that they are named "neural networks" at all is disingenuous and a bit slimy.

    • WRONG Self-driving cars, along with image (pattern) recognition technology, is still rapidly improving. We don't know, at this point, what the limits of current deep learning pattern recognition technology are. I expect true Level V self-driving cars that are clearly superior to the average human driver to be available within 5 years.
      • "WRONG Autonomous deathbots, along with image (pattern) recognition technology, is still rapidly improving. We don't know, at this point, what the limits of current deep learning pattern recognition technology are. I expect true Level V autonomous deathbots that are clearly superior to the average human killer to be available within 5 years."

        FTFY

      • by sphealey ( 2855 )

        With all due respect I have been reading that exact same comment for at least 3 years. So we should expect the superior autonomous vehicles in 2022? And they will be able to drive autonomously not only in Palo Alto but also in dense Pittsburgh city streets during a snowfall?

    • by DarkOx ( 621550 )

      yes and no. You also have to accept that driving as implemented by humans does not cope well with exceptions to what usually happens either. We accept a certain frequency of failures.

      For example I live in the country. There are lots of long strait-ish stretches of road with posted speeds of 55. Sometimes the corn/soy beans/tall hay grow right up to the edge of the road. Lets say its a bright and sunny out clear skies and 70 degrees with the sun directly over head. The road is also dry and free of debris.

      • You're talking about the wrong thing. Humans do what humans do, and driving at 55 by the corn is part of that. The automated car is going to do what ever Uber wants it to do. If it gets into more accidents than humans by doing what it does, then it's not working properly.
      • Now you might argue that if the corn is right up against the road conditions are actually not ideal and I should not have been doing 55 but creeping along at 15 because gosh anything could jump out

        Never out-drive your stopping distance.

        Dummy.

    • You really don't.

      You don't even need a neural network except to save costs on sensors.

      What you need is a system that can map the 3-dimensional space around it (lots of ways to do that without any AI at all),
      And then make NOT HITTING ANYTHING a primary override. Also doable without any AI.

      Then you can develop an AI to try to navigate the environment effectively without activating the collision-avoidance override. But you never put human life on the line trusting an unpredictable and poorly understood AI sys

    • by fermion ( 181285 )
      Cars, like all things, are dangerous. They are in fact the number one cause of death for young people.

      Nothing we do is absolutely safe. This is not the metric. The metric is if, on balance, it is safer than alternatively and if there are compelling reasons for trade offs.

      One problem with self driving cars is they are not programmed to behave as humans expect. I had one car on intelligent cruise control and as it changed speed zones it braked way too fast. If a car had been close behind, I would have bee

  • by Dereck1701 ( 1922824 ) on Tuesday November 05, 2019 @09:58PM (#59385510)

    It is a sad tendency in our current public discourse to pull some numbers out of context and claim that makes a designated thing good/bad. How many vehicles were involved in the tests (dozens?, hundreds?, thousands?), how many miles/hours did they drive, and how did those statistics compare to human controlled vehicles? If I just blurt out that +30k people die and there are 6 Million crashes per year at the hands of human drivers, that sounds pretty bad. But when you throw in that we drive ~3 Trillion miles per year making the statistics much less distressing (~500k miles per accident and ~100m miles per death, average driver travel ~15k miles). The measure of self driving should be is it better than what we have, not if it is perfect. If we were focused on perfecting each aspect or our lives before moving on to the next aspect we'd still be living in thatched roof huts trying to perfect the mud we slapped on the walls.

    • The measure of self driving should be is it better than what we have, not if it is perfect.

      Give this man a medal.

    • I agree with you. The question to ask is if a human would have got into accidents in these same 33 situations. If a human would have gotten into less accidents, then the self driving isn't working.
    • The article didn't make any claims about Uber's self-driving vehicles being good or bad. You jumped to that conclusion on your own.

      • His opinion, conclusion and primary point is "It is a sad tendency in our current public discourse". I don't see that he came to any other conclusion.

  • Jaywalking... (Score:4, Insightful)

    by BytePusher ( 209961 ) on Tuesday November 05, 2019 @10:06PM (#59385526) Homepage
    So, Uber's vehicle was programmed to mow down people crossing the road outside of a crosswalk. I do this every day. Yet somehow we believe these companies need LESS regulation when developing inherently dangerous technology? Why wasn't Uber required to put each revision of their vehicle's software through some kind of test environment for common road hazards and exceptions?
  • by ukoda ( 537183 ) on Tuesday November 05, 2019 @10:10PM (#59385542) Homepage
    Not designing it to handle jaywalkers in the USA sounds like a law suit waiting to happen. However outside the USA there is no such thing as a jaywalker in most countries, just pedestrians and instead of making crossing the road an offense we use a thing called common sense. NB: Crossing a motorway is an offense in most countries.

    Any vehicle operating worldwide, including self driving vehicles, needs to know how to sensibly and safely deal with pedestrians on the road.
    • by MobyDisk ( 75490 )

      That's because the US sidewalks are terrible. It's better out west than in the east.

      • by ukoda ( 537183 )
        I was referring to the fact that is actually illegal to cross the road in the USA except at a crossing in some places. I wasn't even aware that it was a crime until after many trips to the USA as that law doesn't exist in any of the other countries I have been in. I still haven't worked out how you tell which roads it applies to yet which makes visiting the USA a bit annoying when you haven't rented a car.
          • Jaywalking is not as such illegal in Germany. Matter of fact we don't even have a word for it. The pedestrians are supposed to use crosswalks at intersections and they should not interrupt the traffic flow, but other than that...

        • I still haven't worked out how you tell which roads it applies to yet which makes visiting the USA a bit annoying when you haven't rented a car.

          Typically the rule is that if the street you are crossing is controlled by a traffic light at both ends (the nearest intersections to the pedestrian) then they should cross at a pedestrian crossing. However so few drivers honor the crosswalk in this country that it is no surprise that very few pedestrians honor the jaywalking rules either. In my neighborhood it can be much safer to cross in the middle of the road than at an intersection and I even had an uber driver almost hit me 3 times in 30 seconds whil

          • by ukoda ( 537183 )
            Thanks, now I understand the logic. Others had failed to explain where it applied but what you say makes sense.
    • Wow. You haven't seen much of the world.

      • by ukoda ( 537183 )
        New Zealand, Australia, Cook Islands, Fiji, China, Hong Kong, Japan, Singapore, South Korea, Taiwan, Belgium, France, Netherlands, Switzerland, Ukraine, UK and USA.

        Is that 'much'?
        • So you watch Survivor.
          • by ukoda ( 537183 )
            No,I tend to avoid artificial conflict shows but ironically my time in the Cook Islands was 3 months working on a film at the Island chain of Aitutaki in the early 1980s. The same island where used for most of our shooting was later used for the first season of Survivor that I was aware of.
    • Not designing it to handle jaywalkers is bad because, quite simply, a person can be anywhere on the surface of the Earth at any time and you don't want to hit them. You shouldn't even care about the scenario of jaywalking, you should just be concerned that there is a person you don't want to hit. Likewise, physically a car can drive on a sidewalk, on a lawn, etc. You don't want to hit the car so your driving methodology needs to plan for anything.
    • Comment removed based on user account deletion
  • For people pointing out that the Slashdot title is somewhat clickbait'ish, please consider the article titles that Bloomberg and Reuters selected for this story:

    • Self-Driving Uber in Crash Wasn’t Programmed to Spot Jaywalkers
    • In review of fatal Arizona crash, U.S. agency says Uber software had flaws

    Neither of these are favorable towards Uber. The statement that Uber self-driving cars "were involved in 37 crashes over the prior 18 months" comes straight out of the Reuters article. It is not clear

  • Science experiments on you.

  • Too many people here have the mindset of, "if it's illegal, they shouldn't have to plan for it". They are operating a 4000 lb object in the physical world, one that causes great financial damage or injuries or death if there is a mistake by anyone. With this kind of thinking they never would have put seat belts or airbags in cars, because if you need one, someone is doing something illegal.
  • Tesla accidents are WELL known. But, I have to wonder, if others, esp. Waymo, are having accidents that they cover-up like Uber did.
    • If they're not, then they have the testing parameters set ridiculously small to ever work in the real world or to have results that are very meaningful towards such.
  • Jesus Christ. Why not just program it to brake when there is anything in it's path that is not moving in parallel to it at the same or greater speed?

  • This goes beyond Uber's test. In my town (and I'm sure others) there have been several tragic car vs. pedestrian accidents where the pedestrian was jay walking at night in a poorly lit stretch of road. I've nearly hit such pedestrians wearing black, on a dark road. No one wants to do anything about it on either front: pedestrians won't walk 150ft to the crosswalk, proposed engineering protections to discourage jay walking have been shut down by constituents, and the police are afraid to enforce jay walki

  • Quote: "Bloomberg is also reporting that Uber's self-driving test car wasn't programmed to recognize and react to jawalkers. The report said "the system design did not include a consideration for jaywalking pedestrians." [Elaine Herzberg, the 49-year-old pedestrian that was struck by one of Uber's self-driving cars] was crossing the road outside of a crosswalk."

    Jaywalking in my country/location (Western Australia) is defined as a pedestrian crossing at traffic lights on a red 'don't walk' signal. This is

news: gotcha

Working...