Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Transportation AI Government

What Caused Uber's Fatal 2018 Crash? NTSB Reveals Its Findings (forbes.com) 82

This week America's National Transportation Safety Board presented its findings on the fatal 2018 crash of a Uber test robocar with a pedestrian in Arizona. Forbes reports: The NTSB's final determination of probable cause put primary blame on the safety driver's inattention. Contributory causes were Uber's lack of safety culture, poor monitoring of safety drivers, and lack of countermeasures for automation complacency. They put tertiary blame on the pedestrian's impaired crossing of the road, and the lack of good regulations at the Arizona and Federal levels... When it comes to human fault, the report noted that [pedestrian] Herzberg had a "high concentration of methamphetamine" (more than 10 times the medicinal dose) in her blood which would alter her perception. She also had some marijuana residue. She did not look to her right at the oncoming vehicle until 1 second before the crash.

There was also confirmation that the safety driver had indeed pulled out a cell phone and was streaming a TV show on it, looking down at it 34% of the time during her driving session, with a full 5 second "glance" from 6 to 1 seconds prior to the impact. While Uber recorded videos of safety drivers, they never reviewed those of this driver to learn that she was violating the policy against cell phone use. She had received no reprimands, and driven this stretch of road 73 times before... Had the vehicle operator been attentive, she would likely have had sufficient time to detect and react to the crossing pedestrian to avoid the crash or mitigate the impact. The vehicle operator's prolonged visual distraction, a typical effect of automation complacency, led to her failure to detect the pedestrian in time to avoid the collision. The Uber Advanced Technologies Group did not adequately recognize the risk of automation complacency and develop effective countermeasures to control the risk of vehicle operator disengagement, which contributed to the crash... The detrimental effect of the company's ineffective oversight was exacerbated by its decision to remove the second vehicle operator during testing of the automated driving system...

Most notably, they do not attribute the technology failures as causes of the crash. This is a correct cause ruling -- all tested vehicles, while generally better than Uber's, have flaws which would lead to a crash with a negligent safety driver, and to blame those flaws would be to blame the idea of testing this way at all.

Forbes also notes the report criticizes Arizona's "shortcomings" in safeguarding the public because of the state's lack of a safety-focused application-approval process for automated driving system testing.

The article adds that today Uber "is only doing very limited testing -- just a one mile loop around their HQ limited to 25 miles per hour."
This discussion has been archived. No new comments can be posted.

What Caused Uber's Fatal 2018 Crash? NTSB Reveals Its Findings

Comments Filter:
  • by Mal-2 ( 675116 ) on Saturday November 23, 2019 @03:01PM (#59446486) Homepage Journal

    One safety driver will become inattentive. It's just not the nature of biological brains to stay on high alert without a clear and present danger, and I think it will be determined (or maybe has been already) that nobody can be trusted to do it properly for more than a couple hours a day. When you actually drive, you make accommodations for current conditions, and this is enough engagement to keep your mind on task -- sometimes, at least. But when you don't even touch the wheel for hours at a time, there's no engagement, and without engagement, the only pressure to stay vigilant is that imposed from outside (the job requirements).

    I put the blame squarely on the system that assumed a safety driver would be worth a shit over prolonged stretches of time. We're just not built for that.

    • One safety driver will become inattentive.

      Ya, but who will watch out for the second safety driver?
      I suspect the only effective solution will be Uber drivers all the was down [wikipedia.org].

    • by green1 ( 322787 )

      I'm not sure I agree exactly. I think it's more about the human's perception of the ability of the vehicle, if the driver thinks the vehicle is capable of full, unattended, self driving, it's natural that they behave accordingly. But the mere fact that a "safety driver" is required shows that the vehicle is not considered capable of that.

      I have a car right now which will drive for hours on end on the highway without me touching the steering wheel or taking any other action. And yet I remain fully attentive,

      • if the driver thinks the vehicle is capable of full, unattended, self driving, it's natural that they behave accordingly.

        But the Volvo was capable of full, unattended stopping when
        encountering an obstacle, and came that way from the factory.
        But the geniuses at Uber decided to turn that off for no particular reason.

    • by sjames ( 1099 )

      Trains are equipped with a deadman brake. The operator must keep it engaged or the brakes are applied. In addition, many have a second device where the operator must periodically acknowledge a signal or the brakes are applied. The former covers cases where the operator is incapacitated, the latter is to force the operator to remain somewhat alert.

      • by green1 ( 322787 )

        Simple deadman's switches get taped, weighted, or whatever other bypass is easily at hand. Periodic signals are hit on demand, without any increase in attentiveness. The former are almost always bypassed quickly by operators, the latter do the bare minimum to prove the driver is not asleep or dead, and no more.

        The only way to really make people pay attention is for them to have an accurate understanding of the risks and capabilities of the vehicle. The real true cause of this incident is simply a misalignme

        • " and the vehicle's abilities."

          Wasn't there a story recently about Uber disabling some component on that car that would have enabled automatic braking? Here it is [techcrunch.com]

          • by green1 ( 322787 )

            Addressed in other places, but yes, the AEB system in the volvo cars used the same radar frequency as the front radar that Uber installed, instead of fixing their radar to not interfere, they just disabled the one on the Volvo. There's an indication that they've since fixed the problem and re-enabled the volvo one.

        • by sjames ( 1099 )

          In the case of consumers, yes. But since this is a test, professional drivers should be used and any bypassing of attentiveness devices is grounds for termination.

          No system is circumvention proof. The best you can do is hire people professional enough not to do it and make it hard enough to keep them honest.

          • by green1 ( 322787 )

            Well, this driver was not attentive and I'm sure she got fired. See, the system works and no changes are required. [/sarcasm]

            Nagging, monitoring, etc, never works, it never has, and it never will. The only real solution is to make people want to follow the directions. The best way to do that is to make them realize that the car isn't self driving and that they may die if they aren't paying attention.

            Human factors can't be ignored by saying "just hire professionals", proffesionals are still human. I've talke

            • by sjames ( 1099 )

              That's why you also need the attentiveness monitor..

              It doesn't matter how many times you tell someone the car isn't self driving, if it seems to be self driving long enough, they'll become complacent. Just look at the idiots who set the Tesla "autopilot" then climbed into the back seat.

              Otherwise, we must ban all carts immediately, people can't be trusted to drive them.

              • by green1 ( 322787 )

                Nobody has ever invented a functioning attentiveness monitor that can be mounted in a car.

                As for autopilot, that's a marketing failure, not a technical one. I have a version of it, modified such that it never monitors for hands on the wheel. I routinely travel for multiple hours at a time on the highway without touching any controls. I'm no less attentive than I am if fully manually driving, and actually more so as my brain doesn't need to focus on the minutia of driving. The reason I'm attentive is that I'

                • I agree, supercruise comes the closest, and even that would allow you to daydream with your eyes open and looking forward.
          • "The best you can do is hire people professional enough"

            If you want to hire professionals you need to pay a professional wage. Uber pays their safety drivers a shit wage, and therefore they get shit work. No surprise at all.

        • But they thought the car was self driving (which the fact you need a "safety driver" proves it is not)

          Actually, I'd argue that needing a safety driver doesn't prove that the car isn't self driving. You'd need to examine WHY the car needs a safety driver, and more than that, why the driver themselves thinks they're needed.
          I can think of a few:
          1. The easiest, "The law/insurance requires it". IE the driver thinks that it is fully self driving, their butt is only there for legal coverage.
          2. The car still becomes stuck in select situations, they're there to navigate through the weird stuff. Going down a mos

          • by green1 ( 322787 )

            How about "this is beta software and may kill you if you aren't paying attention". Tell the safety driver that, and I'm sure they'll pay attention. Most likely they were told "The law/insurance requires it" and the results were predictable.

          • by mvdwege ( 243851 )

            The car still becomes stuck in select situations, they're there to navigate through the weird stuff. Going down a mostly empty road isn't weird enough.

            Surely that's proof that the car isn't self-driving?

            • By that standard, humans aren't capable of driving.

              Think of it like a learning driver - good in most situations, but you still need a fully licensed one there just in case.

              Or how some people do stuff like drive into standing water and get stuck.

        • I believe any software will have bugs and if you are running a billion times a day [with say 100m vehicles - similar to the 2010 toyota crisis - unintended acceleration - no surely not the floormap sticking], you will see failures (crash/death). I guess it's inherent in any decision making system (wet ones like brain too). So as long as self driving cars lead to lower accidents than human drivers, as a society/group, we humans should take that path. I guess it's a price we need to pay for the extra comfort/
    • One safety driver will become inattentive. It's just not the nature of biological brains to stay on high alert without a clear and present danger,

      It's not in the nature of biological brains to be able to watch TV and the road at the same time. They weren't even fucking trying to do their job.

      • by lgw ( 121541 )

        Exactly. Is the argument "it can't be perfect, so don't even try"? That's a very poor excuse not to do better. In the immortal words of Paul Simon about Chernobyl: "I can't run but I can walk much faster than this."

        Don't let the perfect be the enemy of the good. Uber's system could in principle have been pretty good, had anyone in the chain from driver to CEO given a shit about it.

        • by Mal-2 ( 675116 )

          The argument is "a human safety driver will never be perfect, so use something more appropriate or don't test in situations where you can hit anyone". If there is no suitable substitute for the ineffective safety driver, then work on that first. If that means not testing on public roads, so be it. If that means setting up a simulation town with people who are paid to be there and test the reactions of autonomous vehicles, so be it. Nobody said it was going to be cheap or easy.

          • by lgw ( 121541 )

            "It's gonna steam engine come steam engine time." Doesn't matter how you feel about progress, progress will progress. These will be tested on public roads, since they have to be, tautologically so. No matter what you do, when you first start driving them on public roads you're testing.

            So, the only reasonable question is "what's practical". And what's practical is a safety driver who gets fired if they're on their phone when they're supposed to be driving. There's a world of difference between "not lase

            • And what's practical is a safety driver who gets fired if they're on their phone when they're supposed to be driving.

              And what makes you think she wasn't fired?
              That lady is still dead.

              • by lgw ( 121541 )

                Are you being deliberately dense? Or do you actually think this was the very first time evah the safety driver was blatantly not paying attention on the job? Of course it wasn't. If they had fired the safety driver the first time it happened, this particular crash would have been avoided.

                But to do that, Uber management would actually have to care about safety, which clearly they don't.

    • This is a solved problem in the security space. Being a security guard is typically boring and people quickly become inattentive. Depending on the situation, you need to have either counter-incentives, or random challenges, or both.

      For an example of a counter-incentive, you could pay someone to catch safety drivers not paying attention. Give them video feeds into the driver's seats and let them flip between them at minimum wage with a bonus anytime they manage to catch a driver on camera not paying attentio

    • I would agree it's hard to keep continual focus in a self-driving car for more than a couple of hours, for the reasons you give.

      This might seem controversial, but would it be an improvement if the safety driver was allowed, or even encouraged, to watch a TV show projected semi-transparently on part of the windshield? Then at least she would constantly be looking in the direction of the road in front of her, and hopefully a reflex would kick in if an obstacle suddenly appeared.

    • by PPH ( 736903 )

      One safety driver will become inattentive.

      I'm assuming that the safety driver is a feature of the R&D program. The goal being to develop self driving vehicles to the point that no driver will be needed. Or Uber is wasting their money if in the end the will still be paying a person to be in the car.

      So the solution is: Give the safety driver some duty involving observing the operation of the test vehicle and/or surrounding environment. Press a button every time you see a potential obstruction in the vehicle's path. Collect some sort of data invo

      • TFS says the car basically drives around the block a gazillion times for hours on end. Imagine driving not an average city street, but an average NASCAR track, at night, alone, round and round, and you'll see it becomes a test of human endurance. Even a comparably uber-boring (no pun intended) job like the airport shuttle bus driver (back and forth the half-mile between the plane and terminal, all day long) has more variety, let alone train drivers, etc. An accident was bound to happen sooner or later, task
      • Give the safety driver some duty involving observing the operation of the test vehicle and/or surrounding environment.

        In the amazing race, sometimes they put up signs along the way and require the driver to memorize them.

  • The Volvo has it's own emergency braking system, but it had been disabled. I had read earlier that this was to provide a "better ride", but it now appears it was because of the Uber and Volvo systems used the same radar frequencies, which has, apparently, now been corrected. From TFA:

    More detail was revealed about the disabling of the Volvo standard automatic emergency braking system that comes with the SUV. The Volvo system had its own radar on the same frequency as Uber’s radar and thus could not be used at the same time. Later, they were able to re-tune one of the radars so that both systems can be active.

    This is also discussed elsewhere [businessinsider.com].

  • by cnaumann ( 466328 ) on Saturday November 23, 2019 @03:29PM (#59446522)

    Stoned pedestrians steps into traffic and is run over.

    How is that not the primary cause of the accident?

    • Because the driver had plenty of time to react and did nothing.

      Because the car itself had plenty of time to react and did nothing.

      • by green1 ( 322787 )

        In a sane world, those would be the secondary and tertiary factors, not the primary.

        That said, we don't live in a sane world, so stoned person stepping in to traffic shouldn't be to blame if someone who was otherwise in the right could have taken extreme measures to avoid the person who actually made the mistake.

        • by Dog-Cow ( 21281 )

          If you think hitting the brakes is an extreme measure, you should never be allowed outside, never mind behind the wheel of a motor vehicle.

      • by djinn6 ( 1868030 )

        It's on the driver, not the car. The car's automation was being tested, so you would expect it to fail occasionally. Since there's no way to go from "not sure if it will fail" to "very sure it won't fail" without any testing (or any failures), there needs to be a second line of defense, which is the safety driver.

        The NTSB report rightfully places the blame on her, and also on Uber for not making sure their drivers are paying attention on a regular basis.

        • Blame is not 100% driver/car, 0% pedestrian, though.
          In every accident, there are multiple sources of blame. Yes, the driver and car share the majority of blame. But some amount of blame must be assigned to the pedestrian.

          I just looked at the video to see if the pedestrian was in the crosswalk. She was not in a crosswalk, she was not even in an intersection. She was just cutting across four lanes of traffic in the dark, with no streetlights (not that streetlights would have helped in this case). In that case

      • so, "It wasn't my fault because someone else should have saved me from the consequences my own mistake?"

        This culture of everyone being a victim has got to stop.

        Granted, this particular meth-head doesn't have to worry about dealing with the consequences of her actions anymore. Maybe if she had just gotten badly and chronically injured she might have gotten that golden opportunity to experience some sweet regret.

        If I had a chance to save you from your own mistake, and I either didn't pull it off or just plai

        • If I had a chance to save you from your own mistake, and I either didn't pull it off or just plain missed the opportunity entirely, (or simply chose not to) that does NOT make it my fault.

          Since the "I" in the case was a machine developed by Uber,
          Uber is still definitely partially responsible.

    • Re: (Score:1, Flamebait)

      by quonset ( 4839537 )

      How is that not the primary cause of the accident?

      Because drug users are never to blame for anything involving them. They are the victims who have no personal responsibility for anything. Nothing they do is ever wrong, it's everyone around them who is at fault.

      You have to remember, drug users are smarter than all the doctors and experts who keep reminding people of the danger of drugs, and the hundreds of billions of taxpayer dollars we spend every year reviving drug users from overdoses, for their repeat

      • by Dog-Cow ( 21281 )

        No, it's because the NTSB wasn't determining legal fault. Once the pedestrian was in the road, the situation is set up for an accident. The NTSB is tasked with figuring out why the accident actually occurred, and not how the setup came to be.

    • How is that not the primary cause of the accident?

      It depends where you are driving.

      If you are driving in the country, you are vigilant for a deer running across the road.

      If you are driving in the city, you are vigilant for a meth-head running across the road.

      If you are driving in a developed country, and see a ball roll into the road, you brake because you know some kids will run into the road chasing the ball.

      If you are driving in a third world country, and see a chicken run into the road, you brake because you know some kids will run into the road ch

    • by Gimric ( 110667 )

      For the same reason that if a vehicle in front of you on the freeway comes to complete stop you can't just plow into them. When you are in charge of two tons of steel travelling at high speed you are responsible for how you operate it.

    • by mvdwege ( 243851 )

      Because pedestrians suddenly turning up in front of you is a real and common dangerous situation, and controlling 2 tons of steel that can kill them puts the onus on the driver to make sure that they drive in such a way that they can stop if that happens, you sociopathic asshole.

      • puts the onus on the driver

        And who was the "driver"?
        At that instant, it was a machine developed by Uber
        who thought putting a minimum-wage young woman alone in the
        driver's seat would compensate for disabling the car's
        auto-braking system.

    • Stoned pedestrians steps into traffic and is run over. How is that not the primary cause of the accident?

      Because we live in a society where responsibility lies on the operator of a multi-tonne death machine. Today it's a stoner, they're a stoner right? We shouldn't feel bad for stoners. Tomorrow it's a guy on the phone, that's just Darwinism right? We shouldn't feel bad for Darwinism. The day after that it's someone coming out of a blind corner, but they should be a blind corner, and we don't feel bad for stupid people right? The day after it's a child chasing a ball they dropped, but it's just a child right?

    • Because the USA has _extremely_ car-centric laws which are a direct result of GM/Chrysler/Ford lobbying in the 1930s. Some of those laws have bled over into other countries but thankfully most have not.

      Those laws are so over the top that most drivers and pedestrians simply ignore them for practical purposes and drivers don't ignore pedestrians on the road (if only because running someone over gets expensive in panel work)

      Uber compounded things by releasing doctored video with the gamma wound way down to mak

  • Except for that innocent kitten sitting on the grassy knoll.
  • Thatâ(TM)s funny because 25 is a tad fast for the loop around HQ.
  • by Vinegar Joe ( 998110 ) on Saturday November 23, 2019 @04:30PM (#59446684)

    Darwinian selection in action.

    "Herzberg had a "high concentration of methamphetamine" (more than 10 times the medicinal dose) in her blood which would alter her perception. She also had some marijuana residue. She did not look to her right at the oncoming vehicle until 1 second before the crash."

    • I dodge druggies all the time. Itâ(TM)s possible to drive by these walking disasters and not get them killed.
    • by mvdwege ( 243851 )

      Ah yes, being stoned should carry a death penalty.

      It's not as if non-stoned pedestrians never turn up in front of a car, of course.

    • Darwinian selection in action.

      A perfect example of an attitude which puts the pedestrian death rate per capita of the USA 8x higher than countries where drivers are legally liable for the safety of pedestrians. "I have a car, LOOK AT THE SIZE OF MY BALLS YOU WORTHLESS MEATBAGS!!"

  • by sphealey ( 2855 ) on Saturday November 23, 2019 @04:39PM (#59446716)

    I have more trust in the NTSB than just about any organization, but this report was unfortunately incomplete and deficient. There was absolutely zero analysis of the design of the highway, the associated sidewalks (and lack thereof) and the bike paths, and the composition and typical living activities of the local population. Phoenix in general and Tempe in particular have been designed under the theory that all residents will own and operate an automobile and that all locomotion from place to place will occur within an automobile; as a result there is zero or less than zero (hostile) design features serving pedestrians. Yet I have seen estimates of the non-driving population in the area around the accident zone as high as 40%. Lack of design to accommodate the non-car'd, and design patterns that are actively hostile to non-car-driving human beings, are clearly a contributing factor yet were not mentioned in any of the final reports.

    Then there is what was very close to a tongue bath for Uber in the summary report, when Uber has taken no responsibility for the crash of its research vehicle and death of an innocent citizen, which was very disturbing.

    Uber and the Silicon Valley "move fast and break things" world were fortunate - for themselves, not for society - that the human citizen who died was a person with no next of kin who were interested in looking out for her legacy, because if she had such (e.g. a loving spouse who was a trial lawyer) the entire self-driving car grift could have come crashing down with years of damning testimony and billions of dollars of damages.

    • She was one of the deplorables. Nobody cares about them. If they all died tomorrow there would be great rejoicing throughout the world.
    • by Dog-Cow ( 21281 )

      Are you stupid on purpose, or are you genetically-deficient? The NTSB is tasked with figuring out if an accident could have been avoided. It doesn't matter why the pedestrian was in the road; it only matters whether the car could have avoided hitting her.

    • because if she had such (e.g. a loving spouse who was a trial lawyer) the entire self-driving car grift could have come crashing down with years of damning testimony and billions of dollars of damages.

      I doubt that.

      They would have just paid him off.

      No one is worth billions of dollars, except for a select few.

      • They would have just paid him off.

        And that is exactly what Uber did, immediately after the accident.
        Of course, the terms were held under NDA.

    • "Yet I have seen estimates of the non-driving population in the area around the accident zone as high as 40%."

      This is one of the very points that was made in the New York City case that had them pay out over $2million for consistently failing their duty of care to provide a safe environment for residents.

      The fact that US laws are so loaded against pedestrians, the poor and RESIDENTS of these areas shows up clearly in that the family involved spent 8 YEARS getting this through the courts and paid out far mor

    • "Then there is what was very close to a tongue bath for Uber in the summary report, when Uber has taken no responsibility for the crash of its research vehicle and death of an innocent citizen, which was very disturbing."

      Uber did far worse than that. They released doctored video purporting to be webcam footage from the car - it had been doctored with the gamma turned down to make the forward imaging extremely dark.

      Drivers using the same route/intersection noticed this and posted THEIR webcam footage of the

  • ...the people who came up with the idea of and allowed testing of and were allowed to proceed testing this way at all.

    ftfy.

    sarcasm>God, no we wouldn't want that to happen, would we?</sarcasm

  • This was a test of the car, with a safety driver present as baby sitter. Since there could not have been a failure of the safety driver without a failure of the car, attributing fault to only the human is a textbook-worthy example of fractured reasoning. Don't tell me the car isn't an agent to whom blame can be sensibly assigned; the AI of the car is presumably capable of making decisions about its environment in such a way to safely operate within it, or it shouldn't have been there in the first place. It

    • I'm not sure I understand your logic here. The car was a test unit, which you acknowledge; ultimately the "babysitter" was actually the driver, and was responsible for ensuring the vehicle operated safely at all times. NTSB put fault on the driver because she was watching television until 1 second before impact, and if she'd been watching the road -- as she was being paid to do -- she could have totally avoided the crash. The footage showed the distance to the pedestrian and the velocity; it was a simply ex
  • by djinn6 ( 1868030 ) on Sunday November 24, 2019 @02:00AM (#59447810)

    These stories should really link to the actual report [ntsb.gov].

    • by ebvwfbw ( 864834 )

      They should have a link to the actual report.
      I find it crazy that they didn't find the Meth head as the primary probable cause of the accident. Seems obvious. No meth head crossing illegally, no accident. That accident could have happened with a regular driver. Even a professional police or truck driver.

      • No meth head crossing illegally, no accident.

        Damn straight.
        No meth head, or no Uber, or no highway, or no cars, or
        no people then there would never be a problem like this.

        • by ebvwfbw ( 864834 )

          No meth head crossing illegally, no accident.

          Damn straight.
          No meth head, or no Uber, or no highway, or no cars, or
          no people then there would never be a problem like this.

          That's really unfair. Are you familiar with this accident? Sounds like you're not. There are things that are preventable. The point I'm making is I don't think technology had something to do with this accident. It would have happened if anyone was at the wheel. It was her bad judgement. Ever work with meth heads?
          https://www.rehabs.com/explore... [rehabs.com] . Not exactly people with the best of judgement. Not even good judgement. In fact they make bad decisions all the time.

          Let's not make life to fool proof. Too many f

      • The meth head illegally crossing is an non-issue. I wouldn't want my self-driving car hitting a deer either.

  • It has to be tested on meth-addicted jaywalking 100 meter Olympian record holders.

% APL is a natural extension of assembler language programming; ...and is best for educational purposes. -- A. Perlis

Working...