What Caused Uber's Fatal 2018 Crash? NTSB Reveals Its Findings (forbes.com) 82
This week America's National Transportation Safety Board presented its findings on the fatal 2018 crash of a Uber test robocar with a pedestrian in Arizona. Forbes reports:
The NTSB's final determination of probable cause put primary blame on the safety driver's inattention. Contributory causes were Uber's lack of safety culture, poor monitoring of safety drivers, and lack of countermeasures for automation complacency. They put tertiary blame on the pedestrian's impaired crossing of the road, and the lack of good regulations at the Arizona and Federal levels... When it comes to human fault, the report noted that [pedestrian] Herzberg had a "high concentration of methamphetamine" (more than 10 times the medicinal dose) in her blood which would alter her perception. She also had some marijuana residue. She did not look to her right at the oncoming vehicle until 1 second before the crash.
There was also confirmation that the safety driver had indeed pulled out a cell phone and was streaming a TV show on it, looking down at it 34% of the time during her driving session, with a full 5 second "glance" from 6 to 1 seconds prior to the impact. While Uber recorded videos of safety drivers, they never reviewed those of this driver to learn that she was violating the policy against cell phone use. She had received no reprimands, and driven this stretch of road 73 times before... Had the vehicle operator been attentive, she would likely have had sufficient time to detect and react to the crossing pedestrian to avoid the crash or mitigate the impact. The vehicle operator's prolonged visual distraction, a typical effect of automation complacency, led to her failure to detect the pedestrian in time to avoid the collision. The Uber Advanced Technologies Group did not adequately recognize the risk of automation complacency and develop effective countermeasures to control the risk of vehicle operator disengagement, which contributed to the crash... The detrimental effect of the company's ineffective oversight was exacerbated by its decision to remove the second vehicle operator during testing of the automated driving system...
Most notably, they do not attribute the technology failures as causes of the crash. This is a correct cause ruling -- all tested vehicles, while generally better than Uber's, have flaws which would lead to a crash with a negligent safety driver, and to blame those flaws would be to blame the idea of testing this way at all.
Forbes also notes the report criticizes Arizona's "shortcomings" in safeguarding the public because of the state's lack of a safety-focused application-approval process for automated driving system testing.
The article adds that today Uber "is only doing very limited testing -- just a one mile loop around their HQ limited to 25 miles per hour."
There was also confirmation that the safety driver had indeed pulled out a cell phone and was streaming a TV show on it, looking down at it 34% of the time during her driving session, with a full 5 second "glance" from 6 to 1 seconds prior to the impact. While Uber recorded videos of safety drivers, they never reviewed those of this driver to learn that she was violating the policy against cell phone use. She had received no reprimands, and driven this stretch of road 73 times before... Had the vehicle operator been attentive, she would likely have had sufficient time to detect and react to the crossing pedestrian to avoid the crash or mitigate the impact. The vehicle operator's prolonged visual distraction, a typical effect of automation complacency, led to her failure to detect the pedestrian in time to avoid the collision. The Uber Advanced Technologies Group did not adequately recognize the risk of automation complacency and develop effective countermeasures to control the risk of vehicle operator disengagement, which contributed to the crash... The detrimental effect of the company's ineffective oversight was exacerbated by its decision to remove the second vehicle operator during testing of the automated driving system...
Most notably, they do not attribute the technology failures as causes of the crash. This is a correct cause ruling -- all tested vehicles, while generally better than Uber's, have flaws which would lead to a crash with a negligent safety driver, and to blame those flaws would be to blame the idea of testing this way at all.
Forbes also notes the report criticizes Arizona's "shortcomings" in safeguarding the public because of the state's lack of a safety-focused application-approval process for automated driving system testing.
The article adds that today Uber "is only doing very limited testing -- just a one mile loop around their HQ limited to 25 miles per hour."
There's no way to win with one person in the car. (Score:5, Insightful)
One safety driver will become inattentive. It's just not the nature of biological brains to stay on high alert without a clear and present danger, and I think it will be determined (or maybe has been already) that nobody can be trusted to do it properly for more than a couple hours a day. When you actually drive, you make accommodations for current conditions, and this is enough engagement to keep your mind on task -- sometimes, at least. But when you don't even touch the wheel for hours at a time, there's no engagement, and without engagement, the only pressure to stay vigilant is that imposed from outside (the job requirements).
I put the blame squarely on the system that assumed a safety driver would be worth a shit over prolonged stretches of time. We're just not built for that.
Re: (Score:3)
One safety driver will become inattentive.
Ya, but who will watch out for the second safety driver?
I suspect the only effective solution will be Uber drivers all the was down [wikipedia.org].
Re: (Score:2)
I'm not sure I agree exactly. I think it's more about the human's perception of the ability of the vehicle, if the driver thinks the vehicle is capable of full, unattended, self driving, it's natural that they behave accordingly. But the mere fact that a "safety driver" is required shows that the vehicle is not considered capable of that.
I have a car right now which will drive for hours on end on the highway without me touching the steering wheel or taking any other action. And yet I remain fully attentive,
Re: (Score:2)
if the driver thinks the vehicle is capable of full, unattended, self driving, it's natural that they behave accordingly.
But the Volvo was capable of full, unattended stopping when
encountering an obstacle, and came that way from the factory.
But the geniuses at Uber decided to turn that off for no particular reason.
Re: (Score:2)
Trains are equipped with a deadman brake. The operator must keep it engaged or the brakes are applied. In addition, many have a second device where the operator must periodically acknowledge a signal or the brakes are applied. The former covers cases where the operator is incapacitated, the latter is to force the operator to remain somewhat alert.
Re: (Score:2)
Simple deadman's switches get taped, weighted, or whatever other bypass is easily at hand. Periodic signals are hit on demand, without any increase in attentiveness. The former are almost always bypassed quickly by operators, the latter do the bare minimum to prove the driver is not asleep or dead, and no more.
The only way to really make people pay attention is for them to have an accurate understanding of the risks and capabilities of the vehicle. The real true cause of this incident is simply a misalignme
Re: (Score:2)
" and the vehicle's abilities."
Wasn't there a story recently about Uber disabling some component on that car that would have enabled automatic braking? Here it is [techcrunch.com]
Re: (Score:2)
Addressed in other places, but yes, the AEB system in the volvo cars used the same radar frequency as the front radar that Uber installed, instead of fixing their radar to not interfere, they just disabled the one on the Volvo. There's an indication that they've since fixed the problem and re-enabled the volvo one.
Re: (Score:2)
Re: (Score:2)
In the case of consumers, yes. But since this is a test, professional drivers should be used and any bypassing of attentiveness devices is grounds for termination.
No system is circumvention proof. The best you can do is hire people professional enough not to do it and make it hard enough to keep them honest.
Re: (Score:2)
Well, this driver was not attentive and I'm sure she got fired. See, the system works and no changes are required. [/sarcasm]
Nagging, monitoring, etc, never works, it never has, and it never will. The only real solution is to make people want to follow the directions. The best way to do that is to make them realize that the car isn't self driving and that they may die if they aren't paying attention.
Human factors can't be ignored by saying "just hire professionals", proffesionals are still human. I've talke
Re: (Score:2)
That's why you also need the attentiveness monitor..
It doesn't matter how many times you tell someone the car isn't self driving, if it seems to be self driving long enough, they'll become complacent. Just look at the idiots who set the Tesla "autopilot" then climbed into the back seat.
Otherwise, we must ban all carts immediately, people can't be trusted to drive them.
Re: (Score:2)
Nobody has ever invented a functioning attentiveness monitor that can be mounted in a car.
As for autopilot, that's a marketing failure, not a technical one. I have a version of it, modified such that it never monitors for hands on the wheel. I routinely travel for multiple hours at a time on the highway without touching any controls. I'm no less attentive than I am if fully manually driving, and actually more so as my brain doesn't need to focus on the minutia of driving. The reason I'm attentive is that I'
Re: (Score:2)
Re: There's no way to win with one person in the c (Score:1)
"The best you can do is hire people professional enough"
If you want to hire professionals you need to pay a professional wage. Uber pays their safety drivers a shit wage, and therefore they get shit work. No surprise at all.
Fact doesn't actually follow (Score:1)
But they thought the car was self driving (which the fact you need a "safety driver" proves it is not)
Actually, I'd argue that needing a safety driver doesn't prove that the car isn't self driving. You'd need to examine WHY the car needs a safety driver, and more than that, why the driver themselves thinks they're needed.
I can think of a few:
1. The easiest, "The law/insurance requires it". IE the driver thinks that it is fully self driving, their butt is only there for legal coverage.
2. The car still becomes stuck in select situations, they're there to navigate through the weird stuff. Going down a mos
Re: (Score:2)
How about "this is beta software and may kill you if you aren't paying attention". Tell the safety driver that, and I'm sure they'll pay attention. Most likely they were told "The law/insurance requires it" and the results were predictable.
Re: (Score:2)
Surely that's proof that the car isn't self-driving?
Re: (Score:1)
By that standard, humans aren't capable of driving.
Think of it like a learning driver - good in most situations, but you still need a fully licensed one there just in case.
Or how some people do stuff like drive into standing water and get stuck.
Re: (Score:2)
Re: (Score:2)
One safety driver will become inattentive. It's just not the nature of biological brains to stay on high alert without a clear and present danger,
It's not in the nature of biological brains to be able to watch TV and the road at the same time. They weren't even fucking trying to do their job.
Re: (Score:2)
Exactly. Is the argument "it can't be perfect, so don't even try"? That's a very poor excuse not to do better. In the immortal words of Paul Simon about Chernobyl: "I can't run but I can walk much faster than this."
Don't let the perfect be the enemy of the good. Uber's system could in principle have been pretty good, had anyone in the chain from driver to CEO given a shit about it.
Re: (Score:2)
The argument is "a human safety driver will never be perfect, so use something more appropriate or don't test in situations where you can hit anyone". If there is no suitable substitute for the ineffective safety driver, then work on that first. If that means not testing on public roads, so be it. If that means setting up a simulation town with people who are paid to be there and test the reactions of autonomous vehicles, so be it. Nobody said it was going to be cheap or easy.
Re: (Score:2)
"It's gonna steam engine come steam engine time." Doesn't matter how you feel about progress, progress will progress. These will be tested on public roads, since they have to be, tautologically so. No matter what you do, when you first start driving them on public roads you're testing.
So, the only reasonable question is "what's practical". And what's practical is a safety driver who gets fired if they're on their phone when they're supposed to be driving. There's a world of difference between "not lase
Re: (Score:2)
And what's practical is a safety driver who gets fired if they're on their phone when they're supposed to be driving.
And what makes you think she wasn't fired?
That lady is still dead.
Re: (Score:2)
Are you being deliberately dense? Or do you actually think this was the very first time evah the safety driver was blatantly not paying attention on the job? Of course it wasn't. If they had fired the safety driver the first time it happened, this particular crash would have been avoided.
But to do that, Uber management would actually have to care about safety, which clearly they don't.
Re: (Score:3)
This is a solved problem in the security space. Being a security guard is typically boring and people quickly become inattentive. Depending on the situation, you need to have either counter-incentives, or random challenges, or both.
For an example of a counter-incentive, you could pay someone to catch safety drivers not paying attention. Give them video feeds into the driver's seats and let them flip between them at minimum wage with a bonus anytime they manage to catch a driver on camera not paying attentio
Re: (Score:2)
I would agree it's hard to keep continual focus in a self-driving car for more than a couple of hours, for the reasons you give.
This might seem controversial, but would it be an improvement if the safety driver was allowed, or even encouraged, to watch a TV show projected semi-transparently on part of the windshield? Then at least she would constantly be looking in the direction of the road in front of her, and hopefully a reflex would kick in if an obstacle suddenly appeared.
Re: (Score:2)
One safety driver will become inattentive.
I'm assuming that the safety driver is a feature of the R&D program. The goal being to develop self driving vehicles to the point that no driver will be needed. Or Uber is wasting their money if in the end the will still be paying a person to be in the car.
So the solution is: Give the safety driver some duty involving observing the operation of the test vehicle and/or surrounding environment. Press a button every time you see a potential obstruction in the vehicle's path. Collect some sort of data invo
Re: (Score:1)
Re: (Score:2)
Give the safety driver some duty involving observing the operation of the test vehicle and/or surrounding environment.
In the amazing race, sometimes they put up signs along the way and require the driver to memorize them.
Car's emergency braking system was disabled. (Score:2)
More detail was revealed about the disabling of the Volvo standard automatic emergency braking system that comes with the SUV. The Volvo system had its own radar on the same frequency as Uber’s radar and thus could not be used at the same time. Later, they were able to re-tune one of the radars so that both systems can be active.
This is also discussed elsewhere [businessinsider.com].
Tertiary blame... (Score:3)
Stoned pedestrians steps into traffic and is run over.
How is that not the primary cause of the accident?
Re: (Score:2)
Because the driver had plenty of time to react and did nothing.
Because the car itself had plenty of time to react and did nothing.
Re: (Score:2)
In a sane world, those would be the secondary and tertiary factors, not the primary.
That said, we don't live in a sane world, so stoned person stepping in to traffic shouldn't be to blame if someone who was otherwise in the right could have taken extreme measures to avoid the person who actually made the mistake.
Re: (Score:2)
If you think hitting the brakes is an extreme measure, you should never be allowed outside, never mind behind the wheel of a motor vehicle.
Re: (Score:2)
It's on the driver, not the car. The car's automation was being tested, so you would expect it to fail occasionally. Since there's no way to go from "not sure if it will fail" to "very sure it won't fail" without any testing (or any failures), there needs to be a second line of defense, which is the safety driver.
The NTSB report rightfully places the blame on her, and also on Uber for not making sure their drivers are paying attention on a regular basis.
Re: (Score:2)
Blame is not 100% driver/car, 0% pedestrian, though.
In every accident, there are multiple sources of blame. Yes, the driver and car share the majority of blame. But some amount of blame must be assigned to the pedestrian.
I just looked at the video to see if the pedestrian was in the crosswalk. She was not in a crosswalk, she was not even in an intersection. She was just cutting across four lanes of traffic in the dark, with no streetlights (not that streetlights would have helped in this case). In that case
culture of everyone being a victim (Score:2)
so, "It wasn't my fault because someone else should have saved me from the consequences my own mistake?"
This culture of everyone being a victim has got to stop.
Granted, this particular meth-head doesn't have to worry about dealing with the consequences of her actions anymore. Maybe if she had just gotten badly and chronically injured she might have gotten that golden opportunity to experience some sweet regret.
If I had a chance to save you from your own mistake, and I either didn't pull it off or just plai
Re: (Score:2)
If I had a chance to save you from your own mistake, and I either didn't pull it off or just plain missed the opportunity entirely, (or simply chose not to) that does NOT make it my fault.
Since the "I" in the case was a machine developed by Uber,
Uber is still definitely partially responsible.
Re: (Score:1, Flamebait)
How is that not the primary cause of the accident?
Because drug users are never to blame for anything involving them. They are the victims who have no personal responsibility for anything. Nothing they do is ever wrong, it's everyone around them who is at fault.
You have to remember, drug users are smarter than all the doctors and experts who keep reminding people of the danger of drugs, and the hundreds of billions of taxpayer dollars we spend every year reviving drug users from overdoses, for their repeat
Re: (Score:2)
No, it's because the NTSB wasn't determining legal fault. Once the pedestrian was in the road, the situation is set up for an accident. The NTSB is tasked with figuring out why the accident actually occurred, and not how the setup came to be.
Re: (Score:3)
How is that not the primary cause of the accident?
It depends where you are driving.
If you are driving in the country, you are vigilant for a deer running across the road.
If you are driving in the city, you are vigilant for a meth-head running across the road.
If you are driving in a developed country, and see a ball roll into the road, you brake because you know some kids will run into the road chasing the ball.
If you are driving in a third world country, and see a chicken run into the road, you brake because you know some kids will run into the road ch
Re: (Score:2)
In most jurisdictions that only applies where they are legally crossing the road.
Re: (Score:3)
In most jurisdictions that only applies where they are legally crossing the road.
Actually, no. In most legislations, if you can avoid an accident, then you have to avoid it. And if you read the article, that's exactly the case in the USA, which is why the driver not paying attention was the main factor.
You are not allowed to hurt pedestrians. Whether they do something legal or illegal doesn't matter.
Re: (Score:2)
Is there actually anywhere where "pedestrians have right of way period"? Even if there is, the laws of physics don't care about your feelings. Step out from behind an obstacle into oncoming traffic, giving the drive no time to react, and physics will take its toll.
But this isn't about that. This is about a case where the driver might reasonably have stopped, which is exactly the distinction made in moan jurisdictions, not your childish oversimplified mistake.
Re: (Score:2)
> Is there actually anywhere where "pedestrians have right of way period"?
Most parts of the world, apart from a few ex-USA colonies.
If you "drive like an american" (as in following US car-centric laws) in Europe, you WILL find yourself up on criminal charges pretty quickly.
Pedestrians have _ABSOLUTE_ right of way on the roadway and in general may cross anywhere they damned well please or even stand in the middle of the road if the choose to do so. It is up to motorists to see and avoid them.
In particula
Re: (Score:2)
How nice to have a law that it's physically impossible to obey. But I suppose that;s normal in Eurostan.
Re: (Score:2)
How nice to have a law that it's physically impossible to obey. But I suppose that;s normal in Eurostan.
Slowing down and stopping for a pedestrian that you can see is physically impossible to do? I mean, I know that's not what you meant, but I'm confused at the part in the GP's post that you object to. I don't think he, or European laws, is talking about anyone hidden behind a parked car and jumps right into traffic without warning. Just that if there's a pedestrian in the street, you have to wait for them, no matter what.
Maybe we're getting confused about what "right of way" really means. In both American an
Re: (Score:1)
For the same reason that if a vehicle in front of you on the freeway comes to complete stop you can't just plow into them. When you are in charge of two tons of steel travelling at high speed you are responsible for how you operate it.
Re: (Score:2)
Because pedestrians suddenly turning up in front of you is a real and common dangerous situation, and controlling 2 tons of steel that can kill them puts the onus on the driver to make sure that they drive in such a way that they can stop if that happens, you sociopathic asshole.
Re: (Score:2)
puts the onus on the driver
And who was the "driver"?
At that instant, it was a machine developed by Uber
who thought putting a minimum-wage young woman alone in the
driver's seat would compensate for disabling the car's
auto-braking system.
Re: (Score:1)
Yes, I thought that was obvious? Of course Uber is to blame.
Re: (Score:2)
Stoned pedestrians steps into traffic and is run over. How is that not the primary cause of the accident?
Because we live in a society where responsibility lies on the operator of a multi-tonne death machine. Today it's a stoner, they're a stoner right? We shouldn't feel bad for stoners. Tomorrow it's a guy on the phone, that's just Darwinism right? We shouldn't feel bad for Darwinism. The day after that it's someone coming out of a blind corner, but they should be a blind corner, and we don't feel bad for stupid people right? The day after it's a child chasing a ball they dropped, but it's just a child right?
Re: (Score:2)
Because the USA has _extremely_ car-centric laws which are a direct result of GM/Chrysler/Ford lobbying in the 1930s. Some of those laws have bled over into other countries but thankfully most have not.
Those laws are so over the top that most drivers and pedestrians simply ignore them for practical purposes and drivers don't ignore pedestrians on the road (if only because running someone over gets expensive in panel work)
Uber compounded things by releasing doctored video with the gamma wound way down to mak
So everyone and everything failed. Got it. (Score:2)
Woah (Score:1)
A perfect example of (Score:3)
Darwinian selection in action.
"Herzberg had a "high concentration of methamphetamine" (more than 10 times the medicinal dose) in her blood which would alter her perception. She also had some marijuana residue. She did not look to her right at the oncoming vehicle until 1 second before the crash."
Re: A perfect example of (Score:1)
Re: (Score:2)
Ah yes, being stoned should carry a death penalty.
It's not as if non-stoned pedestrians never turn up in front of a car, of course.
Re: (Score:2)
Darwinian selection in action.
A perfect example of an attitude which puts the pedestrian death rate per capita of the USA 8x higher than countries where drivers are legally liable for the safety of pedestrians. "I have a car, LOOK AT THE SIZE OF MY BALLS YOU WORTHLESS MEATBAGS!!"
No observations on the design of the highway (Score:5, Insightful)
I have more trust in the NTSB than just about any organization, but this report was unfortunately incomplete and deficient. There was absolutely zero analysis of the design of the highway, the associated sidewalks (and lack thereof) and the bike paths, and the composition and typical living activities of the local population. Phoenix in general and Tempe in particular have been designed under the theory that all residents will own and operate an automobile and that all locomotion from place to place will occur within an automobile; as a result there is zero or less than zero (hostile) design features serving pedestrians. Yet I have seen estimates of the non-driving population in the area around the accident zone as high as 40%. Lack of design to accommodate the non-car'd, and design patterns that are actively hostile to non-car-driving human beings, are clearly a contributing factor yet were not mentioned in any of the final reports.
Then there is what was very close to a tongue bath for Uber in the summary report, when Uber has taken no responsibility for the crash of its research vehicle and death of an innocent citizen, which was very disturbing.
Uber and the Silicon Valley "move fast and break things" world were fortunate - for themselves, not for society - that the human citizen who died was a person with no next of kin who were interested in looking out for her legacy, because if she had such (e.g. a loving spouse who was a trial lawyer) the entire self-driving car grift could have come crashing down with years of damning testimony and billions of dollars of damages.
Re: (Score:1)
Re: (Score:2)
Are you stupid on purpose, or are you genetically-deficient? The NTSB is tasked with figuring out if an accident could have been avoided. It doesn't matter why the pedestrian was in the road; it only matters whether the car could have avoided hitting her.
Re: (Score:2)
because if she had such (e.g. a loving spouse who was a trial lawyer) the entire self-driving car grift could have come crashing down with years of damning testimony and billions of dollars of damages.
I doubt that.
They would have just paid him off.
No one is worth billions of dollars, except for a select few.
Re: (Score:2)
They would have just paid him off.
And that is exactly what Uber did, immediately after the accident.
Of course, the terms were held under NDA.
Re: (Score:2)
"Yet I have seen estimates of the non-driving population in the area around the accident zone as high as 40%."
This is one of the very points that was made in the New York City case that had them pay out over $2million for consistently failing their duty of care to provide a safe environment for residents.
The fact that US laws are so loaded against pedestrians, the poor and RESIDENTS of these areas shows up clearly in that the family involved spent 8 YEARS getting this through the courts and paid out far mor
Re: (Score:2)
"Then there is what was very close to a tongue bath for Uber in the summary report, when Uber has taken no responsibility for the crash of its research vehicle and death of an innocent citizen, which was very disturbing."
Uber did far worse than that. They released doctored video purporting to be webcam footage from the car - it had been doctored with the gamma turned down to make the forward imaging extremely dark.
Drivers using the same route/intersection noticed this and posted THEIR webcam footage of the
...and to blame those flaws would be to blame... (Score:1)
...the people who came up with the idea of and allowed testing of and were allowed to proceed testing this way at all.
ftfy.
sarcasm>God, no we wouldn't want that to happen, would we?</sarcasm
Blame everything except the 'self-driving' car... (Score:1)
This was a test of the car, with a safety driver present as baby sitter. Since there could not have been a failure of the safety driver without a failure of the car, attributing fault to only the human is a textbook-worthy example of fractured reasoning. Don't tell me the car isn't an agent to whom blame can be sensibly assigned; the AI of the car is presumably capable of making decisions about its environment in such a way to safely operate within it, or it shouldn't have been there in the first place. It
Re: (Score:2)
Link to actual report (Score:4)
These stories should really link to the actual report [ntsb.gov].
Re: (Score:1)
They should have a link to the actual report.
I find it crazy that they didn't find the Meth head as the primary probable cause of the accident. Seems obvious. No meth head crossing illegally, no accident. That accident could have happened with a regular driver. Even a professional police or truck driver.
Re: (Score:2)
No meth head crossing illegally, no accident.
Damn straight.
No meth head, or no Uber, or no highway, or no cars, or
no people then there would never be a problem like this.
Re: (Score:1)
No meth head crossing illegally, no accident.
Damn straight.
No meth head, or no Uber, or no highway, or no cars, or
no people then there would never be a problem like this.
That's really unfair. Are you familiar with this accident? Sounds like you're not. There are things that are preventable. The point I'm making is I don't think technology had something to do with this accident. It would have happened if anyone was at the wheel. It was her bad judgement. Ever work with meth heads?
https://www.rehabs.com/explore... [rehabs.com] . Not exactly people with the best of judgement. Not even good judgement. In fact they make bad decisions all the time.
Let's not make life to fool proof. Too many f
Re: (Score:1)
The meth head illegally crossing is an non-issue. I wouldn't want my self-driving car hitting a deer either.
We need new tech (Score:2)
It has to be tested on meth-addicted jaywalking 100 meter Olympian record holders.