Uber Driver Was Streaming Hulu Just Before Fatal Self-Driving Car Crash, Says Police (arstechnica.com) 184
An anonymous reader quotes a report from Ars Technica: Tempe, Arizona, police have released a massive report on the fatal Uber vehicle crash that killed pedestrian Elaine Herzberg in March. The report provides more evidence that driver Rafaela Vasquez was distracted in the seconds before the crash. "This crash would not have occurred if Vasquez would have been monitoring the vehicle and roadway conditions and was not distracted,'' the report concludes. Police obtained records from Hulu suggesting that Vasquez was watching "The Voice," a singing talent competition that airs on NBC, just before the crash. Hulu's records showed she began watching the program at 9:16pm. Streaming of the show ended at 9:59pm, which "coincides with the approximate time of the collision," according to the police report.
I remember a lot of people defending Uber (Score:1)
Re: (Score:3)
How's the crow taste?
Errr you post makes no sense what so ever. The people who were defending Uber's self-driving car were from the beginning blaming human error...
Re: (Score:1)
To be fair, if the woman hadnâ(TM)t been jaywalking on a dimly lit street at night in front of oncoming traffic, the accident also wouldnâ(TM)t have happened. There were two people making poor decisions, their paths crossed, and one of them died because of it. It sucks.
Re:I remember a lot of people defending Uber (Score:5, Insightful)
Re: (Score:1)
Human eye works way better
It shouldn't. Silicon devices should be much more sensitive than human eyes. Someone cheaped out?
Re:I remember a lot of people defending Uber (Score:4, Insightful)
You have no clue how good the human eye is and how poor a digital replica is, do you?
Re: (Score:2)
Re: (Score:2)
You have no clue how good the human eye is and how poor a digital replica is, do you?
The GP is narrow minded. CCDs are definitely far more sensitive than the human eye but they suffer greatly in the way the resulting image is processed. All the sensitivity in the world doesn't help if you clip the highlights, compress the result to 8bit, and display it on a shitty monitor with a 200:1 contrast ratio.
Re: (Score:2)
Re: (Score:2)
No they are limitations of display. Your eyes in realtime adjust every point dynamically. You can do that in software too, and the result looks like shit. There's a reason when you take a photo into the sunset everything around you looks black, and that's because the alternative looks like garbage.
Also you want to capture realtime video in HDR with almost no compression? Good luck with your technology. Your $200 dashcam suddenly isn't.
Re: (Score:2)
No they are limitations of display.
But there's no "display" in self-driving car's "brain". Only an FP framebuffer which doesn't have these limitations.
Also you want to capture realtime video in HDR with almost no compression?
You're *not* trying to store it so it's irrelevant what a dashcam can or cannot do. (Your brain is not trying to store it either after all.)
Re: (Score:2)
Oh you're talking about the car navigation system not the feed. Right. Well that makes everything you said irrelevant since the car navigates using LIDAR and doesn't care how light or dark it is.
You're *not* trying to store it so it's irrelevant what a dashcam can or cannot do. (Your brain is not trying to store it either after all.)
Side note: Your brain definitely stores it. Your vision is actually quite horrible. What you see is made up of an assessment of a lot of "stored frames" each individually quite horrible. But our brain is great at building a visual world out of a continuous crappy feed.
Re: (Score:2)
Did the Uber car use LIDAR? It had the module installed, it may have even been on, but the software didn't give a fucking shit either way.
Neither did the "driver". Neither did Uber.
Re: (Score:2)
Re: (Score:2)
You're an idiot and a blind Uber defender
I'm saying that Uber uses shit cameras and therefore I'm a Uber defender? You should have your head checked.
Re: (Score:2)
That's not how any of this works.
Digital telescopes at high sensitivities require perfect stillness for long periods of time and isolation from all other light to get a fuzzy dot to filter.
And such telescopes don't exactly fit in the human skull.
Have you ever used a film camera? How did the shot come out compared to what you saw in real life?
Re: (Score:2)
Increasing gain doesn't do shit other than up the noise floor.
Guess which car didn't use LIDAR, or didn't react to the results of anything?
Guess which car and camera we're talking about. You can go and watch the video. If your human eye can't see tons better than that camera you shouldn't have a license.
Further, go look up other videos of that area. It's NOT dimly lit. This was at best a crappy camera and crappy software with an inattentive "driver" and a complacent Uber. In actuality, I believe it's a
Re: (Score:2)
It shouldn't. Silicon devices should be much more sensitive than human eyes. Someone cheaped out?
Yes they are, and the result is that we take this awesome footage and through 99% of the data away and cram it into an 8bpp representation on a display with a woeful contrast ratio.
The wonders of the human eye is not that it's more sensitive than silicon, but rather that it is more selective and as such we are able to see phenomenal amounts of dynamic range that can not only not be captured by silicon, but also not displayed properly as a result.
Either way, I guarantee the road did not appear anywhere near
Re: (Score:2)
What about senior citizen, their eyes don't work so well in the dark either.
Re: (Score:3, Informative)
Re:I remember a lot of people defending Uber (Score:4, Informative)
Furthermore, this is an area with an average of 1.25 miles between marked crosswalks. Are you saying you would have made the half-mile hike to the next crossing?
If you bothered to check, you would have seen that the place where it happened was about 300 feet from a crosswalk. [google.com]
Re: (Score:2)
I think the point was more that Uber's openly stated method of business is to deliberately break laws it finds inconvenient.
Re: I remember a lot of people defending Uber (Score:1)
Re: (Score:2)
"The fact that the driver was watching Hulu while working suggests that she knew she was not being monitored and that her primary role was a warm body in the drivers seat as safety theater."
And I'd bet her hourly wage will confirm that.
If they'd been really serious about her being a safety driver for an autonomous car on public roads then they would have payed her a lot more.
Re: (Score:2)
Companies using a safety driver to test automated driving need to program their system to drop into manual mode at random intervals no more than an hour apart.
If the safety driver has to regularly be alert and take over, they'll pay attention. Otherwise, it's almost impossible to convince a normal human to focus for day after day of sitting there and not having to do anything just in case there is a failure which by then they won't be expecting. If their "normal" is that they expect to have to take over wit
Re: (Score:2)
You could use a running commentary, but then you also need to QA the commentary at random brief intervals to ensure they aren't just talking. That takes additional resources. I still think having the driver actually periodically do what they are supposed to be there for would be ideal, but a running commentary would at least ensure they are paying attention, even if it wouldn't get them used to taking over control on short notice.
Obviously in either case, you'd want to start the driver on a closed track (an
Re:I remember a lot of people defending Uber (Score:4, Informative)
To be fair, if the woman hadnâ(TM)t been jaywalking on a dimly lit street at night in front of oncoming traffic, the accident also wouldnâ(TM)t have happened. There were two people making poor decisions, their paths crossed, and one of them died because of it. It sucks.
The sensors detected her just fine, the software just decide: 'Ehhh fuck it -- I am not stopping"
Source: https://arstechnica.com/tech-p... [arstechnica.com]
Re: (Score:2)
The sensors detected her just fine, the software just decide: 'Ehhh fuck it -- I am not stopping"
That's obviously a extreme software failure -- the crash might have scratched the paint, thus damaging the car.
Those lazy software people should be fired immediately and the responsible managers should hire new ones -- preferably 20 years experience in a field that's only been around for 5.
They still should have had 2 people (Score:2)
Re:They still should have had 2 people (Score:5, Informative)
As I recall, the vehicle had code that could stop automatically, but it was disabled. It also had code to warn the driver, but it was also disabled. Whoever decided that having both disabled should not be a fatal error should be fired, because if the driver knows the car can handle or warn, and expects the car to handle or warn, and is wrong, you get situations like this.
Re: (Score:3)
Fired? What about being charged with negligent homicide?
Re: (Score:2)
The NTSB report found that Uber's software "determined that an emergency braking maneuver was needed" 1.3 seconds before the crash. Unfortunately, the vehicle wasn't programmed to actually perform emergency braking procedures---nor was it programmed to alert the safety driver.
You disagree with my opinion that it was negligent homicide, or did you mean to reply to someone else?
Re:They still should have had 2 people (Score:5, Insightful)
in the car. If nothing else it decreases the odds. They'd both have to be watching Hulu to mess up.
It is a sad comment on society of epic proportions if companies need to hire two people to police each other from cell phone addiction.
Re: (Score:3)
Re: (Score:2)
Good idea. Next month's Slashdot headline: "Drivers in latest fatal self-driving car crash were having sex at the time..."
Investigators say they fucked up.
...and they are now fucked.
Re: (Score:2)
The investigators announced "The fuckers fucked up while fucking and are now fucked."
Re: (Score:2)
"How's the crow taste?"
An employee watched TV instead of doing her fucking job.
Your Dyslexia is acting up again.
Re: (Score:2)
Can anyone else remember a point to self-driving cars other than being able to do other things while the car drives? I sure can't.
Because it's safer? We'll apparently not in this case.
Umm... Why am I pay extra for self driving cars again?
Re:I remember a lot of people defending Uber (Score:4, Interesting)
In a very foreseeable way. If Uber couldn't figure out they shouldn't allow unsupervised employees to carry an small entertainment device into a situation where there were rare but impactful actions/attention required, I put more blame on Uber.
After all, an employee having an accident is one thing. An employee consistently making choices without consequences for a while, and those choices causing the accident, is a failure of supervision.
Re: I remember a lot of people defending Uber (Score:2, Interesting)
Until someone puts out a law saying companies can force employees to turn in / turn off their cell phones. Then the same people will be crying foul for giving employers that power.
Re: I remember a lot of people defending Uber (Score:5, Informative)
Re: (Score:2)
Re: (Score:3)
It's already legal for companies to do so (turn off/do not bring).
Re: (Score:3)
Until someone puts out a law saying companies can force employees to turn in / turn off their cell phones.
Ah, but as Uber repeatedly states, their drivers are NOT their employees.
Re: I remember a lot of people defending Uber (Score:4, Insightful)
The drivers testing the vehicles are employees. :P
Re: (Score:2)
Re: (Score:2)
If United has a pilot don a blindfold, and had no copilot to take over, I would also blame them. However, the situation of "be a daredevil while landing" and "when bored, turn on a tv show" are very different things. It is foreseeable that when bored people will watch TV.
Re: (Score:2)
Well in about the same way that United Airlines shouldnâ(TM) allow their pilots to don blindfolds while attempting to land the airplane.
Instrument landings happen all the time is when the *airplane* is effectively blindfolded by poor weather. But there's no way the pilots can see the instrumentation when blindfolded themselves, that would be nonsense.
Re: (Score:2)
If United Airlines did have its pilots wear blindfolds, it would be Alitalia.
Re: (Score:3, Insightful)
Correct - testing auto pilot includes the part where we see the effects of a distracted driver if something goes wrong with the automation... automation who's very existence practically begs the human driver to ignore the road. Similar to texting while driving - drivers SHOULD pay attention to the road rather than text, but we know many will. Deciding on if auto pilot is safe enough to use in mass production must account for the fact the human drivers won't pay attention to the road as this accident revea
Re: (Score:2)
What now? (Score:2)
Do we ban Uber, Hulu, cars or pedestrians?
Re: (Score:2)
Do we ban Uber, Hulu, cars or pedestrians?
Autonomous cars are OK . . .
. . . we just need to get rid of the loose nuts behind the steering wheels . . .
Re: (Score:3)
Except that this autonomous car ran over a pedestrian because the loose nut behind the wheel was not paying attention.
If there had been no people in the car, this still would have happened. This is why no cars are licensed to drive autonomously. As long as cars require a driver to monitor they are going to be more dangerous, as the "driver" is going to get bored and not pay attention.
Re: (Score:2)
Exactly, we need to perfect the safety features in a way that requires the driver to be driving/attending fully like normal, then we can consider integrating the safety features into a truly driverless, attentionless car. But this nonsense about making it easier for drivers to not pay attention to the road, yet not have the safety features that allow true inattention, is just, well, nonsense!
Re: (Score:2)
"Level 5" should be called "Autonomy" and everything below that "Not Autonomy".
Cars that are above "level 1" but below "level 5" should be
tested only on test tracks by qualified test drivers.
Re: (Score:2)
It has already been documented that the sensors provided data about the obstacle to the control module in plenty of time to stop the car or change lanes. Even if no action were taken by the primary control system the backup sensor provided 1.5 seconds of warning which is enough for 18 ABS actuation cycles and probably 10-15 mph of speed scrub-off. The obstacle happened to be a human - who was killed- so the car was not badly damaged or the occupant injured. If it has been a piece of machinery that fell
Re: (Score:2)
No, that doesn't follow from anything I wrote. Nice try though.
None of the above (Score:5, Insightful)
Re:None of the above (Score:5, Insightful)
It's worth pointing out that this type of response by drivers is predictable. Not necessarily watching TV but zoning out in one way or another. You'll see Tesla trot out the excuse every time as well: "This system requires constant monitoring by the driver, it's not really fully self-driving, and the crash was the driver's fault for not paying attention when they should have."
But: equal--or even more--blame has to go to the designers of the system and testing protocol for not taking this obvious and well known fact about human behavior into consideration when designing their system and their testing protocol.
It's a simple fact of human behavior that once the system looks like it's working OK for a few dozen to a few hundred miles, you assume it's OK and you start to tune out.
In reality, drivers average between 90 million (auto v. auto fatalities) and 480 million (auto v. pedestrian fatalities) miles between fatal collisions. So a system that can manage to go a few dozen or a few hundred miles without anything disastrous happening is still many orders of magnitude less capable than even the worst human driver. But once the automated system has driven a certain route a few times successfully, just about any human "monitor" is going to start have confidence in the system and tune out.
There are many ways around this issue, and companies shouldn't be allowed to test self-driving systems out on the public roads without using some or all of them:
* Far more extensive testing can be done using simulators etc before going live on public roads. They should be testing many billions of miles in this type of environment first. Some companies are putting more emphasis on this now (ie, nVidia). All should be required to do this or something similar.
* Far more testing should be done on tracks & other non-public locations before testing proceeds on public roads.
* Systems should not be allowed to be tested on the public roads until they have proven they are actually capable.
* If systems do require human "safety drivers" as a backup then various monitoring systems and protocols must be in place to ensure that the humans are actually doing the work. You can't just hire random people at $12/hour, give them 3 hours of training, and hope. That is guaranteed failure.
* Companies doing this type of testing need to be 100% responsible for anything that goes wrong. The fact that some employee wasn't doing something 100% right is no excuse. The companies need to have enough of a safety culture, safety system, and safety protocol in place that they know whether or not any individual tester is doing what they should or not.
* Most of all, these safety-critical systems must be engineered in an environment of safety-critical engineering. Not the "move fast and break things" bullshit software development culture that is currently so pervasive.
"Move fast and break things" might be a great philosophy for developing a cell phone app, but operating a motor vehicle is a safety critical system operating in an environment with very high risk of serious injury and death. The systems and the testing must be designed to take this seriously from top to bottom.
FWIW Uber's corporate culture is like the polar opposite of this from top to bottom.
Congress is trying to pass a bill to allow nationwide testing of self-driving vehicle that is laughably lacking in any type of oversight. More here:
http://saferoads.org/2018/06/1... [saferoads.org]
https://www.cnbc.com/2018/06/1... [cnbc.com]
Re: (Score:2)
It's worth pointing out that this type of response by drivers is predictable.
No it's not. Gaze wandering. Boredom. Not paying full attention. All of that is predictable as it is when doing any boring job.
On the other hand pretty much every job out there if you're caught sitting down watching TV when you're supposed to be on the clock, expect to be disciplined. There's "not being attentive" and then there's "not being there mentally at all". This is the latter.
Re: (Score:2)
There's "not being attentive" and then there's "not being there mentally at all". This is the latter.
The why put a fucking television in a AD test car?
Re: (Score:1)
The later is mindnumbingly boring
... which is why they call it a JOB and not a FUN.
Countless millions of us suffer daily with jobs we would rather not be doing. However we do not feel so entitled to ignore what we should be doing when public safety is at stake.
Re: (Score:2)
Unlike the previous generation who feels entitled to someone else's music, movie or tv show without paying for it, right?
and 95% of the time they aren't the older 50+ drivers, they are the 20-somethings and 30-somethings.
Don't know where you live, but around me it's closer to 40-60. Yes, the majority are the "younger" generation, but there are plenty of midlife folks yakking on their phones w
Re: (Score:2)
Think of the Children (Score:1)
Hopefully this gets "the voice" taken off the air
Hulu? (Score:2)
Re: (Score:2)
Stupid thing is, the "commercial free" Hulu service is about as much as Netflix...... and still shows commercials!
Re: (Score:2)
So what you are saying is that commercial-free Hulu does not show commercials on any shows except for the shows where it does show them?
Re: (Score:2)
There are a few apparently due to streaming agreements where they must show a commercial before the show. There indeed are some, but they are the vast, vast minority.
Re: (Score:2)
Not everything is on Netflix!
Shocked (Score:2)
I'm absolutely shocked that an employee whose job is "be vigilant for hours and react in seconds" had their mind could wander and decided they could probably watch a whole episode of the Voice without any negative consequences. I mean, there are people who watch TV while they are actively driving.
Re: (Score:2)
had their mind could wander
I'm sorry but your sarcasm is completely lost in this case. There's a very big frigging difference between having your mind wonder, and kicking back and watching a frigging TV show.
Re: (Score:2)
There's a very big frigging difference between having your mind wonder, and kicking back and watching a frigging TV show.
Get back to your cheeseburger picnic, Randy!
Re: (Score:3)
If that was a pop culture reference I don't get it. If it was an insult I don't get it either.
Re: (Score:2)
There's a big difference between those two actions. There's a continuum between "space out for 1 second", "space out for 30 seconds" and "fuck it, I'll watch a TV show."
Comment removed (Score:4, Insightful)
Re: (Score:3)
Isn't "The Voice" a singing competition? It's not impossible to envisage someone streaming that with no intention of watching the video.
Why was the driver looking down then?
Does this matter in Uber's case? (Score:4, Insightful)
In this case she may have saved a life by doing her job and paying attention, but the final solution assumes that nobody is sitting behind that wheel. This is still a major fail for Uber's software.
Re: (Score:2)
Re: (Score:2)
To me, this is no different than highly automated systems such as trains that still require humans to monitor them.
Yes, highly trained and evaluated humans, not some minimum-wage kid who has no understanding of the system.
Re: (Score:2)
There are several problems here. First they disabled the braking for an experimental system without at least some audible warning system. Second, they had a human in the car to monitor the system who would have likely caught the problem had she been paying attention, so their policies, procedures, and quality control are lacking. Finally, the auto braking was disabled because of too many false positives. This tells me that the system wasn't nearly ready for road testing, certainly without more oversight
Re: (Score:2)
my understanding is that the car had a warning system... which was also disabled. I have to wonder if the... "driver" isn't quite right... let's go with "designated monitor". I have to wonder if the designated monitor had been informed that the warning system and braking system were both disabled.
Everyone is at fault here (Score:5, Interesting)
The person should have been doing her job. At the same time, Uber hires people telling them it's a self driving vehicle, removes the 2 driver-per-car to reduce costs, and then tests disabling safety features because, "Hey it's okay. We have a human in case something goes wrong."
Fuck everything about this. Uber is equally at fault here. Sure she could have prevented the accident if she had been doing her god damn job. Uber could have prevented the accident if they didn't recklessly disable their own lidar and auto-brake algorithms to test their (failed) computer vision system AT NIGHT!
This girl made a mistake, one that will haunt her for the rest of her life. A girl on a bicycle is dead. There is plenty of blame to go around. But at a minimum, given Uber's track record, they should not be allowed to put these pieces of shit on the road.
Telsa has had a car crash into a truck and another into a barrier with their lane assist (they should be forced to rename that from "auto-pilot. It's not fucking auto-pilot). These systems give people a false sense of security and make people less aware, less active drivers. We are a good 15 year minimum from true autonomous vehicles and it's a fucking hard problem space.
Even with how expensive it is to expand rail, we could probably expand rail at a fraction of the price of self driving tech. Singapore and London already have self driving trains. Let's make transportation better for everyone in America first and catch up to the rest of the world before we work on complicated stuff that's only good for its cool factor:
https://penguindreams.org/blog/self-driving-cars-will-not-solve-the-transportation-problem/
Re: (Score:2)
"Road Follower" would probably be accurate, but maybe not as marketable.
The problem with rail (Score:2)
Re: (Score:2)
Except in a very few densely packed urban areas buses and light rail are just not useful paradigms.
For example where I work people come from as close as 1 mile away to 40 or 50 miles away. They live in every direction. Some in very rural areas. Some in suburban level density.
What kind of light rail or bus system is going to be useful for them?
Even in urban areas if you take public transportation you are trading time for using it. When I worked in Chicago it took me 90 minutes to get to work by public transp
Re: (Score:2)
Largely agree. It’s an EXPERIMENT and so you cannot know what might go wrong. Their method of “testing” suggests they believe they have tons of data showing the thing is already very very safe. Which they cannot know.
If this was a drug trial, it would be like giving the first experimental injection to 100 people. Nobody does that. You give it to maybe TWO people and wait for unexpected reactions.
They’re apparently very cavalier about their tests. Nothing to do with the person in the
Re: (Score:2)
America use to have more passenger rail in the 1940s than Europe has NOW. In less than 100 years, roads expanded and pulled our cities apart into unwalkable spaces.
It can contract back in less time. Seattle just pumped $50 billion into ST3 and its' going to make a massive impact by 2023. Their rail construction is always ~6 months ahead of schedule too. Florida has high speed rail and will hit Orlando before 2030. California is struggling but they better not let that program die.
Felon (Score:2, Funny)
I "Watch" youtube on my phone while driving (Score:2)
The question is did she also fiddle with the display on the car (like she was instructed to do so by Uber so that they didn't have to pay for a second driver/passenger to keep track of interesting driving events for the engineers to review). That'll come up in a court case.
But here's a much, much better question, why
Let's blame everyone but the jay-walker. (Score:2)
How about we blame the woman who jay-walked out into the middle of a dimly-lit street at 10 PM? Noooo, let's not blame that stupid behavior, we should focus only on the driver and the the car. If she had walked, or rode, the extra bit to get to a crosswalk she'd likely be alive.
If this hadn't been an Uber car it never would have made headlines. People are distracted by all sorts of things while driving, and no system is going to be able to prevent all accidents, especially when people dart out into the midd
Re: (Score:2)
How about we blame the woman who jay-walked out into the middle of a dimly-lit street at 10 PM?
Because it wasn't dimly lit.
That's an incorrect impression given by the malfunctioning dash cam.
Every person who has photographed that stretch of road late at night (with an ordinary cellphone camera)
showed a very brightly lit and open piece of highway.
UBER is entirely responsible for properly vetting and training their test drivers, and they didn't.
Re: (Score:2)
Doesn't matter. Accidents happen all the time. People in regular cars are distracted all the time. The fact is that she walked out into the middle of the street and got hit. Don't want to get hit? Try using a crosswalk.
But yeah, blame Uber. Because all non-Uber drivers and vehicles never hit jaywalkers. Ever.
Re: (Score:2)
Dimly lit has not part in this. The woman with the bicycle tested positive for a cornucopia of drugs. She was jay-walking, almost certainly saw the car, and probably expected the driver would stop for her, because people generally don't run you over just because you're a duosh bag who steps out right in front of their car in the middle of the block.
She had no idea that the person who was being paid to be monitor the car decide she'd rather be watching Hulu.
There are no innocent victims here.
Re: (Score:3)
Re:Is it inconceivable that they were just listeni (Score:5, Informative)
Re: (Score:2)
In this role UBER is paying minimum wage and the qualifications are a pulse and a drivers license - this doesn't exactly attract people who would not fit in as extras in Idiocracy.
My thoughts exactly.
You're having someone drive a prototype vehicle where you know focus is going to be an issue. You probably don't need someone with a 4-year degree, but you need to make sure they're responsible and have good work habits.
Instead Uber found the cheapest body they could throw in the driver's seat.
Sure she was negligent for watching Hulu instead of controlling the car, but so was Uber for hiring a test-driver who couldn't be reasonably expected to pay attention while the car was driving.
Re: (Score:2)
What's even more damning to Uber is that they previously found out that the car detected the pedestrian something like 6 seconds before the crash, but wasn't configured to autobrake or potentially to even give a warning that it sensed that condition.
If you don't even have autobraking worked out, why would you be testing an autonomous car anywhere but a private track? You'd need to be an immoral company who thinks that laws are an inconvenience to make that sort of a decision.
Re: (Score:2)
You obviously have information the rest of us lack. All I've seen is the very dodgy camera feed from the car which seems at odds with what other cameras could see in the same place at the same time of night, and a LIDAR scan that picked the victim up 6 seconds before the collision. So where's your source?