Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Transportation The Internet Entertainment

Uber Driver Was Streaming Hulu Just Before Fatal Self-Driving Car Crash, Says Police (arstechnica.com) 184

An anonymous reader quotes a report from Ars Technica: Tempe, Arizona, police have released a massive report on the fatal Uber vehicle crash that killed pedestrian Elaine Herzberg in March. The report provides more evidence that driver Rafaela Vasquez was distracted in the seconds before the crash. "This crash would not have occurred if Vasquez would have been monitoring the vehicle and roadway conditions and was not distracted,'' the report concludes. Police obtained records from Hulu suggesting that Vasquez was watching "The Voice," a singing talent competition that airs on NBC, just before the crash. Hulu's records showed she began watching the program at 9:16pm. Streaming of the show ended at 9:59pm, which "coincides with the approximate time of the collision," according to the police report.
This discussion has been archived. No new comments can be posted.

Uber Driver Was Streaming Hulu Just Before Fatal Self-Driving Car Crash, Says Police

Comments Filter:
    • How's the crow taste?

      Errr you post makes no sense what so ever. The people who were defending Uber's self-driving car were from the beginning blaming human error...

      • To be fair, if the woman hadnâ(TM)t been jaywalking on a dimly lit street at night in front of oncoming traffic, the accident also wouldnâ(TM)t have happened. There were two people making poor decisions, their paths crossed, and one of them died because of it. It sucks.

        • by kiviQr ( 3443687 ) on Friday June 22, 2018 @02:59PM (#56830322)
          note that "dimly list street" - what you have seen is footage from camera that does not perform well in low light condition. Human eye works way better - as long as you focus it on the road...
          • Human eye works way better

            It shouldn't. Silicon devices should be much more sensitive than human eyes. Someone cheaped out?

            • by sexconker ( 1179573 ) on Friday June 22, 2018 @03:32PM (#56830578)

              You have no clue how good the human eye is and how poor a digital replica is, do you?

              • You have no clue how good the human eye is and how poor a digital replica is, do you?

                The GP is narrow minded. CCDs are definitely far more sensitive than the human eye but they suffer greatly in the way the resulting image is processed. All the sensitivity in the world doesn't help if you clip the highlights, compress the result to 8bit, and display it on a shitty monitor with a 200:1 contrast ratio.

                • But there's no technical need to do any of those in a night time camera. Those are artifacts of cheap designs, not limitations on the light capturing technology.
                  • No they are limitations of display. Your eyes in realtime adjust every point dynamically. You can do that in software too, and the result looks like shit. There's a reason when you take a photo into the sunset everything around you looks black, and that's because the alternative looks like garbage.

                    Also you want to capture realtime video in HDR with almost no compression? Good luck with your technology. Your $200 dashcam suddenly isn't.

                    • No they are limitations of display.

                      But there's no "display" in self-driving car's "brain". Only an FP framebuffer which doesn't have these limitations.

                      Also you want to capture realtime video in HDR with almost no compression?

                      You're *not* trying to store it so it's irrelevant what a dashcam can or cannot do. (Your brain is not trying to store it either after all.)

                    • Oh you're talking about the car navigation system not the feed. Right. Well that makes everything you said irrelevant since the car navigates using LIDAR and doesn't care how light or dark it is.

                      You're *not* trying to store it so it's irrelevant what a dashcam can or cannot do. (Your brain is not trying to store it either after all.)

                      Side note: Your brain definitely stores it. Your vision is actually quite horrible. What you see is made up of an assessment of a lot of "stored frames" each individually quite horrible. But our brain is great at building a visual world out of a continuous crappy feed.

                    • Did the Uber car use LIDAR? It had the module installed, it may have even been on, but the software didn't give a fucking shit either way.
                      Neither did the "driver". Neither did Uber.

            • It shouldn't. Silicon devices should be much more sensitive than human eyes. Someone cheaped out?

              Yes they are, and the result is that we take this awesome footage and through 99% of the data away and cram it into an 8bpp representation on a display with a woeful contrast ratio.

              The wonders of the human eye is not that it's more sensitive than silicon, but rather that it is more selective and as such we are able to see phenomenal amounts of dynamic range that can not only not be captured by silicon, but also not displayed properly as a result.

              Either way, I guarantee the road did not appear anywhere near

          • What about senior citizen, their eyes don't work so well in the dark either.

        • Re: (Score:3, Informative)

          by Anonymous Coward
          Except it wasn't a dimly lit area, Uber's own diagnostics attested to this, she was spotted with plenty of time to come to a full stop if need be. Furthermore, this is an area with an average of 1.25 miles between marked crosswalks. Are you saying you would have made the half-mile hike to the next crossing?
        • The safety driver watching Hulu is more than just a poor decision on the drivers part. It represents a complete failure of the safety culture at Uber. What were hiring requirements and training for safety drivers. There was in cab recording of the safety driver. Was it streamed to a monitoring system to ensure the drivers were doing there job? Was it reviewed by safety supervisors? Was there any ongoing analysis done to determine how effective the safety drivers were? Did the safety drivers have regular
          • "The fact that the driver was watching Hulu while working suggests that she knew she was not being monitored and that her primary role was a warm body in the drivers seat as safety theater."

            And I'd bet her hourly wage will confirm that.

            If they'd been really serious about her being a safety driver for an autonomous car on public roads then they would have payed her a lot more.

            • Companies using a safety driver to test automated driving need to program their system to drop into manual mode at random intervals no more than an hour apart.

              If the safety driver has to regularly be alert and take over, they'll pay attention. Otherwise, it's almost impossible to convince a normal human to focus for day after day of sitting there and not having to do anything just in case there is a failure which by then they won't be expecting. If their "normal" is that they expect to have to take over wit

        • by r_naked ( 150044 ) on Friday June 22, 2018 @03:30PM (#56830564) Homepage

          To be fair, if the woman hadnâ(TM)t been jaywalking on a dimly lit street at night in front of oncoming traffic, the accident also wouldnâ(TM)t have happened. There were two people making poor decisions, their paths crossed, and one of them died because of it. It sucks.

          The sensors detected her just fine, the software just decide: 'Ehhh fuck it -- I am not stopping"

          Source: https://arstechnica.com/tech-p... [arstechnica.com]

          • The sensors detected her just fine, the software just decide: 'Ehhh fuck it -- I am not stopping"

            That's obviously a extreme software failure -- the crash might have scratched the paint, thus damaging the car.

            Those lazy software people should be fired immediately and the responsible managers should hire new ones -- preferably 20 years experience in a field that's only been around for 5.

    • in the car. If nothing else it decreases the odds. They'd both have to be watching Hulu to mess up. Safety is about reducing risk, not eliminating it. Also, Uber still disabled a ton of safety features they shouldn't have so they could get better data.
      • by suutar ( 1860506 ) on Friday June 22, 2018 @03:18PM (#56830454)

        As I recall, the vehicle had code that could stop automatically, but it was disabled. It also had code to warn the driver, but it was also disabled. Whoever decided that having both disabled should not be a fatal error should be fired, because if the driver knows the car can handle or warn, and expects the car to handle or warn, and is wrong, you get situations like this.

        • by ChoGGi ( 522069 )

          Fired? What about being charged with negligent homicide?

      • by magarity ( 164372 ) on Friday June 22, 2018 @04:15PM (#56830858)

        in the car. If nothing else it decreases the odds. They'd both have to be watching Hulu to mess up.

        It is a sad comment on society of epic proportions if companies need to hire two people to police each other from cell phone addiction.

      • Good idea. Next month's Slashdot headline: "Drivers in latest fatal self-driving car crash were having sex at the time..."
    • "How's the crow taste?"

      An employee watched TV instead of doing her fucking job.

      Your Dyslexia is acting up again.

    • by dohzer ( 867770 )

      Can anyone else remember a point to self-driving cars other than being able to do other things while the car drives? I sure can't.
      Because it's safer? We'll apparently not in this case.
      Umm... Why am I pay extra for self driving cars again?

  • Do we ban Uber, Hulu, cars or pedestrians?

    • Do we ban Uber, Hulu, cars or pedestrians?

      Autonomous cars are OK . . .

      . . . we just need to get rid of the loose nuts behind the steering wheels . . .

      • Except that this autonomous car ran over a pedestrian because the loose nut behind the wheel was not paying attention.

        If there had been no people in the car, this still would have happened. This is why no cars are licensed to drive autonomously. As long as cars require a driver to monitor they are going to be more dangerous, as the "driver" is going to get bored and not pay attention.

        • Exactly, we need to perfect the safety features in a way that requires the driver to be driving/attending fully like normal, then we can consider integrating the safety features into a truly driverless, attentionless car. But this nonsense about making it easier for drivers to not pay attention to the road, yet not have the safety features that allow true inattention, is just, well, nonsense!

          • This whole idea of having ordinary drivers alpha-test various "levels" of autonomy is total crap.

            "Level 5" should be called "Autonomy" and everything below that "Not Autonomy".
            Cars that are above "level 1" but below "level 5" should be
            tested only on test tracks by qualified test drivers.
        • by sphealey ( 2855 )

          It has already been documented that the sensors provided data about the obstacle to the control module in plenty of time to stop the car or change lanes. Even if no action were taken by the primary control system the backup sensor provided 1.5 seconds of warning which is enough for 18 ABS actuation cycles and probably 10-15 mph of speed scrub-off. The obstacle happened to be a human - who was killed- so the car was not badly damaged or the occupant injured. If it has been a piece of machinery that fell

    • None of the above (Score:5, Insightful)

      by PraiseBob ( 1923958 ) on Friday June 22, 2018 @02:48PM (#56830236)
      How about we arrest the driver for watching TV while they were supposed to be operating a multi-ton piece of machinery?
      • by flug ( 589009 ) on Friday June 22, 2018 @03:46PM (#56830692)

        It's worth pointing out that this type of response by drivers is predictable. Not necessarily watching TV but zoning out in one way or another. You'll see Tesla trot out the excuse every time as well: "This system requires constant monitoring by the driver, it's not really fully self-driving, and the crash was the driver's fault for not paying attention when they should have."

        But: equal--or even more--blame has to go to the designers of the system and testing protocol for not taking this obvious and well known fact about human behavior into consideration when designing their system and their testing protocol.

        It's a simple fact of human behavior that once the system looks like it's working OK for a few dozen to a few hundred miles, you assume it's OK and you start to tune out.

        In reality, drivers average between 90 million (auto v. auto fatalities) and 480 million (auto v. pedestrian fatalities) miles between fatal collisions. So a system that can manage to go a few dozen or a few hundred miles without anything disastrous happening is still many orders of magnitude less capable than even the worst human driver. But once the automated system has driven a certain route a few times successfully, just about any human "monitor" is going to start have confidence in the system and tune out.

        There are many ways around this issue, and companies shouldn't be allowed to test self-driving systems out on the public roads without using some or all of them:

          * Far more extensive testing can be done using simulators etc before going live on public roads. They should be testing many billions of miles in this type of environment first. Some companies are putting more emphasis on this now (ie, nVidia). All should be required to do this or something similar.

          * Far more testing should be done on tracks & other non-public locations before testing proceeds on public roads.

          * Systems should not be allowed to be tested on the public roads until they have proven they are actually capable.

          * If systems do require human "safety drivers" as a backup then various monitoring systems and protocols must be in place to ensure that the humans are actually doing the work. You can't just hire random people at $12/hour, give them 3 hours of training, and hope. That is guaranteed failure.

          * Companies doing this type of testing need to be 100% responsible for anything that goes wrong. The fact that some employee wasn't doing something 100% right is no excuse. The companies need to have enough of a safety culture, safety system, and safety protocol in place that they know whether or not any individual tester is doing what they should or not.

          * Most of all, these safety-critical systems must be engineered in an environment of safety-critical engineering. Not the "move fast and break things" bullshit software development culture that is currently so pervasive.

        "Move fast and break things" might be a great philosophy for developing a cell phone app, but operating a motor vehicle is a safety critical system operating in an environment with very high risk of serious injury and death. The systems and the testing must be designed to take this seriously from top to bottom.

        FWIW Uber's corporate culture is like the polar opposite of this from top to bottom.

        Congress is trying to pass a bill to allow nationwide testing of self-driving vehicle that is laughably lacking in any type of oversight. More here:

        http://saferoads.org/2018/06/1... [saferoads.org]

        https://www.cnbc.com/2018/06/1... [cnbc.com]

        • It's worth pointing out that this type of response by drivers is predictable.

          No it's not. Gaze wandering. Boredom. Not paying full attention. All of that is predictable as it is when doing any boring job.

          On the other hand pretty much every job out there if you're caught sitting down watching TV when you're supposed to be on the clock, expect to be disciplined. There's "not being attentive" and then there's "not being there mentally at all". This is the latter.

          • There's "not being attentive" and then there's "not being there mentally at all". This is the latter.

            The why put a fucking television in a AD test car?

    • by kiviQr ( 3443687 )
      Clearly it was "The Voice" fault!
  • by Anonymous Coward

    Hopefully this gets "the voice" taken off the air

  • Couldn't she afford Netflix?
    • Stupid thing is, the "commercial free" Hulu service is about as much as Netflix...... and still shows commercials!

    • by antdude ( 79039 )

      Not everything is on Netflix!

  • I'm absolutely shocked that an employee whose job is "be vigilant for hours and react in seconds" had their mind could wander and decided they could probably watch a whole episode of the Voice without any negative consequences. I mean, there are people who watch TV while they are actively driving.

    • had their mind could wander

      I'm sorry but your sarcasm is completely lost in this case. There's a very big frigging difference between having your mind wonder, and kicking back and watching a frigging TV show.

      • There's a very big frigging difference between having your mind wonder, and kicking back and watching a frigging TV show.

        Get back to your cheeseburger picnic, Randy!

        • If that was a pop culture reference I don't get it. If it was an insult I don't get it either.

        • There's a big difference between those two actions. There's a continuum between "space out for 1 second", "space out for 30 seconds" and "fuck it, I'll watch a TV show."

  • Comment removed (Score:4, Insightful)

    by account_deleted ( 4530225 ) on Friday June 22, 2018 @02:44PM (#56830200)
    Comment removed based on user account deletion
    • Isn't "The Voice" a singing competition? It's not impossible to envisage someone streaming that with no intention of watching the video.

      Why was the driver looking down then?

  • by qzzpjs ( 1224510 ) on Friday June 22, 2018 @02:46PM (#56830224)

    In this case she may have saved a life by doing her job and paying attention, but the final solution assumes that nobody is sitting behind that wheel. This is still a major fail for Uber's software.

    • by Nidi62 ( 1525137 )
      To me, this is no different than highly automated systems such as trains that still require humans to monitor them. You still hear about trains going to fast and derailing and turns out the person monitoring the controls was asleep, or texting, or whatever. Very often, if they survive, those operators get charged at the very least with negligence and manslaughter. I would expect and assume that will be the case with this too. It's not a fail on Uber's part because the technology is not ready for indepen
      • To me, this is no different than highly automated systems such as trains that still require humans to monitor them.

        Yes, highly trained and evaluated humans, not some minimum-wage kid who has no understanding of the system.

    • There are several problems here. First they disabled the braking for an experimental system without at least some audible warning system. Second, they had a human in the car to monitor the system who would have likely caught the problem had she been paying attention, so their policies, procedures, and quality control are lacking. Finally, the auto braking was disabled because of too many false positives. This tells me that the system wasn't nearly ready for road testing, certainly without more oversight

      • by suutar ( 1860506 )

        my understanding is that the car had a warning system... which was also disabled. I have to wonder if the... "driver" isn't quite right... let's go with "designated monitor". I have to wonder if the designated monitor had been informed that the warning system and braking system were both disabled.

  • by SumDog ( 466607 ) on Friday June 22, 2018 @02:49PM (#56830256) Homepage Journal

    The person should have been doing her job. At the same time, Uber hires people telling them it's a self driving vehicle, removes the 2 driver-per-car to reduce costs, and then tests disabling safety features because, "Hey it's okay. We have a human in case something goes wrong."

    Fuck everything about this. Uber is equally at fault here. Sure she could have prevented the accident if she had been doing her god damn job. Uber could have prevented the accident if they didn't recklessly disable their own lidar and auto-brake algorithms to test their (failed) computer vision system AT NIGHT!

    This girl made a mistake, one that will haunt her for the rest of her life. A girl on a bicycle is dead. There is plenty of blame to go around. But at a minimum, given Uber's track record, they should not be allowed to put these pieces of shit on the road.

    Telsa has had a car crash into a truck and another into a barrier with their lane assist (they should be forced to rename that from "auto-pilot. It's not fucking auto-pilot). These systems give people a false sense of security and make people less aware, less active drivers. We are a good 15 year minimum from true autonomous vehicles and it's a fucking hard problem space.

    Even with how expensive it is to expand rail, we could probably expand rail at a fraction of the price of self driving tech. Singapore and London already have self driving trains. Let's make transportation better for everyone in America first and catch up to the rest of the world before we work on complicated stuff that's only good for its cool factor:

    https://penguindreams.org/blog/self-driving-cars-will-not-solve-the-transportation-problem/

    • by c ( 8461 )

      they should be forced to rename that from "auto-pilot. It's not fucking auto-pilot

      "Road Follower" would probably be accurate, but maybe not as marketable.

    • is Americans really, really (and I mean really) hate paying for anything that benefits somebody else. You have it crammed into your skull from day 1 that if you're doing that then you're a sucker. A fool. A "cuck". Whatever. It's a narrative pushed by our ruling class so they can avoid paying for the commons and it's worked for hundreds of years...
    • by Bongo ( 13261 )

      Largely agree. It’s an EXPERIMENT and so you cannot know what might go wrong. Their method of “testing” suggests they believe they have tons of data showing the thing is already very very safe. Which they cannot know.

      If this was a drug trial, it would be like giving the first experimental injection to 100 people. Nobody does that. You give it to maybe TWO people and wait for unexpected reactions.

      They’re apparently very cavalier about their tests. Nothing to do with the person in the

  • Felon (Score:2, Funny)

    This person was also a convicted felon (armed robbery). What a country!
  • all the time. I start it up when I get in the car and it autoplays while I drive. It's entirely possible that's what she was doing. In that case it's no different than running the radio.

    The question is did she also fiddle with the display on the car (like she was instructed to do so by Uber so that they didn't have to pay for a second driver/passenger to keep track of interesting driving events for the engineers to review). That'll come up in a court case.

    But here's a much, much better question, why
  • How about we blame the woman who jay-walked out into the middle of a dimly-lit street at 10 PM? Noooo, let's not blame that stupid behavior, we should focus only on the driver and the the car. If she had walked, or rode, the extra bit to get to a crosswalk she'd likely be alive.

    If this hadn't been an Uber car it never would have made headlines. People are distracted by all sorts of things while driving, and no system is going to be able to prevent all accidents, especially when people dart out into the midd

    • How about we blame the woman who jay-walked out into the middle of a dimly-lit street at 10 PM?

      Because it wasn't dimly lit.

      That's an incorrect impression given by the malfunctioning dash cam.
      Every person who has photographed that stretch of road late at night (with an ordinary cellphone camera)
      showed a very brightly lit and open piece of highway.

      UBER is entirely responsible for properly vetting and training their test drivers, and they didn't.

      • Doesn't matter. Accidents happen all the time. People in regular cars are distracted all the time. The fact is that she walked out into the middle of the street and got hit. Don't want to get hit? Try using a crosswalk.

        But yeah, blame Uber. Because all non-Uber drivers and vehicles never hit jaywalkers. Ever.

      • Dimly lit has not part in this. The woman with the bicycle tested positive for a cornucopia of drugs. She was jay-walking, almost certainly saw the car, and probably expected the driver would stop for her, because people generally don't run you over just because you're a duosh bag who steps out right in front of their car in the middle of the block.

        She had no idea that the person who was being paid to be monitor the car decide she'd rather be watching Hulu.

        There are no innocent victims here.

Avoid strange women and temporary variables.

Working...