Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Transportation Technology

Safety Driver in Fatal Arizona Uber Self-Driving Car Crash Charged With Homicide (reuters.com) 137

The back-up safety driver behind the wheel of a self-driving Uber test vehicle that struck and killed a woman in Tempe, Arizona, in 2018 was charged with negligent homicide, prosecutors said. From a report: Rafael Vasquez, age 46, who is also known as Rafaela, pleaded not guilty on Tuesday after being charged in the death of Elaine Herzberg on Aug. 27, court records show. She was released pending trial set for February 2021. Herzberg died after she was struck while walking a bicycle across a street at night. The first recorded death involving a self-driving vehicle prompted significant safety concerns about the nascent autonomous vehicle industry. A Tempe police report said Vasquez was repeatedly looking down instead of keeping her eyes on the road. Prosecutors in March 2019 said Uber was not criminally liable in the crash.
This discussion has been archived. No new comments can be posted.

Safety Driver in Fatal Arizona Uber Self-Driving Car Crash Charged With Homicide

Comments Filter:
  • by magzteel ( 5013587 ) on Wednesday September 16, 2020 @06:10PM (#60512698)

    Uber made a series of development decisions that contributed to the crash’s cause, the NTSB said. The software in the modified Volvo XC90 did not properly identify Herzberg as a pedestrian and did not address “operators’ automation complacency. Uber deactivated the automatic emergency braking systems in the Volvo XC90 vehicle and precluded the use of immediate emergency braking, relying instead on the back-up driver.

    Sounds to me like Uber is just as responsible for this accident

    • Re: (Score:3, Insightful)

      by magarity ( 164372 )

      Umm, as bad as Uber is in general, in this case they went to the trouble and expense of hiring a human who was supposed to be paying attention ready to hit the brakes in no small part because all that other stuff was disabled. That their employee was playing with her cell phone instead of working isn't really the company's fault.

      • by magzteel ( 5013587 ) on Wednesday September 16, 2020 @06:21PM (#60512738)

        Umm, as bad as Uber is in general, in this case they went to the trouble and expense of hiring a human who was supposed to be paying attention ready to hit the brakes in no small part because all that other stuff was disabled. That their employee was playing with her cell phone instead of working isn't really the company's fault.

        True, but they disabled the safety features in their product which could have prevented the accident.

        I'd also be interested to know why the hired this driver, what were her qualifications to be a safety test driver, what training she was provided, why the car did not monitor the driver's attentiveness and disable itself, and why now Uber is testing with two employees in the front seat and more strictly monitor safety employees.

         

        • by zenlessyank ( 748553 ) on Wednesday September 16, 2020 @06:33PM (#60512782)

          Also, why was testing done on the general public? I know they have test facilities with whole towns etc for this kind of testing, which would would put the hook back on Uber.

          Uber is also on the hook for letting the safety driver have a phone to begin with during working hours.

          • Also, why was testing done on the general public?

            At some point, testing on public roads is reasonable.

            The fact that they had automatic braking turned off indicates that Uber was not at that point.

            • Maybe it is a feature to run over people who are in the street. Make America Smart Again or some shit like that. Sometimes people have to take responsibility for their own actions.

              • Re: (Score:2, Insightful)

                by saloomy ( 2817221 )
                If a UPS driver runs someone over, UPS is not at fault. They hired her because she was licensed to operate a motor vehicle. The fact that the motor vehicle didn't monitor her, or the pedestrian, doesn't matter because one could reasonably expect that most motor vehicles do not monitor the driver anyway.

                Uber is not liable, just because it has deep pockets. She is because she failed to do her job properly. She had care, custody, and control of the vehicle in motion. Simple as that.
                • Nothing is simple.

                • one could reasonably expect that most motor vehicles do not monitor the driver anyway

                  Ahh, I must have misunderstood. I didn't realise Uber was testing a car that was just like most motor vehicles on the road - I was under the impression they were testing a vehicle that was capable of autonomous driving!

                  Silly me. Thank you for clearing that up.

                  • They were testing some functionality. If it was reliable and complete, they wouldn't be hiring a "safety driver".

                    Don't feel silly. Think about if you were a manger of Uber, and you were told to make sure its safe. It operates like car, hire a driver with a good record!
                    • by cusco ( 717999 )

                      They only have a safety driver because the state won't allow them to test without one.

                • by cusco ( 717999 )

                  Have you seen the accident video? I'm a decent driver without an accident for over 20 years and I seriously doubt I could have missed that extremely stupid woman.

                  • by uncqual ( 836337 ) on Wednesday September 16, 2020 @11:25PM (#60513818)

                    The accident video from the car apparently didn't give a very good view of what the driver would see because of the lighting conditions and the camera's sensitivity (or lack thereof). The next day, a driver drove through the area at night and posted a video on YouTube that showed much better lighting than what the clip from the self-driving car suggested.

                    • by cusco ( 717999 ) <brian.bixby@gmail . c om> on Thursday September 17, 2020 @12:21AM (#60513960)

                      showed much better lighting

                      No, it showed a video which used a different type of camera which functioned better in low light than whatever camera Uber had used. I've never driven that particular street at night so can't speak as to what actual lighting conditions are, but I know that if I drive it with five different cameras we'd have five different videos which give five different impressions of lighting conditions.

                      A coworker who has driven that stretch of highway said that the pedestrian was an absolute idiot to attempt to cross there at any time of day, much less at night.

                • by uncqual ( 836337 )

                  Although, UPS would end up on the hook for financial liability (but not criminal liability assuming that the company management hadn't done something like knowingly sent out unsafe trucks and that lead to the death).

                  If an employee is doing their job, even poorly, the employer is generally liable for the results of the employee's actions. If the employee causes injury to someone when doing something completely outside the scope of their job and that is not authorized by the employer and that the employer has

        • why the car did not monitor the driver's attentiveness and disable itself

          What kind of a screwed-up world do we live in where it's now a requirement to design an extra automated system to watch the human being that's hired to watch the first automated system?

          That human being screwed up. End of story.
          It's not that hard to sit in a car and hit the brakes if it looks like something might go wrong. The level of competence required is staggeringly low.

          Would you blame a railroad company if a train hit a person because the conductor was busy on instagram?

          why now Uber is testing with two employees in the front seat and more strictly monitor safety employees.

          Perhaps because it's now been sh

          • by cusco ( 717999 )

            I take it you haven't seen the accident video, it's unlikely whether any driver could have avoided hitting that idiot no matter how attentive they were.

            • I take it you haven't seen the accident video, it's unlikely whether any driver could have avoided hitting that idiot no matter how attentive they were.

              Is this really a case of not being able to avoid hitting the pedestrian despite being attentive and trying or is it a case of playing with a cell phone while hitting the pedestrian?

            • Car speed 43mph.

              Time before impact when bicycle is seen on dashcam 2 seconds.

              ergo, Distance from car to bicycle when dashcam picks it up 40 yards.

              This just shows you should not put a dashcam in charge of driving as they have terrible vision.

              Anyone driving who cannot see the broad side of a bicycle at 40 yards needs better headlights or better vision.

            • by Cederic ( 9623 )

              I take it you haven't seen the footage of the street in question and how easily someone at night could see and avoid a pedestrian, let alone a pedestrian with a fucking bicycle.

              If you think that woman could not have been avoided then you need to stop driving and sell your car, because you're not safe on the road.

              • by cusco ( 717999 )

                I've seen the footage of that street at night when taken with a camera that has far better low light performance than my eyes do, yes.

                I have not personally driven that street at night, but my coworker who used to drive that route regularly said the woman was a frelling idiot to cross there no matter the time of day. I'll take his word over that of a random YouTuber with a good camera.

          • why the car did not monitor the driver's attentiveness and disable itself

            What kind of a screwed-up world do we live in where it's now a requirement to design an extra automated system to watch the human being that's hired to watch the first automated system?

            That human being screwed up. End of story.
            It's not that hard to sit in a car and hit the brakes if it looks like something might go wrong. The level of competence required is staggeringly low.

            Would you blame a railroad company if a train hit a person because the conductor was busy on instagram?

            why now Uber is testing with two employees in the front seat and more strictly monitor safety employees.

            Perhaps because it's now been shown that the world is going to try and blame it on them when their employee screwed up the one thing they had to do?

            This was a test program, not a production system. As such it was incumbent on Uber to take all reasonable precautions to prevent accidents like this. That would include hiring qualified testers, training them thoroughly on the test protocol, monitoring the test and the testers, and implementing automated safety systems to the extent possible. To my knowledge they didn't do the first three, and they disabled the automated safeties.

            The backup driver may have been incompetent, but Uber hired him and turned

        • True, but they disabled the safety features in their product which could have prevented the accident.

          No. They disabled the safety features in their product which *may* have prevented the accident while being guaranteed to interfere with the data collection for making an updated safety system.

          Simply saying "but they disabled" is asinine. The question is to what extent was this risk assessed, what was the role of a safety driver, what was the requirement for the vehicle (my car doesn't even have such a feature am I operating illegally or negligently right now?)

      • by vux984 ( 928602 )

        1) What was the employees training like? Bet it wasn't anywhere near as intense as it would need to be for the task they were given.

        2) Even if they hired someone and told them to do it, if you can't realistically expect them to do it, you share responsbility for it not being done.

        I'm in the middle of teaching my kids to drive; big part of my job as co-driver is to be a 2nd set of eyes, watching for anything amiss and catching them in time. And when they were starting out it was REALLY easy to be a 2nd set o

        • The PHB were paying attention. The driver was the insurance policy, nothing more. Their job was to get blamed. And for probably 10 bucks an hour, it was really cheap insurance. Probably less per hour than gas, oil and tire wear. The PHB's probably got a bonus as the insurance policy was effective. It is what happens when you let a sociopathic company decide when it is safe to let self driving vehicles on the road.
        • n600xl crash change the crew when the system fails.
          Now in the uber case how much training on when to take over / hit the kill was given?

        • by cusco ( 717999 )

          You've just described the job of almost every security guard on the planet as well.

        • by Cederic ( 9623 )

          They were recording her and every other backup driver. If they'd done the slightest due diligence they'd have been well aware that neither this driver, nor likely ANY OTHER DRIVER they hired was perpetually at the ready to react to an event like this fast enough.

          Worse, they explicitly gave them a role that required full attention and then pretty much guaranteed that it wouldn't be given.

          Humans just don't focus on things when there's nothing to do. No 'driver' ever was going to be providing the level of attention required.

          I'm surprised Uber aren't being held corporately responsible, irrespective of whether the human in the car committed a criminal offence.

      • This is the new America. Pay someone 10 bucks an hour to get blamed, while the million dollar CEO gets a bonus for covering uber's but. It really is just sick.
      • diffident the safety driver need to look at live reports while driving?

      • by rtb61 ( 674572 )

        Yes it is, the company is fully liable for the actions of their employee. That the accident occurred proved a series of things. The company did not recruit the right person to do the job and the company did not train them properly to ensure they would carry out the job safely. As a result a person died. The company is fully civilly liable and the employee is criminally liable.

        Whilst the person walking failed to take due care, the traffic law is, you are NOT allowed to proceed in a motor vehicle unless it is

    • The software in the modified Volvo XC90 did not properly identify Herzberg as a pedestrian

      Did it identifier her as anything? Like maybe you wouldn't wanna slam into any large object?

    • Re: (Score:3, Insightful)

      by Cajun Hell ( 725246 )

      Uber deactivated the automatic emergency braking systems in the Volvo XC90 vehicle and precluded the use of immediate emergency braking, relying instead on the back-up driver.

      Your own words persuade me that it's probably not Uber's fault. Relying on a human driver is the current best practice, used by most of the cars on the road today.

      I'm much more with you on not-addressing-complacency (it's a damn good point), but geez, it was a test, exactly the conditions where you'd expect the driver to be vigilant a

      • by magzteel ( 5013587 ) on Wednesday September 16, 2020 @06:39PM (#60512800)

        Uber deactivated the automatic emergency braking systems in the Volvo XC90 vehicle and precluded the use of immediate emergency braking, relying instead on the back-up driver.

        Your own words persuade me that it's probably not Uber's fault. Relying on a human driver is the current best practice, used by most of the cars on the road today.

        I'm much more with you on not-addressing-complacency (it's a damn good point), but geez, it was a test, exactly the conditions where you'd expect the driver to be vigilant and serious. If it were a production car driven by a customer, your argument would probably tip me to your side. But this is sort of like Neil Armstrong jacking off instead of worrying about how much fuel is left in the lander. I expect Neil to pay attention and I expect Uber employees doing tests to pay attention, moreso than a casual user. It's sort of how like you're less complacent than normal, when you type a command after "sudo."

        Neil was carefully selected from thousands of the best candidates available and given intensive training for years. Every step of his mission was monitored by hundreds of people in mission control along with his two crewmates

        Rafael was some nobody they hired to sit in the car while it drove itself for hours.

        • "Some nobody" gets paid to do a job... and, instead of doing their job, plays around on their phone or whatever. How is it not "some nobody"'s fault that they didn't do the job they agreed [and were paid] to do?

          • "Some nobody" gets paid to do a job... and, instead of doing their job, plays around on their phone or whatever. How is it not "some nobody"'s fault that they didn't do the job they agreed [and were paid] to do?

            It depends on the job, the qualifications of the person they hired to do it, the training they provided, and the safety measures in place. I'm sure all of these questions will be asked in court.

            • Uber is responsible for making hiring decisions. But using the cell phone to that degree while driving is illegal, so this should be a relatively simple case anyway. It's illegal to dick with your phone while driving (which is what the safety driver is legally doing at this point) specifically because of inattention leading to collisions, including between vehicles and pedestrians. If they have already decided that it's not the pedestrian's fault, then the bulk of the blame has to fall on the driver.

              Frankly

              • Uber is responsible for making hiring decisions. But using the cell phone to that degree while driving is illegal, so this should be a relatively simple case anyway. It's illegal to dick with your phone while driving (which is what the safety driver is legally doing at this point) specifically because of inattention leading to collisions, including between vehicles and pedestrians. If they have already decided that it's not the pedestrian's fault, then the bulk of the blame has to fall on the driver.

                Frankly I don't think pedestrians should get free passes, though. If there is a proper place to cross and they choose not to use it, then they should be held at fault. If they are mentally incompetent to be on their OR then whoever let them out unaccompanied should share the blame, up to and including whoever prevented funding for same, etc.

                Look, I'm sure the backup driver has some blame. We are in a very grey space where we say the car is self driving, but the person is responsible for the outcome. What is the point of a self-driving car if you must be the shadow driver, ready to leap into action in a millisecond? it makes no sense.

                Uber already accepted liability and they paid the pedestrians family a big sum.
                It just sounds like the rich company can pay their way out of this, and the poor employee gets jail time

                • It's explicitly not a self driving car yet. That's why it has a safety driver.

                  • It's explicitly not a self driving car yet. That's why it has a safety driver.

                    It explicitly WAS an autonomous vehicle, so says every article you can find:
                    https://www.google.com/search?... [google.com]

                    Here's just one:
                    https://www.npr.org/2019/03/06... [npr.org]

                    Elaine Herzberg, 49, was walking a bicycle across the road at night when she was fatally struck by a Volvo SUV outfitted with an Uber self-driving system in March 2018. The car had a human operator behind the wheel but was in computer control mode at the time of the crash.

                    In the six seconds before impact, the self-driving system classified the pedestri

        • Hmm, so Uber's at least partly reliable, because they hired carelessly? I guess I can maybe buy that.

          "I gave my gun to Clyde. It's not my fault he shot someone."

          "But Clyde's a chimpanzee!"

          • Hmm, so Uber's at least partly reliable, because they hired carelessly? I guess I can maybe buy that.

            "I gave my gun to Clyde. It's not my fault he shot someone."

            "But Clyde's a chimpanzee!"

            Absolutely true.
            If your local daycare hires a pedophile they can't pretend they have no liability when she molests a dozen kids.

          • by Cederic ( 9623 )

            erm. Clyde is an orangutan.

            He's also probably a BLM supporter, given his attitude to the police.

      • Relying on a human driver is the current best practice

        Waymo and Tesla will automatically apply the brakes if a hazard is detected.

        Relying on a human to do so is certainly not a "best practice".

        • Relying on a human driver is the current best practice

          Waymo and Tesla will automatically apply the brakes if a hazard is detected.

          Uber disabled that safety feature.

      • by AmiMoJo ( 196126 )

        If you rely on something you better consider its failure modes.

        You rely on tyres so you have sensors that detect when they are deflated. You rely on brakes so you have anti-lock tech that kicks in when they don't work as expected.

        But for some reason they didn't bother with any driver monitoring system, despite the driver being a critical part of the vehicle's safety systems.

    • +1. Humans are well known to be terrible at the task required. Hiring some mouth breather to babysit a system that is 99.9% right is recipe for disaster the rest of the 0.1%. Try getting the night guard to not nap, check his phone, or otherwise fidget is virtually impossible.

      Uber and/or the state that approves their testing is at fault here. The company was being cavalier, and someone died. Either they broke rules to test dangerous software and they are at fault, or the state let them proceed with a da

  • by Jarwulf ( 530523 ) on Wednesday September 16, 2020 @06:13PM (#60512706)
    The idea behind backup drivers for robot cars is assbackwards. Can't expect people to remain attentive for hours on end, poised and ready to take over at a moments notice from an 'autonomous' car. The human brain doesn't work that way. With nothing to do the mind will naturally begin to wander. And ironically the normal motion of a car is well suited to drift off to sleep. Maybe if its an error which with enough time to recover but not something instantaneous. Either the car is autonomous or its not and the driver should be in control.
    • by Hentes ( 2461350 )

      Driving instructors must be superhuman.

      • Driving instructors must be superhuman.

        Maybe not superhuman, but they do have to be licensed to be a driving instructor in Arizona.
        Here's the application https://apps.azdot.gov/files/m... [azdot.gov]
        Among other things you need 100 hours of combined classroom and in-car training.

        I wonder what training Rafael Vasquez got?

      • by Cederic ( 9623 )

        They have a number of advantages over an Uber safety driver. For instance, they're babysitting an actual human who will know what a pedestrian looks like and know that they shouldn't hit one. They'll also be interacting with that human, discussing the control of the vehicle but also how to drive it safely, highlighting potential issues verbally and providing instructions like "Watch for that person with the bike. Slow down. Brake!"

    • Either the car is autonomous or its not and the driver should be in control.

      In order to develop autonomous cars, they have to be tested.

      In order to test them, they need backup drivers.

      An autonomous car sold to the public should not require a human to pay attention.

      But that is not what this was.

  • by quonset ( 4839537 ) on Wednesday September 16, 2020 @06:28PM (#60512762)

    Create a vehicle which automatically sees what's in front of it and reacts accordingly so the person behind the wheel doesn't have to pay attention.

    Oh wait.

  • HMMM... (Score:5, Insightful)

    by WolfgangVL ( 3494585 ) on Wednesday September 16, 2020 @06:38PM (#60512796)

    Everybody at fault here. Appeal. Uber should not be able to just walk away from this. Pretty sorry article for such a complicated issue.

    Yes the safety driver was negligent, criminally so. Same goes for Uber. Uber had had the ability to monitor this driver, and driver was working in official capacity. I'm guessing she was a a min-wage earning contractor with no proper training and very little pride in her job.

    "Safety driver" was likely told that she was just there to check a box, and weeks into her position she began to believe it.

    Fun fact- It's common practice to rotate soldiers out of tower guard at regular intervals as its a known fact that humans too long in an extremely monotonous positions lead to complacency, and complacency kills.

    If the driver is really expected to assume all liability here then "Autonomous car Safety Driver" became the easiest 6 figure career I've heard of a long time, and aught to be renamed to "Car Company Patsy"

    We all know these companies are going to do everything they can to shuffle off liability when these self driving cars cause damage. I hope this precedent is challenged.

    I also know exactly dick-all about this sort of law so there's that.

    • Uber should not be able to just walk away from this.

      Uber has already paid a big settlement to the victim's family.

      Since she was homeless and her family had more-or-less abandoned her, they likely were quite happy to get that settlement check. I doubt if they were in mourning for long.

    • Uber should not be able to just walk away from this.

      When did Uber walk away?

      They were forced to suspend their global program.
      They were sued and settled claims from the daughter of the victim.
      They were sued and settled claims from the mother of the victim.
      They were determined not to have acted criminally by the courts.

      So sorry that your personal desires for punishment wasn't met. Maybe consider joining the legal profession if you want to second guess the courts?

  • Your realize that if you look down in your car for a few seconds for whatever reason and hit and kill somebody in the process, you could be similarly charged, if proof could be found.

    • And if you did look down at your phone for the same number of seconds, you should be charged.

      But in this case, so should Uber for creating this obvious situation.

  • by ugen ( 93902 ) on Wednesday September 16, 2020 @06:49PM (#60512828)

    The reason Uber hired those "safety drivers" is precisely to have a low ranking peon to take a fall in a situation just like this one. So, clearly, the system worked as designed and Ubers minimal wage investment paid off.

    On a serious note, there is, unfortunately, no hope for Uber mgmt. to be prosecuted for homicide, as they should have. I can only hope that the jury in driver's trial will see through this ploy and make their decision accordingly.

    In the meantime, those of you who are considering a "self driving" car - take note. Regardless of what egregious mistakes the manufacturer will make, you will be the one to take the blame.

    • I would love one, but flatly will not buy one until the legal liability is solely and completely on the manufacturer. It should be presumed that I will sleep the entire ride, and therefore cannot participate in the car's control at a moment's notice.

      Of course, I would also expect to be liable if I used the driving controls.

    • take the blame = if your e-taxi hit someone before it even gets to you. You still have to do the time

    • Uber was under a lot of pressure to be seen catching up to google. They weren't ready for this.

    • The reason Uber hired those "safety drivers" is precisely to have a low ranking peon to take a fall in a situation just like this one.

      Don't be stupid. The reason Uber hired safety drivers is because they were required to by the government as part of the program.

      On a serious note, there is, unfortunately, no hope for Uber mgmt. to be prosecuted for homicide, as they should have.

      Indeed the courts have decided that in this case the bar for criminal homicide hasn't been met, so I'm not sure why you think you know better than the judge looking into this very matter. Uber instead had to pay out to multiple other parties in civil cases.

  • I expect the courtroom arguments to look like Leonard Nimoy in The Outer Limits [imdb.com]. Twice [imdb.com]
  • by eepok ( 545733 ) on Wednesday September 16, 2020 @07:15PM (#60512900) Homepage

    I attended a conference in 2019 where the NTSB spoke about their investigation. Key points:

    Tempe, AZ - 10pm at night
    - 2017 Volvo XC-90
    - Crash occurred at a cross-pattern underpass
    - The vehicle was going 43mph when the victim was first detected.
    - The victim was walking a bike across the street midblock

    - 6 seconds before impact, the vehicle detected the pedestrian, then marked her as unknown, then marked her as a vehicle, then marked her as a bike.
    - 3 seconds before impact, the vehicle began breaking.
    - 1 second before impact, the driver reacted.

    My commentary:

    1. Uber's software couldn't figure out what the "obstruction" was but continued to plow forward. One would assume that pedestrian, vehicle, bike, and "unknown" would all be categorized under "DO NOT HIT" regardless of final categorization. For THREE SECONDS, the vehicle knew there was an obstruction and didn't brake. Massive fault on Uber here.

    2. The vehicle started braking 3 seconds before impact. Any modern vehicle (a Volvo nonetheless) can go from 43mph to 0 in less than 3 seconds if that brake is mashed. It wasn't. Uber failed here.

    3. If you watch the video posted by the Tempe Police (https://twitter.com/tempepolice/status/976585098542833664?s=21), you'll notice that That the vehicle was doing 43mph on collision means that the application of brakes must have been REALLY light. It's quite likely that if the driver was paying attention to the road that the collision would have still happened, though potentially with less deadly consequences.

    4. The only reason the driver is being charged with anything is because there is video evidence of the person being on her phone. Pedestrians and bicyclists are killed on the road every year and very frequently nothing happens to the driver because there's no witness aside from the driver him/herself. The driver says, "She just came out of nowhere" and the cops say, "Ya... non-drivers. Losers."

    5. If Uber leaves the driver out to dry, then they're pretty much setting back the desire for autonomous vehicles by decades. Who's going to want an AV if the company driving the vehicle will hold YOU responsible they THEIR driving fails?

    6. No, your Tesla wouldn't necessarily have performed better. The next discussion was on Tesla's high susceptibility to drive-out collisions. (Williston FL 2015 Tesla Model S; May 7, 2016)

    • 5. If Uber leaves the driver out to dry, then they're pretty much setting back the desire for autonomous vehicles by decades. Who's going to want an AV if the company driving the vehicle will hold YOU responsible they THEIR driving fails?

      We so often hear, "corporations are just groups of people, so they have the same rights as people," but what's often left out is that they (should) have the same responsibilities and status in society.

      In this case, she is Uber, and Uber as a whole should be held criminally liable if anyone is. This is a fucking testing program. You don't put your test drivers in impossible situations. Given human attention span and a lack of feedback, as the driver wasn't supplying input and receiving stimulus, they can har

      • In this case, she is Uber, and Uber as a whole should be held criminally liable if anyone is.

        Indeed. And She was dragged infront of the court, and She was found not to be criminally liable under the legal code based on the systems and checks She put in place by a judge who knows the law. She did however end up getting dragged into multiple civil suits.

    • Massive fault on Uber here.

      The only test I've witnessed that didn't find a fault was a bad test. That's why you test. Not having faith in the equipment you test is why you put in place additional barriers such as human oversight. That isn't a fault of Uber. Now Uber may have been at fault for how well the secondary barriers worked, but ultimately I can't stress this enough: Finding a fault during testing is the whole frigging purpose of testing.

  • by thesjaakspoiler ( 4782965 ) on Wednesday September 16, 2020 @07:28PM (#60512940)

    Did anybody see the video footage?
    Crossing a street in complete darkness expecting that cars will brake for you?
    Any driver would have hit this idiot.

    • As someone who has driven over the exact road where this happened more than a few times, pedestrians like this are entirely too common. And they love to wear all black for some reason. The roads are well-lit, but the fact that people are moving shadows makes them hard to see because they get washed out by all the other lights. You have to watch for other headlights getting eclipsed by something to see where people are and to watch the sides of the road. But yeah, it's not exactly easy, either.

      I can, and

    • An attentive driver would have swerved to avoid the pedestrian. A "safety driver" with his eyes glued to his smartphone, on the other hand...
    • 1 Go look at the dashcam footage frame by frame and work out the speed of the 43 year old pushing a bicycle. You can see the lane markings and you know how big a bicycle is.

      2. Go look at a map of the crash site and see how far the bicycle travelled across the road before the crash.

      3. Using the data from above, work out how many seconds before impact the pedestrian started to cross the road.

      4, Using the information from 3 , and the official impact speed of 43 mph, work out how far away the Uber car was whe

    • by Cederic ( 9623 )

      The video footage is a shitty dashcam incapable of capturing the light that was available.

      It's like pulling on a welding mask and asking people how you're meant to ski.

      The car detected the pedestrian six seconds before impact. A human would have had at least that much time to see her.

      People cross that road all the time. They don't get hit. They don't get killed. It's not a dangerous road, it was a dangerous vehicle.

  • It's terrible what happened, but in this case the backup driver should probably not be criminally charged. First, the pedestrian was illegally crossing .. how is that not taken into account? Who crosses the road seeing a car coming? While there was enough time to brake I am not sure there was enough time to avoid the accident entirely. Maybe the impact speed could have been reduced at best? Second, for the Uber test driver, it is a totally new thing for humans to be in a self driving car .. for all we know

    • by Cederic ( 9623 )

      First, the pedestrian was illegally crossing .. how is that not taken into account?

      Because it does not matter. She could have been dancing naked with a porcupine, that still doesn't justify killing her. You don't get to just ram and kill people because you think they might have possibly been breaking the law. That's quite rightly not legal.

      Who crosses the road seeing a car coming?

      Everybody. They make a judgement based on the road, traffic levels, visibility, the speed of the oncoming vehicle, the time they need to cross.

      Plus of course you're assuming she saw the car at the point she started to cross. Maybe she did, maybe she did

  • by Joe_Dragon ( 2206452 ) on Wednesday September 16, 2020 @08:11PM (#60513102)

    Need discovery on all work rules / emails / guides / etc for the safety driver.
    Also need ALL LOGS / ALL SOURCE CODE

  • by fluffernutter ( 1411889 ) on Wednesday September 16, 2020 @08:18PM (#60513140)
    UBER TEST 1

    Purpose: To explore whether humans are capable of maintaining vigilence required for safe automated driving.

    Result: Hard no.
  • The dead person walked into a street without looking both ways. The Tesla had it's headlights on. It's a hella lot easier to see a car with headlights on on a road than a person in dark clothes and no lights on same road.

    Not to say the person hired to "drive" the car isn't at fault, but the dead idiot contributed at least 50% to their death.
    • Y'all need to look at the dashcam footage. Pedestrian had nearly completed crossing a 4 lane road when the Uber car hit her.
      The Uber car was doing 43 mph, so when the pedestrian decided to cross the road , the uber may have been over 300 yards away and not directly visible .

  • There's your legal precedent, and it's at least as bad as I thought it might be. You have a SDC? You get in a wreck? Your fault.
    So much for SDCs. Never thought they'd work in the first place and precedents like this are the nail in the coffin.
    • by Cederic ( 9623 )

      This does not set precedent for fully automated vehicles. This merely acknowledges that a partially automated vehicle remains under the control of a responsible adult, and if, through negligence of that adult, someone dies, that adult may be liable.

      • I don't find that argument relevant since I do not believe there will be any such thing as 'fully autonomous cars' due to the technology being utter crap that has no ability to 'think', not at all.
        • by Cederic ( 9623 )

          So you're claiming that precedent has been set for something that hasn't happened and will never happen?

          I was trying to be nice but frankly you're too fucking stupid to help.

  • An important takeaway here is that the law applies to people not things or AI. Because of that, the lawyers had to find a person to blame not the device. Given the NTSB investigation, I would say that the prosecutors are pointing the finger at the wrong person unless of course there was a law that clearly stated that as the backup to the AI, they were responsible. The prosecutors should have been pointing the finger at the designers of the software given the NTSB results. But this also illustrates the p

Scientists will study your brain to learn more about your distant cousin, Man.

Working...