Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Transportation

Human Driver Could Have Avoided Fatal Uber Crash, Experts Say (bloomberg.com) 408

An anonymous reader shares a report: The pedestrian killed Sunday by a self-driving Uber SUV had crossed at least one open lane of road before being hit, according to a video of the crash that raises new questions about autonomous-vehicle technology. Forensic crash analysts who reviewed the video said a human driver could have responded more quickly to the situation, potentially saving the life of the victim, 49-year-old Elaine Herzberg. Other experts said Uber's self-driving sensors should have detected the pedestrian as she walked a bicycle across the open road at 10 p.m., despite the dark conditions. Herzberg's death is the first major test of a nascent autonomous vehicle industry that has presented the technology as safer than humans who often get distracted while driving. For human driving in the U.S., there's roughly one death every 86 million miles, while autonomous vehicles have driven no more than 15 to 20 million miles in the country so far, according to Morgan Stanley analysts. "As an ever greater number of autonomous vehicles drive ever an ever greater number of miles, investors must contemplate a legal and ethical landscape that may be difficult to predict," the analysts wrote in a research note following the Sunday collision. "The stock market is likely too aggressive on the pace of adoption."
This discussion has been archived. No new comments can be posted.

Human Driver Could Have Avoided Fatal Uber Crash, Experts Say

Comments Filter:
  • by ZorinLynx ( 31751 ) on Thursday March 22, 2018 @04:36PM (#56308443) Homepage

    Based on the video I saw, she was practically invisible until she entered the car's headlight beams. The road was poorly lit, and she had dark clothing, no reflectors on the bike and no lights.

    I don't see how I could have stopped or swerved in time to avoid her in that brief window.

    Believe me, I don't care for self-driving cars at all, but I have to remain unbiased here because I know I would have hit her in the same situation.

    Be safe out there, people. Put lights on your bike or yourself when you're out there on the road at night.

    • I agree, I'm not sure an attentive human driver would have necessarily avoided that collision but the real question is why all the sensors of the autonomous vehicle didn't pick up that there was an obstacle in the road.
      • by TexasDiaz ( 4256139 ) on Thursday March 22, 2018 @05:04PM (#56308657)
        When you live in rural North Georgia or rural New Hampshire for as many years as I have, there's one rule you learn - don't swerve to avoid obstacles in the road (namely deer). You often times will kill yourself trying to avoid the deer much more than you would if you just hit it. So I'm trained, if something jumps out at me, I'm gonna get ready to hit it. A human driver may have been able to swerve out of the way if they were alert and ready to perform the necessary maneuver, but I'll be goddamned if I'm gonna do that. I'll do everything in my power to not kill something, but I won't kill myself trying.
        • Agreed. Braking is the right thing to do, not swerving. I know most people wont do it, but that's one of many reasons I put a big brake kit on my car. It isn't even a sports car! Just a V6 Accord coupe. If I brake down to half or a quarter my speed before I hit someone they might survive, but I am not swerving.

        • by xevioso ( 598654 )

          And I think this is actually kindof the crux of the problem here...we are now at the spot where we are going to expect vehicles to decide, either hit that person, or swerve or brake and injure the driver or someone else. Those sorts of split second decisions people rarely have to make, but they *do* occasionally have to make them.

          I'm of the opinion that we use applied philosophy...that is, given a person the option of two different types of software in their cars, and they have to decide which one they wan

          • we are now at the spot where we are going to expect vehicles to decide,
            either hit that person, or swerve or brake and injure the driver or someone else

            No need to make a person-oriented decision.
            If the car detects an obstacle, whether a child, an adult , a dog or a traffic cone,
            the car just has to solve one problem: is there a free lane that I can swerve into safely?
            If so, do it.
            If not, hit the brakes as fast and as hard as it can.

        • by rthille ( 8526 )

          Yeah, I came around a corner to a momma deer in the middle of my lane. I was going too fast to tighten the radius (which would have also brought me into currently unoccupied, but not necessarily for long oncoming lane), so I widened the radius onto the wide shoulder.

          Where the baby deer was illuminated by my lights. At that point is was either try to swerve back to momma, or off the road, down the embankment into the oak trees. Where the rest of the deer were.

          The baby deer didn't make it, but I'm pretty su

          • by jeremyp ( 130771 )

            If you had hit the mother deer, the baby deer would likely also have died later. Also, an adult deer coming through your windscreen is probably not survivable.

        • by Cederic ( 9623 )

          Maybe you should buy a car that can do more than go in a straight fucking line.

    • Re: (Score:3, Insightful)

      by quantaman ( 517394 )

      Based on the video I saw, she was practically invisible until she entered the car's headlight beams. The road was poorly lit, and she had dark clothing, no reflectors on the bike and no lights.

      I don't see how I could have stopped or swerved in time to avoid her in that brief window.

      Believe me, I don't care for self-driving cars at all, but I have to remain unbiased here because I know I would have hit her in the same situation.

      Be safe out there, people. Put lights on your bike or yourself when you're out there on the road at night.

      Then you should drive slower because I would have avoided her.

      When I drive at night I drive at an appropriate speed so I can stop in time if my headlights detect something on the road ahead of me. And if my headlights and eyes were as terrible as the crappy video we've been shown I would have been driving very slowly indeed.

      • by ChrisMaple ( 607946 ) on Thursday March 22, 2018 @05:24PM (#56308785)

        Speed limits are set according to fixed rules that have been set by carefully examining statistics and the theoretical capabilities of cars and drivers. A self-driving car would be obeying the speed limit (well, this is Uber, maybe not). A human driver would assume that driving at the posted limit was safe for all but the most severe conditions (dense fog, or heavy snow, icy road, and night).

        A cyclist crossing the road on foot, wearing dark clothing, should be able to see approaching headlights from hundreds of yards away. This seems like a case of extreme bad judgement on her part.

        • this also took place in the PHX metro; a car going the speed limit will get shot at and/or run off the road.

        • obeying the speed limit so useless on the IL Tollwall.

          Late nights with low traffic and really good lighting you can fly. No one does the 55 even cops do 75-80 in the 55.

        • Speed limits are set according to fixed rules that have been set by carefully examining statistics and the theoretical capabilities of cars and drivers.

          In all countries other than the USA, where they are often set by policy and manipulated politically.

          A human driver would assume that driving at the posted limit was safe for all but the most severe conditions (dense fog, or heavy snow, icy road, and night).

          Maybe in the USA. In many other countries we are told to "drive to conditions". The speed limit has never been a defense against an accident. Breaching it however has always been a contributory factor. Take for instance the european right of way rules. Literally give way to anything coming from your right. In an average built up area with a speed limit of 50km/h you never are able to get that fast despite tha

      • I agree with you. In poorly lit areas I often go ten under. If the road is well lit I'll go faster, but it's all based on conditions. I don't speed in a blizzard like some of the maniacs I live around.

      • Re: (Score:3, Insightful)

        Comment removed based on user account deletion
        • Speaking from experience :

          A.
          I drive quite a lot when on vacations/week-end, including often on nights, including sometime in fog.
          - The human supervisor *should* have turned on the high beams. It seems to me that only the low beams were active, reducing the visibility range. (This might have affected the camera part of the sensors). The super visor is supposed to supervise the self-driving car and thus should be able to see in order to anticipate and compensate bugs, instead of relying the whole thing to wor

    • That's video (Score:2, Informative)

      by Anonymous Coward

      ....she was practically invisible until she entered the car's headlight beams

      Human vision is MUCH more sensitive than cameras. What looks dark in the video wouldn't be so bad to a human. That's why they use all those lights when shooting video.

      So, it wasn't as dark as it appears.

      • by K. S. Kyosuke ( 729550 ) on Thursday March 22, 2018 @04:59PM (#56308625)
        Yes, that's why night vision devices use extracted human eyeballs instead of optoelectronic components.
        • Night vision has amplification, this camera did not.

          What the car should have had was infra-red and if it doesn't then I can't see how you can suggest it's fit to use at night in any conditions.

          If it did have infra-red she would have been a massive bright spot on a black background moving across the cars path, and it reacted by ... doing nothing.

          • Long-wave infrared would be very useful for cars but it's probably also not all that cheap. I'm not sure she'd leave a "massive bright spot" on ordinary short-wave infrared imaging sensors.
            • Maybe, the BMW version seems to be about $1000. But I'd pay that for infrared around here pretty happily, for kangaroos rather than people or deer, but would easily pay for itself.

          • Apparently it did have a lidar system for obstacle detection; so the darkness is irrelevant.

            Also the nice thing with lidar is that unlike infrared, it doesn't rely on the object's heat signature for detection

        • Cameras vary by model. I doubt that Uber was using high end equipment.
        • The gross failure to use adequate sensors should be immediately recognized in any real or forensic engineering review.
      • Often those dark shapes at night can be seen when you look straight at them, but they are not at all easy to see with peripheral vision.

    • I would not have. I have been in situations like that many times. The low light is far enough that you have time to avoid stationary objects in the road, or things moving slowly onto the road.

    • by Luthair ( 847766 ) on Thursday March 22, 2018 @04:49PM (#56308541)

      Its hard to be certain, the video is rather low quality and typically cameras struggle to capture image at night. Even in the low quality video I saw on BBC's site you can see white shoes moving in the shadow which makes me suspect the person was more visible than the video would have you believe.

      Perhaps more concerning - the video released of the person supposedly monitoring the car spent an awful lot of time looking down not ahead and out the window.

    • by AlanBDee ( 2261976 ) on Thursday March 22, 2018 @04:54PM (#56308581)

      But consider this. Next time you're a passenger at night on a poorly lit road take out you phone and record the road. I'll wager that your real eyes can see better in the low light then your phone or the camera attached to the Uber car.

      Morally, I think the woman is at fault for crossing the road in the dark without looking for oncoming cars and not having any kind of light. But the autonomous car's other sensors should have picked her up anyway. To be successful autonomous cars need to be significantly better then the average driver, they need to be better then the best drivers out there.

      I am bias, I can't wait for autonomous cars to come to market. But even I have to admit that it should have seen her coming with plenty of time to spare.

      • To be successful, automated cars will need to make out things much smaller than adult women pushing bikes.
      • Re: (Score:3, Informative)

        by AmiMoJo ( 196126 )

        It's the sensor failures that really worry me. Radar should have seen her, the lidar should have seen her. The cameras should have seen her - most autonomous cars use cameras with some IR vision capability so they can see at night.

        The cameras on my Nissan Leaf have better night vision than the one in the video, which makes me think it's not the one the system uses.

    • I am amazed at how many jaywalkers I see wearing dark clothes at night crossing a four lane road that I often drive on.

    • From what I saw on the video, it was only about 1 1/2 seconds between when the pedestrian 1st became visible and the car hit her. Admittedly I was doing the old " 1 one thousand, 2 one thousand" thing, but I don't think I'm that far off. I recall seeing reports years ago that human reaction time to any sudden situation is about 2 seconds. I do think that if the safety driver had been looking more at the road and less down at whatever was in her lap, she would have shown that look of shock and surprise a lit
      • by bws111 ( 1216812 )

        Two second reaction time? Average reaction time to visual stimulus is about a quarter of a second.

    • I've seen the video. I think I would have hit her, but I also think I would have slammed on my breaks before I did. I might have hit her at 20 instead of 40. She might be alive.
    • Maybe if the woman who was supposed to be watching what was going on were not half asleep we'd know the answer. I think in a situation like that a person who wanted to be more aware would have been. This is exactly why auto pilot is dangerous. Everyone wants to blame it on it being dark, the car shouldn't care. It's not relying on visible light.

    • by kiviQr ( 3443687 )
      Most people would react differently - they would either slow down to adjust to driving conditions or use high beam lights. Note that person behind the weel was checking something on dashboard? phone? - she unfortunatelly wasn't paying attention. Regarding car - they did very bad job. Car should have recognized person (LIDAR, infrared, radar, ultrasound, etc.); it looks like it uses only camera video to make decisions. Doesn't look like breaks got applied. I have seen way better non autonomous cars that rea
    • Take with a pinch of salt: low light, high contrast and over compressed. How much do you really expect to see in such a video?

      More interesting is what he human driver was looking down at instead of having their eyes on the road. If they weee looking at a screen then they also compromised their night vision.

    • A robot car should have supernormal vision, redundant detection capabilities and faster reaction times, not excuses...
    • by LordKronos ( 470910 ) on Thursday March 22, 2018 @05:57PM (#56309127)

      Based on the video I saw, she was practically invisible until she entered the car's headlight beams. The road was poorly lit, and she had dark clothing, no reflectors on the bike and no lights.

      I don't see how I could have stopped or swerved in time to avoid her in that brief window.

      Then I suggest you try driving by looking out the windshield, and not at a crappy video of whats in front of you. I say that not as a joke. People seem to keep judging this situation by the video we see, but the video quality is pretty much crap. I guarantee the human eye would capture much better detail (both in terms of resolution as well as shadow detail) then what we see in that video. The video is absolute crap, so please don't say what you couldn't have done based on it.

    • Dashcams tend to expose for the light and make things in darkness less visible than they are to human eye.

      But even ignoreing that the pedestrian was in the traffic lane when the headlights reached them - they didn't step into the headlights out of darkness from the side. So a human could have stopped in time assuming they were driving at a safe speed and hence didn't have their stopping distance out past their view distance. If you can't stop in that situation you are driving too fast for the conditions.

    • Based on the video I saw, she was practically invisible until she entered the car's headlight beams.

      I think you're seeing the limitations of the camera. There were two streetlights nearby, and if the headlights on the car were anywhere near focused properly, a human driver would have easily seen her.

      The road was poorly lit, and she had dark clothing, no reflectors on the bike and no lights.

      In most of the US, large animals, like deer, commonly enter a roadway. Deer do not wear reflectors or have LED lights

  • by reve_etrange ( 2377702 ) on Thursday March 22, 2018 @04:46PM (#56308507)

    raises new questions about autonomous-vehicle technology.

    No, it raises further questions about Uber's poor, perhaps criminally negligent, implementation. In the last year Uber's had more, and more serious, accidents than I think every other driverless program combined. Google/Waymo has been testing in San Francisco - not Tempe - for years with nothing comparable.

    • Yeah. It seems Ubers implementation is basicaly a level 2 AI marketed and tested as a level 3, and then they are just hoping their safety drivers can keep the scam working until they have gotten some more investor money to burn in their corporate dumbster fire.

      • Being a safety driver has to be one the most boring jobs. It seems to me it would promote sleeping, texting, etc.

        They are supposed to have their hands poised over the wheel and be aware of everything going on around them at all times while the vehicle is in motion.
        Which is completely unrealistic for the individuals that would take that job.
        When the passager in the Tesla was killed when the vehicle mistook the white side of a semi trailer for the sky. Tesla said, all drivers while using autopilot were su
        • I was actually thinking they should have a person ask them questions about the road ahead every ten minutes or so. That may keep them more involved.
  • by foxalopex ( 522681 ) on Thursday March 22, 2018 @04:48PM (#56308529)

    I actually watched the set of videos and there's two major things to note:

    First of all the safety or backup driver appeared to be distracted. Although in all fairness if you're suppose to sit there hours on end without taking an active roll at driving this is probably going to happen. This is why google believes in all or nothing approach, half-baked systems are going to get people killed. While this wouldn't save the cyclist from being hurt, quick reflexes may have saved it from being fatal.

    Second, LIDAR works by projecting a super high speed panning laser that maps out the 3D spacial environment. It causes the computer to produce a 3D model of the surroundings. This should NOT be affected by the dark! Unless Uber decided not to use LiDAR which would be a dangerous move. If they're using LiDAR the only explanation is the AI image recognition system failed to recognize the cyclist which is weird considering an object that BIG moving should register as a collision threat. Google has noted that in their own self-driving program the computer can sometimes panic over a flying piece of newspaper while a normal driver wouldn't because it looks like an object heavy enough to threaten the car.

    • by Luthair ( 847766 )

      First of all the safety or backup driver appeared to be distracted. Although in all fairness if you're suppose to sit there hours on end without taking an active roll at driving this is probably going to happen. This is why google believes in all or nothing approach, half-baked systems are going to get people killed. While this wouldn't save the cyclist from being hurt, quick reflexes may have saved it from being fatal.

      To me randomly driving around seems like poor test methodology - they should only allow the AI to drive to test specific scenarios and the time should be limited such that a human can reasonably supervise it. Otherwse they should have a human drive a vehicle collecting sensor data which can be used in simulations. Unlike a person, an algorithm doen't know the difference between a simulation and real life.

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      Uber do use LIDAR. Looks like there was a hardware or software issue. The car was also breaking the speed limit at the time.

    • The driver looks like they have consumed many rolls but are not at all active

    • LIDAR (and radar, and sonar) is one of those things which sounds great when you consider the car in isolation by itself. But isn't so great once you have multiple cars on the road all using it. I'm already noticing the problem with the sonar-based parking sensors. Despite using CHIRP (sonar frequency which varies over time), there's still enough random overlap from nearby cars that the parking sensor will occasionally trigger due to other cars which also have parking sensors. Usually it's at a red light
    • by kiviQr ( 3443687 ) on Thursday March 22, 2018 @05:47PM (#56309037)
      I wonder if they removed LIDAR after Waymo lawsuit?
      • by mjwx ( 966435 )

        I wonder if they removed LIDAR after Waymo lawsuit?

        I doubt it, LIDAR is likely to be someone elses tech bought off the shelf. Not sure about Uber but Alphabet (Google) uses Helodyne units which I've used for aerial terrain survey. Phenomenally accurate units except if its raining, snowing or there's cloud in the way.

  • Seriously. Pay attention when crossing the road, especially at night.
    • Seriously. Pay attention when driving a car, especially at night.

      • The stakes are a lot higher if you're on foot though. Even if there is a crosswalk I'm not going to play chicken with a car.
      • by Kjella ( 173770 )

        Seriously. Pay attention when crossing the road, especially at night.

        Seriously. Pay attention when driving a car, especially at night.

        Unfortunately paying attention as a driver is no immunity from getting hit as a pedestrian. It doesn't matter if I have the right of way, I'll be the one injured, crippled or dead. And when you know that by far most of the adult population have a driver's license so when you're scraping the bottom of that barrel there's some pretty terrible drivers out there. Looking out for yourself is simple self-preservation, not matter how much the rules say you shouldn't have to.

  • by petes_PoV ( 912422 ) on Thursday March 22, 2018 @04:54PM (#56308575)
    I watched the video and it looked to me like (in the visible spectrum at least) she literally appeared out of the shadows less than a second or two before she was hit.

    But the other question would be why didn't she see the oncoming vehicle. It had its lights on and even coming around a bend, the light it threw onto the roadway would be visible long before the car itself appeared.

    Even if one party in a collision is not at fault, that doesn't mean they couldn't have avoided it.

  • by Prien715 ( 251944 ) <agnosticpope.gmail@com> on Thursday March 22, 2018 @04:55PM (#56308587) Journal

    Kudos to everyone in the last story who commented LiDAR being able to see the pedestrian and the crash being totally avoidable. Comments have also been more accurate than the news in the recent Intel and AMD (non-story) about security.

    The fact that the highly moderated comments is more accurate almost any news outlet is why I keep coming back. That and I'm *still* looking for Natalie Portman's brand of hot grits.

    • This being /. I'm really struggling to decide if you are being serious.
    • These stories are just PR on dummies. Liability lawyers and engineers should roll their eyes at the execuses.
      Most be something rigged in the "enabling legislation" if the companies get away with it.
  • video camera's dynamic range is much less than that of a human eye. meaning we can't judge what a human might/could have seen based on this video.

    that said, i'm not necessarily in agreement that a human could have avoided a collision is a similar scenario BUT sensors in my view should have noticed the pedestrian. if not that should be considered a fixable flaw.

  • Extremely boring (Score:5, Insightful)

    by 140Mandak262Jamuna ( 970587 ) on Thursday March 22, 2018 @05:28PM (#56308843) Journal
    I mentioned it earlier. The idea of a human being in the driving seat would be alert enough to override the autonomous mode is really really stupid.

    If there is no need to steer, the attention wanders and it is impossible to stay alert. This was discovered almost 100 years ago in the railroads. The engineer had the exacting task of watching for grades and monitoring speed, especially those days with weak steam locomotives that responded very slowly. Still they would get bored and fall asleep. They invented the dead man's treadle. The engineer must keep it pressed, or the locomotive will stop. Even now there are various techniques to check and keep the engineers alert on railroads.

    With that much of history, it is stupid for autonomous cars to just leave the driver there. They should have active devices that do challenge and response to make sure the human operator stays alert. Else it is a waste to put a human being there.

  • If it can be determined that the car *should* have seen her, then what was going on that time that the car didn't see her?

    It's a freakin' computer... you can go through its logs and track what it saw and what it didn't see, and figure out based on the logic in the code why it didn't respond to the pedestrian appropriately.

    Figure that out, and then add it to repertoire of situations that the car knows about to at least make it safer for the future.

  • More or less. Continue scoffing at me and shouting me down if it makes you feel better, SDC fanboys, IDGAF.
  • There are bound to be situations in which a human would react better than an autonomous system. That's not news. The real question is whether there are more per-capita accidents involving human drivers that could have been prevented by an autonomous vehicle or vice-versa. We will likely never get to the point where autonomous vehicles never make a mistake that humans wouldn't. However, when we get to the point where it makes FEWER fatal or potentially fatal mistakes than the average human, that's the cutoff
  • So, something in another lane crosses into yours,

    Let's look at it from the computer's perspective.

    You can be driving in your lane and have stationary traffic in the next lane (eg. a turning lane). This is not a problem, they are not in your lane.

    At the extremes of your sensing range, you see an object in that lane that is not moving towards you. In this case, at that distance a person pushing a bicycle across the lane is - generally - not really approaching you, not if you look at lidar. This is not a probl

If you didn't have to work so hard, you'd have more time to be depressed.

Working...