Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Transportation Businesses

Uber Shutting Down Self-Driving Operations In Arizona After Fatal Crash (azcentral.com) 67

An anonymous reader quotes a report from The Arizona Republic: Uber is shutting down its self-driving car tests in Arizona, where one of the cars was involved in a fatal crash with a pedestrian in March, the company said Wednesday. The company notified about 300 Arizona workers in the self-driving program that they were being terminated just before 9 a.m. Wednesday. The shutdown should take several weeks. Test drivers for the autonomous cars have not worked since the accident in Tempe, but Uber said they continued to be paid. The company's self-driving trucks have also been shelved since the accident. Uber plans to restart testing self-driving cars in Pittsburgh once federal investigators conclude their inquiry into the Tempe crash. The company also said it is having discussions with California leaders to restart testing.
This discussion has been archived. No new comments can be posted.

Uber Shutting Down Self-Driving Operations In Arizona After Fatal Crash

Comments Filter:
  • Ugh, I was working in the self-driving unit and I just got canned. I thought it would work!
    • I wanna be a test napper in a car.

      • by TWX ( 665546 )

        You really don't. Too much facing the sun so you can't really get off to sleep, and they're monitoring you on video the whole time anyway.

        The most boring days I've ever had were workdays without anything to do but sit there. Being the "driver" in a self-driving car without being allowed to do anything would be hell.

      • should be "Uber Shutting Down Killbot Operations in Arizona Due to Low Effectiveness".

        Or maybe the Pentagon threw some money at them and they've taken the program black. /Alex Jones
    • So what is the future of Uber? Waymo is ahead on the tech, and Lyft is partnering with other SDC developers. So if Uber pulls back, they are going to get crushed on price when competing SDC rideshares are available. They are bleeding cash even at current prices ($11B burn rate last year). So how do they survive?

      • The amount others waste trying to build SDCs isn't really relevant to the ultimate demise of Uber.

        They don't survive, but first there is chump's money to collect.

      • by AvitarX ( 172628 )

        It depends on how this effects investing, since they aren't really making money, but this should reduce their burn rate significantly.

        If a driver makes $12/hour (I feel I read that's what they make after accounting for other costs) it'll take at least 2 years for a self driving car to recoup the cost of a full time driver (a self driving car could in theory replace 2+ drivers though). That's assuming around 20k mark up for the car (which I suspect is low).

        Sure, a company with a reliable self-driving base fl

  • by TWX ( 665546 ) on Wednesday May 23, 2018 @03:57PM (#56661632)

    ...and then running the cars on public roads leading to the fatal collision, they should consider themselves lucky if any jurisdiction is willing to let them run again.

    If I were a mayor or town manager I'd ban them. If the state overruled me, I'd request that my police department ensure that their vehicles do not pose a danger to the public, which would probably mean being pulled over all of the time and inspected for any violations by the commercial enforcement team. I doubt that the person behind the wheel has the ability to prove that safety systems are enabled, so that might mean a lot of vehicles get stopped, fail to prove safety, and get towed back to the shop with a fix-it ticket.

    • This isn't your facebook feed, you can't just post that kind of stuff without some sort of proof to back up that claim.

      • by Anonymous Coward
        • Of course they would need to "disable" one system to enable another. Phrasing sure can show bias. I am sure the 737 I am flying on today has 'intentionally disabled safety systems' along the way. Should I be concerned, I mean if I leave out context that means there are no more safety systems.

          Do you have the article that shows how disabling those systems to use others in their place caused this accident?

          • by TWX ( 665546 )

            And I read somewhere that not only did they disable the OEM system, but they disabled their own system too.

      • by Kjella ( 173770 ) on Wednesday May 23, 2018 @04:56PM (#56661994) Homepage

        This isn't your facebook feed, you can't just post that kind of stuff without some sort of proof to back up that claim.

        I can't quite remember where I read it, but the car model has collision detection in the default configuration and would normally have performed an emergency brake when collision was imminent. All the "smart" features were disabled to run Uber's SDC software, though from what I understand this is standard practice so you don't have competing/conflicting automated systems. Nobody made a big deal of that part, it's just part of the explanation of how it could plow down that pedestrian without reacting at all - they disabled a primitive system that worked and replaced it with a sophisticated system that didn't and actually performed worse than out of the box.

        • Thanks for a more unbiased account. An AC above posted some links and from what I can gather all it proves is that they are saying "it wasn't our systems that failed!". Hardly what the parent was trying to imply. Yes it would be a serious mess if two systems tried to take over car controls.

          • They kind of said more than "it wasn't our system that failed", more along the lines of "our system would have applied the brakes 1 second before impact"
            Which is much better than plowing through the person at 30+mph. Despite the video footage being terrible and much worse than would have been seen by the OEM camera.
            At 30mph the XC90 can come to a complete stop in about 15m. Less than 1 second at that speed.

            Intel Corp.’s Mobileye, which makes chips and sensors used in collision-avoidance systems and is a supplier to Aptiv, said Monday that it tested its own software after the crash by playing a video of the Uber incident on a television monitor. Mobileye said it was able to detect Herzberg one second before impact in its internal tests, despite the poor second-hand quality of the video relative to a direct connection to cameras equipped to the car.

        • though from what I understand this is standard practice

          So you're criticising a company for the standard practice of not screwing up their algorithms by running a second set over the first ones? Gotchya. Thanks for clarifying. I thought you had something relevant to say about the topic.

    • by stephanruby ( 542433 ) on Wednesday May 23, 2018 @07:25PM (#56662780)

      If I were a mayor or town manager I'd ban them.

      That's what San Francisco did before the Arizona accident. It specifically banned Uber back in December because its self-driving car ran a red light. See video [youtube.com] (wait until the 10 seconds mark).

      After that, Uber was allowed to test in California, just not in San Francisco. After the Arizona accident, Arizona, California, and one other State pulled Uber's permit to test cars on their public streets.

      Now Uber can only do testing on its own private track with fake pedestrians and fake bicyclists, and I really doubt that it will ever be allowed to test its cars on the public roads in California again. Thankfully, there are 50 other self-driving car companies in the US. And in the San Francisco Bay Area, I now see 7 different types of self-driving cars which seem to multiplying in numbers, I just no longer see the Uber ones anymore.

  • Must not ham humans...Must not harm humans...Must not harm humans...Must *carrier interrupted* Harm humans.
  • You'd think with all the tech they stole from Waymo/Google, they would've been better at this.

    Without autonomous cars, I wonder how Über plans to survive long-term? It seems unlikely investors will be willing to keep throwing millions and millions of dollars at them.

  • by mewsenews ( 251487 ) on Wednesday May 23, 2018 @04:15PM (#56661750) Homepage
    Is their business model going to be kill someone in a criminally negligent fashion, pull up stakes and move to a different state? Will they run out of states or fix their technology first?
  • Stick to union busting.

  • by TeknoHog ( 164938 ) on Wednesday May 23, 2018 @04:40PM (#56661882) Homepage Journal

    when the progress of science wasn't hindered by a few statistical accidents. The age of discoveries. The space race, when you could at least pretend mankind had its aim at the stars, even if it was mostly about political bickering between superpowers.

    Now it's all about safety and well-being for everyone, no child left behind. If there's any of that sci-fi tech around we used to dream of, we might as well put ourselves in the stasis chamber and be comfortably numb for the rest of eternity.

    • by Anonymous Coward

      Kind of ignoring the fact that the public at large is being conscripted into this experiment aren't you, Sparky? It ain't just the guy testing the "autopilot" who's at risk. So's your sister and her kid.

    • by itsdapead ( 734413 ) on Wednesday May 23, 2018 @05:38PM (#56662260)

      Except one of the much vaunted benefits of self-driving cars is they're going to be safer than human drivers...

      Anyway, this wasn't a freak accident - the safety driver was watching their phone and not the road and, if you believe the video they released, then the car was driving too fast for the visibility conditions (the alternative is not to believe the video...)

      Even in the good ol' 60s, if Apollo 11 had landed on a civilian's head because the Heroic Astronauts were busy Tweeting then Questions Would Have Been Asked (like, "what the hell is Tweeting?")

    • by novakyu ( 636495 ) <novakyu@novakyu.net> on Wednesday May 23, 2018 @07:02PM (#56662704) Homepage

      Yeah, the good old days when scientist like Marie Curie killed themselves with exposure to dangerous radiation, rather than the public (I mean, yes, there are "radium girls [wikipedia.org]," but Curie discovered polonium, not radium).

      P.S. To be actually serious, it's a good thing for autonomous cars that unscrupulous companies like Uber will be driven out of the business (if not "out of business" altogether). There are much more competent, ethical, and less-profit-crazy companies out there. They are the future of technology, not companies like Uber.

    • when the progress of science wasn't hindered by a few statistical accidents.

      If you're going to move 100% of your profits to Ireland, then the Netherlands and onto the Caymen Islands to avoid paying taxes, and otherwise not give back to society, don't expect society to tolerate you externalizing your research costs.

    • when the progress of science wasn't hindered by a few statistical accidents. The age of discoveries. The space race, when you could at least pretend mankind had its aim at the stars, even if it was mostly about political bickering between superpowers.

      Now it's all about safety and well-being for everyone, no child left behind. If there's any of that sci-fi tech around we used to dream of, we might as well put ourselves in the stasis chamber and be comfortably numb for the rest of eternity.

      Well, the usual ratio to consider is "risk"/"reward".

      I can see how self driving taxis benefit a taxi company. I am not seeing how they benefit me, or civilization generally.

    • There’s a big difference between “a few statistical accidents” and “engaging in reckless behavior”, which is effectively what Uber has been doing. Their incident rate is known to be several orders of magnitude worse than their competitor’s, yet they’ve been testing their vehicles on public streets nonetheless. The reason they were in Arizona to begin with is because they got kicked out of California after running red lights and operating without a license. While Goo

    • Yes, I remember the days when corporations were allowed to all but openly kill in pursuit of profits. Very few people remember them as the "good old days".

    • by AmiMoJo ( 196126 )

      Volunteering for a dangerous mission to Mars is fine, but random cyclists on public roads didn't volunteer to be part of Uber's SDC testing.

      There has to be a balance between progress and killing members of the pubic.

  • To burn the records and wipe the hard drive?
  • Why would a taxi company even try to develop self driving cars? Why not wait for the pro's to get it right and just purchase the hardware/software from them. Here's a list of companies in the self-driving car sector: GM, Google, Daimler-Bosch, Ford, Volkswagen Group, BMW-Intel-FCA, Aptiv, Renault-Nissan-Mitsubishi Alliance, Tesla, Toyota, Audi, Nissan, Peugeot, ........

    • Because those companies aren't going to sell the cars... not to Uber. They'll just make a fucking app themselves. Uber's only value is the install base, and that's primarily only valuable in the driver community. (They'll then make an interoperable app they all own and split the market nicely between themselves.)

  • The next time a couple of their cars are involved in an accident, they'll stop making and selling cars.
    • by sphealey ( 2855 )

      The will cooperate with DOT and the NTSB to thoroughly investigate the accident and improve their system.

      Although I'm pretty sure the Big 3, along with BMW, Toyota, etc will have tested the 'suddenly appearing obstacle' scenario among others before they put their semi-autonomous vehicles on public roads. Sometimes you can't replace 120 years of accumulated knowledge with "disruption".

      • I believe that one way the NTSB and other agencies can contribute usefully is to publish a reference set of scenario "test cases" based on real world accidents for self-driving car researchers and manufacturers.

        There is no reason someone else should die because their car thought a trailer crossing a roadway was a road sign, didn't recognize a stopped firetruck on the highway, or followed the wrong line into a traffic divider.

  • Back when the original crash was news, I saw the video on YouTube.

    It was apparently taken by a dashcam. The person walking a bike across the road was obscured until they were about in the lane directly ahead of the car. It wasn't clear until I looked closely, but the obscuring object seemed to be the (very out-of focus) driver's side windshield-wiper.

    If the camera had been mounted somewhere else (like on the front side of the rear-view mirror) the pedestrian would have been clearly visible.

    Which made me w

    • by Anonymous Coward

      The video you saw was also very dark, because Uber edited it to turn down the brightness and contrast to deflect blame. That street was and is actually very well lit.

      Also, there is no legal requirement to cross at a cross walk in that area, even if there were crosswalks. This isn't an interstate highway or anything like that, just a county road.

    • Which made me wonder: Is that camera just for recording a view from the driver's seat? Or is it what the auto-driving software is using for vision.

      A lot of people wonder. Sensible people don't leave that conspiracy theory crap out of the "stupid ideas" section of their brain. Look at the feed, if that was being used to drive then the car wouldn't have hit the pedestrian, it would have run off the road and been wrapped around a tree.

      Now in the mean time does it seem likely that Uber would load a metric shitton of gear on the roof racks of their cars and then use a grainy cabin cam to do the driving?

      If the former, why the HELL would you log, for your engineering analysis, what a separate camera sees but not what the CAR sees?

      What makes you think in any way shape or form that the

Some people manage by the book, even though they don't know who wrote the book or even what book.

Working...