Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation

Uber Could Resume Testing of Its Self-Driving Vehicles this Summer (bizjournals.com) 56

Uber could resume its testing of self-driving vehicles this summer after a long pause on the program following a fatal crash in March. From a report: The possible restart of Uber's self-driving testing comes after the company conducted an internal safety review that led to 16 recommended improvements, according to a source familiar with the matter. Those changes would all be implemented before Uber returns its vehicles to the road. The recommendations "include developing emergency braking features to help minimize collisions if the main self-driving system fails," as first reported in The Information.
This discussion has been archived. No new comments can be posted.

Uber Could Resume Testing of Its Self-Driving Vehicles this Summer

Comments Filter:
  • by Ol Olsoc ( 1175323 ) on Friday June 29, 2018 @10:39AM (#56865694)
    Remember when people were strutting around expounding how safe these self driving vehicles were?

    Guber driverless vehicles should be required to have a flashing light in order to let pedestrians know that as in all things, Guber doesn't follow rules, so you've been warned citizens.

    • So, keeping Uber off the roads should reduce pedestrian fatalities by, oh, 0.07% this year, eh?

      Based on the number of people killed by cars with drivers in 2017, of course. No statistics available for the whole country for 2018 yet....

      • So, keeping Uber off the roads should reduce pedestrian fatalities by, oh, 0.07% this year, eh?

        Based on the number of people killed by cars with drivers in 2017, of course. No statistics available for the whole country for 2018 yet....

        I'll repeat - Remember when people were saying how safe these self driving cars were? After all, the software is not fallible as a human.

        I wonder what the pedestrain fatalities per mile driven were compared to the unsafe human driven cars?

        Regardless, the point stands - Driverless cars indeed also kill pedestrians, so sorry - they are dangerous as well. Or does getting killed only count when a human is behind the wheel?

        • by dgatwood ( 11270 )

          I'll repeat - Remember when people were saying how safe these self driving cars were? After all, the software is not fallible as a human.

          Nope. I remember folks saying that about Waymo/Google, but not Uber. Uber has always been a company that takes shortcuts and ignores rules/laws. Anybody expecting safe self-driving tech from them should have realized their mistake long before the first accident.

          Self-driving tech will eventually be much safer than humans, because it will be able to learn from mistakes on

          • Uber has always been a company that takes shortcuts and ignores rules/laws

            Based on the previous articles it sounds like they completely ignored doing MIL/SIL/HIL testing. Every issue they had should have been caught by ISO26262 requirements and traceability.

      • Extrapolate those fatalities to the number of human driven cars on the road and your argument falls apart.
        • Or better yet, by miles driven. And really, you should control for the traffic, road, and weather conditions in which the SDCs are operating.
    • Just because Uber messed it up does not disprove their safety. Uber tries to do everything on the cheap to boost their profits. Meanwhile Google's cars would have spotted that pedestrian with the tech they were using four years ago.

      Self driving cars can are safe, just not the el-cheapo versions Uber is trying to use.
      • by AvitarX ( 172628 )

        Uber isn't boosting any profits.

        They are are trying to minimize burn perhaps, and have PR wins to get more funding.

        Nothing about Uber is about making money currently.

        Their hopes are either:
        1) drive traditional cabs out of business, then charge as much as they do, but not have the overhead of maintaining their own vehicles and the logistics that includes
        2) replace drivers with automation, allowing for much lower cost of service during less peak times (they'll still need extra cars during the busy times)

        or pe

    • More lies spouted off by the Uber PR machine.

      Uber also suspended its testing in San Francisco and Pittsburgh following the crash,

      Lie #1: Uber was permanently banned from testing in San Francisco in December several months before the crash. This is the reason why [theverge.com]. See the video for yourself. At the time, Uber placed the entire blame on the driver saying that he was the only one driving the car and that the system was disengaged, but internal documents later obtained by the New York Times show the direct opposite of that claim. That's lie #2.

      And please note, at that time Travis Kalanick was

    • Remember when people were strutting around expounding how safe these self driving vehicles were?

      No. Not at all. Because no one ever did. What they did say was that with consistency, repeatability, not wavering, not sleeping, and the ability to continuously improve they *will be* far safer than human drivers can ever hope to be. But absolutely no one has made a universal claim like you just have, not even really silly people.

      • Remember when people were strutting around expounding how safe these self driving vehicles were?

        No. Not at all. Because no one ever did. What they did say was that with consistency, repeatability, not wavering, not sleeping, and the ability to continuously improve they *will be* far safer than human drivers can ever hope to be. But absolutely no one has made a universal claim like you just have, not even really silly people.

        Here's some reading material for the no one ever said that crowd:

        Some of the words: "Driverless cars are designed to have almost a superhuman-like ability to recognize the world around them. This is because they use loads of sensors to gather tons of data about their environment so that they can seamlessly operate in a constantly changing environment."

        "The more data we feed it the more vocabulary it has and the more it can recognize what a pedestrian is. And we do the same thing with bicyclists, cars, t

        • "Superhuman", "Infinite ability" just non-committal words, and no doubt.

          Yeah, great words about the ability to learn which is exactly what I said. Now go back and read what you quoted to me. I'll highlight some for you:
          The more data we feed it

          This one gives AV lovers hard-ons The headline? "Google's Self-Driving Cars Are Ridiculously Safe' https://bigthink.com/ideafeed/ [bigthink.com]... [bigthink.com]

          So are you saying it's wrong? How many accidents have Google's self-driving cars caused in it's time? The answer is 1 in over 8 million kilometers of driving. That is a ridiculously AWESOME safety record, far better than even some of the best drivers out there.

          If you want more, you can google it

          Nope thanks I'll keep letting you reaffirm what I said. Got any more examples o

          • Sorry muchacho, through all of this wordsmithing back and forth, my point still stands.

            You merely don't accept words like Superhuman ability, and infinite ability and claiming that anyone against it is so stupid they don't realize it and are therefore wrong - AKA Dunning Kruger - or "ridiculously safe" as examples of what my original point was. My original point was that people are expounding just how safe AV technology is. Hyperbole and insults. Well okay, some people don't see such words as anything b

            • Sorry muchacho, through all of this wordsmithing back and forth, my point still stands.

              If you think so. Sure. But based on your inability to read I can see why you think why.

              • Sorry muchacho, through all of this wordsmithing back and forth, my point still stands.

                If you think so. Sure. But based on your inability to read I can see why you think why.

                You have the ability to choose what you will be. So for some reason you chose to be an asshole. I think you can do much better than that, but as you keep saying, I'm quite wrong.

            • Crap I hate double replying but yeah I missed it the first time around:

              And you still haven't told me who you want to car to purposefully kill in the event of an unavoidable accident.

              The question doesn't actually exist outside of any stupid theoretical mind. A car will be programmed to take the safest course of action given that it couldn't possibly know enough variables to make the decision. That course of action is to attempt to stop in a controlled way. Maybe one day in 2218 when technology is fed perfect information about friction and the value to society of everyone around it and in it, then we can re-ask the qu

            • And you still haven't told me who you want to car to purposefully kill in the event of an unavoidable accident.

              Nobody.
              If the car just hits the brakes when there it senses an obstacle ahead, then nobody can sue the
              manufacturer since that will always be a legally acceptable choice.

              • And you still haven't told me who you want to car to purposefully kill in the event of an unavoidable accident.

                Nobody.

                If the car just hits the brakes when there it senses an obstacle ahead, then nobody can sue the manufacturer since that will always be a legally acceptable choice.

                Amusing - what it slamming on the brakes will cause you to be killed by the guy tailgating you at 80 mph?

                Humans must make these choices, just slamming on the brakes is one of many. Since I have occasionally avoided accidents by speeding up, hitting the brakes hard might be the worst possible decision.

                I gave you one link already from the people who might be writing the software or involved with self driving cars, here's one from the auto industry https://www.inverse.com/articl... [inverse.com]

                You might want to ca

        • There will be situations where the autonomy must deliberately kill people.

          Like what?

          Robocars only need one rule: if there is an obstacle ahead (any obstacle) and there isn't time to safely drive around it, then hit the brakes and hope for the best.

          • There will be situations where the autonomy must deliberately kill people.

            Like what? Robocars only need one rule: if there is an obstacle ahead (any obstacle) and there isn't time to safely drive around it, then hit the brakes and hope for the best.

            Way overly simplified There is an ethical dillemma that pops up from time to time that occurs in driving. The unavoidable accident where there is no good decision, where each decision will lead to death of people. Humans caught in such a dilemma have to make these decisions - and slamming on the brakes might be the worst possible decision Maybe hitting the two bicyclists coming the other way will kill less people than running into the10 bicyclists in your lane as you round the blind corner. Maybe driving

  • by Anonymous Coward

    Yup, they can test all they want at the Bonneville Salt Flats in Utah.

  • all safety drivers will now be 1099'ers and need to take full liability

  • by Luthair ( 847766 ) on Friday June 29, 2018 @10:49AM (#56865756)
    Seems good. /rubberstamp
    • Seems good. /rubberstamp

      the company conducted an internal safety review that led to 16 recommended improvements

      Hopefully, one of those 16 is requiring the safety driver to pay attention and not watch pop TV shows while monitoring the car.

      • Obviously government regulation improvements should be called for as well. Expecting companies to 'self regulate' is an extreme conflict of interest.
      • Agreed.

        And hopefully one of those 16 is also having the obstacle detection system turned on, which it wasn't during the fatal crash.

  • Optimistically you are in 4th place in self driving tech Behind Waymo Cruise and Tesla (and really no-one is close to Waymo with 7M miles logged and 80k vehicles on order) This accident showed how far behind they really are and from the details of their legal fight it shows how Waymo was really right all along about Levandowski and the methods he wanted to take. He wanted to push the envelope and get vehicles on the road when they wanted to be conservative and test as much as possible. Now that $500M for

  • by sjbe ( 173966 )

    Uber could resume its testing of self-driving vehicles this summer after a long pause

    Don't care as long as they do it somewhere else I don't literally get crushed by their arrogance and stupidity and greed.

As the trials of life continue to take their toll, remember that there is always a future in Computer Maintenance. -- National Lampoon, "Deteriorata"

Working...