Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Transportation Technology

Uber's Self-Driving Cars Were Struggling Before Arizona Crash (nytimes.com) 284

An anonymous reader quotes a report from The New York Times: Uber's robotic vehicle project was not living up to expectations months before a self-driving car operated by the company struck and killed a woman in Tempe, Ariz. The cars were having trouble driving through construction zones and next to tall vehicles, like big rigs. And Uber's human drivers had to intervene far more frequently than the drivers of competing autonomous car projects. Waymo, formerly the self-driving car project of Google, said that in tests on roads in California last year, its cars went an average of nearly 5,600 miles before the driver had to take control from the computer to steer out of trouble. As of March, Uber was struggling to meet its target of 13 miles per "intervention" in Arizona (Warning: source may be paywalled; alternative source), according to 100 pages of company documents obtained by The New York Times and two people familiar with the company's operations in the Phoenix area but not permitted to speak publicly about it. Yet Uber's test drivers were being asked to do more -- going on solo runs when they had worked in pairs. And there also was pressure to live up to a goal to offer a driverless car service by the end of the year and to impress top executives.
This discussion has been archived. No new comments can be posted.

Uber's Self-Driving Cars Were Struggling Before Arizona Crash

Comments Filter:
  • by hey! ( 33014 ) on Saturday March 24, 2018 @09:03AM (#56318439) Homepage Journal

    Why does that sound so familiar?

    Oh, wait. I'm a software developer.

  • by Teckla ( 630646 ) on Saturday March 24, 2018 @09:03AM (#56318441)

    Self driving cars are mostly hype. They're primarily self driving on very good, very clean, very well mapped roads only. Take them out of perfect conditions, and they fail miserably.

    That being said, the technology is still cool, even though it has a long, long way to go. A lot of the technology could eventually be incorporated into normal everyday cars to help human drivers avoid accidents.

    But the hype, at this point, is kind of out of control.

    • by bill_mcgonigle ( 4333 ) * on Saturday March 24, 2018 @09:33AM (#56318549) Homepage Journal

      The problem with the hype is that high early expectations of perfection will drive fear which may lead to regulations that will ultimately cause more deaths. Give it ten years and this stuff will probably outperform human drivers, but watch one kid chase a ball out in front of a robot car and e.g. Utah will ban the technology.

      • by Anonymous Coward

        You apparently don't work in the industry. I do. The "hype" is to attract investors. It's purely for money. The actual timelines I've seen from the companies developing the sensors don't even suggest autonomous driving is possible in 10 years. There aren't any sensors that can detect common road obstacles, and there are no known solutions. A child is the road? Can't see it. This is by their own admission. Remember the time when a car crashed head on into a semi-truck? Can't see them either - they admitted t

        • So Waymo being able to drive 5,600 miles without human intervention is just a lie then?

          • Re: (Score:3, Insightful)

            80% of driving is easy, 20% of it is full of a million edge cases like this one. Furthermore, edge cases like this are easily avoidable if you have a vested interest in choosing where and when to drive. Waymo could easily rack up 5600 miles and never hit an edge case. We only got elucidated on this one because of a bad test driver and it gave us a window into how bad the situation really is.
            • What about this makes it an edge case? I appreciate most of driving (on a per-mile base) is keeping lane, following the speed limit, and not running into any other cars; and that this part is relatively trivial to manage. Detecting traffic signs/signals is the next increment, and defensive driving follows.

              But, saying that a pedestrian at an unmarked crossing is an "edge case" I think belies what driving is.

              I will accept a beach ball blowing across a highway as an edge case though.

    • by monkeyxpress ( 4016725 ) on Saturday March 24, 2018 @10:35AM (#56318743)

      Self driving cars are mostly hype. They're primarily self driving on very good, very clean, very well mapped roads only. Take them out of perfect conditions, and they fail miserably.

      But even a car that could drive under such conditions would be extremely useful. Take, for example, public buses. They just drive around the exact same route everyday, and the route in many places is upgraded with special lanes and signalling infrastructure to make their job easier. There is no reason why you couldn't start with replacing such bus routes in cities with moderate weather conditions. Over time a combination of roading infrastructure improvements (special lanes, intersection redesigns, beacons etc) and the tech getting better could easily expand out to cover the majority of vehicle uses in a city. Again, we do this for bicycles and buses, so why is it impossible to imagine it would make sense to do some road works to cater towards driver less cars?

      Another example is motorway driving. Motorways are already an extremely controlled and regular environment. It would be great to have a driver less truck that can go door to door, but there is no reason why we can't start with depots built off the side of motorways where local human drivers pick up and drop off longhauled trailers. As for weather conditions guides in the road way and other navigation infrastructure could be added if these problems cannot be dealt with using lidar and cameras.

      Yes, I agree that a car you can just dump into an unknown urban environment is a long long way off. But I don't understand the fascination with meeting this goal before driver less vehicles can be useful to us.

    • Self driving cars are mostly hype. They're primarily self driving on very good, very clean, very well mapped roads only. Take them out of perfect conditions, and they fail miserably.

      I have a solution to this. Let's develop some technology and put it through an iterative improvement process combined with years of trials in the real world in increasingly complex scenarios. It's a shame no one has thought of doing this.

      By the way, hype has two possible meanings:
      a) It's publicised a lot to which I say So? That's kind of the point of all new technology that many groups are working on.
      b) It's benefits are exaggerated: No they're not.

      So yes, Self driving cars are currently hyped up. That is n

  • Nice company (Score:2, Offtopic)

    by tsa ( 15680 )

    Really nice company, Uber. Their 19th century attitude towards its employees surely will make them do their very best.

    • Re: Nice company (Score:3, Interesting)

      Their mutual rating system actually does do just that. Get used to the gig economy - your 19th century factory model is going away.

      • Re: (Score:3, Insightful)

        by Anonymous Coward

        What? The gig economy IS the 19th century factory model. Back then workers assembled at places and the bosses pointed out which got to work that day. No job security, bad wages, no insurance.

        The gig economy is just a scam trying to fool people into believing that they are "freelancers" or "consultants" when they in fact are making slave wages working more hours than is legally allowed. A freelancer or consultant can set their own wage and negotiate on it. In the gig economy you have to accept whatever wage

  • by aaarrrgggh ( 9205 ) on Saturday March 24, 2018 @09:19AM (#56318485)

    Ok, so forcing them to liquidate might be extreme, but clearly there is some kind of regulatory framework missing here!

    I hope the victim has some relatives that want to get rich though.

    • The regulatory framework is there.......Arizona made regulations allowing them to do this. Clearly they didn't think deeply about the problem, they just trusted tech companies.
    • by novakyu ( 636495 )

      Two words: tort liability.

  • by BeerCat ( 685972 ) on Saturday March 24, 2018 @09:23AM (#56318497) Homepage

    Clearly, not all autonomous vehicles are the same.

    It's very like camera - a cheap one and an expensive one will both offer "autofocus" and "zoom lens"

    The cheap one will have 3 or 4 focus settings, while the expensive one will be continuous. The cheap one will have 2 or 3 zoom settings, while the expensive one will, again, be continuous.

    So, Uber's cars look to be at the "what is the minimum that can make a car steer itself" end of the scale, and the Google ones are "have we missed anything off the long list of things that will help a car steer itself" end

    • The problem is:
      1) It is damn expensive to do it right, and
      2) There is no standard method to prove if someone is doing it right before allowing them on public roads
      • 1) This is Silicon Valley. They have lots of money to pursue difficult technology like this.
        2) There is a standard measure already, average distance between interventions by the human monitor. Waymo is clearly ahead here at 5100 miles to Uber's 13. When that average distance gets to some agreed value of very high, self-drive will be ready for general use.

        • 1) But they have to make something marketable. Then why didn't Uber have a LIDAR array?
          2) That's a poor measure. We need to know exactly how many and what type of obstacles the cars are dealing with in those miles. The biggest problem is, self-driving is easy 80% of the time. It only gets hard for the 20% of edge cases like this one. It doesn't really matter how many successful miles there are if we don't know how well they deal with difficult edge cases.
        • Another thing I should have added for 2) was the fact that Uber *was* making the requirement for interventions and someone died. So obviously the measurement is bull.
    • by mikael ( 484 )

      What they are doing at the moment is like teaching someone to drive a car, where the instructor takes over when something goes wrong. What they also need to be doing is being like a flight school where they train the pilots to handle things when they go wrong. Otherwise, there are always going to be high profile accidents in the news, like the Viola Group accident:

      https://www.telegraph.co.uk/ne... [telegraph.co.uk]

      I've seen these kinds of bridges in Norway. One end of the road at the bridge becomes a solid concrete wall. The

  • Contrary to popular belief, the term jaywalking does not derive from the shape of the letter âoeJâ (referencing the path a jaywalker might travel when crossing a road). Rather, it comes from the fact that âoeJayâ used to be a generic term for someone who was an idiot, dull, rube, unsophisticated, poor, or simpleton. More precisely, it was once a common term for âoecountry bumpkinsâ or âoehicksâ, usually seen incorrectly as inherently stupid by âoecityâ folk.

  • by account_deleted ( 4530225 ) on Saturday March 24, 2018 @10:45AM (#56318779)
    Comment removed based on user account deletion
  • The smartest AI we have right now, not counting IBM's watson, is about as intelligent as a brain-damaged cockroach and that actually is a real scientific assessment from real scientists. I think Time posted that article actually. So they're supposed to know a truck from a building in full context. Really? Does anyone expect a computer to be on par with a human brain on this one? That's ridiculous. We're decades away from a working self-driving car.
  • Why a company would want to get into the self driving car angle of the rideshare market.

    We see the reports of the incredibly low profits the drivers currently make, and UBER wants to buy a bunch cars as well? The one thing that drives down profits?

  • Uber lost $4.5 bn last year, in a revenue stream of $7.5bn. They were evaluated to be next-to-last in a large survey when it comes to AD technology and strategy.
    They are the ones with the largest stakes as their current business model is less than solid, and really need to bring AD to the streets ASAP in order to survive.

    This accident shows how far they are from that - this accident was not a difficult case from any perspective, sensory (the video shows dashcam footage, without any kind of HDR functionalit

    • Uber lost $4.5 bn last year

      That's misleading as hell. Uber's is profitable with shitloads of cash flow; they chose to spend that money, which they did. Are their investors stupid? Perhaps but that's another debate.

      • by DrTJ ( 4014489 )

        I don't think that is particularly misleading. It just tells us that is ridiculously expensive to develop AD, and that Uber needs to do that for quite some time yet before they can start to make money from it. Will the investors sign up en-masse for a long period? I doubt it.

      • Yes, they spent that money subsidizing the cost of the rides (especially the drivers). We call that a "cost" to their operations which reduces profit.

        They are not profitable. They will never be profitable at current ride rates unless they eliminate their biggest expense which is drivers.
  • Human drivers will kill about 100 people in the USA today, world wide it will be close to 2000. Driving also consumes the valuable resource of 4.5 million workers in the USA alone.

    Uber might be an easy company for some people to hate (and there are even some people with financial incentive to hate them) but other than the slashdot hate for crypto currencies this is excessive. They also might have the worst self driving car on the road but someone has to be the worst*. We need self driving vehicles a
    • You're assuming that the only way to learn and improve the systems is to allow situations like the one that just happened. But there are ways to reduce risk. For example, keep the auto-braking system from Volvo engaged, if it is triggered and the AV didn't respond, then you now have a learning moment that didn't require a death.
    • We need self driving vehicles only IF they ever make a dent in the fatality rate. So far I see no evidence at best and I'm not prepared to give them leniency on killing people now based on a mere promise and I don't think anyone should. All evidence right now is that automation isn't coming anywhere close to human safety, which really is quite safe if you take it in perspective of miles driven by humans every day. Being overweight is probably more dangerous and people volunteer for that.
  • Cars are not allowed to do that. Period. People blind, kids fail to pay attention, roads are icy, people have strokes and heart attacks crossing the street etc. SDV *must* accommodate.

    SDV will never take humans out of the look as all software is written by humans

One man's constant is another man's variable. -- A.J. Perlis

Working...