Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Transportation

A Self-Driving Uber Car Went the Wrong Way On a One-Way Street in Pittsburgh (qz.com) 254

An anonymous reader writes: Uber driver Nathan Stachelek was pulled off to the side of the road when he saw the self-driving car turn the wrong way. It was the night of Sept. 26 and the car he had spotted, one of the autonomous Ford Fusions that Uber is testing in Pittsburgh, Pennsylvania, was heading through the city's Oakland neighborhood, just steps from the center of campus for the University of Pittsburgh. Stachelek watched the car turn off Bates Street and onto Atwood, a one-way road, going in the wrong direction. From a distance he couldn't tell whether the car was driving itself, or its human operator had made a mistake. Stachelek took out his phone in time to shoot a brief video of Uber's vehicle backing up and driving away, then uploaded it to Facebook. "Driverless car went down a one way the wrong way," he wrote. "Driver had to turn car around."
This discussion has been archived. No new comments can be posted.

A Self-Driving Uber Car Went the Wrong Way On a One-Way Street in Pittsburgh

Comments Filter:
  • In all fairness (Score:5, Interesting)

    by Anonymous Coward on Tuesday October 04, 2016 @02:44PM (#53012591)

    In all fairness, I've done the same in Pittsburgh. Was visiting, not familiar with the city and you guys do love your one way roads. Luckily I figured it out pretty damn quick.

    • Re:In all fairness (Score:4, Insightful)

      by neoritter ( 3021561 ) on Tuesday October 04, 2016 @02:47PM (#53012611)

      In all fairness, for self-driving cars to live up to the claims that proponents are making, they can't do this.

      • Re:In all fairness (Score:5, Insightful)

        by NatasRevol ( 731260 ) on Tuesday October 04, 2016 @03:01PM (#53012719) Journal

        I don't think anyone anywhere is claiming that self driving cares will be perfect.

        That's just stupid to expect.

        Lowering the 100,000+ deaths per year in the world due to humans driving is the actual goal.

        • What they are claiming is that they are ready to be on public roadways, which they aren't if they are still doing things like this.
          • They're ready, once they're as safe as humans on the road. Which happened several years ago.

            • Mathematically, "as safe as" = "as dangerous as".

              Marketingmentally they're quite different.

            • Really.. So the Uber car was able to figure out on its own that it was going the wrong way and get out of the situation like a human would? Or was it totally oblivious to its own mistake. The reaction to the mistake is more important to the fact that the mistake was made.
              • So, you're saying there's no human accidents on one-way roads? Or just being specious for no reason?

                • I'm saying the chance is far less likely.. It is far less surprising that one out of a million humans would make this mistake because of the sheer number of drivers. On the other hand there are only 20? 30? Automated Uber cars on the road, if they are as safe then accidents for them should be statistically negligible if not impossible. This is not even taking into account all the extra limitations that have been placed on Uber cars that are not on human drivers. Not sure how you could say automation is a
                  • Except all accidents are based on per mile. And automated cars have been proven repeatedly to be much safer per mile than almost any grouping of humans.

                    • Cite your source.. I don't even know of any automated car that works in all situations that a human works in, so there is no way to compare. At the very least, you would have to omit human accidents that happened in tricky situations that autonomy wouldn't attempt. For example, if autonomy will not pass other vehicles in the oncoming traffic lane, then omit any human accident that happened while passing in the human traffic lane.
                    • http://www.techtimes.com/artic... [techtimes.com]

                      "After the adjustments were made, the Virginia Tech study estimates that human-driven vehicles find themselves in 4.2 crashes per million miles, as opposed to self-driving cars that find themselves in 3.2 crashes per million miles."

                      So automated driving was, in late 2015, already (4.2-3.2/4.2) ~25% safer.

                    • Ah yes I'm sure a study backed by Google would be accurate. The fact is that there are still humans in the Google cars so of course they are prevented from getting into serious accidents. That is happening from the human taking over, not from ai. All that aside, the article backs up my point a few sentences later: "self-driving cars are still not widespread enough to check if the safety technologies included in these vehicles actually hold up against the myriad of real-life situations that can be experi
                    • Now that I have read the article again, this study doesn't do what I said at all.. All they did was add a certain amount of accidents that would be under-reported by people. There was no omission of the types of human accidents that wouldn't apply to autonomy. They are comparing apples to oranges and they know it, which is why they included the line about there not being enough real-world statistics.
        • by tomhath ( 637240 )

          Lowering the 100,000+ deaths per year in the world due to humans driving is the actual goal.

          Making a profit on self-driving car technology is the goal. Anything else is a byproduct.

        • The problem is that the people who DO die should not be doing so because a machine fucked up.
        • we're kind of an all or nothing people. I don't know if I'd call it an endearing trait but it's certainly one of ours.
      • The claims are to be better than people - I don't see anyone claiming perfection.
        • by CODiNE ( 27417 )

          I turned into a one way street driving through Oakland CA. Was looking for parking at the same time and the streets were so confusing for an out of towner. Considering all the tourists in San Fransisco it probably happens fairly often. I'd trust most self driving cars more than myself in that town, especially at night.

          I'll just let other people Beta test them however. :-)

      • In all fairness, for self-driving cars to live up to the claims that proponents are making, they can't do this.

        If Google Maps isn't sending drivers the wrong way down that street, I doubt very much that the car's software would make that mistake.

        Since that the car had a driver in it, I'd be willing to bet that the vehicle was under human control. But even if it wasn't, the software will be fixed, and no Google car will ever make that mistake again, whereas you can be quite certain that human drivers will c

      • In all fairness, for self-driving cars to live up to the claims of their proponents, they just have to do this less frequently than people do it.
        • But they also have to be as good at getting out of the situation on their own. I think this is the biggest concern, that the car will just stay confused and not auto correct itself by turning around without hitting anything and recalculating the route.
      • In all fairness, for self-driving cars to live up to the claims that proponents are making, they can't do this.

        In all fairness, the proponents aren't claiming that SDCs are perfect, just better than HDCs.

        Also, whatever bug or DB error that caused this specific problem has probably already been fixed. SDCs will improve. HDCs will not. You can't fix bugs in wetware.

      • Not until all cars are self driving, at least. Then most traffic laws would be a bit redundant I would think; let the cars figure out the quickest safe route through coordination with servers and with other cars without any need for further restrictions.
      • by GuB-42 ( 2483988 )

        From a distance he couldn't tell whether the car was driving itself, or its human operator had made a mistake.

        It is not even sure that the car was self driving.

    • by goombah99 ( 560566 ) on Tuesday October 04, 2016 @02:52PM (#53012649)

      This summer in Manhatten, between battery park and Grenich village, google maps told me to turn the wrong way on a one way street, a major road, that has always been one way. Apple maps on my wifes phone got it right. If google can mess up that spectacularly in the most well characterized city in the world this is not surprising.

      • by reanjr ( 588767 )

        Self-driving cars absolutely must not rely on map data or GPS to operate properly. A self-driving car must be able to read road signs and traffic conditions at least as well as a human in order to claim the title self-driving. Not to say a self-driving car might not still make the same mistake, but if it makes the mistake 100% of the time due to incorrect map data, it is not a self-driving car. It's a particularly advanced rail system.

      • It used to tell me to get on a freeway going the wrong way. I tried informing them and years later they hadn't fixed it.

    • Re:In all fairness (Score:4, Insightful)

      by Anonymous Coward on Tuesday October 04, 2016 @03:00PM (#53012713)

      Murphy's Law of Computing #8:
      To screw up is human, to screw up royally requires a computer.

      The issue has never been one self driving car screwing up vs one driver screwing up - the machine will eventually beat the human there (and arguably it already has). The issue is that one mistake on a map update or some defect in the algorithm, and the possibility that you'll have cars full of people driving over the edge of an incomplete bridge for hours on end. That's always been my personal reluctance for enthusiastically embracing self driving cars.

      • Murphy's Law of Computing #8:
        To screw up is human, to screw up royally requires a computer.

        Corollary: Human can screw up bigger, faster, better, with computers. Also see: Epic Fail.

    • by SeaFox ( 739806 )

      In all fairness, I've done the same in Pittsburgh. Was visiting, not familiar with the city and you guys do love your one way roads. Luckily I figured it out pretty damn quick.

      We forgive you because you don't have a GPS embedded in your head that constantly tells you where you are and has the direction of all roads mapped.
      Add to that, this is a test limited to a single municipality. It's not a case of "oh, well the GPS map was out-of-date because we wont be aware of construction being done three states away immediately". This is a relatively small test bed and Uber should be watching like a hawk for local issues to update the test vehicles.

    • by AK Marc ( 707885 )
      I've done so "legally". The side-street I approached the one-way from didn't have any markings that the street was one-way. So, with no way to know the road layout, turning the wrong way onto it was technically legal. I did not disobey any traffic marking. Once I determined it was a one-way street, and I was going the wrong way, I turned around. No law was broken.

      It's not illegal to go the wrong way on a one-way street. It's illegal to disobey road markings.
  • A quick check shows Google, Apple & Bing maps all know Atwood is a one-way-street.
    I'm not even a resident of Pittsburgh or student at CMU and I could figure that out.
    • Hey, it worked fine on the machine in my office too... Must have been a hardware problem, no way the software is to blame! (Said many programmers to me in the past.)

  • As expected, the car was undamaged and only collateral damage was a few kids and kittens crushed in the process

  • by dcooper_db9 ( 1044858 ) on Tuesday October 04, 2016 @03:07PM (#53012753)

    he couldn't tell whether the car was driving itself, or its human operator had made a mistake

  • Zoning needed (Score:5, Insightful)

    by GeekWithAKnife ( 2717871 ) on Tuesday October 04, 2016 @03:12PM (#53012807)

    We need regulating bodies and driverless car makers to agree on standards and zoning.

    A driverless car has sensors, not eyes and spatial awareness. It has GPS and map data not a sense of direction.

    If the data fed to the car says it can turn into oncoming traffic (and there are no vehicle so the sensors don't alert some wannabe AI) it will turn. Any human that might make the error will very quickly notice they are going the wrong way without the need for cars. the might notice how (most) cars are parked facing in a certain direction or road markings that give clues like "no entry" and the corresponding road markings.

    Car AI cannot yet read these properly. Forget reading in time or when it's raining and the sign is slightly eroded or placed at an odd angle.

    A human can spot a branch handing on power lines dangling in the wind, a sensor designed to avoid collisions with other cars cannot.

    I'm certain that driverless cars will get much better and will very quickly be safer than a human driver despite these and other faults BUT to make it all so much safer we need approved zones. Like zoning for congestion or weight/height limits.

    Car manufacturers will know that in these specific zones/highways they can expect a rather predictable set of road conditions. A human can drive the car out of some odd city intersection with angry aggressive drivers in rush-hour then switch to autopilot for that boring and predictable 100 mile highway journey. (Or not if you like that sort of driving)

    When a driverless car can navigate A to B across a busy city in India it might be ready to do away with zoning but until then it's simply necessary and I believe it's just a matter of time until zoning happens.
    • John Zimmer from Lyft describes an evolution in his Medium article [medium.com] that would address the issues you raise.
    • by Ichijo ( 607641 )

      A driverless car has sensors, not eyes and spatial awareness.

      Substituting "cameras" for "eyes," why can't a driverless car have all of the above?

    • We should really make driverless the default and keep the distracted, poor sighted, slow reacting humans out of traffic in congested areas. If you want to make things work smoothly, that's the better solution.

    • People drive the wrong way on freeways and freeway entrance / exit ramps regularly. You are likely to die when you do it, but it does happen.

  • From a distance he couldn't tell whether the car was driving itself, or its human operator had made a mistake. Stachelek took out his phone in time to shoot a brief video of Uber's vehicle backing up and driving away, then uploaded it to Facebook. "Driverless car went down a one way the wrong way," he wrote. "Driver had to turn car around."

    Well, was it driverless or did it have a driver? If it had a driver, was the driver in control? Which would make it just a funny looking car and a confused human operator?

    Verdict: meh.

  • I got a ride from a friend one time, and she went down the wrong way on a one way, too. Nearly got us killed, because there was no stop sign at an intersection coming the wrong direction, but cross traffic did not stop. It happens. The only difference, with Uber, you can correct the software. With a human driver, you're constantly fighting stupid.
    • Pretend that there weren't human drivers in the Uber car. Do you think the autonomy would have been able to deal with the situation safely once realizing the mistake, as your friend apparently did?
  • We all have... I would also comment that in this day and age a modest mapping device installed on squad cars
    in metro areas can record data that the city map makers are unable to maintain. Very high leverage in rural areas.

    Like the Waze application has demonstrated mapping and traffic feedback is darn easy.

    Waze might have a class of users "city+state roads, police" that have +10 reliability
    points for reported map errors accidents and obstructions.

    Facts like this today are just data. The community can hel

    • by PPH ( 736903 )

      a modest mapping device installed on squad cars

      Every route to the donut shop accurately mapped.

      • a modest mapping device installed on squad cars

        Every route to the donut shop accurately mapped.

        As long as the map is correct.. It is a start. ;-)
        I have nothing against donuts BTW.

  • Must be a slow news day, I guess.

  • by dohzer ( 867770 ) on Tuesday October 04, 2016 @04:43PM (#53013541)

    Oh good, they've already learned to take shortcuts!

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...