Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AI Transportation Technology

Waymo Self-driving Cars Are Having Problems Turning Around Corners (siliconangle.com) 245

Alphabet's Waymo has long been regarded as the leader in autonomous vehicle development and technology, but all might not be as well as it seems at the company, according to a report published Tuesday. From a report: The Information quoted a number of unnamed Waymo insiders who claim the vehicles being used in the Arizona ride-hailing test have numerous problems. The test, which launched in November, is meant to be converted to a full commercial service later this year. The report claimed that the autonomous Chrysler Pacifica struggles to handle a number of driving tasks and even goes as far as annoying human drivers around them. Top among the problems is an apparent issue with turning left. "The Waymo vans have trouble with many unprotected left turns and with merging into heavy traffic in the Phoenix area, especially on highways," the report noted. "Sometimes, the vans don't understand basic road features, such as metered red and green lights that regulate the pace of cars merging onto freeways." If having problems turning left isn't bad enough, they also apparently on occasion have problems turning right. One woman claimed that she almost hit a Waymo vehicle as it suddenly stopped while trying to make a right turn.
This discussion has been archived. No new comments can be posted.

Waymo Self-driving Cars Are Having Problems Turning Around Corners

Comments Filter:
  • by Anonymous Coward

    I really cannot fathom these monstrosities on the same road as myself. Time and time again it has been proven that a human driver is needed to intervene to keep these 'auto'mobiles in check. Yet, each one of these companies claims they'll be fully autonomous within a year or so, and each time it gets delayed again and again. If there's any indicator where we're at, just take a look at the newly released chat with the former Telsa worker. When will Silicon Valley and its ilk stop spreading false hopes and fl

    • Re: (Score:3, Insightful)

      I'm actually surprised about the progress that has been made already. I'm pretty optimistic about this stuff, but 5 years ago I would not have predicted that we would have come as far as we have, not just with experimental vehicles but with semi autonomous tech that is alerady available on high end consumer vehicles. Sure, there's false hopes and unrealistic expectations being raised, that's what Silicon Valley is all about after all, but this is not 5 decades out either. I think automakers like Tesla (i
    • I really cannot fathom these monstrosities on the same road as myself.

      I really cannot fathom the monstrosities known as incompetent drivers on the same road as myself, not knowing how to handle left turns in intersections (the left turn always yields unless there is a protected left signal!) and not staying in their lane and not knowing how to signal at a roundabout, but there they are, and they mostly don't crash.

      Many more will die in the same likes as in the Tempe, AZ accident.

      That same collision could reasonably have happened with a human driver. Many more die every day due to the failings of human drivers. You're nowhere near a point.

      • Human drivers manage well over 460K miles without an accident. Considering Waymo vehicles can't make a turn without possibly causing an accident what are they up to now? 12 miles?
        • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Wednesday August 29, 2018 @09:45AM (#57217340) Homepage Journal

          Considering Waymo vehicles can't make a turn without possibly causing an accident what are they up to now?

          You can't cause an accident by stopping in the middle of a turn at an intersection. You can, however, cause an accident by following too closely. People coming to an abrupt stop should be an expected action. An animal or a human could run out in front of their car, or a bag could just blow out in front of them too quickly for them to see what it is. Something could fall off of the car in front of them.

          I don't want to let Waymo off the hook here completely, it's still ridiculous behavior. But blaming them for a collision with someone behind them is even more ridiculous if their vehicle isn't in reverse at the time. And it's still more ridiculous when no collision in fact occurred.

          • The onus is on Waymo to drive in a predictable fashion here. The fact that humans sometimes follow too closely is a consideration when driving. I *always* know who is behind me and how close they are so that I know how to handle the situation if I must stop quickly. Is the driver that is following too closely at fault in a specific accident? Sure. But should Waymo identify this as a factor in proper defensive driving and have their cars do it as well? Absolutely. As more time goes on, the responsibil
            • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Wednesday August 29, 2018 @10:03AM (#57217536) Homepage Journal

              The onus is on Waymo to drive in a predictable fashion here.

              No, it really isn't. Unless Waymo is trying to cause a collision, the onus is on the following driver to watch out for irrational behavior. Humans are often irrational. Often, when following one driver, I see them do literally a dozen different irrational things in the space of a couple of blocks. They speed up, they slow down, they drift lanes, they start a lane change and then stop it for no reason... If I assumed they would be driving rationally and predictably, I would hit them. I don't, so I don't. The law is quite clear that if I run up their ass, I'm at fault.

              Is the driver that is following too closely at fault in a specific accident? Sure. But should Waymo identify this as a factor in proper defensive driving and have their cars do it as well? Absolutely.

              Yes, of course they should. And they will, because unnecessary stops are undesirable for a variety of reasons. What I take objection to is all the people who want to let the following driver off the hook because the Waymo car shouldn't have stopped. Human drivers do things they aren't supposed to do all damned day, and the rest of us are expected to account for that. Why would other drivers' inattention or poor practice suddenly become acceptable because they are behind an AV?

              • Safe driving requires BOTH parties to be considerate. Goes Google realize right now this very moment that drivers are irrational? If so then the car should be designed to work with irrational drivers, period, full stop. It doesn't matter if the driver is following too close, and thus caused an accident. In these situations, the human driver in front usually slows down to make the situation safe again, or pull over and let the person pass. An automated car should be even more capable of doing so. Your
                • Does Google realize...*
                • Your attitude that a driver should only be concerned about safety in terms of whether they are liable for the accident or not really turns my stomach to tell you the truth.

                  In a perfect world, everyone would be aware of everything happening around them at all times. Guess where we don't live? That's why the law makes it the following driver's responsibility not to hit the driver in front of them. Their not hitting stuff in front of them is their responsibility, and it's the driver behind them who has responsibility for not hitting them. My attitude is that the law has already taken this argument into account, and portioned responsibility fairly in this case. Your attitude that

              • The onus is on Waymo to drive in a predictable fashion here.

                No, it really isn't.

                If they want to brag about how much they improve safety on the roads, it sure as hell is. If introducing a "safer" driver to the road ecosystem causes an increase in accidents because it doesn't behave the way "bad" human drivers expect it to, it isn't actually safer, no matter how rational or law-abiding it may be.

                • If they want to brag about how much they improve safety on the roads, it sure as hell is. If introducing a "safer" driver to the road ecosystem causes an increase in accidents because it doesn't behave the way "bad" human drivers expect it to, it isn't actually safer, no matter how rational or law-abiding it may be.

                  Your scare quotes are, as usual, inappropriate. If it doesn't hit things as often as humans, it is a safer driver. If the vehicles behind it hit it, they are the problem. They shouldn't be following that close. A vehicle can come to a hard stop for a variety of reasons, only some of which are control-related. Not being prepared for those eventualities makes the follower the problem driver, period.

                  Should Waymo limit spurious hard stops? Yes. Is it the person behind them with the responsibility to stop before

      • by TWX ( 665546 )

        I really cannot fathom the monstrosities known as incompetent drivers on the same road as myself, not knowing how to handle left turns in intersections (the left turn always yields unless there is a protected left signal!)

        You're wrong about that. Even at metered intersections with a control-light, those seeking to turn left must yield to any oncoming traffic that enters the intersection, regardless of one's own light, and regardless of that oncoming traffic's light.

        It didn't make sense to me either, until I realized that first, the prime duty is to avoid a collision, regardless of things like right-of-way, and second, that more than one party can be cited in an accident. If you turn left even when the oncoming driver runs

        • You're wrong about that. Even at metered intersections with a control-light, those seeking to turn left must yield to any oncoming traffic that enters the intersection, regardless of one's own light, and regardless of that oncoming traffic's light.

          Actually, those seeking to enter any intersection from any point must yield to any oncoming traffic that enters the intersection, whenever possible, for the reason you stated (avoidance of accidents.) However, I was only talking about who has the right of way.

          If you turn left even when the oncoming driver runs a red light, both of you will be cited. Him for running the red light and failure to control, and you for failure to yield.

          Eh, maybe. You might or might not be cited for that. What really matters, though, is conviction, not just citation.

  • Not a problem (Score:5, Interesting)

    by PopeRatzo ( 965947 ) on Wednesday August 29, 2018 @01:52AM (#57215460) Journal

    When I was in high school, my sister had a friend who was deathly afraid of turning left from one busy street to another. She just didn't get the whole, "inch out until the light turns yellow, and you're sure oncoming traffic is gonna stop, and then complete your turn" thing. So, swear to god, she used to make three right hand turns instead. She drover her father's old '70 Buick Electra 225 4-door and that thing was like an aircraft carrier. But it had the first electric seats I ever saw and had the bucket seats instead of a bench in the front, which I though was cool.

    In summary, as long as you can make a sufficient number of right-hand turns, you can get away without hanging a Louie.

    • Re:Not a problem (Score:5, Interesting)

      by Chrisq ( 894406 ) on Wednesday August 29, 2018 @02:40AM (#57215572)
      My mum used to avoid right turns (we drive on the left in the UK so those are the ones where you have to cross the oncoming traffic). It got to the stage where she had worked out a long route that would take her to the local supermarket and back without turning right once. This was the only trip she'd drive herself eventually.
    • by Memnos ( 937795 )

      "Deuce and a Quarter". I drove one of those back in high school. We called it the White Whale. Huge.

    • by sphealey ( 2855 )

      = = = In summary, as long as you can make a sufficient number of right-hand turns, you can get away without hanging a Louie. = = =

      That works in the Midwest US (where I learned to drive), but not so well in Pittsburgh. Make a right instead of a left there and it could be an hour until you get back to your starting point, which is also true for much of the northeast region of North America.

    • by Megane ( 129182 )
      "Two wrongs don't make a right, but three rights make a left."
    • "In summary, as long as you can make a sufficient number of right-hand turns .."

      Generally true although it may not work in Boston.

      Legend has it that notorious whackjob and FBI director J Edgar Hoover actually instructed his drivers never to make left turns. One of Hoover's biographies is entitled "No Left Turns"

    • In summary, as long as you can make a sufficient number of right-hand turns, you can get away without hanging a Louie.

      Indeed you can, and often should.

      I specifically address those folks who think they can turn left from an establishment out onto the road when it's bumper to bumper rush hour traffic. Just turn right and go around the block.

  • by vtcodger ( 957785 ) on Wednesday August 29, 2018 @02:03AM (#57215480)

    OK ... So we have several problems

      First the Waymo software is likely a bit buggy. No surprise there. It'll take several years to work through that Wait til they encounter some of the blinking red and yellow arrows recently installed on traffic signals around here. I don't have the slightest idea what they really mean. Neither does anyone else.. Neither, I'll bet, will Waymo. On top of which at some times on some days, the sensors trying to read the signals will be looking directly into the sun.

    Second, the Waymo cars try to drive safely and legally. Whereas human drivers generally try to drive as quickly as possible without being delayed by accidents or police traffic stops.

    Third, I expect, is that autonomous vehicles in general are likely going to have trouble with some forms of bad weather -- especially heavy snow which humans who like to stay out o0f ditches handle by driving quite slowly and keeping moving. This is likely not going to be apparent in testing in Sunnyvale or Phoenix.

    • Never mind driving in bad weather; it is painful enough clearing your normal vehicle of snow in the morning. Who is going to want to climb on top in a snow storm and pick every chunk of ice off the lenses and then use lens cleaner on them?
      • Never mind driving in bad weather; it is painful enough clearing your normal vehicle of snow in the morning. Who is going to want to climb on top in a snow storm and pick every chunk of ice off the lenses and then use lens cleaner on them?

        And then keep them clean and clear while driving! In heavy snow it can be hard enough to keep the windshield clear with a heater and wipers.

    • Wait til they encounter some of the blinking red and yellow arrows recently installed on traffic signals around here. I don't have the slightest idea what they really mean. Neither does anyone else.

      Flashing yellow arrows are becoming quite common around here. Other than one woman who wrote an editorial about them, no one has had any problem figuring out that they mean "turn left when safe to do so, yielding to oncoming traffic". They typically replace solid red arrows, and are a delightful improvement.

      We don't have the flashing red arrows, but applying a bit of logic, I'd be really surprised if they meant anything other than "Stop, then you may turn left when it is safe to do so."

  • that with more debugging, they're hoping to turn the corner on the problems.
  • Walt Disney's dream (Score:3, Interesting)

    by eminencja ( 1368047 ) on Wednesday August 29, 2018 @02:27AM (#57215554)
    It is not going to happen on regular roads as we know them. Instead some big corporation is going to build a new city (possibly around a new campus) where regular cars will be banned and all trafic will be autonomous and roads will be smart as well with sensors, broadcasts, and what not. It will be so much simpler (for the AI) and so much more convenient for the humans. And once the benefits are obvious, other cities will follow suit. Building a city from scratch was Walt Disney's dream btw.
    • Instead some big corporation is going to build a new city (possibly around a new campus) where regular cars will be banned and all traffic will be autonomous and roads will be smart as well with sensors, broadcasts, and what not.

      To make the roads really safe . . . you'll need to ban human passengers, as well. As long as there are still humans in the autonomous cars, they'll find a way to cause the car to crash.

      "Nothing can be made foolproof, because fools are so ingenious."

    • by jeti ( 105266 )
      If you're building new infrastructure anyway, it would be much more easy to use elevated light rails. I really liked the concept of the Taxi2000 system [prtconsulting.com].
      • If you're building new infrastructure anyway, it would be much more easy to use elevated light rails. I really liked the concept of the Taxi2000 system.

        If you're building elevated light rails, it makes dramatically more sense to hang the vehicle from the rail and use and even more lightweight rail than to make a rail strong enough to not only support the vehicle, but also keep it upright. The rail consumes 1/4 or less of the resources of a fat monorail like that.

    • by PPH ( 736903 )

      I think I've seen this city [cnn.com].

  • Stopping suddenly (Score:5, Insightful)

    by 93 Escort Wagon ( 326346 ) on Wednesday August 29, 2018 @03:05AM (#57215620)

    ”One woman claimed that she almost hit a Waymo vehicle as it suddenly stopped while trying to make a right turn.”

    If you almost hit someone because they stopped suddenly... that’s on you, not the other driver.

    Don’t drive like an idiot.

    • by havana9 ( 101033 )
      If I have to do an emergency brake because the driver if front of me behaves erratically and I can stop safely I was driving correctly, and was actually respecting the safety distance.
      In Italy there's is a fine if a driver stops suddently or even drives too slowly without reason., for traffic obstruction,
      • If I have to do an emergency brake because the driver if front of me behaves erratically and I can stop safely I was driving correctly, and was actually respecting the safety distance.

        In Italy there's is a fine if a driver stops suddently or even drives too slowly without reason., for traffic obstruction,

        I could be wrong, but I feel like suddenly stopped in the article means unexpectedly stopped, rather than break check stopped. While stopping a car for no apparent reason on a normal road is a bad thing to do, it is still up to the car behind to not run them over.

        Such as every day lately on the freeway all of the traffic stops for no apparent reason. Probably ten minutes ago, someone merged in and resulted in the cars behind slamming on the breaks. Ten minute later, everyone is still stopping at the same

      • In Italy there's is a fine if a driver stops suddently or even drives too slowly without reason., for traffic obstruction,

        We have that, too, but in general the driver who hits another driver is responsible for the collision unless you have very good evidence that the driver who was braking intended to cause an accident. It's just too hard to determine who is at fault, so we have laws which make the following driver responsible for maintaining safe space.

    • Comment removed (Score:5, Interesting)

      by account_deleted ( 4530225 ) on Wednesday August 29, 2018 @05:03AM (#57215928)
      Comment removed based on user account deletion
    • Legally, the tailgater is at fault, agreed. But part of the point of autonomous cars is to drive with humans and they are failing at that if they are doing unexpected things.
  • Errr.... (Score:3, Insightful)

    by thegarbz ( 1787294 ) on Wednesday August 29, 2018 @03:56AM (#57215726)

    One woman claimed that she almost hit a Waymo vehicle as it suddenly stopped

    Then don't tailgate. Idiot.

    • Re: (Score:3, Insightful)

      One woman claimed that she almost hit a Waymo vehicle as it suddenly stopped

      Then don't tailgate. Idiot.

      Unexpected, erratic behavior is dangerous.

      Yes, we should all defensively drive and all that. But surely you aren't claiming that we should all be able to erratically stop for no reason whenever we want on any public road, and reasonably expect that this is not going to result in increased accidents.

      • by PPH ( 736903 )

        Unexpected, erratic behavior is dangerous.

        Come visit my town. Old people.

      • But surely you aren't claiming that we should all be able to erratically stop for no reason whenever we want on any public road

        Yes I am claiming this 100% and the law will claim it too. If you hit the car in front for whatever reason, even if that car just randomly slams on their brakes you're at fault because you were driving too close to react to a change in conditions.

        Unexpected things are dangerous. Today it's a self driving car. Tomorrow it's someone having a seizure, the day after it's some kid who runs out on the road.

    • by DRJlaw ( 946416 )

      One woman claimed that she almost hit a Waymo vehicle as it suddenly stopped

      Then don't tailgate. Idiot.

      She didn't hit it, ergo she wasn't tailgating.

      Pull that move yourself with a police car behind you. You won't get hit, but you'll probably get a talking--to and just maybe a ticket yourself [bellinghamherald.com].

      • A ticket? I got ARRESTED because i slowed down at a light and the car behind the car behind me ran into the car behind me, knocking it into me. Turns out the car that caused the accident had 6 kids not properly seatbelted who got injured and no insurance, so the cops thought if they blamed the accident on me my insurance would cover their medical costs. It didn't work, the DA threw the case out, and my insurance refused to pay anything. Yeah, I'm pretty sure "the driver's manual says you are required to lea
        • A ticket? I got ARRESTED because i slowed down at a light and the car behind the car behind me ran into the car behind me, knocking it into me.

          Bullshit.

      • She didn't hit it, ergo she wasn't tailgating.

        So what you're saying it's a non-event that should never have been written down as words. But it was written down so everyone involved is an idiot.

    • If you aren't going to let idiots drive, the western economies going to experience a huge readjustment.

      • I didn't say don't let idiots drive. I said idiots shouldn't complain. If I nearly hit a car because it suddenly stopped then I need to do some inner reflection rather than bitching at the media.

    • The word 'almost' means she wasn't tailgating, idiot. What it means is that these cars are doing things that are scary for other drivers.
      • The word 'almost' means she wasn't tailgating, idiot.

        The words are on the page meaning she's bitching to the media about it, ergo she's an idiot who doesn't realise how in the wrong she is, fluffernutter*.

        What it means is that these cars are doing things that are scary for other drivers.

        If people are scared by a car stopping in front of them then they are idiots who shouldn't be driving.

        *I was going to call you idiot back but given your past posting just saying your name is just as effective.

  • by SmilingBoy ( 686281 ) on Wednesday August 29, 2018 @04:30AM (#57215838)

    I'd like to hear Waymo's side of the story as I could imagine that the vehicle may have stopped during a right turn because it detected a hazard that was real (maybe a child running towards the road) or not real (paperbag flying towards the road). I also find the wording "the vans don't understand basic road features, such as metered red and green lights that regulate the pace of cars merging onto freeways" strange. Surely metered lights are not a basic road feature but something quite rare. I'm not saying that Waymo should not be able to handle those (surely they should!) but it does not seem to be a major failure either.

    • Surely metered lights are not a basic road feature but something quite rare.

      Until recently they were less rare in the USA than roundabouts, although for some reason roundabouts are being installed all over the damned place now. I think the theory is that they reduce fatalities, but they definitely don't reduce accidents.

      • That's true where I live. There has been an uptick of accidents at roundabouts, but a sharp decrease in fatal accidents.

        At intersections there's the potential for head-on and side collisions with the passenger compartment. At roundabouts, about the only possible collision is a side-to-side impact between a hood and a trunk. That's fantastic from a human safety standpoint.

        What I don't get is how there are more accidents. You look left. When nobody is coming, you go. How fucking hard is that? It's an order of

        • What I don't get is how there are more accidents. You look left. When nobody is coming, you go. How fucking hard is that? It's an order of magnitude less complicated than navigating a four way stop.

          It's hard because they are used in idiotic locations, and because people don't know how to signal for roundabouts. Say I'm on a highway, it doesn't matter if it's the 1 coming North into Fort Bragg or the 20 going West out of Upper Lake, but those are both places where roundabouts have recently replaced other traffic control features at intersections. Either way, traffic is going 40+ MPH when it comes to the roundabout, at which point it is expected to slow to 15 MPH. If it does, which it usually doesn't, i

    • Not rare, in Oregon and Washington almost every freeway entrance has a stop light for restricting access onto the freeway. And they work fine for human drivers who realize only one car is allowed to go through every time they turn green.
  • "Waymo Self-driving Cars Are Having Problems Turning Around Corners "

    According to TFA, it should be "Waymo Self-driving Cars Are More Risk-Averse than Human Drivers."

    • If they were risk-averse, they would understand that the human drivers behind them have a certain reaction time, and it is stressful for them to have a vehicle in front of them stop suddenly for any reason.
  • Those "I am not a robot" links Google provides to websites ask humans to pick out cars, crossings, road signs etc. to prove they're not a computer. If a computer can't even pick out a road sign or other road related things in a still image then why does anyone expect a self driving car to?

    It's simply evidence that the hype for this tech is well in excess of reality.

    • That's a really good point. That's from a still image too; driving requires picking these out from thousands of images a second.
  • Boing Boing had a link to the reporter who originally broke the story which actually has useful information - https://twitter.com/amir/statu... [twitter.com]
  • They're having waymo problems than they expected, then? Now I understand where the name "waymo" comes from.
  • Waymo reported that they drove something like 37,000 miles without a human interaction in November 2017. How does this add up, now knowing that they can't even navigate a normal intersection properly?
  • "The Waymo vans have trouble with many unprotected left turns"

    I've said it before and i'll say it again. If they're having trouble making unprotected left turns, perhaps you should stop having Google Maps direct them to make unprotected left turns? And then PLEASE PROVIDE THAT UPDATE TO ME AS WELL!

    Google Maps seems just fine for trips of moderate distance, but i've lost track of the number of times on shorter trips it has directed me to take side streets and then suggested that i make a left turn at an
    • by PPH ( 736903 )

      If they're having trouble making unprotected left turns,

      then perhaps they shouldn't have a drivers license.

      Unprotected left turns are common in most places. In fact, left turn signals are usually installed based on traffic demand due to cost. So as mass transit and punitive city center tolls reduce traffic, expect to see less of them.

Some people manage by the book, even though they don't know who wrote the book or even what book.

Working...