Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Google AI Transportation

Google Self-Driving Car Might Have Caused First Crash In Autonomous Mode (roboticstrends.com) 410

An anonymous reader writes: While driving in autonomous mode, a Google self-driving car was involved in an accident with a public bus in California on Valentine's Day, according to an accident report filed with the California DMV.The accident report, signed by Chris Urmson, says the Google self-driving car was trying to get around some sandbags on a street when its left front struck the bus' right side. The car was going 2 mph, while the bus was going 15 mph.Google said its car's safety driver thought the bus would yield. No injuries were reported. If it's determined the Google self-driving car was at fault, it would be the first time one of its cars caused an accident while in autonomous mode.
This discussion has been archived. No new comments can be posted.

Google Self-Driving Car Might Have Caused First Crash In Autonomous Mode

Comments Filter:
  • by NotDrWho ( 3543773 ) on Monday February 29, 2016 @05:00PM (#51610945)

    Ow my neck!!

  • Machine Learning (Score:2, Insightful)

    >> Google said its car's safety driver thought the bus would yield.

    So Google is teaching their cars to drive like normal Californians: expect that the other guy will yield.

  • by presidenteloco ( 659168 ) on Monday February 29, 2016 @05:07PM (#51610987)

    in some jurisdictions, cars have to yield right of way to buses in general.
    Buses certainly have right of weight.

    Also, what's with the aggressive / obnoxious sneaking around cars in same lane tactic. Did someone program that or did the software learn it?

    • by hey! ( 33014 ) on Monday February 29, 2016 @05:21PM (#51611085) Homepage Journal

      That may be true in some jurisdictions, but what's true in all jurisdictions is driving is that right of way isn't a license to get into an accident that you can avoid. If the Google car really was traveling at just 2 mph, then you have to wonder whether the bus driver could have avoided the accident.

      In any case it's clear that if the safety driver had been driving the accident still would have happened; he judged that the bus would yield, but it didn't.

      • At 2mph I wouldn't be surprised if the bus driver thought the Google car was stationary.
      • That may be true in some jurisdictions, but what's true in all jurisdictions is driving is that right of way isn't a license to get into an accident that you can avoid. If the Google car really was traveling at just 2 mph, then you have to wonder whether the bus driver could have avoided the accident.

        In any case it's clear that if the safety driver had been driving the accident still would have happened; he judged that the bus would yield, but it didn't.

        I do not believe that most jurisdictions require you to take action to avoid someone else hitting you. That could result in far more dangerous circumstances. And if the google car was going 2mph then the correct action is for the google car to stop for the sandbag rather than jump in front of a bus. Besides, I thought the whole point of autonomous cars is that they're supposed to be safer? It is clear that the car merged into the bus.

        • Re: (Score:2, Interesting)

          by Obfuscant ( 592200 )

          And if the google car was going 2mph then the correct action is for the google car to stop for the sandbag rather than jump in front of a bus.

          From the description in the summary, it sounds like the car ran into the side of the bus. It didn't jump in front of it, it sideswiped it as it tried to go around sandbags in its lane. Assuming that the bus was in its own lane, the car had to leave the lane it was in to do that.

          Every discussion about safe driving I've seen in this forum has had the "safe" drivers claiming that the only safe thing to do is stop when faced with an impediment to traffic, not to try swerving around it. And the autonomous discu

          • And if the google car was going 2mph then the correct action is for the google car to stop for the sandbag rather than jump in front of a bus.

            From the description in the summary, it sounds like the car ran into the side of the bus. It didn't jump in front of it, it sideswiped it as it tried to go around sandbags in its lane. Assuming that the bus was in its own lane, the car had to leave the lane it was in to do that.

            See, that's the thing - they were in the same lane. The AV was in the right side of the lane preparing to turn, the bus was behind and starting to pass the AV on the left side of the lane. The AV saw sandbags in its way, so slowed and moved over - and the bus did not yield. You cannot pass another vehicle in its same lane in nearly any situation (the one I know of that is legal is lane splitting on a motorcycle in California). The bus was in the wrong - it was passing another vehicle in the same lane.

            • the bus was behind and starting to pass the AV on the left side

              To hit the side of a bus, it has to already be passing you.

              The bus was in the wrong - it was passing another vehicle in the same lane.

              The whole purpose of double-wide lanes is so that people making right turns don't impede people not making right turns. You don't need that extra space for any other purpose. If you can't go past someone making a right hand turn, then the whole reason for the lane is defeated.

              And that ignores the question, did the car not see the bus or did the extra-smart computer just assume that the human would yield, as did the extra-smart human driver of said

        • I do not believe that most jurisdictions require you to take action to avoid someone else hitting you.

          In California you actually are required to do so if it can be done safely.

      • The good part is that the Google car has a whole lot of telemetry data waiting to be analyzed to figure out what really happened.

  • by Your.Master ( 1088569 ) on Monday February 29, 2016 @05:13PM (#51611029)

    I'm a big believer in autonomous cars, but when I see

    Google said its car's safety driver thought the bus would yield.

    it makes me wonder how many crashes we would have had in autonomous mode, if there weren't an attentive driver who was fully aware he was sitting in an experimental vehicle.

    Even if the first rounds of autonomous cars still require a driver for override (for legal reasons if nothing else), it seems like the number of autonomous crashes that likely would have happened is the number has to be driven way down to be comparable to, or less than, the ones with human drivers*; it's not really the amount of autonomous crashes overall that is important.

    Also makes me wonder whether any of the manual mode crashes were initiated in autonomous mode and the manual override driver just couldn't recover the situation.

    *whether average human drivers or above-average human drivers or even below-average human drivers are the standard is up for debate.

    • by fisted ( 2295862 )

      it makes me wonder how many crashes we would have had in autonomous mode, if there weren't an attentive driver who was fully aware he was sitting in an experimental vehicle.

      What's your point? Yes, it isn't ready yet. That is why they have a safety driver there in the first place.

      • by Junta ( 36770 )

        Well the point would be that the state of public discourse is based around the assumption that the cars have never caused an accident because they wouldn't. Stories were written all the time about how there were no known instances of an accident where the autonomous system were at fault. That dialog could be disingenuous if the safety drivers intervene often. For example, one accident that was caused by a 'safety driver' was when the car gave up trying to make a left term and the human had to try and mes

      • by nukenerd ( 172703 ) on Monday February 29, 2016 @07:01PM (#51611665)

        it makes me wonder how many crashes we would have had in autonomous mode, if there weren't an attentive driver

        What's your point?

        The point is that the safety driver's presence and power to intervene means that we cannot rely on the accident rate statistics racked up so far.

    • by rtb61 ( 674572 )

      As long as the programmers wear the penalties for what their programs do, including custodial sentences, autonomous vehicles are fine. If they expect the same old, same old, bullshit of you use the program, it's your fault because in the EULA we contrary to all the marketing the program is shite and you are an idiot to use it (marketing says the exact opposite), well, no autonomous vehicles should be banned. Sorry but there is now way in hell I want to share a road with software with the typical EULA no wa

      • Some companies have already declared that they'll assume liability for their autonomous vehicles. [jalopnik.com] They do this knowing full well that autonomous vehicles are going to be an order of magnitude or two safer than human drivers, mainly because human drivers, on average, are pretty terrible drivers.

        Quite frankly, I'm much concerned about sharing the road with other humans who get distracted, don't pay attention, or drive impaired / recklessly around me, and I'm very much looking forward to the day when vehicula

        • by rtb61 ( 674572 )

          Dude, we a talking typical lying jack ass corporations, let's see it in writing because before then it is just empty promises. I did read through the various claims and articles and they are just empty claims subject to government regulatory testing and government vehicle licensing, with a government implemented supportive framework (immediate out, not our fault, the governments fault, never ever forget lobbyists hard at work privatising profits and socialising losses). Here's is betting they will immediat

    • by ColdSam ( 884768 )
      Sure, all of what you say is possible, we don't have all the data. It's ALSO possible that crashes wouldn't have occurred if the self-driving car had been left to correct itself.

      However, what data we do have suggests that the combination of AV and scrupulous test driver is better than the average driver. We also know that every time the driver has to take the wheel that it will make the next generation of AV that much better.
    • Every time the operator takes the wheel, google use the captured sensor data to write another test case.
  • by __aaclcg7560 ( 824291 ) on Monday February 29, 2016 @05:20PM (#51611077)
    Neither school buses nor soccer moms ever yield in traffic.
    • Change that to TRANSIT buses and soccer moms and i'm with you in Atlanta.

      OTOH the school buses in my burb are actually pretty considerate.

  • by avandesande ( 143899 ) on Monday February 29, 2016 @05:27PM (#51611137) Journal
    1st rule of defensive driving- never expect another driver to do anything

    I won't yield into traffic or turn into a street if another driver will need to slow or brake not to hit me
    Never sit in a median to turn with front or back of car sticking out
    I actually speed up a bit before turning to maximize distance between myself and driver behind and turn shallow. This is a bit hard to explain but you angle into the turn and actually do most of your slowing when you are already in the turn
    Many others but I probably don't even think about them.
    • by jittles ( 1613415 ) on Monday February 29, 2016 @06:08PM (#51611381)

      1st rule of defensive driving- never expect another driver to do anything I won't yield into traffic or turn into a street if another driver will need to slow or brake not to hit me Never sit in a median to turn with front or back of car sticking out I actually speed up a bit before turning to maximize distance between myself and driver behind and turn shallow. This is a bit hard to explain but you angle into the turn and actually do most of your slowing when you are already in the turn Many others but I probably don't even think about them.

      You should probably take a driving safety course. Speeding up or slowing down in a turn requires traction. Your traction in a turn is always a fixed amount (that varies on conditions). With 4 wheels, this may not be a huge problem in favorable conditions. With two wheels, this can be very dangerous. I hate when people do exactly what you describe while I'm on my motorcycle. I do not want to touch my brakes in a turn unless its an emergency. I try to maintain a constant speed. Even if I coast to slow down, I slow down much more slowly than a car when turning.

    • by Ichijo ( 607641 )

      I actually speed up a bit before turning to maximize distance between myself and driver behind and turn shallow. This is a bit hard to explain but you angle into the turn and actually do most of your slowing when you are already in the turn

      That's illegal in California. You're supposed to make the turn from as close as practicable to the right-hand curb or edge of the roadway [ca.gov].

  • by avandesande ( 143899 ) on Monday February 29, 2016 @05:34PM (#51611177) Journal
    Could somebody please come up with a fitting car analogy?
  • by Lucas123 ( 935744 ) on Monday February 29, 2016 @05:34PM (#51611179) Homepage
    Google is now saying they were following the "spirit of the road" when the crash happened [computerworld.com] and that they've reviewed the incident, as well as thousands of variations on it, in a driving simulator and made refinements to its AV software.
    • by Anonymous Coward on Monday February 29, 2016 @05:58PM (#51611327)

      And there in lies the rub. No Google car will make the same mistake again. Likely no other autonomous car will make the same mistake again. And thus by having a minor fender bender during beta testing we prevent hundreds of collisions in the future. No matter how many times a human did the same thing, more humans would continue to make the same mistake, until we could come up with a law to prevent it. i.e. always yield to buses no matter what.

  • In the same lane? (Score:5, Insightful)

    by singularity ( 2031 ) <.nowalmart. .at. .gmail.com.> on Monday February 29, 2016 @05:57PM (#51611317) Homepage Journal

    So based on numerous descriptions I have read, the Google car was in a very wide lane and moved to the right side of the lane to make a right turn. It saw some sandbags blocking the very right side of the lane, so it tried to move back to the middle of the lane. A bus, coming up from behind in the same lane, did not yield to to the Google car and there was contact.

    I think it is important to note that all of this happened in the same "lane".

    While the Google car could have possibly avoided the accident, I am not sure it is to blame. It seems to me that the bus was attempting to pass a car ahead of it in the same lane.
    The blame seems about 80% on the city for not properly marking the lanes, about 15% on the bus for not yielding to a car ahead of it in its own lane, and about 5% on the Google car for not stopping for the bus who was trying to barge its way through.

    • 'Caused' (Score:2, Flamebait)

      by Martin S. ( 98249 )

      Agreed, reading the report it is very obvious the bus actually caused the accident by trying an inappropriate overtake the Lexus

      The Lexus only 'Caused' the accident only in so much as it did not avoid it.

    • So based on numerous descriptions I have read, the Google car was in a very wide lane and moved to the right side of the lane to make a right turn. It saw some sandbags blocking the very right side of the lane, so it tried to move back to the middle of the lane.

      I don't think there is any such thing as a wide lane where cars are allowed to go side by side. Whether it is marked as such by paint on the road or not the google car had moved into a right turn lane. It is common for parking spots and bike lanes and such to turn into a right turn lane near a corner. So marked with paint or not, the google car seems to have being trying to move from a right turn lane to a traffic lane. Perhaps the lack of paint contributed to the error, the software failed to recognize the

  • by SuperKendall ( 25149 ) on Monday February 29, 2016 @06:10PM (#51611391)

    The accident was because the car saw sandbags on the right and in an overabundance of caution decided to move a whole lane over, into a bus.

    Well that's as good as many human drivers who I have seen swerve from the lane they are in at seemingly nothing without a glance, and absolutely why you do not linger in someones blind spot.

    An open question though is how it saw sandbags and not a BUS...

  • by superdave80 ( 1226592 ) on Monday February 29, 2016 @06:16PM (#51611415)

    Google said its car's safety driver thought the bus would yield.

    BWAHAHAHAHA!!! Has this guy ever driven in SF before? A bus YIELDING to another car? In your dreams. I drove through SF for years, and buses didn't give a crap who was around them. When they pulled off to pick-up/drop-off passengers, they would intentionally park at an angle to keep the lane blocked so they could more easily start back into the lane. Even if they didn't block the lane, if they wanted to get moving in that lane, they just go. They know that they are bigger than the cars, so they know the car will slow down or move out of the way. If a lane became more narrow than they liked due to parked cars on the side of the road, they would just take up two lanes. If you are next to it in the lane that they now want to occupy? You better move the fuck over. They would run red lights at will. Watch out if you are trying to cross at an intersection with a traffic light and a bus is coming through. Without a doubt, bus drivers in SF were the worst.

  • by Ichijo ( 607641 ) on Monday February 29, 2016 @06:20PM (#51611433) Journal

    Most of the time Google's AVs drive in the middle of a lane but "when you're teeing up a right-hand turn in a lane wide enough to handle two streams of traffic, annoyed traffic stacks up behind you.

    "So several weeks ago we began giving the self-driving car the capabilities it needs to do what human drivers do: hug the rightmost side of the lane.

    The law says a right-hand turn shall be made as close as practicable to the right-hand curb or edge of the roadway [ca.gov]. So Google's self-driving cars have been making their right turns illegally until just recently.

    I expected better from Google.

  • Probably looks like someone forgot to set their parking break.
  • by morethanapapercert ( 749527 ) on Monday February 29, 2016 @07:42PM (#51611895) Homepage
    In previous posts about autonomous cars, I raised the question of how these vehicles handle the highly variable and difficult to anticipate changes in the routing caused by construction. I worked for several years in road construction and can tell you that an appalling number of humans get confused by having to change lanes in response to a flagman or pylons/barrels, ignoring any existing lane, curb and signed markings.

    In this case; having read the article (I know, I know...) it seems that the car programming is overly optimistic about predicting the behaviour of vehicles overtaking it. It seems possible that the programming includes implicit assumptions of the likely stopping distance and reaction times it should expect from other vehicles as well. In other words; it "thought" it had sufficient space and time to perform the manoeuvre because it "assumed" a bus would behave and react the way a car might.

    I have two thoughts, each in defence of one of the vehicles in this collision:

    1) Even the safety driver expected the bus to yield and from I can glean from the article, legally the bus should have yielded. So this was a mistake that even the majority of human drivers might have made in the same situation.

    2) Others in this thread have posted criticisms of bus drivers in their city or in general. Much of the annoying behaviours they mention though are pretty understandable from the bus drivers POV. You can't just suddenly hit the brakes if a smaller vehicle or pedestrian darts in front of you. Not only do you have a hell of a lot of momentum (highly variable, depending on passenger load) you also have to make as gradual velocity changes as you can. Your passengers aren't buckled up, you might have a fair number of them standing, with any number of knapsacks, briefcases, skateboards etc etc that become flying hazards when you come stop too suddenly. You have to ease to the left a fair bit when making a right turn because you have a much larger turning radius than most other vehicles. You have to drive straddling lines sometimes because if you stayed tight to the right, you are going to crunch someone, hop the curb or both. On the other hand, if you do stick to the left as much as you can, lots of people are going to pull what Torontonians call a "cabby pass" where the cab illegally passes a bus or streetcar on the right so as to get out from behind it. If they don't use their rear end to block the traffic lane, quite often they'll never get back out because no one wants to stop at the buses back corner and let the bus back in. (I have a relative who is a TTC bus driver and he has passed along some training and daily work anecdotes)

  • by BrendaEM ( 871664 ) on Monday February 29, 2016 @09:20PM (#51612331) Homepage

    It never had to take a driving test like you did.
    It will come defended by one of he largest companies the world has ever seen.
    It will put thousands of people out of a job.
    It's not likely to see a woman in woman in a black fur coat.
    I can't decide whether a child or an adult dies.
    It can't see at 400hz like your eyes can.
    [Yes, we have persistence of about 16-24 hz, like you though you did, but we can see a new object enter the scene at about 400hz. Try it with an Aduino if you don't believe me. You can plainly see the difference between 60-120hz in monitors.]

C'est magnifique, mais ce n'est pas l'Informatique. -- Bosquet [on seeing the IBM 4341]

Working...