Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Google AI Robotics Transportation Hardware

Google's Self Driving Car Crashes 244

datapharmer writes "We've all read previous stories on slashdot about Google's driverless car, and some have even pondered if a crash would bring the end to robotic cars. For better or for worse, we will all find out soon, as the inevitable has occurred. The question remains, who is to blame. A Google spokesperson told Business Insider that 'Safety is our top priority. One of our goals is to prevent fender-benders like this one, which occurred while a person was manually driving the car.'"
This discussion has been archived. No new comments can be posted.

Google's Self Driving Car Crashes

Comments Filter:
  • by Dice ( 109560 ) on Friday August 05, 2011 @07:11PM (#37001958)

    The car crashed while being driven by a person.

    According to a Google spokesperson. If I were in that car, and it crashed while the software was driving, I would claim that I had been driving it too. Any public crash that could be blamed on the software would put the project in serious jeopardy.

  • by KingSkippus ( 799657 ) on Friday August 05, 2011 @07:19PM (#37002080) Homepage Journal

    I've posted this before and I'll post it again.

    Robot cars don't have to be 100% reliable. As long as they're more reliable than the jerks who normally scare the bejesus out of me by cutting across three lanes of traffic, driving 90 MPH, weaving in and out, running red lights, etc., then I'm all for a robot car-driven society. I'm willing to put up with the computer glitches that, on very rare occasions, cause crashes if I don't have to put up with the human glitches that call themselves licensed drivers.

  • by Riceballsan ( 816702 ) on Friday August 05, 2011 @07:29PM (#37002202)
    Good for you, but unfortunately that only means you are more sane then a lawmaker, the lobbyists etc... The problem is if there is a single fatality, or even minor accidents, a large group will rise up screaming about how unsafe the cars are, and they will be disallowed from driving on public roads. Even if the average rate of accidents and fatalities is 1/16th of human rates. Most laws can be stopped by focusing on the 1% of the time something is worse and completely ignoring the 99% of the time they were better.
  • by tibman ( 623933 ) on Friday August 05, 2011 @07:38PM (#37002274) Homepage

    I did some quick research.

    According to California officials, there are no laws that would bar Google from testing such models, as long as there's a human behind the wheel who would be responsible should something go wrong.

    Taken from here: http://jalopnik.com/5661240/are-googles-driverless-cars-legal [jalopnik.com] which was linked in the article from the summary.

    However i would say that there is a difference from operating the car and manually driving the car. The google spokesperson used the phrase, manually driving.

  • by Anthony Mouse ( 1927662 ) on Saturday August 06, 2011 @12:15AM (#37004082)

    The other thing to consider is who is at fault for the collision. There are situations where, it doesn't matter who you are, you can't avoid a collision through no fault of your own. Example: You're driving in a construction zone with a car to your left and a construction barrier to your right. A deer jumps over the barrier and lands two feet in front of your car. You only get to choose whether you hit the deer, the barrier or the car to your left. There is no choice that avoids a collision. If a self-driving car is put in that situation, it has the same alternatives, and we shouldn't be at all surprised when some similar situation ultimately occurs.

"The eleventh commandment was `Thou Shalt Compute' or `Thou Shalt Not Compute' -- I forget which." -- Epigrams in Programming, ACM SIGPLAN Sept. 1982