Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Google AI Robotics Transportation Hardware

Google's Self Driving Car Crashes 244

datapharmer writes "We've all read previous stories on slashdot about Google's driverless car, and some have even pondered if a crash would bring the end to robotic cars. For better or for worse, we will all find out soon, as the inevitable has occurred. The question remains, who is to blame. A Google spokesperson told Business Insider that 'Safety is our top priority. One of our goals is to prevent fender-benders like this one, which occurred while a person was manually driving the car.'"
This discussion has been archived. No new comments can be posted.

Google's Self Driving Car Crashes

Comments Filter:
  • Johnny Cab (Score:5, Funny)

    by Anonymous Coward on Friday August 05, 2011 @06:02PM (#37001880)
    "The door opened, you got in!"
  • by ELitwin ( 1631305 ) on Friday August 05, 2011 @06:07PM (#37001920)
    The car crashed while being driven by a person.

    Nothing to see here - move along please.
    • by WrongSizeGlass ( 838941 ) on Friday August 05, 2011 @06:08PM (#37001934)

      The car crashed while being driven by a person.

      Maybe he was looking at the GPS and not paying attention to the road.

    • But...the robots....:( So disappointed.
    • by Dice ( 109560 ) on Friday August 05, 2011 @06:11PM (#37001958)

      The car crashed while being driven by a person.

      According to a Google spokesperson. If I were in that car, and it crashed while the software was driving, I would claim that I had been driving it too. Any public crash that could be blamed on the software would put the project in serious jeopardy.

      • by 101010_or_0x2A ( 1001372 ) on Friday August 05, 2011 @06:13PM (#37001978)
        9/11 was an inside job because man never landed on the moon.
        • Laugh all you want but his is not such a weird assumption without more facts about the subject.

          And you already have proof that Google did the logical thing for a company and downplayed the magnitude of the event. They said 2 cars were involved when it was really 5.

          Then again, it's what all smart companies will do when facing similar problems. It rests to see if they're able to handle the many variables in a man + robot traffic.

      • OTOH if you lied and the cops found out you had lied then I would think that could put the project in even more serious jeopardy.

      • by Fuzzums ( 250400 )

        Yeah. And if later on anybody would find out it actually WAS the software instead of the human driver...

    • Are you kidding? This is the epiphany of news for nerds. It's amazing. The Google car has crashed! Who cares about the details. It's all over. Even Google can't make the perfect car. Why do they allow a manual override in the first place. OMG.

      Actually the real Oh My God moment was that I just clicked on that waste of a link.

    • Lets talk about something a bit more relevant.
      Basically a small issue is that bugs will occuere. If the cars AI actually crashed the car, ain't it actually a really good thing? I mean, the bug would otherwise have been present.
      And since it crashed, they can figure out WHY it crashed, and that means they can fix the bug.
      And the same thing applies to everything: While doing R&D you actually want a few of your products to break badly, so you can fix the fault that caused it.

    • How do we know that the following condition didn't happen;

      The car was in automatic drive.
      A problem occurred and it appeared that a crash was about to occur.
      The driver took control of the vehicle
      There was not enough time to avoid the crash and the crash occurred.

      Google can truthfully say that at the time of the crash the car was in manual control but the crash was still caused by the computer.

      • Because Google makes a distinction between the car being driven autonomously, which is the scenario which you describe, and manually driving the car, in which the car is driven just like any other Prius. Until more facts come in, what is the point of this scare-mongering conspiracy theory? What is YOUR agenda?
        • We have the same point. There are some people who are taking the statement by Google that the car was being driven manually when the accident occurred at face value. I was just putting forward an alternate scenario in which the basic statement was true but may not tell the whole story. There is not enough information in the articles to make an informed judgment on what happened.

          Even when the car is being driven by computer there is always someone in the driver's seat in case something goes wrong. In my scen

    • Yes, but it is Prius on Prius violence. The only thing better would have been if the drivers had gotten into an altercation as well. Seriously, I'm always shocked when a Prius driver turns out to be anything other than a self obsess jack ass, I swear they take special classes to teach them how to drive in an incompetent and self obsessed manner.

    • by bonch ( 38532 ) *

      That's what the Google spokesman says. A crash would be very bad PR for Google's pointless self-driving car project.

  • by Bovius ( 1243040 ) on Friday August 05, 2011 @06:07PM (#37001924)

    Relevant quote: "...occurred while a person was manually driving the car."

    Headline should be: "Human damages Google car by operating it with his own slow, meaty appendages"

  • by Anonymous Coward

    Why, Apple, Microsoft and Yahoo! and may be Oracle too!

  • Why would one crash bring an end to robotic cars? Crashes can be expected while they are still developing this car.
    • Because we have a need to *blame* someone when something goes wrong. If a robotic car makes a mistake, crashes, and kills someone, who goes to jail? The owner, who submitted it for through testing before allowing it to drive on the road? The manufacturer, who did the same and also preformed thousands of hours of independent testing? One of the dozens of engineers or hundreds of programmers who worked on it? A person is hypothetically dead, and they wouldn't have hypothetically died if not for this robotic c
      • Who do we get to punish!?

        The corporation has to pay. And, when all is said and done, if their behavior was especially egregious they'll pay a lot. That's just the way it is. And yes, it does take time and money. If it were any other way, nobody would ever be an engineer, nobody would ever build anything of consequence, because going to jail for doing your job is just not a worthwhile risk for most people.

      • who goes to jail?

        How about no one? Why must someone go to jail for what would probably be perceived by most to be an unfortunate occurrence?

  • any ways legal liability is a big hold up to auto cars and the only way that at least at the start to have them is to have auto drive only roads and even then there will need to be some kind of no fault or some one saying that all costs to fix things will be covered or there will need to be auto drive insurance. Also the cops and courts will need someone to take the fall if any laws are broke.

  • by KingSkippus ( 799657 ) on Friday August 05, 2011 @06:19PM (#37002080) Homepage Journal

    I've posted this before and I'll post it again.

    Robot cars don't have to be 100% reliable. As long as they're more reliable than the jerks who normally scare the bejesus out of me by cutting across three lanes of traffic, driving 90 MPH, weaving in and out, running red lights, etc., then I'm all for a robot car-driven society. I'm willing to put up with the computer glitches that, on very rare occasions, cause crashes if I don't have to put up with the human glitches that call themselves licensed drivers.

    • by Riceballsan ( 816702 ) on Friday August 05, 2011 @06:29PM (#37002202)
      Good for you, but unfortunately that only means you are more sane then a lawmaker, the lobbyists etc... The problem is if there is a single fatality, or even minor accidents, a large group will rise up screaming about how unsafe the cars are, and they will be disallowed from driving on public roads. Even if the average rate of accidents and fatalities is 1/16th of human rates. Most laws can be stopped by focusing on the 1% of the time something is worse and completely ignoring the 99% of the time they were better.
      • by sl149q ( 1537343 )

        Mostly the scare campaigns will be generated by people with other agendas... Think teamsters wanting to protect jobs for drivers. There are a *lot* of people who stand to loose their living once self driving cars start to be deployed.

        You can see prototype scare campaigns of this sort anywhere that has contemplated driverless mass transit systems.

        I suspect that in some jurisdictions (where unions have political pull) we will see laws enacted that require a human "driver" be available to override the controls

    • by artor3 ( 1344997 )

      It doesn't matter that you're okay with it. The media will jump on it to create a scare, so that they can get more advertising revenue. Their victims will get scared, and demand their congressmen ban self-driving cars. History has shown that politicians who try to rationalize with raving, scared citizens end up having short careers.

      There will never be self-driving cars. Not in our lifetimes. Technology allows them, but society doesn't.

      • by xigxag ( 167441 )

        Strongly disagree.

        First of all, we already have automatic braking systems, cruise control, electronic stability control and other computer assisted driving methods. And they can fail. The argument you are making would lead us to conclude that a couple of ABS failures would lead to banning the technology, but that hasn't happened. The computer is taking over the automobile in stages, and people will have time to become accustomed to each incremental step.

        Second, people become accustomed to automated transp

    • by jamesh ( 87723 )

      cutting across three lanes of traffic, driving 90 MPH, weaving in and out, running red lights, etc

      If you want that behavior download the @r53h0L3 patch...

      cause crashes if I don't have to put up with the human glitches that call themselves licensed drivers.

      I wonder how much of that sort of driving they have put the googlemobile through? Being a tester would be a whole lot of fun... set the googlemobile down a freeway and everyone else gets to cut it off etc and see how it responds.

    • Robot cars *do* have to be 100% reliable, because the automakers will bear culpability for crashes caused by an autopilot, and their much deeper pockets will result in lawsuits filed for damages several orders of magnitude higher than what Joe Sixpack faces when he hits someone. That risk of liability will keep car autopilots off the roads for the foreseeable future, even when the technology appears to have matured.

      • Robot cars *do* have to be 100% reliable

        Well then, I doubt that there will ever be robot cars. I don't believe that it's possible for humans to make something as complex as that to be 100% reliable.

        • I think you have to make some sort of Fight Club-ish equation involving probability of accidents, damages and overall revenue.

    • But what if a child is killed by these robot cars? If it's not a perfect solution (which, of course, exist, and human drivers are perfect), then it's a bad solution! Think of the children!

    • by mcrbids ( 148650 )

      I sooooo wish this were true! The problem is in concentrated wealth.

      If I (a "little people") crash a car, the most anybody could get out of me would be my life savings, which (at 40) adds up to a few hundred thousand. Enough for an ambulance chaser and a douchebag to make my life suck, but not enough to bring out the big guns.

      But when the "driver" of a car is a software company with millions of installs, any crash at all is enough for said ambulance chaser and douchebag to go for the jugular for millions. A

  • Does this mean that the car works better without humans? The "every small" sample size study says yes!
  • ah the computer will take care of it, I have rear view tv why should I bother turning my hea..bump

    I like safety but I cant expect humons to do anything right as a whole, a great example would be a coworker of mine, focused so hard on his little tv screen he didn't notice me standing inches away from the side of his car as he backed out, I knocked on his window, throughly scaring him and pointed to my eyes.

  • by adisakp ( 705706 ) on Friday August 05, 2011 @06:36PM (#37002256) Journal
    FTA: Google's Prius struck another Prius, which then struck her Honda Accord that her brother was driving. That Accord then struck another Honda Accord, and the second Accord hit a separate, non-Google-owned Prius.
  • I mean here the car is, a brain the size of a planet, and all we are asking it to do is to drive us around. I think it was attempted suicide.
  • by Marc_Hawke ( 130338 ) on Friday August 05, 2011 @06:50PM (#37002394)

    The author of the Business Insider article seems to think that a 'driverless car' killed his mother or something. Every sentence was a scathing attack on the audacity of Google to even be running these tests. He also never once entertains the idea that this might have been a normal fender-bender between normally driven vehicles. He just assumes Google's responses are bald-faced lies and implies what really happened is that the computer decided to try to kill everyone else on the road.

    What I don't get is why does he hate the car so much? It thought these cars were an exciting new technology. Why would he go out of his way to demonize it?

  • After reading this article and seeing the pictures, I'm buying a Prius!

    Striking a car with enough force to trigger a four-car chain reaction suggests the Google car was moving at a decent clip

    It caused all that carnage and I can't even see a scratch on the Google Prius or the Prius in front of it!

    • by jrumney ( 197329 )
      A more likely scenario based on the damage in that photo is that the four car chain reaction happened in front of the Google car, with the Google car unable to stop in time to avoid joining the pile up from the back (whether manual or auto driven, though in auto mode you'd hope that it would keep sufficient following distance to stop safely in these circumstances).
  • Anyone who doesn't think we're going to see crashes with a new (semi)autonomous driving system is delusional or being obtuse. If one crash becomes some sensational national news story, one has to wonder why.

  • by Jeremi ( 14640 ) on Friday August 05, 2011 @07:52PM (#37002998) Homepage

    There's an inherent conflict between the prime directive of the Google auto-driving software ("drive safely"), and the prime directive of the Toyota firmware ("drive safely until the human isn't paying attention, then accelerate to top speed for as long as possible").

    It was only a matter of time before the Toyota side of the car's character came to the fore. ;^)

  • by Joe_Dragon ( 2206452 ) on Friday August 05, 2011 @07:59PM (#37003070)

    car crashes you!

  • Commercial aircraft are largely automated fly-by-wire systems. Every so often, there's a crash caused in part by sensor malfunction. Does the NTSB and FAA prohibit use of autopilot as a result?

    Humans crash cars on the road and kill each other all the time. So that means we should outlaw human-controlled driving mechanisms, of course.

    Some men are sexual predators and have abused children. That means we have to physically quarantine all men from all children, right?

    If your standard for progress is perfect

    • by artor3 ( 1344997 )

      Rationality doesn't matter. The media will conduct a scare campaign to drive up their ratings. Most people, and thus most voters, get their news entirely through the media. They will be kept outraged and afraid, as always, and self-driving cars will be banned.

  • The article neglects to mention the google-car's previous DUI. Influence of.....?
  • Although we like to think all accidents are preventable (and in theory, they are), that theory changes a bit when you claim that all accidents are preventable when only one driver is attempting to prevent them. Now, I'm sure this happened under a typical, well controlled situation (stopped cars in the middle of the street, for instance), something that happens quite regularly on any drive, and therefore a very typical obstacle. However, consider that there has to be SOME condition for which lesser of evil

  • Would it be accurate to say that only in California could a 5 car fender-bender involve three Priuses?
    • Would it be accurate to say that only in California could a 5 car fender-bender involve three Priuses?

      Yep.. and luckily nothing of value was lost.

  • The car barely touched the other car, you can tell by the picture, there's no visible damage. Most likely just touched the bumper. It probably happens a couple of times a week. They are debugging, that's good, let them work in piece. How many car crashes are there every day around the world? And how many barely-touched-you incidents like this one? All they had to do was exchange insurance information. Instead, we could see a cop at the scene. Why? Probably because the other driver acted like an asshole, o

  • And the Wright brothers crashed planes...
    The advent and adoption of the self driving car will prove to be the single most life saving accomplishment of this century. If the google car went rogue and ran over a group of school children and the steering column punctured straight through the torso of the meatbag driver, I would still champion the development of this project. The technology to achieve the goal of self driving cities and highways has already existed for years. Adequate support and test

  • /., please find a good editor.

  • Both the HAL 9000 and SkyNet had perfect operational records right up until they, um, started having issues.

    Maybe having a glitch early on is a good sign.

  • Why shouldn't a car driven by one crash as well? w00t!

    Also worth noting: Thirty thousand people a year die in auto crashes. Could Google's robots do much worse?

  • There's not enough info available about this yet.

    I'd expect Google's driverless cars to have not only the Velodyne laser scanner and the vision system, but a dumb anti-collision radar system as a backup. We had one of those (an Eaton VORAD) on our DARPA Grand Challenge vehicle, just in case the more advanced systems failed. So did most of the Grand Challenge teams, including Stanford. You can buy that as a factory option on some cars now.

    So if they rear-ended another car, I'd suspect either manual dri

  • holy crap, now i've seen and read it all..

Don't get suckered in by the comments -- they can be terribly misleading. Debug only code. -- Dave Storer

Working...