Google's Self Driving Car Crashes 244
datapharmer writes "We've all read previous stories on slashdot about Google's driverless car, and some have even pondered if a crash would bring the end to robotic cars. For better or for worse, we will all find out soon, as the inevitable has occurred. The question remains, who is to blame. A Google spokesperson told Business Insider that 'Safety is our top priority. One of our goals is to prevent fender-benders like this one, which occurred while a person was manually driving the car.'"
Johnny Cab (Score:5, Funny)
Re:Johnny Cab (Score:5, Funny)
Johnnycab: The fare is 18 credits, please.
[Quaid gets out]
Douglas Quaid: Sue me, dickhead!
[cab tries to run him down, crashes, and explodes]
Johnnycab: We hope you enjoyed the ride!
Summary is sensationalistic (Score:5, Informative)
Nothing to see here - move along please.
Re:Summary is sensationalistic (Score:5, Funny)
The car crashed while being driven by a person.
Maybe he was looking at the GPS and not paying attention to the road.
Re:Summary is sensationalistic (Score:4, Funny)
The article has an update, which says the incident forced a chain-collision among 3 Priuses (Preii?) and an Accord.
So it appears that only one car was affected.
Re: (Score:2)
To me it sounds like the only thing dangerous here is driving a Prius.
ob XKCD (Score:2)
Re: (Score:2)
The other Prius's driver looked like Sarah Connor.
Re: (Score:2)
Re: (Score:2)
Re:Summary is sensationalistic (Score:5, Insightful)
The car crashed while being driven by a person.
According to a Google spokesperson. If I were in that car, and it crashed while the software was driving, I would claim that I had been driving it too. Any public crash that could be blamed on the software would put the project in serious jeopardy.
Re:Summary is sensationalistic (Score:4, Funny)
Re: (Score:2)
Laugh all you want but his is not such a weird assumption without more facts about the subject.
And you already have proof that Google did the logical thing for a company and downplayed the magnitude of the event. They said 2 cars were involved when it was really 5.
Then again, it's what all smart companies will do when facing similar problems. It rests to see if they're able to handle the many variables in a man + robot traffic.
Re: (Score:2)
Maybe it was just a shitty design.
Re: (Score:3)
Did you see some other towers collapse than I did? It doesn't look anything like a controller implosion. For one thing, controlled implosions always "pancake" from the bottom, with the main mass of the building squashing the lowest floors first. (See here [youtube.com], or any of a million other videos), while the WTC pancaked from around the point of impact, or 3/4 of the way up.
Secondly, the towers didn't come
Re: (Score:2)
OTOH if you lied and the cops found out you had lied then I would think that could put the project in even more serious jeopardy.
Re:Summary is sensationalistic (Score:4, Insightful)
I did some quick research.
According to California officials, there are no laws that would bar Google from testing such models, as long as there's a human behind the wheel who would be responsible should something go wrong.
Taken from here: http://jalopnik.com/5661240/are-googles-driverless-cars-legal [jalopnik.com] which was linked in the article from the summary.
However i would say that there is a difference from operating the car and manually driving the car. The google spokesperson used the phrase, manually driving.
Re: (Score:2)
There's probably no provision in the law explicitly forbidding you to release sharks with fricken' lasers at the municipal swimming pool, but that doesn't mean it's legal. In most jurisdiction there's a basic provision in the law saying the operator of a vehicle must do so in a safe manner.
I suspect it's not any different in principle from engaging cruise control. In some situation that would be unsafe, therefore illegal. In practice, nobody knows how safe a new software system like this is, so there cou
Re: (Score:2)
According to California officials, there are no laws that would bar Google from testing such models, as long as there's a human behind the wheel who would be responsible should something go wrong.
Exactly what happened here. The car crashed, the human was held responsible. This says nothing about who or what was controlling the car at the time.
I have designed automatic control systems since I graduated from electronics engineering college in 1979. I hate most of the automatic features in cars today, it will never do until we have human-equivalent artificial intelligence.
I have had a few near-misses due to anti-lock brakes caused entirely by algorithms that aren't good enough.
The last one was when the
Re: (Score:2)
and forgive me, but I wonder, how many near-misses have you had that would have been actual accidents but for the algorithms that react faster than you? I don't suppose you even noticed any..
Re: (Score:3)
The last one was when the ABS released the brakes because I hit a pothole when I was braking. One wheel locked because it was over a hole and the system came to the conclusion that the car was on a slippery surface. Another near crash I had due to ABS was when I stepped on the brakes just as the pavement dropped down in a ramp. The system apparently interpreted the sudden downwards acceleration as a bigger than normal deceleration and unlocked the brakes.
In both cases the ABS probably reacted correctly and may have saved you from a less controllable situation. ABS works on each wheel separately so in the cases you experienced, breaks would still be applied on at least two wheels.
It makes sense to release the brake over a pothole as the wheel locks and will provide less traction once it hits the normal road surface again. It probably also makes sense to release the brakes on the wheel opposite the one that hit the pothole so asymmetric braking forces (whi
Re: (Score:2)
Yeah. And if later on anybody would find out it actually WAS the software instead of the human driver...
Re: (Score:2)
It would be trivial for Google to verify the logs, sure, but I doubt very much that the Mountain View Police Department would have an easy time with it.
They don't have to. All they need is to hire a credible computer expert to testify as to what the data actually says. If the subpoena includes Google's data analysis tools, it becomes even more trivial.
PS- "credible" means someone who doesn't spread conspiracy theories online.
Re: (Score:3, Insightful)
The other thing to consider is who is at fault for the collision. There are situations where, it doesn't matter who you are, you can't avoid a collision through no fault of your own. Example: You're driving in a construction zone with a car to your left and a construction barrier to your right. A deer jumps over the barrier and lands two feet in front of your car. You only get to choose whether you hit the deer, the barrier or the car to your left. There is no choice that avoids a collision. If a self-drivi
Re: (Score:3)
Example: You're driving in a construction zone with a car to your left and a construction barrier to your right. A deer jumps over the barrier and lands two feet in front of your car. You only get to choose whether you hit the deer, the barrier or the car to your left. There is no choice that avoids a collision. If a self-driving car is put in that situation, it has the same alternatives, and we shouldn't be at all surprised when some similar situation ultimately occurs.
Or when the two drivers' roles are reversed and the other one swerves into you to avoid the deer. Neither situation would be your fault, but in your original scenario you would be considered at fault because you're the one that caused the collision.
The huge difference is that an automated car would be able to see that deer coming and initiate a correction before it's even visible to the driver. It would take a human driver close to a second after the deer is visible before he even computes that there's a de
Re: (Score:2)
Are you kidding? This is the epiphany of news for nerds. It's amazing. The Google car has crashed! Who cares about the details. It's all over. Even Google can't make the perfect car. Why do they allow a manual override in the first place. OMG.
Actually the real Oh My God moment was that I just clicked on that waste of a link.
Re: (Score:3)
The word you want is epitome, not epiphany.
Re:Summary is sensationalistic (Score:4, Funny)
Ah yes, the epiphany of epitome.
Re: (Score:2)
Lets talk about something a bit more relevant.
Basically a small issue is that bugs will occuere. If the cars AI actually crashed the car, ain't it actually a really good thing? I mean, the bug would otherwise have been present.
And since it crashed, they can figure out WHY it crashed, and that means they can fix the bug.
And the same thing applies to everything: While doing R&D you actually want a few of your products to break badly, so you can fix the fault that caused it.
Re: (Score:3)
How do we know that the following condition didn't happen;
The car was in automatic drive.
A problem occurred and it appeared that a crash was about to occur.
The driver took control of the vehicle
There was not enough time to avoid the crash and the crash occurred.
Google can truthfully say that at the time of the crash the car was in manual control but the crash was still caused by the computer.
Re: (Score:2)
Re: (Score:2)
We have the same point. There are some people who are taking the statement by Google that the car was being driven manually when the accident occurred at face value. I was just putting forward an alternate scenario in which the basic statement was true but may not tell the whole story. There is not enough information in the articles to make an informed judgment on what happened.
Even when the car is being driven by computer there is always someone in the driver's seat in case something goes wrong. In my scen
Re: (Score:2)
Yes, but it is Prius on Prius violence. The only thing better would have been if the drivers had gotten into an altercation as well. Seriously, I'm always shocked when a Prius driver turns out to be anything other than a self obsess jack ass, I swear they take special classes to teach them how to drive in an incompetent and self obsessed manner.
Re: (Score:2)
That's what the Google spokesman says. A crash would be very bad PR for Google's pointless self-driving car project.
Re:This is Slashdot. (Score:5, Funny)
Man Crashes Car? That's no story. CAR CRASHES MAN!!! Now *that's* a story.
Re: (Score:2)
Except in Soviet Russia...
Re: (Score:2)
True, but in this case, this non-story is just a chance to let everybody bring in their Three Laws jokes.
Wildly misleading headline (Score:5, Informative)
Relevant quote: "...occurred while a person was manually driving the car."
Headline should be: "Human damages Google car by operating it with his own slow, meaty appendages"
Re:Wildly misleading headline (Score:5, Interesting)
(a) it was indeed the human's fault
(b) the robot effed up first, then the human took over and attempted (unsuccessfully) to recover
Re: (Score:2)
Who is to blame? (Score:2, Funny)
Why, Apple, Microsoft and Yahoo! and may be Oracle too!
Re: (Score:2)
Built Upon Failures (Score:2)
Re: (Score:2)
Re: (Score:2)
Who do we get to punish!?
The corporation has to pay. And, when all is said and done, if their behavior was especially egregious they'll pay a lot. That's just the way it is. And yes, it does take time and money. If it were any other way, nobody would ever be an engineer, nobody would ever build anything of consequence, because going to jail for doing your job is just not a worthwhile risk for most people.
Re: (Score:2)
who goes to jail?
How about no one? Why must someone go to jail for what would probably be perceived by most to be an unfortunate occurrence?
Re: (Score:2)
Both. Mistakes happen (no matter who made them happen). It doesn't mean that someone always needs to go to jail.
Re: (Score:2)
Re: (Score:2)
Concorde was killed off for many other reasons unrelated to the crash, most critically, it was a money and fuel sponge.
Nuclear, though, I agree. Apparently coal and it's hundreds to thousands of deaths is ok because we've had it since man first sent child into a mine shaft to play in the dirt. Nuclear though, GAAH! MUTANT THREE EYED FISH!!!!
Re: (Score:3)
Concorde was killed off for many other reasons unrelated to the crash, most critically, it was a money and fuel sponge.
Nuclear, though, I agree. Apparently coal and it's hundreds to thousands of deaths is ok because we've had it since man first sent child into a mine shaft to play in the dirt. Nuclear though, GAAH! MUTANT THREE EYED FISH!!!!
Yes, and what's tragironic about that is that many coal fields are naturally radioactive, and we (as in "pretty much everyone on the planet") have been breathing thorium dust for over a century now. Thorium that would have been better of staying in the ground. The unfortunate reality is that some number of people die every year just from that particular aspect of our use of coal for power. Well-designed nuclear power facilities (and no, I don't mean obsolescent junk like what lit off in Japan recently, and
Re: (Score:2)
Has anyone here ever read Heinlein's Starship Troopers (no comment about the movie)? There's a scene where they discuss a human colony on this planet that's exactly like earth, only with far less ionizing radiation. The discussion goes that in a few thousand years, the humans that settle on that world will be evolutionarily impaired due to lack of mutations.
That is, unless they purposefully irradiate themselves.
Re: (Score:2)
The colony will be perfectly fine. Most genetic mutations are spontaneous, caused by defects in the molecular transcription processes. And even among induced mutations, there's plenty of chemical or biological agents in the environment that do more damage than ionizing radiation.
Re: (Score:2)
The proximal cause of Concorde actually being cancelled was the maintenance company deciding it diddn't want to do it any more, and putting in a stupidly high quote for the renewal.
goolge has deep pockets get a good lawyer (Score:2)
any ways legal liability is a big hold up to auto cars and the only way that at least at the start to have them is to have auto drive only roads and even then there will need to be some kind of no fault or some one saying that all costs to fix things will be covered or there will need to be auto drive insurance. Also the cops and courts will need someone to take the fall if any laws are broke.
100% reliability not needed (Score:5, Insightful)
I've posted this before and I'll post it again.
Robot cars don't have to be 100% reliable. As long as they're more reliable than the jerks who normally scare the bejesus out of me by cutting across three lanes of traffic, driving 90 MPH, weaving in and out, running red lights, etc., then I'm all for a robot car-driven society. I'm willing to put up with the computer glitches that, on very rare occasions, cause crashes if I don't have to put up with the human glitches that call themselves licensed drivers.
Re:100% reliability not needed (Score:4, Insightful)
Re: (Score:2)
Mostly the scare campaigns will be generated by people with other agendas... Think teamsters wanting to protect jobs for drivers. There are a *lot* of people who stand to loose their living once self driving cars start to be deployed.
You can see prototype scare campaigns of this sort anywhere that has contemplated driverless mass transit systems.
I suspect that in some jurisdictions (where unions have political pull) we will see laws enacted that require a human "driver" be available to override the controls
Re: (Score:2)
It doesn't matter that you're okay with it. The media will jump on it to create a scare, so that they can get more advertising revenue. Their victims will get scared, and demand their congressmen ban self-driving cars. History has shown that politicians who try to rationalize with raving, scared citizens end up having short careers.
There will never be self-driving cars. Not in our lifetimes. Technology allows them, but society doesn't.
Re: (Score:2)
Strongly disagree.
First of all, we already have automatic braking systems, cruise control, electronic stability control and other computer assisted driving methods. And they can fail. The argument you are making would lead us to conclude that a couple of ABS failures would lead to banning the technology, but that hasn't happened. The computer is taking over the automobile in stages, and people will have time to become accustomed to each incremental step.
Second, people become accustomed to automated transp
Re: (Score:2)
cutting across three lanes of traffic, driving 90 MPH, weaving in and out, running red lights, etc
If you want that behavior download the @r53h0L3 patch...
cause crashes if I don't have to put up with the human glitches that call themselves licensed drivers.
I wonder how much of that sort of driving they have put the googlemobile through? Being a tester would be a whole lot of fun... set the googlemobile down a freeway and everyone else gets to cut it off etc and see how it responds.
Re: (Score:3)
I wonder how much of that sort of driving they have put the googlemobile through? Being a tester would be a whole lot of fun... set the googlemobile down a freeway and everyone else gets to cut it off etc and see how it responds.
It'd make sense to do this in a simulator... a good deal cheaper... actually, come to think of it, I'd be vastly surprised if Google hasn't already done this.
Scene 2: A bunch of google tech's crowded around their broken googlemobile, scratching their heads and muttering "strange... it worked perfectly in the simulator"
Re: (Score:2)
Robot cars *do* have to be 100% reliable, because the automakers will bear culpability for crashes caused by an autopilot, and their much deeper pockets will result in lawsuits filed for damages several orders of magnitude higher than what Joe Sixpack faces when he hits someone. That risk of liability will keep car autopilots off the roads for the foreseeable future, even when the technology appears to have matured.
Re: (Score:2)
Robot cars *do* have to be 100% reliable
Well then, I doubt that there will ever be robot cars. I don't believe that it's possible for humans to make something as complex as that to be 100% reliable.
Re: (Score:2)
I think you have to make some sort of Fight Club-ish equation involving probability of accidents, damages and overall revenue.
Re: (Score:2)
Re: (Score:2)
But what if a child is killed by these robot cars? If it's not a perfect solution (which, of course, exist, and human drivers are perfect), then it's a bad solution! Think of the children!
Re: (Score:2)
I sooooo wish this were true! The problem is in concentrated wealth.
If I (a "little people") crash a car, the most anybody could get out of me would be my life savings, which (at 40) adds up to a few hundred thousand. Enough for an ambulance chaser and a douchebag to make my life suck, but not enough to bring out the big guns.
But when the "driver" of a car is a software company with millions of installs, any crash at all is enough for said ambulance chaser and douchebag to go for the jugular for millions. A
Re:100% reliability not needed (Score:4, Funny)
you say this now, but wait until the smartcar you're in gets caught in an infinite loop!
So you get lost around the Apple campus. What's the big deal?
Re: (Score:2)
I imagine it more as a bell curve, scaled on an accumulation of tickets and accidents over time. Nine out of ten drivers will get at least one speeding ticket in their lifetime, and be involved in 3-5 accidents serious enough to be reported. As I have not had either yet in 15 years of driving, I can safely consider myself an above average driver
so the machine have finally taken over (Score:2)
It makes you too comfortable (Score:2)
ah the computer will take care of it, I have rear view tv why should I bother turning my hea..bump
I like safety but I cant expect humons to do anything right as a whole, a great example would be a coworker of mine, focused so hard on his little tv screen he didn't notice me standing inches away from the side of his car as he backed out, I knocked on his window, throughly scaring him and pointed to my eyes.
Does Everyone on CA own a Prius or Accord? (Score:5, Funny)
Re: (Score:2)
Re: (Score:2)
Puget Sound too. Gotta love my '93 Saturn with 35MPG though.
Re: (Score:2)
Another question: How many of those Accords were hybrids? :)
Re:Does Everyone on CA own a Prius or Accord? (Score:4, Funny)
That would explain that cloud of smug that has been collecting over California.
Re: (Score:2)
Just chiming in to confirm a sibling post: Yes, in the Bay Area.
Re: (Score:2)
Maybe that's why it crashed? Calculating the possibility of that fried its circuits.
I think it was depressed. (Score:2)
Why so antagonistic? (Score:3)
The author of the Business Insider article seems to think that a 'driverless car' killed his mother or something. Every sentence was a scathing attack on the audacity of Google to even be running these tests. He also never once entertains the idea that this might have been a normal fender-bender between normally driven vehicles. He just assumes Google's responses are bald-faced lies and implies what really happened is that the computer decided to try to kill everyone else on the road.
What I don't get is why does he hate the car so much? It thought these cars were an exciting new technology. Why would he go out of his way to demonize it?
Rugged Prius (Score:2)
After reading this article and seeing the pictures, I'm buying a Prius!
Striking a car with enough force to trigger a four-car chain reaction suggests the Google car was moving at a decent clip
It caused all that carnage and I can't even see a scratch on the Google Prius or the Prius in front of it!
Re: (Score:2)
New autonomous test cars crash? NO WAI! (Score:2)
Anyone who doesn't think we're going to see crashes with a new (semi)autonomous driving system is delusional or being obtuse. If one crash becomes some sensational national news story, one has to wonder why.
You can't really blame Google (Score:3)
There's an inherent conflict between the prime directive of the Google auto-driving software ("drive safely"), and the prime directive of the Toyota firmware ("drive safely until the human isn't paying attention, then accelerate to top speed for as long as possible").
It was only a matter of time before the Toyota side of the car's character came to the fore. ;^)
In soviet russia (Score:3)
car crashes you!
logic fail (Score:2)
Commercial aircraft are largely automated fly-by-wire systems. Every so often, there's a crash caused in part by sensor malfunction. Does the NTSB and FAA prohibit use of autopilot as a result?
Humans crash cars on the road and kill each other all the time. So that means we should outlaw human-controlled driving mechanisms, of course.
Some men are sexual predators and have abused children. That means we have to physically quarantine all men from all children, right?
If your standard for progress is perfect
Re: (Score:2)
Rationality doesn't matter. The media will conduct a scare campaign to drive up their ratings. Most people, and thus most voters, get their news entirely through the media. They will be kept outraged and afraid, as always, and self-driving cars will be banned.
Second Offense (Score:2)
Re: (Score:2)
Must have been the Ethanol blend of the fuel.
Maybe it crashed on purpose.... (Score:2)
Although we like to think all accidents are preventable (and in theory, they are), that theory changes a bit when you claim that all accidents are preventable when only one driver is attempting to prevent them. Now, I'm sure this happened under a typical, well controlled situation (stopped cars in the middle of the street, for instance), something that happens quite regularly on any drive, and therefore a very typical obstacle. However, consider that there has to be SOME condition for which lesser of evil
Only in California... (Score:2)
Re: (Score:2)
Would it be accurate to say that only in California could a 5 car fender-bender involve three Priuses?
Yep.. and luckily nothing of value was lost.
News: It wasn't a crash ... (Score:2)
The car barely touched the other car, you can tell by the picture, there's no visible damage. Most likely just touched the bumper. It probably happens a couple of times a week. They are debugging, that's good, let them work in piece. How many car crashes are there every day around the world? And how many barely-touched-you incidents like this one? All they had to do was exchange insurance information. Instead, we could see a cop at the scene. Why? Probably because the other driver acted like an asshole, o
so what. (Score:2)
And the Wright brothers crashed planes...
The advent and adoption of the self driving car will prove to be the single most life saving accomplishment of this century. If the google car went rogue and ran over a group of school children and the steering column punctured straight through the torso of the meatbag driver, I would still champion the development of this project. The technology to achieve the goal of self driving cities and highways has already existed for years. Adequate support and test
Fuck you, Timothy (Score:2)
/., please find a good editor.
Look on the bright side (Score:2)
Both the HAL 9000 and SkyNet had perfect operational records right up until they, um, started having issues.
Maybe having a glitch early on is a good sign.
Computers crash all the time. (Score:2)
Why shouldn't a car driven by one crash as well? w00t!
Also worth noting: Thirty thousand people a year die in auto crashes. Could Google's robots do much worse?
Don't they have a radar, too? (Score:2)
There's not enough info available about this yet.
I'd expect Google's driverless cars to have not only the Velodyne laser scanner and the vision system, but a dumb anti-collision radar system as a backup. We had one of those (an Eaton VORAD) on our DARPA Grand Challenge vehicle, just in case the more advanced systems failed. So did most of the Grand Challenge teams, including Stanford. You can buy that as a factory option on some cars now.
So if they rear-ended another car, I'd suspect either manual dri
wow (Score:2)
Re: (Score:2)
BTW would I have to list my car/computer as a drive on my insurance policy? Does it get a license and can it accumulate points and get suspended? Maybe the points can go directly to the developer's license.... If google is working on AI and a human really did crash the car, I hope that person has a really good attorney......
Doesn't work that way. I'm no lawyer, but I am a software developer, and I work on some fairly mission-critical stuff. So yes, I did consult an attorney regarding my own personal liability. What it comes down to (in the U.S. at least) is that the company takes on that liability. Unless, of course, you do something criminal like sabotage a control program or something ... but your employer assumes the normal costs and risks of doing business. If you are just doing your job you generally can't be held liable