Google Lobbies Nevada To Allow Self-Driving Cars 275
b0bby writes "The NY Times reports that Google is quietly lobbying for legislation that would make Nevada the first state in which self-driving cars could be legally operated on public roads. 'The two bills, which have received little attention outside Nevada's capitol, are being introduced less than a year after the giant search engine company acknowledged that it was developing cars that could be safely driven without human intervention.'"
Not yet. (Score:5, Insightful)
I would not feel safe with self driving cars on the road...yet.
Google's still a private company, and their word alone that these cars are safe does not a satisfied citizen make. Let these cars be thoroughly tested by both a government entity and a private third party before they be allowed on the road.
Furthermore, we all know that a program that's still being beta tested still has its bugs. Even if the bugs were worked out so that a car "experienced a bug" only once every 100,000 miles, given the number of vehicles presently on the road and how much they are driven every day, that would still be too many "crashes" for society to find acceptable.
Re:Not yet. (Score:5, Insightful)
It's the combination of self driven and idiot driven ones that scares me most.
Re:Not yet. (Score:4, Insightful)
It might be possible if we could demonstrate ten years with semi-automatic driving. Have a computer in control most of the time with a human as backup. But I frankly don't believe that a self driving car can come close to dealing with all the corner cases involved in driving on public roads.
1 bug / 100,000 mile - I'll take that (Score:5, Insightful)
Re:Not yet. (Score:4, Insightful)
Humans can drive.
...badly.
Re:Not yet. (Score:5, Insightful)
Society is going to be the problem here anyway. People are going to freak out at cases where the driving AI is responsible for a fatal accident. A quick search shows that 33808 people died in road accidents in the US, in 2009. And that's apparently a 60-year low. This still translates to some 92 traffic fatalities per day. But society accepts that... whereas I'm sure they would freak out if a full transition to self-driving cars happened, with the driver AI being responsible for 1 fatality per day. Fatality numbers could go down by almost two orders of magnitude, but people would feel less safe on the road because of "killer cars" out there.
I feel this is a big problem overall - people are willing to accept human controlled systems where the human factor regularly leads to accidents/injuries/deaths, but if that system can be automated with a much lower accident/injury/death rate, the society would not feel it's safe.
Re:Not yet. (Score:3, Insightful)
The funny thing is that they can't be thoroughly tested if they aren't actually *allowed* on the road
Yes, in the same way that you can't really test a plane until it has its first flight with passengers aboard, or a bridge until you unleash rush hour traffic on it.
Backseat drivers (Score:3, Insightful)
I know that Bruce Schneier has said that human beings tend to overestimate risks when we feel that we are not in control and underestimate risks when we feel that we are in control. That's why people tend to feel more anxious in the passenger seat.
I think it is this innate sensibility that will be the biggest obstacle to self-driven cars, and will remain after the technological problems are solved.
Re:Not yet. (Score:3, Insightful)
People attack what they fear. People fear what they don't understand. The idiot is something they understand completely.
Same happened when the first steam and kerosene powered cars were first introduced. There was a strong movement to ban them, fud about accidents they cause, etc.
On the other hand, a computer-driven car will be equipped with cameras and a black box. In the even of an accident it will be trivial to see who to assign the blame to.