Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI Google Transportation Technology

California Legalizes Self Driving Cars 508

Hugh Pickens writes writes "The Seattle PI reports that California has become the third state to explicitly legalize driverless vehicles, setting the stage for computers to take the wheel along the state's highways and roads ... 'Today we're looking at science fiction becoming tomorrow's reality,' said Gov. Brown. 'This self-driving car is another step forward in this long, march of California pioneering the future and leading not just the country, but the whole world.' The law immediately allows for testing of the vehicles on public roadways, so long as properly licensed drivers are seated at the wheel and able to take over. It also lays out a roadmap for manufacturers to seek permits from the DMV to build and sell driverless cars to consumers. Bryant Walker Smith, a fellow at Stanford's Center for Automotive Research points to a statistical basis for safety that the DMV might consider as it begins to develop standards: 'Google's cars would need to drive themselves (by themselves) more than 725,000 representative miles without incident for us to say with 99 percent confidence that they crash less frequently than conventional cars. If we look only at fatal crashes, this minimum skyrockets to 300 million miles. To my knowledge, Google has yet to reach these milestones.'"
This discussion has been archived. No new comments can be posted.

California Legalizes Self Driving Cars

Comments Filter:
  • by Anonymous Coward on Wednesday September 26, 2012 @12:41PM (#41465457)

    A human won't pass that test 100% of the time either, so I'm not sure what your point is about 100%. It's all statistics.

  • by h4rr4r ( 612664 ) on Wednesday September 26, 2012 @12:44PM (#41465487)

    Why would a self driving car ever drive off a cliff?
    Clearly it would rank available options and pick the lowest cost one. The cheapest collision in that case.

    Human drivers allow fatalities everyday. The question is not is it better than some hypothetical human driver, but is it better than the drivers we have right now.

    5 years ago the tech to do this was not cheap enough, now it is. This is called progress not being irresponsible. What is irresponsible is suggesting that the average person continue to drive automobiles when we have a better solution at hand.

  • by cyberchondriac ( 456626 ) on Wednesday September 26, 2012 @12:50PM (#41465551) Journal
    My first gut instinct is, this is bad, bad, bad.. but then I think of the stupid beatch in the Hyundai that blew by me at 85mph, then cut into my lane, making me slam my brakes on while driving to work this morning.. so maybe it's not so bad.
  • by rich_hudds ( 1360617 ) on Wednesday September 26, 2012 @12:51PM (#41465563)
    I think you're entirely wrong.

    A much more likely scenario is that the self driving cars prove statistically to be safer than human driven cars.

    At that point expect legislation to ban humans from driving.

    Imagine trying to defend yourself in court if you've caused a fatal accident.

    'Why did you turn off the computer when you know it is proven to be safer?'
  • by h4rr4r ( 612664 ) on Wednesday September 26, 2012 @12:54PM (#41465621)

    Driving is enjoyable?
    Since when?

    Sure a race track is enjoyable, twisty deserted roads can be fun, but 99% of driving is mind numbing boredom.

  • by CanHasDIY ( 1672858 ) on Wednesday September 26, 2012 @12:58PM (#41465675) Homepage Journal

    'Why did you turn off the computer when you know it is proven to be safer?'

    "Because my brain operates at a frequency modern computers cannot even begin to match, and it cannot be hacked."

  • by Altanar ( 56809 ) on Wednesday September 26, 2012 @12:59PM (#41465695)
    As far as I'm concerned, letting humans drive is putting trust in the other human drivers around me, and frankly, I don't trust them at all. I'd feel much safer if manual driving was illegal.
  • by h4rr4r ( 612664 ) on Wednesday September 26, 2012 @01:00PM (#41465713)

    1. your reaction time is absolute crap.
    2. advertisers disagree with your notion that human brains cannot be hacked.

  • by h4rr4r ( 612664 ) on Wednesday September 26, 2012 @01:21PM (#41465975)

    No that question is; Is the car a better driver than me when I am sleep deprived, upset at my wife and in a hurry to get home?

    The computer will always drive the same, humans are not the reliable.

  • by 0123456 ( 636235 ) on Wednesday September 26, 2012 @01:22PM (#41465995)

    I believe on public roads you do need a human available to take over for legal reasons.

    And that worked so well for AF447.

    Aviation autopilots should have proven by now that relying on a human to take over when the situation is so bad the autopilot can't handle it is a recipe for disaster. Besides, what's the point of a 'driverless car' if I have to be continually ready to take over at a millisecond's notice?

    Car: 'Warning, warning, kid just jumped out in the road, you are in control'.
    Driver: "WTF? I just hit a kid and smeared their insides all over my windshield'
    Car manufacturer: 'Not our fault, driver was in control, human error'.

  • by Matimus ( 598096 ) <mccredie@g[ ]l.com ['mai' in gap]> on Wednesday September 26, 2012 @01:34PM (#41466169)
    I have known a few terrible drivers in my life. Despite their friends, and occasionally strangers, telling them that they were terrible drivers, multiple collisions in which vehicles have been totaled, and even collisions with pedestrians, they still believed that they were good drivers. Individuals may not be the best judges of whether or not they can drive better than a machine.

    It will be interesting to see how this plays out. How the public perceives it. How it is marketed. How it is handled by insurance companies.

  • by doom ( 14564 ) <doom@kzsu.stanford.edu> on Wednesday September 26, 2012 @01:43PM (#41466315) Homepage Journal
    I don't think there's any question that automated cars can beat human beings at safety, nor is there any question that they can reduce pollution just by driving more evenly (not to mention by drafting each other, "tailgating" to form car-trains).

    The trouble with them is that they'll take the sting out of long commutes. You already have people who think it's a good idea to spend four hours a day driving for the sake of cheaper real estate. What if they up it to six hours a day when they don't have to stare at the road?

    Note: cutting a problem (pollution, car-deaths) would do no good if you double the miles.

  • by mellon ( 7048 ) on Wednesday September 26, 2012 @01:45PM (#41466339) Homepage

    I think that the way it will play out is that as self-driving cars become a real and viable option, the penalties for bad driving will go up—drive drunk once, and you lose your license permanently, because why not—you can just use a self-driving car. Driver's tests will get harder, because why not—if you fail, you can just use a self-driving car. It will start with really egregious behavior, because voters won't feel threatened by it in sufficient numbers to cause a problem. Over time, the standards for human drivers will go up; at some point driving your own car will be about as common as flying your own airplane. We'll also probably stop giving licenses or learners' permits to teenagers, because they don't have the vote, and their parents would prefer to avoid a teenage testosterone tragedy.

    Of course, a really spectacular failure on the part of a self-driving car could put that whole scenario off by a generation.

  • by flimflammer ( 956759 ) on Wednesday September 26, 2012 @02:06PM (#41466629)

    Are you an objective source for deciding whether or not you're a better driver than the machine?

  • by drerwk ( 695572 ) on Wednesday September 26, 2012 @02:23PM (#41466837) Homepage

    The system just needs a rapid manual override and a little common sense from the driver.

    See the results of the http://en.wikipedia.org/wiki/Air_France_Flight_447 [wikipedia.org] AF447 flight for the odds of this working. As a one time private pilot I am totally baffled as to how a professional pilot could hold a plane in a stall from 35,000 ft to the ground. I think there were several issues including human factors in the design of the interfaces; but I really think that these guys got used to being along for the ride and it was not conceivable to them that the plane had decided to stop flying itself.

    After a week of having an auto-car drive me to work everyday I can not imagine I'd be ready in 1/2 second to suddenly take over for the computer and expect a good result.

  • by SolitaryMan ( 538416 ) on Wednesday September 26, 2012 @03:18PM (#41467475) Homepage Journal

    I wonder how insurance companies are going to handle this. My self-driving car hit your self driving car. Who's going to pay? Yeah, my car is at fault, but I wasn't at the wheel and I don't even have a license. What then? What if a collision is due to a bug in software?

    I'm afraid that legal obstacles this project faces are more serious than technical.

Stellar rays prove fibbing never pays. Embezzlement is another matter.

Working...