Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Google Transportation Technology

Self-Driving Cars Should Be Legal Because They Pass Safety Tests, Argues Google (theverge.com) 265

An anonymous reader quotes an article on The Verge: Chris Urmson, director of Google's self-driving car project, has sent a letter to US Transportation Secretary Anthony Foxx today with a plan for selling autonomous vehicles that have no steering wheels or pedals. The plan appears to be pretty straightforward: Urmson argues that if a self-driving car can pass standardized federal safety tests, they should be road-legal. Urmson adds that regulators could 'set conditions that limit use based on safety concerns.'
This discussion has been archived. No new comments can be posted.

Self-Driving Cars Should Be Legal Because They Pass Safety Tests, Argues Google

Comments Filter:
  • by Anonymous Coward on Saturday March 19, 2016 @01:45PM (#51732127)

    About three years ago I accidentally let my license expire and thus had to re-take the driving component of the exam.

    I am somewhat convinced you could pass it with a non-autonomous vehicle having no steering wheels or pedals.

    • I failed my driving test on a critical error. ... They gave me the license anyway and told me to watch out a bit when I'm driving or I may end up hurting someone. Quite frankly driving tests are a joke designed to be passable by the lowest common denominator of society.

      • by jez9999 ( 618189 ) on Saturday March 19, 2016 @07:13PM (#51734105) Homepage Journal

        Not in the UK. Maybe Google should try having one of their cars pass the test over here.

      • It is not the test's deficiencies but the complete lack of tests...for old people. Really old people. People in their 80s and 90s.

        I saw an old guy at a red light suddenly bolt into the intersection where traffic moves at 50mph. No reason, lots of witnesses, he hit someone of course. Just too old to be driving.

        In Canada my grandmother had to retake her test every so many years once she reached 65. The US doesn't seem to have that standard. I'm not sure why.

        • Keep in mind that driver licensing in both Canada and the US is done at the province/state level, not the national level. Requirements vary widely.
        • by mjwx ( 966435 )

          It is not the test's deficiencies but the complete lack of tests...for old people. Really old people. People in their 80s and 90s.

          I saw an old guy at a red light suddenly bolt into the intersection where traffic moves at 50mph. No reason, lots of witnesses, he hit someone of course. Just too old to be driving.

          In Canada my grandmother had to retake her test every so many years once she reached 65. The US doesn't seem to have that standard. I'm not sure why.

          Because old people vote and complain more.

  • by Anonymous Coward on Saturday March 19, 2016 @01:46PM (#51732129)

    We can't write a 100% working OS for a phone. Please trust our software with your life.

    • by guruevi ( 827432 ) on Saturday March 19, 2016 @02:10PM (#51732297)

      There are plenty of computers in use (a lot of the better ones are running Linux or an RTOS and hell, even Windows NT/CE/XP) that people trust their lives to implicitly on a daily basis in a lot more delicate situations than driving a car. Commercial planes do most of the flying fully autonomous, most of both your debt and savings is being invested fully automated, any machine in a hospital parses a lot more data than a few dozen sensor and requires much more precision.

      • by epyT-R ( 613989 )

        flight is actually a much simpler problem to solve for an AI than ground travel. Planes don't typically have to avoid unexpected obstacles because their vectors are carefully monitored and controlled by human pilots in the air and on the ground. So while the speeds and distances are much greater, the path to destination is much simpler (even if elliptical).

        Machines are quicker, yes, but a lot dumber and lack situational awareness. A medical machine monitoring vitals can notice changes a lot more quickly

        • Machines are quicker, yes, but a lot dumber and lack situational awareness

          On the other hand, machines are capable of watching dozens of different sensors and cameras at once, in all directions around the car, with much higher precision, and without getting distracted or sleepy. What they are lacking right now is human-like interpretation of what they see, but that's a field that is rapidly improving.

        • Look up TCAS sometime. The planes have sensors to detect each other. If the TCAS system detects a possible collision situation, the planes determine, all by themselves, the correct course of action, and then relay that information to the pilot. Commands like CLIMB or DESCEND or STAY LEVEL. In this situation, the pilot has absolutely no say in the matter. They are required to obey the computer because in the past, pilots ignoring this input have cause planes to crash into each other in mid-air because the pilot thought he knew better. The TCAS commands even override Air Traffic Control commands. How's that for trusting your life to a computer?

      • There are plenty of computers in use (a lot of the better ones are running Linux or an RTOS and hell, even Windows NT/CE/XP) that people trust their lives to implicitly on a daily basis in a lot more delicate situations than driving a car. Commercial planes do most of the flying fully autonomous, most of both your debt and savings is being invested fully automated, any machine in a hospital parses a lot more data than a few dozen sensor and requires much more precision.

        Driving is a far more difficult problem than auto landing, auto pilot and auto takeoff on an airplane.

        So if one vendor's software passes a driving test let it also share all the driver's license "points" accumulated by all the autonomous vehicles. So if it makes too many mistakes or gets into too many accidents it looses its license. Again, not an individual car, all cars running the vendor's software.

        • pilots land and take off manually.

          • by Pikoro ( 844299 )

            Unless the weather isn't conducive to VFR, in which case, guess what? The autopilot lands the plane. In situations where the pilot cannot be trusted to land due to poor visibility or other issues, the autopilot is king. I've taken an A320 class D flight simulator from ATL to JFK without touching the controls except to set the throttles to the climb position from TOGA, setting the flaps, and landing gear position.

            • by KGIII ( 973947 )

              An automobile not only has many, many more variables but it has variables that apply to it as well as variables that apply to other vehicles and terrain types. Call me back when your AV can get me to my home, in a snowstorm, without having been there before. Then, it needs to be safer than I am.

              My home is in NW Maine, near the edge of cellular connection but there are actually two towers to work with, it's on the side of a mountain, the driveway is paved, it's about a half-mile long. It's steep but not too

        • by kuzb ( 724081 ) on Saturday March 19, 2016 @03:04PM (#51732657)

          "So if it makes too many mistakes or gets into too many accidents it looses its license."

          You now have one point on your spelling license.

  • by epyT-R ( 613989 ) on Saturday March 19, 2016 @01:46PM (#51732131)

    ..and lots of them have been proven later to be unsafe anyway. The law cannot account for everything.

    • by wonkey_monkey ( 2592601 ) on Saturday March 19, 2016 @02:08PM (#51732275) Homepage

      The same goes for drivers.

      • by epyT-R ( 613989 )

        That's true. The difference is that a half attentive human is still far more situationally aware than a computer.

    • Re: (Score:3, Interesting)

      by Dutch Gun ( 899105 )

      This shows what a horrible idea it was for Google to remove the standard driver controls from their car design.

      First, it gives absolutely no backup when the inevitable failure occurs and the car doesn't know WTF to do. For example how exactly are you supposed to direct the car to a specific parking spot inside a garage?

      Second, it was stupid simply from a regulatory point of view. Yeah, no kidding regulators are not going to be thrilled about letting version 1.0 of an autonomous vehicle on the road without

      • by slashping ( 2674483 ) on Saturday March 19, 2016 @02:19PM (#51732351)

        My bet is that Google is going to have to backpedal on this

        They would, but unfortunately, the backpedal has already been taken out.

      • by Kjella ( 173770 )

        This shows what a horrible idea it was for Google to remove the standard driver controls from their car design. First, it gives absolutely no backup when the inevitable failure occurs and the car doesn't know WTF to do. For example how exactly are you supposed to direct the car to a specific parking spot inside a garage?

        Just like human drivers Google's car is free to take directions from the passengers but still be the one legally responsible. I'm sure they have a plan B, but it's obvious the car would have a ton more value if it didn't require a licensed and capable driver. And principally humans don't have a backup, if the driver is incapacitated well call an ambulance and a tow truck. Yes, that might mean the passengers are shit out of luck. But if they don't have a license or are drunk or whatever they would be anyway,

        • For example how exactly are you supposed to direct the car to a specific parking spot inside a garage?

          Ok, google, park next to the elevator/blue sedan/in spot 14A/etc...

          • by Kjella ( 173770 )

            Ok, google, park next to the elevator/blue sedan/in spot 14A/etc...

            A relatively simple touch screen should do the trick, if it's not the passenger's responsibility to check the mirrors and such. You just say where you want to go, the car works out if it's safe to do so.

          • Ok, google, park next to the elevator/blue sedan/in spot 14A/etc...

            "Now parking next to Ellen's gator/flew command/in pot for teen gay"

            Wait, no... aaaaaaah!

          • For example how exactly are you supposed to direct the car to a specific parking spot..?

            Ok, google, park next to the elevator/blue sedan/in spot 14A/etc...

            I like to park away from other cars (less likely to get dinged) and away from the elevator. I expect the Google car would be programmed to park as close as possible to the elevator anyway, which is what most people want and exactly what I don't.

            At work I park in the remotest area of the car park because I like quiet and privacy when I often go and sit in it at lunch break. I even wash it there sometimes. There are a few other like-minded people who do the same, spaced about. And the slots are not num

            • I expect the Google car would be programmed to park as close as possible to the elevator anyway, which is what most people want and exactly what I don't.

              I expect them to let you say where you want your car parked, either through voice commands or just tapping on the location on the screen.

            • Why would it even need to be parked near the building at all? The car could just drop you off and continue on to the municipal garage which has been specially marked to accommodate autonomous vehicles. For short visits, it could just orbit the block until you're ready to leave (probably this doesn't scale, though).

        • Oh, no doubt I agree that eventually cars will not even require a licensed driver at some point in the future. Heck, I'm not even really debating the issue of whether or not their car *should* be designated road worthy. I just think it was wildly optimistic of Google to think that their version 1.0 driverless cars could look like this, given the practical and regulatory hurdles they'll be facing.

          Every other autonomous car manufacturer is taking a much saner approach of integrating these systems so that th

        • it's obvious the car would have a ton more value if it didn't require a licensed and capable driver.

          Absolutely. Children, drunks, the blind and the elderly could all benefit from personal transport. Also, drivers are one of the largest unpredictables for autonomous cars. The less drivers there are, the better they will communicate and work together.

          Not to mention that if the cars drive well in normal conditions, the driver is unlikely going to be paying enough attention to "take charge" in an emergency

        • I saw a video where an incapacitated driver was moving slowly down a highway, and a police officer smashed the passenger-side window, jumped in, and stopped the car. How does this work with a car with no controls?
      • How about sudden, complete loss of system power? If my car stalls out at speed, I've still got partial (albeit limited) control. What the fuck's the [steering-wheel-free] Google "car" going to do? Sure, it'll come to a stop (engineering a fail-safe mechanical backup is easy)... but since it won't have any steering, there's no telling where.

        Since any fool, even the non-technical shitheads jerking off over the prospect of not having to ever drive again because they suck so badly at it (we know who you are) sh

        • How about a sudden, complete loss of steering wheel ? Of course, you can come up with extreme scenarios where the computer will fail. The car doesn't have to be perfect. If it's 10 times better than your average human driver, it's good enough.
          • How about a sudden, complete loss of steering wheel ?

            Let's ignore the fact, for a moment, that you just demonstrated your unsuitability to even be participating in this conversation and pretend that what you actually asked was...

            How about a sudden, complete loss of steering ability?

            ...in which case, my reply would be a sarcastic and altogether-assholish "Well then then, it's a good thing self-driving cars will be immune from tie-rod/turnbuckle and other forms of mechanical failure."

            Heard a new one recently; I believe it applies here: Don't bring piss to a shit fight. ;)

      • For example how exactly are you supposed to direct the car to a specific parking spot inside a garage?

        Yep I'm sure the best and brightest minds in car automation haven't given this a moment's thought. What idiots!

        • For example how exactly are you supposed to direct the car to a specific parking spot inside a garage?

          Yep I'm sure the best and brightest minds in car automation haven't given this a moment's thought. What idiots!

          The real question is... why do you care where your network-connected self-driving car parks? Let the damned thing figure it out by itself. Whey you need it, you don't go find it, you grab your phone and tell it to come to you.

      • I can't imagine what leads Google to think they can actually solve EVERY problem an autonomous car will run into with the very first version. Where exactly does that extraordinary self-confidence (hubris?) come from?

        Their hubris comes from your lack of research. Maybe look into what, how long, and how many iterations Google have gone through in their car design before you claim they are solving "EVERY" problem in the "very first version".

        Traditional car companies know that they can only get away with this on their concept cars - not their production models.

        Traditional car companies think the most advanced device in their car is an iPhone dock. It has always taken small start-up or independent inventors / engineering firms to create a significant change in the car industry. If you left it up to car companies you wouldn't have seatbelts, c

      • For example how exactly are you supposed to direct the car to a specific parking spot inside a garage?

        Why would you want to? Just get out and let the car figure out where to park. Or maybe tell it to go home. When you're ready to go, just use your phone to tell the car where to pick you up.

        I can't imagine what leads Google to think they can actually solve EVERY problem an autonomous car will run into with the very first version.

        This is very, very far from the first version.

        Where exactly does that extraordinary self-confidence (hubris?) come from?

        Several years and ~1.5 million miles of real-world testing.

        That is, designers felt the damn thing didn't look futuristic enough if it still had a steering wheel and petals.

        No, actually. The people working on it have explained at length why they took this step, and it derived from actual experience and testing. There are two problems with putting controls in a self-driving car. First, it

  • Monkey? (Score:5, Insightful)

    by Carewolf ( 581105 ) on Saturday March 19, 2016 @02:03PM (#51732229) Homepage

    Any ordinary car driven by a raging retarded monkey would pass the safety tests as well.

    BECAUSE THE SAFETY TESTS ON CARS DOESN'T TEST DRIVING OR COGNITIVE SKILLS!!!!

  • No steering or pedals? Talk about snatching defeat from the jaws of victory!

  • Urmson argues that if a self-driving car can pass standardized federal safety tests, they should be road-legal.

    Umm, yes, seems reasonable...

    Sorry, is there any actual story here? It's practically tautological. Of course there should be some kind of safety test for self-driving cars before they're allowed on the roads for any reason other than testing. Was anyone expecting anything else?

    • Of course there should be some kind of safety test for self-driving cars before they're allowed on the roads for any reason other than testing

      You're missing the point. Google argues that the safety test should be good enough.

      • As in only the standard safety tests that all cars currently go through, and nothing else? One of the articles could be read that way, but neither is crystal clear.

        • Urmson adds that regulators could "set conditions that limit use based on safety concerns,"

          I think that means they are open to the idea that there could be additional tests or limitations, as long as they can remove the steering wheel and pedals.

        • As in only the standard safety tests that all cars currently go through, and nothing else?

          Google's claim is disingenuous BS.

          Current car construction regulations were written with the unspoken assumption that there would be a competent driver who can react to failures, for example can apply the handbrake if the footbrake fails. Of course a Google car can be programmed to apply a secondary brake if the primary one fails, but for that to occur automatically is not a requirement in the present construction regulations (in the UK anyway).

          What a self-driving car needs to meet is the sum of the

  • robot cars will take over the taxi driver profession, including uber, and if the price of a ride comes down low enough it will put a big dent in busses and subway customer base
    • Depends on the location, I'd think. I know a bunch of people who are far richer than you need to be in order to drive a car in NYC, but who still take the subway because self-driving or not, a car during rush hour in NYC is going nowhere.

  • - would I feel OK sitting in one of those and everything around me moves and I cannot act at all, sitting there passively?

    I think I would worry myself to death.

    Cruse Control fine, maybe automatic steering but steering wheel disappearing, seats swerving 90 degree away from driving direction towards each other, as it was written about the BMW "experiment" and semis overtaken by software passing by right before my eyes - no mf. way ...

  • by HuskyDog ( 143220 ) on Saturday March 19, 2016 @02:51PM (#51732563) Homepage
    Now, I can just about grasp that a self driving car can be constructed that will navigate on the road, but that is not all that a car has to do. Let's look at a couple of examples:

    1) Suppose I live on a small farm or ranch and you are coming to visit me in your car. I might say "When you get here, come up the drive, turn left at the old tractor and park behind the barn next to the chickens". With a conventional car this should be easy, but what if you have one of these Google cars with no controls. Presumably it will find my address and arrive at the end of the drive. Given that there are no manual controls, how would you tell it the bit about the tractor and chickens? Will you just be able to type that in and it will be clever enough to follow those instructions?

    2) What about parking at work? I work on a big site with several car parks. How will I describe to the car which one I want to park in. They don't have separate Zip codes.
    • It could show you the surroundings on a screen, and you tap where you want to go.
    • Will you just be able to type that in and it will be clever enough to follow those instructions?

      Yes. And then it will kick your butt at a game of Go.

  • Should every self-driving car pass the same test humans do?

    What will happen if everyone has self-driving cars, and our GPS system goes out?

    Who will be the first one to die in a driverless car?

    So you don't enjoy driving?

    How do motorcycles fit in in a world of self-driving cars?

    How many people will be made unemployed by self-driving cars, buses, and trucks?

  • ... until somebody loads up a few car bombs and sends them to town.

  • If I get hit by an automated car, I sure hope my insurance agency sues Google. Furthermore, if I have an automated car and it gets into an accident it's not my fault so I'm not paying insurance for it, Google can pay the bill.
  • We have our suicide hill --- on the map a direct and plausible route south, but with a steep S curve which even the locals occasionally fail to navigate.
    Come Labor Day and the VFW and our rural volunteer firemen have beer tents set up in a small town parks, with temporary parking on the grass, while minor road work elsewhere have traffic being routed off onto the shoulder. In theory, every hazard should be properly signed and flagged and posted to the web. In practice, it doesn't always work out that
  • They try to cover a representative subsample of ever possible situation. They're based on the premise that if you're cognizant enough to figure out what to do in those situations, you're cognizant enough to figure out the common sense thing to do in other situations which aren't being tested.

    A machine built to only pass the situations being tested should automatically fail. The actual "rules" of the road is the State driving code, which is usually several hundred pages long and probably filled with sit
  • I'm never going to allow one of these things to drive me, as I will not let Google make life-and-death choices for me. Let me illustrate with a story that happened to me a couple of years ago in Morocco. I was driving from Marrakesh to Ouarzazate (beautiful trip, btw) in a narrow road when a truck coming at high speed from the opposite direction invades my lane. I thought quickly of two ways to escape it:

    1 - Swerve to the right, crashing into the mountain. Likely to destroy my car and cause me minor injurie

BLISS is ignorance.

Working...