Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation Google

California DMV Told Google Cars Still Need Steering Wheels 506

cartechboy writes Google showed us what it feels is the car of the future. It drives itself, it doesn't have a gas or brake pedal, and there's no steering wheel. But that last one might be an issue. Back in May California's Department of Motor Vehicles published safety guidelines aimed at manufacturers of self-driving vehicles. After seeing Google's self-driving car vision, the California DMV has told the company it needs to add all those things back to their traditional locations so that occupants can take "immediate physical control" of the vehicle if necessary. Don't for a second think this is a major setback for Google, as the prototypes unveiled weren't even close to production ready. While the DMV may loosen some of these restrictions in the future as well all become more comfortable with the idea of self-driving vehicles, there's no question when it comes down to the safety of those on the road.
This discussion has been archived. No new comments can be posted.

California DMV Told Google Cars Still Need Steering Wheels

Comments Filter:
  • Not surprising (Score:5, Interesting)

    by gurps_npc ( 621217 ) on Tuesday August 26, 2014 @12:16PM (#47758135) Homepage
    California is playing it safe. It will take a while for us to trust the software enough to remove the steering wheel.

    In fact, it would not surprise at all if the brake itself is NEVER removed. I can easily foresee a situation where these vehicles are used to transport unwilling people, or simply undergo a malfunction and the occupant will always want the ability to stop the device.

    But I can see the steering wheel and accelerator going away completely - don't want to let untrained people having the ability to make things worse.

    • Re:Not surprising (Score:5, Interesting)

      by TWX ( 665546 ) on Tuesday August 26, 2014 @12:22PM (#47758225)
      It's going to depend on who's allowed to use a self-driving car and under what conditions, and even so far as what seats are allowed to be occuppied.

      I can see a tiered system where licensed drivers with a normal operator's permit are allowed to always occupy the driver's seat in a vehicle with the capability of full control. I could see a special provision of license for those who once held normal operators' permits that voluntarily gave up those licenses (elderly, poor vision, etc) so that they could basically pull-over the vehicle in a crisis. There could also be a special class of license for learners' permit operator licenses that allow the person to occupy that seat. Everyone else will be required to occupy any-other-seat unless all seats are occupied, then there would have to be conditions to allow that seat to be occupied while the controls are disabled.
      • Re:Not surprising (Score:5, Interesting)

        by ShanghaiBill ( 739463 ) on Tuesday August 26, 2014 @01:14PM (#47758795)

        It's going to depend on who's allowed to use a self-driving car and under what conditions, and even so far as what seats are allowed to be occuppied.

        Under California law, a licensed driver must be seated in the "driver's seat", and must be paying attention (no yacking on the cell phone). These requirements won't be permanent, but at least for the first few years of SDCs, that is how it will be. Once a safety record is established, and the pubic is more comfortable with the technology, the restrictions will be relaxed. In a decade or so, cars will likely be able to drive with no people on board, or even transport children with no adult in the car.

        • Re: (Score:3, Insightful)

          Then what's the point; I want to sleep!
          • Re: (Score:3, Insightful)

            Comment removed based on user account deletion
            • Re:Not surprising (Score:4, Informative)

              by k3vlar ( 979024 ) on Tuesday August 26, 2014 @03:30PM (#47760067)
              No. Google cars don't drive on a "virtual track".

              They reference map data for general route planning, and then defer to onboard cameras and other sensors which feed into image-recognition systems to drive in the same way as a human (by paying attention to the environment, and not blindly following GPS).

              These cars use cameras, lasers and radar to look for lines on the road (or other markings), road signs, cyclists, pedestrians, other vehicles, etc., and use this to build a live 3-dimensional map of the surrounding area. The software builds a stack of triggers, sorts them according to priority, and then reacts, by turning, braking or accelerating.

              Source: https://www.youtube.com/watch?... [youtube.com]
      • Short term (Score:5, Insightful)

        by ZombieBraintrust ( 1685608 ) on Tuesday August 26, 2014 @01:22PM (#47758887)
        That is very short term thinking. 15 years after driverless cars are released your going to have a whole generation of people who never learned to drive a car. People who get the license and then forget everything because they haven't touched the wheel in 10 years. There is not going to be anyone quilified to drive a car. You will just have millions of amatuers with a skill level of a 16 year old.
        • Re:Short term (Score:4, Insightful)

          by TWX ( 665546 ) on Tuesday August 26, 2014 @01:31PM (#47758997)
          Self-driving cars will be luxury items to start with. Not only won't they be in the price range of your financially-strapped new-driver, they won't even be in the price range of most consumers. New drivers will end up with used cars. I expect that it'll be 30+ years before self-driving cars are even close to half of cars on the road, if not longer, as a lot of people won't give up the ability to self-drive, won't want to replace a self-driving car with an autonomus-only car, or can't afford the extra cost of such a car.
        • Re: (Score:2, Insightful)

          by Anonymous Coward

          That is very short term thinking. 15 years after driverless cars are released your going to have a whole generation of people who never learned to drive a car. People who get the license and then forget everything because they haven't touched the wheel in 10 years. There is not going to be anyone quilified to drive a car. You will just have millions of amatuers with a skill level of a 16 year old.

          So somehow everyone who was alive before they hit the market is either going to forget how to drive or suddenly die by this 15 year mark that you're chosen? You think with the popularity of programs like Top Gear that there won't be millions around who still prefer to drive their own cars?

        • by CanHasDIY ( 1672858 ) on Tuesday August 26, 2014 @02:31PM (#47759579) Homepage Journal

          That is very short term thinking. 15 years after driverless cars are released your going to have a whole generation of people who never learned to drive a car. People who get the license and then forget everything because they haven't touched the wheel in 10 years. There is not going to be anyone quilified to drive a car. You will just have millions of amatuers with a skill level of a 16 year old.

          The futurist in me envisions this occurring about 5 years after the bloodless, peaceful transition to an economy where robots do all the work.

          The futurist in me is an exceedingly sardonic asshole, if you haven't picked that up already.

    • Re:Not surprising (Score:5, Interesting)

      by jellomizer ( 103300 ) on Tuesday August 26, 2014 @12:25PM (#47758279)

      I agree, having a manual break should be required as a bare minimum.
      Even if the software is perfect, if there is an unexpected power outage, you will need a manual break to stop the car that just may be aimlessly costing.

      • Re:Not surprising (Score:5, Insightful)

        by TWX ( 665546 ) on Tuesday August 26, 2014 @12:30PM (#47758331)
        As Toyota demonstrated to us, that manual break needs to be damn-near hardware-level too, or at least allow for an emergency override that interrupts the computer entirely if the main 'stop the car now' brake fails to work properly.

        It would be terrifying to be in a self-driving runaway car without any controls whatsoever.
        • Re: (Score:3, Interesting)

          by Xoltri ( 1052470 )
          The cause of the Toyota problem was people hitting the accelerator instead of the brake. http://www.caranddriver.com/fe... [caranddriver.com] So if you take away the accelerator pedal I think we're good to go.
          • Re:Not surprising (Score:5, Informative)

            by TWX ( 665546 ) on Tuesday August 26, 2014 @01:10PM (#47758749)
            No, it wasn't [sddt.com], at least in all cases. There were definite computer control problems that led to the computer getting stuck in a mode where it had the throttle applied, and ignored the brake, shut-off, and gear-selector inputs because since logically the throttle was applied, those other inputs must be erroneous.
            • Re:Not surprising (Score:4, Informative)

              by chis101 ( 754167 ) on Tuesday August 26, 2014 @01:56PM (#47759259)
              I remember reading through that analysis a few months back. It was showed that a single flipped bit could cause the unintended acceleration. What was not shown was how this bit could possibly be flipped. So, it is far from proven that it was a software error.
              • by TWX ( 665546 )
                There was an early fuel injection system in about 1958 or so on some Mopars, manufactured by Bendix and adapted from an aircraft fuel injection design. It delivered more power and more efficient fuel delivery than carburetors ever could. Unfortunately too late, they realized that it was poorly shielded against EMI, and if an old prewar car with a Magneto pulled up next to it, the new fancy fuel-injected car would stumble and shut off.

                Cars are electrically incredibly dirty. Alternators generate electri
      • Re:Not surprising (Score:5, Interesting)

        by rossdee ( 243626 ) on Tuesday August 26, 2014 @12:48PM (#47758499)

        "I agree, having a manual break should be required as a bare minimum."

        A manual brake would be even more useful, along with a kill switch for the engine.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Depends on what you call "Safe"

      This is merely politically safe.

      Realistically, these things are going to be packed to the gills with dozens of sensors covering thousands of metrics and they will be logged every second. I'd be willing to bet that said data will show that the gross majority of accidents happen just after the driver takes control, and are a direct result of driver actions.

      What I don't get is why bother with a traditional seating arrangement once you no longer have to drive? Fuck being upright,

      • I want to lounge back in comfort, read the news, catch up on email, etc.

        And we'll be able to, eventually. These are just the very first set of rules for the very first automated cars; you can't go from Simpsons to Jetsons overnight.

        • I get your idea, but we're going from simpsons to flintstones. So requiring a steering wheel is not exactly a step in the right direction.

    • In fact, it would not surprise at all if the brake itself is NEVER removed.

      That's the current situation with driverless trains/subs:
      No cockpit in the front of the wagon, but you still have "emergency brakes" lever everywhere.

    • by bigpat ( 158134 )
      I think California is playing it wrong and unsafe. I agree there needs to be a big red button on cars which brings the vehicle to a safe stop much like there is on passenger trains, but this move by California seems more like something pushed for by entrenched vested interests and not driven by safety considerations. Lives will be saved when we allow cars to go pick up people that can't drive, don't have licenses or don't want to drive themselves. The implication of this move is that a human driver is go
      • agree there needs to be a big red button on cars which brings the vehicle to a safe stop much like there is on passenger trains

        Passenger trains are on tracks. Cars aren't. I'm going to need more than just an emergency brake. I'm going to need a steering wheel too (and preferably an accelerator).

        In short, I don't (and will never) trust any software fully. I've worked with too many programmers.

    • Re:Not surprising (Score:5, Insightful)

      by pavon ( 30274 ) on Tuesday August 26, 2014 @12:49PM (#47758513)

      They may never be removed. Everyone is focused on the split-second decision scenario when talking about this issue, and on that I agree that humans will cause more problems than they solve. But there are many more situations where manual override is needed and beneficial. What happens when the car runs out of gas/charge and you need to push it to the side of the road out of traffic. Or the computer is malfunctioning somehow (software bug, squirrel chewed halfway through a wire, dead battery/alternator). Or when I need to move the car somewhere off-road that the AI refuses to recognize as a valid driving path. There are plenty of not so time critical scenarios where some sort of manual override is needed and those aren't going to go away even when we trust the software to do all the driving. Once we admit that they don't have to be intuitive for split-second reactions, then they don't have to retain the traditional layout, nor be designed for comfortable everyday use, but some sort of steering, brake control, and neutral gear select will always be needed.

      • Like I'm going to even be looking out the windshield. If I and my partner are in an autonomous vehicle, odds are pretty good neither one of us will be paying attention to anything but each other, if you catch my drift.

      • What happens when the car runs out of gas/charge and you need to push it to the side of the road out of traffic.

        What about the car driving to the side of the road out of traffic with the last bit of kinetic energy available? People might be stupid enough to drive until the tank is absolutely empty and be stuck, a driverless car wouldn't. And then there are driverless Diesel cars which most definitely won't run until the tank is empty, because that kind of thing is _expensive_.

    • One of the things that bugs me about so many high-tech devices is the lack of an "off" switch (and in the case of a vehicle, substitute "stop"). On ye olde personal computers, IBM put a big red paddle-switch that summarily deprived the electronics of electricity. Flip that, and it was OFF. (Even the clock.) These days, it's a button (and pretty soon just a contact-sensitive control spot) that asks the system to... not shut off, exactly, but to put itself into a low-power state in which it looks as if it w

    • I'd like to see self-driving cars have some kind of fail-safe mechanical brake (where power is required to hold the friction surfaces apart) and a big red button that cuts all power.

    • We need to start pushing for formal regulations with regard to what the cars will do when a collision between vehicles is inevitable. Should your car drive off a bridge, killing you, if it means saving a school bus full of kids? Probably. But I'd like to know how such failure modes are defined.
    • by Rich0 ( 548339 )

      In fact, it would not surprise at all if the brake itself is NEVER removed.

      How is having a brake a safety feature when there is interleaved traffic crossing an intersection? If a car were to rapidly stop, it would get broadsided by another car expecting it to not stop.

      I could see the brake sticking around until it is illegal to manually drive a car, however.

    • I side with the DMV on this one; I don't ever want to see vehicles on public roads that have no manual controls whatsoever, it's just a bad idea.
    • the occupant will always want the ability to stop the device.

      I'm afraid I can't let you do that, Dave.

  • Personally I'd like to get a car that has controls similar to a jet fighter - or even more basic if it's all drive-by-wire anyway. Gimme a throttle lever in one hand, and a twist stick for proportional steering in the other - or combine them. More display room and less clutter of a wheel.

    • by TWX ( 665546 )
      Chrysler experimented with a stick-approach in the sixties, it really didn't work very well. The steering wheel that is capable of multiple revolutions allows for fine-grain control over steering, same goes for long-pedal-travel analog brakes and throttle position.

      Think back to playing with cheap radio-controlled cars, it was difficult to navigate tight courses because the cars couldn't steer accurately enough, and if they were really cheaply made and open-wheel types, breaking a control arm at a front
    • Saab did that once. It was universally panned as a terrible idea.

      I'm betting there's just some things you wouldn't be able to do with that joystick, like controlling a skid in the snow.

      Me, I'll stick with the old fashioned steering wheel. I know it works.

      Your jet fighter controls? Not so much.

      • by itzly ( 3699663 )
        So why don't jet fighters use a wheel ?
        • Ummm, because they move in 3 dimensions and have pitch and yaw, whereas a steering wheel does "left and right"? Jets also have a couple of pedals for some additional control surfaces. And buttons, and levers, and dials ... oh, and operators which hugely more training.

          I just don't think a joystick is a good input mechanism for a car.

          You, however, are free to buy whatever kind of car strikes your fancy .. including an older Saab with a fly by wire joystick if you like.

          But, really, if it was better for a car

    • Ejection seat, LOL.

  • Of course (Score:5, Interesting)

    by Meneth ( 872868 ) on Tuesday August 26, 2014 @12:21PM (#47758213)
    Have they not seen "I, Robot" (2004)? Of course you need a manual override.
  • .. So that real horses can take "immediate physical traction" of the vehicle if necessary.
    • by HornWumpus ( 783565 ) on Tuesday August 26, 2014 @12:25PM (#47758267)

      Early cars were required to have a harness attachment point. Which was actually sane at the time. So is this.

    • So that real horses can take "immediate physical traction" of the vehicle if necessary.

      You have no idea how punishing the roads were in the early days of the automobile, how often cars broke down or became hopelessly mired in mud or snow. In rural states, the horse was still in the towing business as late as 1940.

      • So that real horses can take "immediate physical traction" of the vehicle if necessary.

        You have no idea how punishing the roads were in the early days of the automobile, how often cars broke down or became hopelessly mired in mud or snow. In rural states, the horse was still in the towing business as late as 1940.

        In 1919 Lt Col Eisenhower, yes the later Supreme Allied Commander of WW2 and the 1950s President of the US, led a convoy of 24 vehicles from the east coast to the west coast. 9 vehicles were lost, 21 men were injured and unable to continue.

    • .. So that real horses can take "immediate physical traction" of the vehicle if necessary.

      Joking aside, early cars broke down frequently and the horse was a very common towing option. In these early days people didn't necessarily drive themselves, many paid their mechanic to act as their driver. If a person drove themselves they were probably a hobbyist mechanic.

  • by brunes69 ( 86786 ) <slashdot@keir[ ]ad.org ['ste' in gap]> on Tuesday August 26, 2014 @12:22PM (#47758229)

    Any car that allows the driver to take "immediate physical control" makes the roads unsafer for all. The safest roads will be when ALL cars are autonomous. Having humans in the mix will just ruin all the gains that autonomous cars provide. Can a human wirelessly communicate with a car 5 miles ahead to know of a road condition and adjust it's speed in tandem with all the other cars in between to mitigate any and all danger in advance? Can a human react in sub-millisecond time to avoid obstacles thrown in their way. No and no.

    • by TWX ( 665546 ) on Tuesday August 26, 2014 @12:36PM (#47758385)
      Autonomous cars need to prove that they're capable of being safer than operator-driven cars. Right now they haven't done so, and until there's data there will be a need for autonomous cars to be manually operatable.

      I expect to drive myself around for the next 30 years or more; I doubt self-driving cars in the price range that I can justify paying will come out any time soon.
    • The safest roads will be when ALL cars are autonomous.

      Agreed, but having only autonomous cars on the road will not happen for decades to come. First there will need to be a viable autonomous car which has not happened yet and may not for up to 20 years. Then there will need to be at least ten years of testing. Then all manual cars will need to age off the road which will not happen for decades as people will want to keep classic cars on the road. Notice that there are cars built in the 30's that are still on the road. So your utopia of all autonomous cars will

    • by Cabriel ( 803429 )

      And what about for a situation the car doesn't have programmed to deal with? Such as narrowly avoiding an accident that takes up the road in front of it? How does a driverless car deal with that? Just sit there assisting in blocking traffic? What about when an officer on the road is directing traffic? What about when something else is blocking the lane of traffic, like road construction where the workers direct traffic into the lane travelling the opposite direction?

      Yes, human error is most likely to cause

    • Not only that, but an autonomous car that isn't good enough to drive itself without the person having controls probably isn't good enough to be on the road at all.

      The car should either have controls for a human, and expect the human to be operating them. Or it should not have human controls and do all the driving itself. Having the car do all the driving for weeks or months on end, lulling the person into a false sense of security, and then one day expect the driver to take over the controls at some rand
    • Can a human wirelessly communicate with a car 5 miles ahead to know of a road condition and adjust it's speed in tandem with all the other cars in between to mitigate any and all danger in advance?

      Do not assume that source of wireless coordination is always 100% trusty.
      The wireless coordination information might be hostile origin. i.e.: some idiot with a hacked emitter that systematically ask all the other cars to slow down and move aside to let him go through. In theory such a function has practical uses (ambulances, for example), in practice such function WILL GET abused (idiot wanting to arrive faster, or a criminal trying to run away through heavy traffic).

      Can a human react in sub-millisecond time to avoid obstacles thrown in their way.

      Yup, that's what I consider as the main

      • by Rich0 ( 548339 )

        The coordination between cars will have to have a default-safe mode of operation. That is, the car won't go someplace until it is sure it is safe to go there, and it will always have a safe exit-to-stop route that is coordinated with all other nearby vehicles.

        So, suppose my car is parked on the side of the road. It gets a reservation that allows it to be in a box that moves down the road at a certain rate for a certain distance. with an exit to a point where it is stopped at every point along that route.

    • by Kjella ( 173770 )

      May the computer totally fail to realize that the bridge is about to give out or the building about to collapse or an avalanche about to hit or a dam about to burst or you're driving right into a rioting mob or some other disastrous event? Even if I assume that the car will never, ever throw the controls to me and expect me to take over doesn't exclude the possibility that I want to take immediate physical control to avoid some kind of danger that goes above and beyond a computer's understanding of traffic

    • The only 'gains' that will come from people not having control of where they're going in vehicles will be for oppressive totalitarian police states that have complete control over all movements of citizens, and for criminal organizations to steal cars and/or kidnap the passengers. That, and in a small way, funeral homes, who will get business from traffic deaths caused by software malfunctions. Fuck that noise. Enjoy being treated like so much cattle to be moved around at someone else's whim, I refuse to go
  • Consider how many people die on the roads every year in the United States alone, the biggest factor is humans.

    Then, consider how many people die every year due to firearms in the United States alone and look how CA reacts and tries to limit access to firearms, through laws and technology. Basically, "remove the problem."

    Shouldn't CA be pushing hard for driverless vehicles? Removing human error from the equation would save countless lives.

    Think of the children.
    • by hondo77 ( 324058 )

      Shouldn't CA be pushing hard for driverless vehicles? Removing human error from the equation would save countless lives.

      Consider all the bug-free software that is written, especially for first-generation devices. Oh, wait...

    • by TWX ( 665546 )
      The purpose of a firearm is to shoot. The purpose of a car is to convey people or contents over distance, not to crash or to run over someone. That they happen to crash or run over people is something to be solved, and this is heading in that direction.
  • by Jason Levine ( 196982 ) on Tuesday August 26, 2014 @12:32PM (#47758351) Homepage

    I agree that an automated car will need a steering wheel in the immediate future. Once their track record has been proven and people are comfortable with them, however, cars will gradually lose manual controls. We'll likely be telling our grandkids with stories of hundreds of non-automated cars screaming down the highway piloted by fallible humans. Of course, they'll just roll their eyes at us, make an "uphill both ways in the snow" comment, and tell their RobotCar to take them to the mall.

    • by bigpat ( 158134 )
      Compared with the track record for human drivers which is proven to be completely unsafe?
      • What, 100% of driver-operated cars are guaranteed to crash?

        As enamored as you are of the technology, dial back the hyperbole. It doesn't do the cause any good.

        It's called "paying your dues". No one gets away without it. You prove, by extended experience over a long period of time, that the new technology is superior to the old. After a couple of generations (of people, not technology), it's accepted and the shackles of the old can safely go away.

  • Star Trek (Score:5, Funny)

    by Barlo_Mung_42 ( 411228 ) on Tuesday August 26, 2014 @12:34PM (#47758367) Homepage

    If there's one lesson I learned from Star Trek it's that you always, ALWAYS, include a manual override.

  • by gstoddart ( 321705 ) on Tuesday August 26, 2014 @12:34PM (#47758371) Homepage

    California DMV has told the company it needs to add all those things back to their traditional locations so that occupants can take "immediate physical control" of the vehicle if necessary

    The transition time from the computer giving up to the user having to take control is always going to mean this is impossible.

    If you're reading the newspaper, you are not going to be able to transition to operating the vehicle in the event the computer gives up and says it's all up to you.

    I've been saying for a while, that a driverless car needs to be 100% hands off for the people in the car, or serves no value at all other than as a gimmick.

    I will believe driverless cars are ready for prime time when I can stumble out of a pub, crawl into the back seat and tell the car to take me home. Anything less than that is a giant failure of automation waiting to happen, and a convenient way of dodging liability by pretending that users are expected to be in control of the car even while the AI is driving.

    As long as there is a pretense of handing back to the driver in even of an emergency, this is a glorified cruise control, and I'll bloody well drive myself.

    If I'm ultimately responsible for the vehicle, I'll stay in control of the vehicle. Because if there's a 10 second lag between when the computer throws up its hands and says "I have no idea" and when the user is actually aware enough and in control, that is the window where Really Bad Things will happen.

    • by Nkwe ( 604125 )

      As long as there is a pretense of handing back to the driver in even of an emergency, this is a glorified cruise control, and I'll bloody well drive myself.

      If I'm ultimately responsible for the vehicle, I'll stay in control of the vehicle. Because if there's a 10 second lag between when the computer throws up its hands and says "I have no idea" and when the user is actually aware enough and in control, that is the window where Really Bad Things will happen.

      I would agree if the human is expected to be able to take over at any time. But what about if automation was to the point that if the computer found conditions too complicated, it would pull over and stop the vehicle. Once stopped by the computer, manual controls would be used to drive in those "complicated" situations. You could have the option to interrupt the "safe stop" process and assume control if the human driver felt comfortable doing so, but If the logic included an unattended safe stop, would it b

      • Again, your issues become "what are these unknown complicated conditions and how can we know them in advance or that the default solution will work?", "who holds liability?", and "do you realistically expect to hand off control to a human who isn't paying attention and have that work out?"

        If the answer to any of these questions is "don't know, but how hard could it be?", or "that's for the courts to decide", or "what could possibly go wrong?" ... well, the in my opinion, your fancy driverless cars aren't re

    • "If you're reading the newspaper, you are not going to be able to transition to operating the vehicle in the event the computer gives up and says it's all up to you."

      I don't think you understand the topic of conversation here. We're not talking about situations in which the computer says, "Excuse me, Dave, but I'm not sure what to do here. Could you please drive for me?" We're talking about situations in which Dave says, "WTF! You're heading for a cliff!" and chooses to take control. Maybe it takes him s

      • * "there would be no significant delay"

      • I don't think you understand the topic of conversation here. We're not talking about situations in which the computer says, "Excuse me, Dave, but I'm not sure what to do here. Could you please drive for me?"

        Oh, really now? I don't see anything which explicitly says your interpretation OR my interpretation,

        California DMV has told the company it needs to add all those things back to their traditional locations so that occupants can take "immediate physical control" of the vehicle if necessary

        So, either I'm m

    • by TheSync ( 5291 )

      The transition time from the computer giving up to the user having to take control is always going to mean this is impossible.

      I can think of several recent airplane crashes that occurred because pilots tried to take back control from the auto-pilot or auto-landing system without full situational awareness.

    • If I'm ultimately responsible for the vehicle, I'll stay in control of the vehicle. Because if there's a 10 second lag between when the computer throws up its hands and says "I have no idea" and when the user is actually aware enough and in control, that is the window where Really Bad Things will happen.

      Have a look at how collision avoidance systems that are on the streets nowadays currently work:
      - the car will sound an alarm signalling probable impending collision and asking the user to intervene.
      - the car will also autonomously start to slow down and eventually brake and stop never the less.

      The system is designed in such a way that, although human override is possible, the car will also try to autonomously to follow the best course of actions, unless overridden. You could take the control and do somethin

  • by troll -1 ( 956834 ) on Tuesday August 26, 2014 @12:45PM (#47758449)
    Is this requirement based on science or an irrational fear of computers?
    • I would say it's based on a rational fear of computers and automation, and a reasoned understanding that they have failure modes and won't be perfect in all situations.

      The problem is that the transition time from being essentially cargo to the one operating the vehicle is going to be where most of the failures occur.

      So, it's all fine and lovely to say "tag, you're it", but the human reaction time to re-engage with what the vehicle is doing, what is happening around you, and what needs to be done about it is

  • 90% of accidents (or more, depending on the study) are due to human error. So the DMV insistence on putting the humans back into the drivers seat is actually counterproductive. "there's no question when it comes down to the safety of those on the road." ... the question is are the other humans on the road more or less safe with the google vehicle operators able to override the computer?

    While I'm not interested in being an early adopter of this or most automotive technologies, there are lots of questions whe

  • by HockeyPuck ( 141947 ) on Tuesday August 26, 2014 @01:06PM (#47758695)

    If a driverless car has no manual means of steering, and if it broke down and you had to push it, how could you control it?

    • Re: (Score:2, Funny)

      by Ksevio ( 865461 )
      Tow truck.
    • If a driverless car has no manual means of steering, and if it broke down and you had to push it, how could you control it?

      If a car with automatic gear box breaks down, how do you push it?

  • The laws will be rewritten once this gets closer to being a real thing. Google can continue to do what it wants on its test tracks.
  • I want a human to be able to take control of whatever automated device acting as my conveyance. Train, plane, automobile-- it doesn't matter.

    If I'm liable for the machinery and the lives carried by the machinery, I want to be able to determine the maximum speed, the maximum rate of acceleration (when not in an emergency), the route, and be able to take full control as necessary. It's not that I distrust computers... it's the squishy meat bags affecting and affected by the computers I don't trust. Humans pro

  • I wonder if you could legally take your driver's license test in a self-driving vehicle. You'd still have to have your hands on the wheel, and check your mirrors before (the car) changing lanes, but I don't know if there are any rules that would actually prohibit your not being in control of the vehicle.

    Mind you, they'll have to teach the thing to parallel park...

  • If they're insistent there's a way for an occupant to take "immediate physical control", why do they allow current cars on the road?

    I'm not sure about steering, but certainly for acceleration and braking there's no way for drivers to take physical control of a modern automobile. Anything we do with those pedals on the floor sends a signal to a computer. The computer then decides what actions to take--open the throttle or apply the brakes.

    There's a person initiating those functions, but the person does not

  • by wcrowe ( 94389 ) on Tuesday August 26, 2014 @01:40PM (#47759091)

    I like the idea of a self-driving car, but I still don't understand how the self-driving car finds a parking space, or gets eased into place in the garage for maintenance. How does it find it's way around an unexpected hazard, like a downed limb, or washed-out area of the road? How does the self-driving car know that the road is flooded or otherwise undriveable? How does it know that the power is out at an intersection that normally has traffic lights?

  • If a passenger in a diverless car needs to get the car to do some non-programmed maneuvers, they should hit a button, and let the a trained driver back at HQ take control of the car and do the maneuvers.
  • by taustin ( 171655 ) on Tuesday August 26, 2014 @01:45PM (#47759161) Homepage Journal

    That Google thinks their self driving cars are ready for the open road isn't the issue. The issue is that they think they are ready to go straight from traditional cars to cars with no ability for the human passenger to take control if the new, unproven technology fails. That, by itself, convinces me that Google's judgment is flawed, and cannot be trusted. Were I making this decision, I wouldn't let Google's cars on public roads at all until they show some evidence that they understand why this is a bad idea.

  • by DrXym ( 126579 ) on Tuesday August 26, 2014 @05:43PM (#47761041)
    Anyone who thinks self driving cars are likely to be capable of driving on open roads in all circumstances by themselves in the forseeable future is living in cloud cuckoo land. There MUST be a conscious, unimpaired human being able to take over when the need arises because the need will arise.

"Conversion, fastidious Goddess, loves blood better than brick, and feasts most subtly on the human will." -- Virginia Woolf, "Mrs. Dalloway"

Working...