Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Transportation Government

Would You Need a License To Drive a Self-Driving Car? 362

agent elevator writes Not as strange a question as it seems, writes Mark Harris at IEEE Spectrum : "Self-driving cars promise a future where you can watch television, sip cocktails, or snooze all the way home. But what happens when something goes wrong? Today's drivers have not been taught how to cope with runaway acceleration, unexpected braking, or a car that wants to steer into a wall." The California DMV is considering something that would be similar to requirements for robocar test-driver training." Hallie Siegel points out this article arguing that we need to be careful about how many rules we make for self-driving cars before they become common. Governments and lawmakers across the world are debating how to best regulate autonomous cars, both for testing, and for operation. Robocar expert Brad Templeton argues that that there is a danger that regulations might be drafted long before the shape of the first commercial deployments of the technology take place.
This discussion has been archived. No new comments can be posted.

Would You Need a License To Drive a Self-Driving Car?

Comments Filter:
  • by xxxJonBoyxxx ( 565205 ) on Wednesday March 04, 2015 @10:59PM (#49185645)

    If "yes," then it's not self-driving.

    • by zarthrag ( 650912 ) on Wednesday March 04, 2015 @11:05PM (#49185679)

      Simply this. To elaborate further. Self-driving cars should be the legal equivalent to sitting in the back of a taxi. Even from an insurance/liability standpoint, owning one means you're responsible/liable for fuel & maintenance - and that's about it. It should be down to the manufacturer to ensure safe, autonomous operation. (Otherwise, things such as self-valet and timed pick-ups won't happen)

      • by Shakrai ( 717556 ) on Wednesday March 04, 2015 @11:16PM (#49185741) Journal

        It should be down to the manufacturer to ensure safe, autonomous operation.

        Thus guaranteeing that it never happens, at least in the litigious society known as the United States of America.

        Aerospace is held to a far higher standard than automotive ever will be, with modern planes able to fly themselves from takeoff to landing, but we still expect qualified pilots to sit in the front seat and keep an eye on things. An autonomous automobile may well have more variables to contend with than an airliners autopilot. Children don't tend to dart out in front of airliners, the physics of air travel don't change drastically with weather conditions, and airplanes are built with more redundancy than automobiles.

        Even if you can account for such things, how will your autonomous vehicle handle malfunctioning sensors? Aerospace has been working at this for decades and still hasn't figured it all out [wikipedia.org].

        • by kylemonger ( 686302 ) on Thursday March 05, 2015 @12:23AM (#49186087)

          Forget about sensors for a moment: We don't deal with malfunctioning PEOPLE right now. Drunks, old people, and visual impaired people routinely climb behind the wheel everyday. We are already running over darting children, cyclists and pretty much anything else with the temerity to set foot, hoof or paw on the road. Old people ramming cars into crowds because they can't tell the brake from the accelerator are just the cost of doing business in a free society.

          A self-driving system doesn't have to be perfect, it just has to be better than what we have now when we scale it up. Given that you can give a driving AI the equivalent of millions of miles road experience in all conditions, I doubt that AI's will drive worse than human beings for much longer.

          The insurance companies will need to be convinced for sure, but they will be when self-driving systems demonstrate their superiority.

          • by Jane Q. Public ( 1010737 ) on Thursday March 05, 2015 @12:48AM (#49186191)

            We don't deal with malfunctioning PEOPLE right now. Drunks, old people, and visual impaired people routinely climb behind the wheel everyday.

            We don't deal with these problems, because we have bad laws. We have bad laws because politicians want to please lobbyists, and don't want to seem "soft" on crime or negligence. As a result, they pass laws that are too strict (DUI laws being a classic example: studies show the majority of people are NOT significantly impaired at 0.08%).

            When unreasonable laws are passed which victimize pretty much "innocent" people, people lose respect for the law. Not just DUI but also (former or at least getting there) marijuana laws are great examples.

            A self-driving system doesn't have to be perfect, it just has to be better than what we have now when we scale it up.

            Nope. Based on past advances in automobiles (ABS, airbags, power steering, computer throttle control), what will happen is that they will get released, and they will have some major screwups (or public perception of screwups anyway), and there will be a flurry of very heavy-duty lawsuits, and it will go away for a while. Then they'll come back in new and improved form. Then there will be a couple of more lawsuits, and some recalls. Sales will go down a bit and improve again. And it will gradually smooth out. Probably.

            It's a bit like the "ringing" effect in some kinds of oscillators.

            • I would like to see your studies.
              In France, the DUI limit is 0.05%. My anecdotical experience is that this threshold does not seem too low: I certainly do not have the same reflexes or spatial awareness when I am close to this threshold. And I do not think this is a corner case.
            • "We have bad laws because politicians want to please lobbyists,"

              I'm pleased to note that autonomous auto manufacturers won't stoop to employing lobbyists.

            • I have never seen a single study to show that 0.08% is "too strict". In fact, it is extremely lenient by most other country standards. A quick perusal showed this:
              http://trid.trb.org/view.aspx?... [trb.org]

              It concluded impairment begins with any deviation, and almost all people are significantly impaired by 0.08% (lending credence to the idea that the line is too lenient, not too strict).
              If you have a study that actually shows what you purport, I'm sure people would love to see it.

        • by wvmarle ( 1070040 ) on Thursday March 05, 2015 @04:33AM (#49186919)

          Even if you can account for such things, how will your autonomous vehicle handle malfunctioning sensors? Aerospace has been working at this for decades and still hasn't figured it all out [wikipedia.org].

          Detecting a malfunction in a sensor is hard, really hard. You'll need more than one sensor, preferably different types, to realise there's an error, and then you have to decide which of the contradictory sensor results is the correct one. As naturally sensors will always return slightly different results, you'll have to account for that as well.

          So let's say we solved this. Then you know there's a problem. For an autonomous car it's simple: it could decide to continue (minor problem), or stop (e.g. tyre blow-out or other major problem that makes it unable to continue, or simply "I don't know how to handle this situation, so I pull over to the side of the road and stop to have my human overlords sort it out"). In the second scenario an automated call to the repair service could be included, so the human(s) in the car can continue to sleep while it's being fixed and after that be sent on their way again.

          An airplane doesn't have this fail safe stop option, and needs to have human overlords present at all times to take control if something happens the programmers didn't foresee.

          • An airplane doesn't have this fail safe stop option, and needs to have human overlords present at all times to take control if something happens the programmers didn't foresee.

            Even then, there's arguments for removing the human pilots today because they actually cause around half the accidents.

          • by Lumpy ( 12016 )

            "Detecting a malfunction in a sensor is hard, really hard. "

            it depends. you have a known range the sensor will read and you have a known rate of change. For example the sensor in my BMW that measures steering angle will go from 10 to 65525 it can read from 0 to 65535 but the physical limits of the mounting will not allow it. which is fine. the computer system also knows that it is 100% impossible to have more than a rate change of + or - 3500 per second. so if any rate changes are high than that, like

          • You wrote, "Detecting a malfunction in a sensor is hard, really hard."

            Actually is is quite simple to do. If you get anything more then the cheapest of sensors, they continually diagnose themselves and report back the diagnosis. There are failures that cause the sensor to freeze up and stop reporting. If it keeps sending the same data, easy to detect the value stopped changing. If it stops sending any data at all, easy to see a step change that should not have occurred and you also do a redundant sensors

      • by HiThereImBob ( 3935253 ) on Wednesday March 04, 2015 @11:56PM (#49185965)

        Self-driving cars should be the legal equivalent to sitting in the back of a taxi. Even from an insurance/liability standpoint, owning one means you're responsible/liable for fuel & maintenance - and that's about it. It should be down to the manufacturer to ensure safe, autonomous operation. (Otherwise, things such as self-valet and timed pick-ups won't happen)

        Let's be realistic. Self-driving cars are coming, but it is going to be a gradual transition. We've already seen the beginning of it with adaptive cruise control and self-parking. These features will continue to be refined while new ones are added, but we almost certainly face years (decades?) of gradual transition where our cars are some weird hodgepodge of self driving and user operated. The laws governing this won't be nearly as straightforward as you suggest.

        • Let's be realistic. Self-driving cars are coming, but it is going to be a gradual transition. We've already seen the beginning of it with adaptive cruise control and self-parking. These features will continue to be refined while new ones are added, but we almost certainly face years (decades?) of gradual transition where our cars are some weird hodgepodge of self driving and user operated.

          What's funny is this whole bit about "Where is my flying car?" when realistically it won't happen in any quantity until personal flying is almost completely automated. And I don't see that happening until we CAN, at least, make reliable automated cars.

          The collision-avoidance problem, in some ways, is multiplied in the air. At least on the ground you have specific lanes with traffic control devices on them (lights, etc.).

          • The collision-avoidance problem, in some ways, is multiplied in the air. At least on the ground you have specific lanes with traffic control devices on them (lights, etc.).

            Just the opposite. Consider that we developed drones long before we developed a self driving car. You can program specific lanes for flying, they're used all the time by commercial aircraft, but by the same token there's a lot less static clutter, margins are greater(no worrying about whether the kid on the side of the road will dart out), etc...

            There are reasons why we developed self-piloting plants decades before we developed self-driving cars.

        • by fuzzyfuzzyfungus ( 1223518 ) on Thursday March 05, 2015 @01:05AM (#49186261) Journal
          Your suggestion about the pace of the transition is highly plausible; but that needn't imply much regulatory complexity: At present, a car with a licensed operator can have a variety of convenience features that involve a level of automation and (very) bounded autonomy without any variation in what type of license you need for what level of features. There may be some hassle on the vendor's end, in convincing the relevant feds that their intended new feature isn't an automated accident generator; but there's nothing on the driver side.

          Given that operator handoff is most likely to happen either under relatively hairy conditions, or when some system failure has left the automated systems unable to cope, there isn't an obvious incentive to relax the(already not terribly demanding, at least in the US) requirements placed on licensed drivers until 'self-driving' actually does mean 'self-driving'. If it means 'sometimes self driving, except the hard parts', that may require less operator effort; but not obviously less operator knowledge(if anything, given that drivers usually get somewhat safer with experience, at least until they hit the point where each additional year stops making them less young and stupid and starts making them more old and inept, I'd be particularly worried about the likely performance of somebody whose vehicle is sophisticated enough to coddle him most of the time, then screams and hands him the wheel when the situation is already halfway lost.)

          I have no doubt that the laws(or at least the liability litigation and insurance-related contracts, even if carried out under existing law) for damage and death caused by partially-automated vehicles will be an epic nightmare of horrendous proportions; but on the operator licensing end "If you might have to drive it, you need a driver's license; if you won't have to drive it, you don't." really covers a lot of territory. There might be some incremental adjustments, mostly to the format of the test(say, allowing use of a rear-view camera in addition to mirrors and over-the-shoulder during tests of parking); but not too much need to complicate things.
          • It's worth noting that there is one piece of automation in cars already that does give a different kind of driving license in a lot of places: automatic gear change. If you get a driving license in a car that has an automatic transmission then you can't drive manual cars with it, though the converse is allowed.
          • by wvmarle ( 1070040 ) on Thursday March 05, 2015 @04:44AM (#49186953)

            Given that operator handoff is most likely to happen either under relatively hairy conditions, or when some system failure has left the automated systems unable to cope,

            Euhm... let me get this right... you expect cars to drive automatically, except when it gets difficult or something else unexpected happens it suddenly gives back control to the driver. That's what you mean, right?

            Bad idea. Very bad idea. The driver is probably reading the paper, or is dozing off, or otherwise simply not paying attention to the road, as the car is doing the driving and he has nothing to do. He's not supposed to do anything about driving, as the car is in full automatic driving mode. Suddenly asking for attention, then expecting the driver to handle a difficult situation instantly, is asking for accidents. Many more than when the driver was in control already, and possibly sees the situation coming, so anyway has much more time to react.

            To allow the driver to fully hand off control to the car, the car should be able to handle it all. The driver assist functions we have available on certain cars nowadays are a great start in working towards full control by the car: now the car will intervene in certain emergency situations, when that's all settled, we can think about giving off control of the rest of the ride as well. For fully automatic drive, the car should not rely on human intervention, ever.

      • Self-driving cars should be the legal equivalent to sitting in the back of a taxi. Even from an insurance/liability standpoint, owning one means you're responsible/liable for fuel & maintenance - and that's about it. It should be down to the manufacturer to ensure safe, autonomous operation.

        The autonomous car is safe only within its operational limits --- but how many drivers will be willing to let a car or its manufacturer decide when it is safe to take to the roads?

        How many will risk being stranded if automated systems begin shutting down because they are confused and overwhelmed by bad weather, outdated maps, or other unforeseen circumstances?

        • How many will risk being stranded if automated systems begin shutting down because they are confused and overwhelmed by bad weather, outdated maps, or other unforeseen circumstances?

          Probably the same number who are willing to try horseless carriages that might get overwhelmed by bad weather, outdated maps, or other unforeseen circumstances.

        • by Aighearach ( 97333 ) on Thursday March 05, 2015 @12:58AM (#49186227)

          By that theory, nobody ever drives anywhere, because there could be an unexpected road closure. I go lots of places where there is only one road, and if it is closed (which happens) then you can either try the next day, or drive an extra 250 miles. I've never once heard of it as a reason people don't go to those places. Even a doctor isn't going to stay in town and never go to the beach on a day off because of some small percent chance the road would be closed.

          If the car is leased with a service agreement (likely for early versions) then you probably just call roadside assistance if it strands you, and they send a tow truck, same as AAA.

          Gosh, nobody would even play golf, because of the lightning risk.

        • Self-driving cars should be the legal equivalent to sitting in the back of a taxi. Even from an insurance/liability standpoint, owning one means you're responsible/liable for fuel & maintenance - and that's about it. It should be down to the manufacturer to ensure safe, autonomous operation.

          The autonomous car is safe only within its operational limits --- but how many drivers will be willing to let a car or its manufacturer decide when it is safe to take to the roads?

          How many will risk being stranded if automated systems begin shutting down because they are confused and overwhelmed by bad weather, outdated maps, or other unforeseen circumstances?

          How many drivers? Probably not that many, unless the safety envelope is very wide indeed. However, if the autonomous car is, in fact, autonomous, the question is also how many non-drivers would like to have access to the road, most of the time, without becoming drivers. Especially if they don't have a choice(visual or other incapacity that precludes driving, alcohol violation, too young, etc.) or their use case is relatively miserable driving(If you are going to have a shit commute in heavy traffic, do you

      • You should know better than to make false assertions when we have plenty of evidence countering your assertion that technology will ever be this good. Since the 1960s we have been automating space travel and airlines, and still need pilots and astronauts because when the shit hits the proverbial fan humans are required to intervene. Sometimes to correct problems with the technology, and sometimes to bypass it and fly by hand.

        Drones require people to pilot them too, so don't try to go down a bad path.

        I don

        • Since the 1960s we have been automating space travel and airlines, and still need pilots and astronauts because when the shit hits the proverbial fan humans are required to intervene.

          We have pilots to make passengers feel good. We have astronauts because we can't make a robot as dextrous as a human yet.

        • by Jeremi ( 14640 ) on Thursday March 05, 2015 @02:05AM (#49186471) Homepage

          Nothing, and that is an absolutely nothing, has ever been made by man which has been perfect.

          A self-driving car does not have to be perfect. It just has to be better than the alternative.

          With motor vehicles already being the number one killer in the US annually, we want human intervention early and often.

          Isn't the fact that motor vehicles are already the number one killer in the US annually actually an argument for automated cars?

          As stated above, a half a century has not perfected "self driving" anything else.

          Five centuries of work before that never perfected heavier-than-air flying machines either, until one year, presto, all the necessary preconditions were finally met and airplanes became a reality. There's nothing linear about progress.

    • If "yes", it's worse than self-driving.
      • I just realized that comment made no sense. Excuse me. I should have said:

        If "yes," it's worse than traditional cars. Even if you're stuck in a traffic jam, if you have to pay attention (in case something goes wrong with your car), there is no benefit.
        • You don't have to be licensed so that you can pay attention "in case something goes wrong," though you'll probably be expected to push the car out of the roadway if physically able.

          The reason you have to be licensed is that if the car malfunctions and creates an insurance claim, there is lots of existing legal precedent related to insurance liability that means the insurance company will require a licensed driver, until the laws are changed by people not scared of self-driving cars. That will take up to 50

    • A driver's license is not really entirely about driving, which is why some jurisdictions refer to them as operator's licenses.

      To operate a motor vehicle, you're showing competence in the vehicle's operation, For a normal car, that means mostly the in-motion controls and law knowledge, but there's also a section of most tests where you're required to demonstrate mastery of the machine and the ability to keep it in good condition, by demonstrating indicator lights, completing a knowledge test, passing emissio

    • I know, right? Like, how can you drive your license around if you're not driving. Oh wait, but it says the car will be driving. Wait, I don't even drive my license as it is, I drive my car!

      If it doesn't drive, I'll agree it isn't self-driving. But if it isn't licensed, then I can only agree it is not self-licensing.

      It is a bit of a "no-brainer" that at first a licensed driver will be required, for the purpose of integrating normally into the existing insurance law and regulation. Only after they're common w

  • TL-DR (Score:2, Insightful)

    by wbr1 ( 2538558 )
    As with any new technology, society and by proxy the government will struggle with its place and rules in said society. There will be oversteps, misses, and oversights.

    There will be detractors, luddites, and evangelists, sociopaths and attention whores all vying for a moment in the sun.

    Welcome to the human race. I'll go get my popcorn.

  • by mjwx ( 966435 ) on Wednesday March 04, 2015 @11:08PM (#49185703)
    Do pilots still need licenses in the age of autopilot? Well yes because machines aren't infallible.

    For a long time, an autonomous car will not be driverless. People need to get over this notion that next year a car will drive itself and you'll sit in the back with a Martini and the paper. That probably wont happen in our lifetimes.

    Initially, fully autonomous modes will only be permitted on certain roads (think limited access roads like highways, freeways and autobahns). This will last years as engineers are even more conservative than law makers. The next step is likely to be special lanes on A roads. It will be a long time before autonomous cars are good enough to operate on a B road or suburban street.

    Ultimately, because the law requires someone to be responsible for the operation of the machine it means a qualified operator will need to be at the controls whilst in operation. Same with a lot of other automated systems (such as long distance trains).
    • by sl149q ( 1537343 )

      Unless you are over 80 it is going to happen in your lifetime.

      Fully autonomous vehicles will be driving on all (but possibly) rural roads in the near future.

      • by arth1 ( 260657 )

        Your dad said the same thing about flying cars.

      • Unless you are over 80 it is going to happen in your lifetime.

        Why are you so certain? The technology we have today is certainly not fully autonomous, and the stuff Google is working on isn't leading towards that. Do you have a reasoning behind your estimate, or is it merely a guess?

        • The google stuff is fully autonomous already. They have steering wheels for 2 reasons: they're required to by the State of California, and they're prototypes and they need to be able to steer them around with some of the equipment turned off.

          A consumer model doesn't need to be able to still be operated after a malfunction. It can just shut down and call a tow truck.

      • You have no idea how long I'll live, you insensitive clod! :P

    • Do pilots still need licenses in the age of autopilot? Well yes because machines aren't infallible.

      Not quite. It's "yes" because most people would be unable to get over their fear of flying in an entirely autonomous plane, not because we need heroic pilots to override the computer when things go wrong.

      Consider that about half of all aviation accidents are traced to pilot error. The percentage of crashes caused by autopilot error is zero.

      • by mjwx ( 966435 )

        Do pilots still need licenses in the age of autopilot? Well yes because machines aren't infallible.

        Not quite. It's "yes" because most people would be unable to get over their fear of flying in an entirely autonomous plane, not because we need heroic pilots to override the computer when things go wrong.

        Consider that about half of all aviation accidents are traced to pilot error. The percentage of crashes caused by autopilot error is zero.

        No,

        Pilots are still there because autopilots can fail.

        http://www.dailymail.co.uk/new... [dailymail.co.uk]

        You didn't hear about this 4 years ago because no-one died... Thanks to some quick thinking by the "error prone" lumps of meat in the cockpit.

    • People need to get over this notion that next year a car will drive itself and you'll sit in the back with a Martini and the paper. That probably wont happen in our lifetimes

      It'll happen during the next decade. Bet against Dr. Moore at your own peril.

      (granted, the government will lag 20 years behind the technology, so we'll still have drunk drivers killing people when the autopilots would have been safer)

      • by mjwx ( 966435 )

        People need to get over this notion that next year a car will drive itself and you'll sit in the back with a Martini and the paper. That probably wont happen in our lifetimes

        It'll happen during the next decade. Bet against Dr. Moore at your own peril.

        (granted, the government will lag 20 years behind the technology, so we'll still have drunk drivers killing people when the autopilots would have been safer)

        The concentration of transistors has nothing to do with this.

        You rely on a bad interpretation of Moore's Law at your own peril.

        The technology will be adopted slowly because any mistake will kill the technology. When your laptop crashes due to a production fault, you might lose a little bit of work that you'll have to redo, when a car crashes due to a production fault, there's a good chance people will die. So ordinarily cautious and conservative car companies will be even more cautious and conservativ

    • I agree that adoption will be gradual. The first generation of "self-driving" cars will probably have "smart cruise-control" and "self-parking" modes, but the driver will still be expected to be at the wheel and ready to take control if needed. Next, the vehicle will be smart enough to take you from start to destination by itself, but only in good weather and relatively common driving circumstances. Eventually, engineers will probably figure out how to make these systems so smart and reliable that we can

    • Do pilots still need licenses in the age of autopilot? Well yes because machines aren't infallible.

      This is a terrible analogy. First autopilot for a plane cannot taxi the aircraft so it is not feature complete. Secondly the consequences of mechanical failure in a car are far less severe and you can probably solve most of the ones which do not themselves involve the engine dying by having a kill switch and a steering wheel: all you have to do is yank the switch and steer the now rapidly braking car out of trouble. A kill switch on an aircraft is a somewhat less viable option which is why you need a pilot

  • Should all car drivers be accomplished horse riders? Well yes obviously! You never know when your car will break down, run out of gas, etc. and you'll need to hitch up a horse to get you home.

    I think that it's pretty clear that within a few 10s of years the car with a driver will be the anomaly. The economic advantage in large areas of transportation (trucking, taxis, deliver, etc. etc.) are so huge that the technology will be adopted, and the transition to home vehicles is inevitable because the cost i
    • The cost for the array of sensors is far from minimal at the moment. Maintenance on them will add up too, you have new complicated pricey parts. The majority of people are probably driving cars worth $5K or less. Cheap low maintenance human-driven vehicles will be the norm for the foreseeable future, outside of wealthy suburbs.

      • The cost for the array of sensors is far from minimal at the moment. Maintenance on them will add up too, you have new complicated pricey parts. The majority of people are probably driving cars worth $5K or less. Cheap low maintenance human-driven vehicles will be the norm for the foreseeable future, outside of wealthy suburbs.

        Driverless cars will probably be introduced as a taxi like service. That way the cost will be spread out over a large customer base. At some point most young couples will decide not to get a second car because the autonomous service will take one of the spouses to work and back. Then with a generation or two of families having only one vehicle, new young couples will start by passing owning a car in the first place. Or at that point, they will have become economical enough, the one car the family does own w

  • by Chas ( 5144 ) on Wednesday March 04, 2015 @11:26PM (#49185805) Homepage Journal

    And I stopped listening right there.

    Only fucking MORONS want this sort of thing.

    When you're in a piece of heavy machinery, like a car, even if you're NOT driving it, you DON'T want to be impaired in case of an emergency.

    So, drinking in a self-driving car is pretty much out. And for many of the reasons this dipshit talked about. MALFUNCTIONS.

    Before you bring up bus and rail transport. Keep in mind, there are people actually driving those. And, in the case of long distance trains, crews full of people. All better trained at running the transportation than you are.

    • Before you bring up bus and rail transport. Keep in mind, there are people actually driving those.

      Many subway trains are now automated. Expect all other vehicles to follow eventually.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      "Only fucking MORONS want this sort of thing.

      When you're in a piece of heavy machinery, like a car, even if you're NOT driving it, you DON'T want to be impaired in case of an emergency."

      Wha?? Hows this any different than taking a cab home?

      Am I putting myself in horrible danger when i take the cab home after a night drinking? Afterall i dont want to be impaired in case of emergency. Friends dont let friends take a cab?? -rolls eyes-

      Blah... you people worry too much.

    • by tmosley ( 996283 )
      Drivers can have strokes and heart attacks. Why don't you feel like you need to be prepared to take over the wheel of any bus that you get on? Sandra Bullock did!
    • by afgam28 ( 48611 )

      That seems a bit extreme. I don't know about you but if I drink a small amount of alcohol it won't impair my ability to react to an emergency in any meaningful way.

    • by AK Marc ( 707885 ) on Thursday March 05, 2015 @04:04AM (#49186827)
      So you never ride drunk in a cab in case the cabbie has a heart attack? For every argument about this, substitute chauffeured car for self-driven car and run it through that filter first.

      You'll find that people hold self-driven cars to a much higher standard than we have today. I think people just fear change.
  • Insurance (Score:4, Interesting)

    by sound+vision ( 884283 ) on Wednesday March 04, 2015 @11:26PM (#49185807) Journal
    This also makes me think, will you need insurance for a self-driving car? If two self-driving cars are involved in a collision, who is responsible for the damages? You could say the manufacturer is responsible - but what if it's a collision between a self-driving car and a human-driven car? Or, will manufacturers be willing to take on the burden of providing insurance for each car they sell?
    • Re:Insurance (Score:4, Insightful)

      by tmosley ( 996283 ) on Wednesday March 04, 2015 @11:54PM (#49185955)
      Fault is determined the same way it is today. You can bet your ass that self driving cars will have black boxes full of data on speed, location, orientation, etc. If one causes a wreck, that would probably be the end for that manufacturer from a publicity standpoint. As such, you can bet that any model that hits the road is going to be DAMN safe.
    • If two self-driving cars are involved in a collision, who is responsible for the damages?

      If the cars are owned by individuals and not a taxi service, it'll probably be related to if they've kept the software up to date. If one person's car is up to date with the latest patches, and the other person hasn't updated in the last three years, and their car has had an update which would have avoided the accident, the person who didn't maintain their vehicles software will be liable.

  • So let me get this right. You are in a 'driverless car'. Yet your job is to painstakenly hover over the controls trying to double guess the AI every single second you are in the vehicle?!???!! Good luck with that because if the AI fucks up you have a second or two tops to stop yourself from becoming road paste. It sounds like a massive copout from manufacturers wanting to sell autonomous features before the technology is mature enough to realistically insure and assure customer safety.
  • or interchanges. If the cars are well guided and coordinated you could have full speed ground level crossing where the cars just space out enough to weave past each other. Would be terrifying at first but people would get used to it.

  • by tlambert ( 566799 ) on Wednesday March 04, 2015 @11:40PM (#49185889)

    Why don't we just incorporate them?

    Then they'll legally be people, and they can get their own driver's licenses!

  • When the automobile debuted, the UK passed the infamous Locomotive Acts (otherwise known as the Red Flag Law) [wikipedia.org], requiring someone to walk in front of a "horseless carriage" waving a red flag.

    Requiring a license for a self-driving car is the modern red flag to avoid spooking the lawyers.
    • When the automobile debuted, the UK passed the infamous Locomotive Acts (otherwise known as the Red Flag Law), requiring someone to walk in front of a "horseless carriage" waving a red flag.

      The first Locomotive Acts were passed in the 1860s.

      Forget the "horseless carriage." We are talking about road trains, huge and heavyweight steam powered agricultural tractors, bailers, threshers, bulldozers, steam shovels and the like.

      To this day flagmen and escort vehicles serve the same purpose.

  • But it win;t be called a ;drivers licence , it will be called a government issued photo ID
    You need one now to cash a check, vote etc.

    (I would have used the term "state issued ID" but SCOTUS still has to decide whate the term 'state' means.

    Note to grammar nazis - would have is pronounced would of

  • The state needs the ability to track the movement of the populace. If they had unlicensed cars on the road, people would be able to move about freely, without being surveiled. Imagine the safety implications there...

    Also, they would lose an avenue for much needed recurring revenue, and something to hold over the head of criminals.

    It's almost like you think you live in a free country?

  • by jader3rd ( 2222716 ) on Thursday March 05, 2015 @12:48AM (#49186193)
    You would need a 'driver' for a driver-less car as much as you need a horse for a horse-less carriage.
  • I think the key to making cars that are really 'self driving' will be to have the on-board systems backstopped by a call center rather than anyone sitting in the vehicle itself. Autonomous aircraft are really designed with a computer to handle the routine flying and then pass things off to a remote pilot for the interesting bits. An autonomous car could handle the freeway and major streets by itself quite well but might need to call up a licensed operator to negotiate a parking garage or a work zone.

    Some
  • by Karmashock ( 2415832 ) on Thursday March 05, 2015 @12:58AM (#49186229)

    Lets say my self driving car runs someone over... who is liable?

  • I think a more interesting question is will gaurdians be allowed to put children in cars alone?
    ( Jimmy, your parents just called and said they would be late, I'm going to call yoiu a robotaxi. )

An authority is a person who can tell you more about something than you really care to know.

Working...