Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation AI

Kids With Wheels: Should the Unlicensed Be Allowed To 'Drive' Autonomous Cars? 437

Hallie Siegel (2973169) writes "From the Open Roboethics Research Initiative: Earlier this month, when we asked people about your general thoughts on autonomous cars, we found that one of the main advantages of autonomous cars is that those who are not licensed to drive will be able to get to places more conveniently. Some results from our reader poll: About half of the participants (52%) said that children under the legal driving age should not be able to ride driverless cars, 38% of the participants believe that children should be able to ride driverless cars alone and the other 10% also think that children should be able to drive autonomous cars with proven technology and specific training."
This discussion has been archived. No new comments can be posted.

Kids With Wheels: Should the Unlicensed Be Allowed To 'Drive' Autonomous Cars?

Comments Filter:
  • no (Score:2, Insightful)

    no. the idea of an autonomous vehicle with no possible driver to override it is just plain stupid.
    • Re:no (Score:5, Insightful)

      by Rich0 ( 548339 ) on Monday May 26, 2014 @06:37PM (#47095747) Homepage

      no. the idea of an autonomous vehicle with no possible driver to override it is just plain stupid.

      The idea of a manually-operated vehicle with no possibility of a more accurate automated system overriding it is just plain stupid.

      It all comes down to risk. Obviously today autonomous vehicles aren't ready to take over completely. However, they will steadily improve, and it seems unlikely that human drivers will improve at all. At some point the risk of a computer causing an accident will drop below the risk of a person causing one, and at that point it becomes safer to just not let people interfere with the operation of the vehicle.

      Would you consider it wise to give passengers in an airliner the ability to take over in case the pilot makes a mistake? Such a feature is far more likely to cause a disaster than avert one. Once cars get to the point where they can be operated more safely than aircraft (which are already safer than cars are today) then taking control of a car in a crisis will just be getting in the way of the proven driver: the machine.

      • Re:no (Score:5, Insightful)

        by sl149q ( 1537343 ) on Monday May 26, 2014 @06:51PM (#47095815)

        Exactly. By the time this question is germane it will be equivalent of "would you let your kid ride in a taxi without you?".

        The long term direction is a far safer driving experience solely based on removing human drivers from all cars. Allowing them to "override" the automated systems just makes them far more dangerous than cars today where at least the norm is drivers who are somewhat used to driving. Letting people who rarely if ever driver override is just a disaster waiting to happen.

        • Re:no (Score:5, Insightful)

          by immaterial ( 1520413 ) on Monday May 26, 2014 @07:32PM (#47096011)
          We already allow kids alone on the streets on foot and on bicycles at parental discretion. As you say, a proper automated vehicle will be safer than a car piloted by an adult human, so it will be far, far safer than a bicycle piloted by a child. I don't see how there's even a question.
          • by msauve ( 701917 )
            "a proper automated vehicle will be safer than a car piloted by an adult human"

            I'll see your straw man, and raise you a spark.

            Your preposition is unproven. It may occur at some time in the future, but that remains to be seen.
            • Re:no (Score:4, Funny)

              by immaterial ( 1520413 ) on Monday May 26, 2014 @08:52PM (#47096429)
              Did you think we were discussing all the automated cars on the road right at this moment?
            • This is more of a True Scotsman issue. I think it's a fair statement that an automated vehicle which is not safer than a car piloted by an adult human is not a 'proper' automated vehicle.

              The vehicle really only has to be safer than an average human driver but we'll probably have to make it safer than any conceivable human driver before it will be widely accepted.
        • Re:no (Score:5, Interesting)

          by rtb61 ( 674572 ) on Tuesday May 27, 2014 @12:32AM (#47097361) Homepage

          Catch with that. Taxi drivers are largely kid proof and computers are not. Much like elevators, those devices that interact within public space are very difficult to make child proof. Even something a simply as a swing is rather difficult to make child proof and something that needs to be used with adult supervision. Let alone the most obvious danger hacking of the service to facilitate remote control abduction of children. Children require adult supervision, that is their nature, they are learning to be human beings and will make many mistakes. Adult supervision reduces the number of mistakes children will make and the greater the risk the greater the need for adult supervision. Simple hack of an automated car and yet very disruptive especially in rush hour, would be for the child to instruct the vehicle to drive round and round, a roundabout actively denying other vehicles access, yet completely legal.

          • Children require adult supervision

            Yes, to a point.

            When they're old enough to be left at home, to use public transport, to cycle around the neighborhood - then they're old enough to ride in an automated car without adult supervision.

            Until then, kids should not be left alone - full stop.

      • by Kaenneth ( 82978 ) on Monday May 26, 2014 @06:57PM (#47095845) Journal

        Well, you COULD give every passenger a virtual control stick on a display panel on their back seat, and let democracy fly the plane.

        It worked for Twitch Plays Pokémon.

        • by sumdumass ( 711423 ) on Monday May 26, 2014 @09:17PM (#47096511) Journal

          Do we have passenger jets that the pilots cannot override the autopilot?

          I mean that is the real comparison here. If anyone can override the automated systems, then that person or some person needs to be qualified and present during the operation.

          Before we go completely autonomous with cars, it should be safe to have autonomous lawnmowers. If the thought of a machine with spinning blades roaming around by itself doesn't sit well, cars without the ability to override yhe autopilot shouldn't either.

          • Before we go completely autonomous with cars, it should be safe to have autonomous lawnmowers.

            Behold! [husqvarna.com] The [robomow.com] future! [lawnbotts.com]

          • Yes, in some Airbus aircrafts, the pilot cannot exceed thresholds set up by the autopilot.
            • Actually, that's an urban myth - in Normal Law the pilot cannot exceed certain thresholds as you say, but its a simple button press to put the aircraft in Alternate Law where they can. Boeing aircraft from the 777 onward are essentially the same.

      • Re:no (Score:5, Interesting)

        by ADRA ( 37398 ) on Monday May 26, 2014 @07:01PM (#47095863)

        In my city (Vancouver), trains are basically run automomously under normal circumstances unless there's an interruption, in which case staff at HQ. could manually take control of the vehicles. This is at least somewhat over simplified, as they run on almost entirely isolated railways without much risk of outside risk factors, but a highly advanced car with little more than a GPS (with auto-nav) / stop peddle and an on-star-like communications terminal for emergency stop responses and rescue situations could eventually become a valid and functional road driving system for cities. Even a 'manually driven' option for truly rural areas not covered by the grid could be an option that 'turns off' when entering managed city roads.

        I don't see why we couldn't 'have faith' in central city command and control centers which are paid for by road taxpayers to help manage and mitigate risk to public safety. Do you think the added taxes in supporting this would be more or less than the amount lost to accidents/life lost/insurance of a non-managed roadway?

        Oh, well, nice dream but I don't see it happening any time soon. Here's hoping I happens before die and..fdsfzzzzzzzzzzzzz

      • by epyT-R ( 613989 )

        Granting sufficient contextual awareness to free roaming vehicles is too intractable. The computer is clueless about compound cause-effect situations where prevention is better than reaction. You have it backwards. If the automated system can handle 90% of the situational dynamics of driving, you want the human to be able to override it when it's clearly about to get something wrong. I am being 'extremely' generous with that 90% btw.. It's unlikely we'll get that far. computers are faster, but humans hav

        • Re:no (Score:4, Insightful)

          by immaterial ( 1520413 ) on Monday May 26, 2014 @08:18PM (#47096279)
          This is absurd. There is no 'fixing' the human. Driving was already incredibly risky before cellphones (humans are 'proven' to drive terribly, I mean really? Google's automated cars already have a far better record than the average human and the technology is still in its infancy). Humans do not have 360 degree vision or the mental capacity to focus specific attention on dozens of details and separately moving trajectories simultaneously - even if they ARE paying 100% of their attention to the road (which is obviously grossly optimistic).

          What if the computer can't tell the difference between a bag and a rock? Then it assumes the highest-risk possibility and takes the appropriate mitigation action with reflexes so quick that it has probably begun before the meatbag in the car even takes note of the bag.

          What happens when the perfect driver is checking his side view or rear view mirror right at the moment the rock rolls into the lane in front of him?

          Automated systems are never going to be perfect, but I see no reason they can't be far, far more safe than a system guided by a human being.
        • by Rich0 ( 548339 )

          Granting sufficient contextual awareness to free roaming vehicles is too intractable.

          That's a pretty bold statement. Human brains can do it, and they're made out of matter. Why wouldn't a computer that is also made out of matter not be able to do the same thing?

          I didn't say that it would happen tomorrow. However, at some point computers will surpass humans in driving skill - it seems inevitable to me. It is just a matter of when.

        • Re:no (Score:5, Insightful)

          by ultranova ( 717540 ) on Monday May 26, 2014 @09:22PM (#47096537)

          The computer is clueless about compound cause-effect situations where prevention is better than reaction.

          So are humans. Every single workplace safety program starts and ends with "stop and think what you're about to do before doing it". Our higher functions operate at the timespan of minutes, not fractions of a second. This is also the reason we have traffick laws: they turn driving from an activity requiring judgement into a mechanical exercise. When that fails, accidents follow.

          You have it backwards. If the automated system can handle 90% of the situational dynamics of driving, you want the human to be able to override it when it's clearly about to get something wrong.

          Apart from negating the entire reason one might want to get an automated vehicle in the first place, it's also not physically possible to stay alert and pay attention to a system - in this case traffick - that you aren't actively participating in. This means that you have no idea when the computer is about to do something wrong, much less what to do about it.

          computers are faster, but humans have far better intellectual contextual awareness.

          Humans have next to no intellectual contextual awareness in realtime situations. Various levels of automation drive your body, most social situations, and even activities usually considered intellectual, like math or programming. "Intellectual contextual awareness" is what you use to pick a career, and often not even then.

          (is that a plastic bag or a big rock in the road?)

          Is there a rock beneath the bag? You can't know. You can, however, guess there isn't and adjust your estimates about any future bags containing rocks should this one be harmless. That happens all the time, and is one of the numerous ways in which human rationality tends to break down.

          and have an interest in survival, which makes up for their less consistent behavior.

          No, it doesn't. Your survival instinct manifests as a bunch of reflexes, which do little to help (shielding yourself with your arms) or even need to be worked around (ABS brakes). It doesn't stop people from speeding or ignoring the road in favour of their cellphone, whereas a computer that's told to obey the speed limit will obey the speed limit.

          'Safe' to you might be a computer controlled everything robot that makes gross, heuristic assumptions about the reality around it.. 'Safe' to me is a mechanically cabled accelerator, spring-loaded to drop to idle RPM if it breaks,

          This being a good example of a gross, heuristic assumption. Your "safe" accelerator can be defeated by a cable jam, metal fatigue in the spring, or even a simple bit of sticky dirt on the cabin floor.

          and a human with superior situational awareness capability who cares about his survival behind the wheel.

          And this calling for something that doesn't exist.

          Lets fix the human rather than hobble and distract him with more uselessly complex machines.

          And how would you go about "fixing" humans?

      • Would you consider it wise to give passengers in an airliner the ability to take over in case the pilot makes a mistake?

        False analogy. Very few passengers would know how to fly an airliner. For the forseeable future however, most adults can drive a car if they needed to take over, Which brings us back to the question of whether that ability should be a requirement.

        In any case, while I do not know much about what these automated cars are capable of, surely some human control is going to be needed. It will not be good enough to say to my car "Go to the office" because sometimes I just need the car park (where I prefer

    • Re:no (Score:5, Insightful)

      by aristotle-dude ( 626586 ) on Monday May 26, 2014 @06:37PM (#47095751)

      no. the idea of an autonomous vehicle with no possible driver to override it is just plain stupid.

      The idea of requiring a driver's license to "ride" in an autonomous car is stupid. What's the point if you need to be able to drive?

      • Re:no (Score:4, Insightful)

        by rolfwind ( 528248 ) on Monday May 26, 2014 @07:08PM (#47095885)

        I think the concern is twofold.

        As of yet autonomous vehicles are unproven. It would be nice to have a driver on the wheel just in case. This might not be for emergencies as a person would be reading or whatever and it's dangerous to give him the wheel unprepared and unaware. But we can presume that the computer might just get confused (lets say a construction site) and come to a stop and say "Please, human, guide me here until I can take over again." That's legitimate because the first generation of autonomous vehicles are certainly not going to be perfect.

        Second, we don't want kids having free access to autonomous vehicle. 10 year old Johnny is riding in a car with no parents and just cannot resist the urge to take over the controls. 9 year old Amanda just met a really cool adult online that promises her if she goes to this one address, she's getting all the toys she wants.

        So maybe not a driver's license, since blind people should have access to this technology after the bugs are worked out, but there should be some regulation.

        • by Belial6 ( 794905 )

          10 year old Johnny is riding in a car with no parents and just cannot resist the urge to take over the controls.

          There is no reason this should even be possible.

          9 year old Amanda just met a really cool adult online that promises her if she goes to this one address, she's getting all the toys she wants.

          So, exactly the same situation we have today?

          Neither of these make any sense when it comes to regulations concerning autonomous cars.

        • So require a key for override, that way the 9 year old can go to grandma's house but not into a tree. If you need to pay attention and be prepared to intervene at any second then how is this not strictly worse than the current system? It's less convenient than a cab and more expensive than a bus. Pretty useless. And people multi-task with phones and laptops now, what makes you think they'll pay any attention at all when cars drive themselves 99.99% fine on there own?
        • First, unaccompanied children riding in these cars would wait until the cars have proven themselves. I wouldn't support children until it's reached the point that you're dropping the steering wheel.

          As for #2, it's easily solved by placing the car into a mode where it only has limited destinations. Worst case, you should readily have records of where the car went and can use that to find the 'cool adult'.

    • Re:no (Score:5, Insightful)

      by ZahrGnosis ( 66741 ) on Monday May 26, 2014 @06:39PM (#47095757) Homepage

      Agreed. If there WERE fully autonomous vehicles (like computer controlled trams in airports are now), it shouldn't matter who drives them. If we get to the point where we trust automobiles to be completely devoid of manual control and override then what difference does it make who's inside?

      Until then, no... as long as there are controls or overrides that someone can cause dangerous scenarios then you should have a license. Maybe we can have a different conversation about an "emergency stop" or changing destinations or minor route corrections, but the way the cars are built now allow for pretty complete driving responsibilities, and they should require similar of not identical rules for the drivers.

    • by hey! ( 33014 )

      I'd modify this answer to be: not yet.

      At this point we don't understand what the impact of many driverless cars will be. It makes sense *for now* to require a licensed human driver be ready take over the vehicle in case the robotic control begins to conduct the vehicle incorrectly.

      Later, as we gain more experience with autonomous vehicles and the systems become both more sophisticated and more proven, we'll reach a point where he have hard, data that proves having a human driver handy doesn't statistically

    • by msauve ( 701917 )
      The simple solution is to require very significant liability insurance. Or, buy a horse.
    • So it's better to have the child driven by a sleepy, irritated or texting parent or, worse, barely legal, sibling? Why not just think of an autonomous car as a trackless train that can take passengers to designated places without worrying about the precise turn by turn navigation? This way, the child would have as much control over the car as a train driver. Put a "brake" or "force stop" button that will park the car in the nearest safe location.

      Also, the car AI should already have a built-in restriction ag

    • It seems pretty clear that there is going to be a transition period where autonomous vehicles will absolutely need to have the ability to let the driver takeover for situations like:

      1) Driving in places where you usually are not supposed to due to road work or an accident.
      2) Driving in places that no map data is available for yet.
      3) Getting a vehicle onto a lift at the repair shop for servicing.
      4) Pulling a trailer, this adds an entire new level of difficulty that I suspect autonomous car makers will not ta

      • It seems pretty clear that there is going to be a transition period where autonomous vehicles will absolutely need to have the ability to let the driver takeover for situations like:

        2) Driving in places that no map data is available for yet.

        You're assuming the car is going by map data alone, and not by video analysis? How quaint.

    • Comment removed based on user account deletion
    • On the contrary, I find the idea of an autonomous vehicle with someone who thinks they're clever enough to override it just plain stupid.

    • by mrmeval ( 662166 )

      It would get rid of pubic transportation, be cheaper in the long run and if there is zero manual allowed, safer.

  • Well, of course. (Score:5, Insightful)

    by CrimsonAvenger ( 580665 ) on Monday May 26, 2014 @06:34PM (#47095727)

    It's not like the guy sitting in the seat is the actual "driver" of an autonomous car.

    And it's not like anyone is actually required to sit in that seat.

    Note that if an "autonomous car" that requires someone to sit in the driver's seat and pay attention, you might as well not bother making it autonomous. If I have to pay as much attention as if I were the real driver, I might as well drive it myself, since the act of driving at least helps me keep my attention on the traffic.

    • Re:Well, of course. (Score:5, Informative)

      by msobkow ( 48369 ) on Monday May 26, 2014 @06:49PM (#47095807) Homepage Journal

      Oh, by all means, let's have a crying six year old be the sole occupant of a car when it gets in an accident...

      • by immaterial ( 1520413 ) on Monday May 26, 2014 @07:24PM (#47095967)
        How old does a kid have to be before they can walk to school on their own? How would it be any different in an autonomous car? Leave it up to the parents to determine the independence/maturity level of their own children.
        • How old does a kid have to be before they can walk to school on their own? How would it be any different in an autonomous car?

          The difference is how far a kid can go in an autonomous car vs walking under their own power.
          Even a bike doesn't change the situation all that much, since cars are still several times faster than a child's top speed.

          • How old does a kid have to be before they can walk to school on their own? How would it be any different in an autonomous car?

            The difference is how far a kid can go in an autonomous car vs walking under their own power. Even a bike doesn't change the situation all that much, since cars are still several times faster than a child's top speed.

            I guess it depends on how much cab fare you give the kid. Or in the case of the autonomous car, how far you let the car take them. I'm assuming there are some safeguards in place that a thief just can hop in your autonomous car and say "take me to Denver." And that these same safeguards would keep your kids from straying too far from home without permission. Parents can always be incompetent, of course, but that doesn't require a self-driving car to cause serious problems.

      • Let's get kids off the lawns of America, and out on the road behind the wheel where they belong!

      • Oh, by all means, let's have a crying six year old be the sole occupant of a car when it gets in an accident...

        Frankly, a crying six year old being the sole occupant of a car involved in an accident is a lot better then a six year old plus a badly injured parent.

      • What a great argument. Something going potentially wrong is a great reason to ban that thing outright. That's why they banned six year olds from walking around - they could potentially fall down and serverely harm themselves in the process. LEARN WHEN PRINCIPLES MATTER. You need statistics. Not principles to win your argument.
    • It's not like the guy sitting in the seat is the actual "driver" of an autonomous car.

      He is the one programming the destination. The one who ultimately decides whether the run is within the car's operational parameters. I don't want to see a young child or an impaired adult making those decisions.

      The geek tends to assume that the autonomous car will have complete and accurate situational awareness. That it can plan ahead.

      I have my doubts.

      I learned to drive on country back roads ---- learning to sweep my eyes right and left watching out for traffic approaching a blind crossroads long befor

      • He is the one programming the destination. The one who ultimately decides whether the run is within the car's operational parameters.

        Which, in practice, means checking the actually is a road where the map says it is, and there's enough fuel in the tank. The latter is trivial, and the first one is an absolute requirement for any kind of self-driving car.

        I don't want to see a young child or an impaired adult making those decisions.

        You haven't given any compelling reason why they would need to.

        I learned to d

  • Trains? (Score:5, Insightful)

    by Mr D from 63 ( 3395377 ) on Monday May 26, 2014 @06:35PM (#47095737)
    Should kids be allowed to ride trains/metros all by themselves? Same answer.
  • Robotic chauffeur (Score:5, Interesting)

    by Jamu ( 852752 ) on Monday May 26, 2014 @06:36PM (#47095745)
    If the autonomous car is reliable there should be no need for a drivers' license, for the same reason I wouldn't be required to have one if driven by a chauffeur.
    • If the autonomous car is reliable there should be no need for a drivers' license, for the same reason I wouldn't be required to have one if driven by a chauffeur.

      It's even clearer than that, once you consider the bureaucratic and legal implications of it all.

      Do you seriously think any manufacturer or government would let a child ride in such an "autonomous car" if it weren't "reliable"? I'm sure before that's the case, any "semi-autonomous car" or whatever will carry strong warnings that it can only be operated by a licensed driver -- and if you don't follow that and let your kid ride in it alone, the company will claim they are not liable. Further, the parents

  • by quantaman ( 517394 ) on Monday May 26, 2014 @06:48PM (#47095803)

    Is there a scenario in which the unlicensed will be required to operate the vehicle themselves?

    If yes, the unlicensed can't drive.

    If no, they can.

    For a partially autonomous car requiring occasional driving an unlicensed user obviously can't use it.

    For a fully autonomous car there should never be a necessity to drive since an autopilot failure will require a graceful break down mode regardless. Even if there's a manual drive mode an unlicensed user won't be allowed to use it and the car will essentially be broken down on the road.

    The only time it comes up is with a partially autonomous car requiring occasional non-driving guidance. Then it's simply a question of whether you design an alternate certification process for the unlicensed and it really depends on the degree of user interaction required.

    • Is there a scenario in which the unlicensed will be required to operate the vehicle themselves?

      If yes, the vehicle is NOT autonomous.

      • Is there a scenario in which the unlicensed will be required to operate the vehicle themselves?

        If yes, the vehicle is NOT autonomous.

        Like a cab with a driver that had a stroke and collapsed. With a legally blind fare in back.

  • by russotto ( 537200 ) on Monday May 26, 2014 @06:50PM (#47095813) Journal

    For every Frankenstein pre-emptive handwringing stops, you'll kill a million improvements which will make the world a better place.

  • by kenwd0elq ( 985465 ) <kenwd0elq@engineer.com> on Monday May 26, 2014 @07:00PM (#47095853)

    The government would jail you for leaving your child at home alone. If your autonomous vehicle is as safe as being at home, then the government should also prevent children from operating such a vehicle. Perhaps the child could be allowed to ride alone only if a parent or guardian programmed the destination....

    Or perhaps we need to go back to the 1970's and allow children as much freedom and autonomy as I had when I was eight or ten, when my mother would tell me "Go out and play, and be back before dark."

    • by Belial6 ( 794905 )
      To be fair, leaving your 8 year old at home alone will get you arrested in many places, while sending your kid outside alone and locking the doors while you leave is perfectly legal.
  • by Sasayaki ( 1096761 ) on Monday May 26, 2014 @07:16PM (#47095915)

    The reimagined Battlestar Galactica copped a lot of (somewhat) deserved flak for its filler episodes, but my favourite episode of the entire series is also one of the more blatant filler episodes ("Scar").

    In particular, I loved the scene where it is revealed that Cylon raider-ships also reincarnate, just as their fleshy biological counterparts do. Sharon even spells it out for the characters.

    Starbuck: Raiders reincarnate?
    Sharon: Makes sense, doesn't it? It takes months for you to train a nugget into an effective Viper pilot. And then they get killed and then you lose your experience, their knowledge, their skill sets. It's gone forever. So, if you could bring them back and put them in a brand new body, wouldn't you do it? 'Cause death then becomes a learning experience.

    This is why, I believe, the future will eventually belong to automated drivers. The initial ones are already very good, but there will be holes. There will be headlines like "automated car drives headlong into school, killing 10 of the world's cutest orphans". Human drivers have similar issues and events like that are almost everyday occurrences all around the world. The problem is, as Sharon pointed out, when those drivers die their experience is lost. With an automated system, the skill set improves. Someone discovers that, for example, hey, if a drunk passenger opens the door to a self-driving car at low speed and falls out the system doesn't realise they're gone and blindly drives away.

    So the system improves. The car's internal systems track passengers, and if one exits the car, the vehicle will double back and pick them up. Or contact emergency services if the speed is high enough, and form a roadblock so that this person isn't hit again. Or simply lock the doors to begin with. Or any number of more sane actions. The point is: the accident becomes a learning experience. With a human driver, we spend months training people to become drivers. Then one day they make a stupid mistake -- one other drivers have learnt to avoid, but not this driver -- and become a red smear. Their skill set, their experience and training, is lost.

    With automated systems, every mistake is an opportunity to grow. I personally believe that automated driving systems are already better than humans, but this massive evolutionary benefit (directly learning from the mistakes of others drivers as though they were that other) ensures that they will continue to improve, whereas human lifespans are finite and so ours will not.

  • by penguinoid ( 724646 ) on Monday May 26, 2014 @07:22PM (#47095957) Homepage Journal

    First the dirverless cars need to be ridden by people of the general public who can take over if it is necessary. When driverless cars prove to be trustworthy, then it'll make little difference who the "driver" is. All I know is that taxi drivers are going to go the way of the buggy whip makers.

    • by Belial6 ( 794905 )
      Humans trying to take over in an "emergency" will end up about as successful as if you started throwing baseballs at random people and yelling "think fast" just before it hits their head. The idea of a human taking over in a fantasy at best.
      • by Imrik ( 148191 )

        In an emergency you're right, but there are other reasons a person might need to take over. For example, if there's road construction or an accident blocking part of the road and the autopilot doesn't know how to handle it.

    • by dbc ( 135354 )

      Go collect some data. Like the Google cars, for instance. Zero autonomous accidents. Zero. The only accidents they have had is when a human driver is at the controls.

  • Anyone who responded postively to that idea should be neutered immediately.

  • by szemeredy ( 672540 ) on Monday May 26, 2014 @07:29PM (#47095999) Homepage

    There are three broad topics that I feel need to be addressed before allowing minors to ride around unaccompanied in automated vehicles:

    Liability: Who is responsible for the safety of an unaccompanied minor in the event of an accident or vehicle malfunction, especially if the vehicle is a long distance from home? More importantly, who will be willing to accept that kind of liability and at what cost?

    Capacity: Is there enough room on our roads and in our parking lots to accommodate children riding around in their own personal vehicles? Will the efficiencies of automated vehicle traffic be enough to overcome an overall increase in vehicle traffic? How much will associated expansion projects cost? Can we afford to pay for them?

    Energy: Can we afford the increase in energy consumption associated with increasing vehicle traffic at a time when the capacity of available energy reserves is questionable and energy policy is all over the place?

    • by Imrik ( 148191 )

      For liability the answer is fairly simple, require them to be fully insured against malfunction. The price would probably be fairly high to begin with but as they prove themselves better than human drivers it would go down.

      As far as capacity, you have a point as far as parking, but not for total traffic. Currently parents will drive both to and from wherever their child goes, with a fully autonomous car, the car would only go where the child does. The same argument applies for energy.

  • by Jim Sadler ( 3430529 ) on Monday May 26, 2014 @07:34PM (#47096029)
    An autonomous car should not allow human input. It should come to a stop if the controls fail and remain stopped until help arrives. This is perfect for getting kids to school and picking them up from school as well. We might even be able to eliminate school buss drivers.
    • An autonomous car should not allow human input. It should come to a stop if the controls fail and remain stopped until help arrives

      IF it is safe to stop here.

      IF help arrives in time AND IF the kids remain in the car until it does.

    • If your going to go that far why not just implement autonomous (mini) buses. If you do not control it it's not yours.

  • A common thing that people claim is that this will reduce accidents caused by drunk drivers... I would trust my son or daughter to drive more then I would trust myself to drive drunk.. As I would trust a autonomous car to drive us better then either of us.

    Now what really concerns me is what if the computer of a driverless car is under the influence of ethanol? :P

  • Out of curiosity, when you pack a bunch of these on the road, will their laser systems ever confuse one another? How much power are they using anyway? I have a hard time thinking they can get away with just a few mW. Any fear of blinding pedestrians?

    I have just so many low level questions.

  • Absolutely not! (Score:4, Insightful)

    by kheldan ( 1460303 ) on Monday May 26, 2014 @07:58PM (#47096145) Journal
    "Should the Unlicensed Be Allowed To 'Drive' Autonomous Cars?" Hell, no! Not any more than non-pilots be allowed to operate aircraft! It'll be decades, if ever, that so-called 'autonomous' cars are actually reliable and tested enough to be trusted to have no qualified driver at the controls, and even then if I had anything to say about it that will still never happen. People should always be properly trained, tested, licensed, and checked periodically for competency if they are to operate any sort of motor vehicle. It's bad enough out on the roads as it is, the last thing we need are people who have no idea how to drive, or more to the point, what to do in an emergency situation.
  • by Snotnose ( 212196 ) on Monday May 26, 2014 @08:01PM (#47096163)
    1) If you need to take control it's probably going to be Right Effin Now!!!! If the car is driving itself what are the odds you're paying any attention to the road?
    2) If the car has been driving you around for a couple years with no intervention from you, how good a driver do you think you'll be in an emergency?
    • by Imrik ( 148191 )

      What if it isn't an emergency, just a situation the autopilot can't handle, road construction and accidents come to mind.

  • by romanval ( 556418 ) on Monday May 26, 2014 @08:02PM (#47096175)

    In the next few decades there'll be plenty of elderly that need to get around: They're a huge part of active society, yet for simple physical reasons (eye-sight or limb coordination issues) many of them can no longer drive, and a lot of them are homeowners that live in the suburbs, far away from public transportation. I'd say that's a much bigger market, especially in the next 30 or 40 years.

    • I agree with this completely. I know the impact of not being able to drive on the elderly from how it affected my parents. Now that I've turned 60 I can see how it is likely to affect my wife and myself in ten years or so. I am certainly hoping that easier to drive and ultimately autonomous vehicles are going to lead to improved quality of life in the future.

  • I'm a parent of four precocious kids in a small college town in the mountains of NC.

    We have taxpayer-funded public transportation here. 12 years old and up are allowed to ride the bus alone (to go to the library, etc). Would I let my 7-year old if he were allowed? No. Would I let him go with his 12-year-old brother who has a way to stay in touch with me? Probably.

    I don't think the issue here is automotive safety. A fully-automated car should be safe enough for kids to ride in by themselves, or it shouldn't be on the road. I think the bigger concern is, when is it okay to let your kids out in public without supervision? 72% of the people who said flat out "no" did so because they have the impression that parents should be attached to their children at the hip, or because there was no option for, say, 15 and up. Maybe kids should be able to earn the freedom of being out without their parents with good grades above a certain age, etc. The survey sucked. There should have been an option for unlicensed adolescents but not younger children, etc. Parental consent and discretion should be part of the equation as well. We're the ones responsible for our kids, and with that responsibility should come some discretion on our part.

    On a side note, I think autonomous cars will reduce the need for us to go out for non-social things. I mean, aside from losing the ability to pick the best produce, I certainly wouldn't mind telling my car to make a run to the grocery store for me. For me, shopping is just time I'd rather spend with my family.

  • I was thinking the same thing about the blind and the blind drunk, but the problem is at the start and end points. The car may not know how to get out of a parking garage (scan for exit signs?), and it probably won't know how to find a parking spot in congested metro areas (heuristic search?) so at some point you're going to have to take over.

  • I like how when mentioning unlicensed people it automatically assumes children. There are adults that don't have a drivers license whether through choice or not. If the car is fully autonomous then I would hope that unlicensed people could use them.
  • When I was about 12, I would have been a fine driver, but not all kids 12 years old have the capacity to do it, so they picked 16.

    At first, cars will require manual override. Maybe 20-30 years into it when the manual override is no longer needed we can talk about younger kids using them, but at first, due to manual override, kids should have their drivers license.
  • Was the year 2000 so long ago?

    http://youtu.be/GYSfncB4peU?t=... [youtu.be]

    .
  • I know autonomous cars will be "oh so safe". At the moment I'm just as worried about what these things will make people do to people.

    [OPENING OVERATURE [youtube.com]]

    Your driver liability insurance policy has come up for review. We have been recently been acquired by AAAA, the quadruple-A company -- the "Autonomous AAA of the future" and what that means for you as a member is -- it has never been easier to upgrade to an a-car! Financing is available! [link] Due to increasing pressure in the political, legal and underwriters' arenas, we regret to inform you that the cost of your driver policy will be rising this quarter in order to begin collection of fees for the Federal National Driver Insurance Pool, and rising at a steady rate thereafter. It will continue to rise over time despite your [good to excellent] driving record. Now that the Autonomous Vehicle Safety Act is law, and blanket liability accident investigation procedures have been approved by Congress, the legal liability of autonomous vehicles is capped nationwide. While this grants the manufactures freedom from risk of direct criminal penalty and potentially unlimited civil liability, it places human drivers in a difficult position. Most a-car accidents will, of historical necessity rather than actual circumstances, be "no-fault". Since human drivers and any victims claiming injury from them are still obliged to use traditional law enforcement and legal means of redress -- and the cost of these continues to rise -- underwriters are pressuring insurance companies to drop human drivers altogether. We do not intend to do this, but we can no longer provide policies for extended periods. Your new maximum policy period is now [one month]. Thank you for insuring with AAAA.

    [INTERMISSION [youtube.com]]

    Meanwhile...

    Dear editor: DRIVERS cause accidents. A-CARS prevent them. That's what the billboard says -- and if Howard County Referendum passes this September manually operated cars will soon be a thing of the past here. What started as a discussion at a hearing after last year's tragic accident grew into a full heated debate, and to think it all started with the parents who provide their children with a-cars pinning the blame squarely on other peoples' children. But then, after co-opting the national campaign with its slick literature and canned answers for everything -- NOW the fault is with human drivers themselves. And then in an astounding feat of lunacy they claim that it's only fair to place the blame on everybody. Not just the drunk, the aged or infirm, the inexperienced, the distracted or the just plain stupid. But no one's stupid in their book, we're just behind the times is all. They are like the drum majorettes of some utopian humanist parade. I say, SAVE US from these rich hippies, their weird toys and their broken ideals. Now I know a lot of these people, even like some of 'em, but aside from this national 'sideline the humans' campaign they're pushed at us (and WHO is paying for those TV spots I wonder) let's not forget that this debate started around kids. Kids who need to learn to drive as surely as they need to learn to push a pen and spell their name. It's like swimming, who would discourage their own children from practice in swimming, to become expert swimmers, because water is dangerous?? Every kid will need to drive some day, or suffer harm or hardship by not knowing how. These a-car parents even forbid their kids from riding in cars being driven by folks they've grown up with, trusted for years. At the parent conferences we even sit on opposite sides of the table, we can barely be civilized even, because this crap has gotten so deep. Well I say they are making a big mistake and don't seem to get it. It's not just that everyone who cannot afford these a-toys will be walking or begging rises on a-buses or buses wi

  • by roc97007 ( 608802 ) on Tuesday May 27, 2014 @12:08AM (#47097271) Journal

    Ya know, they're either autonomous or they're not. If they're truly autonomous, I should be able to train my dog to get inside and hit the "home" button and it should be just as legal and appropriate as if it were an elevator. If they still need an adult behind the wheel, they're not what I would call autonomous.

The truth of a proposition has nothing to do with its credibility. And vice versa.

Working...