Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?
Transportation AI

Kids With Wheels: Should the Unlicensed Be Allowed To 'Drive' Autonomous Cars? 437

Hallie Siegel (2973169) writes "From the Open Roboethics Research Initiative: Earlier this month, when we asked people about your general thoughts on autonomous cars, we found that one of the main advantages of autonomous cars is that those who are not licensed to drive will be able to get to places more conveniently. Some results from our reader poll: About half of the participants (52%) said that children under the legal driving age should not be able to ride driverless cars, 38% of the participants believe that children should be able to ride driverless cars alone and the other 10% also think that children should be able to drive autonomous cars with proven technology and specific training."
This discussion has been archived. No new comments can be posted.

Kids With Wheels: Should the Unlicensed Be Allowed To 'Drive' Autonomous Cars?

Comments Filter:
  • no (Score:2, Insightful)

    by atomicthumbs ( 824207 ) <atomicthumbs @ g m> on Monday May 26, 2014 @07:32PM (#47095723) Homepage
    no. the idea of an autonomous vehicle with no possible driver to override it is just plain stupid.
  • Well, of course. (Score:5, Insightful)

    by CrimsonAvenger ( 580665 ) on Monday May 26, 2014 @07:34PM (#47095727)

    It's not like the guy sitting in the seat is the actual "driver" of an autonomous car.

    And it's not like anyone is actually required to sit in that seat.

    Note that if an "autonomous car" that requires someone to sit in the driver's seat and pay attention, you might as well not bother making it autonomous. If I have to pay as much attention as if I were the real driver, I might as well drive it myself, since the act of driving at least helps me keep my attention on the traffic.

  • Trains? (Score:5, Insightful)

    by Mr D from 63 ( 3395377 ) on Monday May 26, 2014 @07:35PM (#47095737)
    Should kids be allowed to ride trains/metros all by themselves? Same answer.
  • Re:no (Score:5, Insightful)

    by Rich0 ( 548339 ) on Monday May 26, 2014 @07:37PM (#47095747) Homepage

    no. the idea of an autonomous vehicle with no possible driver to override it is just plain stupid.

    The idea of a manually-operated vehicle with no possibility of a more accurate automated system overriding it is just plain stupid.

    It all comes down to risk. Obviously today autonomous vehicles aren't ready to take over completely. However, they will steadily improve, and it seems unlikely that human drivers will improve at all. At some point the risk of a computer causing an accident will drop below the risk of a person causing one, and at that point it becomes safer to just not let people interfere with the operation of the vehicle.

    Would you consider it wise to give passengers in an airliner the ability to take over in case the pilot makes a mistake? Such a feature is far more likely to cause a disaster than avert one. Once cars get to the point where they can be operated more safely than aircraft (which are already safer than cars are today) then taking control of a car in a crisis will just be getting in the way of the proven driver: the machine.

  • Re:no (Score:5, Insightful)

    by aristotle-dude ( 626586 ) on Monday May 26, 2014 @07:37PM (#47095751)

    no. the idea of an autonomous vehicle with no possible driver to override it is just plain stupid.

    The idea of requiring a driver's license to "ride" in an autonomous car is stupid. What's the point if you need to be able to drive?

  • Re:Trains? (Score:5, Insightful)

    by aristotle-dude ( 626586 ) on Monday May 26, 2014 @07:39PM (#47095755)

    Should kids be allowed to ride trains/metros all by themselves? Same answer.

    Trains in Vancouver are driverless and have been that way since their introduction in 1986. Oh the humanity.

  • Re:no (Score:5, Insightful)

    by ZahrGnosis ( 66741 ) on Monday May 26, 2014 @07:39PM (#47095757) Homepage

    Agreed. If there WERE fully autonomous vehicles (like computer controlled trams in airports are now), it shouldn't matter who drives them. If we get to the point where we trust automobiles to be completely devoid of manual control and override then what difference does it make who's inside?

    Until then, no... as long as there are controls or overrides that someone can cause dangerous scenarios then you should have a license. Maybe we can have a different conversation about an "emergency stop" or changing destinations or minor route corrections, but the way the cars are built now allow for pretty complete driving responsibilities, and they should require similar of not identical rules for the drivers.

  • by quantaman ( 517394 ) on Monday May 26, 2014 @07:48PM (#47095803)

    Is there a scenario in which the unlicensed will be required to operate the vehicle themselves?

    If yes, the unlicensed can't drive.

    If no, they can.

    For a partially autonomous car requiring occasional driving an unlicensed user obviously can't use it.

    For a fully autonomous car there should never be a necessity to drive since an autopilot failure will require a graceful break down mode regardless. Even if there's a manual drive mode an unlicensed user won't be allowed to use it and the car will essentially be broken down on the road.

    The only time it comes up is with a partially autonomous car requiring occasional non-driving guidance. Then it's simply a question of whether you design an alternate certification process for the unlicensed and it really depends on the degree of user interaction required.

  • by russotto ( 537200 ) on Monday May 26, 2014 @07:50PM (#47095813) Journal

    For every Frankenstein pre-emptive handwringing stops, you'll kill a million improvements which will make the world a better place.

  • Re:no (Score:5, Insightful)

    by sl149q ( 1537343 ) on Monday May 26, 2014 @07:51PM (#47095815)

    Exactly. By the time this question is germane it will be equivalent of "would you let your kid ride in a taxi without you?".

    The long term direction is a far safer driving experience solely based on removing human drivers from all cars. Allowing them to "override" the automated systems just makes them far more dangerous than cars today where at least the norm is drivers who are somewhat used to driving. Letting people who rarely if ever driver override is just a disaster waiting to happen.

  • by kenwd0elq ( 985465 ) <> on Monday May 26, 2014 @08:00PM (#47095853)

    The government would jail you for leaving your child at home alone. If your autonomous vehicle is as safe as being at home, then the government should also prevent children from operating such a vehicle. Perhaps the child could be allowed to ride alone only if a parent or guardian programmed the destination....

    Or perhaps we need to go back to the 1970's and allow children as much freedom and autonomy as I had when I was eight or ten, when my mother would tell me "Go out and play, and be back before dark."

  • Re:no (Score:4, Insightful)

    by rolfwind ( 528248 ) on Monday May 26, 2014 @08:08PM (#47095885)

    I think the concern is twofold.

    As of yet autonomous vehicles are unproven. It would be nice to have a driver on the wheel just in case. This might not be for emergencies as a person would be reading or whatever and it's dangerous to give him the wheel unprepared and unaware. But we can presume that the computer might just get confused (lets say a construction site) and come to a stop and say "Please, human, guide me here until I can take over again." That's legitimate because the first generation of autonomous vehicles are certainly not going to be perfect.

    Second, we don't want kids having free access to autonomous vehicle. 10 year old Johnny is riding in a car with no parents and just cannot resist the urge to take over the controls. 9 year old Amanda just met a really cool adult online that promises her if she goes to this one address, she's getting all the toys she wants.

    So maybe not a driver's license, since blind people should have access to this technology after the bugs are worked out, but there should be some regulation.

  • by Sasayaki ( 1096761 ) on Monday May 26, 2014 @08:16PM (#47095915)

    The reimagined Battlestar Galactica copped a lot of (somewhat) deserved flak for its filler episodes, but my favourite episode of the entire series is also one of the more blatant filler episodes ("Scar").

    In particular, I loved the scene where it is revealed that Cylon raider-ships also reincarnate, just as their fleshy biological counterparts do. Sharon even spells it out for the characters.

    Starbuck: Raiders reincarnate?
    Sharon: Makes sense, doesn't it? It takes months for you to train a nugget into an effective Viper pilot. And then they get killed and then you lose your experience, their knowledge, their skill sets. It's gone forever. So, if you could bring them back and put them in a brand new body, wouldn't you do it? 'Cause death then becomes a learning experience.

    This is why, I believe, the future will eventually belong to automated drivers. The initial ones are already very good, but there will be holes. There will be headlines like "automated car drives headlong into school, killing 10 of the world's cutest orphans". Human drivers have similar issues and events like that are almost everyday occurrences all around the world. The problem is, as Sharon pointed out, when those drivers die their experience is lost. With an automated system, the skill set improves. Someone discovers that, for example, hey, if a drunk passenger opens the door to a self-driving car at low speed and falls out the system doesn't realise they're gone and blindly drives away.

    So the system improves. The car's internal systems track passengers, and if one exits the car, the vehicle will double back and pick them up. Or contact emergency services if the speed is high enough, and form a roadblock so that this person isn't hit again. Or simply lock the doors to begin with. Or any number of more sane actions. The point is: the accident becomes a learning experience. With a human driver, we spend months training people to become drivers. Then one day they make a stupid mistake -- one other drivers have learnt to avoid, but not this driver -- and become a red smear. Their skill set, their experience and training, is lost.

    With automated systems, every mistake is an opportunity to grow. I personally believe that automated driving systems are already better than humans, but this massive evolutionary benefit (directly learning from the mistakes of others drivers as though they were that other) ensures that they will continue to improve, whereas human lifespans are finite and so ours will not.

  • by immaterial ( 1520413 ) on Monday May 26, 2014 @08:24PM (#47095967)
    How old does a kid have to be before they can walk to school on their own? How would it be any different in an autonomous car? Leave it up to the parents to determine the independence/maturity level of their own children.
  • by szemeredy ( 672540 ) on Monday May 26, 2014 @08:29PM (#47095999) Homepage

    There are three broad topics that I feel need to be addressed before allowing minors to ride around unaccompanied in automated vehicles:

    Liability: Who is responsible for the safety of an unaccompanied minor in the event of an accident or vehicle malfunction, especially if the vehicle is a long distance from home? More importantly, who will be willing to accept that kind of liability and at what cost?

    Capacity: Is there enough room on our roads and in our parking lots to accommodate children riding around in their own personal vehicles? Will the efficiencies of automated vehicle traffic be enough to overcome an overall increase in vehicle traffic? How much will associated expansion projects cost? Can we afford to pay for them?

    Energy: Can we afford the increase in energy consumption associated with increasing vehicle traffic at a time when the capacity of available energy reserves is questionable and energy policy is all over the place?

  • Re:no (Score:5, Insightful)

    by immaterial ( 1520413 ) on Monday May 26, 2014 @08:32PM (#47096011)
    We already allow kids alone on the streets on foot and on bicycles at parental discretion. As you say, a proper automated vehicle will be safer than a car piloted by an adult human, so it will be far, far safer than a bicycle piloted by a child. I don't see how there's even a question.
  • Absolutely not! (Score:4, Insightful)

    by kheldan ( 1460303 ) on Monday May 26, 2014 @08:58PM (#47096145) Journal
    "Should the Unlicensed Be Allowed To 'Drive' Autonomous Cars?" Hell, no! Not any more than non-pilots be allowed to operate aircraft! It'll be decades, if ever, that so-called 'autonomous' cars are actually reliable and tested enough to be trusted to have no qualified driver at the controls, and even then if I had anything to say about it that will still never happen. People should always be properly trained, tested, licensed, and checked periodically for competency if they are to operate any sort of motor vehicle. It's bad enough out on the roads as it is, the last thing we need are people who have no idea how to drive, or more to the point, what to do in an emergency situation.
  • by romanval ( 556418 ) on Monday May 26, 2014 @09:02PM (#47096175)

    In the next few decades there'll be plenty of elderly that need to get around: They're a huge part of active society, yet for simple physical reasons (eye-sight or limb coordination issues) many of them can no longer drive, and a lot of them are homeowners that live in the suburbs, far away from public transportation. I'd say that's a much bigger market, especially in the next 30 or 40 years.

  • I'm a parent of four precocious kids in a small college town in the mountains of NC.

    We have taxpayer-funded public transportation here. 12 years old and up are allowed to ride the bus alone (to go to the library, etc). Would I let my 7-year old if he were allowed? No. Would I let him go with his 12-year-old brother who has a way to stay in touch with me? Probably.

    I don't think the issue here is automotive safety. A fully-automated car should be safe enough for kids to ride in by themselves, or it shouldn't be on the road. I think the bigger concern is, when is it okay to let your kids out in public without supervision? 72% of the people who said flat out "no" did so because they have the impression that parents should be attached to their children at the hip, or because there was no option for, say, 15 and up. Maybe kids should be able to earn the freedom of being out without their parents with good grades above a certain age, etc. The survey sucked. There should have been an option for unlicensed adolescents but not younger children, etc. Parental consent and discretion should be part of the equation as well. We're the ones responsible for our kids, and with that responsibility should come some discretion on our part.

    On a side note, I think autonomous cars will reduce the need for us to go out for non-social things. I mean, aside from losing the ability to pick the best produce, I certainly wouldn't mind telling my car to make a run to the grocery store for me. For me, shopping is just time I'd rather spend with my family.

  • Re:no (Score:4, Insightful)

    by immaterial ( 1520413 ) on Monday May 26, 2014 @09:18PM (#47096279)
    This is absurd. There is no 'fixing' the human. Driving was already incredibly risky before cellphones (humans are 'proven' to drive terribly, I mean really? Google's automated cars already have a far better record than the average human and the technology is still in its infancy). Humans do not have 360 degree vision or the mental capacity to focus specific attention on dozens of details and separately moving trajectories simultaneously - even if they ARE paying 100% of their attention to the road (which is obviously grossly optimistic).

    What if the computer can't tell the difference between a bag and a rock? Then it assumes the highest-risk possibility and takes the appropriate mitigation action with reflexes so quick that it has probably begun before the meatbag in the car even takes note of the bag.

    What happens when the perfect driver is checking his side view or rear view mirror right at the moment the rock rolls into the lane in front of him?

    Automated systems are never going to be perfect, but I see no reason they can't be far, far more safe than a system guided by a human being.
  • Re:no (Score:5, Insightful)

    by Rich0 ( 548339 ) on Monday May 26, 2014 @10:03PM (#47096465) Homepage

    So, just so I am clear... When the autonomous vehicle runs someone over because it failed to "see" the person, the CEO of the company making the vehicle as well as the developers go to jail for manslaughter, right? Then I'm fine with it.

    Sure, as long as when a human runs somebody over we send their parent and every driving instructor they ever had to jail for manslaughter as well.

    A CEO who comes up with a car that saves the lives of the 40k people who die every year in the US from car accidents and then fails to save the life of a few odd people is a hero in my book.

  • Re:no (Score:5, Insightful)

    by ultranova ( 717540 ) on Monday May 26, 2014 @10:22PM (#47096537)

    The computer is clueless about compound cause-effect situations where prevention is better than reaction.

    So are humans. Every single workplace safety program starts and ends with "stop and think what you're about to do before doing it". Our higher functions operate at the timespan of minutes, not fractions of a second. This is also the reason we have traffick laws: they turn driving from an activity requiring judgement into a mechanical exercise. When that fails, accidents follow.

    You have it backwards. If the automated system can handle 90% of the situational dynamics of driving, you want the human to be able to override it when it's clearly about to get something wrong.

    Apart from negating the entire reason one might want to get an automated vehicle in the first place, it's also not physically possible to stay alert and pay attention to a system - in this case traffick - that you aren't actively participating in. This means that you have no idea when the computer is about to do something wrong, much less what to do about it.

    computers are faster, but humans have far better intellectual contextual awareness.

    Humans have next to no intellectual contextual awareness in realtime situations. Various levels of automation drive your body, most social situations, and even activities usually considered intellectual, like math or programming. "Intellectual contextual awareness" is what you use to pick a career, and often not even then.

    (is that a plastic bag or a big rock in the road?)

    Is there a rock beneath the bag? You can't know. You can, however, guess there isn't and adjust your estimates about any future bags containing rocks should this one be harmless. That happens all the time, and is one of the numerous ways in which human rationality tends to break down.

    and have an interest in survival, which makes up for their less consistent behavior.

    No, it doesn't. Your survival instinct manifests as a bunch of reflexes, which do little to help (shielding yourself with your arms) or even need to be worked around (ABS brakes). It doesn't stop people from speeding or ignoring the road in favour of their cellphone, whereas a computer that's told to obey the speed limit will obey the speed limit.

    'Safe' to you might be a computer controlled everything robot that makes gross, heuristic assumptions about the reality around it.. 'Safe' to me is a mechanically cabled accelerator, spring-loaded to drop to idle RPM if it breaks,

    This being a good example of a gross, heuristic assumption. Your "safe" accelerator can be defeated by a cable jam, metal fatigue in the spring, or even a simple bit of sticky dirt on the cabin floor.

    and a human with superior situational awareness capability who cares about his survival behind the wheel.

    And this calling for something that doesn't exist.

    Lets fix the human rather than hobble and distract him with more uselessly complex machines.

    And how would you go about "fixing" humans?

  • Re:no (Score:4, Insightful)

    by TWX ( 665546 ) on Monday May 26, 2014 @11:53PM (#47096979)
    Road construction.

    Potholes filled with brackish water.



    Diverting around a dangerous situation like a fire, downed power lines, or police response.

    Following detours.

    Following manual police or other responder's manual directions.

    Intentionally blocking the road to other traffic (ie, to protect an injured bystander laying in the road).

    Even with those reasons to allow for manual operation by a licensed driver, I would still allow license-less occupants to use an automated car in the same fashion as a sedan service, assuming that certain conditions are met. Those conditions could be things like legal restrictions that require a combination of age and owner consent, or legal restrictions like a form of state-issued ID that allows the occupant to state destinations for the car, or perhaps for a class of operator for those that used to be licensed drivers but are no longer generally capable of driving at-speed on normal roadways, but could perhaps manually operate the vehicle in a limited capacity in an emergency or when automated operation is not possible, with restrictions on speed and with automated assistance to supplement the operator's own restrictions.

    States have a form of state-issued ID that is issued when the individual either does not qualify for a driver's license or does not want one. States also have multiple classes of vehicle operator permitting, often a range including minors and new drivers with time-of-use or caps on the number of passengers, to motorcycle licenses, to higher GVWR/GCWR or special-purpose licensing like for hazardous chemicals or high occupancy. It would not be unreasonable to add a new kind of endorsement, for those considered old enough to be capable of instructing autonomous vehicles what to do, and it could have a combination of minimum age and parental consent, something like twelve years of age.

    There would still need to be some kind of means for the car to either make choices to abort a trip if road conditions couldn't be handled in an automated fashion, or for the car to allow the occupant to provide additional direction in some situations. There would also need to be a way for the car to either reject destinations or to restrict to only certain destinations based on parental or owner input, and for the vehicle to be able to have limits on the number of occupants and to handle behavioral issues like the failure to wear seatbelts or to remain seated. Those could be as simple as detecting the length of the seatbelt (ie, calibrated to know a minimum length when buckled so one can't buckle it first then sit on it) and knowing if the seat is occupied when the trip starts, and to abort the trip if the occupant gets off of the seat.

    My in-laws could benefit greatly from an autonomous vehicle. They stopped driving due to vision problems and now have to rely on a dial-a-ride service. I'm sure that if an autonomous vehicle existed and was within their means that they'd buy it so long as it could convey groceries and other small to medium sized parcels in addition to at least two occupants.

    I could see families with children in that adolescence age benefiting. Even with two-parent families, it can be difficult if more than one child has an activity to attend and the parents still want to cook dinner or handle other responsibilities, so I could see a parent being able to use an autonomous vehicle to help pick up children from events like that.

    If they can make the cars function completely driverlessly then I don't see any reason why they can't make them function with occupants that can't operate them manually.

Recent investments will yield a slight profit.