Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Google Transportation

Why Self-Driving Cars Should Never Be Fully Autonomous (roboticstrends.com) 397

An anonymous reader writes: David Mindell, an MIT professor, says self-driving cars should never be fully autonomous. "There's an idea that progress in robotics leads to full autonomy. That may be a valuable idea to guide research but when automated and autonomous systems get into the real world, that's not the direction they head. We need to rethink the notion of progress, not as progress toward full autonomy, but as progress toward trusted, transparent, reliable, safe autonomy that is fully interactive: The car does what I want it to do, and only when I want it to do it." Mindell writes, "Google's utopian autonomy is a more brittle, less functional solution than a rich, human-centered automation."
This discussion has been archived. No new comments can be posted.

Why Self-Driving Cars Should Never Be Fully Autonomous

Comments Filter:
  • by hawguy ( 1600213 ) on Tuesday October 13, 2015 @01:32PM (#50718967)

    safe autonomy that is fully interactive: The car does what I want it to do, and only when I want it to do it."

    Sure, if I own the car it should do only what I want it to when I want it to, but why should I own a car at all? I use a car only a few times a month, driving maybe 5000 miles/year total. Why should I spend $30,000 on a depreciating asset and devote 200 sq ft of space towards housing it.

    I want to call a car and have it come when I want it, take me where I want to go, then go away until I need it again.

    • Re: (Score:3, Insightful)

      by taustin ( 171655 )

      It's called a "rental car," and it's not a new concept.

      And like rental cars today, you'll have the exact same responsibilities as you do with your own car. If car owners are responsible for making sure their own cars operate safely, you'll be responsible for making certain the rental does.

      Perhaps what you really want is a "taxi," which as a trained, professional driver, who knowingly accepts that responsibility (and control) for himself. Or maybe an Uber driver who may or may not know his ass from the glove

      • > you'll be responsible for making certain the rental does.

        Unless it's a Volvo as they've already come out and said they are going to accept responsibility and liability for their autonomous vehicles when they launch them. So have fun paying $60/day for your "rental" and we'll have fun paying $200/month to our Volvo auto-car cooperative.

        • by taustin ( 171655 )

          What Volvo says in their press release and what the law says are often completely unrelated to on another. (And they're talking about a very limited liability.)

      • by Guspaz ( 556486 ) on Tuesday October 13, 2015 @01:56PM (#50719179)

        No, what I want is an automated taxi. In the case of autonomous cars, the car manufacturer will accept responsibility for any accident caused by the autonomous car, and the insurance of the automated taxi company will take care of the rest.

      • When autonomous vehicles are less likely to kill me than one with a human driver, I will prefer the safer driver that also doesn't require a salary.
        • by taustin ( 171655 )

          That's the point: Will anyone alive today live long enough to see autonomous vehicles that are better drivers than people? The technology is nowhere near existing yet.

          • by lgw ( 121541 )

            That's not a very high bar. I expect we'll be there software-wise in 10 years or so (with cars following a few years later) in well-mapped city areas. Rural America is a whole different topic.

          • by amRadioHed ( 463061 ) on Tuesday October 13, 2015 @03:48PM (#50720145)

            We're probably already there, in general human drivers suck.

            The current problems with Google's cars are not safety related, it's the limited range in which they can operate. Over time the range will increase without needing to make compromises on safety.

            • by rockmuelle ( 575982 ) on Tuesday October 13, 2015 @04:45PM (#50720637)

              "We're probably already there, in general human drivers suck."

              Data, please? People make this claim all the time, but given that there are over a billion trips a day in the US and only around 120 fatalities, I'd say humans drivers pretty much have this thing down. The fact that people can make it around in their cars in myriad weather conditions, successfully navigate unfamiliar terrain, and quickly respond to sudden changes in circumstances (kid darting out in front of them) speaks volumes to how good human drivers are.

              I watched a Google self-driving car cross an intersection this weekend (in Austin). It was moving very cautiously and then slowed down to a walking pace on the other side of the intersection, leaving a trail of human-driven cars stuck in the intersection while it decided to turn down a side street.

              The "human drivers suck" crowd sounds very much like the "there's a thug with a gun around every corner" crowd. Some people seem to enjoy thinking the world is more dangerous than it really is.

              -Chris

              Some sources:
              https://www.rita.dot.gov/bts/s... [dot.gov]
              http://www.nytimes.com/2004/03... [nytimes.com]

              • by radish ( 98371 ) on Tuesday October 13, 2015 @06:17PM (#50721559) Homepage

                Data, please? People make this claim all the time, but given that there are over a billion trips a day in the US and only around 120 fatalities, I'd say humans drivers pretty much have this thing down. The fact that people can make it around in their cars in myriad weather conditions, successfully navigate unfamiliar terrain, and quickly respond to sudden changes in circumstances (kid darting out in front of them) speaks volumes to how good human drivers are.

                So I'm going to try. Putting aside fatalities, as the Google cars have not been involved in any, there were approx 5.5m traffic accidents in the US in 2010 [wikipedia.org]. Taking your number of 1b trips per day, we get a figure of ~66k miles per accident. According to Google [medium.com] they have been involved in 11 accidents over 1.7 million miles which is ~154k miles/accident. Now this is a combination of fully automatic and driver assisted miles, so the comparison isn't exact, but it's pretty safe to assume the computer is at least as good as your average driver. And maybe twice as good.

                I watched a Google self-driving car cross an intersection this weekend (in Austin). It was moving very cautiously and then slowed down to a walking pace on the other side of the intersection, leaving a trail of human-driven cars stuck in the intersection while it decided to turn down a side street.

                That may be evidence of it driving badly. Or it may be evidence of it driving well, because it was responding to a potential danger that the human drivers didn't see or didn't care about. Remember - if we're saying we want the computers to drive better than humans we have to accept that they will at times drive differently.

              • by Maxo-Texas ( 864189 ) on Wednesday October 14, 2015 @04:40AM (#50724119)

                It sounds cautious.

                Now let's review my friends experience with human drivers tonight.

                Tailgating. Holding down the horn.

                The lanes to the left and right of her were open.

                Finally. zooming around her on the left-- cutting in front of her within a few feet and braking hard. Which was stupid because so many assholes have done that trick now that when someone passes me in anger, I'm already slowing down. She knows to do the same thing.

                A google car will NEVER do that.

                Let's review my accidents with humans.

                Rear ended from behind while sitting at a red light (30mph) (he was ticketed)
                Rear ended from behind while sitting at a red light (5mph) (minor damage- more from my bike rack to her grille).
                Sideswiped by an 18 wheeler that changed lanes into me without signaling (he was ticketed)
                Front ended when the truck in front of me put it into reverse at a redlight and GUNNED backwards into me. (minor damage to my front bumper).
                Rear ended from behind while sitting at a red light (30+mph - car totaled) (she was ticketed)
                The person that rammed me in a parking lot when I wasn't even there and drove off without leaving info.

                How about my friend who was nearly killed recently.
                Rear ended from behind while sitting in the HOV lane with stopped traffic (50+ mph- evidence the human didn't even brake until he was 20' behind doing over 50mph).

                NONE of these accidents would have happened had a google car been driving.

                If nothing else- I'd like cars to start slowing down automatically when they detect a collision is about to occur. And prevent a driver from flooring it into another car ahead or behind them when both are stopped.

                I, for one, an looking forward to our autonomous car overlords.

          • Will anyone alive today live long enough to see autonomous vehicles that are better drivers than people?

            Autonomous vehicles already have a better safety record than people.

      • Perhaps what you really want is a "taxi," which as a trained, professional driver, who knowingly accepts that responsibility (and control) for himself.

        Ha ha ha. Amongst the worst drivers on the road are Taxi drivers.

    • Actually the argument is bogus from the start. Even if you own the car doesn't mean it should be permitted to run over pedestrians just because you want it to.

    • I'm guessing that if you don't want to own a car, this self-driving "car" thing they are selling might not be right for you.

    • Sure, if I own the car it should do only what I want it to when I want it to, but why should I own a car at all? I use a car only a few times a month, driving maybe 5000 miles/year total.

      Let me guess, you live somewhere on the East coast or Chicago? Or one of the few other places with public transportation? Out here in the rest of the country we tend to drive a LOT more. I routinely rack up 30-40,000 miles each year. Not because I love driving so much but because work is 20 miles each way and you cannot get anywhere else without driving there. Public transportation for all practical purposes doesn't exist where I live. The infrastructure and population density simply doesn't exist fo

      • by Hylandr ( 813770 )

        > work is 20 miles each way

        If you're lucky.

      • The infrastructure and population density simply doesn't exist for car rentals to be economically viable and self-driving cars will not change that fact.

        I don't know where you live. But anywhere that's populated enough for a taxi service is populated enough for an autonomous taxi service. And because of the lack of a need to pay for drivers, plus the central planning of an Uber type system, many places that can't support a taxi service may be able to support autonomous taxis too.

        It may not be right for you. It least now. But it may be for others.

    • Why should I spend $30,000 on a depreciating asset and devote 200 sq ft of space towards housing it.

      I don't know, why? You could buy a $1000 car and park it on the street.

    • by Solandri ( 704621 ) on Tuesday October 13, 2015 @05:04PM (#50720831)

      Sure, if I own the car it should do only what I want it to when I want it to, but why should I own a car at all? I use a car only a few times a month, driving maybe 5000 miles/year total. Why should I spend $30,000 on a depreciating asset and devote 200 sq ft of space towards housing it.

      I want to call a car and have it come when I want it, take me where I want to go, then go away until I need it again.

      This is purely a cost-benefit analysis, and what's true for you will not be true for others.

      If you purchase the car new for $30,000 (smart people don't, but we're talking the typical car buyer here), use it for 5 years, and sell it for $15,000, it depreciates an average of $3000/yr.

      The average car in the U.S. is driven 12,000 miles/yr. Gas for 12,000 miles/yr, at 25 MPG and $3/gal, works out to $1440/yr

      Maintenance and insurance is around $2500/yr (mine and maybe yours is a lot less, but we're talking the typical, median driver here with an accident or two on his record).

      Total cost of ownership is then $6940yr, or $0.578/mi, which is almost exactly the IRS reimbursement figure of $0.575/mi so we're on pretty solid footing here. If you use the car on 300 days out of the year, that's $23.13/day.

      Can you do everything you usually do in a typical day of driving the car for $23.13? If you live in a city with good public transportation, the answer is probably yes (ignoring the cost of time you have to wait for said public transportation). If you have to rely on taxis, the answer is probably no. And if you live outside the city the answer is almost certainly no.

      There's also the tragedy of the commons to worry about. I just got back from taking my dog for a walk at the beach, and there's wet sand all over the back seat. Do you really want to get the autonomous car right after I've used it to ferry my dog around? In a taxi, the driver's presence discourages you from abusing the shared asset. You won't get that with an autonomous car unless the car service puts cameras inside them that are always recording. Which would could as a huge negative for a lot of people in the buy vs rent a car argument.

  • Why does this guy care? What about private roads? Private cars and private property rights? Who gives a shit?

  • by Daetrin ( 576516 ) on Tuesday October 13, 2015 @01:35PM (#50718991)
    There are already systems that will warn you if you're drifting out of your lane, and systems that will warn you/apply brakes if you're in danger of collision. And of course systems that will plot a route for you and give you step by step directions to your destination have been around for quite awhile at this point.

    If the goal isn't full autonomy then it doesn't really seem like we need to do much more research and development. How boring will it be to be "driving" a car that can do 99% of the driving by itself but insists on you paying attention (at least intermittently) to do the remaining 1%, instead of kicking back and enjoying your time doing something else?

    (And note that anything less than full automation will provide little benefit to the biggest commercial interest, long distance trucking. Having to pay a person to ride along and babysit the automation doesn't save anything over just making that person drive in the first place.)
    • by Luthair ( 847766 )
      I'd think that boredom would be the biggest problem, past a threshold of driver assist where the user begins to rely heavily on it and their attention to the task at hand (e.g. driving) decreases at which point we need full automation.
      • This has been an issue for airline pilots. Auto-pilot is great at saving fuel but it makes the cruising portion of the flight so boring that pilots stop paying attention.

    • Having to pay a person to ride along and babysit the automation doesn't save anything over just making that person drive in the first place.)

      It saves on gas mileage and accidents at least, that is big bucks in the trucking industry. You could likely also pay the "driver" much less as they might not need a CDL, though likely the unions would force the laws to be behind the times to support their members.

  • by phantomfive ( 622387 ) on Tuesday October 13, 2015 @01:35PM (#50718993) Journal
    Now please relax, and watch as we show you this ad. A sedative will be provided shortly.
  • Not sure I agree (Score:5, Interesting)

    by ranton ( 36917 ) on Tuesday October 13, 2015 @01:36PM (#50718999)

    “The notion of ceding control of something as fundamental to life as driving to a big, opaque corporation - people are not comfortable with that,” -- David Mindell

    I'm not sure I agree with that. Sounds similar to someone 150 years ago saying "The notion of ceding control of something as fundamental to life as growing and hunting the food to feed my family to a big, opaque corporation - people are not comfortable with that".

    People get comfortable with a great number of things if you make their life significantly better even while asking them to give up a little control.

  • In a related story, the new toaster from MindellCo was announced yesterday. It's just like a regular toaster, except when you push down the lever, it asks you if you're sure before it starts toasting, and then asks again every ten seconds.
  • I'm not convinced. (Score:5, Insightful)

    by jeffb (2.718) ( 1189693 ) on Tuesday October 13, 2015 @01:44PM (#50719083)

    From the article:

    “[Full automation is] just proven to be a loser of an approach in a lot of other domains,” Mindell says. “I’m not arguing this from first principles. There are 40 years’ worth of examples.”

    For how many of those 40 years have today's sensors, computing hardware, and AI been available?

    It's possible that fully automated driving will turn out to be hard like commercial fusion power, or like commercial space travel. I think it's more likely, though, that it will turn out to be hard like speech recognition or cheap, lightweight flying drones -- each popularly regarded to be "a few years away" for decades, until suddenly it was here, courtesy of a few research advances and a great deal of exponential improvement in computer hardware.

    • “I’m not arguing this from first principles. There are 40 years’ worth of examples.”

      "...which I picked myself!"

      Yeah, come on, automation has proven to be a winner in any domains as well. And I suspect the author would take care of those by quibbling over the definition of "full."

  • Not Right Away (Score:5, Insightful)

    by Jason Levine ( 196982 ) on Tuesday October 13, 2015 @01:44PM (#50719089) Homepage

    I don't think we'll have self-driving cars right away. Instead, we'll have cars with "Enhanced Cruise Control." You get into a lane on a highway, hit Enhanced Cruise Control, and your car will stay in that lane (turning left or right as needed) keeping to the speed you set but slowing down if needed (e.g. if the car in front of you brakes). For long car trips, this would mean that a bulk of your trip would be automated. You'd still need a driver there to take control once you wanted to leave the highway and you might not be able to use this during bad weather (just like you wouldn't put cruise control on during a snowstorm), but it would be one step towards autonomous cars.

    As the software gets more refined and the edge cases are dealt with better, the car will be able to handle more driving situations. For example, "automatically stop at red lights" or "keep going straight unless the driver indicates otherwise." Eventually, cars driving themselves will be the norm and human drivers will be the exception.

    • As the software gets more refined and the edge cases are dealt with better, the car will be able to handle more driving situations. For example, "automatically stop at red lights"

      What could possibly go wrong when a driver gets into a different car than normal?

      or "keep going straight unless the driver indicates otherwise."

      Don't they sorta do that already? ;)

    • Don't we already have that now?
      • By "keep going straight", I meant more "stay on the road in the current lane" than "generally go straight ahead." The latter we have today (assuming your car is properly aligned and you don't veer onto the sidewalk). The former would be a step towards self-driving cars.

        • By "keep going straight", I meant more "stay on the road in the current lane" than "generally go straight ahead." The latter we have today (assuming your car is properly aligned and you don't veer onto the sidewalk). The former would be a step towards self-driving cars.

          There are several cars on the market that already do lane following. Have been for a few years now. So far they all try to require you to at least keep your hands on the wheel, in an effort to try to make you pay attention, but it turns out that at least some are pretty easy to fool: http://www.roadandtrack.com/ca... [roadandtrack.com]

          So, yes, we already have this, though we try not to.

  • by gurps_npc ( 621217 ) on Tuesday October 13, 2015 @01:46PM (#50719107) Homepage
    This statement indicates the real problem with the author's logic: "Its cars must identify all nearby objects correctly, need perfectly updated mapping systems, and must avoid all software glitches."

    Incorrect in principal and practice. It's like the angry bear vs the two people. You don't have to be faster than the bear, just faster than the other human being. The cars' don't need to be perfect - they just need to beat a human mind that is NOT an expert. If the car by itself can do better than a human without the AI, than it is sufficiently good to replace the current model that is human without the AI.

    The idea that we need to achieve the maximum possible result of human+AI ignores the current situation's inherent problems of poor drivers, the elderly, drunk drivers, children, etc. etc. etc.

    Parents of teenagers, children of the elderly, alcoholics and their loved ones ALL are VERY comfortable with the idea of having the car drive, not the person. They will provide the demand and market. Once their demand is met, then simple continuing research will eventually make EVERYBODY comfortable with letting the AI drive. If you are OK with the AI drive your teenager, your grandma, and your drunk cousin Joe - knowing they might be in the car next to you, then you will be OK with letting them drive you.

    His comparison of other modes of transportation such as space, submarine and airplane, is also flawed.

    The main reason we never automated those is that their need for accuracy was much much higher than we have for automobiles and up until recently, computers have not had the real ability to beat a trained, expert human. But in cars, they don't have to beat an expert, just a licensed and impaired human - drunk, young, elderly for example.

    A better comparison is to look at welding. Originally people welded. Then robots came along and were better. Automated welding has taken over a large proportion of welding, we don't have humans over-riding them. Why? Because the robots are better at it than humans in most cases.

    • by 0123456 ( 636235 )

      The main reason we never automated those is that their need for accuracy was much much higher than we have for automobiles and up until recently, computers have not had the real ability to beat a trained, expert human.

      Uh, automating aircraft is vastly simpler than automating cars, because nuns rarely run out into the middle of the road at 40,000 feet.

      We haven't done it yet, because then everyone dies when the sensors fail. Whereas, with current autopilots, the pilots only sometimes fly the plane into the sea when the sensors fail.

    • His comparison of other modes of transportation such as space, submarine and airplane, is also flawed.

      Hmm, it sounds like he doesn't know much about space travel and air travel. Planes can now take off, fly, and land all on their own. There are fully automated spacecraft now as well, in fact the shuttle was widely considered to be impossible to manually land as it is pretty much a flying brick.

      • by 0123456 ( 636235 )

        There are fully automated spacecraft now as well, in fact the shuttle was widely considered to be impossible to manually land as it is pretty much a flying brick.

        Most, if not all, shuttle landings were flown manually. I believe at least one was flown manually from orbit to touchdown, though that mostly just involved following the pointers the computer displayed on the screen.

        • You are correct, I was wrong, it appears that the initial reentry is computer controlled, but after the air braking portion, it is manual control. However, there are still many automated spacecraft such as the cargo haulers and the new X37b that are completely computer controlled. Also, apparently, after Columbia, a remote control option was instituted, but not used, in case the orbiter needed to be brought back without a crew due to risk of death.

      • The Soviet space shuttle, the Buran, was fully automated from launch to even having to make a second attempt at landing.

    • by elrous0 ( 869638 )

      You don't have to be faster than the bear, just faster than the other human being. The cars' don't need to be perfect - they just need to beat a human mind that is NOT an expert. If the car by itself can do better than a human without the AI, than it is sufficiently good to replace the current model that is human without the AI.

      I'm not keen on dying in a car accident as the result of a software glitch--because no manual override was included, because ON AVERAGE the software does much better than humans. Autonomous AI driving will be great 99% of the time, I'm sure. But in that 1%, I sure as shit want a manual override available. If I die in a videogame because the software glitched, I get at least get a respawn. Real life is somewhat less forgiving.

      • I'm not keen on dying in a car accident as the result of a software glitch--because no manual override was included, because ON AVERAGE the software does much better than humans.

        But you are keen on dying in a car accident as a result of driver fatigue, distraction, lack of skill or physical glitch (say, the guy in the oncoming lane has a heart attack), because ON AVERAGE human drivers do much worse than the software?

        It's all a question of odds, and not choosing the option that maximizes the odds is stupid.

    • If the car by itself can do better than a human without the AI, than it is sufficiently good to replace the current model that is human without the AI.

      In a world without lawsuits, that would be true.

  • I think the technology being implemented in cars today is not really the most useful technology. Self driving cars are pretty cool from a 'sci-fi' perspective, but inherently dangerous. Personally I don't want any software driving my car! Non-moving computers can be buggy enough.

    I think what should have been installed decades ago are safety systems such as proximity and speed limit sensors. These types of devices would alert the driver to potentially hazardous situations and allow them to avoid an accid
    • by geeper ( 883542 )

      Personally I don't want any software driving my car! Non-moving computers can be buggy enough.

      Do you ever ride on a plane? Guess what...

      • Planes are not the same as cars. You don't normally have "idiots" flying around with no respect for the rules or other planes; or pilots coming out of the bar completely wasted, then flying a commercial airliner.

        Nor do you have pedestrians, animals, or very many other obstacles in the vast expanse of air above the earth.

        I know that planes will run on auto-pilot and the pilot can take manual control of the plane, but commercial aviation and civilian motor vehicles are two entirely different beasts.
    • Computer: You are about to crash into another car in 10 milliseconds, should I avoid it, or do you want to do it?
      Driver:...
      Computer: You are about to crash into another car in 9.99 milliseconds, should I avoid it, or do you want to do it?
      Driver:...
      Computer: You are about to crash into another car in 9.98 milliseconds, should I avoid it, or do you want to do it?
      Driver:...W ........h.........

      ...

      Computer: You are about to crash into another car in 0.01 milliseconds, should I avoid it, or do you want t
    • I think what should have been installed decades ago are safety systems such as proximity and speed limit sensors. These types of devices would alert the driver to potentially hazardous situations and allow them to avoid an accident.

      These things exist? They are extremely common in fact?

      there really isn't much - if any - high tech safety features (other than ABS brakes, etc).

      You mean other than traction stabilization systems, and backup cameras, and rear object detection systems, and collision avo
  • by cellocgw ( 617879 ) <cellocgw.gmail@com> on Tuesday October 13, 2015 @02:01PM (#50719217) Journal

    Wait: TV sets which self-align vertical and horizontal hold.

    Subway systems which run completely automatically (granted with all sorts of staff there to pull the Panic switch and/or make travellers feel more 'safe')

    elevators which automatically go to requested floors.

    Heck, IP packets which automatically make it to their intended destination.

    I think this guy has no idea where and how software is running automated control systems.

  • One, it is unclear from having read the article exactly what level of automation the author is railing against. There is a huge amount of experimentation on the part of the major players in this race and likely several levels of automation will arrive nearly simultaneously. The best approach will tend to win out in the market. Its not like it will be suddenly all totally automated cars and an “OMG we made the wrong choice” scenario.

    Likely we will evolve into fully automated as more and more cars become automated. Eventually it will reach a tipping point where the government needs/wants all the human drivers off the roads for safety and efficiency -- when this happens, driving laws and automated enforcement of every minor offense will force humans to cede control to automation else be fined into the poor house. The biggest challenge to fully automated cars will be dealing with unpredictable humans. The mixed environment for the next 2-3 decades will be quite challenging for all involved.

    • by 0123456 ( 636235 )

      The mixed environment for the next 2-3 decades will be quite challenging for all involved.

      In 2-3 decades, hardly anyone will travel on Earth. Why waste time transporting your body hundreds of miles when you can just rent a drone body at your destination? Travel will only be required over distances too large for remote control.

      This is why the whole 'self-driving car' thing is attacking the wrong problem. It's like someone in 1900 trying to figure out how to clean up all the horse crap that will be clogging up our cities by the year 2000, when everyone will be able to afford a horse.

      • In 2-3 decades, hardly anyone will travel on Earth.

        I've heard claims that no one needs to travel today, that sound and video today provide all the experience you need. I suspect that your claim, like theirs, will fall on its face and people will be traveling just as much.

        Why waste time transporting your body hundreds of miles when you can just rent a drone body at your destination?

        Because the real world is very, very different from a pair of screens right up in your eyes and speakers on your ears. Assuming

  • Why Self-Driving Cars Should Never Be Fully Autonomous

    I think the author meant "selectively autonomous." "Never fully autonomous" implies there are things it shouldn't do by itself, rather than the author's wish that "the car does what I want it to do, and only when I want it to do it." What if you want the car to be fully autonomous?

    What does he think "fully autonomous" cars will do? Force you to go to work when you actually want to skip and go to the beach?

  • He uses the examples of planes and how humans are constantly correcting human errors. Okay, full automation would not have the human errors in the first place. Also the system would be aware that under no circumstance should a highly perilous course be taken. Actually the article more makes a point for why planes should be fully automated as most of the plane crashes have been human error. That being said, humans are still better at landing planes smoothly, but that will probably change over time.

    As for c
    • He uses the examples of planes and how humans are constantly correcting human errors. Okay, full automation would not have the human errors in the first place.

      No, it would have it's own set of unique errors. Maybe less of them or maybe more of them. But there will be errors of some sort. Failed sensors, interference, logic errors, defective hardware, etc.

      As for cars, he says most car companies are trying to enhance driver control instead of replace it.

      That's because the full autonomy problem is too big. You have to break it up into bite sized pieces and solve those. Trying to eat the entire elephant in one bite simply isn't possible.

      A computer does not get tired, it can look in more directions and pay attention to them all at the same time, it does not take drugs, it does not get angry.

      It also is inflexible, completely literal and sometimes challenging to communicate with. I think the problem of instructing

  • by argStyopa ( 232550 ) on Tuesday October 13, 2015 @02:29PM (#50719475) Journal

    I'd be happy to have a car "automated enough" to drive by itself on the boring, shitty parts of driving where you're likely to fall asleep - dull, EASY stretches of highway. That's pretty much the 80% target the automated cars are at today. I wouldn't even mind if it drove half speed, since I could be reading/working/sleeping while sitting there.

    If we wait until AI has mastered the complicated, cluttered, HARD bits of driving in cities, construction zones, neighborhoods, etc it'll never happen.

    It's just a problem of having appropriate systems to awake/alert the driver before the AI disengages, and/or fallbacks like pullouts where cars can go when their driver ISN'T responding.

  • by sjbe ( 173966 ) on Tuesday October 13, 2015 @02:40PM (#50719575)

    I think one of the biggest problem with autonomous vehicles is directing them where you want to go. Let's say you are in a crowded parking lot and you want the car to park in the 3rd spot, 4 rows over. How do you instruct the vehicle efficiently to do that without taking control of the steering yourself? That's not an easy thing to articulate clearly. Worse, how do you tell it where to go when you don't clearly know the final destination yourself? Sometimes you don't have an address or the destination is very large like an airport.

    I think autonomous vehicles might do well on major roads but I think the problem of giving specific instructions is going to be a LOT harder than many people think.

    • by tnk1 ( 899206 )

      I think that in most cases, you'd turn over manual control or at least selective control for parking or other decisions that would be made at slow speeds. Especially in the beginning.

      Alternately, vehicles might simply do a drop off for the user and be programmed to enter a specific assigned parking spot which might well be a mile away, but within relatively close distance for pickups. You wouldn't want to walk that distance, but you could call the vehicle over from a spot that far away.

  • Autonomous self-driving!

    Think about thatâ¦

    Cars. Why cars?

    Cars provide self-sustaining R&D funding capital along the path to autonomous robotics. Cars are the highest and best embodiment for human-robotic technology development. Developing cars enables companies like GOOG, AAPL and Tesla-like competitors to affordably grease the runway for the autonomous economy future. Future progress promises autonomous action without the detrimental reliance upon an irrational unreliable primacy of man of mo

  • by Gliscameria ( 2759171 ) on Tuesday October 13, 2015 @03:14PM (#50719873)
    With any of this, there needs to be an absolutely manual override. Software can get hacked or just freak out, and the passenger needs to always have the option to take control of the vehicle, especially on a large scale, like a major bug or hack. I don't see a world where law enforcement isn't going to want a backdoor that shuts down the vehicle or runs some program to pull it over or even drive it to a certain place. This hat is itchy...

1 + 1 = 3, for large values of 1.

Working...