Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Transportation Google

How Autonomous Cars' Safety Features Clash With Normal Driving 451

An anonymous reader writes: Google's autonomous cars have a very good safety record so far — the accidents they've been involved in weren't the software's fault. But that doesn't mean the cars are blending seamlessly into traffic. A NY Times article explains how doing the safest thing sometimes means doing something entirely unexpected to real, human drivers — which itself can lead to dangerous situations. "One Google car, in a test in 2009, couldn't get through a four-way stop because its sensors kept waiting for other (human) drivers to stop completely and let it go. The human drivers kept inching forward, looking for the advantage — paralyzing Google's robot." There are also situations in which the software's behavior may be so incomprehensible to human passengers that they end up turning it off. "In one maneuver, it swerved sharply in a residential neighborhood to avoid a car that was poorly parked, so much so that the Google sensors couldn't tell if it might pull into traffic."
This discussion has been archived. No new comments can be posted.

How Autonomous Cars' Safety Features Clash With Normal Driving

Comments Filter:
  • Best solution: (Score:3, Insightful)

    by Anonymous Coward on Wednesday September 02, 2015 @07:45AM (#50442621)

    Ban human drivers.

    • Re:Best solution: (Score:5, Insightful)

      by Anonymous Coward on Wednesday September 02, 2015 @09:35AM (#50443171)

      No surprise this is up modded insightful. I'm betting many slashdotters are horrible technologists who assume the world needs to bend to technology. Simply put, that's not the case. If these cars can't handle driving around humans they are not ready for consumption. The fact that they can't properly work with and adapt with humans on the road means that these cars are unsafe. They may be "safe" from the definition of the laws, but they are not safe if they are causing or instigating traffic accidents. It seems it's blind luck that these cars haven't been the clear cut cause of an accident yet.

      • Re:Best solution: (Score:5, Insightful)

        by TWX ( 665546 ) on Wednesday September 02, 2015 @10:00AM (#50443357)
        More to the point, autonomous cars are currently not the cheaper technology. Any bill that would attempt to force conventional vehicles off of the road would be stillborn, there are far too many automotive enthusiasts that have already made inroads in the other direction (ie, looser emissions testing rules on cars with collectors' insurance) that it literally can not happen. There would also be pushback from those that simply cannot afford new cars and advocacy groups for them; one can buy running cars for less than $1000 on the used market, it will take a decade for there to even be a chance for a used autonomous vehicle to be that cheap, if not even longer.

        There have been lots of discussions on attempting to change driver behavior. Those are also nonstarters. People are not going to change how they drive until conditions in the field force them to do so. Hell, we still have idiots driving below the speed limit in the left lane on busy freeways where they're actually posing a safety hazard and where the law actually states that one can be cited for failing to yield and being passed on the right. Most people probably don't even know the rules for what's defined as stopping (ie, remaining still for two seconds where I live) and have no interest in bothering to learn, and the police don't seem inclined to enforce either, so this simply won't change.

        The cars are going to have to learn how to adapt to these conditions.
        • How did we get horses and buggies off the road?
    • I too enjoy taking shortcuts instead of fixing things properly.
    • And deer. And snow too. Also road construction.

  • Poor example (Score:5, Insightful)

    by fred911 ( 83970 ) on Wednesday September 02, 2015 @07:46AM (#50442623) Journal

    "One Google car, in a test in 2009,..."

    One would think that in 6 years some improvements would have been made. Do we have a more current example?

    • It would be good to see what progress is being made, but it will be always be a challenge to have these control systems anticipate what human drivers intend to do.
      • Re:Poor example (Score:5, Insightful)

        by PolygamousRanchKid ( 1290638 ) on Wednesday September 02, 2015 @08:47AM (#50442905)

        it will be always be a challenge to have these control systems anticipate what human drivers intend to do.

        This is complicated by the fact that some human drivers do not even know themselves, what they intend to do. So how should a computer control system be able to anticipate what a human driver intends to do, when the human drivers don't even know themselves?

        I really don't think it is that many . . . maybe only 1% of all human drivers. However, one clueless driver can confuse and tie up 99 drivers who know where they want to go, and can communicate it to other drivers.

        It's like being on a escalator at the airport or train station. Two folks don't know where they are going. So they stop dead in their tracks at the end of the escalator, blocking the path for all the other folks on the escalator. An accordion affect ensues, with all the folks on the escalator getting squished together. The two people doing the blocking, are totally oblivious to this fact. Their field of vision ends at their own noses. They are entirely engulfed in themselves, and can't even conceive that there are other living beings around them.

        This is what happens on the road, as well. The driver of the car parked halfway into the street, is just not capable of thinking, that other drivers might be confused by this. Is the car really parked? Or is the driver trying to park? Or maybe trying to drive away . . . ? At any rate, some drivers need to be taught that it is terribly important to anticipate how others might interpret their actions.

        • I see this kind of assholes every day on the street. They walk as if they were the only living beings existing on the sidewalk or in the mall, completely ignorant of what happens around them. I wonder what kind of serious defect these people have to act like that.
        • Re:Poor example (Score:5, Interesting)

          by jbengt ( 874751 ) on Wednesday September 02, 2015 @10:07AM (#50443399)

          It's like being on a escalator at the airport or train station. Two folks don't know where they are going. So they stop dead in their tracks at the end of the escalator, blocking the path for all the other folks on the escalator.

          That is a well known problem in architectural design. Give a clear path for people to exit and clear escalators and the like, and place directional signs where people have space to stop to read them. There's an art to finding a way to entice people away from stairs and escalators after they exit them.

    • Re:Poor example (Score:5, Interesting)

      by wstrucke ( 876891 ) on Wednesday September 02, 2015 @08:28AM (#50442821)

      "One Google car, in a test in 2009,..."

      One would think that in 6 years some improvements would have been made. Do we have a more current example?

      It mentions further down in the article that that particular example has already been corrected.

      ... For instance, at four-way stops, the program lets the car inch forward, as the rest of us might, asserting its turn while looking for signs that it is being allowed to go.

    • Re:Poor example (Score:5, Interesting)

      by Braedley ( 887013 ) on Wednesday September 02, 2015 @08:44AM (#50442895)
      As it turns out, we do [washingtonpost.com]. A Google Self Driving Car and a cyclist on a fixed gear bike met at a 4-way stop. The cyclist was doing a track stand (staying upright on the peddles, sometimes peddling backwards and forwards a small ammount) instead of balancing on a foot. This caused the Google car to think the cyclist was going to enter the intersection after the car had started moving, causing it to stop and "wait" for the cyclist, which by this point had "stopped", which the car took to mean that he (the cyclist) was waiting for the car to go (which was actually the case), and so the car would start moving again until the cyclist started his next forward motion to balance himself.
      • So the earlier example with a car doing much the same has been corrected. Now they have data that shows a bike can do something similar at an intersection. I imagine it will be pretty trivial to produce code that lets the car progress through the intersection slowly, while watching the bike to make sure it stays within a box.

        One thing's for sure, a car that refuses to go until the cyclist stops moving or takes their turn is hardly creating a dangerous situation.

        It does raise another interesting point though

        • by jbengt ( 874751 )
          The roundabout they built to replace a 4-way stop nearby my house may let you slow down to 5 mph rather than stopping and eliminated the odd drunk driver blowing through the stop sign, but it is more difficult to maneuver than stop signs, and has also resulted in more crashes.
        • If the roads are very similar in traffic volume, use a roundabout.

          I've seen places where they put in roundabouts for existing road intersections and the problem is that the roundabouts, despite taking about 20 times more space than a normal intersection, was still far too small for traffic to go around safely.

      • by tlhIngan ( 30335 )

        As it turns out, we do. A Google Self Driving Car and a cyclist on a fixed gear bike met at a 4-way stop. The cyclist was doing a track stand (staying upright on the peddles, sometimes peddling backwards and forwards a small ammount) instead of balancing on a foot. This caused the Google car to think the cyclist was going to enter the intersection after the car had started moving, causing it to stop and "wait" for the cyclist, which by this point had "stopped", which the car took to mean that he (the cyclis

    • Re:Poor example (Score:5, Informative)

      by AmiMoJo ( 196126 ) on Wednesday September 02, 2015 @08:59AM (#50442959) Homepage Journal

      The NYT hates new car tech, especially EVs and robots. I seems to be in the pocket of some big vested interests (oil presumably, maybe other auto manufacturers who are falling behind). Remember the infamous Tesla Model S review by that Broder guy, where he did everything in his power to make it fail, exceeding the speed limit and slow-cooking himself with the heater etc.

      This is just another hit-piece against autonomous cars. It might even be out to trash Tesla again, since they are introducing autopilot.

    • Do we have a more current example?

      That would kinda mess up the story now wouldn't it?

      This is just a techno FUD story for people who can't stand Google or self driving vehicles to point to and yell "See, SEE? I told you these things will never work!"

      Theat they have to point to 6 year old data is sort of telling.

  • In other news (Score:5, Insightful)

    by Anonymous Coward on Wednesday September 02, 2015 @07:53AM (#50442657)

    Millions of people on the road today deserve to have their license taken from them because they can't follow simple rules like signaling, not parking halfway out into the street and leaving enough room to brake in case the car in front of you brakes.

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      Millions of people on the road today deserve to have their license taken from them because they can't follow simple rules like signaling, not parking halfway out into the street and leaving enough room to brake in case the car in front of you brakes.

      Due to this - I am a pedestrian - I'd much rather have self-driving cars.

    • Re:In other news (Score:5, Insightful)

      by Lumpy ( 12016 ) on Wednesday September 02, 2015 @08:12AM (#50442737) Homepage

      Or speeding in residential areas. Those people are the scum of the earth.

      • by Mr D from 63 ( 3395377 ) on Wednesday September 02, 2015 @08:17AM (#50442763)
        Left lane laggards are even lower.
        • We just passed a law basically saying "keep right, except to pass". Judging by comments on news articles about the law, this concept is utterly incomprehensible for a large number of drivers. Any number of folks seemed to think that as long as they were doing the speed limit, this rule didn't apply to them, and that it was ok to stay in the left lane all day.

          • This is already law in the state I live (Washington), and has been so since before I get my license. It is *almost* never enforced.

            With that said, there was a time at a past employer where somebody posted mail to a large social (voluntary inclusion, not-work-related) internal mailing list asking for advice on how to get out of a ticket he'd gotten for holding up traffic in the left lane. The typical response ran something like this:

            Well, did you get the cop's name/badge number? Because I'd like to buy him a

    • This. The problem is not who follow the rules, the problem is the idiots who refuse to follow the rules.
      • While they are indeed a problem, possibly the largest, there are two problems with that being THE problem. 1. Driving safely is often but not always about following the rules. 2. Sometimes the rules are written in a way that happen to counter safe driving to increase the local revenue.
  • by Anonymous Coward

    From what I have read from people who have actually interacted with a autonomous car. They are very pokey, slow, and tend to pause trying to figure out what to do.
    In fact many times its the required human driver who has to intervene in order to help the car out of a jam. I think the more we try and mix these auto driven vehicles with human one's the more we will experience the growing pains of this technology.

    • I see this autonomy being helpful, like an autopilot control. I may be the one in control as I enter the highway, get to speed and merge, but once I'm in a lane, I may switch over to the autonomous system.

  • culture dependent (Score:5, Interesting)

    by Chrisq ( 894406 ) on Wednesday September 02, 2015 @08:06AM (#50442711)

    “They have to learn to be aggressive in the right amount, and the right amount depends on the culture.”

    Very true, When holidaying in Texas I quickly found out that stopping for a red light that had just turned would upset drivers behind me. The lights had a much longer amber time, so a whole lot of people who would have had to brake for the lights in the UK would go through

  • Not normal driving. (Score:4, Informative)

    by Lumpy ( 12016 ) on Wednesday September 02, 2015 @08:11AM (#50442733) Homepage

    They are confused by BAD driving. People in general really really suck at driving and a computer will have problems with that.

    • by gstoddart ( 321705 ) on Wednesday September 02, 2015 @08:50AM (#50442919) Homepage

      Which is always going to be the problem ... because as long as there are human drivers on the road, there will always be cases in which the computer utterly fails.

      And any technology future which is predicated on suddenly replacing all drivers with autonomous cars is complete crap and will never actually happen. Because nobody is going to pay for it.

      It's the corner cases which will always cause these things to go wrong. And, I'm sorry, but the driver with his right turn signal on who swoops across two lanes and turns left ... or the ones who think they can use the oncoming lane because there's something in their lane ... or who randomly brake because they can see a cat a half mile away ... or cyclists who do crazy and random shit ... or any number of crazy things you can see on a daily basis ... all of these things will create situations in which the autonomous car utterly fails to do the right thing.

      As much as people think it will mostly work most of the time, if these things require the driver to constantly monitor it or have to swoop in when the system decides it doesn't know what to do, then the utility of the autonomous car pretty much vanishes.

      I just don't see this technology ever becoming widespread or used in the real world, other than by companies trying to prove how awesome it is. Because it's just going to have too many cases which simply don't work, and the occupants will have to be ready to take the controls.

      In which case you might as we be driving and actively engaged in the process instead of zoned out and not paying attention. Because the human reaction time is greatly diminished when you're reading the newspaper and suddenly have to take evasive reaction.

  • by ledow ( 319597 ) on Wednesday September 02, 2015 @08:13AM (#50442741) Homepage

    If the programming makes it jerk the steering away from a stationary hazard rather than, say, detect it earlier and slow down as it approaches, then it's not suitably programmed for coexistence with unexpected stationary hazards (Not even anything to do with human presence! What if that was a cardboard box and it swerved heavily in case that box "pulled out"?).

    If it can't make it's way through a junction where the drivers are following the rules, that's bad programming. If it can't make it's way through a junction where other drivers don't come to a complete halt for it, it's not fit to be on the road with other drivers.

    If you want a car to co-exist on the road, it has to be treated as a learner driver. If a learner driver swerved at a non-hazard, they would fail. If a learner driver refused to make progress at a junction because the masses didn't open up before it, they would fail. So should an automated car.

    Unless - and this is important - you are saying that automated cars should only operate on automated roads where such hazards should never be possible and they are deliberately NOT programmed to take account of such things. Which, in itself, is expensive (separate roads with separate rules with no human drivers), stupid (that's otherwise known as a "train line", and because they can't do anything about it it will hurt more when it does happen), and dangerous (because what happens if a cardboard box blows over the automated road? etc.).

    Program to take account of these things, or don't plan on driving on the road. The safety record is exemplary but equally there are only a handful of them and the eyes of the world are on them, and there are still humans behind the wheel, and even by miles travelled each one is probably dwarved by a single long-distance driver over the course of a year - and it's not hard to find a long-distance driver who's not had an accident for years.

    If you're going to be on the roads, then you need to be able to take account of all these things, the same as any learner driver. Sure, you didn't hurt anyone by swerving or not pulling out, but equally - in the wording of my first driving test failure - you have "failed to make adequate progress" while driving.

    A car sitting on a driveway would have an even better safety record but, in real life, it's still bog-useless compared to a human. Similarly for any automated vehicle that just stops at a junction because it can't pull out, or swerves out of the way of a non-hazard (and potentially weighs up collision with non-hazard vs collision with small child and gets it wrong).

    • by Cassini2 ( 956052 ) on Wednesday September 02, 2015 @08:33AM (#50442835)

      If it can't make it's way through a junction where the drivers are following the rules, that's bad programming. If it can't make it's way through a junction where other drivers don't come to a complete halt for it, it's not fit to be on the road with other drivers.

      The problem is that people don't follow rules. We follow approximations of the rules. For instance, my driver's handbook described the correct way to deal with yielding at a four-way stop as "yield to the person on the right." For a computer, that's an obvious deadlock situation, or worse - an obvious mistake. If four cars are parked at a four way stop, and each car yields to the car on the right, then (a) a situation could occur where no one goes anywhere, and (b) if the individual cars only pay attention to the person on the right, then they could hit an on-coming car turning left, or the car on the left turning left. People process the "yield to the person on the right" rule into something much more complex.

      People use a number of complex behaviours at four-way stops. Firstly, the wave of the hand, or the nod of the head to indicate that you yield to the other driver is an important signal. Secondly, in my jurisdiction, 90% of the four way stops are done on a first-come first-served basis. Lastly, and this is the bit I don't understand, often people yield to the person on the left. The actual system of navigating a four-way stop is much more complex than what an initial computer implementation might be.

    • by Xyrus ( 755017 ) on Wednesday September 02, 2015 @08:39AM (#50442867) Journal

      Computers follow rules. Humans (a.k.a every other asshole on the road) do not.

      This is a no win situation. If you program a car to drive safely and follow rules, then it won't be safe on roads because of all the assholes who don't. If you program the car to behave more like an asshole ( a human driver), then it won't be safe since there's a good chance it will make the wrong call. If you program the car to just account for assholes but still drive safely, then it will basically choke in situations like a four way stop in southern California where every other asshole will just muscle or roll their way through the stop.

      The long pole in the tent isn't developing an AI capable of driving. It's developing an AI that can deal with assholes.

      • by swillden ( 191260 ) <shawn-ds@willden.org> on Wednesday September 02, 2015 @09:00AM (#50442971) Journal

        It's not a no-win situation. It just means that self-driving cars have to know when to break the rules. They can and should behave like the best of human drivers.

        If you program the car to just account for assholes but still drive safely, then it will basically choke in situations like a four way stop in southern California where every other asshole will just muscle or roll their way through the stop.

        The current programming of the car handles that situation. Less aggressively than a human would, but aggressively enough to assert its intention to go, and go.

      • There's a much easier solution that has already been implemented. In the short-term, the human driver can take over in these situations. In the medium term, we should have more automated traffic enforcement (Here comes my first -1 moderation). You could have self-driving police cars. If the car has to do something aggressive to avoid a crash, ticket the human driver. Takes everything arbitrary out of the system. That and automated speed enforcement. In the long-term, the human drivers will be the exc
        • by PPH ( 736903 )

          In the short-term, the human driver can take over in these situations.

          But the human 'driver', freed of the need to keep tabs on traffic is probably doing something else.

          Yesterday, I was coming home through the daily traffic jam. Heading eastbound, I saw a van stopped in the westbound lane, backing up traffic. I figured it had broken down or something until I passed it. The driver had his nose in his phone, busily texting away (or playing Angry Birds). He probably figured that he'd get something else done while the line wasn't moving and failed to notice that it had started a

      • by fisted ( 2295862 )

        Here's an elaborate algorithm for autonomous cars to solve the oh-so-huge problem at a 4-stop intersection:

        0. Are we in a 4-stop intersection with three other cars?
                Yes: Goto step 1
                No: We're finished.
        1. Are some of the cars human-driven?
                Yes: Yield to them. Goto step 0.
                No: Work out go-order by communicating with the other three autonomous cars. We're finished.

      • Put the AI on a M1A2, problem solved :-)
      • Re: (Score:2, Informative)

        by Anonymous Coward

        That's not really the take-away from the article. The issue is that even with this limited amount of real-world usage, the cars are running into scenarios that the programmers didn't anticipate and the software handles the scenario poorly as a result - more poorly than humans would. That is the issue. That is the limitation of the self-driving car. There will *always* be unanticipated events when driving - more so when there are more automated cars on the roads. The automated cars will handle those sit

    • by swillden ( 191260 ) <shawn-ds@willden.org> on Wednesday September 02, 2015 @08:58AM (#50442951) Journal

      Program to take account of these things, or don't plan on driving on the road.

      Duh.

      Technology in development is imperfect. Big surprise. These issues are why Google hasn't yet started selling them to the public. None of them are insurmountable, but it takes a lot of time and effort to build sophisticated systems.

      What if that was a cardboard box and it swerved heavily in case that box "pulled out"?

      The cars can easily distinguish between a cardboard box and a vehicle. Determining whether or not the vehicle has a driver in the seat and might move... that's often impossible. Likely the reason that the car swerved sharply rather than braking earlier is because the badly-parked car was obscured by other obstacles.

      If it can't make it's way through a junction where the drivers are following the rules, that's bad programming.

      Six year-old programming, note. The article mentions that the current version of the software inches forward to establish intent to move.

      and potentially weighs up collision with non-hazard vs collision with small child and gets it wrong

      Google cars recognize pedestrians (of all sizes) and regularly notice them even when no human could. I'm sure the car would choose to hit another vehicle over a pedestrian or cyclist.

      Really, your whole comment is a mixture of outdated information buttressed by invalid assumptions and layered over with a veneer of blindingly obvious conclusions.

    • by swb ( 14022 ) on Wednesday September 02, 2015 @10:31AM (#50443601)

      Every time I've heard an expert (usually a college professor with a background in computer science, robotics, or automation) discuss existing self-driving cars (the Google car is almost always mentioned as an example), the experts always describe self-driving cars as something more highly programmed and rule-bound than actually autonomous.

      They rely less on machine vision and more on extremely detailed and high-resolution saved maps versus driving the road they see in front of them. Sensors are used to determine hazards, but more for avoidance than some kind of self-guided navigation decisions.

  • What about speeding? Even more so on under posted highways / interstates / toll roads?

    Useing the center of the road as an extended trun lane? even when not marked as one?

    rolling stops when no other cars are in the way?

    • by H_Fisher ( 808597 ) <h_v_fisher AT yahoo DOT com> on Wednesday September 02, 2015 @08:26AM (#50442797)
      This. All the studies that I've seen boasting about the enormous time advantages of self-driving cars ignore the fact that most human drivers tend to cruise from 5 to 15 MPH over the posted speed limit on many interstates and highways. I can't imagine a self-driving car being designed so as to operate above the posted speed limit in self-driving mode. Unless a second set of roads or a second set of rules is created for autonomous vehicles, you're going to have a difficult time convincing people of the advantage of being slower than anyone else on your morning commute.
      • Re: (Score:2, Informative)

        by Anonymous Coward

        Actually, autonomous cars are programmed to exceed the speed limit by up to 10 mph. This is done because Google deems it safer than driving at the speed limit and being slower than the other cars on the road.

        http://gizmodo.com/googles-autonomous-car-is-programmed-to-speed-because-i-1624025227 [gizmodo.com]
        http://www.bbc.com/news/technology-28851996 [bbc.com]

        • So is Google going to pay my speeding ticket when a cop pulls over my autonomous automobile for speeding?
          • So is Google going to pay my speeding ticket when a cop pulls over my autonomous automobile for speeding?

            Almost certainly. Though they will bring in several well-respected highway safety engineers to testify that following the flow of traffic is significantly safer than following the posted speed limit. Enough jurisdictions will lose money arguing these cases that there won't be money to be made by writing the tickets. Absent both the financial and safety benefits the police will stop issuing the citations.

          • Most certainly not. Eventually there will be a class action suit, though, and a firmware upgrade will allow you to force the car to strictly obey the limits. That will be about the same time that there is saturation of self-driving cars and the sheer number of them that are keeping speed to within 0.01% of the posted limit will ensure that nobody can speed. Municipalities will then all complain of the lost revenue.
      • by belthize ( 990217 ) on Wednesday September 02, 2015 @08:49AM (#50442915)

        If all the cars were autonomous the morning commute times could be cut in 1/2 or 1/3rd without changing the speed limit since rush hour style rubber band stop and go traffic would be a thing of the past.

        • And even if morning commute times were longer, who cares, as it wouldn't be commute time. You would start working the second you hit the car. Who cares if it takes two hours each way if you have your computer open doing work.
      • Even if the car would be programmed to follow the rules exactly, how much time would you actually lose on your daily commute? Really? Especially when weighed against the fact that you can spend the drive to work reading / working / making calls or whatever.
      • by Gavagai80 ( 1275204 ) on Wednesday September 02, 2015 @02:26PM (#50445617) Homepage

        A slow commute isn't such an issue if you can spend it relaxing or working instead of driving... and even speeding by 15 MPH only saves a few minutes on a commute.

    • by krray ( 605395 )

      And that's a ticket for each one of those offenses [if caught]. I've been ticketed in the last 5 years for each and every one of those. Annoying? Yes. Unavoidable? Don't use a Google car. :)

  • with ABS, airbags and selectable 4WD. Window cranks, inert key and levers on the seat. Cheaper, less to break, and I think I can still manage.

Anyone can make an omelet with eggs. The trick is to make one with none.

Working...