Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×
Transportation AI

The Humans Crashing Into Driverless Cars are Exposing a Key Flaw (bloomberg.com) 748

schwit1 sends in a story from Bloomberg pointing out that the rigid adherence to traffic laws and overcautious programming have caused self-driving cars to rack up a crash rate twice that of an average human driver. "This may sound like the right way to program a robot to drive a car, but good luck trying to merge onto a chaotic, jam-packed highway with traffic flying along well above the speed limit. It tends not to work out well. As the accidents have piled up — all minor scrape-ups for now — the arguments among programmers at places like Google and Carnegie Mellon University are heating up: Should they teach the cars how to commit infractions from time to time to stay out of trouble?" While the autonomous vehicles aren't at fault in these crashes, their relative unpredictability on the road are nonetheless leading to more accidents than expected.
This discussion has been archived. No new comments can be posted.

The Humans Crashing Into Driverless Cars are Exposing a Key Flaw

Comments Filter:
  • Unison (Score:5, Insightful)

    by Tukz ( 664339 ) on Friday December 18, 2015 @09:02AM (#51142755) Journal

    I have always thought that for automated vehicles to be a reality, ALL traffic has to be automated.
    It takes almost an A.I. to be able to adjust to the random nature of human driving.

    • Re:Unison (Score:5, Funny)

      by U2xhc2hkb3QgU3Vja3M ( 4212163 ) on Friday December 18, 2015 @09:07AM (#51142787)
    • Re:Unison (Score:5, Interesting)

      by Vrekais ( 1889284 ) on Friday December 18, 2015 @09:14AM (#51142841)

      I've thought this too and I think it'll be a city that does it first. City traffic is the worst affected by the start stop of signalling and a perfected driverless system wouldn't need signalling as they could flow efficiently (assuming the system is aware of every other car's location).

      We have a few cities here in the UK that are becoming completely pedestrian/mass, you get to a point on the city boundary where you have to park and a bus takes you in the rest of the way. I think a city might pilot a "auto mode only" area at some point.

      The law abiding nature of them reminds me of another dilemma I've wondered about. If the car is about to crash and has become sophisticated enough to know that X maneuver would result in 5 pedestrian deaths but Y maneuver only kills the driver.
          Do they make it kill the driver?
          How do you sell something that is programmed to kill you if certain circumstances are met?

  • This should go both ways. People will need to adapt to the way automated vehicles drive (this would be helped by labeling them so they are easy to spot). Then automated vehicles should be given a set of exception to the rules and this would need to be legal, so the can override the regulations when the regulations are likely to create trouble.

    • Re: (Score:2, Interesting)

      by Dcnjoe60 ( 682885 )

      This should go both ways. People will need to adapt to the way automated vehicles drive (this would be helped by labeling them so they are easy to spot). Then automated vehicles should be given a set of exception to the rules and this would need to be legal, so the can override the regulations when the regulations are likely to create trouble.

      If your goal is to give privilege to the wealthy who will be able to afford autonomous vehicles than this would certainly do it. The rich, riding in these new vehicles, will get special rules related to operating a vehicle in traffic compared to the rest of us. Then, in addition, if there is an accident where a regular car hits an autonomous vehicle, it will be the regular drivers fault because the autonomous vehicle wasn't breaking the law in what it was doing.

      A better and more practical solution would be

    • Re:Adaptation (Score:5, Insightful)

      by matbury ( 3458347 ) on Friday December 18, 2015 @10:04AM (#51143257) Homepage

      Agreed. Human drivers tend to have this sense of entitlement and exceptionalism so that they believe that they can break the law but anyone else who does it is a dangerous idiot. Automated cars are a new type of vehicle that these drivers have to negotiate. Expect a learning curve with a spike in accidents while drivers get used to it. I remember the same thing happening in Barcelona when they re-introduced trams. There was a sudden, dramatic spike in accidents, some of them fatal, while drivers learned what they could and couldn't "get away with." The answer is to give time for lawless drivers to learn about automated cars and adjust their law-breaking appropriately so that they don't get involved in as many accidents. And while we're at it, how about automated cars recording everything that happens around them that can be presented as evidence in court?

  • by AntronArgaiv ( 4043705 ) on Friday December 18, 2015 @09:03AM (#51142761)

    People expect Caddys to drive slow and do weird things, because Uncle Harry is driving. Same for Priuses, because it's either Aunt Marge or some granola-head hippy doing his "hyper-mileing" thing. Problem solved :-)

    Either that or put a sticker on the back: "This car rigorously obeys all traffic laws"

  • It might be safer in some situations, but you won't be able to buy an autonomous car that's programmed to break the law. The DoT will just never allow it, and I can't blame them.

  • by AthanasiusKircher ( 1333179 ) on Friday December 18, 2015 @09:07AM (#51142791)

    I'm sure there will be AI defenders who will question the assertion about a crash rate "double" that of average humans. But it doesn't matter. The point is that human drivers are idiots and drive in all sorts of unpredictable ways. They also tend to hate other drivers who operate in demonstrably safer ways (e.g., allowing plenty of space in front of them, not accelerating wildly just to stop 100 feet ahead in stop-and-go traffic, not zooming past a slower lane in a merge situation, but instead attempting a "zipper merge" at the same speed as the slower lane, etc). Of course, a lot of the less safe human behaviors also tend to be the reason for traffic snarls in the first place, but you'll have a hard time convincing most drivers of that, since they want to drive as if they are on a racetrack and somehow think that weaving back and forth to get into that tiny gap you've left in front for safety is going to allow them to get home so much faster (even if it's only 2 seconds earlier).

    I imagine the biggest problem with having AI cars obey traffic laws strictly is not the accidents -- rather that it's going to lead to human road rage, which often leads humans to be even more irrational and drive in even less safe ways. Thus, while AI cars are still a minority on the roads, I'm not sure it will lead to a net improvement in accident statistics -- just as a "slow driver" on a highway can block up traffic, cause other drivers to drive unsafely around them, and ultimately lead to the potential for more accidents, even if that slow driver thinks they are being "safe" by driving the speed limit or a little below.

    • Re: (Score:3, Interesting)

      by sinij ( 911942 )

      The point is that human drivers are idiots and drive in all sorts of unpredictable ways.

      All of this is true, yet accident rate of these idiotic humans is half of what rigidly-abiding robots are. Perhaps, driving like an idiot in all sorts of unpredictable ways is the right approach to reducing accidents in a system that presently dominated by idiots driving in all sorts of unpredictable ways?

      • by BasilBrush ( 643681 ) on Friday December 18, 2015 @09:18AM (#51142861)

        Or... it's simply a learning curve. For both AI and human drivers.

        Besides, all of the robot crashes have been minor fender benders. It may be worth living with double the rate of those if the serious crashes that injure people are perhaps halved.

        • by sinij ( 911942 )
          We could put big soft bumpers on AI cars and paint them bright orange. Perhaps white Lexus wasn't such a good platform choice?

          Still, if it is necessary for humans to change and adapt to make autonomous driving a possibility, then it is clear indication that AI is not up to the task of driving by itself.
        • by PvtVoid ( 1252388 ) on Friday December 18, 2015 @09:33AM (#51142997)

          Besides, all of the robot crashes have been minor fender benders. It may be worth living with double the rate of those if the serious crashes that injure people are perhaps halved.

          Bingo. The relevant question is "What is the crash rate involving injury?" If that is lower, then it is reasonable to accept a higher rate of non-injury crashes. Trading injury for property damage is a good deal, since increased property damage can be handled by increased insurance costs on non-automated drivers.

        • by Junta ( 36770 ) on Friday December 18, 2015 @10:24AM (#51143399)

          all of the robot crashes have been minor fender benders

          Note that, for example, Google's cars are never going faster than 25 mph. So it's a little disingenuous to say they only get into fender benders when a human in the same situation would likely not have anything more than a minor fender bender either, even if they were very bad at driving.

      • All of this is true, yet accident rate of these idiotic humans is half of what rigidly-abiding robots are. Perhaps, driving like an idiot in all sorts of unpredictable ways is the right approach to reducing accidents in a system that presently dominated by idiots driving in all sorts of unpredictable ways?

        Maybe, maybe not. It depends on how you count accidents. Do you want to reduce accidents in general, or do you want to reduce serious accidents that cause serious or fatal injuries?

        It's unclear from the limited data what effect the robotic rules are having. If "driving like an idiot in all sorts of unpredictable ways" causes 50 accidents with 25 serious ones and 10 fatalities, but driving in a more rigid rule-based way results in 100 fender-benders but no serious accidents and no fatalities, I think mo

        • by sinij ( 911942 )
          You are over-simplifying problem.

          If you want to eliminate accidents the simplest way is to ban all driving. We don't do that, because what we really are after is reduction of accidents for any given throughput of transportation system. It is difficult to measure throughput, so we approximate it by miles driven.

          Every adherent of "rigid rule-based" approach ignores the death toll caused by increased traffic as a result of rigidly following all rules. It is easy to measure direct fatalities caused by traffic
          • by Alioth ( 221270 ) <no@spam> on Friday December 18, 2015 @09:49AM (#51143145) Journal

            "Rigid rule based approach" would probably result in many fewer snarl-ups.

            Let's imagine 2 extremes: the M6 southbound near Manchester as it is today. Traffic is very heavy, and impatient drivers tend to bunch up. An impatient driver cuts from one lane to the next because the next lane is moving 1mph quicker, forcing their way into the remaining space in lane 3 causing someone to brake, and it causes a chain reaction - all the close following cars with too little distance start braking progressively harder and harder until the entire motorway stops (or worse, someone gets rear-ended). You now have a self-sustaining traffic jam with no discernible reason (from the air you just see a standing wave of stopped traffic with no obvious cause) until the evening when finally fewer vehicles are arriving at the back of the jam than are leaving from the front.

            The other extreme is the same entire motorway is populated by rigidly rule following automated cars. They will all be following a safe distance. No one will cut across a lane because the other one is going 1 mph faster. Traffic flows freely all day long despite the density.

      • by PvtVoid ( 1252388 ) on Friday December 18, 2015 @09:23AM (#51142911)

        All of this is true, yet accident rate of these idiotic humans is half of what rigidly-abiding robots are.

        No. The accident rate of the idiot humans is twice as high with robot cars as it is with other idiot humans.

        • Re: (Score:3, Funny)

          by sinij ( 911942 )
          When you define problem space this way I could only see two solutions:
          1. Ban rigidly-abiding robots from the roads
          2. Ban idiotic humans from the roads

          I hope you can see that 1. is by far more economical and feasible solution, especially considering we have a shortage of rigidly-abiding robots and over-supply of idiotic humans.
      • The point is that human drivers are idiots and drive in all sorts of unpredictable ways.

        All of this is true, yet accident rate of these idiotic humans is half of what rigidly-abiding robots are. Perhaps, driving like an idiot in all sorts of unpredictable ways is the right approach to reducing accidents in a system that presently dominated by idiots driving in all sorts of unpredictable ways?

        Actually, that's not what the article stated. The overall accident rate for autonomous vehicles is lower than human vehicles. However, the likelihood of your vehicle being hit by a vehicle driven by a human is twice as high if you drive an autonomous vehicle.

        The other issue, which has not been tested yet, is how will the autonomous vehicles fair against different manufacturers who have developed their own AI? It is much more predictable to program all AI cars to obey the traffic laws than to have each man

    • I imagine the biggest problem with having AI cars obey traffic laws strictly is not the accidents -- rather that it's going to lead to human road rage, which often leads humans to be even more irrational and drive in even less safe ways.

      This is a real problem with bad traffic laws, and in a way I'll be glad if having a "perfectly law-abiding" driver demonstrate it unambiguously makes the point.

      We have a related problem in my home city (Cambridge, UK) at the moment. There are many cyclists here and congestion is also increasing with new housing developments nearby. The local council have responded by lowering the speed limits on almost all roads in the city to 20mph instead of the normal 30mph.

      On the face of it, this looks like a reasonable

  • by Anonymous Coward on Friday December 18, 2015 @09:08AM (#51142799)

    There was a study a few years ago about traffic in cities. They found that if all the drivers kept to rules that most cities would halt into complete grid lock.
    People need to break rules to clear junctions, to pass cars that are stuck, and even force priority to not starve lanes going into a junction.

    I travel by bus to and from work in Amsterdam, it is quite a long trip which includes traffic jams in the inner city. The bus driver needs to often break the rules to be able to pass cars, and force priority on junction because they are often stuck. Cars are backing up, cars are trying to make room.

    • All of this just highlights how primitive current AI really is. I have a lot of experience dealing with drivers who behave in random, unpredictable ways. A self-driving car doesn't. Current AI is a long, long way from being able to handle all the situations that a human driver encounters every day.

    • There was a study a few years ago about traffic in cities. They found that if all the drivers kept to rules that most cities would halt into complete grid lock.The bus driver needs to often break the rules to be able to pass cars

      That depends on the rules. It is illegal for a bus to pass a car where you are?! As for buses, there is a fortunate tendency in the UK to give way to them; otherwise they would never be able to pull away from any bus stop in cities. Perhaps that is what you mean by "breaking the rules"?

      My experience is the opposite. Every day I am in a long traffic jam of cars waiting to pull out at a "T" junction onto another road which has priority. This is all in a city suburb with 30mph speed limit. The traffic

  • WCPGW? There's a part of me that thinks, "We point lasers at planes!"
  • Get humans out of driving. Humans tend to break rules and that disqualifies them from being good drivers.
  • by Ed Tice ( 3732157 ) on Friday December 18, 2015 @09:11AM (#51142821)
    This is nothing more than a sophisticated from of "everybody else is doing it" argument that you get from small children. If the rules aren't working, the solution is to either enforce the rules better or to change the rules. Having everybody ignore the rules and not change them is the worst possible outcome. It creates a situation where things simply can't get better. Nobody can know the real effect of properly enforced rules so there's no data that can be used for improvement of the rules. What we need is better enforcement for human drivers. It's almost inexcusable that neither cars (nor trains) have automatic speed control systems that prevent exceeding the limit. Invariably somebody will point out the fantastical corner case where accelerating and swerving makes sense but those can be easily solved.
    • Indeed. If more accidents are prevented or at least the seriousness of them reduced from other aspects of enforcing the speed limit, then it's worth the occasional fantastical corner case crash.

    • by serano ( 544693 ) *
      I see it more as the cars failing to respond appropriately to surrounding conditions. The driving style of surrounding drivers is an essential condition that should be factored for.
    • Having everybody ignore the rules and not change them is the worst possible outcome.

      And yet, it's what we have.

      Driving is a complex mix of the statutory, the habitual and the negotiated. Getting a car to obey the first is the easy part. Good luck with the other two. Especially the third. When a driver flashes his lights at you at a junction, what do you do?

      The real world is messy and the corner cases kill you. If you're fortunate, you work in an industry where that's metaphor. If you're unlucky you work in an industry where that's a literal truth.

    • how do you enforce something 100% without going for a big brother solution? there are only so many cops.

    • Believe it or not, "everybody does it", or, called by the official term, "customary law" is part of the anglo-saxon law system.

      Drafting new rules is non trivial I guess, as they have to allow cars to do what the humans do, as well as still being understandable by humans, so that the humans stay legal, and the humans know what to expect from cars.

    • The problem is far more complicated that you realize.

        It has already been shown that if everyone followed existing laws perfectly, traffic would grind to a complete stop. So obviously you need to change the rules, right? But, trying to change the rules to accommodate every possible situation will simply result in a mess that's even worse than what it is right now, because those "fantastical corner cases" are much more common than you think.

      • It has already been shown that if everyone followed existing laws perfectly, traffic would grind to a complete stop.

        Oh, come on. Who showed this? Where?

    • This is nothing more than a sophisticated from of "everybody else is doing it" argument that you get from small children. If the rules aren't working, the solution is to either enforce the rules better or to change the rules. Having everybody ignore the rules and not change them is the worst possible outcome. It creates a situation where things simply can't get better. Nobody can know the real effect of properly enforced rules so there's no data that can be used for improvement of the rules. What we need is better enforcement for human drivers. It's almost inexcusable that neither cars (nor trains) have automatic speed control systems that prevent exceeding the limit. Invariably somebody will point out the fantastical corner case where accelerating and swerving makes sense but those can be easily solved.

      You sir, are part of the problem and not the solution. For one thing, it is perfectly reasonable and acceptable to exceed the speed limit in order to safely merge into traffic. If you end up directly next to a car and need to merge then you have two options. One is to speed up and one is to slow down. If you're already going the speed limit then the safest option is not to slow down. You can see in front of you and next to you much more clearly than behind you. So why would you stick to a strict inter

  • Not an Infraction (Score:3, Interesting)

    by wisnoskij ( 1206448 ) on Friday December 18, 2015 @09:13AM (#51142835) Homepage

    Not only is it not an infraction to drive in such a way as to save lives and prevent accidents, when you can save a life or prevent an accident, but it requires you to go against the suggested speed, or swerve into the left lane (even when the divider is solid) you are actually required to do so. That is the entire point of cars having a maximum speed of several times the maximum suggested speed is because you are supposed to speed in many situations to save lives.

    • There isn't any situation I can think of where speeding up will end up saving lives, nor is that why cars can go faster than posted speed limits, nor does anybody teach swerving into the left hand (on-coming) lane to avoid an accident. That's just beyond brain dead, "let me trade this rear end collision with a head on collision, all day long!".
      • You obviously have not taken a motorcycle training course. They will teach you these strategies.

        For example, when coming to a stop you never just downshift to first, you down shift with the speed as you slow down. This allows you to quickly start moving again if you are about to be rear ended.

        If you are traveling quickly and someone slams on their breaks, the correct answer is often not to panic stop, but to use the bikes agility to quickly move out of the way. I've had a situation once where I was at a li

      • by Voyager529 ( 1363959 ) <voyager529@[ ]oo.com ['yah' in gap]> on Friday December 18, 2015 @10:15AM (#51143347)

        There isn't any situation I can think of where speeding up will end up saving lives

        Getting out of the way. In most cases, a car can accelerate out of the way of a car faster than its brakes can overcome intertia. Even if it can't, I'd much rather an accident where I get T-boned in the trunk than in one of the doors.

        nor is that why cars can go faster than posted speed limits

        Well, a car that maxed out at 55mph would be laughable in other states where 65 and 75 are common. trying to decide upon a national speed limit would be ridiculous, as it doesn't account for population density or geography/topography. Sure, this argues well for having cars max out at 90, rather than 120 or 140 (higher in some of the high end / exotic models), but a car that maxed out at 75 would be more desirable than one that maxes out at 55, and then we end up with interstate commerce hell...

        nor does anybody teach swerving into the left hand (on-coming) lane to avoid an accident.

        This particular lesson was covered as follows: "do whatever the hell it takes to avoid an accident". If that includes swerving into the left lane? so be it. Here's a for-instance: residential area, two lane road, a driver isn't looking too closely while backing out of the driveway. Do you retain your lane, or swerve into the left lane to avoid hitting his car in the rear wheel well area? Same for hitting a deer, fallen tree, road construction, idiot texting instead of looking at the road, a situation where the lights aren't synchronized and thus the left side is clear and the person in front of you stopped short...

        That's just beyond brain dead

        No, assuming that we're only talking about driving in a straight line on a highway, as if it is the only possible scenario where driving skills come into play, is beyond brain dead.

        "let me trade this rear end collision with a head on collision, all day long!".

        I am certain that the GP wasn't referring to crossing the divider when there was oncoming traffic. To more fully phrase it with the included context, his/her statement was this: "At present, Google cars treat the divider line as sacrosanct, and will not cross it under any circumstances. However, there are edge cases when driving where the best way of avoiding an an accident is to cross the line. Humans know that avoiding an accident is more important than staying in the lane; most humans would look at another human sideways if an accident took place because the driver adhered to lane markers rather than self preservation. This is expected of humans, but not of Google cars."

  • It's difficult to say without having the telemetry from every collision.

    If the one incident I've seen the data from is the typical case, then hell no. The driver rear-ended a Google car stopped at a stoplight without even slowing down.

    On the other hand, if we're talking about merging, then that's possibly no. I don't drive big truck anymore, but let's face it: four wheel drivers just can't figure it out.

    Merging is a bit of a difficult case because you can't merge if you haven't matched the speed of traffi

  • by PvtVoid ( 1252388 ) on Friday December 18, 2015 @09:19AM (#51142865)

    From TFA: "They’re usually hit from behind in slow-speed crashes".

    If this is in fact the dominant accident mode, I would suggest that this is not such a big deal and will, over the long term, be self-correcting as the insurance rates for idiot non-automated drivers shoot up because they can't get it through their thick skulls not to tailgate other vehicles.

    • by starless ( 60879 ) on Friday December 18, 2015 @09:31AM (#51142977)

      From TFA: "They’re usually hit from behind in slow-speed crashes".

      If this is in fact the dominant accident mode, I would suggest that this is not such a big deal and will, over the long term, be self-correcting as the insurance rates for idiot non-automated drivers shoot up because they can't get it through their thick skulls not to tailgate other vehicles.

      So, what's happening that makes tailgaters hit the driverless cars more often than driven cars?
      Are the google cars suddenly slamming on the brakes in a way that humans don't generally do?

    1. A robot may not injure a human being or, through inaction, allow a human being to come to harm. (Ensure accidents do not occur even if the car has to violate laws to do it.)
    2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. (Otherwise, obey all traffic laws.)
    3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. (Back up the user's self-driving preferences to the cloud! OK maybe this one do
  • When you meet a driver who will risk a collision rather than tollerate traffic infractions in others, what do you tell him? That he should stop driving by the rules? Hardly. No matter what you think about it, he's in the right and safe (from prosecution) as long as it's not him violating the rules. The same goes for driverless cars. Once you teach them to break the rules and one of them will get into an accident while being in violation, all driverless cars will be in trouble. Another positive result of thi
  • by Anonymous Coward

    Don't make self-driving cars shitty drivers just because everyone else is one.

  • 1. Write all the code in Spark or at least Ada or another language with a similar safety record (e.g. Haskell, perhaps Rust). It can be a formally sound and static subset of C++, if it must be, but not just any C or C++.

    2. Use only deterministic code, no dynamic memory allocation, fluffy A.I. heuristics or machine learning. I don't care how hard it is. I want real engineering based on real physics with provable security margins, guaranteed response times.

    3. Formally validate the code in a theorem prover, as

  • If an autonomous car can completely avoid accidents by taking corrective action that keeps it's behaviour within the law, then it should do.

    But there will always be occasions that occur out of the ordinary. Take the obvious example of someone stepping out into the road - if you do nothing, then you are certainly going to crash into the person (and likely kill them).

    Slamming on the brakes might cause a car behind to run into you, and it may not even be possible to stop in time.

    Swerving may be the only option

  • It is also the same problem with automated law enforcement such as speed traps and red-light cameras.
    Normally, cops are supposed to use some judgment before handing tickets, machines don't. A typical example is that you are supposed to make way for emergency vehicles, which sometimes involves breaking the usual rules. For example, if you cross a red light just enough to make way for an ambulance, a human cop won't ticket you as you did what you had to. A camera doesn't care. And it starts becoming a problem

  • their relative unpredictability on the road are nonetheless leading to more accidents than expected.

    Isn't the opposite way?, aren't self-driving stricly predictable?, isn't fault of those who don't follow the rules?. They should keep being like that, and people who doesn't follow the laws should be banned from driving, simple as that.

  • Confession (Score:5, Funny)

    by PopeRatzo ( 965947 ) on Friday December 18, 2015 @09:49AM (#51143149) Journal

    I'll admit it: I drive like an old person. Any day I can piss off a millennial in his BMW, is a good day.

  • by NReitzel ( 77941 ) on Friday December 18, 2015 @10:10AM (#51143295) Homepage

    The solution to this "problem" is quite obvious.

    Highways need to be -all- autonomous vehicles. No manual control at over 50 kph (35 mph).

  • by U2xhc2hkb3QgU3Vja3M ( 4212163 ) on Friday December 18, 2015 @10:10AM (#51143303)

    To summarise the summary: people are a problem.

  • by bistromath007 ( 1253428 ) on Friday December 18, 2015 @10:54AM (#51143563)
    It sounds more like a flaw with traffic laws.

Software production is assumed to be a line function, but it is run like a staff function. -- Paul Licker

Working...