Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Transportation Google

Self-Driving Cars In California: 4 Out of 48 Have Accidents, None Their Fault 408

An anonymous reader writes: The Associated Press reports that 48 self-driving cars have been navigating the roads of California since the state began issuing permits last year. Of those, only four have been in accidents, and none of the accidents were the fault of the autonomous driving technology. Seven different companies have tested autonomous cars on California's roads, but Google, which is responsible for almost half of them, was involved in three of the four accidents — the other one happened to a car from Delphi Automotive. All four of the accidents happened at speeds of under 10 mph, and human drivers were in control during two of them. The Delphi accident happened when another car broadsided it while its human driver was waiting to make a left turn.

The AP pieced together its report from the DMV and people who saw the accident reports. But critics note that there aren't direct channels to find this information. Since one of the chief selling points of autonomous cars is their relative safety over cars piloted by humans, the lack of official transparency is troubling. "Google, which has 23 Lexus SUVs, would not discuss its three accidents in detail." Instead, the company affirmed its cars' accidents were "a handful of minor fender-benders, light damage, no injuries, so far caused by human error and inattention."
This discussion has been archived. No new comments can be posted.

Self-Driving Cars In California: 4 Out of 48 Have Accidents, None Their Fault

Comments Filter:
  • by bradley13 ( 1118935 ) on Monday May 11, 2015 @08:20AM (#49663429) Homepage

    I expect the number haven't been publicized, because they are still to limited to have any significance, and also because the cars have been running under fairly tightly controlled conditions.

    When there are a few hundred cars, running in all kinds of weather and traffic conditions, with millions of miles - if the numbers are still good, you can bet that they will be plastered all over the internet

    • Yes, the data set is way to small to draw any conclusions, and key info like you say is not included, such as miles driven (only partially supplied), and where. And what are the rules for driving these vehicles autonomously vs manually? Are humans are taking over where there are more complicated or risky conditions such as a crowded parking lot?
      • by HangingChad ( 677530 ) on Monday May 11, 2015 @12:38PM (#49666175) Homepage

        Yes, the data set is way to small to draw any conclusions,

        Not necessarily. Pick a pool of 48 cars at random and compare the accident rates. You also have to compare them by the accident rate per hour behind the wheel.

        This gets at the whole idea that self-driving cars have to meet some lofty standard of perfection to become the optimum choice. To replace people behind the wheel self-driving cars only need to be +1 better than human drivers.

        Self driving cars can't drive in the rain. Oh, really? Take a drive around Seattle in the rain, you'll discover human drivers suck in the rain, too. And that's in the rain capitol of the world where you'd expect people to be used to driving in the rain and they still suck (I lived there for 10 years so don't bother trying to deny it).

        The biggest obstacle to self-driving cars isn't rain or snow, it's something called Illusory Superiority. The vanity of humans who think they're better drivers than they really are.

    • Re: (Score:3, Insightful)

      by Andy Dodd ( 701 )

      There's also: "Since one of the chief selling points of autonomous cars is their relative safety over cars piloted by humans, the lack of official transparency is troubling."

      No it is NOT a selling point, because NO ONE is selling these cars yet. It is EXPECTED to be a selling point once development is complete - WHICH IT IS NOT.

      That said, it would be interesting to hear the details of Google's two autonomous accidents.

      Also, the headline is misleading... While a car may be capable of self-driving, if a hum

      • by Junta ( 36770 )

        . While a car may be capable of self-driving, if a human is in control when an accident occurs, then the car was not a self-driving one as far as the accident goes.

        Well it is interesting in so far as knowing when the companies think they need to have human operators still. Not really so much the crash, just the portion of the time that is human versus autonomous.

      • by mwvdlee ( 775178 ) on Monday May 11, 2015 @09:07AM (#49663815) Homepage

        No it is NOT a selling point, because NO ONE is selling these cars yet.

        You may not be able to buy them yet, but they're certainly already selling you on the concept of it.
        The fact that we're talking about it here demonstrates that the marketing department for these cars is already in full swing.

      • by BasilBrush ( 643681 ) on Monday May 11, 2015 @09:33AM (#49664029)

        They say inattentiveness was the problem. I expect the drivers were wearing Google Glass at the time.

    • by goombah99 ( 560566 ) on Monday May 11, 2015 @08:47AM (#49663639)

      One can be "in the right" and still not have done the right thing. For example, if the light is green I'm in the right not to slow down for the intersection. But that doesn't mean I shouldn't take precautions to check if someone is coming the other way. If I had I might have avoided the accident that was not assigned to my "fault".

      On the other hand it's also possible that google cars will be better drivers than the average person. One might hope they use different CPUs for the texting and the driving.

      • by tsqr ( 808554 ) on Monday May 11, 2015 @09:02AM (#49663771)

        One can be "in the right" and still not have done the right thing.

        Pretty much what I was thinking. Back when the Earth was a molten mass and I was taking Driver Education in high school, there was a lot of emphasis on "defensive driving"; in other words, expect the other guy to do the wrong thing, and be ready for it. When you have a mix of self-driving and human-operated cars on the road, the self-driving ones better have some extremely conservative defensive driving skills.

      • by tlhIngan ( 30335 )

        One can be "in the right" and still not have done the right thing. For example, if the light is green I'm in the right not to slow down for the intersection. But that doesn't mean I shouldn't take precautions to check if someone is coming the other way. If I had I might have avoided the accident that was not assigned to my "fault".

        Depending on where you are, even if you had the green, you can be assigned partial fault if you hit the idiot running the red (turning right on red, while legal, is technically ru

    • by Alumoi ( 1321661 )

      I expect the number haven't been publicized, because they are still to limited to have any significance, and also because the cars have been running under fairly tightly controlled conditions.

      So, under controlled conditions, they were still involved in accidents. I wonder what's going to happen when they let them loose. You know, under normal conditions.

    • Re: (Score:3, Insightful)

      I suspect that "none are at fault" is probably true. But what is often left unsaid, the cars, while being legal, were doing something unexpected.

      My Great Aunt, had four car accidents in two years. None were her fault, yet they all kind of were. She was doing things in unexpected ways, that were completely legal, but not ordinary. People expect certain patterns, and when someone is outside of those patterns, it causes accidents. Not the fault, but rather the cause.

  • by Chrisq ( 894406 ) on Monday May 11, 2015 @08:20AM (#49663433)

    4 Out of 48 Have Accidents, None Their Fault

    I think that compares well to the average Californian.

    • by Anonymous Coward

      California is a "no fault" state -- so every accident is always nobody's fault.

    • No-fault is about taking money away from lawyers, who used to litigate each and every auto accident as a lawsuit in court before the insurers would pay. Eventually the insurers decided that they spent more on lawyers than accident payments, and they had no reason to do so.

      If you want to go back to the way things were, you are welcome to spend lots of time and money in court for trivial things, and see how you like it. I will provide you with expert witness testimony for $7.50/minute plus expenses. The lawye

  • Or is it normal that one out of twelve cars that is involved in an accident each year? And by calling it "only" the submitter suggests that the regular accident rate is much higher than that.

    • To be fair, 2 of the accidents happened while under human control. That suggests that yes, the computers are at least as good as the humans... That said, the sample size is tiny, and critical info like miles driven is missing, so who knows.

      • by mwvdlee ( 775178 )

        It also means that the cars aren't driving autonomously at all times.
        To me this implies that there simply isn't any comparable data yet.

      • To be fair, 2 of the accidents happened while under human control. That suggests that yes, the computers are at least as good as the humans... That said, the sample size is tiny, and critical info like miles driven is missing, so who knows.

        And maybe the humans are required to take over in crowded parking lots and other places where these kind of fender benders take place.

        So maybe this is telling us that Google should consider hiring older, more experienced drivers for these cars......

    • by thaylin ( 555395 )

      You are not considering the mileage driven. These cars are on the road for 100k miles + a year, so consider that 4 out of 720 cars were in an accident.

      • Re: (Score:2, Troll)

        by alphatel ( 1450715 ) *

        You are not considering the mileage driven. These cars are on the road for 100k miles + a year, so consider that 4 out of 720 cars were in an accident.

        I don't find these stats promising.
        Being from a family of 50k miles per year per driver, I can tell you that we all take vehicle safety highly seriously. We do not get into accidents, we do not get broadsided or hit pedestrians or bicyclists or even stop signs.
        The two incidents I can recall in over 10 years are once my uncle got hit from behind at a full stop at a red light, and the other time some loony attacked my mother's van with a baseball bat while she was driving down a street in broad daylight. Bo

      • by arth1 ( 260657 )

        You are not considering the speed they're going at and which roads they are going on. It's easy to avoid accidents when going sub-25 speeds on a predefines subset of roads. Whether you're human or not.

        Until we see some data on how autonomous cars do on all kinds of roads and driving speeds and conditions, I don't think we should extol their safety. Going 55 mph over a hilltop on a country road, or avoiding a deer is a bit different. Or a busy bumper-to-bumper city street where no-one will let you over

  • Non-Paywalled Link (Score:5, Informative)

    by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Monday May 11, 2015 @08:23AM (#49663467) Homepage Journal

    Perhaps you would like to read the story somewhere other than the NYT [seattletimes.com] because paywalling is stupid and offensive, even if you know how to bypass it. Thanks, Seattle Times [seattletimes.com], for just showing me the flipping article.

    Here's a news flash: You can get the same AP newswire article anywhere. Yet people still link the NYT. That's poor internet etiquette given that they paywall.

  • Editorializing... (Score:3, Insightful)

    by Junta ( 36770 ) on Monday May 11, 2015 @08:25AM (#49663485)

    '48 self-driving cars have been navigating the roads... Of those, only four have been in accidents'

    I know that the bigger point is that zero (known) incidents can be traced to the software making a 'mistake' (though even if the other driver is 'at fault', hard to say if a human would have done better at avoidance). The thing that strikes me though is the editorial bias here. *Only* 4 out of 48.. that's nearly 10%. That's far far above the percentage for the general population. It's perfectly likely that is simply a fluke of the small sample size, but implying that 4 out of 48 is a very promising rate of incident is pretty silly.

    • by tibit ( 1762298 )

      Everything depends on where you are. There are places, even in the U.S., where incident rates are a couple times over the national average.

    • Re:Editorializing... (Score:4, Informative)

      by gurps_npc ( 621217 ) on Monday May 11, 2015 @08:59AM (#49663737) Homepage
      You missed a rather significant point in the article. Two of those accidents happened when a human WAS in control of the car (which was how they know it wasn't the car's fault), so NO, a human would not have done better at avoidance.

      The fact that of the 4 accidents that happened, none of them were the car's fault is more significant than the 10% rat.

      When any specific humans has 4 accident driving cars, on average exactly 50% of them were caused by that specific human. If I drove long enough to have 4 accidents and none of them were my fault that would be significant evidence that I am a far superior driver than the average human

      • by LWATCDR ( 28044 )

        I mostly agree. The question I have where the accidents that the car was in control over easily avoidable by a human but not by a program. The end result may be more fender benders but fewer lives lost. Still a good trade off.
        In the end the sample is too small to jump for joy or shriek in horror.

      • by Kjella ( 173770 )

        You missed a rather significant point in the article. Two of those accidents happened when a human WAS in control of the car (which was how they know it wasn't the car's fault), so NO, a human would not have done better at avoidance. The fact that of the 4 accidents that happened, none of them were the car's fault is more significant than the 10% rat.

        I don't see how two of them should be meaningfully counted under any circumstances. They could just have it drive itself out of the parking lot and let a human do the rest, the autonomous system would never be at fault. If the car's not driving, it's just a plain old ordinary human-operated car. You don't count the miles, you don't count the accidents.

        When any specific humans has 4 accident driving cars, on average exactly 50% of them were caused by that specific human.

        Actually only about 90% of accidents are attributed to driver error, the rest is mechanical failure like a tire blowing out or environmental like a tree falli

      • You missed a rather significant point in the article. Two of those accidents happened when a human WAS in control of the car (which was how they know it wasn't the car's fault)

        Yes, but what precisely do they mean by "a human was in control"? Do they mean the human was actually driving at that point normally? Or do they mean that some sort of situation occurred while the AI was driving, the human driver took over rather quickly to resolve the situation (either because the AI alerted the human, the human knew he/she needed to take over in such a situation, or the human overrode the AI because of an impending problem), and the human driver wasn't able to correct things before an a

    • I don't think the percentage of cars in accidents is really that helpful without knowing more about average drive time and conditions. According to the numbers I found, there are around 200 million drivers in the US and those drivers are involved in 10 million accidents a year. That's 5% of the population roughly who are involved in accidents a year, which is half the automated cars. However, you have to bear in mind that the average driver in the US drives only 13,000 miles a year which really isn't tha
    • When you have test cars that are being tested as much as possible and on the road as much as possible, the average incident rate will be several times higher than the rate of an average car that sits in the driveway most of the day.

  • Comment removed (Score:3, Interesting)

    by account_deleted ( 4530225 ) on Monday May 11, 2015 @08:27AM (#49663501)
    Comment removed based on user account deletion
    • It will happen. Cars get smarter but people remain as stupid as they are now. Someone will stick a baby in their self driving car, the car will stop before reaching grandma because it runs out of power and the nearest charging station Is out of commission. It is a hot day and the baby will die. The parents will of course sue. And from there on in our cars will come with a warning sticker not to let infants or mentally incompetent persons ride unattended.
      • Or the car refuses to leave the driveway because it detects an unaccompanied minor. Or it drives straight to the police. Or it phones the police when it becomes stranded.

        I'm pretty sure that they're accounting for human stupidity. You pretty much have to these days.
        'Car cannot drive, trunk is open'
        'Car cannot drive, human sticking out of window'
        'Car cannot drive, ...'

    • by MrL0G1C ( 867445 )

      I used to walk to school when I was a kid. What is sadder, not allowing your kids to walk to school or fearing not being able to stick them in a car and send them a mile or two.

  • While I get that the autonomous system wasn't at fault, the real question is, "Were the accidents something a human driver could have avoided?" I'm a professional delivery driver and one of the things that's constantly drilled into us is to essentially watch out for the stupid people. Sometimes you have to yield right-of-way because it's clear the other driver isn't going to. Do autonomous cars know that?
  • magenta line (Score:4, Insightful)

    by fche ( 36607 ) on Monday May 11, 2015 @08:34AM (#49663561)

    "a handful of minor fender-benders, light damage, no injuries, so far caused by human error and inattention"

    In case any of those were done by human co-drivers in automated vehicles, this does not exonerate the automation from some share of responsibility. For example, if the presence or habitual use of the automation makes it more likely for the co-driver to become inattentive, it's partly to blame.

  • But would a human driver have been able to avoid the accident? On more than one occasion I've escaped a fender bender that would not have been my fault.

  • by Luthair ( 847766 ) on Monday May 11, 2015 @08:36AM (#49663583)

    The autonomous may not have been at fault, but one wonders whether some of the accidents would have been avoidable by a fleshy driver.

    • If you read the article, you would have realized that at least two of the accidents occurred when a human was driving the car, which is why we know the autonomous was not at fault.
    • by Kjella ( 173770 )

      The autonomous may not have been at fault, but one wonders whether some of the accidents would have been avoidable by a fleshy driver.

      In theory or as in a representative sample of the driving population? I'm guessing it's pretty hard to get a good answer to what we would do. At any rate, my prediction is that we'd do better with one less fleshy driver instead of one more.

  • by sunking2 ( 521698 ) on Monday May 11, 2015 @08:36AM (#49663585)
    That the vast majority of human drivers around them were able to avoid accidents despite the presence of dangerous automated cars.
  • by Thud457 ( 234763 ) on Monday May 11, 2015 @08:43AM (#49663615) Homepage Journal

    American components, Russian components, they're all made in Taiwan.

  • Eye contact (Score:2, Interesting)

    by Anonymous Coward

    How does one make eye contact with an autonomous vehicle at an intersection, or when merging lanes? Human drivers will have to learn a separate protocol.

    • by tomhath ( 637240 )
      Just do what they do in Italy...lean on the horn and step on the gas pedal. It's up to the other car to get out of your way.
  • by sinij ( 911942 ) on Monday May 11, 2015 @09:09AM (#49663837)
    Big issue with AI controlled cars in a human-dominated traffic is that AI doesn't react the same way people do. Sure, all-AI traffic would likely be more efficient and less prone to accidents, but we are nowhere near this. Instead we have AI that to humans is hard to predict.

    For example, huge puddle on the road, most humans would unwisely drive through it. What would AI do? No idea, and I wouldn't want to be driving behind it when that happens. What about a hobo at the end of the offramp begging for change? Would AI freak out about pedestrian on the road? No idea, and I wouldn't want to be driving behind it to see what happens.
  • There is a difference between following traffic laws and not being at fault and not failing to avoid an accident an alert human driver would have.

Trap full -- please empty.

Working...