Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation Businesses Technology

Humans To Blame For Most Self-Driving Car Crashes In California, Study Finds (axios.com) 187

cartechboy writes: Turns out computers are better drivers than humans after all. Axios compiled a study that found the vast majority of crashes in California involving self-driving cars were not caused by the autonomous vehicles themselves. Of the 54 incidents involving 55 companies holding self-driving permits in California, only one crash could be blamed on a self-driving car in autonomous mode. Six crashes were when the self-driving cars were in conventional driving modes, while the majority of the accidents were to be blamed on other drivers or pedestrians. Maybe self-driving cars aren't such a bad thing after all, it's humans that are the problem.
This discussion has been archived. No new comments can be posted.

Humans To Blame For Most Self-Driving Car Crashes In California, Study Finds

Comments Filter:
  • It can only be attributable to human error.
    https://www.imdb.com/title/tt0062622/quotes/qt0396920 [imdb.com]

  • Human drivers know that other road users make mistakes. As long as the majority of drivers is still human, the real question here is not whether a self-driving car accident has to be blamed on another driver or a pedestrian, but whether a human driver could and would have avoided the accident in question.

    • by Rei ( 128717 )

      Agreed. There's also the issue of whether unusual or unexpected behavior by the self-driving car makes other road users more likely to hit it.

      • I like how Australia handles this. Learners' cars are marked, as are teenagers', so you can anticipate the type of likely stupidity. It's hilarious the wide berth learners get; almost as unpredictable as roos. Add Elderly marking to the mix, everyone wins.

        • Re: (Score:3, Funny)

          by Anonymous Coward

          I like how Australia handles this. Learners' cars are marked, as are teenagers', so you can anticipate the type of likely stupidity. It's hilarious the wide berth learners get; almost as unpredictable as roos. Add Elderly marking to the mix, everyone wins.

          We have elderly marking here in the US. They are all required to drive a Buick with a handicap marker somewhere, usually on the plate itself.

  • Given perfect weather and the absence of traffic, animals or pedestrians, lane tracking software is still hard. Not all roads are well marked

    I'm a futurist and a big fan of the idea of autonomous vehicles

    I'm also a programmer who has been writing code since the 70s

    The current tech seems to be 90+% percent working. The last few percentage points and edge cases are where the deeper problem lies

    • by Tablizer ( 95088 )

      lane tracking software is still hard. Not all roads are well marked

      On unfamiliar roads, I often have that problem also. (I'm a human, by the way.) CA roads are still recovering from the Great Recession, so I often have to guess around faded lines.

      If there are cars in front of me, I simply follow them, hoping they know from prior experience on the same road. If not, I keep an eye on the cars around me for cues. If there are no cars around me, then guessing wrong has minimal risk anyhow.

      As much as I rely on

      • There's a lot of practical little heuristics like this that humans use. Bots could also, but it may take years to include and tune them.

        And just when they are done tuning for every situation involving another car on the road, there will be another accident with a car that was for some reason half on a sidewalk, and then they will need to start over again. There are so many possible edge cases I cannot fathom how it could ever work.

        • If you entertain the thought that Human-equivalent AI will someday be implemented (I do), you could expect it to be able to drive a car just as well as a human. You could probably even expect it to drive as well as the best human driver. With cars having the equivalent of the human sensors, and then some more (radar, lidar, vehicle-to-vehicle communication, ...) , I think that they have a good chance of becoming a decent improvement in traffic.

          But given the state of AI today, in my view the software in auto

          • by MrL0G1C ( 867445 )

            Are they really using 'AI' for self-driving, or are they just using a ton of fuzzy logic? I'm assuming fuzzy logic, it's more predictable and more tweakable. Actually, fuzzy logic for the decision making, AI for the recognition of objects in images.

            • Yup, fuzzy logic. My impression is that there is a rule that provides 95% of the decision and the 'neural network' buffers the other 5%. AI is nowhere near thinking like a human.
      • by dgatwood ( 11270 )

        lane tracking software is still hard. Not all roads are well marked

        On unfamiliar roads, I often have that problem also. (I'm a human, by the way.) CA roads are still recovering from the Great Recession, so I often have to guess around faded lines.

        If there are cars in front of me, I simply follow them, hoping they know from prior experience on the same road. If not, I keep an eye on the cars around me for cues. If there are no cars around me, then guessing wrong has minimal risk anyhow.

        As much as I rely on that algorithm myself, the thought of a bot having a similar algorithm bothers me. But faded is faded.

        I'm somehow reminded of a joke.

        A teenage driver was driving for the first time in the winter. His dad told him, "If you ever get caught in a snowstorm, just wait for a snowplow to come by, and follow it until it gets onto a major road."

        Well, sure enough, the kid got stuck in a storm, so he started following a snowplow. After about half an hour, the snowplow driver stopped and got out of his truck.

        "Why are you following me?" the man asked the young driver.

        "My dad said that if I ever got stuck in a snowstor

      • Just last week I was driving on some fresh asphalt with those little flag-type reflectors in the middle waiting for proper botts' dots or what have you to be applied... given the area, actually, I suspect it's a rumble strip center line. And the guy in front of me could not manage to interpolate those dots into a line at all. He did okay (or at least average) on the sections before and after, but he had real trouble where the line wasn't clearly marked out for him even though the reflectors are highly visib

      • "I wonder how much the bot drivers use info from prior visits to the same road, versus using a generic algorithm each time"
        The Google cars are heavily reliant on prerecorded, highly detailed 3D maps. Tesla tries to "just do it" with......Some success.
      • A robot could also measure objects (lamp posts, buildings, signs, etc) on the side of the road, and use a detailed map to figure out exactly where it is. That way you could drive even if you're the first on a completely snow covered road.

    • And as in most problems in life, the final edge cases take the most time to resolve. Driving will take almost 100% accuracy to be realistic. It isn't about being as good or better than humans, it is just because of the serious consequences that occur when a 4500 pound machine makes a mistake in traffic.
      • It isn't about being as good or better than humans, it is just because of the serious consequences that occur when a 4500 pound machine makes a mistake in traffic.

        The consequences are not less serious when a human makes a mistake driving that 4500 pound machine.

    • There's a reason they test in Arizona and California.

      I wanna see these things take icy turns at reasonable speed, and avoid skids better than humans and recover from them better than humans.

      We will still have lawyer problems with dollar signs in their eyes as they sue for accidents, claiming facetiously they are improving the quality when in fact they may be delaying mass roll out, leading to tens of thousands of extra deaths per year, for years or decades.

      Imagine 100% roll out, and deaths drop from 35,000

      • I wanna see these things take icy turns at reasonable speed, and avoid skids better than humans and recover from them better than humans.

        Most of that technology is in every car sold since what, 2010? They've all got ABS and ESP. If they've got AWD as well, then all they have to do is keep to reasonable speeds and the underlying platform will do most of it. That's how most humans handle those conditions, at least where the roads aren't being cared for. In my experience, icy roads get treated somehow. Thankfully, in my region they use volcanic cinders rather than salt. This is hard on the tires, and driving on the loose cinders can be a bit sl

        • You have never been on a road with ice ruts. On a road with ice ruts, it no longer matters where the lane was before the ruts were there, because missing the ruts is dangerous. Ice ruts can throw the car sideways at walking speed if you don't align the car with them. Furthermore, they won't line up with the lanes around corners either, so any car that is driving by map had better be willing to forgo that map and drive in the ruts or there will be enormous problems.
      • I wanna see these things take icy turns at reasonable speed, and avoid skids better than humans and recover from them better than humans.

        Low level vehicle control on icy roads is a fairly easy problem to solve for computers.

        • Have you ever actually driven in real winter conditions, with snow plows in the road that don't scrape the road 'smooth' because they don't want to destroy their blades every kilometer?
    • Given perfect weather and the absence of traffic, animals or pedestrians, lane tracking software is still hard. Not all roads are well marked
      No it is not. Marks are only used as guidelines. There are plenty of things lane tracking can use. Here is a list of about 20 algorithms: http://airccse.org/journal/jcs... [airccse.org]
      I guess if you google for them individually you find youtube videos that show how the algorithms work.

      Camara based lane tracking only fails in deep snow. But usually sides get market with sticks then,

  • Good Driver (Score:3, Insightful)

    by u19925 ( 613350 ) on Friday August 31, 2018 @05:23PM (#57234270)

    A good driver is not just supposed to prevent at fault accident but should also do its best to prevent accident when the other party is at fault. If you replace all good drivers with self driving system, you are going to have lot more accidents than you have today if the self driving system is simply claims to have no at fault accident. Remember there is no reward for preventing accidents and so there is no tracking of it and we don't know how many of them are prevented daily.

    Once I was on a divided road (divided by 2 feet concrete wall) driving on right lane. A car took left turn and entered in wrong way to the left of me thinking that it was a 1 lane undivided road and 2 feet divider was a barrier to some private property. It was not at all a danger to me and I would have just ignored it and let it have accident but I honked hard, stopped car, opened the window and alerted driver. He backed up and moved to the other side of divider. A self driving car would have just ignore this car. I can easily narrate dozen such incident and few more incidents where I was at fault.

    We need self driving car which is not just not getting involved in at fault accident but also its at no fault accident rate is lower than average.

    • you don't have to pay to fix your car. I've been in several accidents were I wasn't at fault and never once gotten away clean. Insurance will only pay to fix on a new car but they don't pay the lost resale value that an on record accident causes. You can hire a lawyer but they take so much you break even or lose. And if your car is totaled they pay the dealer invoice price, e.g. what the dealer would pay if you took it for a trade in. That's usually 3/4 what the car is worth. And yes, you're rates are going
    • Exactly, humans drive to prevent accidents, period. Autonomous cars cannot drive merely to prevent liability. It won't work. Human drivers are part of a social system where everyone is watching out and compensating for mistakes. Autonomous cars need to be a part of this system, which means they need to anticipate human mistakes.
    • by hey! ( 33014 )

      Well, why would you preferentially replace the *good* drivers first?

      I don't think it follows form the 98% human fault rate that robotic drivers don't try to prevent accidents. Who would want to ride in a car which didn't drive defensively? But I suspect robots aren't quite as good as human with dealing with other humans behavioral flexibility, which is a nice way of saying "unpredictability".

      That flexibility is sometimes good, sometimes bad. In a world of robotic drivers, no car would stop to honk at anot

      • by rjr162 ( 69736 )

        "no car would stop to honk at another car for entering the the highway the wrong way; but then that other car wouldn't be doing that."

        Replace wouldn't with shouldn't in that but then the other car part.

        My car shouldn't pop out of park in certain situations, but was recalled because it could.
        Cars shouldn't lose power on the freeway, but they do.
        Traffic lights shouldn't quit working, but they do.
        Self driving cars shouldn't make mistakes, but they will because they're just like any other object that can encoun

      • Dunning Kruger. The good drivers... the ones who dive defensively and, for example, cruise down a road where they know the lights are timed at the speed limit and hit every green... will be the first to understand: "Yeah, a computer can probably do this better than I can.", and trade in their old cars for self-driving models. It's the bad drivers... the ones who weave in and out of the traffic pattern, take every turn too fast, jackrabbit off every green, and slam on the brake at every red... who will neve

    1. AV Moving in autonomous mode: 38 accidents, 37 attributed to human error.
    2. AV Moving in manual mode: 19 accidents, 13 attributed to human error.
    3. AV stopped in autonomous mode: 24 accidents, 100% of which attributed to human error.
    4. AV stopped in manual mode: 7 accidents, 100% attributed to human error.

    What strikes me is the raw number of accidents are higher in autonomous mode. Maybe the vehicles spend the majority of their time in autonomous mode. The data need to be normalized in accidents per mile driven.

    • AV stopped in autonomous mode: 24 accidents, 100% of which attributed to human error.

      AV stopped in manual mode: 7 accidents, 100% attributed to human error.

      Unless I misunderstand something, what those numbers tell me is that the sample size os too small to provide meaningful data.

  • by renegade600 ( 204461 ) on Friday August 31, 2018 @05:39PM (#57234340)

    IMO, the only way self-driving cars will be safe is if all cars are self-driving. they need to be able to talk with each other in order to be safe. Humans are so illogical, no way to to have an algorithm to predict what they are going to do at any given moment.

    • The fact that autonomous cars are logical is the problem. 1 or 0, brake on or off. Humans aren't used to such extremes in driving.
    • IMO, the only way self-driving cars will be safe is if all cars are self-driving.

      Nope. Even then they will not be 100% safe all the time in every situation.

      A more reasonable standard is whether they are safer than human drivers, and they already are.

      • No they aren't. Put aside the fact that humans drive way more safe miles than unattended robot cars, the robot cars don't even drive in near enough conditions to make that conclusion.
  • Humans programmed the car and humans driving near it had an accident with the vehicle. Either you blame the bad driver or the bad programmer, but a human is to blame. Even if it was some sort of self learning system that self reprogrammed, the initial seed was from a human. But, I guess this is probably not what they meant.
  • If there wasn't anyone else on the road.

  • Maybe self-driving cars aren't such a bad thing after all, it's humans that are the problem.

    Are you sure that California isn't the problem?

  • Because robot cars are unpredictable. A 16 year old just driving away with a new license is more predictable than they are. Humans are not accustomed to driving with robots.
  • That's why dafuck we're developing self-driving cars! I remember being in a classroom in the mid/late 60s when we were discussing those new-fangled Government Regulations requiring seat belts be installed on all passenger cars sold in America (not passed until 1968) . Our teacher pointed out that "you can make cars as safe as you want, but until they do something about that loose nut behind the wheel, there's going to be lots of deaths in cars." A week later, a good friend died in a car crash. Fast forward
  • To live in a robotic world where everything I have so enjoyed over my lifetime is run by a machine.

    I want to feel the acceleration of a car when I want to feel it. I want to take a curve at the limits of the machine.

    I want to be free of my robotic overlords.

    Fortunately I'm just old enough that those who foolishly believe robotic cars, robotic airplanes, robotic sex, robotic ass wipers, and all other things robotic will make their life miraculously wonderful will not be able to dictate their living hell upo

    • I want to feel the acceleration of a car when I want to feel it. I want to take a curve at the limits of the machine.

      I enjoy all that stuff too, but it's a lot safer for everyone if it happens on a track. Where there is sufficient demand, there can be municipally-operated tracks, so they don't have to be expensive.

  • If robot car makers want to test in public roads, they should bear the burden of proving that any accident would also have occurred if a human was at the wheel. This means, if a robot car slams on the brakes because it is a robot, they should take ten humans and put them in the same situation and see how many slam on the brakes just as suddenly. If no one slams on the brakes, then the autonomous car company should be responsible. In the end it will be good for everyone, because robot cars are never going
    • And yes I realize I am going to a special place in Slashdot hell for this; but I am tired of seeing technology companies give humans the short end of the stick with this.
  • Humans *are* to blame for all self driving car crashes in California (and everywhere else). Humans are creating the software and hardware to enable self driving cars. Humans are building the cars (or building the machines to build the cars). Humans are driving the manually operated cars that are colliding with the human designed and engineered automated cars. No matter how good or bad the technology is, humans are to blame.

    Perhaps the headline should have been Human drivers are to blame for most accidents
  • ... is perfectly obvious.

    Signed,
    SkyNet.

  • by Karmashock ( 2415832 ) on Friday August 31, 2018 @08:41PM (#57235106)

    ... even if you technically are not to blame. Stopping suddenly for example and causing people to pile into you... technically is typically the fault of the person that rear ended you. But "you" did cause it. If you had not driven in a way that was surprising and unpredictable to other drivers then it wouldn't have happened.

    Now the law will say that you should maintain enough distance that even if people do that there shouldn't be an accident.

    But if the streets are crowded... high traffic... high congestion... that is often not viable.

    Now what they'll then say is "go slower"... the problem is that if everyone does that the traffic becomes even worse.

    What people learn in busy cities is that there is a "way" to operate on the road that has more to do with Chinese bicycles than it does with California road laws. The idea is that everyone follows a code of conduct on the road... "vibe"... a pattern... and if everyone does it... then we have TRUST... and that trust means that we can drive faster and with less space between cars than the law would like. But it is generally very safe so long as people are aware of and hold the pattern.

    When a given individual on the road doesn't follow the pattern... this system becomes unsafe. I notice this all the time on the streets of the busy city in which I live.

    You just get a sense that things are "off" on the road... people are not moving predictably. Maybe it is me... maybe it is them... doesn't matter. I get off the road immediately. I literally park and go for a walk or something. And often I find that there are shattered car parts all over the street when I get back. Why? Because the accident I could sense coming... because people weren't following the pattern caused an accident.

    So... was the AI responsible for the accident? Yes. Legally? Perhaps not. But legality has very little to do with how actually driving on an actual street works. Driving computers have been dealing with this for awhile.

    It is a very annoying situation when the police give people tickets for this... according to DMV rules... the way people drive on the streets is generally illegal. It is however how we've basically always driven and continue to drive. If you wanted to... you could probably haul half the drivers in for violating the law.

    You'd have a riot on your hands and the politicians would probably be forced to actually have the law reflect how we actually drive. But they could do it... for a minute.

    Long and short... Cali driving laws are more of a rough guideline and less of the letter of conduct.

  • just not adept enough at staying clear of these perfect machines as they go about their business.

    Just my 2 cents ;)
  • Far too many trolls here will not care about facts. They will simply continue to scream that Tesla and other car companies are killing ppl, while disregarding this.
    • Far too many trolls here will not care about facts.

      Thats so funny Windy. I nearly fell off my chair.

      this post [slashdot.org] explains who doesn't care about facts.

      You are still yet to show a single lie, yet you claim it all the time. You like to also claim any random AC is me, it's probably you. You are dishonest enough to pull that shit.

      I often point out your lies [slashdot.org] and lies [slashdot.org] more lies [slashdot.org] more lies [slashdot.org] even more lies [slashdot.org] lies [slashdot.org] and lies [slashdot.org] When you aren't lying, you are just making shit up [slashdot.org] that is in no way believable, and lying.

  • "Hey, Hal, fix it so there's no more crashes involving humans and robot-driven vehicles."
    "No worries, Dave."
    *kills all humans

  • ... will be insurance premiums.

    Once it becomes widely known that the vast majority of accidents are caused by human error, then insurers will push up the cost of "human" insurance.

    We will then enter a period of claims containing "punitive" damages: "well, why wasn't the car computer controlled?" and insurance rates for people will climb even more. And as rates of vehicle accidents become lower, due to there being more AVs on the roads, the publc's tolerance for accidents as being "acts of god" will dimi

  • In my country, most urban roads are total-3 cars wide, or 'almost' 4, with parking and random deliveries (legal or illegal) on both sides. To make any progress, you play 'chicken' with oncoming traffic of varying widths, in the middle, and hope for politeness with a cheery wave. Autonomous impossible! What really would reduce long-distance traffic would be an autonomous system for distributing freight, driverless for each container. Can I copyright my new words 'railroad' and 'railhead' and 'marshalling
  • Machines are right, humans are wrong and responsible for their own deaths and injuries. I submit that these crashes occur because these machines do not act according to the real human rules of the road. Nobody actually studies the "social and psychological environment of driving," in my opinion. People do a very complex dance when driving that goes by complex human perceptions, rules, and expectations. It is a daily conspiracy to break the law with the purpose of getting to one's destination as quickly and
    • Machines are not humans and never will be.

      Then they will never reduce accidents, or reduce driving safety, and we should act as if they are a novelty not as if they are a savior.

  • Self driving car companies either want to reduce all accidents or they don't. They either want to integrate with human driving or they don't. Humans occasionally do stupid things in traffic, we can all agree on this. If self driving cars are going to reduce accidents, they are going to have to compensate for the stupid things that humans do, just like we all do every time we go out on the road. I believe the worst outcome is coming true, self driving car companies are not feeling any responsibility to t

Be sociable. Speak to the person next to you in the unemployment line tomorrow.

Working...