Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AI Transportation

Self-Driving Cars Would Only Prevent a Third of America's Crashes, Study Finds (reuters.com) 219

An anonymous reader quotes Reuters: Self-driving cars, long touted by developers as a way to eliminate road deaths, could likely only prevent a third of all U.S. road crashes, according to a study released on Thursday. The Insurance Institute for Highway Safety (IIHS), a research group financed by U.S. insurers, found the remaining crashes were caused by mistakes that self-driving systems are not equipped to handle any better than human drivers.

Partners for Automated Vehicle Education, a consortium of self-driving companies and researchers, said in a statement on Thursday the study wrongly assumed that automated cars could only prevent crashes caused by perception errors and incapacitation. Some 72% of crashes were avoidable, based on the study's calculations, if accidents caused by speeding and violation of traffic laws were included, the consortium said...

[N]ot all human mistakes can be eliminated by camera, radar and other sensor-based technology, according to the IIHS analysis of more than 5,000 representative police-reported crashes nationwide. Most crashes were due to more complex errors, such as making wrong assumptions about other road users' actions, driving too fast or too slow for road conditions, or making incorrect evasive maneuvers. Many crashes resulted from multiple mistakes. "Our goal was to show that if you don't deal with those issues, self-driving cars won't deliver massive safety benefits," said Jessica Cicchino, IIHS vice president for research and a coauthor of the study.

This discussion has been archived. No new comments can be posted.

Self-Driving Cars Would Only Prevent a Third of America's Crashes, Study Finds

Comments Filter:
  • Uh... (Score:5, Insightful)

    by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Sunday June 07, 2020 @11:39AM (#60156136) Homepage Journal

    "Most crashes were due to more complex errors, such as making wrong assumptions about other road users' actions, driving too fast or too slow for road conditions, or making incorrect evasive maneuvers."

    You can solve literally all of those problems with self-driving cars by simply making them leave an adequate cushion, and not drive too fast for conditions. "Driving too slow for road conditions" is really code for "other vehicles moving too fast and/or failing to yield". If you leave an adequate cushion then you don't have to make evasive maneuvers in the first place. If the other cars are also self driving and they don't speed and they let you in, then your speed can't cause an accident.

    • Re: Uh... (Score:4, Insightful)

      by Viol8 ( 599362 ) on Sunday June 07, 2020 @11:50AM (#60156190) Homepage

      Contrary to what you may believe, driving too slow can be dangerous which is why many countries have a minimum speed limit on their highways and here in the UK mopeds and tractors are banned from motorways.

      • My math looks interesting," 2/3 < 1 ". Detroit, you listening?
      • Re: (Score:3, Insightful)

        by drinkypoo ( 153816 )

        You can only "cause" an accident by going too slow if someone else is going too fast for conditions. You should never outdrive your vision and braking ability. Whether the hazard you're avoiding is that something fell off of a truck or that someone is driving 15 in a 65, this will prevent a collision.

      • Ok, but
        1) self driving cars won't drive too slow, eliminating much of the problem

        2) driving too slow doesnt directly cause crashes. It causes crashes indirectly through people not paying attention and being prepared for such a slow vehicle, and by causing congestion which puts everyone closer together and makes them a bit more aggressive trying to get around it. When non-self driving cars drive too slowly, other self driving cars around it can deal with both of these indirect causes.

        So I fail to see how sel

        • by kenh ( 9056 )
          Self-driving cars will drive as programmed (obvious), why assume that manufacturers will design cars to always run at the maximum allowed speed limit? Decisions made by individual drivers are based an many variables and factors including "running late" for an appointment or "showing off" - I assume the committee that develops the self-driving algorithm will choose to drive at a speed determined by several factors, not just the posted speed (congestion, weather, road condition are but a few).

          Self-driving car

          • I wouldn't worry about it since so-called 'self driving cars' are never going to be available to the general public and will never be widely accepted enough to be profitable anyway because they're never going to be adequate enough to be truly safe on public roads under all conditions anyway. The so-called 'AI' used is at best half-assed, can't 'think', has no ability to 'reason', and therefore will never be fully up to the task.
            • I drive 80% of the time on autopilot, or I did when commuting was still a thing. I would be willing to pay someone to remotely drive my car the other 20% if such a thing were an option. Your right that full autonomy wonâ(TM)t happen for a long time but full autonomy isnâ(TM)t necessary, we only need adequate autonomy combined with effective remote piloting to enable autonomous taxis and autonomous delivery vehicles which would have a huge impact on the economy.

      • It's not that driving "too slow" is dangerous because if you just had a road full of tractors there are unlikely to be any serious accidents. The problem is that when you have a massive differential in speeds between vehicles it results in more lane changes, passing, etc. which is where the danger comes from. If people are traveling the same speed, the vehicles don't change position relative to each other all that much. If there's someone who's driving much slower (even if they're speeding themselves!) that
        • All cars should drive like humans, especially automated ones. If the car is making any choice that a human wouldn't make, then it is dangerous addition to a road with 99.9% humans. Therefore driving an insanely slow speed may not be illegal but it is a predictable danger that automated car designers should avoid at all costs.
      • You mean driving slower than everyone else expected you to drive.

        If we reduced the standard speed limits within cities to 20mph and actually enforced it, then it would save many lives and avoid many serious injuries. Actually, even just enforcing the 30mph limit would be a start.

    • If all cars were self driving, then ideally all cars would be connected to some common grid. Therefore there should be far less wrong assumptions about other driversâ(TM) intentions and then fewer accidents.
      • You don't even need V2V/V2I in order to avoid such crashes. You do it the same way a conscientious human does it. You don't speed up and pass on the outside when going past an onramp where you can't see whether anyone is about to merge, for example. Communication with other vehicles or with infrastructure would help, but it's not necessary. The vehicles only have to follow the rules of the road.

        Plus, you can't trust any vehicle that requires communication to self-drive anyway, because that only means that i

      • We will never even have 10% of all cars be 'self driving' because the technology is woefully inadequate, always will be until we have real 'AI' and not the half-assed garbage they keep trotting out in the media, and people are highly unlikely to accept anything they can't control themselves anyway.
    • "Driving too slow for road conditions" = the 55 BS.
      Just Try to do 55 on the IL toll way to see that in action

      • I don't want to drive 55. But if my car drives itself at 55 and I can play with my phone then I'm fine, I'll eat the extra few minutes in the name of safety and fuel efficiency. Over a long trip you might lose an hour, but you'll feel less fatigued when you get there.

    • by fermion ( 181285 )
      Read between the line. It is insurance companies. Even though you buy a car that is verifiably going to be involved in fewer accidents, we can’t lower you insurance rates.
    • Comment removed based on user account deletion
  • Still... (Score:5, Insightful)

    by smi.james.th ( 1706780 ) on Sunday June 07, 2020 @11:41AM (#60156138)
    A third is kind of a lot though?
    • Re:Still... (Score:5, Insightful)

      by kenh ( 9056 ) on Sunday June 07, 2020 @12:24PM (#60156312) Homepage Journal
      Over 35,000 Americans die in car crashes per year [policyadvice.net], anything that can save as many as 6 deaths per day is a good thing and should be celebrated. I don't understand the "only one-third" comment in the title seems to dismiss the achievement. Imagine a drug that could eliminate 1/3rd of fatal heart attacks, or a policy that was documented to prevent 1/3rd of fatal gun deaths each year. would you still dismiss them as "only" eliminating 1/3rd of deaths?
      • ..., or a policy that was documented to prevent 1/3rd of fatal gun deaths each year. would you still dismiss them as "only" eliminating 1/3rd of deaths?

        I take your point, but I can't resist pointing out that we already have demonstrated and documented policies that would do a whole lot better than reducing 1/3 gun fatalaties. In this country that would not be considered an achievement. Instead it would be and has been summarily and vehemently rejected multiple times. Guess what "this country" refers to.

        Why? Sorry but I'm not going to go there now. There is no point it has been done too many times already.

        • Like making guns illegal.... works well in Chicago... or are you talking some different policy ? what would that be?
        • Reforms in driver education, training, and testing would solve most of the problems with human drivers. Weed out the people who cannot be fully competent. Bring back Driver Ed/Driver Training in highschools for starters. Private driving schools need to be held to higher standards, not just teach people the bare minimum to pass the current tests. Immigrant adults would be required to start from scratch as if they'd never driven before. More thorough testing overall would prevent adult drivers with bad habits
      • Apples and oranges.
        Also you're ignoring the deaths caused by a totally inadequate machine that fucks up and gets people killed -- and it will happen, guaranteed, because the technology is inadequate, and always will be because it's a technological cul-de-sac.
    • I think that the "only" was related to the idea that developers have long touted by developers as a way to eliminate road deaths.

      33% is a LOT of savings, but it's nowhere near eliminating anything.

      • Exactly. Also, they can't predict how many more deaths will be caused by woefully inadequate self-driving cars, in accidents that a human could have avoided, but that the machine can't, since it's incapable of what we refer to as 'thinking' and 'reasoning'.
    • Do you really want to cede control over whether you live or die to an unreliable machine you can't control? If it starts to fuck up you have no chance or ability whatsoever to take manual control of the vehicle and try to save yourself? Nightmarish. Hellish. Screaming in utter terror as you see your messy, painful death coming. No thanks.
      • ..or even worse: it's about to kill someone else, and you can't stop it from happening? You have to live with that image in your head the rest of your life? The family and friends of the deceased, blaming you for it, regardless of you having no control over the vehicle? Again: no thanks.
    • Sounds like it might not be worth it.  What monetary value had we placed on a human life?  I want to say 3 million, but that sounds high.
  • by jacks smirking reven ( 909048 ) on Sunday June 07, 2020 @11:42AM (#60156142)

    A story that is immediately outed as functionally useless in it's own summary. Of course you have to account for all possibilities of accidents to eliminate all those accidents.

    IIHS does some good work but I imagine self-driving cars will re-shape the way auto insurance works on a whole, probably something they are worried about.

    • The title is misleading, but the research is not useless. It's an analysis of the cost-benefit for achieving certain levels of functionality and adoption.

      For example, "if we have cars that are automated but have no way to sense the level of traction and slow down, that will leave X many accidents on the table"

      Or, "if we have self-driving cars, but initially only a few, so drunk drivers are still blowing through red lights at about the same rate, X many self-driving cars will still be hit by them."

      Or

  • not massive? (Score:5, Informative)

    by dirk ( 87083 ) <dirk@one.net> on Sunday June 07, 2020 @11:43AM (#60156152) Homepage

    "Our goal was to show that if you don't deal with those issues, self-driving cars won't deliver massive safety benefits"

    So lower accident by 33% wouldn't be a massive benefit? Really? I can't think of any other change that ever delivered anything like those benefits. I guess they are trying to make sure they can keep overcharging people for insurance since not every crash will be eliminated. In other words, this is crap.

    • As according to the CDC, over 32,000 people die in car accidents a year, so this would "only" save over 8,000 lives a year and prevent "only" 500,000 car related injuries a year (in theory, yes yes statistics and all). A drop in the bucket..... /s

      https://www.cdc.gov/vitalsigns... [cdc.gov]

    • Re:not massive? (Score:4, Insightful)

      by nukenerd ( 172703 ) on Sunday June 07, 2020 @12:20PM (#60156294)
      They are talking about the USA where the driving standards are terrible : https://www.theguardian.com/co... [theguardian.com]
      In that link, an American describes his test in the USA :-

      I had to take a very short multiple choice theory test. Having not studied and never driven, I passed easily. Then I took a practical test that consisted of a 15-minute amble through a flat rural area. I performed poorly, and at the end of my test the examiner turned to me and said, "You really don't know what you're doin', do ya?" And he passed me.

      It would make less difference in the UK for example, as the driving standards are higher and also because UK roads will be a greater challenge for SD tech. The human driver accident rate in the UK is far lower than in the USA despite the roads being less wide, less straight, and having more pedestrians and cyclists.

      I can't think of any other change that ever delivered anything like those [33%] benefits.

      The introduction of the driving test in the UK had a massive benefit. Other things that come to mind are seat belts and safety glass, and crash helmets for motorcyclists..

  • by PPH ( 736903 )

    Most crashes were due to more complex errors, such as making wrong assumptions about other road users' actions, driving too fast or too slow for road conditions, or making incorrect evasive maneuvers.

    You can fix the road conditions problem by having telemetry between vehicles and ground stations broadcasting information. This also solves the 'other drivers intentions' problem and the evasive maneuvers. Vehicles agree on which way to swerve, much like TCAS [wikipedia.org]. But none of these systems can handle pedestrians or bicycles unless we figure out a way to hang transponders on every bum bike and hobo wandering in the street.

    • You can fix the road conditions problem by having telemetry between vehicles and ground stations broadcasting information.

      The generic term for this technology is Vehicular Ad-hoc Networks (VANETs) [wikipedia.org].

      • I find it amusing that anyone promoting that technology ignores the fact (FACT, mind you) that it would be hacked, and there would be massive multi-car accidents because of that, among other hijinks, like having entire freeways grind to a halt because some hacker thought it would be funny.
        • by PPH ( 736903 )

          because some hacker thought it would be funny

          Thousands of virtual protesters suddenly block the freeway. It's not like people haven't though of doing something like this [wired.com] already.

    • "none of these systems can handle pedestrians or bicycles unless we figure out a way to hang transponders on every bum bike and hobo wandering in the street."

      Of course they can. The vehicles which detect such obstacles can communicate their existence to others. Then those other vehicles can alter their behavior such that they behave more cautiously around them. Further, if they are detecting illegal and unsafe behavior, they can report it to the infrastructure, which can report it to law enforcement. A hobo

      • by PPH ( 736903 )

        The vehicles which detect such obstacles can communicate their existence to others.

        Not if traffic is light and the obstacle appears after the previous vehicle passes. Obstacles can materialize in a matter of seconds. Think of a bicycle shooting through a stop sign and across an intersection.

        A hobo wandering on the street is jaywalking,

        No longer enforced in Seattle. We've had some pretty serious protests over video surveillance and privacy. I suspect that if you automatically uploaded a clip (or any other data) of such a traffic infraction to law enforcement, you would be cited.

    • My Tesla already handles incorrect behavior by human drivers. I was once in an HOV lane going about 40 miles/hr faster than the regular traffic lane, driving an autopilot. A car pulled into the HOV lane right in front of me. The Tesla slammed on the brakes and drover right onto the shoulder to avoid the accident - I would have not been able to do this in time. ( No idea what would have happened if there was no shoulder. )

  • by Bimkins ( 242641 ) on Sunday June 07, 2020 @11:45AM (#60156168)

    Remove ‘only’ from the headline, and the result would sound rather great! There were over 36000 traffic fatalities in the US in 2018. If we ‘only’ saved one third of those, it would be an awesome improvement.

    • They said "only" because some people are claiming that SD tech will nearly eliminate crashes. Only a few days ago someone was claiming that the recent crash in Taiwan of a Tesla on Autopilot straight into an overturned lorry in broad daylight, was only an "edge case".
    • We can now eliminate 100% of all traffic-related fatalities!
      All we require of everyone is that you give up any and all control of the vehicle you're being transported in!
      You're all okay with that, right?

      • I'm sure many won't be... but I for one would be. I'm ok giving away my right to drive drunk... because in the end I'd rather be able to walk down the road and not get killed by a drunk driver. So same goes for driving manually.
  • by Riceballsan ( 816702 ) on Sunday June 07, 2020 @11:55AM (#60156192)
    Misunderstanding other human intentions... But that litterally reduces the fewer human drivers there are. AIs can communicate with eachother, always knowing the intention of every other AI. When we get the costs down, and make them standard, it sounds like those "unavoidables" become avoidable. It's like saying a self driving car won't protect you from a drunk driver... but, if the drunk driver had a self driving car, he wouldn't really be a problem. At least that's my understanding.
    • Scheduling trains on a railway with fixed routes, no u turns, no sudden pulling out , no bicycles, no pedestrians or 101 things you get on the roads is hard enough. The idea that we're anywhere close to having software that could successfully control cars in crowded city like Delhi or Rome is just farcical.

      And something self drive advocates always forget - what about motorbikes? They'd have to be banned from the roads to make the self driving "dream" (nightmare more like) become a reality.

      • why can't motorbikes become self driving when the tech advances? Now I do agree, places like India are pretty non-viable. Hell all I've seen there, I have no idea how any human can drive in those cities.
      • Scheduling trains on a railway with fixed routes, no u turns, no sudden pulling out , no bicycles, no pedestrians or 101 things you get on the roads is hard enough.

        It is? Since when? Are you stuck back in the 1600s or something?

        And something self drive advocates always forget - what about motorbikes? They'd have to be banned from the roads to make the self driving "dream" (nightmare more like) become a reality.

        It's entirely possible to create a self-driving motorcycle also, but, really, who gives a crap about motorcycles. Anywhere outside of Asia they make up a teeny tiny percentage of vehicles on the road, and are just a blip in the accident statistics. No clue how you came to the conclusion that they would have to be banned, but then again I don't understand why you made up 90% of the stuff in your comment.

      • Yeah, any kind of centralized control of all cars is never going to happen, it's computationally intractable. The best cars are likely to do is broadcast to nearby cars. Things like their intentions, "I am about to change lanes" or "I am about to accelerate."

        Still, I really like the idea of cars being able to get the timing perfectly going through intersections to not need traffic lights. That is great science fiction.
    • From an engineering point of view, you are correct. However, once you inject the stupidity of non-technical people into the mix, things go sideways pretty quick. Let's say that tomorrow everyone had a self-driving car and every other vehicle type on the road (truck, bus, etc.) was also self-driving. It wouldn't take very long for loss of revenue to various entities to add up and the complaining about loss of revenue to take hold. What would likely happen is that you'll have to pay a fee to travel at cer

  • A third is still a ton of accidents saved, injuries and deaths avoided.

  • Even if every individual autonomous vehicle operates very well we will run into synchronization problems where masses of them do not operate well together. Cars today are more reliable than ever but they still fail. Autonomous vehicles, especially massed produced ones, will fail, parts will fail. Problems will happen, incompatibilities will happen, especially when there is a high density of autonomous vehicles.

    I also don't think that autonomous vehicles are a panacea for traffic. You can only fit so man

    • You can all but eliminate the equipment failure problem the way they do in Germany, with intensive vehicle inspections. They check for things like rusting suspension arms, loose wheel bearings, contaminated brake fluid... Granted, this comes with additional cost, but you can expect it to be part of the future for more nations, if not all.

      As for traffic, you're entirely correct. Self driving vehicles may actually increase traffic, both by giving more people access to automobile transportation, and by vehicle

      • That doesn't really account for the things you see at scale with mass production. Things like infant mortality, manufacturing issues that only show up with extended vibration, etc. Does an inspection account for the MTBF of internal components? How does your older model with the slow processor handle the amount of data the new models with faster processors are sending? There are lots of scenarios that we will only learn about by practice. There is no panacea here.

        • "That doesn't really account for the things you see at scale with mass production. Things like infant mortality, manufacturing issues that only show up with extended vibration, etc."

          It doesn't account for every instance, but it does account for most of them. The limitation is the manufacturer's level of responsibility when it comes to carrying out recalls. Design detects are supposed to be addressed that way. And I'm sure they do check for whether your vehicle's recalls were addressed during inspection.

        • by kqs ( 1038910 )

          Sure, you can never solve this problem 100%, but solving it 99% is pretty damn good. Mix inspections with better instrumentation and part tracking, and the car will tell you most of the time when something is close to failure. And if a particular batch of parts fails more often than it should, then we can avoid even more failures.

          Humans can do this; commercial airplanes almost never fail (despite being far more complex than cars) because we have very detailed (human) inspecting and tracking. We could ext

          • Unfortunately in my experience 90% of the time when your instrumentation is telling you that a monitored component is failing, it's actually the instrumentation that's failing.

            • That's not a big problem when it comes to avoiding deaths, though. If you pull over because the car says it's about to fail and it isn't, the only loss is some time. If a component fails at speed because there is no monitoring, the penalty is much worse.

          • Commercial planes are not a great comparison - they are relatively far apart in the sky and don't often hit each other. There are lots of mechanical failures every year, just that most of them don't bring the plane down and in those cases its usually pilot skill that saves the plane.

  • What the report is actually saying is that self-driving cars would only be able to avoid 1/3 of accidents caused by unpredictable behaviors of other human-driven cars. Why make it misleading? Well, clickbait or maybe something deeper.

    financed by U.S. insurers

    Now, if they found that self-driving cars would prevent nearly all accidents, do you think insurance companies would be happy? I don't think, "you could eviscerate our industry" makes for a good public report. What if they just spin it?... Aaaannd here we are.

    • Some of the major safety organizations in the United States:

      UL does electrical safety, and especially avoiding fires caused by electrical problems (Stores won't sell anything that isn't UL listed, UL rated, etc).

      The National Fire Protection Association created the fire code, which greatly reduced death, injury, and damage from fire, and continues to do so.

      IIHS is the preeminient organization for automobile safety, with ratings and standards that more accurately reflect real-world conditions than the NHTSA t

  • by petes_PoV ( 912422 ) on Sunday June 07, 2020 @12:15PM (#60156272)
    Even a 33% reduction sounds pretty good. Especially when you consider that the technology is only going to get better, from that figure into the future.

    However self-drive has many other advantages. One is indemnity. The person in the vehicle is not at fault if an accident happens. That lifts many restrictions on people in the car. Apart from the obvious ones regarding states of intoxication, it also means that individuals who would otherwise be unable to drive can become mobile.
    That would include people with disabilities as well as minors, or even the elderly who simply don't feel confident to drive any longer.

    In addition I can even see situations where you would send a car out to the store to pick up an order. Or for the store to send their vehicle to a customer's address when their stuff is ready.

    Self drive, even at the levels touted will be a game-changer ... when it works!

  • That headline should read:

    Self driving Cars Would CURRENTLY Solve a THIRD of All Crashes!!!

    A third less lives lost, a third less damage done, a whole third!

  • No mention of DUI: alcohol, drugs or a cell phone
    No mention that not everyone has the requisite skills to operate a motor vehicle, night or day, in crowded traffic, in all weather conditions. Yet we hand a DL to anyone that can pass a basic driving test and never test them again.

  • About 1.35 million people die every year in car crashes. Only a few make headlines... When a human makes a mistake are we far more tolerant, so much we accept death as a consequence. Sure, we take comfort in punishing humans, but we also forgive them for being human, for having faults and for making mistakes.

    But are we never going to accept self-driving cars on a large scale into our lives for something other than being a curiosity and unless it pays money directly into our pockets. That's why it takes one

    • "You'll hate they day you thought this was a good idea, and then you hate yourself, because you cannot accept their deaths and all you can do, that is left to do, is to tolerate it."

      You can make the same argument for motorcars in general. But if the self driving cars mean one third less deaths, you can go ahead and hate yourself one third less.

      • You can make the same argument for motorcars in general. But if the self driving cars mean one third less deaths, you can go ahead and hate yourself one third less.

        No. When you kill someone then it doesn't matter if you do it with a car or a gun or bare hands. It's you who is doing it and you will be held responsible.

        With self-driving cars however are we trying to remove any responsibility from ourselves for killing others. We want to allow machines to kill humans. When this is what you really want then you might as well call the software of your self-driving cars "SkyNet 1.0".

        • "When you kill someone then it doesn't matter if you do it with a car or a gun or bare hands. It's you who is doing it and you will be held responsible.
          With self-driving cars however are we trying to remove any responsibility from ourselves for killing others. We want to allow machines to kill humans."

          Liability is very much at the root of problems for the self-driving vehicle industry, alongside the technical ones. Who is going to be responsible for deaths, and to what degree. I can tell you that I for one

          • But having said all of that, if the result is actually a large reduction in deaths, you are clearly ahead.

            If only we could jump this easily to such a conclusion then why are we still using weapons? Why haven't we given up on them? Clearly, weapons kill a lot. The answer is, we don't really care that much about the deaths, we even enjoy watching it in movies, but we do care about why someone died. So when we allow a machine to kill and there is nobody to blame then we've lost some of our humanity.

            Or as the say goes, "When nobody is to blame, everyone is to blame."

  • Does anyone think itâ(TM)s odd that a study funded by insurance companies finds that 2/3 of crashes would still happen? Therefore justifying their continued existence? Itâ(TM)s possible itâ(TM)s valid, but it would be good to see a 3rd party validate the findings (apologize if they did and I missed it)
    • Does anyone think itâ(TM)s odd that a study funded by insurance companies ...

      Yes, I'm afraid so. They make money when there are no crashes and they pay when there are. They've become very good at knowing the factors involved and analyse accidents for the sake of their profits. If anyone is going to profit from accident-free cars then it's car makers and insurances.

  • ound the remaining crashes were caused by mistakes that self-driving systems are not equipped to handle any better than human drivers.

    Except where humans can't learn from their mistakes, these systems do, and as technology progresses they will also be able to handle a lot of those situations. Yes, you will never be able to prevent all deaths (directly related to human drivers), but I'll bet it will be reduced by at least 90-95%..
    But let's not forget who did the study, insurance companies, and it's not in their interest to tell us that using selfdriving cars is much safer, because then they would not be able to sell such high insurances,

    • Except where humans can't learn from their mistakes, these systems do, and as technology progresses they will also be able to handle a lot of those situations.

      The problem is that you hold machines to a very high standard. You believe they could possibly be near perfect, perhaps even perfect? ...

      It's this very belief of yours that when an accident does happen will you then judge the machine in the most harshest way.

      You'll have to allow a machine to occasionally kill someone and look away when it does. Are you prepared to do this?

      • Nope, I don't think machines can be near perfect, but I certainly believe they can be much MUCH better at driving then we humans can. The problem in most cases is that a human driver just looses focus for a moment (due to being distracted by whatever, these days they f-ing mobile) and that's something that's actually can't happen in a selfdriving car (except when it's defective). And yes, it will happen that a machine will kill someone, but that will happen way WAAAAAAY less then with human drivers. At lea
        • So what you're saying is that you want to be relieved from your responsibility and let a machine take it. In short, you're saying "It wasn't me, it was the car!"

          Or are you going to take the responsibility for when your self-driving car kills somebody?

  • Humans don't improve (individuals do) - AI learns new cases and improves for everyone. Think about driving in snow, some people are great, rest drives no so. AI can learn it for everyone.
  • Why even bother? Pack it up everyone, we're going home.
  • Because we've seen proof that self-driving isn't anything even CLOSE to perfect.

All life evolves by the differential survival of replicating entities. -- Dawkins

Working...