Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Transportation AI Businesses Google United States Technology

German Auto Firms Face Roadblock In Testing Driverless Car Software 177

An anonymous reader writes As nations compete to build the first operational autonomous car, German auto-manufacturers fear that current domestic laws limit their efforts to test the appropriate software for self-driving vehicles on public roads. German carmakers are concerned that these roadblocks are allowing U.S. competitors, such as Google, to race ahead in their development of software designed to react effectively when placed in real-life traffic scenarios. Car software developers are particularly struggling to deal with the ethical challenges often raised on the road. For example when faced with the decision to crash into a pedestrian or another vehicle carrying a family, it would be a challenge for a self-driving car to follow the same moral reasoning a human would in the situation. 'Technologically we can do fully automated self-driving, but the ethical framework is missing,' said Volkswagen CEO Martin Winterkorn.
This discussion has been archived. No new comments can be posted.

German Auto Firms Face Roadblock In Testing Driverless Car Software

Comments Filter:
  • by gstoddart ( 321705 ) on Thursday March 26, 2015 @02:08PM (#49348161) Homepage

    So, disregarding how the self-driving car decided who it is best to kill in any given situation, for me the biggest problem with self-driving cars is legal liability.

    If Google wants to sell autonomous cars, Google should be liable for anything the damned thing does.

    And none of this cop out where if the computer doesn't know what to do it just hands back to the human -- because that's pretty much guaranteed to fail since the human won't be able to make the context switch in time (if at all).

    As far as I'm concerned, the autonomous car has to be 100% hands off by the user at all times, and the company who makes the damned thing is 100% responsible for what it does.

    Why the hell would someone have to pay for insurance for something they don't have control of what it does?

    • So what happens when the brake assist in a modern car causes an older car to rear end a car on the highway? Who is liable as it was technology that caused the accident, not either drive.

      These things are already dealt with in modern countries, but lets pretend that all the years of driving related liability rulings never happened.

      • Same thing that happens when a modern car with brake assist rear ends an old car with better brakes and traction.

        If your car has shitty brakes you leave extra room. Good drivers realize that 'shitty brakes' is always relative.

        If you're driving a crazy high performance car you moderate your brake use to avoid being rear ended.

        • If your car has shitty brakes you leave extra room. Good drivers realize that 'shitty brakes' is always relative.

          Sounds to me like the solution to the problem in question - a computer could quickly periodically recompute the envelope of possible scenarios and never drive in the phase space into points from which it can't recover without hitting someone or something.

        • by mjwx ( 966435 )

          Same thing that happens when a modern car with brake assist rear ends an old car with better brakes and traction.

          If your car has shitty brakes you leave extra room. Good drivers realize that 'shitty brakes' is always relative.

          If you're driving a crazy high performance car you moderate your brake use to avoid being rear ended.

          This.

          You also dont have to be driving a crazy high performance car to get good braking. Just get some performance pads, rotors, good tyres and maybe some braided brake lines and you can make a Toyota Corolla stop like a sports car. Your 0-100 time will still be crap but 100-0 will be amazing. You dont even need to fit six piston callipers.

          You've got to understand your car. Sadly this is something most people never learn. They get in it every day but dont understand where the edge of the envelope is. W

      • Except they're not the same.

        If you have a human driving, you usually know who to blame.

        If you have a computer driving, the people who made the computer sure as hell aren't going to take liability.

        But you can bet your ass some sleazy lawyer will put it into the EULA that by driving in a Google car you assume all liability.

        If they're going to make autonomous cars, they pretty much need to be 100% autonomous, with the humans effectively in the back seat with no controls.

        At present, there simply ARE no liabilit

        • If you have a human driving, you usually know who to blame.

          Which, to me, is a horrible way of looking at things. If that were the only criterion, we could easily end up with ten times more car deaths simply because we're more comfortable with putting blame at people, even at the expense of lives.

          • Welcome to a world with lawyers and liability laws.Someone is always to blame.

            And, as I said, you can bet your ass Google et al are going to try to make sure it's you and not them.

            • by sl149q ( 1537343 )

              And that is why we have insurance.

              Why do you assume that insurance would not be available? And if insurance is available why is this an issue?

              Sure if there are multiple vehicles and multiple insurance companies they might have a proxy battle to determine the answers to these questions.... but that is how issues like this have been sorted out for the last couple hundred years and will continue to be sorted out. Once you have precedents the insurance companies just adjust their rates accordingly.

      • by suutar ( 1860506 )

        So, falling back to first principles... the following car should be prepared for the lead car to do pretty much anything, including drop a bumper (which will come to a stop far faster than either car can brake). If you're not leaving enough room to avoid hitting a fallen bumper, you're too close. Follower at fault, next case.

        • A dropped bumper will tend to slide along the ground and likely will go off the side of the road before the approaching car gets anywhere near it (coefficient of friction of steel or plastic vs rubber will show you that), however, when a tire falls off the lead car and the disc brakes are instead digging into the road surface, that will stop rapidly.

          However, that was just an example. There are others for current "autonomous" features in cars. How about the speed adjusting cruise control (when it malfuncti

          • by bws111 ( 1216812 )

            Every single one of your examples is just bad driving, and have nothing to do with autonomous features in cars.

            Cruise control malfunctioning is no different than the car in front slowing. It is your job to notice and react. If you 'don't have time' then you are tailgating.

            4 wheel drive? What does that have to do with acceleration?

            ABS - yes, that is what is does. If you are driving in snow, leave more room. Not that complicated.

            Traction control - yes, that is why it has an OFF switch

            • Did you miss the first two words on the Cruise control line? Speed Adjusting Cruise Control, it is the new hot thing, it adjust the speed of the vehicle automatically to the vehicle in front. If it were to malfunction you may not know in time to react.

              4 wheel drive ONLY effects acceleration, what else would it have control over?

              You are missing the point, all of these systems (besides 4WD...) are being integrated into an auto driving car, so we have to look at the failure modes to understand where the liab

              • by bws111 ( 1216812 )

                It does not matter if the speed control is speed adjusting, it is still YOUR responsibility to maintain safe distance.

                4 wheel drive does not affect acceleration, it affects traction. The only way to blame 4WD for unintended acceleration is if you were planning on spinning your wheels, which again is just shitty driving.

                And, no, I am not missing the point, you are. The 'failure modes' you listed, whether or not some technology was involved, are failures of the DRIVER, and that is where the liability reside

                • by suutar ( 1860506 )

                  For cases where the only difference is who was making the decisions, I'd say liability should be with the manufacturer. Other cases are similar to existing with-driver cases: parts failure (manufacturer), skipped maintenance (owner), poorly performed maintenance (shop that did the work).

                  In the end, like with FAA investigations, everything boils down to pilot error and equipment failure, and in many cases "not dealing with equipment failure properly" is considered pilot error. The only question is who's the

        • Dunno how it is in the US, but here in the Netherlands you are obligated to prevent parts from dropping off your car. Also, the quite thorough yearly tests check for such cases.
          The first car would be liable.
          In practice it is impossible to find liability in such a case. If I drive behind a car that looks like it's going to loose parts I'll keep an appropriate distance. It doesn't happen all that often.

      • by bws111 ( 1216812 )

        Huh? It certainly is the fault of the driver in the older car.

        • Not always. If I slam on my breaks (ABS, electronic brake distribution, traction control, etc) and someone rear ends me, I could be at fault for overbraking even if it was needed to prevent me hitting someone/something.

          • by bws111 ( 1216812 )

            Where? Everywhere I have been the driver in back is always at fault in rear-end collisions.

      • FYI ABS brakes were available in Europe for years before the USA *for just that reason*
    • Re: (Score:2, Funny)

      by afidel ( 530433 )

      Why the hell would someone have to pay for insurance for something they don't have control of what it does?

      Says every parent of a teenager since cars became widespread.

    • by TWX ( 665546 )
      There's always a degree of responsibility and liability if one owns something. I own a ladder. If someone uses that ladder at my house with my permission and the ladder breaks and injures them, even if it has not been abused, my homeowners' insurance policy is probably going to end up having to pick up the tab.

      I fully expect that insurance for completely autonomous cars will be less expensive, once self-driving cars are proven. To prove them, I expect large fleets sponsored by the manufacturer or syst
      • I fully expect that insurance for completely autonomous cars will be less expensive, once self-driving cars are proven.

        Again, why would I pay liability insurance to cover the actions taken by a computer?

        The only viable business model for fully autonomous cars I can see is essentially as taxis.

        The notion that we're all going to trade in our cars and let the computer do all the driving is laughable -- too many people like driving, and there's decades worth of cars out there. The notion that we'd buy a self d

        • by itzly ( 3699663 )

          Again, why would I pay liability insurance to cover the actions taken by a computer?

          Because that's the most pragmatic solution. If you don't like it, don't get a self driving car (and probably pay even more insurance).

          • by sl149q ( 1537343 )

            Exactly. Do you think you ARE NOT paying insurance when you are in the back of a taxi? Its just that the fare reflects the operating cost. And part of the operating cost is the insurance. And you really want the driver to HAVE insurance. So you pay the fare which pays the insurance.

            How is that different from buying insurance to cover the self driving car you buy or lease or rent to get you from point a to point b. The insurance is there to make sure that any parties injured in a collision (including yoursel

      • by khasim ( 1285 )

        To prove them, I expect large fleets sponsored by the manufacturer or systems integrator will drive many thousands of hours per-car to establish a baseline, similarly to how an MTBF is established for devices, and that rate of collision or other liability-causing event will factor into the insurance companies' rates for those cars.

        I think it will be even easier.

        The autonomous cars will be packed with sensors that record EVERYTHING.

        If there is an accident then the insurance companies will know which car has

        • by TWX ( 665546 )
          Cars are very complex machines that can have loads of things go amiss with them without rendering them undrivable, and can have loads of other things go wrong while in operation. You're correct that it's safe to assume everything will be recorded, but I expect equipment failures will plague first-generation autonomous cars once they're old and the tolerances have loosened up. Steering, tires, brakes, suspension alignment, all things that will lead the computer astray as it's attempting to self-drive.
          • by dave420 ( 699308 )
            The computer is in a very good position to know when those components develop faults, as it knows with far greater precision when it takes more effort/time to perform some function (turning, braking, accelerating, etc.). Monitoring the other components is also trivial, and a computer can do a much better job than a human.
    • for me the biggest problem with self-driving cars is legal liability.

      Why is this a problem? Several states already allow self driving cars on the road (although with a driver in the seat for now). The liability issue is already resolved. The party responsible is the insurance company. Duh.

      The only thing that changes with SDCs, is that the insurance will likely be much cheaper. Not just because accidents will go down, but also because the camera, gps, and sensor data will make it very clear what happened, eliminating disputes over the facts, so legal costs will be much l

      • by bws111 ( 1216812 )

        You seem highly confused as to what insurance is. Here is a clue: the 'party responsible' is NEVER the insurance company. The 'party responsible' is YOU, the insurance company is just providing the cash for you.

        Its cute that you think cameras, gps, and sensor data will make it very clear what happened or eliminate disputes.

        • The 'party responsible' is YOU, the insurance company is just providing the cash for you.

          As long as the insurance company is paying, why should I care who is "responsible"?

          Its cute that you think cameras, gps, and sensor data will make it very clear what happened or eliminate disputes.

          In many countries, insurance companies offer discounts for anyone using a dash cam. Why? Because cameras reduce disputes, thus lowering legal costs.

          • by bws111 ( 1216812 )

            Seriously? First, your liability does not end where your coverage does. If you are under insured, it is you who us responsible. Second, the insurance company does not pay out of the goodness of their hearts, they pay because you pay them to. And if you have a claim, you will pay more.

            The lower insurance rates with dash cams are more about fraud detection than dispute resolution.

    • by itzly ( 3699663 )

      The point is moot. Either the owner pays the insurance, or the owner pays Google, and then Google will pay for the insurance.

      As long as insurance companies are willing to provide the insurance, the finer liability issue isn't important.

    • for me the biggest problem with self-driving cars is legal liability.

      This is already covered. Brakes fail, tires blow out, mechanical failures happen. They kill people. Its been something that has happened plenty and gone trough the courts many times. Precedent has been set.

    • by mjwx ( 966435 )

      So, disregarding how the self-driving car decided who it is best to kill in any given situation

      This old chestnut needs to die.

      Rules for this already exist, it's just that human drivers dont follow them. An autonomous car will be programmed to take the course that causes the least damage and is the most legal. So they would choose a rear end crash over a right angle crash because a rear ender presents the lowest risk of casualties. If you think that veering out of your lane to avoid a rear end crash is

  • Humans are unable to make moral decisions in a few miliseconds. They would either freeze for a least one second and hit the next car or pedestrian depending on which comes first. If they have more time, they would try to avoid collision with the human and hit the car, because you cannot really see other people in there and you do not know how many persons are in there. Also people in the car are better protected. So the safest thing is hit the car. But beside that people know when approaching an truck trail

    • Humans are unable to make moral decisions in a few miliseconds. They would either freeze for a least one second and hit the next car or pedestrian depending on which comes first. If they have more time, they would try to avoid collision with the human and hit the car, because you cannot really see other people in there and you do not know how many persons are in there. Also people in the car are better protected. So the safest thing is hit the car. But beside that people know when approaching an truck trailer and they cannot stop, they should aim for the wheels and not the section in the middle. However, most people are unable to implement that so why should be cars be able to do these things?

      You have hit on one of the key reasons why trying to implement human reasoning in an emergency; especially since it's usually a subconscious reaction to avoid hitting the bigger, scarier thing. yo can train people to make calm decisions in an emergency situation but that takes a lot of simulator time and practice; something most drivers sorely lack before getting a license. If you wanted to follow the human reasoning it would simply be "CRAAAP.... AVOID HITTING THE BIG THING...DAMN... A PEDESTRIAN ... OH WE

    • by Livius ( 318358 )

      I would go further and say that those who believe that ordinary people think rationally and/or ethically in the spit second of crisis in an imminent collision are possibly sociopaths and should not be allowed near critical software that will these kinds of decisions.

  • As opposed to all the laws and regulations making driverless cars difficult to test in the US? Google has to pay someone to sit in the front seat so they can take over from the computer (that can make better decisions faster than a human).

    What regulations concern them so much (I didn't see any listed in the article) and how do they differ from the US regulations (like the US in some lawless state..)

    • by prefec2 ( 875483 )

      The thing is quite simple. They have tested such cars in Germany. Therefore, it is possible to do so. They can also test their shiny new thing in German traffic, as long as a human back up is sitting in the car. And testing it in a German or other European inner city should be challenging enough for now. However, they want to show how innovative they are and not get surprised by the Japanese again, as it was with the hybrid. So actually this is mostly advertisement.

  • The anti-driverless car always love to bring up the situation that they think the human handles well, but the computer does poorly - the given example of hitting a pedestrian vs a family in a car. This of course ignores the fact that 99% of the time humans have no idea if the car has a family in it, or a single neo-nazi.

    But the self-driving cars ARE capable of hitting the breaks quicker and more reliably (avoiding skidding) than a normal human would

    Think about it if it were the other way around - what if humans were crappy about deciding to hit the pedestrian but computers had incredibly slow reflexes and took ten times as long to decide to hit the break. Given that example we would laugh and say no way would we let anyone with slow reflexes drive a car.

    But we already do that - we let human reflexes drive a car - (Even if they have had one drink 30 minutes ago, slowing them down). The question is not and never has been will computers be perfect drivers. Instead the question is will they do

    • better than humans in most situations.

    And that is something that we likely can do within the next couple of years, if we can't already do that.

    So stop being obstructionists idiots bringing up the rare/never seen in the real world situations, and talk about what actually happens.

    • I somewhat agree... but the problem is not "is a driverless car better than a human" it's: who do we sue when something goes wrong.

      In the proposed hypothetical, whoever gets hit is going to be suing someone. Who do they sue? The owner of the car (even if they weren't in the car the time?) the "passenger" or the company that makes the vehicle.

      I tend to think that it will be the owner - and the owner will need to have insurance to cover whatever the autonomous car could do. There is no way a company like G

    • by prefec2 ( 875483 )

      This is totally true. No human ever was able to make a moral determination and act on it in an accident situation. Beside the fact. The pedestrian and the family in the car is quite simple. The family is protected by their car the pedestrian is not. It would be a different thing between one or a group of pedestrians. Most humans would freeze and hit who ever comes first. End of story. The car could try to reduce victims.

    • As soon as any autonomous car advocates start talking about 'what actually happens' the conversation can start in earnest.

      But for now, all we have is Google's marketing BS and some DARPA challenges that paint a much less rosy picture.

      • by Qzukk ( 229616 )

        As soon as any autonomous car advocates start talking about 'what actually happens'

        Why yes! Just the other day a baby stroller magically appeared 2 feet in front of me while I was doing 90mph on the local autobahn, forcing me to make a snap decision between creamed baby or ramming the President's car which was carrying a gaggle of pregnant neurosurgeons to a peace conference that just happened to be in the other 5 lanes of the freeway and the shoulders and the sidewalks and the ditches, all at the same time

    • by mjwx ( 966435 )

      The anti-driverless car always love to bring up the situation

      The pro-driverless car crowd always love to ignore the fact that the autonomous car wont be driverless for decades. A human will still be required to oversee and in case of a failure, take control of the vehicle.

      The big problem with this is that people will be taking manual control because the autonomous car will abide by the rules that human drivers like to ignore like keeping a safe distance, not driving in the passing lane, keeping to the

  • For example when faced with the decision to crash into a pedestrian or another vehicle carrying a family, it would be a challenge for a self-driving car to follow the same moral reasoning a human would in the situation. 'Technologically we can do fully automated self-driving, but the ethical framework is missing,' said Volkswagen CEO Martin Winterkorn.

    We learn how to drive in driving school and not how to crash. In a situation like the above. Most people will hit the next thing in front of them, regardless of what or who it is.
    The faster the car goes the less likely anyone is to avoid an obstacle.

    If anything. A machine could improve things by being able to react in time.

    • We learn how to drive in driving school and not how to crash. In a situation like the above. Most people will hit the next thing in front of them, regardless of what or who it is.

      I remember stats showing drivers going into a tree at high speed are more likely to aim (unconsciously) for the passenger side - drivers have higher chance of survival than front passengers.

      >> same moral reasoning a human would in the situation

      This is some hilarious shit right there. There are no higher mental functions involved in a crash, its all on instinct, pre-learned behaviour and reflex.

  • While they are trying to change the laws back home they might as well do their development and testing in the United States. We currently have fewer restrictions here.

    I agree with gstoddart about autonomous cars being able to be 100% hands off by the user at all times for normal driving regimes. If the companies that make them do it right then they should not be afraid of being 100% responsible when the vehicle is in autonomous mode. Some computer modeling of autonomous vehicles has shown a major drop in

  • 'Technologically we can do fully automated self-driving, but the ethical framework is missing

    Ethically we can allow fully automated self-driving, but the Technological framework is kinda missing

  • Why this obsession with moral reasoning on the part of the car? If using self-driving cars are in 10x fewer accidents than human driven cars, why the requirement to act morally in the few accidents they do have. And it isn’t as if the morality is completely missing, it is implicit in not trying to to hit objects, be they human or otherwise. Sure try to detect which are objects are human and avoid them at great cost, but deciding which human to hit in highly unlikely situations seems unneeded and pe

    • Instinctively this is what we humans do already -- try not to hit anything, but save ourselves as a first priority. In my few new misses (near hits) Iâ(TM)ve had, I never find myself counting the number of occupants in the other car as I make my driving decisions.

      It's a horribly stupid breakpoint in any case. The truth is that if you swerve to avoid a school bus and run over an old person on the sidewalk, they're just going to do you for running over the old person because you weren't supposed to be on the sidewalk. Meanwhile, you weren't supposed to be outdriving your vision or your brakes, so you wouldn't have had to dodge the school bus anyway. The self-driving car won't outdrive its vision or brakes, so it won't have to swerve. If someone jumps in front of it, i

  • by SuperKendall ( 25149 ) on Thursday March 26, 2015 @02:58PM (#49348741)

    The summary doesn't really explain why that dilemma is harder for German companies to solve than American companies.

    For Americans, the answer is: always hit the pedestrian(s). What the hell was anyone doing outside of a car?

  • by Tom ( 822 ) on Thursday March 26, 2015 @03:13PM (#49348881) Homepage Journal

    For example when faced with the decision to crash into a pedestrian or another vehicle carrying a family, it would be a challenge for a self-driving car to follow the same moral reasoning a human would in the situation

    Or maybe it would follow better moral reasoning. Ours is not perfect, it's just whatever evolution came up with that gave us the best species survival rates. That doesn't mean it's really the most ethical solution.
    For example, in a post-feminist society, let's assume for arguments sake that gender discrimination has been overcome, wouldn't we also do away with "women and children first" - which is a suitable survival approach in a species fighting for survival in the african prairie, but hardly for the dominant species that already is overpopulated.

  • by flopsquad ( 3518045 ) on Thursday March 26, 2015 @04:39PM (#49349729)
    1) Cars are not technologically at a point where they have omnipresent awareness of the constituents of every vehicle around them and the locations of every pedestrian (add in crowded street-facing cafés, structural members for buildings, and everything else you could possibly think of). Neither, for that matter, are people.

    2) The most brilliant philosophers still disagree over the ethics of choosing who dies when someone's gotta go. See also the Trolley Problem, most other ethical dilemmas, and generally the eternal struggle between various consequentialist and deontological systems of ethics.

    3) This precise scenario is highly contrived and seems (1st approximation) to be vanishingly rare.

    Given the above, maybe the question shouldn't be if a robot can make a perfect (or any) ethical decision. Maybe for now it should just be if the robot can do better than a human at not killing anyone at all in these sorts of situations. Maybe "I did my best to protect my owner from death and just happen to average out safer for everyone" will have to be ok for now.
    • The whole ethics framework debate is a straw man (computer) argument. It's patently obvious people don't make those split second judgement calls. The real reason the Germans are on sound moral grounds is autonomous cars are nowhere near commercial prime time on sunny day clear traffic straight highways. Dirty sensors, unpolished code with bugs, proper reliable extraction of features, sensor failures, intelligent prediction of object locations, prediction and proper avoiding of road hazards, and many mor
      • Not sure if the "you" in your post was me or the Googles of the world making self-driving cars. If it's me, I'll just point out that I never proposed that handwringing over decisional ethics was the one thing holding SDCs back.

        My point was that questions like the one in TFS are matsurbation. The question ought to be, are we at a point where they're safer (aggregate) than humans, driving in real world conditions? You and I both agree that currently the answer is no. For optics and liability reasons, t
        • I didn't mean to imply you but in general.
          I pretty strongly object to testing in real life situations when the populace has an expectation of safety. The google car is perhaps the most advanced in the world yet is not able to function safely in city driving. Google themselves admit it's not ready or it would be rolled out as a product. It's a far cry from teams of engineers, programmers, and scientists fussing over every last detail, planning routes where only expected problems (if any) are in ideal si
  • Let's say they manage to program the car so that it can calculate which course of action will cause the least injuries/fatalities. Now you get into a situation where the only two options available are a.) evade some obstacle on the road, but thereby hit a group of five pedestrians, quite possibly severely injuring or killing them or b.) hit the obstacle, quite possibly killing the driver (you). You are alone in your car.

    Now, would you drive such a car which sometimes can decide, with cold, pure Vulcan logic

  • by aepervius ( 535155 ) on Thursday March 26, 2015 @05:32PM (#49350145)
    Those made up example with the family van and the walker are laughable. Firstly out of my experience you do not have time while being in an accident to think that far as to check & see the family van is full of children. You steer , brake, and do as much as you can to avoid the walker. And the van. Secondly you really think there is even a concurrence ? The walker will get the full hit of the kinetic energy in his body. The van will absorb part of it in its structure. What sort of SICK FUCK would hit the walker rather than the van because there are children in the van ? There is no photo finish : You steer to avoid all, and if you cannot, you try to hit the target which will get the less physical damage from your damn car *if you can think that fast*. Does not matter if the walker is an old 99 guys and the van full of the brim of puppies and babies. I expect any automated car to make the same damn calculation : between hitting a car and a walker, go for the car. Hit the one with the most protection.
    • by Delgul ( 515042 )

      Hmm... Call me a sick fuck then because rather than killing myself, my wife and my children driving into that upcoming 2 ton truck, I would choose to hit the pedestrian every day of the week! And to add a bit of fuel to the discussion, I (and I suspect there are many that agree with me) would only ever buy a car that makes decisions that takes ME into account first and THEN starts looking at minimizing collateral damage. But, like you said, I'm a sick fuck by thinking this apparently...

  • "For example when faced with the decision to crash into a pedestrian or another vehicle carrying a family, it would be a challenge for a self-driving car to follow the same moral reasoning a human would in the situation."

    No, a self driving car shouldn't get into that situation in the first place. The right thing to do here is to anticipate events and slow down. Self driving cars have a huge advantage here, in that they don't get tired or lose attention over time.

  • For example when faced with the decision to crash into a pedestrian or another vehicle carrying a family

    Um there is no question for several reasons.

    First if the situation is so immediate your only two options are hit a vehicle or hit a person its highly unlikely you have time to peer into the other vehicle and count its occupants.

    Second most vehicles on the road today have lots of safety features; if they are being used, seat belts fastened airbags not disabled etc, most crashes are highly survivable; most pedestrian vehicle crashes far far less so for the pedestrian (excepting very low speed nudged someone i

If all else fails, lower your standards.

Working...