Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI Transportation

Are Major Legal Changes Needed for the Driverless Car Era? (bbc.co.uk) 110

Long-time Slashdot reader Hope Thelps brings news about the future of self-driving cars. "The law commisions of England and Wales and of Scotland (statutory bodies which keep the laws in those countries under review) are recommending a shift in accident liability away from 'drivers' when autonomous cars become a reality."

The BBC reports: Human drivers should not be legally accountable for road safety in the era of autonomous cars, a report says. In these cars, the driver should be redefined as a "user-in-charge", with very different legal responsibilities, according to the law commissions for England and Wales, and Scotland. If anything goes wrong, the company behind the driving system would be responsible, rather than the driver....

In the interim, carmakers must be extremely clear about the difference between self-drive and driver-assist features. There should be no sliding scale of driverless capabilities — a car is either autonomous or not....

Transport Minister Trudy Harrison said the government would "fully consider" the recommendations. The Scottish and Welsh governments will also decide whether to introduce legislation.

The BBC also summarized some of the reports other recommendations:
  • Data to understand fault and liability following a collision must be accessible
  • Sanctions for carmakers who fail to reveal how their systems work

This discussion has been archived. No new comments can be posted.

Are Major Legal Changes Needed for the Driverless Car Era?

Comments Filter:
  • Your driverless car screws up, hits someone, and kills them. Who is going to jail for manslaughter?
    • When you describe a manslaughter event, we might try to answer you.

      Someone getting killed != manslaughter.

      Maybe try thinking about this before posting, yeah?
      • You're being an obtuse pedant. Dial down your autism knob.

        The legal definition of "manslaughter" is

        'the crime of killing a human being without malice aforethought, or otherwise in circumstances not amounting to murder.'

        A driverless car killing someone can fit that definition, might be found in court of law to always fit that definition.

        • by dryeo ( 100693 )

          Isn't manslaughter usually a crime that requires intent to injure but not kill, like punching someone and accidentally killing them.
          I think the crime would be something like criminal negligence causing death, which is roughly as serious as manslaughter.

        • by Zemran ( 3101 )
          A car cannot be expected to be with aforethought so but you omit intention which also needs to be present and the car is also incapable of intention. So a car cannot commit manslaughter but a human can. The human may also be liable for a whole raft of other crimes that result in a death which in this instance would most commonly include negligence.
      • Re:Who is at fault? (Score:4, Informative)

        by hey! ( 33014 ) on Sunday January 30, 2022 @12:59PM (#62221359) Homepage Journal

        I think what the poster is talking about is what is called "criminally negligent homicide" in many jurisdictions. That's a better term in any case because it makes it what we have to clarify: what are the duties of the manufacturer of the system and the operator of the system to protect others?

        It's pretty clear that in a world of self-driving cars *some* instances of systems killing people would be unfortunate accidents nobody could have anticipated. Some foreseeable accidents may be matters for civil litigation. But I expect that we'll eventually require a certain minimum standard of safety performance and failing to meet those standards will make any resulting death criminal matters. But those standards don't exist yet, so we can't answer the poster's question yet.

        • by Zemran ( 3101 )
          Given that "homicide" is a US word, not many other jurisdictions. Most countries only refer to murder where the intent was to kill and manslaughter where in the commision of an illegal act someone is killed. So if you meant to rob but someone dies it is manslaughter but what if you only meant to sit in you car on while it drives you to work? You are not doing anything illegal and therefore murder or manslaugher are not relevant. If you should be paying more attention then it is still the lack of attenti
    • Perhaps no one? (Score:5, Interesting)

      by Okian Warrior ( 537106 ) on Sunday January 30, 2022 @12:04PM (#62221199) Homepage Journal

      Your driverless car screws up, hits someone, and kills them. Who is going to jail for manslaughter?

      Your driverless car screws up, hits someone, and kills them. Who is going to jail for manslaughter?

      It's not clear that anyone should go to jail for manslaughter.

      Tesla has a ton of statistics that show autopilot (ie assisted) driving has about 1/9 the accident rate of human driving. We can argue the details, the fact that Tesla doesn't show its data or that assisted driving is only on simple conditions, but in general it seems reasonable to expect that autonomous driving will be safer than human driving by a wide margin.

      Whenever an aircraft comes down the NTSA sifts through the circumstances to determine the root cause, and makes recommendations to prevent future accidents of the same type. Many of those accidents result in changes to various design features of the aircraft, instrument software, or procedure.

      It seems reasonable that when a driverless car hits someone, no one will go to jail for manslaughter, and some agency will analyze the stored vehicle data and make software recommendations that will be sent out via OTA updates.

      As a society, we can justify this if autonomous driving is sufficiently safer than human driving such that the total number of deaths is much less at the start. Then autonomous driving will become safer over time.

      Also, insurance agencies would be on-board with this: fewer payouts under autonomous driving.

      I'm reminded of the US electrical code, which has undergone a bunch of revisions over the last several decades. The original vacuum cleaner with 2-prong plug could have a short in the motor, electrifying the metal case, and would kill the user when they brush up against a radiator. Three-prong plugs largely eliminated that problem, and a host of changes, tech advances (GFIs, breakers), and minor "best practices" have made your house electrical system remarkably safe.

      I get it, having an autonomous vehicle kill someone sounds really scaaary, but when you think about it for a few moments you realize that accepting that risk actually makes everyone safer.

      The important bit in the previous sentence: think about it for a few moments.

      • Re:Perhaps no one? (Score:4, Insightful)

        by phantomfive ( 622387 ) on Sunday January 30, 2022 @12:18PM (#62221239) Journal

        Tesla has a ton of statistics that show autopilot (ie assisted) driving has about 1/9 the accident rate of human driving.

        It's a clear example of lying with statistics.

        We can argue the details

        Details matter quite a bit.

        • Tesla has a ton of statistics that show autopilot (ie assisted) driving has about 1/9 the accident rate of human driving.

          It's a clear example of lying with statistics.

        • Tesla has a ton of statistics that show autopilot (ie assisted) driving has about 1/9 the accident rate of human driving.

          It's a clear example of lying with statistics.

          I'm sure you have a ton of clear and convincing evidence to be making such categorical statements, so please present it. Or GTFO.

      • The problem is that there's a root cause for every car crash - the driver, car, road, etc. We're willing to accept that humans are fallible and even call car crashes "accidents" even though that's far from the truth. As soon as the car is under autonomous control we're not longer willing to accept that fallibility - the automation system must be perfect. Autonomous driving has the ability to make the roads safer, but people are not rational. A legal change that enshrines the fallibility of autonomous dr
        • Who is this "we" that are not able to accept that fallibility?

          It can't be "people", because there's plenty of people eager to buy and abuse the current, unreliable systems in ways that are clearly warned are unsafe. And plenty of people willing to sell them.

          I think most people are perfectly aware that life is full of risk, they just mostly choose not to think about it and be shocked and appalled when it inevitably bites them in the ass.

          Plus, the media is desperate to make low-bugdet "news" out of whatever

        • People have died in vehicle accidents because of mechanical failures for over a century. So what you describe is nothing new.

          The manufacturer will compensate the victim if a design flaw causes the accident.

          No one will go to jail unless negligence can be proven. Nobody expects SDVs to be perfect. They have already killed people. HDVs kill many more.

      • I'm sure that insurance companies would love to pass on the liability burden to the manufactures, but will manufacturers accept it?
      • by bws111 ( 1216812 )

        Wow, that is amazing! You mean a system that can only be used under ideal conditions, AND has a human monitoring it who will (hopefully) take action before an accident occurs, has a better record than humans?

        And even if you ignore those 'details', what they are comparing themselves to are not 'average' drivers, they are comparing to the worst drivers. How does Tesla compare when you include driving in all conditions and exclude beginning drivers, drunks, and texters? Because if I am going to let my car do the driving I want it to be as good as or better than ME, not the worst drivers out there.

      • Your driverless car screws up, hits someone, and kills them. Who is going to jail for manslaughter?

        It's not clear that anyone should go to jail for manslaughter.

        Tesla has a ton of statistics that show autopilot (ie assisted) driving has about 1/9 the accident rate of human driving. We can argue the details, the fact that Tesla doesn't show its data or that assisted driving is only on simple conditions, but in general it seems reasonable to expect that autonomous driving will be safer than human driving by a wide margin.

        We absolutely can't expect this. Common sense suggests that autopilot will be disproportionately used in low-risk environments. It could very well be that the human accident rate would be similar or lower. It's also quite plausible that many of the "human driven" accidents were due to the autopilot putting the vehicle in a dangerous situation that the human then unsuccessfully tried to correct.

        3rd party researchers need to see the full data, Elon Musk simply doesn't have the credibility for us to take his w

      • You do not get credit for all the times you did not kill someone.

        This is a complex area and IANAL.

        But generally (and there are exceptions and some tings do not seem to be common sense) the idea is that if you do something in a way that could be foreseen by a reasonable person to end in a fatality, you (the person in control - whoever that is) is possibly guilty of causing death by dangerous driving.

        So far the person in control is the driver.

        IMNSHO, the only place autonomous driving will ever be allowed is i

      • "Tesla has a ton of statistics that show autopilot (ie assisted) driving has about 1/9 the accident rate of human driving, if you don't correct for confounding factors such as that tesla AP will only work in good weather and is not used where most accidents per mile occur"

        And if they are comparing to the general accident rate then you need to account for the fact that the average tesla driver is middle class, middle aged and driving on highways, not a couple of drunk teenage rednecks hooning around the back

      • by Zemran ( 3101 )
        We are talking about the UK, not the US, where companies are liable for the crimes they commit and are prosecuted. The UK is far less corrupt although not perfect. You say that the Tesla is responsible for far less death as if that means it is OK for a Tesla to kill. That is a very weird attitude. If a Tesla kills a person there should be an invistigation and if it is shown that there was insufficient care in a process during the design then there should be a legal charge. If they do their job correctl
  • Pilots in command (Score:4, Insightful)

    by E-Lad ( 1262 ) on Sunday January 30, 2022 @11:49AM (#62221171)

    With aircraft, there is always a Pilot In Command. The PIC (and, really, the entirety of the flight crew) are still responsible for everything an aircraft does, even if it's on autopilot. The PIC must still pay attention and monitor both systems and the situation at all times. If the autopilot being active is not appropriate for the situation or is operating the aircraft unsafely, it's still the PIC's responsibility to assume control and (attempt to) correct the situation. You just cannot assume that the autopilot will always do the right thing.

    I really don't know why this concept hasn't been applied to driverless cars. The infrastructure for truly driverless cars does not exist on all of the roads nor is it standardized, even though the mechanisms might be, so attempting to completely absolve the quasi-non-driver from responsibility is for the operation of the vehicle is wondrously dumb. This is especially true since driving environments will be mixed with drivered and driverless vehicles for quite some time and the same hazards that exist today will exist for just as long.

    • by ColdBoot ( 89397 )

      spot on. can't believe they only gave you a 2 for this

      • Re: (Score:2, Insightful)

        by Joce640k ( 829181 )

        spot on. can't believe they only gave you a 2 for this

        It's a 3 now, but still... the idea of making the manufacturers liable is the most stupid idea in the history of humanity. It's basically an open invitation for ambulance-chasing lawyers to destroy what should be a huge advance in humanity.

        The only thing that needs to be demonstrated is that robodrivers are statistically better than human drivers. That's it.

        Human drivers are out there right now causing all sorts of carnage. All we need to do is improve on that. There can be several hundred fatal accidents p

        • The only thing that needs to be demonstrated is that robodrivers are statistically better than human drivers. That's it.

          I don't think that should be the case, it should be that they followed proper safety standards, for example properly tested the software. Even if chances of killing someone is less than a human driver.

          Think of it this way after self driving cars are common, would it be acceptable for a manufacturer to release a knowingly substandard system that is twice as likely to kill some as the other automated driving systems, but still less likely to kill than a human driver. Should they be able to evade liability.

          Or

          • Think of it this way after self driving cars are common, would it be acceptable for a manufacturer to release a knowingly substandard system that is twice as likely to kill some as the other automated driving systems, but still less likely to kill than a human driver.

            The answer is "yes", but it's a good point.

            Unlike humans, where new idiots are being given licenses every day; self-driving cars will be a continually-improving thing. When one car receives an update to make it smarter, they all get smarter. All at once.

            In this new world order it's ridiculous to think that every manufacturer will be at the exact same level WRT their software and it's just as ridiculous to think that some of them should be more liable than others.

            I'm pretty sure most manufacturers have a vir

        • Yeah, no corporation has ever tried to cut costs &/or increase profits by cutting corners on health & safety, then trying to cover up, deny, &/or blame the public for the consequences, have they?
        • by dryeo ( 100693 )

          he only thing that needs to be demonstrated is that robodrivers are statistically better than human drivers. That's it.

          You need to consider circumstances, they're already safer in some circumstances such as a quiet well marked highway. What about a rural road with no markings and crappy mapping? In the snow? And does the manufacturer know about the short comings and didn't mention them.
          Otherwise its like the guy up the page claiming that Tesla's are 1/9th as likely to be in a crash without any information on circumstances, for both the Tesla and non-Tesla.

    • Re:Pilots in command (Score:4, Informative)

      by BloomFilter ( 1295691 ) on Sunday January 30, 2022 @12:05PM (#62221203)
      Exactly right! From what is obvious to anyone who has "driven" a self-driving car in busy city traffic where people walk, or rainy weather, cars CANNOT drive themselves anytime soon safely - despite what Elon says.
    • Re:Pilots in command (Score:5, Interesting)

      by Freischutz ( 4776131 ) on Sunday January 30, 2022 @12:15PM (#62221227)

      With aircraft, there is always a Pilot In Command. The PIC (and, really, the entirety of the flight crew) are still responsible for everything an aircraft does, even if it's on autopilot. The PIC must still pay attention and monitor both systems and the situation at all times. If the autopilot being active is not appropriate for the situation or is operating the aircraft unsafely, it's still the PIC's responsibility to assume control and (attempt to) correct the situation. You just cannot assume that the autopilot will always do the right thing.

      I really don't know why this concept hasn't been applied to driverless cars. The infrastructure for truly driverless cars does not exist on all of the roads nor is it standardized, even though the mechanisms might be, so attempting to completely absolve the quasi-non-driver from responsibility is for the operation of the vehicle is wondrously dumb. This is especially true since driving environments will be mixed with drivered and driverless vehicles for quite some time and the same hazards that exist today will exist for just as long.

      If you are the DIC (Driver in Command) and have to sit there clutching the steering wheel with your foot on the break ready to intervene at a fraction of a seconds notice in case the AI messes up because you are legally liable for what happens when an AI you didn't write does mess up and kill somebody, what is your motivation to buy an AI driven car? You might as well save your money, skip the AI feature and drive the car yourself. The whole point of having a self driven car is snoozing in the back seat while the thing drives you from LA to New York. As for why this concept hasn't been applied to AI driven cars, that's simple. In an aircraft you have miles of separation and you are being monitored by an ATC. You have plenty of warning when a collision or any other incident is imminent. If your co-pilot doesn't warn you the ACAS, will, if the ACAS fails you, the ATCO will warn you. In a car you have none of that, separation is on the order of tens of meters at best, usually less than that and when you are burning down the highway at 150 kph your reaction time is measured in two or three seconds at best, usually less than a second. Quite frankly, if car makers cant guarantee an accident as good or better than manually driven cars in all environments and under all conditions (in the UK it is 1 in 366 for every 1600 kilometres driven), they are shit out of luck. AI driven cars will have to beat those odds unless we are talking about trucks and buses driving on separate dedicated lanes.

      • by tlhIngan ( 30335 )

        If you are the DIC (Driver in Command) and have to sit there clutching the steering wheel with your foot on the break ready to intervene at a fraction of a seconds notice in case the AI messes up because you are legally liable for what happens when an AI you didn't write does mess up and kill somebody, what is your motivation to buy an AI driven car? You might as well save your money, skip the AI feature and drive the car yourself. The whole point of having a self driven car is snoozing in the back seat whi

        • by nasch ( 598556 )

          Anything short of Level 5 autonomous driving requires a driver at the wheel.

          Not quite true, depending on what you mean by "at the wheel". At level 4 the driver need not be prepared to take over driving.

          The very best technology out there right now is at level 3.

          Who has level 3?

          But there is tremendous value in Level 2 autonomy

          It's also quite dangerous, because it's asking humans to do something they're terrible at, and will generally do either poorly or not at all.

          We aren't getting Level 5 anytime soon, so the dream of LA to New York in the back seat will likely be had by paying an Uber driver or other thing.

          Level 4 seems pretty plausible though, and that would be really nice for long trips.

      • by mjwx ( 966435 )

        With aircraft, there is always a Pilot In Command. The PIC (and, really, the entirety of the flight crew) are still responsible for everything an aircraft does, even if it's on autopilot. The PIC must still pay attention and monitor both systems and the situation at all times. If the autopilot being active is not appropriate for the situation or is operating the aircraft unsafely, it's still the PIC's responsibility to assume control and (attempt to) correct the situation. You just cannot assume that the autopilot will always do the right thing.

        I really don't know why this concept hasn't been applied to driverless cars. The infrastructure for truly driverless cars does not exist on all of the roads nor is it standardized, even though the mechanisms might be, so attempting to completely absolve the quasi-non-driver from responsibility is for the operation of the vehicle is wondrously dumb. This is especially true since driving environments will be mixed with drivered and driverless vehicles for quite some time and the same hazards that exist today will exist for just as long.

        If you are the DIC (Driver in Command) and have to sit there clutching the steering wheel with your foot on the break ready to intervene at a fraction of a seconds notice in case the AI messes up because you are legally liable for what happens when an AI you didn't write does mess up and kill somebody, what is your motivation to buy an AI driven car? You might as well save your money, skip the AI feature and drive the car yourself. The whole point of having a self driven car is snoozing in the back seat while the thing drives you from LA to New York. As for why this concept hasn't been applied to AI driven cars, that's simple. In an aircraft you have miles of separation and you are being monitored by an ATC. You have plenty of warning when a collision or any other incident is imminent. If your co-pilot doesn't warn you the ACAS, will, if the ACAS fails you, the ATCO will warn you. In a car you have none of that, separation is on the order of tens of meters at best, usually less than that and when you are burning down the highway at 150 kph your reaction time is measured in two or three seconds at best, usually less than a second. Quite frankly, if car makers cant guarantee an accident as good or better than manually driven cars in all environments and under all conditions (in the UK it is 1 in 366 for every 1600 kilometres driven), they are shit out of luck. AI driven cars will have to beat those odds unless we are talking about trucks and buses driving on separate dedicated lanes.

        First of all, DICs is a great description of Tesla drivers. Secondly, if the software screws up, then it's not the drivers fault _IF_ it can be demonstrated that the driver could not have reasonably taken action to avoid it. Ultimately you are responsible for the vehicle, but not for design faults. Hence we don't blame the pilots for the two 737 Max crashes, those pilots did everything they could to stop the planes from crashing due to a design fault. However I doubt driverless cars will be here in my lif

    • by burtosis ( 1124179 ) on Sunday January 30, 2022 @12:16PM (#62221231)
      First off there are no obstacles in the air during the time autopilot is engaged, and teams of people with multimillion dollar equipment track and warn planes if they get too close. This is why autopilot has existed for decades and driverless cars do not still do not exist.

      Second, in a commercial plane you have a team in the cockpit, even professional pilots aren’t able to stare off into the blue for a thousand hours on end and then respond 100% reliably in a split second to an emergency. Humans simply can’t do that by themselves it requires redundancy and worse, driving manually all the time is usually not even enough to reflexively do the right thing in a split second and letting the computer drive is simply going to further reduce those skills to the point it’s just passing the puck on liability. It’s insane, like some kind of sick combination of an ADD test and Russian roulette.

      Third, the ability of the computer to drive needs to be independent of other vehicles controlled by humans. The funny thing is people see computers doing math far faster and with relatively no mistakes and mistakenly think this means computer controlled driving is perfect and beyond human ability. Despite having ridiculous sensor advantages far far exceeding those of any human or indeed any creature ever, the lowest ability humans vastly outperform current systems with two crappy stereo cameras, two low bandwidth microphones, a low resolution six axis gyro/accelerometer, and haptic feedback from the tube goldberg interface they are required to interface with. It’s clear to anyone in the field that not only are we still decades away from driverless cars but that our approaches need a massive overhaul because we can’t even replicate what simple organisms can do with orders of magnitude less processing power.
      • Human brain is about 6 petaflops. Tesla onboard is about 10 teraflops.

        • As much as we all would like to assume it, the worst human drivers are not simple organisms.
        • by djinn6 ( 1868030 )

          Source?

          It takes me a minute or two to perform a single floating point. That sounds like something on the order of 10 milliflops rather than petaflops.

          • It’s based on the number of neurons and the perceived dependencies between them and the processing power required to simulate those neurons. However it is clear that there are functions and dependencies left out of the models and further that the simulation of them are terribly computationally inefficient. Combine the inefficiency in simulation with the larger brains in nature and it’s clear we won’t be matching them for decades, perhaps centuries. Additionally, you perform highly compl
      • This is why autopilot has existed for decades and driverless cars do not still do not exist.

        Driverless cars do exist. Waymo is operating a completely self-driving taxi service in Phoenix. No "safety drivers"; there is no one at the wheel. They still have limitations, but they do operate autonomously, and safely.

        • Yes, in a single city in an extremely limited area that is massively pre-mapped with no rain no snow and they move slowly. Any small change that’s unexpected and they cannot handle it at all [theverge.com]. It’s so cost ineffective and unpopular that there are no plans to expand it, doubly so when a slightly dirty sensor completely shuts it down or worse.
      • by AmiMoJo ( 196126 )

        In fact aircraft do have anti-collision systems that are independent of ground control. It's quite clever, when two aircraft detect that they might cross paths they agree between themselves that one will go up and the other will go down.

    • Re:Pilots in command (Score:5, Interesting)

      by Baconsmoke ( 6186954 ) on Sunday January 30, 2022 @12:19PM (#62221245)
      So, I'm not disagreeing with you outright, but I think there's some possible key differences here. A pilot always has the ability to re-engage with controlling their aircraft. We don't want to confuse Tesla's current and/or shortly in the future auto-pilot with where vehicles might end up 10-20+ years down the road. A truly autonomous vehicle may never have the ability for a person in the vehicle to re-engage with control. People could become so completely accustomed to not driving their vehicle that the ability for them to safely monitor the vehicle will be lost as a skill. Stop and think about human psychology for a moment. The first time you get in an autonomous vehicle you're going to be watching like a hawk. By the 1000th time, you really aren't and you'll have subconsciously trained yourself to not pay attention anymore. Now think about the 2nd or 3rd generation people who have known nothing else but autonomous driving. If that's the case then I could foresee that AI/Computers will become far better and safer drivers and the human element will be completely removed. Once that happens, then it would be correct to remove human liability from the people in the vehicle. Yes there's a lot of conjecture in my comment, but I'd be willing to wager that my estimation of how humans will interact with their vehicles is highly likely.
      • More importantly, a plane at cruising altitude generally has some time between something going wrong and crashing into the ground. That time is much, much longer than human reaction times. A human can be startled back to awareness by an alarm, figure out the problem, and hopefully react appropriately. Pilots also get quite a bit of training in how to react.

        In a lot of car situations, you are seconds away from a crash. I'm not sure there's enough time to pull a human into the loop.

    • An airplane has one pilot and dozens of passengers. It's a mass transit system. Cars are generally meant to transport individuals or most small groups of individuals.

      You can't have tens of millions of people all responsible legally for the actions of the driverless car they bought. It would completely defeat the purpose of driverless cars which is to be able to get in your car and not have to pay attention to the road.

      Honestly we're just going to see car companies by laws to indemnify consumers and
    • by Immerman ( 2627577 ) on Sunday January 30, 2022 @12:28PM (#62221265)

      >I really don't know why this concept hasn't been applied to driverless cars.

      Because that would completely defeat the point of autonomous cars?

      An aircraft (or boat) autopilot isn't (traditionally) anything like a self-flying plane - generally speaking it maintains heading and altitude regardless of anything that might be in the way. The closest car equivalent is probably cruise control, maybe with a light splash of lane-keeping assistance.

      With such an autopilot you *obviously* need someone maintaining regular attention - but obstacles are so few and far between, and generally visible far in the distance, that it doesn't have to be continuous close attention, so pilots can usually safely focus on other things so long as they just keep a watchful eye on plane and sky.

      On the road, you can't get away with that. Obstacles are dense, plentiful, and can be hidden from view until seconds before a collision. So you need to have a responsible party maintaining a continuous high level of attention. The only way a human driver can realistically do that is if they're actually in control of the vehicle so that the attention is necessary. We're just really not built for maintaining focused attention without a feedback loop.

      So, the only way a self driving car actually makes sense is if you can safely stop paying attention to the road entirely - essentially handing over control of the vehicle to a robot chauffeur. And if you're doing that, then just like a human chauffeur, it only makes sense that the one actually doing the driving must take legal responsibility. Only with an autonomous robot, the one actually doing the driving is whoever programmed the robot and sold it claiming it was fit for purpose.

      Any other interpretation is an invitation to companies to sell grossly inadequate self-driving cars, as they are legally shielded from any responsibility for the collisions those cars will inevitably cause.

      • by E-Lad ( 1262 )

        I think you've explained why truly driverless cars in today's driving environments, where the operator is absolved from any fault in accidents, is not possible.

        • Why?

          So long as the "person" driving the car (the company selling the autonomous driving system) *is* held fully legally responsible, I see no problem.

          Basically, if you put legal liability on a major corporation with their deep pockets, I think a combination of ambulance chasers, unsympathetic juries, and a requirement to make a complete sensor and software log of the event promptly available to the prosecutors should do just fine to enforce adequate safety.

          Now, that would mean that almost certainly nobody w

    • Are Major Legal Changes Needed for the Driverless Car Era?

      No. Major technical advancements are needed for the Driverless Car Era.

      Driverless car technology is turning out to be our latest pinnacle of mount stupid. Like fusion technology, which is just 20 years away and always will be, driverless tech is going to be deployed "any time now" and always will be.

      So yes, the whole idea of a "Driver" in charge is a good point, but, to be honest, premature. The technology itself just isn't there and I don't think it will be for another actual generation and then only wi

      • by AmiMoJo ( 196126 )

        Waymo has level 4 working, fully autonomous cars with no human driver, but only in limited areas. Their progress is steady and the technology looks like it will work just fine.

        Most importantly they aren't basing it on "AI" nonsense, trying to train an adversarial network to drive. They use algorithms, and actually understand how their system operates.

        • Waymo has level 4 working, fully autonomous cars with no human driver, but only in limited areas.

          And... you just made my exact point.

          Their progress is steady and

          Everyone's progress is steady. Until, for example, you stop looking at their weasel worded press releases and look at their actual SEC filings and realize they have had nowhere near the level of engagement with regulatory authorities that they should have given the state of readiness they want you to believe they are at.

          ...the technology looks like it will work just fine.

          Sure it does. In, as you say, "limited areas" which are carefully groomed not to have anything which will confuse their AI. This is akin to programmer

    • With aircraft, the autopilot is a way to make the trip easier during the "boring" parts of the flight, but the pilot could hand fly the plane all the way if he wanted to do that. It probably is currently possible to build "pilotless" planes (not remote control drones but actual autopilot-only planes), but I doubt a lot of people would want to fly in an airplane that is controlled by computer only with no human to oversee it.

      However, some people dream about "driverless" cars that have no steering wheel and a

    • by kbg ( 241421 )

      There really is a complete difference between plane autopilot and car autopilot. You don't expect humans to jump in front of your plane while it's flying and maneuvering in almost completely empty space is a lot simpler than manuvering in road changing conditions on ground.

      And this is why complete autopilot for a car is never going to work. Are you really willing to go jail for manslaughter because there was a software bug in your car? Now you might say that you should always be watching the road when using

    • by sjames ( 1099 )

      Actually, that principle *IS* being applied for now.

      The question is what is the way forward as self driving systems actually become capable of driving autonomously as opposed to the sometimes advanced driver assist technology in play today.

      Who gets the summons if one of the experimental autonomous Ubers causes an accident? In those, there is no human in the car with access to the controls. Do we even allow that sort of thing going forward?

    • Comment removed based on user account deletion
    • Ok, now apply your concept to a car driving without anyone inside (no PIC onboard). You scheduled your car to pick you up you wife from her work at 5pm. On the way to her workplace the car kills someone. Oh, and the car's tires are worn just below the minimum (they were legal at the start of the journey), and the car had a flat and your son replaced one of the tires with a temporary doughnut tire. Your parent is the legal owner of the car, and the car 15 years old and no longer under warranty. Who is respon

    • by bgarcia ( 33222 )

      With aircraft, there is always a Pilot In Command.

      Allow me to introduce you to these new contraptions called "drones". Many of them will fly pilotless until they arrive at an area of interest.

  • by unfortunateson ( 527551 ) on Sunday January 30, 2022 @11:55AM (#62221179) Journal

    a bill helping solo owner-operators of long-haul trucks finance upgrades to driverless, support open (vs corporate) driverless truck stops, etc.

    Without that, trucking will overwhelmingly be taken over by big corporations that will price out the solo owner-operator (because of access to capital), and lock them out of autonomous refueling, servicing and recharge (for when the electric trucks come).

    Solo truckers are a big piece of the small businesses in this country, and will be pretty much wiped out by automation without this.

    • Because they will undercut the smaller guys on pricing and we don't enforce antitrust laws in any way shape or form. The market consolidation isn't going to stop no matter what we do, we need to start thinking about how to reorganize things to work around it.

      Capitalism is breaking down like a car that hasn't had an oil change in 5 years. We need to do the necessary maintenance or we need to accept that we're going to become a fascist oligarchy with the trappings of freedom but not actual freedom.
    • Why should autonomous ownership be incentivized towards solo owner/operators ? (And to clarify - this is an honest question, not an attack or rhetorical question) It seems like it would be most efficient for big companies to own many trucks, that way the overhead is minimized and cost of trucking is minimized. As long as there is some competition (e.g. some number of large autonomous trucking operators, probably > 3) this seems much more efficient than 100,000s or millions of solo owner operators? Cert
      • by psergiu ( 67614 )

        1) Because, at least in US, the same thing will happen when there are just a few major competitors: price fixing. See ISPs and Mobile Telephony US vs. countries with real competition.
        2) With only a few major trucking companies, it takes a few "woke" people at the top for them to refuse to carry livestock (because we all should be vegetarians to save the planet) or ammunition (because guns are bad, mmkay) or to blacklist a particular company for any reason. Then what is your recourse ?

    • a bill helping solo owner-operators of long-haul trucks finance upgrades to driverless, support open (vs corporate) driverless truck stops, etc.

      Without that, trucking will overwhelmingly be taken over by big corporations that will price out the solo owner-operator (because of access to capital), and lock them out of autonomous refueling, servicing and recharge (for when the electric trucks come).

      Solo truckers are a big piece of the small businesses in this country, and will be pretty much wiped out by automation without this.

      I'm not certain that solo owner-operator truckers make a lot of sense post-automation.

      Right now, they make sense because the major input for trucking is the hours of the driver and there's not a lot of extra benefit to be brought by a big office staff.

      But once there's automation? What are these solo operators going to be doing all day while their truck is out driving itself down the highway? I feel bad that they'll have to change careers, but I don't see a big need for truck drivers once the trucks drive th

    • by mjwx ( 966435 )

      a bill helping solo owner-operators of long-haul trucks finance upgrades to driverless, support open (vs corporate) driverless truck stops, etc.

      Without that, trucking will overwhelmingly be taken over by big corporations that will price out the solo owner-operator (because of access to capital), and lock them out of autonomous refueling, servicing and recharge (for when the electric trucks come).

      Solo truckers are a big piece of the small businesses in this country, and will be pretty much wiped out by automation without this.

      Erm, this has been and gone sunshine. Major corporations are already controlling most of the long haul logistics. This is why the EU has mandated maximum driving hours and other controls to ensure that drivers are not being overworked to meet deadlines.

      However drivers are the last people to be pushed out by automation for one simple reason. Liability. When stopping gracefully means just stopping in the middle of the road or at the very best pulling over and staying there until someone gets out there to f

  • In the interim, carmakers must be extremely clear about the difference between self-drive and driver-assist features. There should be no sliding scale of driverless capabilities — a car is either autonomous or not....

    That doesn't make sense. If there no huge, complicated and extremely fuzzy legal scale of responsibility of such labyrinthine complexity that it allows manufacturers' legal SWAT teams to easily blame the 'user-in-charge' for every single AI fuck-up it's the manufacturers who'll end up being legally and financially liable for AI fuck-ups?!? Can't have that now can we?

  • At least in the US, I'm guessing that even if you have an autonomous car, you'll still be guilty of DUI if you're in it and the car is driving itself. This is a country where possession of car keys anywhere near your car while you're inebriated is sufficient to convict, even if you're just getting your bag and coat out of the backseat before taking a cab home.
    • You don't get a DUI for being in a taxi, bus, or subway train if you're drunk, so why would you get a DUI for being in a self-driving car? Some states call it OUI, "Operating Under the Influence"; if you aren't operating the car, there's no OUI.
      • You don't get a DUI for being in a taxi, bus, or subway train if you're drunk,

        You do if you're the operator.
        • Technically, you wouldn't get the DUI for being in the vehicle, you would get the DUI for operating the vehicle.
          • and having an phone app or even summon an auto taxi can = operating / having control and that = DUI

            • and having an phone app or even summon an auto taxi can = operating / having control and that = DUI

              Is using your phone to call for a taxi considered DUI? If so, you live in a pretty bad legal system.

              • and ROBO TAXI not an TAXI with an driver.
                As the DRIVER IS in control.

                But under the LAW it can be seen that access to the APP = in control.

  • The summary quotes, "Sanctions for carmakers who fail to reveal how their systems work."

    The car makers do not know how the systems work. This is a serious problem with modern AI. It is impossible to explain how or why they work or why they sometimes don't work.

    There is no need for a fund to enable independent truck drivers to upgrade or worry about displacing millions of truck drivers. There will be no fully autonomous road vehicles outside of highly controlled trials as long as there are human piloted vehicles on the road. It will not happen in my lifetime and probably not yours either. The press/media is just having another moral panic hysteria moment unsupported by science.

    • Congrats, you're part of a slowly but inexoribly growing group of people who are realizing that so-called SDCs are just junk that will never work as advertised.
      The problem is the brain of a housefly is orders of magnitude more capable than any so-called 'AI' they keep trotting out, and for one simple reason: they cannot think, because we don't even understand how cognition works in a naturally-evolved biological brain, and any neuroscientist will confirm that. All the 'training data' in the world isn't eno
      • >it's not going to magically 'wake up' if you add enough hardware and software to it.

        Well, it *might*. As you point out yourself we have no idea how cognition works, and enough self-organizing complexity *might* be all it takes.

        However, even in the unlikely event that that is true, there's no particular reason to think we're anywhere *remotely* close to reaching "enough", especially not in the relatively tiny "brains" of a self-driving car.

        • Exactly. As someone pointed out a long time ago now, an amoeba has more congitive abiity than the best (so-called) 'AI'.
      • Congrats, you're part of a slowly but inexoribly growing group of people who are realizing that so-called SDCs are just junk that will never work as advertised. The problem is the brain of a housefly is orders of magnitude more capable than any so-called 'AI' they keep trotting out, and for one simple reason: they cannot think, because we don't even understand how cognition works in a naturally-evolved biological brain, and any neuroscientist will confirm that. All the 'training data' in the world isn't enough, it's not Mycroft from The Moon Is A Harsh Mistress, it's not going to magically 'wake up' if you add enough hardware and software to it.

        I don't care if it "wakes up" or "can think". It doesn't take thinking and cognition to run simple image recognition and steering, and the bar isn't "zero mistakes", it's "less mistakes than a hairless monkey", so pretty low. And we already seem to be there, except for some corner cases (like driving in snow), but handling those corner cases doesn't require "waking up", it just takes collecting more data and spending some more man-hours on that. And steady progress is being made, with no reason to suppose i

        • the bar isn't "zero mistakes", it's
          "less mistakes than a hairless monkey"

          Government safety standards and insurance companies
          will decide that, not you.

    • This is a serious problem with modern AI. It is impossible to explain how or why they work or why they sometimes don't work.

      Can you explain how or why human brains work, or why they sometimes don't work?

  • Musk's self-driving cars have already killed one person [reuters.com] and had multiple instances [theguardian.com] of slamming into parked emergency vehicles who had their lights on. There's even a case against the driver of a Tesla who was using self-driving and killed two people [yahoo.com] as a result.

    If anyone thinks Musk's ego will let politicians write laws to hold his company liable for injuries and deaths resulting from his self-driving cars, they haven't seen how much money he'll bribe them with.

  • Sanctions for carmakers who fail to reveal how their systems work

    That's the thing: they're all already on record saying they have no idea how the so-called 'AI' (inappropriately named) generates the output it generates; it's literally a Black Box to them
    Meanwhile earlier this week there was a story posted here about how Waymo doesn't want any of their saftey (or more like 'lack of safety', LOL) data revealed to anyone. Gee I wonder why that is? Is it because SDCs are all smoke and mirrors and they're deathtraps?
    In an era where people are falling for grand troll-memes l

  • The car-makers simply will get the insurance the drivers or car owners had to have so far and that is it. If a car-maker is large enough, they can found their own insurance for this. Will be cheaper overall, since self-driving will have significantly lower accident cost initially and a lot lower long-term. At least where sane insurance is in place today.

    For reference, the typical coverage for a car in, for example, Germany is "unlimited" and the mandatory minimum goes up to 7.5M Euro for damage to people.

  • No, everybody who is a road user or footpath user can trust implicitly all autonomous vehicles no matter who the manufacturer is. Basically just slap a sticker on it saying AI and your good to go.

  • Technological progress is increasingly impeded by people hell-bent on sucking liability money out of it before it has a chance to take root.
    Imagine if the internet in the early 90s was subjected to the kind of regulation that exists today and that which is being proposed.
    Imagine if the legislation and regulation surrounding cars today existed when Henry Ford was building Model Ts by the millions.

  • Self driving cars have a different responsibility structure that human-guided vehicles. Who's at fault when one has an accident with another? Will the AI powered vehicle be given preference? I say if they are to drive autonomously, and that's the only way self-driving cars have any value, they should ride on their own roads similar to the cars-only lanes on the New Jersey Turnpike where trucks are not allowed.
  • In the future, if a car is completely autonomous, with some people predicting no steering wheel or option for manual driving then the liability should be with the manufacturer, just as you would not prosecute a passenger in a cable car if it crashed. While it is still "driver assist" and people may need to take control then they have responsibility still.
  • But why is nobody talking about carless drivers?

    I think I'd support an initiative to allow self-driving cars on special roadways that the manufacturers and owners of self-driving fund themselves. I don't agree in using a public resource such as roads in a way that is hostile to pedestrians, cyclists, and old-fashioned person-driven vehicles. We already have trouble sharing the road between those three, add in a machine that can't even argue with human beings at the side of the road and it'll be chaos.

  • Vint Cerf, when I saw him at work during a conference six or eight years ago, was saying that Google found it was "better" to have NO PEDALS AND NO STEERING WHEEL in an autonomous car.

  • Of course changes are necessary. The first fundamental Law of Robotic Cars is that the company making it is never liable for anything. Didn't you read the EULA?

The opossum is a very sophisticated animal. It doesn't even get up until 5 or 6 PM.

Working...