Forgot your password?
typodupeerror
Transportation

Society Will Accept a Death Caused By a Robotaxi, Waymo Co-CEO Says (sfgate.com) 239

At TechCrunch Disrupt 2025, Waymo co-CEO Tekedra Mawakana said society will ultimately accept a fatal robotaxi crash as part of the broader tradeoff for safer roads overall. TechCrunch reports: The topic of a fatal robotaxi crash came up during Mawakana's interview with Kristen Korosec, TechCrunch's transportation editor, during the first day of the outlet's annual Disrupt conference in San Francisco. Korosec asked Mawakana about Waymo's ambitions and got answer after answer about the company's all-consuming focus on safety. The most interesting part of the interview arrived when Korosec brought on a thought experiment. What if self-driving vehicles like Waymo and others reduce the number of traffic fatalities in the United States, but a self-driving vehicle does eventually cause a fatal crash, Korosec pondered. Or as she put it to the executive: "Will society accept that? Will society accept a death potentially caused by a robot?"

"I think that society will," Mawakana answered, slowly, before positioning the question as an industrywide issue. "I think the challenge for us is making sure that society has a high enough bar on safety that companies are held to." She said that companies should be transparent about their records by publishing data about how many crashes they're involved in, and she pointed to the "hub" of safety information on Waymo's website. Self-driving cars will dramatically reduce crashes, Mawakana said, but not by 100%: "We have to be in this open and honest dialogue about the fact that we know it's not perfection."

Circling back to the idea of a fatal crash, she said, "We really worry as a company about those days. You know, we don't say 'whether.' We say 'when.' And we plan for them." Korosec followed up, asking if there had been safety issues that prompted Waymo to "pump the breaks" on its expansion plans throughout the years. The co-CEO said the company pulls back and retests "all the time," pointing to challenges with blocking emergency vehicles as an example. "We need to make sure that the performance is backing what we're saying we're doing," she said. [...] "If you are not being transparent, then it is my view that you are not doing what is necessary in order to actually earn the right to make the roads safer," Mawakana said.

This discussion has been archived. No new comments can be posted.

Society Will Accept a Death Caused By a Robotaxi, Waymo Co-CEO Says

Comments Filter:
  • by abulafia ( 7826 ) on Tuesday October 28, 2025 @10:15PM (#65757388)
    What he's really saying is they have a crisis PR firm on retainer, wargamed several different scenarios, and have detailed plans for when it happens.

    Because he's right, it will.

    And because it would be incompetence for them not to foresee and plan for it.

    The event that'll be really interesting is when some remote driver snaps and goes on a remote rampage. That'll freak out people in multiple different ways all at once.

    • Re: (Score:3, Informative)

      by Tony Isaac ( 1301187 )

      We will *actually like* the steep *reduction* in fatalities that occur as robotaxis become more common. Some studies already show robotaxis being safer than human drivers. Even if those are arguable now, as robotaxis get better, they will very much outperform humans on safety. There is nothing to dislike about that.

      • by Z00L00K ( 682162 )

        However for the cases where there are accidents where the robotaxi is considered at fault then someone has to take the penalty.

        • Of course, the liability issues will have to be worked out through court cases and appropriate laws. Automated vehicles besides robotaxis aren't new, there are automated bus and train systems everywhere. Liability is a thing for them too.

        • by Kisai ( 213879 )

          The reality is that the robo taxis will be held 100% responsible for all accidents unless it's caused by someone inside the car tampering with it, which will likely be how every investigation goes.

      • Some studies already show robotaxis being safer than human drivers.

        You should have said:

        Some studies already show robotaxis being safer than the average human driver.

        There is absolutely no way a current robotaxi is better than a LOT of human drivers out there. Humans can predict, robots can not. Sure, the robotaxi has a far superior response time, but with the prediction that humans can do, there is less of a need for response times. There are situations where response times do not matter because if you didn't deal with the problem before it became a recognizable problem to the robot, it is already too late do anything useful; although applying the

    • by Kisai ( 213879 )

      Ford Pinto gambit.

      "We think society will accept the small amount of deaths for our shitty design"

      or the second amendment gambit

      "We think society will accept the small amount of (daily) gun deaths, for our shitty design"

      Like, I get it, regulations are written in blood, but only in America is there this aversion to rejecting shitty designs until there is a literal swimming pool of blood.

  • by Cajun Hell ( 725246 ) on Tuesday October 28, 2025 @10:16PM (#65757394) Homepage Journal

    They're right.

    Over a century ago, people accepted deaths caused by human drivers. If you reduce fatalities, many people will still appreciate that, even if you don't get it down to zero. Everything else is like that. What doesn't have a nonzero death rate?

    • by dfghjk ( 711126 )

      Why do you believe Waymo will reduce death rate at all? And what detrimental effects will that have on traffic? We can fully eliminate traffic deaths by eliminating traffic, does Waymo reduce rates like that and if not, why bother?

      We know Waymo doesn't give a shit about improving safety, they are interested in getting rich.

      • Capitalizing on convenience is how capitalism works. If you have a better system, why isn't it more popular?
        • Because the wealthy want to keep the current system in place to stay wealthy of course. Any better system requires us to balance the distribution of wealth which is instantly called communism, and we have allowed the word communism to mean something so bad we can never try it.
      • by Cajun Hell ( 725246 ) on Tuesday October 28, 2025 @11:54PM (#65757550) Homepage Journal

        Why do you believe Waymo will reduce death rate at all?

        Because, unlike most humans, they want to and are trying.

        And what detrimental effects will that have on traffic?

        I don't know, but if you or I were trying to solve the problem, I'm sure we would have plenty of opinions about which convenience-vs-safety tradeoffs are the rights ones.

        We can fully eliminate traffic deaths by eliminating traffic

        And that convenience-vs-safety tradeoff would have very few advocates, I suspect. Do you think it's a good one?

        We know Waymo doesn't give a shit about improving safety, they are interested in getting rich.

        Perhaps you or I should hang out with some insurance company nerds, and see what changes they advocate for liability law, to make those two things (safety & getting rich) correlate. I wouldn't be surprised if you already have some ideas, even without the insurance nerds.

      • by Mr. Barky ( 152560 ) on Wednesday October 29, 2025 @02:41AM (#65757742)

        We know Waymo doesn't give a shit about improving safety, they are interested in getting rich.

        This is true... somewhat. But a high accident rate will cause them to get less rich. If there accident rate is too high, they will have bad publicity. The reality is that their accident rate has to be substantially less than the rate for human drivers - otherwise bad publicity will become overwhelming (which will in turn reduce profits). This makes their quest for profits aligned with improving safety, so they do and will give a shit - at least until they get a monopoly at which point all bets are off.

        • Which is exactly why we must penalize these companies highly as we penalize humans. Because if we don't then we are not using the one mechanism that will force companies to be better. They got into a business where humans have set the bar for a long time. They need to either achieve at or better than human safety in every single instance or be penalized into oblivion. Like they should actually stand the risk of going bankrupt if they fail their promises to society.
        • by Sloppy ( 14984 )

          a high accident rate will cause them to get less rich

          I'm reminded of a scene from one of my favorite movies:

          [ED-209 kills someone]

          Dick Jones: "I'm sure it's only a glitch. A temporary setback."

          The Old Man: "You call this a glitch?! We're scheduled to begin construction in six months. Your "temporary setback" could cost us fifty million dollars in interest payments alone!"

    • by opakapaka ( 1965658 ) on Tuesday October 28, 2025 @11:15PM (#65757478)
      People were tricked into accepting automobile deaths as just the cost of doing business, and the entire business of traffic engineering is dedicated to explaining away those deaths as accidents. Wes Marshall PhD PE published Killed by a Traffic Engineer in 2024 which explains a lot of this. Think about it: if there is a death on an escalator, it results in huge inquiries, lawsuits, and extensive investigation to ensure it never happens again. This does not happen with automobile crashes in most cases, as the driver is usually blamed rather than the engineering which makes most roads unsafe at any speed.
      • by Rei ( 128717 ) on Wednesday October 29, 2025 @04:47AM (#65757884) Homepage

        Agreed. Forget about escalators, think about a much closer analogy - aircraft. *Every single crash* gets an investigation and recommendations on how to avoid that type of crash from happening again. Recommendations that are, for the most part, acted on.

        But we just accept car crashes as, "Meh, it happens". It's not that we don't have safety regulations for road design and car design. It's just that we've heavily normalized a high rate of crashes and don't treat preventing them with any sense of urgency, even though they're a top killer of young people.

      • by Tom ( 822 )

        Unlike cars, escalators (or airplanes, as one commenter used as another example) are not driven by amateurs. They are either automated or operated by professionals.

        The trade-off we had to make was between allowing only licensed chauffeurs to drive, or allowing everyone to drive after a short intro course that teaches you only the basics and very little about SAFE driving. Oh, and you get qualified for life. Not further tests, requirements for courses, experience, etc.

        Are roads safe? Fuck no, not by a huge m

    • The perfect vs good argument is the pragmatic one for moral hazards like this. IMHO the best scenario as self-driving vehicles become mainstream technology is probably a culture like air travel: when there is some kind of accident, the priority is to learn from it and determine how to avoid the same problem happening again, and everyone takes the procedures and checks that have been established that way very seriously. That is necessarily going to require the active support of governments and regulators as

    • by rsilvergun ( 571051 ) on Wednesday October 29, 2025 @12:10AM (#65757568)
      Car companies forced it on us. Seriously no joke look at the history of car companies. They basically got us all to pay to build the roads needed to use their product while destroying public transportation making us completely dependent on their product.

      We all grew up playing with toy cars and surrounded by cars has the most normal thing imaginable but they're one of the most bizarre and aberrant things humanity has ever come up with if you actually can step outside our society and look at them objectively as an outsider. We are literally all traveling in individual multi-ton vehicles that use enormous amounts of energy and fuel and cause a vast number of health, political and social issues that we wallpaper over with externalized costs.

      And God help you if you point any of this out.
      • True dat.

      • by Viol8 ( 599362 ) on Wednesday October 29, 2025 @04:55AM (#65757890) Homepage

        "They basically got us all to pay to build the roads needed to use their product while destroying public transportation making us completely dependent on their product."

        Here in europe people simply found them more convenient for certain use cases than public transport (or walking!) and the building of roads followed the uptake of cars, not the other way around. However good public transport still exists in most countries too. Its not a zero sum game.

        Also people like yourself ignore human nature - most people prefer having their own private vehicle just as they prefer living in their own private accomodation rather than sharing rooms with strangers even though the latter would be far more space and energy efficient.

      • Most of us see it in the opposite. The car freed us from having to rely on the whims of a horse, government, or railroad.
    • You know what, the last time I got into a fender bender no one accepted that I caused the accident.
  • by gurps_npc ( 621217 ) on Tuesday October 28, 2025 @10:24PM (#65757402) Homepage

    The articles quotes are strange, but he is not wrong.

    It is obvious that robot drivers will not be perfect. They will at some point make a decision that will result in human death.

    But if they reduce total deaths no one will be particularly concerned. I certainly won't. And if they do not tremendously reduce total deaths they will not be allowed on the road.

    The only reason they are being tested is that humans are horrible drivers and kill many people all the time.

    • by dfghjk ( 711126 )

      "But if they reduce total deaths no one will be particularly concerned."

      Bullshit, I will be. No one accepts that computers occasionally take money out of your bank account, and everyone would be concerned if that happened. Losing a buck is nothing compared to losing a life, so can you explain your casual sociopathy?

      "I certainly won't."
      Yes, that sociopathy.

      "And if they do not tremendously reduce total deaths they will not be allowed on the road."
      Bullshit, they are already on the road. How ignorant are you

      • I just have to laugh when people are down on humans about their driving skills. Billions of cars on the road every day and if one human causes an accident we never stop hearing about it. This is definitely where road rage comes from.
      • Bullshit, I will be. No one accepts that computers occasionally take money out of your bank account, and everyone would be concerned if that happened.

        Are you so confident that there are absolutely zero bugs in your bank's computer systems that could cause it to incorrectly take money out of your account? If so I'd say that's pretty naive based on what we see in the real world [time.com].

        The people quoted in the article were obviously very concerned, but I'd bet that many are still BoA and Zelle users today. We collectively accept the risks because of 1) the convenience that electronic banking offers and 2) the understanding that alternative systems, like a room

        • I have never had a bank make an error on any account I've had. Ever. In approximately 40 years of having at least one bank account.

          That's a pretty god track record.

          • Me neither, though I've had bank accounts for a good bit less longer than you. I've also never been killed by a self-driving car (or human-driven car, for that matter). Such is the challenge of weighing the risk of low-frequency events, but we implicitly do it all the time.
    • It's not the public, but insurance companies that matter. At least in sane societies that do not have the notion of punitive damages,
      Once it is clear that robo-taxis, or self-driving cars, are safer than human-driven vehicles, insurance companies will take note and lower premiums for self-driving cars even if the operators are held responsible for whatever mayhem they are causing. In such a scenario, it is likely that your insurance premium will go up by a fair bit, if you insist on having a steering whe
    • Well, that, and when people are driving your taxis, they're going to expect things like wages, bathroom breaks, meal breaks, reasonable shifts, tolerable working conditions...

  • I for one would be very upset if a robot car killed me, YMMV.

    • Just make sure you always have a note ready saying "It wasn't an accident. I knew too much, and the premeditation came directly out of the CEO's mouth."
    • You would not be upset, or anything else, dead. And dying from self-driving car vs a speeding distracted driver really doesn't matter. If the general numbers go down, even if nonzero and have complications, then society will absorb it. Every single advance in civilization follows this path. Electricty, Petroleum, Pasturization, Power tools, Skyscapers, etc. In fact, defly navigating the new dangers of a technological advance are considered a sign of ability by the young generation. Over time, acceptin
  • by madbrain ( 11432 ) on Tuesday October 28, 2025 @10:31PM (#65757406) Homepage Journal

    Of course technology will never be perfect. The question is who'll be held responsible for death and injuries resulting from product defects, and how they'll pay to compensate for them.

    • It'll be no muddier, and perhaps somewhat less muddy, than today. Assigning blame is a messy business in auto accidents, and often you're starting with "he said, she said", a couple of damaged cars, and no witnesses.

      At least there will be boatloads of telemetry to tell part of the story.

      • by dfghjk ( 711126 )

        "At least there will be boatloads of telemetry to tell part of the story."

        To tell the pro-self driving part. We're already seeing the manipulation, even if an occasional DOGE effort is needed to destroy evidence.

        • by Petersko ( 564140 ) on Tuesday October 28, 2025 @11:30PM (#65757502)

          That's not the nature of telemetry. There's no log message that says, "Ball in roadway, brake applied". That's why analysis of the raw data should probably become a specialty service from independent providers for legal stuff. You can create obvious gaps to hide things, but creating whole fake streams of data to imply different real world behaviour of the vehicle, and having it remain consistent with the physical crash evidence is one hell of an ask.

          I think you're much more likely to see planned fakes than you are falsified accident data. Things the emissions scandals with VW and others. Faked test results, shady test practices. Up front malfeasance. That's far easier than trying to rewrite the history of a single event and it's raw data.

          • In the age of real-time audio and video deepfakes, telemetry deep fake doesn't seem like such a huge stretch, especially if there are no witnesses, human or machine, to contradict the data. Of course, we are still talking about conspiracy theories here, but technologically, it seems feasible.

    • by dfghjk ( 711126 )

      We have an expectation that technology IS perfect when lives are at stake. That's why medical technology is expensive, and it's why mission critical systems are redundant. But with lethal driving technology, it's OK to use neural networks when we don't even know how they work. People have lost their minds with this crazy apologetic.

      "The question is who'll be held responsible for death and injuries resulting from product defects, and how they'll pay to compensate for them."
      Not the companies responsible, t

      • Medical technology is far from perfect. You only need to read the warning labels on drugs to know that. Some are many pages long. Some of those side effects include death, a surprising number of times. So yes, medical tech can certalnly be lethal. And the manufacturers are often held responsible in civil cases. Sometimes criminal cases also, if they covered something up.

    • and when and robo truck wipes out an school bus.
      Who will do hardtime?

    • IMHO the only sensible answer to is separate responsibility in the sense that a tragedy happened and someone has to try to help the survivors as best they can from responsibility in the sense that someone behaved inappropriately and that resulted in an avoidable tragedy happening in the first place.

      It is inevitable that technology like this will result in harm to human beings sooner or later. Maybe one day we'll evolve a system that really is close to 100% safe, but I don't expect to see that in my lifetime

      • The concept of intent matters from a criminal perspective - whether someone should be charged with a crime. Intent is going to be hard to prove without a lot of transparency on the part of the manufacturers about their development and testing protocols.

        Proving intent will depend heavily on what kind of regulatory framework we end up with. It is likely going to vary by locale. And enforcement will be at the discretion of prosecutors as well.

        I expect most autonomous vehicle accidents aren't going to be the re

        • Just my personal opinion, but given the track record in this particular industry, I think there should be demonstrable intent by decision-makers to follow good practices, not merely a lack of evidence of intent to circumvent or cut corners. This is expected in other regulated industries, compliance failures are a big deal, and for good reason. I see no reason why similar standards could not be imposed on those developing and operating autonomous vehicles, and every reason they should be given the inherent r

          • I agree with you. But even with the best intent, regulations, and full compliance, problems will still arise. Nothing as complex as a vehicle can ever be perfect, hardware and software wise. Unforeseen and unpredictable behavior will still occur, same way it does with medical devices and drugs.

            • For sure. And that's why I think it's important to distinguish harm caused despite good intentions and reasonable practices being followed from harm caused because someone did not follow reasonable practices or actively chose to cut corners.

  • by Petersko ( 564140 ) on Tuesday October 28, 2025 @10:41PM (#65757420)

    His point is nuanced, carefully considered, realistic, honest, and well put. Damn shame. People want none of that.

    What people want for real is blame and retribution. They'll put up any number of lives to the sacrificial alter of transportation just so long as they can nail somebody to the cross when something goes wrong. And automatic cars don't provide that, so it'll always feel unsatisfying and questionable, no matter how good it gets from a safety perspective.

    • Advances still need to prove themselves safer. Deep investigation of each incident has vastly improved air & sea travel, for example. So while blame is the catalyst, the money is actually moving to pay for the deep statistics to be gathered. Lawyers want details, and the advance needs to die by debt if it's the wrong direction.
    • by dfghjk ( 711126 )

      "His point is nuanced, carefully considered, realistic, honest, and well put. Damn shame. People want none of that."

      His point is self-serving and dishonest. And yeah, I want none of that.

      And self-driving is already causing deaths. Those deaths are because of greed and gross incompetence, coupled with dishonesty about how much the technology improves safety. But rubes like you are easily lied to.

      "They'll put up any number of lives to the sacrificial alter of transportation just so long as they can nail so

      • "By they, do you mean self-serving CEOs? Talk about a sacrificial alter!"

        NO. I mean people in general. We have collectively decided that the thousands of lives it costs every year for the current system to function is an acceptable price. We tolerate it because of the somewhat illusory promise of accountability.

        "Sure will kill some people but that's a good thing!"

        You said that. Nobody else did.

    • by gweihir ( 88907 )

      What people want for real is blame and retribution. They'll put up any number of lives to the sacrificial alter of transportation just so long as they can nail somebody to the cross when something goes wrong. And automatic cars don't provide that, so it'll always feel unsatisfying and questionable, no matter how good it gets from a safety perspective.

      This will go for a while. But not long. Because even with the pathetic-level of car insurance you need in the US, insurances will not have it and premiums for robotic cars will be orders of magnitude lower and then the hate and revenge fueled primitives will simply seem themselves ignored.

    • by Luthair ( 847766 )
      He's wrong. People aren't going to accept robotaxis that make mistakes that a normal human wouldn't.
    • by RobinH ( 124750 )
      This is insightful. We had a rather notorious crash [wikipedia.org] up here in Canada a few years ago when a transport truck didn't stop at a stop sign, and hit a coach bus carrying a teenage hockey team, and a lot of kids died. It was really tragic, and there was an outpouring of grief across the country. But what really got me is the way people treated the driver, as if he was the anti-christ himself. People don't seem to connect the dots... if anyone fails to stop at a stop sign, and nobody is hurt, then nobody care
  • by Joe_Dragon ( 2206452 ) on Tuesday October 28, 2025 @11:01PM (#65757456)

    Will they use EULA's to keep stuff out of court?

  • the owner of the car that has little control over the code will be the one out of pocket fighting it out in court?

  • billionaires have had it with us peon making decisions. You can have your moral panics and your Thanksgiving rants. Everything else is theirs.
  • Until a loved one dies.

  • Society accepts about 45k deaths from avoidable car crashes per year in the US. Robotic cars will have far lower numbers per distance travelled than human drivers. It just takes a bit of getting used to the idea, but the naysayers really have noting. Or rather a lot less than nothing.

    • Maybe this will be an area where the US simply gets left behind because of the pro-car and litigious culture that seems to dominate discussions there.

      Reading online discussions about driving -- admittedly a hazardous pastime if you want any facts to inform a debate -- you routinely see people from the US casually defending practices that are literally illegal and socially shunned in much of the world because they're so obviously dangerous. Combine that with the insanely oversized vehicles that a lot of driv

    • No they don't. People who cause those crashes generally go to jail, provided they survive.

  • I think that society will," Mawakana answered, slowly, before positioning the question as an industrywide issue. "I think the challenge for us is making sure that society has a high enough bar on safety that companies are held to." She said that companies should be transparent about their records by publishing data about how many crashes they're involved in, and she pointed to the "hub" of safety information on Waymo's website. Self-driving cars will dramatically reduce crashes, Mawakana said, but not by 100%: "We have to be in this open and honest dialogue about the fact that we know it's not perfection."

    See this quite regularly where someone pipes in out of the blue with the x is not 100% thing completely unprompted without anyone saying or offering any statements that would be responsive to such fundamentally worthless statements.

    It is a mistake to judge self driving vendors on aggregate accident rates or grade on a curve by how they compare to human drivers. If a self driving car with a safety record 100x better than an average human occasionally recognizes a person in the middle of the road and speeds

  • But Luigi Mangione proved that society will accept the death of abusive CEOs.
  • by fluffernutter ( 1411889 ) on Wednesday October 29, 2025 @03:07AM (#65757776)
    So a kid jumped out into the road and got hit by an automated vehicle.

    I would expect the accident to be investigated like any accident is today. Either the automated vehicle has performed to the standards we hold to humans or not. As a human should I have seen the kid before he ran through the row of parked cars or not? If a human should, so should the automatic vehicle. If a human would be penalized so should the vehicle company.
  • Waymo and their ilk are trying to replace public transport with their own for-profit self interests. When their cars kill occupants, or pedestrians, or other road users, or generally become a nuisance, driving slowly or unpredictably or violating laws then why should anyone accept it?

    They might claim safety but they have an obvious conflict of interest in being truthful. While I trust Waymo more for transparency that Tesla, the reality is all of these things should be held to regulatory scrutiny and indep

  • If society should accept a death from automated cars, then they should also accept deaths from vaccinations. We are being asked to believe that these cars will some day be better over all, ok say I buy that. Then everyone should be vaccinated by the same logic, because we know vaccination has a common good like these cars.

    Anyone who is asking people to possibly die to develop these cars should also be willing to get vaccinated.
    • If society should accept a death from automated cars, then they should also accept deaths from vaccinations.

      The same people who are against vaccinations are the primary people who say you can pry the wheel of their dino burner from their cold dead hands.

  • I'm quite confident his tune would change were it someone in his family that was killed. Certainly would be a wonderful irony if he himself were struck.

  • No, I won't, when robot cars start running people over I am going to destroy every roboraxi's ability to drive that I can get my hands on

"God is a comedian playing to an audience too afraid to laugh." - Voltaire

Working...