Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AI Transportation

Bad Week for Unoccupied Waymo Cars: One Hit in Fatal Collision, One Vandalized by Mob (nbcbayarea.com) 47

For the first time in America, an empty self-driving car has been involved in a fatal collision. But it was "hit from behind by a speeding car that was going about 98 miles per hour," a local news site reports, citing comments from Waymo. ("Two other victims were taken to the hospital with life-threatening injuries. A dog also died in the crash, according to the San Francisco Fire Department.")

Waymo's self-driving car "is not being blamed," notes NBC Bay Area. Instead the Waymo car was one of six vehicles "struck when a fast-moving vehicle slammed into a line of cars stopped at a traffic light..." The National Highway Traffic Safety Administration requires self-driving car companies, like Waymo, to report each time their vehicles are involved in an accident, regardless of whether the autonomous vehicle was at fault. According to NHTSA, which began collecting such data in July 2021, Waymo's driverless vehicles have been involved in about 30 different collisions resulting in some type of injury. Waymo, however, has noted that nearly all those crashes, like Sunday's collision, were the fault of other cars driven by humans. While NHTSA's crash data doesn't note whether self-driving vehicles may have been to blame, Waymo has previously noted that it only expects to pay out insurance liability claims for two previous collisions involving its driverless vehicles that resulted in injuries.

In December, Waymo touted the findings of its latest safety analysis, which determined its fleet of driverless cars continue to outperform human drivers across major safety metrics. The report, authored by Waymo and its partners at the Swiss Reinsurance Company, reviewed insurance claim data to explore how often human drivers and autonomous vehicles are found to be liable in car collisions. According to the study, Waymo's self-driving vehicles faced about 90% fewer insurance claims relating to property damage and bodily injuries compared to human drivers... The company's fleet of autonomous vehicles have traveled more than 33 million miles and have provided more than five million rides across San Francisco, Los Angeles, Phoenix and Austin...

In California, there are more than 30 companies currently permitted by the DMV to test driverless cars on the open road. While most are still required to have safety drivers sitting in the front seat who can take over when needed, Waymo remains the only fleet of robotaxis in California to move past the state's testing phase to, now, regularly offer paid rides to passengers.

Their article adds that while Sunday's collision marks the first fatal crash involving a driverless car, "it was nearly seven years ago when another autonomous vehicle was involved in a deadly collision with a pedestrian in Tempe, Arizona, though that self-driving car had a human safety driver behind the wheel. The accident, which occurred in March 2018, involved an autonomous car from Uber, which sold off its self-driving division two years later to a competitor."

In other news, an unoccupied Waymo vehicle was attacked by a mob in Los Angeles last night, according to local news reports. "Video footage of the incident appears to show the vehicle being stripped of its door, windows shattered, and its Jaguar emblems removed. The license plate was also damaged, and the extent of the vandalism required the vehicle to be towed from the scene."

The Los Angeles Times reminds its readers that "Last year, a crowd in San Francisco's Chinatown surrounded a Waymo car, vandalized it and then set it ablaze..."

Bad Week for Unoccupied Waymo Cars: One Hit in Fatal Collision, One Vandalized by Mob

Comments Filter:
  • But I don't see how this particular news item is a "Waymo" story or informs us about the topic at all, really.

    • by rta ( 559125 )

      Like with EVs and AI there are technical issues, but there are also many social, legal, perception, etc issues to acceptance.

      The first one is a reminder that just because a driverless car is involved in an accident it doesn't mean it's at fault. Yeah, it's not so deep, but sadly it's relevant to cars in general: all sorts of urban planets blame all pedestrian and bike accidents on cars and then use the numbers to pass speed limits, parking band, "traffic calming" and the whole raft of anti-car techniques.

      An

      • Like with EVs and AI there are technical issues, but there are also many social, legal, perception, etc issues to acceptance. The first one is a reminder that just because a driverless car is involved in an accident it doesn't mean it's at fault.

        That is literally the second paragraph of the summary.

    • by Firethorn ( 177587 ) on Sunday January 26, 2025 @06:14PM (#65120515) Homepage Journal

      Basically, it's "any accident involving a Waymo is reportable news".

      Reading the article, the 98 mph vehicles was a Tesla being driven by a 66 year old man. It crashed into the STOPPED and empty Waymo vehicle with enough force to ping-pong the Waymo around. This resulted in the death of Romanenko, who was not in either of the mentioned vehicles, but then, 5 other cars of unknown make were hit.
      The driver of the Tesla was uninjured enough to be hauled directly to jail. Wish it had been the opposite - early reports are that Romanko was a pretty nice guy. Two other people were taken to the hospital with life threatening injuries, and a dog was killed.

      In the same report they mention that vandals wrecked a Waymo enough that it had to be towed off, but nobody was hurt. No word if video of the perps was obtained that would be sufficient for identification.

      • So what you're saying is Tesla kills again [yahoo.com].

        • “Most of these vehicles received excellent safety ratings, performing well in crash tests at the IIHS and NHTSA, so it’s not a vehicle design issue,” said the company's executive analyst Karl Brauer.

          “The models on this list likely reflect a combination of driver behaviour and driving conditions, leading to increased crashes and fatalities.”

          As this case it seems the driver of the Tesla was driving it like he stole it, so can't exactly argue with this bit blaming the drivers of Teslas more than Teslas themselves.

          Maybe Tesla self driving software is actually better... Than Tesla drivers? ;)

          But even though Tesla is the deadliest brand, apparently, it doesn't produce the deadliest cars: Those would be the Hyundai Venue SUV, Chevy Corvette and Mitsubishi Mirage hatchback.

          Looking up more information, [carscoops.com]

          Tesla barely beats out Kia at 5.6 fatal accident

      • early reports are that Romanko was a pretty nice guy

        Not sure why you think this is relevant. Don't do the "think of the children" bullshit. He was an bystander, how good of a person he was is irrelevant, and no one is going to come out and say that "oh that guy who just died was an arsehole!"

        Heck "early reports" of most school shooting perps are they were pretty nice guys.

        Leave this emotional rubbish out of posts. It's not relevant.

  • by awwshit ( 6214476 ) on Sunday January 26, 2025 @05:28PM (#65120431)

    > an unoccupied Waymo vehicle was attacked by a mob in Los Angeles last night

    I suspect this is a part of any future that involves 'autonomous' machines. Just a taste of what humans can and will do.

    • by Cyberax ( 705495 )

      I suspect this is a part of any future that involves 'autonomous' machines. Just a taste of what humans can and will do.

      Nah. This is just the regular urban edgy bros showing off. Once self-driving cars become more widespread and people start to realize that they completely solve the public transit issues, these super-edgy oh-so-anarchist bros will be dealt with swiftly.

      • > realize that they completely solve the public transit issues

        Have you been to LA? Sometimes it is not about drivers or accidents or human caused things. Sometimes it is just the number of vehicles that want to share the same space at the same time. When the road is full the road is full and it doesn't matter if there is a human or a machine in control of each vehicle.

      • by q_e_t ( 5104099 )
        SDC themselves don't solve public transport issues as the ratio of energy per passenger for a given distance is unfavourable. They can be part of a public transport infrastructure if linked to mass transport vehicles. E.g., the SDC taxi talks to the bus network and takes you to a bus stop thirty seconds before the bus arrives, allowing you to catch it, then another picks you up at the other end, or provides you with a transfer between routes, amplifying the utility of the bus network. Add in train, light ra
    • I've lived in the SF Bay area for most of my life. The far left there is oddly reactionary, and they have a long history of violent and direct lashing out. They scream at techie commuter buses and vomit on them. They put death threats against techies on bumper stickers, etc. They damage or disrupt self-driving cars. They light buildings on fire when the buildings contain new apartments. They gather those little lime scooters and throw them in the bay or in lakes. They shit on things. They destroy constructi

  • by Ed Tice ( 3732157 ) on Sunday January 26, 2025 @06:16PM (#65120519)
    It's no surprise that the Waymo vehicles are safer than human drivers because Waymo has taken a much different approach to their deployment. Everybody is targeting L5. Some have attempted to get there via some sort of iterative process covering random miles with human safety drivers. Waymo, instead, focused on getting an actual L4 system working in a limited geography. This allowed them to get much more meaningful feedback and continue to expand the area of safe operation. They don't expand the L4 area until they have enough safety data to predict with high confidence that the vehicles will be able to handle the expanded geography.

    Meanwhile you have other systems out there running perpetual "beta" deployments that aren't really getting any closer to being something meaningfully useful or safe.

    • It's no surprise that the Waymo vehicles are safer than human drivers because Waymo has taken a much different approach to their deployment. Everybody is targeting L5. Some have attempted to get there via some sort of iterative process covering random miles with human safety drivers. Waymo, instead, focused on getting an actual L4 system working in a limited geography. This allowed them to get much more meaningful feedback and continue to expand the area of safe operation. They don't expand the L4 area until they have enough safety data to predict with high confidence that the vehicles will be able to handle the expanded geography.

      Meanwhile you have other systems out there running perpetual "beta" deployments that aren't really getting any closer to being something meaningfully useful or safe.

      How they are “measuring” safety:

      While NHTSA's crash data doesn't note whether self-driving vehicles may have been to blame, Waymo has previously noted that it only expects to pay out insurance liability claims for two previous collisions involving its driverless vehicles that resulted in injuries.

      The first half of that statement demands to know why the NHTSA is choosing NOT to specifically and properly measure autonomous drivers.

      The second half of that statement, is more a testament of how bad/corrupt the US Legal system is. By lawyers, for lawyers.

      • You're failing to understand the difference between what the NHTSA look at, and what they publish. The NHTSA investigate all self driving incidents. They know what's happening. They just don't publish this spec in their aggregated dataset.

  • I mean they act like these cars haven't done anything to anybody until this time when someone died and heck it wasn't even their fault.I'm pretty sure there was an AI car that saw a person in the road, collided with them and then stopped seeing the person in the road and then kept driving because it didn't detect the person under the car

    So many of the articles are paywalled but I did find Cruise Didn't Tell Anyone That A Woman Was Dragged 20 Feet After Being Pushed Into Robotaxi [jalopnik.com] and [Woman dragged by [sfstandard.com]

    • You're complaining about Cruise. Cruise is dead. They had their license pulled in no small part due to the exact incident you're complaining about.

      Now that Cruise is gone, the level of newsworthy driverless car badness has dropped from accelerating into pedestrians down to things like getting into honking matches at 4am and spending 5 minutes driving around in circles.

  • Waymo's self-driving vehicles faced about 90% fewer insurance claims relating to property damage and bodily injuries compared to human drivers

    What is wrong with writers these days? Do they think they that only big numbers grab attention so they have to invert the sense so that big == small? Do they not understand that "only 10% of the claims of human drivers" is not only more correct but more clear?

    • The one that gets me is when they write something like "10x less". There's not enough information in that statement to do the math if you take it as it's written. Yes, I KNOW they're trying to write "1/10th", but for fuck's sake, if you're writing copy for a living you ought to be able to do it properly.

  • Waymo's self-driving car "is not being blamed," notes NBC Bay Area.

    Hide the part about no blame and trumpet the part about association mixed with innuendo about blame. No untruths in the story, but with the same effective as the direct lie.

  • While NHTSA's crash data doesn't note whether self-driving vehicles may have been to blame..

    And why the fuck is that set up like that?!? Autonomous solutions ARE the new driver on the road no matter how much bragging they want to do about a gazillion miles of collected “experience”. This is like dropping the legal driving age to 13 and then simply lumping in all fatalities caused with everyone else. Wrong. You measure the FNG on the damn road. We need to know accurate impact.

    ..Waymo has previously noted that it only expects to pay out insurance liability claims for two previous collisions involving its driverless vehicles that resulted in injuries.

    That, is a measure of how bad the legal system is. That, is NOT a measure how bad Autonomous is at drivin

  • In other news, an unoccupied Waymo vehicle was attacked by a mob in Los Angeles last night, according to local news reports. "Video footage of the incident appears to show the vehicle being stripped of its door, windows shattered, and its Jaguar emblems removed. The license plate was also damaged, and the extent of the vandalism required the vehicle to be towed from the scene."

    I say we blame the car!

    (Like we do when yet another truck, er, spontaneously drives into a Christmas market.)

Don't get suckered in by the comments -- they can be terribly misleading. Debug only code. -- Dave Storer

Working...