Bad Week for Unoccupied Waymo Cars: One Hit in Fatal Collision, One Vandalized by Mob (nbcbayarea.com) 47
For the first time in America, an empty self-driving car has been involved in a fatal collision. But it was "hit from behind by a speeding car that was going about 98 miles per hour," a local news site reports, citing comments from Waymo. ("Two other victims were taken to the hospital with life-threatening injuries. A dog also died in the crash, according to the San Francisco Fire Department.")
Waymo's self-driving car "is not being blamed," notes NBC Bay Area. Instead the Waymo car was one of six vehicles "struck when a fast-moving vehicle slammed into a line of cars stopped at a traffic light..." The National Highway Traffic Safety Administration requires self-driving car companies, like Waymo, to report each time their vehicles are involved in an accident, regardless of whether the autonomous vehicle was at fault. According to NHTSA, which began collecting such data in July 2021, Waymo's driverless vehicles have been involved in about 30 different collisions resulting in some type of injury. Waymo, however, has noted that nearly all those crashes, like Sunday's collision, were the fault of other cars driven by humans. While NHTSA's crash data doesn't note whether self-driving vehicles may have been to blame, Waymo has previously noted that it only expects to pay out insurance liability claims for two previous collisions involving its driverless vehicles that resulted in injuries.
In December, Waymo touted the findings of its latest safety analysis, which determined its fleet of driverless cars continue to outperform human drivers across major safety metrics. The report, authored by Waymo and its partners at the Swiss Reinsurance Company, reviewed insurance claim data to explore how often human drivers and autonomous vehicles are found to be liable in car collisions. According to the study, Waymo's self-driving vehicles faced about 90% fewer insurance claims relating to property damage and bodily injuries compared to human drivers... The company's fleet of autonomous vehicles have traveled more than 33 million miles and have provided more than five million rides across San Francisco, Los Angeles, Phoenix and Austin...
In California, there are more than 30 companies currently permitted by the DMV to test driverless cars on the open road. While most are still required to have safety drivers sitting in the front seat who can take over when needed, Waymo remains the only fleet of robotaxis in California to move past the state's testing phase to, now, regularly offer paid rides to passengers.
Their article adds that while Sunday's collision marks the first fatal crash involving a driverless car, "it was nearly seven years ago when another autonomous vehicle was involved in a deadly collision with a pedestrian in Tempe, Arizona, though that self-driving car had a human safety driver behind the wheel. The accident, which occurred in March 2018, involved an autonomous car from Uber, which sold off its self-driving division two years later to a competitor."
In other news, an unoccupied Waymo vehicle was attacked by a mob in Los Angeles last night, according to local news reports. "Video footage of the incident appears to show the vehicle being stripped of its door, windows shattered, and its Jaguar emblems removed. The license plate was also damaged, and the extent of the vandalism required the vehicle to be towed from the scene."
The Los Angeles Times reminds its readers that "Last year, a crowd in San Francisco's Chinatown surrounded a Waymo car, vandalized it and then set it ablaze..."
Waymo's self-driving car "is not being blamed," notes NBC Bay Area. Instead the Waymo car was one of six vehicles "struck when a fast-moving vehicle slammed into a line of cars stopped at a traffic light..." The National Highway Traffic Safety Administration requires self-driving car companies, like Waymo, to report each time their vehicles are involved in an accident, regardless of whether the autonomous vehicle was at fault. According to NHTSA, which began collecting such data in July 2021, Waymo's driverless vehicles have been involved in about 30 different collisions resulting in some type of injury. Waymo, however, has noted that nearly all those crashes, like Sunday's collision, were the fault of other cars driven by humans. While NHTSA's crash data doesn't note whether self-driving vehicles may have been to blame, Waymo has previously noted that it only expects to pay out insurance liability claims for two previous collisions involving its driverless vehicles that resulted in injuries.
In December, Waymo touted the findings of its latest safety analysis, which determined its fleet of driverless cars continue to outperform human drivers across major safety metrics. The report, authored by Waymo and its partners at the Swiss Reinsurance Company, reviewed insurance claim data to explore how often human drivers and autonomous vehicles are found to be liable in car collisions. According to the study, Waymo's self-driving vehicles faced about 90% fewer insurance claims relating to property damage and bodily injuries compared to human drivers... The company's fleet of autonomous vehicles have traveled more than 33 million miles and have provided more than five million rides across San Francisco, Los Angeles, Phoenix and Austin...
In California, there are more than 30 companies currently permitted by the DMV to test driverless cars on the open road. While most are still required to have safety drivers sitting in the front seat who can take over when needed, Waymo remains the only fleet of robotaxis in California to move past the state's testing phase to, now, regularly offer paid rides to passengers.
Their article adds that while Sunday's collision marks the first fatal crash involving a driverless car, "it was nearly seven years ago when another autonomous vehicle was involved in a deadly collision with a pedestrian in Tempe, Arizona, though that self-driving car had a human safety driver behind the wheel. The accident, which occurred in March 2018, involved an autonomous car from Uber, which sold off its self-driving division two years later to a competitor."
In other news, an unoccupied Waymo vehicle was attacked by a mob in Los Angeles last night, according to local news reports. "Video footage of the incident appears to show the vehicle being stripped of its door, windows shattered, and its Jaguar emblems removed. The license plate was also damaged, and the extent of the vandalism required the vehicle to be towed from the scene."
The Los Angeles Times reminds its readers that "Last year, a crowd in San Francisco's Chinatown surrounded a Waymo car, vandalized it and then set it ablaze..."
Re: (Score:2)
Africa? Nobody wants that kind of latency from a driver. It would be like driving drunk.
Re: (Score:1)
Re:They're not really driverless (Score:4, Informative)
Every about 5 mi a real person has to take control remotely and drive by wire.
You're confusing Cruise and Waymo, and the former has basically given up and abandoned the market and they reported requiring manual intervention every 5 miles. Waymo on the other hand publish no hard numbers, but by all accounts information shows they are several orders of magnitude better than that.
There's 5000 Waymos travelling the streets of SF. If they each needed to stop in traffic for half a minute every couple of miles to get a human intervention it would be in the news. They aren't, and the interventions aren't remotely operating the vehicle either, they are making decisions for the autonomous system. Taking over the vehicle is considered an autonomous disengagement and Waymo do publish figures for that, currently it happens once every 17000 miles where a human needs to remotely take control of the vehicle.
You misunderstood (Score:2)
The point I was making is right now Google is hyper fixated on safety so they are letting those human drivers do a lot of driving.
Once this is a business people are going to start tracking how much time does human drivers are actually in control of the vehicles and pushing t
Re:You misunderstood (Score:4, Informative)
They don't stop in traffic. What happens is they don't know what to do so they delegate control to a person who drives the car remotely. The car never stops it just seamlessly hands control over to a human being in a call center.
First, Waymo does NOT do that [reddit.com].
Second, what you're describing is not even physically possible without some kind of direct radio link. No cellular network is even close to being fast enough or reliable enough for real-time control of a motor vehicle. Period. The latency is way, way, way too high, and the bandwidth is way too low. And if the network dropped a second of video, someone would likely die. There is absolutely no realistic way to do that safely, period.
The grandparent post described exactly what happens when a human takes control over a Waymo car [waymo.com]. The car asks the remote operator what path to take, they specify a series of points relative to the current location, and the car drives that path autonomously. Then, they tell it to resume autonomous driving, and it continues.
The only way a human can truly take complete control of a Waymo car is to physically get into a vehicle, drive out to wherever the Waymo car is, open the driver's door, and get behind the wheel. That might happen on rare occasions, but it obviously isn't common, or else you'd hear about it.
In ALL other circumstances, the car is driving with its own decision-making. The human is just drawing points on a map to override the car's path planning decisions, and telling it "Yes, it is safe to do this" whenever the car stops and asks for confirmation.
The point I was making is right now Google is hyper fixated on safety so they are letting those human drivers do a lot of driving.
Literally nothing in your post is true. Waymo isn't even part of Google anymore, and hasn't been for almost a decade. It's a separate company under Alphabet. And again, having a human who is not in the car "do a lot of the driving" in a way that didn't cause constant accidents would require changing the laws of physics.
Once this is a business people are going to start tracking how much time does human drivers are actually in control of the vehicles and pushing to have them spend less time doing it.
ROFL. Waymo has been selling rides commercially for about five years now. Where have you been? And they do track it. The amount of time that human drivers are in direct control of the vehicles is precisely zero except when they are testing something with a safety driver in the car.
At that point you're going to see the limits of self-driving cars. Instead of a driver taking over every 5 mi you'll be looking at a driver taking over every 10, 20 or even 50 mi. But it's entirely possible even likely the technology won't be any better and we'll just have more accidents.
As others have already told you, Waymo had only one intervention per 17,000 miles way back in 2023. Even in 2016, Waymo had just one intervention per 5,000 miles. In 2015, it was one intervention every 1,250 miles. I couldn't find data from 2013 or 2014, but I'm pretty sure they exceeded your once-per-50-miles mark long before California started making them report disengagements back in 2013.
Re: (Score:2)
Waymo isn't even part of Google anymore, and hasn't been for almost a decade.
I'm onboard with what else you're saying but here you're making a very stupid distinction without a difference. Google renaming its parent company to Alphabet Inc doesn't make it any less Google. You may even note that Google and Alphabet Inc have the same CEO. Waymo is definitely a part of Google for all intents and purposes.
Re: (Score:2)
They don't stop in traffic. What happens is they don't know what to do so they delegate control to a person who drives the car remotely.
False, they stop. They then display a message to the driver telling them that they are getting help and will be on their way soon. They connect to a remote control centre where a person is presented with the challenge, has to analyse the situation and then issue a corrective order. They do *NOT* drive the car remotely. There's nothing seamless about this, it takes 30+ seconds. Remote control takes well over a minute on the very rare cases where it happens.
The point I was making is right now Google is hyper fixated on safety so they are letting those human drivers do a lot of driving.
And the point you are making is completely and utter
Re: (Score:2)
Seamlessly handing over control without stopping, without the agent in the call center already knowing what's going on, what's around the car, etc.?
And yet there are so few accidents? Clearly these agents are superheroes.
Re: (Score:3)
Any proof of that? Or is that another one from the Library of Alexandria that exists up your ass?
I'm ambivalent about self-driving cars ... (Score:2)
But I don't see how this particular news item is a "Waymo" story or informs us about the topic at all, really.
Re: (Score:3)
Like with EVs and AI there are technical issues, but there are also many social, legal, perception, etc issues to acceptance.
The first one is a reminder that just because a driverless car is involved in an accident it doesn't mean it's at fault. Yeah, it's not so deep, but sadly it's relevant to cars in general: all sorts of urban planets blame all pedestrian and bike accidents on cars and then use the numbers to pass speed limits, parking band, "traffic calming" and the whole raft of anti-car techniques.
An
Re: (Score:2)
Like with EVs and AI there are technical issues, but there are also many social, legal, perception, etc issues to acceptance. The first one is a reminder that just because a driverless car is involved in an accident it doesn't mean it's at fault.
That is literally the second paragraph of the summary.
Re:I'm ambivalent about self-driving cars ... (Score:4, Informative)
Basically, it's "any accident involving a Waymo is reportable news".
Reading the article, the 98 mph vehicles was a Tesla being driven by a 66 year old man. It crashed into the STOPPED and empty Waymo vehicle with enough force to ping-pong the Waymo around. This resulted in the death of Romanenko, who was not in either of the mentioned vehicles, but then, 5 other cars of unknown make were hit.
The driver of the Tesla was uninjured enough to be hauled directly to jail. Wish it had been the opposite - early reports are that Romanko was a pretty nice guy. Two other people were taken to the hospital with life threatening injuries, and a dog was killed.
In the same report they mention that vandals wrecked a Waymo enough that it had to be towed off, but nobody was hurt. No word if video of the perps was obtained that would be sufficient for identification.
Re: (Score:2)
So what you're saying is Tesla kills again [yahoo.com].
Bad stats. Tesla drivers, not Tesla? (Score:2)
“Most of these vehicles received excellent safety ratings, performing well in crash tests at the IIHS and NHTSA, so it’s not a vehicle design issue,” said the company's executive analyst Karl Brauer.
“The models on this list likely reflect a combination of driver behaviour and driving conditions, leading to increased crashes and fatalities.”
As this case it seems the driver of the Tesla was driving it like he stole it, so can't exactly argue with this bit blaming the drivers of Teslas more than Teslas themselves.
Maybe Tesla self driving software is actually better... Than Tesla drivers? ;)
But even though Tesla is the deadliest brand, apparently, it doesn't produce the deadliest cars: Those would be the Hyundai Venue SUV, Chevy Corvette and Mitsubishi Mirage hatchback.
Looking up more information, [carscoops.com]
Tesla barely beats out Kia at 5.6 fatal accident
Re: (Score:2)
early reports are that Romanko was a pretty nice guy
Not sure why you think this is relevant. Don't do the "think of the children" bullshit. He was an bystander, how good of a person he was is irrelevant, and no one is going to come out and say that "oh that guy who just died was an arsehole!"
Heck "early reports" of most school shooting perps are they were pretty nice guys.
Leave this emotional rubbish out of posts. It's not relevant.
robots are in trouble (Score:4, Insightful)
> an unoccupied Waymo vehicle was attacked by a mob in Los Angeles last night
I suspect this is a part of any future that involves 'autonomous' machines. Just a taste of what humans can and will do.
Re: (Score:2)
I suspect this is a part of any future that involves 'autonomous' machines. Just a taste of what humans can and will do.
Nah. This is just the regular urban edgy bros showing off. Once self-driving cars become more widespread and people start to realize that they completely solve the public transit issues, these super-edgy oh-so-anarchist bros will be dealt with swiftly.
Re: (Score:2)
> realize that they completely solve the public transit issues
Have you been to LA? Sometimes it is not about drivers or accidents or human caused things. Sometimes it is just the number of vehicles that want to share the same space at the same time. When the road is full the road is full and it doesn't matter if there is a human or a machine in control of each vehicle.
Re: (Score:2)
Re: (Score:3)
I've lived in the SF Bay area for most of my life. The far left there is oddly reactionary, and they have a long history of violent and direct lashing out. They scream at techie commuter buses and vomit on them. They put death threats against techies on bumper stickers, etc. They damage or disrupt self-driving cars. They light buildings on fire when the buildings contain new apartments. They gather those little lime scooters and throw them in the bay or in lakes. They shit on things. They destroy constructi
Waymo safer due to actual process and evidence. (Score:4, Insightful)
Meanwhile you have other systems out there running perpetual "beta" deployments that aren't really getting any closer to being something meaningfully useful or safe.
Re: (Score:2)
It's no surprise that the Waymo vehicles are safer than human drivers because Waymo has taken a much different approach to their deployment. Everybody is targeting L5. Some have attempted to get there via some sort of iterative process covering random miles with human safety drivers. Waymo, instead, focused on getting an actual L4 system working in a limited geography. This allowed them to get much more meaningful feedback and continue to expand the area of safe operation. They don't expand the L4 area until they have enough safety data to predict with high confidence that the vehicles will be able to handle the expanded geography.
Meanwhile you have other systems out there running perpetual "beta" deployments that aren't really getting any closer to being something meaningfully useful or safe.
How they are “measuring” safety:
While NHTSA's crash data doesn't note whether self-driving vehicles may have been to blame, Waymo has previously noted that it only expects to pay out insurance liability claims for two previous collisions involving its driverless vehicles that resulted in injuries.
The first half of that statement demands to know why the NHTSA is choosing NOT to specifically and properly measure autonomous drivers.
The second half of that statement, is more a testament of how bad/corrupt the US Legal system is. By lawyers, for lawyers.
Re: (Score:2)
You're failing to understand the difference between what the NHTSA look at, and what they publish. The NHTSA investigate all self driving incidents. They know what's happening. They just don't publish this spec in their aggregated dataset.
Not the first time (Score:1)
I mean they act like these cars haven't done anything to anybody until this time when someone died and heck it wasn't even their fault.I'm pretty sure there was an AI car that saw a person in the road, collided with them and then stopped seeing the person in the road and then kept driving because it didn't detect the person under the car
So many of the articles are paywalled but I did find Cruise Didn't Tell Anyone That A Woman Was Dragged 20 Feet After Being Pushed Into Robotaxi [jalopnik.com] and [Woman dragged by [sfstandard.com]
Re: (Score:2)
You're complaining about Cruise. Cruise is dead. They had their license pulled in no small part due to the exact incident you're complaining about.
Now that Cruise is gone, the level of newsworthy driverless car badness has dropped from accelerating into pedestrians down to things like getting into honking matches at 4am and spending 5 minutes driving around in circles.
Re: (Score:2)
I'm waiting for the inevitable comedy skit where a woman gets into a robotaxi and the thing spends the next five minutes pulling high-G donuts in a snowy parking lot.
Re: (Score:1)
Re: (Score:2)
No one really wants this 'self driving car' nonsense in the first place. It's not good at all, it never will be good, and it's just causing more problems than it claims to solve.
A lot of people want self-driving cars, actually.
I want one so I don't have to give up large parts of my day to commuting, and can instead work while I drive. (And no, public transit doesn't solve that. I would spend considerably more time walking to and from the nearest public transit stops on both ends than I would spend driving my car.)
I also want one for my mom in the short-to-medium term and for myself in the long term, because whether in one year or fifty, at some point, every single one of us will
More 10x less grammer garbage (Score:2)
Waymo's self-driving vehicles faced about 90% fewer insurance claims relating to property damage and bodily injuries compared to human drivers
What is wrong with writers these days? Do they think they that only big numbers grab attention so they have to invert the sense so that big == small? Do they not understand that "only 10% of the claims of human drivers" is not only more correct but more clear?
Re: (Score:2)
The one that gets me is when they write something like "10x less". There's not enough information in that statement to do the math if you take it as it's written. Yes, I KNOW they're trying to write "1/10th", but for fuck's sake, if you're writing copy for a living you ought to be able to do it properly.
Re: (Score:2)
Unfortunately, 10x less Americans actually understand fractions.
This is how you do propaganda (Score:2)
Waymo's self-driving car "is not being blamed," notes NBC Bay Area.
Hide the part about no blame and trumpet the part about association mixed with innuendo about blame. No untruths in the story, but with the same effective as the direct lie.
Bad Way of Measuring Autonomy. (Score:2)
While NHTSA's crash data doesn't note whether self-driving vehicles may have been to blame..
And why the fuck is that set up like that?!? Autonomous solutions ARE the new driver on the road no matter how much bragging they want to do about a gazillion miles of collected “experience”. This is like dropping the legal driving age to 13 and then simply lumping in all fatalities caused with everyone else. Wrong. You measure the FNG on the damn road. We need to know accurate impact.
..Waymo has previously noted that it only expects to pay out insurance liability claims for two previous collisions involving its driverless vehicles that resulted in injuries.
That, is a measure of how bad the legal system is. That, is NOT a measure how bad Autonomous is at drivin
Re: Bad Way of Measuring Autonomy. (Score:2)
Blame the car! (Score:1)
In other news, an unoccupied Waymo vehicle was attacked by a mob in Los Angeles last night, according to local news reports. "Video footage of the incident appears to show the vehicle being stripped of its door, windows shattered, and its Jaguar emblems removed. The license plate was also damaged, and the extent of the vandalism required the vehicle to be towed from the scene."
I say we blame the car!
(Like we do when yet another truck, er, spontaneously drives into a Christmas market.)