Waymo Issues Software and Mapping Recall After Robotaxi Crashes Into a Telephone Pole (theverge.com) 69
Waymo is issuing a voluntary software recall after one of its driverless vehicles collided with a telephone pole in Phoenix, Arizona, last month, the company said. The vehicle was damaged, but no passengers or bystanders were hurt in the incident. From a report: The company is filing the recall with the National Highway Traffic Safety Administration (NHTSA) after completing a software update to 672 vehicles -- the total number of driverless-capable vehicles in Waymo's fleet. The update corrects an error in the software that "assigned a low damage score" to the telephone pole, and updates its map to account for the hard road edge in the alleyway that was not previously included. This is Waymo's second recall ever, after two minor collisions prompted a recall of 444 vehicles last February. And it comes at a time of increased regulatory scrutiny of the driverless vehicle industry, in which federal investigators are probing almost all the major companies operating autonomous vehicles in the US.
manual updates are need to the map for changes to (Score:3)
manual updates are need to the map for changes to the roads?
So the cars can get lost if an road changes and the map data does not have the newest change in it?
Re: (Score:2)
Re: manual updates are need to the map for changes (Score:2)
You only think it's stupid because you don't know or don't care about safety.
OTA updates for your phone might cause you to miss an appointment or phone call. OTA updates for your car might cause you to hit an obstacle or pedestrian. It is reasonable that they come with a higher level of scrutiny.
Re:manual updates are need to the map for changes (Score:5, Interesting)
manual updates are need to the map for changes to the roads?
So the cars can get lost if an road changes and the map data does not have the newest change in it?
Looks like the car was driving in "an alley that was lined on both sides by wooden telephone poles [that] were not up on a curb but level with the road and surrounded with longitudinal yellow striping to define the viable path for vehicles." Even though Waymo is trying to update their maps, this is effectively an off-road, off-map scenario. More importantly, that the AV has to depend on maps to stay in the right place means that the perception software failed. Of course, this scenario of an alley with no lines with telephone poles with weird markings and directly on the road surface is rare. It is absolutely impossible to include all such corner cases in training data.
These corner cases are the reason that Level-5 autonomy won't be available for decades. The probability distribution for corner cases has an extremely long tail, not only for perception but also for planning. AVs have been able to handle the 99% for a while, but the remaining fraction of a percent makes Level-5 cars impractical. That's why Level 2+ and Level 3 will be the target for the next few decades.
This is also the reason why Musk's stock-pumping PR about robotaxis has no chance of success in the time frame needed to maintain TSLA prices.
The Autonomous Fan. (Score:3)
And no one is going to ask what the autonomous vehicle was doing in that weird back alley?
Wait, don’t tell me. Someone told the car K.I.T.T. from Knight Rider was down the block doing voice impressions, and it just HAD to take a shortcut and hurry..
Re: (Score:2)
Re: (Score:2)
Who cares why it was there? Unless it is a road illegal for any member of the public to travel on, you need to assume in the training that it will be travelled on.
And 10 years from now when the autonomous driverless cab decides to route you and your family through the shittiest part of town where they hate tourists and those “newfangled” cars, all because that shady back alley was not closed that week due to another homicide, and that route was 17 feet shorter than the freeway?
At least pretend to understand why you should care. It’s not illegal for a Ferrari to take the offroad dirt path either. It’s just not smart.
Re: (Score:2)
These corner cases are the reason that Level-5 autonomy won't be available for decades.
If there are corner cases, the algorithm or process/es are insufficient with dealing with the Real World. I would not trust it. A human has no "corner cases". The closest you can get to that with humans are psychological issues such as "fear" or "uncertainty".
Uh... what? Waymo = Disney ride? (Score:3, Insightful)
So the vehicle crashed because it wasn't preprogrammed in advance to know about that one particular spot?
I thought these things were run by some AI that analyzed the world around it and made decisions, not a Disney World style track. This is disappointing, I believed they had a good AI for real world driving.
This means they'd have to keep up with every change to every place they want to go? Construction, underground cable installs, rain, snow, etc weather, vandalism, other car wrecks, and so on can all change features and conditions. There's no way they can keep fixing their maps at that level of detail.
Re: (Score:2)
Yeah, I've been reading about how much more advanced Waymo is over Tesla on this site on a regular basis, but if it turns out that 99% of that is simply that Waymo put drastically more effort into mapping the areas where it operates, it may be a case of while Waymo cars are more capable in their geofenced areas, Tesla's version is better in the general sense.
And if we're going to introduce self-driving to the entire USA, much less the entire world (more or less), we need a version that works in the general
Re: (Score:2)
may be a case of while Waymo cars are more capable in their geofenced areas, Tesla's version is better in the general sense.
They've always talked about them using "HD Maps" or whatever. That right there makes it pretty much unusable outside of cities, because companies will never pay the expense to HD Map bumfuck nowhere.
Re:Uh... what? Waymo = Disney ride? (Score:4, Informative)
HDMaps + LIDAR *is* more advanced than what Tesla is doing. And that's reflected in the number of people Waymo has not killed, and reflected in the way that Waymo has licenses for fully self driving vehicles, while Tesla is being sued for calling their car "Fully Self Driving" when it is both legally and technically not.
Waymo's HDmaps augment their driving system for weird an unexpected situations. If you have a look at the incident, the road is fucking weird. It looks like a road with telecom poles growing out of it. No curb just a yellow line saying "don't hit me".
The funny thing is if you jump on youtube and do a search for FSD beta crash, the literal first video is a Tesla doing the same thing - except instead of a pole it was a bollard in the road.
Re: (Score:3)
HDMaps + LIDAR *is* more advanced than what Tesla is doing. And that's reflected in the number of people Waymo has not killed, and reflected in the way that Waymo has licenses for fully self driving vehicles, while Tesla is being sued for calling their car "Fully Self Driving" when it is both legally and technically not.
That's really not a fair comparison. Most of those occurred before the FSD beta feature set even became available. I think there has been a single FSD beta fatality so far. And bear in mind that Waymo still doesn't support any highway driving, which is where drivers are most likely to get killed in an accident. So you're comparing driver-assistance miles driven mostly on freeways with self-driving miles driven mostly in cities, and getting predictably different results.
Waymo's HDmaps augment their driving system for weird an unexpected situations. If you have a look at the incident, the road is fucking weird. It looks like a road with telecom poles growing out of it. No curb just a yellow line saying "don't hit me".
It's also a LIDAR return that shou
Re: (Score:2)
HDMaps + LIDAR *is* more advanced than what Tesla is doing. And that's reflected in the number of people Waymo has not killed, and reflected in the way that Waymo has licenses for fully self driving vehicles, while Tesla is being sued for calling their car "Fully Self Driving" when it is both legally and technically not.
Waymo's HDmaps augment their driving system for weird an unexpected situations. If you have a look at the incident, the road is fucking weird. It looks like a road with telecom poles growing out of it. No curb just a yellow line saying "don't hit me".
The funny thing is if you jump on youtube and do a search for FSD beta crash, the literal first video is a Tesla doing the same thing - except instead of a pole it was a bollard in the road.
Are google still using the HDL-64 units? Did some aerial surveying with some of those and they were brilliant (unless it was cloudy, lidar and water dont mix strangely enough).
The problem isn't with detection, it's with decision making. All a LIDAR or any other form of detection can do is tell you something is there. It doesn't tell you what it is... That takes time and compute power, so autonomous cars are programmed just to stop and wait for a human to determine what it is when they detect something...
Re: (Score:2)
Not really. The world is a pretty finite place relative to current processing abilities - and the world's roads, moreso.
For example, a taxi service in the US could state flatly "we won't go on any street not on google maps," and it wouldn't hurt their business at all.
In any case this story isn't about that. The cars obviously are supposed to see
Re: (Score:3)
The world may be finite, but it keeps changing. And the only way to find out about those changes is to go look at them. What you saw yesterday may not be true today. (Well, it *usually* is, but usually isn't sufficient.)
Maps should be "reference materials" that you look up to find out how to get from here to there. But they shouldn't be expected to tell you about the box springs in the road.
Re: (Score:2)
For example, a taxi service in the US could state flatly "we won't go on any street not on google maps," and it wouldn't hurt their business at all.
so it can't handle some parking lots / strip malls?
may or may not work at airports?
may or may out try to use an loading doc at some buildings?
Re: (Score:2)
Here's their info on taking a Waymo to the Phoenix airport. They do indicate specific pickup/dropoff locations so I guess they don't let it pull over just anywhere at the airport:
https://support.google.com/way... [google.com]
Re: (Score:2)
Re: (Score:2)
and are they going map each airport and how often will they update that map data?
Re: (Score:2)
You have to take into consideration that
- Waymo started 2009, Tesla had autopilot 2015.
- Tesla is driving on highway, which is the easy part, while Waymo is driving inside a city, which is the hard part.
- Tesla has had hundreds of crashes and dozens of deaths. Waymo has has few minor crashes, and it has killed one dog that run under the car.
- This case does not tell us that Waymo puts drastically more effort into mapping. It could as well mean that mapping is automatic, but this particular telephone pole ha
Re: (Score:2)
Re: (Score:2)
Some more differences:
1. There are way more FSD Teslas on the road than Waymo vehicles. As somebody else noted - Tesla FSD actually has a lower accident rate per mile. So more accidents, but because more miles. Also, highway accidents tend to be much more deadly.
2. FSD Teslas are in the hands of customers. All of Waymo's vehicles are directly owned by Waymo.
3. Not running into a telephone pole should be a basic thing for any self driving car.
Re: (Score:2)
Yeah, I've been reading about how much more advanced Waymo is over Tesla on this site on a regular basis, but if it turns out that 99% of that is simply that Waymo put drastically more effort into mapping the areas where it operates, it may be a case of while Waymo cars are more capable in their geofenced areas, Tesla's version is better in the general sense.
Well, depending on how you look for it, that may well be the case. FSD beta has an accident rate of 0.31 per million miles [notateslaapp.com] (city driving only) versus 0.41 per million miles [waymo.com] for Waymo.
Of course, that's also not a fair comparison, because with FSD beta, there's a person behind the wheel ready to intervene when it does something utterly bananas. And the intervention rate is still orders of magnitude too high to truly consider it to be self-driving.
Without turning both systems loose in the real world under si
Re: (Score:2)
they can take the Disney ride we are not responsib (Score:2)
they can take the Disney ride we are not responsible part and add it to the EULA and also say the rider / renter is on the hook for any damage to the car, tolls , tickets, claims made.
Re: (Score:2)
HD maps are used to map out obstacles in non-standard roads. There's nothing normal about the area where Waymo crashed. It looks like the poles are in the actual road. Really bizarre. Incidentally the Waymo did notice the pole in the end and braked. The crash happened at ... 8mph.
There's nothing "Disney world style track" about this either. Unless you consider an entire major city a Disney world style track. HD maps are generated in real time on the fly, and sometimes edited when the software makes a mistak
Software companies (Score:5, Insightful)
Re: (Score:2)
(Uncommon sense, +5)
Re: (Score:2)
spirit airlines passes the lower cost of useing be (Score:2)
spirit airlines passes the lower cost of useing beta software on to passengers.
or you can can be on an airline that has higher costs do you buying the fully tested software that costs more.
Re: (Score:2)
Re: (Score:1, Interesting)
Neither should average humans. Human drivers are too often morons.
The bots only need to be slightly better than people, not perfect.
Re: (Score:2)
Re: (Score:1)
If those are the actual practical tradeoffs, then YES.
Re: (Score:2)
Re: (Score:1)
I'm not following you. It would be great if bot-cars were super-safe, but we don't yet have super-safe tech.
Re: (Score:2)
Re: (Score:1)
Sorry, I'm not following. Early jets had notable problems, and with experience they were ironed out. You can't get experience until you get experience.
Let me restate this all: I'm okay with bot-cars being introduced when they reach a rate of approximately 20% better than human drivers. That's my vote. If it's "wrong", too bad, I'll live in my Wrongville and you live your Pedanticville.
Re: (Score:2)
Re: (Score:1)
Become a formal widespread commercial service.
Re: (Score:2)
That's not the way the legal system works. And, given the examples of unregulated companies, it's not the way it should work. If it works that way they'll cut corners.
Re: (Score:1)
If we have too many restrictions, we'll never get bot-cars. I'm getting up there in age, so would like a bot car to get around on my own when I can no longer drive.
> If it works that way they'll cut corners.
No, they'd still be liable, just like a bad human driver. The safer they are, the less lawsuits they have to pay out, so it's not like there are zero incentives to be safe.
Re: (Score:2)
You'd think so, wouldn't you? But in reality, humans are (wrongly) trusted to do drive cars, and automation is not. Even if autonomous vehicles are, statistically, responsible for half as many accidents as humans(*) all it takes is one high-profile accident in which "a human wouldn't have done that" and you'll convince the majority of the populace that autonomous vehicles are unsafe. Because that's the same sort of thought process that leads
Re:Software companies (Score:5, Interesting)
I've done work that affected public safety. I tested until I was certain it was solid work, then had a colleague try to make it fail. She always did, the first couple of times. Once I got past her, it went to an entire test team who spent as long as it took to test every function under every conceivable circumstance.
My original work has been ported to new languages and used on at least two continents over the last decade or so, and to the best of my knowledge it has never failed in a way that caused a safety hazard. ...But my work was in no way as potentially dangerous as a self driving system for a massive chunk of metal moving at significant velocities in proximity to people.
Re: (Score:1)
Don't tell me, this is after you were let go by Boeing.
Re: (Score:2)
I worked in public safety, not aerospace.
Re: (Score:3)
should not be doing anything related to safety.
Why not? Even with this crash Waymo's accident record far exceeds that of any other company or human driver. Take a step back and think a bit before you knee jerk your way to conclusions.
If you do something in the engineering world, you have to prove that it's safe and usually have someone be certified if your going to potentially kill people.
This wouldn't have killed anyone. Not at 8mph. Waymo have driven collectively millions of miles and have yet to kill anyone. Yet you're jumping straight to conclusions.
The attitudes of most software devs is iterative design and that doesn't work when you have people involved because if you have a bug, you kill someone.
The attitude of most Slashdotters is to ignorantly talk out of their arse. Yet somehow you still get modded up for it. Show Waymo the respect moderators show
Re: (Score:2)
Why are humans allowed to crash into telephone poles? Humans do it all the time, and there's no investigation into the safety of humans driving? There is no need to prove self-driving cars are "safe", they are already safer than humans and that's good enough. I don't think we should bother to make them much safer because you guys say 40,000 deaths per year just in the USA is an acceptable number right? A mere 40,000 people dying shouldn't bother anybody.
Re: (Score:2)
FWIW, it's not clear to me that they are safer under all conditions. But then I also think a lot of people shouldn't be driving. (Including me. And I take that seriously enough that I pulled my license. So you can guess that I REALLY want a self-driving car.)
OTOH, I don't think relying on maps is the right approach. Local conditions are always subject to unannounced changes. It's necessary to observe the local conditions and act on that...with maps as a guide to the more global situation.
Re: (Score:2)
Re: (Score:2)
I disagree.
Lets say that if an example human operated systems would kill 42000 (US death toll) people per year and lets say that autonomous systems would kill 700 per year (estimated based on Tesla as Waymo has no fatalities). You have now two options:
A) Kill 42000 people
B) Kill 700 people
You are suggesting that we should pick option A.
Re: (Score:3)
Re: (Score:2)
AS a wise man once said... (Score:2)
It's amazing (Score:5, Insightful)
That they work as well as they do. Nearly 700 autonomous vehicles and only a couple crashes?
Re: (Score:2)
Re: (Score:2)
40,000 people dying per year just in the USA .. 1 million worldwide .. is acceptable to you? How many times have humans crashed into telephone poles and actual people? https://www.youtube.com/result... [youtube.com]
When will people realize human driving won't ever work as intended and the sooner everyone realizes that the better off everyone will be.
Re: (Score:2)
Should have taken a Johnnycab. (Score:1)
detailed mapping (Score:2)
Note that these AVs require very detailed mapping of the physical world. That has always been the case, since day 1. The question is how much mapping they can do themselves, and how much must be done offline.
Humans crashing into telephone poles (Score:3)
Why no investigation and banning of humans from driving cars? https://www.youtube.com/result... [youtube.com]
Re: (Score:2)
I don't get it, not rocket science (Score:3)
Detecting a telephone pole with lidar is at least two-decade-old tech that doesn't require image recognition. How did they flunk that?
What if it were a skinny stationary person wearing brown clothes?
Stop calling on OTA update a recall (Score:2)