Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation Software

Waymo Issues Software and Mapping Recall After Robotaxi Crashes Into a Telephone Pole (theverge.com) 69

Waymo is issuing a voluntary software recall after one of its driverless vehicles collided with a telephone pole in Phoenix, Arizona, last month, the company said. The vehicle was damaged, but no passengers or bystanders were hurt in the incident. From a report: The company is filing the recall with the National Highway Traffic Safety Administration (NHTSA) after completing a software update to 672 vehicles -- the total number of driverless-capable vehicles in Waymo's fleet. The update corrects an error in the software that "assigned a low damage score" to the telephone pole, and updates its map to account for the hard road edge in the alleyway that was not previously included. This is Waymo's second recall ever, after two minor collisions prompted a recall of 444 vehicles last February. And it comes at a time of increased regulatory scrutiny of the driverless vehicle industry, in which federal investigators are probing almost all the major companies operating autonomous vehicles in the US.
This discussion has been archived. No new comments can be posted.

Waymo Issues Software and Mapping Recall After Robotaxi Crashes Into a Telephone Pole

Comments Filter:
  • manual updates are need to the map for changes to the roads?

    So the cars can get lost if an road changes and the map data does not have the newest change in it?

    • OTA Updates are also considered recalls (Stupid - I know...)
      • You only think it's stupid because you don't know or don't care about safety.

        OTA updates for your phone might cause you to miss an appointment or phone call. OTA updates for your car might cause you to hit an obstacle or pedestrian. It is reasonable that they come with a higher level of scrutiny.

    • by larryjoe ( 135075 ) on Wednesday June 12, 2024 @11:02AM (#64543813)

      manual updates are need to the map for changes to the roads?

      So the cars can get lost if an road changes and the map data does not have the newest change in it?

      Looks like the car was driving in "an alley that was lined on both sides by wooden telephone poles [that] were not up on a curb but level with the road and surrounded with longitudinal yellow striping to define the viable path for vehicles." Even though Waymo is trying to update their maps, this is effectively an off-road, off-map scenario. More importantly, that the AV has to depend on maps to stay in the right place means that the perception software failed. Of course, this scenario of an alley with no lines with telephone poles with weird markings and directly on the road surface is rare. It is absolutely impossible to include all such corner cases in training data.

      These corner cases are the reason that Level-5 autonomy won't be available for decades. The probability distribution for corner cases has an extremely long tail, not only for perception but also for planning. AVs have been able to handle the 99% for a while, but the remaining fraction of a percent makes Level-5 cars impractical. That's why Level 2+ and Level 3 will be the target for the next few decades.

      This is also the reason why Musk's stock-pumping PR about robotaxis has no chance of success in the time frame needed to maintain TSLA prices.

      • And no one is going to ask what the autonomous vehicle was doing in that weird back alley?

        Wait, don’t tell me. Someone told the car K.I.T.T. from Knight Rider was down the block doing voice impressions, and it just HAD to take a shortcut and hurry..

        • Who cares why it was there? Unless it is a road illegal for any member of the public to travel on, you need to assume in the training that it will be travelled on.
          • Who cares why it was there? Unless it is a road illegal for any member of the public to travel on, you need to assume in the training that it will be travelled on.

            And 10 years from now when the autonomous driverless cab decides to route you and your family through the shittiest part of town where they hate tourists and those “newfangled” cars, all because that shady back alley was not closed that week due to another homicide, and that route was 17 feet shorter than the freeway?

            At least pretend to understand why you should care. It’s not illegal for a Ferrari to take the offroad dirt path either. It’s just not smart.

      • These corner cases are the reason that Level-5 autonomy won't be available for decades.

        If there are corner cases, the algorithm or process/es are insufficient with dealing with the Real World. I would not trust it. A human has no "corner cases". The closest you can get to that with humans are psychological issues such as "fear" or "uncertainty".

  • by iAmWaySmarterThanYou ( 10095012 ) on Wednesday June 12, 2024 @10:15AM (#64543597)

    So the vehicle crashed because it wasn't preprogrammed in advance to know about that one particular spot?

    I thought these things were run by some AI that analyzed the world around it and made decisions, not a Disney World style track. This is disappointing, I believed they had a good AI for real world driving.

    This means they'd have to keep up with every change to every place they want to go? Construction, underground cable installs, rain, snow, etc weather, vandalism, other car wrecks, and so on can all change features and conditions. There's no way they can keep fixing their maps at that level of detail.

    • Yeah, I've been reading about how much more advanced Waymo is over Tesla on this site on a regular basis, but if it turns out that 99% of that is simply that Waymo put drastically more effort into mapping the areas where it operates, it may be a case of while Waymo cars are more capable in their geofenced areas, Tesla's version is better in the general sense.

      And if we're going to introduce self-driving to the entire USA, much less the entire world (more or less), we need a version that works in the general

      • may be a case of while Waymo cars are more capable in their geofenced areas, Tesla's version is better in the general sense.

        They've always talked about them using "HD Maps" or whatever. That right there makes it pretty much unusable outside of cities, because companies will never pay the expense to HD Map bumfuck nowhere.

      • by thegarbz ( 1787294 ) on Wednesday June 12, 2024 @12:09PM (#64543987)

        HDMaps + LIDAR *is* more advanced than what Tesla is doing. And that's reflected in the number of people Waymo has not killed, and reflected in the way that Waymo has licenses for fully self driving vehicles, while Tesla is being sued for calling their car "Fully Self Driving" when it is both legally and technically not.

        Waymo's HDmaps augment their driving system for weird an unexpected situations. If you have a look at the incident, the road is fucking weird. It looks like a road with telecom poles growing out of it. No curb just a yellow line saying "don't hit me".

        The funny thing is if you jump on youtube and do a search for FSD beta crash, the literal first video is a Tesla doing the same thing - except instead of a pole it was a bollard in the road.

        • by dgatwood ( 11270 )

          HDMaps + LIDAR *is* more advanced than what Tesla is doing. And that's reflected in the number of people Waymo has not killed, and reflected in the way that Waymo has licenses for fully self driving vehicles, while Tesla is being sued for calling their car "Fully Self Driving" when it is both legally and technically not.

          That's really not a fair comparison. Most of those occurred before the FSD beta feature set even became available. I think there has been a single FSD beta fatality so far. And bear in mind that Waymo still doesn't support any highway driving, which is where drivers are most likely to get killed in an accident. So you're comparing driver-assistance miles driven mostly on freeways with self-driving miles driven mostly in cities, and getting predictably different results.

          Waymo's HDmaps augment their driving system for weird an unexpected situations. If you have a look at the incident, the road is fucking weird. It looks like a road with telecom poles growing out of it. No curb just a yellow line saying "don't hit me".

          It's also a LIDAR return that shou

        • by mjwx ( 966435 )

          HDMaps + LIDAR *is* more advanced than what Tesla is doing. And that's reflected in the number of people Waymo has not killed, and reflected in the way that Waymo has licenses for fully self driving vehicles, while Tesla is being sued for calling their car "Fully Self Driving" when it is both legally and technically not.

          Waymo's HDmaps augment their driving system for weird an unexpected situations. If you have a look at the incident, the road is fucking weird. It looks like a road with telecom poles growing out of it. No curb just a yellow line saying "don't hit me".

          The funny thing is if you jump on youtube and do a search for FSD beta crash, the literal first video is a Tesla doing the same thing - except instead of a pole it was a bollard in the road.

          Are google still using the HDL-64 units? Did some aerial surveying with some of those and they were brilliant (unless it was cloudy, lidar and water dont mix strangely enough).

          The problem isn't with detection, it's with decision making. All a LIDAR or any other form of detection can do is tell you something is there. It doesn't tell you what it is... That takes time and compute power, so autonomous cars are programmed just to stop and wait for a human to determine what it is when they detect something...

      • And if we're going to introduce self-driving to the entire USA, much less the entire world (more or less), we need a version that works in the general sense.

        Not really. The world is a pretty finite place relative to current processing abilities - and the world's roads, moreso.

        For example, a taxi service in the US could state flatly "we won't go on any street not on google maps," and it wouldn't hurt their business at all.

        In any case this story isn't about that. The cars obviously are supposed to see

        • by HiThere ( 15173 )

          The world may be finite, but it keeps changing. And the only way to find out about those changes is to go look at them. What you saw yesterday may not be true today. (Well, it *usually* is, but usually isn't sufficient.)

          Maps should be "reference materials" that you look up to find out how to get from here to there. But they shouldn't be expected to tell you about the box springs in the road.

        • For example, a taxi service in the US could state flatly "we won't go on any street not on google maps," and it wouldn't hurt their business at all.

          so it can't handle some parking lots / strip malls?
          may or may not work at airports?
          may or may out try to use an loading doc at some buildings?

          • Just speculation on my part, but I would be surprised if a Waymo will venture into unknown territory. But an airport would certainly be an example of a well-mapped area, at least the public areas.

            Here's their info on taking a Waymo to the Phoenix airport. They do indicate specific pickup/dropoff locations so I guess they don't let it pull over just anywhere at the airport:

            https://support.google.com/way... [google.com]

            • But its only because they knew people would go to the airport so they took the time to map the airport accurately. They won't be able to do that for 'the entire world'.
      • by dvice ( 6309704 )

        You have to take into consideration that
        - Waymo started 2009, Tesla had autopilot 2015.
        - Tesla is driving on highway, which is the easy part, while Waymo is driving inside a city, which is the hard part.
        - Tesla has had hundreds of crashes and dozens of deaths. Waymo has has few minor crashes, and it has killed one dog that run under the car.
        - This case does not tell us that Waymo puts drastically more effort into mapping. It could as well mean that mapping is automatic, but this particular telephone pole ha

        • So how many other poles are there in America (or even Phoenix) that haven't been "marked" yet? There will need to be an accident at every single one of those spots before they can fix it?
        • Some more differences:
          1. There are way more FSD Teslas on the road than Waymo vehicles. As somebody else noted - Tesla FSD actually has a lower accident rate per mile. So more accidents, but because more miles. Also, highway accidents tend to be much more deadly.
          2. FSD Teslas are in the hands of customers. All of Waymo's vehicles are directly owned by Waymo.
          3. Not running into a telephone pole should be a basic thing for any self driving car.

      • by dgatwood ( 11270 )

        Yeah, I've been reading about how much more advanced Waymo is over Tesla on this site on a regular basis, but if it turns out that 99% of that is simply that Waymo put drastically more effort into mapping the areas where it operates, it may be a case of while Waymo cars are more capable in their geofenced areas, Tesla's version is better in the general sense.

        Well, depending on how you look for it, that may well be the case. FSD beta has an accident rate of 0.31 per million miles [notateslaapp.com] (city driving only) versus 0.41 per million miles [waymo.com] for Waymo.

        Of course, that's also not a fair comparison, because with FSD beta, there's a person behind the wheel ready to intervene when it does something utterly bananas. And the intervention rate is still orders of magnitude too high to truly consider it to be self-driving.

        Without turning both systems loose in the real world under si

    • Where I live there is snow clearing equipment in the winter frequently. They leave piles of snow in the roads when they are working and it is your fault if you hit them. There are ice ruts that can turn a car sideways in a second that need to be navigated. Also it's disappointing that they cannot recognise an animal in the road and anticipate another animal, or anticipate a child running after a ball in the road.
    • they can take the Disney ride we are not responsible part and add it to the EULA and also say the rider / renter is on the hook for any damage to the car, tolls , tickets, claims made.

    • HD maps are used to map out obstacles in non-standard roads. There's nothing normal about the area where Waymo crashed. It looks like the poles are in the actual road. Really bizarre. Incidentally the Waymo did notice the pole in the end and braked. The crash happened at ... 8mph.

      There's nothing "Disney world style track" about this either. Unless you consider an entire major city a Disney world style track. HD maps are generated in real time on the fly, and sometimes edited when the software makes a mistak

  • Software companies (Score:5, Insightful)

    by wakeboarder ( 2695839 ) on Wednesday June 12, 2024 @10:19AM (#64543613)
    should not be doing anything related to safety. If you do something in the engineering world, you have to prove that it's safe and usually have someone be certified if your going to potentially kill people. The attitudes of most software devs is iterative design and that doesn't work when you have people involved because if you have a bug, you kill someone.
    • (Uncommon sense, +5)

    • Re: (Score:1, Interesting)

      by Tablizer ( 95088 )

      Software companies should not be doing anything related to safety

      Neither should average humans. Human drivers are too often morons.

      The bots only need to be slightly better than people, not perfect.

      • That's like saying a toaster or an oven needs to be only slightly safer than cooking over an open fire.
        • by Tablizer ( 95088 )

          If those are the actual practical tradeoffs, then YES.

          • You think it would be practical for an oven to set fire to slightly fewer houses than open fires?
            • by Tablizer ( 95088 )

              I'm not following you. It would be great if bot-cars were super-safe, but we don't yet have super-safe tech.

              • And flying tech wouldn't BE super safe if they hadn't insisted on it being super safe before the public was exposed to it.
                • by Tablizer ( 95088 )

                  Sorry, I'm not following. Early jets had notable problems, and with experience they were ironed out. You can't get experience until you get experience.

                  Let me restate this all: I'm okay with bot-cars being introduced when they reach a rate of approximately 20% better than human drivers. That's my vote. If it's "wrong", too bad, I'll live in my Wrongville and you live your Pedanticville.

      • by HiThere ( 15173 )

        That's not the way the legal system works. And, given the examples of unregulated companies, it's not the way it should work. If it works that way they'll cut corners.

        • by Tablizer ( 95088 )

          If we have too many restrictions, we'll never get bot-cars. I'm getting up there in age, so would like a bot car to get around on my own when I can no longer drive.

          > If it works that way they'll cut corners.

          No, they'd still be liable, just like a bad human driver. The safer they are, the less lawsuits they have to pay out, so it's not like there are zero incentives to be safe.

      • The bots only need to be slightly better than people, not perfect.

        You'd think so, wouldn't you? But in reality, humans are (wrongly) trusted to do drive cars, and automation is not. Even if autonomous vehicles are, statistically, responsible for half as many accidents as humans(*) all it takes is one high-profile accident in which "a human wouldn't have done that" and you'll convince the majority of the populace that autonomous vehicles are unsafe. Because that's the same sort of thought process that leads

    • by Baron_Yam ( 643147 ) on Wednesday June 12, 2024 @10:59AM (#64543805)

      I've done work that affected public safety. I tested until I was certain it was solid work, then had a colleague try to make it fail. She always did, the first couple of times. Once I got past her, it went to an entire test team who spent as long as it took to test every function under every conceivable circumstance.

      My original work has been ported to new languages and used on at least two continents over the last decade or so, and to the best of my knowledge it has never failed in a way that caused a safety hazard. ...But my work was in no way as potentially dangerous as a self driving system for a massive chunk of metal moving at significant velocities in proximity to people.

    • should not be doing anything related to safety.

      Why not? Even with this crash Waymo's accident record far exceeds that of any other company or human driver. Take a step back and think a bit before you knee jerk your way to conclusions.

      If you do something in the engineering world, you have to prove that it's safe and usually have someone be certified if your going to potentially kill people.

      This wouldn't have killed anyone. Not at 8mph. Waymo have driven collectively millions of miles and have yet to kill anyone. Yet you're jumping straight to conclusions.

      The attitudes of most software devs is iterative design and that doesn't work when you have people involved because if you have a bug, you kill someone.

      The attitude of most Slashdotters is to ignorantly talk out of their arse. Yet somehow you still get modded up for it. Show Waymo the respect moderators show

    • Why are humans allowed to crash into telephone poles? Humans do it all the time, and there's no investigation into the safety of humans driving? There is no need to prove self-driving cars are "safe", they are already safer than humans and that's good enough. I don't think we should bother to make them much safer because you guys say 40,000 deaths per year just in the USA is an acceptable number right? A mere 40,000 people dying shouldn't bother anybody.

      • by HiThere ( 15173 )

        FWIW, it's not clear to me that they are safer under all conditions. But then I also think a lot of people shouldn't be driving. (Including me. And I take that seriously enough that I pulled my license. So you can guess that I REALLY want a self-driving car.)

        OTOH, I don't think relying on maps is the right approach. Local conditions are always subject to unannounced changes. It's necessary to observe the local conditions and act on that...with maps as a guide to the more global situation.

      • Well way more people died of covid and people didn't care about that, so...
    • by dvice ( 6309704 )

      I disagree.
      Lets say that if an example human operated systems would kill 42000 (US death toll) people per year and lets say that autonomous systems would kill 700 per year (estimated based on Tesla as Waymo has no fatalities). You have now two options:
      A) Kill 42000 people
      B) Kill 700 people

      You are suggesting that we should pick option A.

      • Yeah, that is a good argument for waymo. Right now if we scaled up their operations, we'd be killing hundreds of thousands of people. The cars need to work out of the box. No more beta testing and killing people. Most people that are not drunk can avoid telephone poles.
      • Except you need to compare to people killed in Phoenix (and specifically where Waymo drives) by humans, not the number of people in America. Also you have to consider that 2/3 of traffic fatalities happen where there is ice and snow. So out of that 42000 that die in America, only 14000 of those are being addressed by current Waymo cars because they cannot drive in ice and snow. And then figure out how many of those 14000 are actually in the places where Waymo drives and they are not preventing many. May
  • "That's one lousy pilot."
  • It's amazing (Score:5, Insightful)

    by darth_borehd ( 644166 ) on Wednesday June 12, 2024 @10:53AM (#64543779)

    That they work as well as they do. Nearly 700 autonomous vehicles and only a couple crashes?

    • They basically did a super accurate pre-scan of all the streets they drive on. I'm surprised they would hit anything stationary.
  • Unless they were trying to run from killers.
  • Note that these AVs require very detailed mapping of the physical world. That has always been the case, since day 1. The question is how much mapping they can do themselves, and how much must be done offline.

  • by backslashdot ( 95548 ) on Wednesday June 12, 2024 @12:17PM (#64544015)

    Why no investigation and banning of humans from driving cars? https://www.youtube.com/result... [youtube.com]

    • Because just because one human hit a telephone pole doesn't mean all humans will. And generally someone in that bad of an accident will be investigated.
  • by Tablizer ( 95088 ) on Wednesday June 12, 2024 @01:27PM (#64544235) Journal

    Detecting a telephone pole with lidar is at least two-decade-old tech that doesn't require image recognition. How did they flunk that?

    What if it were a skinny stationary person wearing brown clothes?

  • It's so ridiculous to call an OTA update a recall as no car is actually being recalled. But I get it, putting 'recall' in the title sounds much more exciting. Keep recall to actual recalls when a hardwareproblem needs to be solved at the servicestation.

Life is a whim of several billion cells to be you for a while.

Working...