Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Transportation Software

Waymo Recalls and Updates Robotaxi Software After Two Cars Crashed (techcrunch.com) 41

Sean O'Kane reports via TechCrunch: Waymo is voluntarily recalling the software that powers its robotaxi fleet after two vehicles crashed into the same towed pickup truck in Phoenix, Arizona, in December. It's the company's first recall. Waymo chief safety officer Mauricio Pena described the crashes as "minor" in a blog post, and said neither vehicle was carrying passengers at the time. There were no injuries. He also said Waymo's ride-hailing service -- which is live in Phoenix, San Francisco, Los Angeles, and Austin -- "is not and has not been interrupted by this update." The company declined to share video of the crashes with TechCrunch.

Waymo said it developed, tested, and validated a fix to the software that it started deploying to its fleet on December 20. All of its robotaxis received that software update by January 12. "This voluntary recall reflects how seriously we take our responsibility to safely deploy our technology and to transparently communicate with the public," Pena wrote.

The crashes that prompted the recall both happened on December 11. Pena wrote that one of Waymo's vehicles came upon a backward-facing pickup truck being "improperly towed." The truck was "persistently angled across a center turn lane and a traffic lane." Pena said the robotaxi "incorrectly predicted the future motion of the towed vehicle" because of this mismatch between the orientation of the tow truck and the pickup, and made contact. The company told TechCrunch this caused minor damage to the front left bumper. The tow truck did not stop, though, according to Pena, and just a few minutes later another Waymo robotaxi made contact with the same pickup truck being towed. The company told TechCrunch this caused minor damage to the front left bumper and a sensor. (The tow truck stopped after the second crash.)
Last week, a Waymo autonomous vehicle was vandalized and burned by a crowd of people in San Francisco. Meanwhile, Rival operator Cruise suspended its operations while it investigates an incident where one of its robotaxis ran over a pedestrian who had been hit by another vehicle driven by a human.
This discussion has been archived. No new comments can be posted.

Waymo Recalls and Updates Robotaxi Software After Two Cars Crashed

Comments Filter:
  • OTA updates? (Score:4, Insightful)

    by ShanghaiBill ( 739463 ) on Tuesday February 13, 2024 @07:10PM (#64237566)

    Why do the robotaxis have to be "recalled"?

    Do they really have no OTA (over the air) updates?

    My Tesla does automatic OTA updates while it sits in my garage.

  • Two minor bumps? (Score:2, Insightful)

    by quonset ( 4839537 )

    That's what's newsworthy? Not the guy who was burned alive in his Tesla which crashed while driving with full self-driving enabled [electrek.co]? The software which, according to the passenger, swerved several times on the road?

    These two companies, Waymo and Cruise, pull their vehicles for minor incidents while Teslas keep killing people because Musk doesn't care how many have to die to prove he's wrong.

    • by thegarbz ( 1787294 ) on Tuesday February 13, 2024 @07:50PM (#64237628)

      That's what's newsworthy?

      No. What is newsworthy is that a company with actual full self driving capabilities demonstrates it can systematically update its fleet to improve safety in one go, rendering self-driving vehicles an example of universal improvement that human drivers are not capable of...

      Not the guy who was burned alive in his Tesla which crashed while driving with full self-driving enabled [electrek.co]?

      ... you know, like human drivers who think drink driving while using experimental software which requires an alert driver to pay attention, and which was proven to be unreliable only hours before, was a clever life choice.

      Teslas keep killing people because Musk doesn't care how many have to die to prove he's wrong.

      Waymo and Cruise have self driving cars. Telsa has a glorified driver assist and it even warns you it is such when you enable it. Equating the two is asinine. That said in this case Tesla too will have improved by taking one more drink driving inattentive idiot off the road.

      • Waymo and Cruise have self driving cars. Telsa has a glorified driver assist and it even warns you it is such when you enable it. Equating the two is asinine. That said in this case Tesla too will have improved by taking one more drink driving inattentive idiot off the road.

        Not in this case. The guy was using full self-driving, not autopilot. He was an employee of Tesla and, according to his wife, regularly used it.

        The reason the distinction is important is Tesla keeps saying no one has died while using f

        • If you are testing a beta software, and give it double control over your life: that is only your responsibility, and no one else's.
          Double control:
          a) it is beta
          b) you are drunk

          I think self driving under circumstances where the driver still has to pay attention + being drunk, simply does not mix!

        • The guy was using full self-driving, not autopilot.

          Tesla does not offer a certified full self-driving option. He was using an experimental beta product, and FSD warns you that is is such and requires you to keep your hands on the wheel at all times or it deactivates.

          The reason the distinction is important is Tesla keeps saying no one has died while using full self-driving

          Can you point to a recent example of such a claim? As far as I recall the last time they said this was prior to 2018 when the NHTSA started investigating their crashes. Since then the claim has watered down to "it is safer and we have less crashes per mile driven".

          Tesla has knowingly released software which can do unexpected things at any moment and endanger those around or in the vehicle.

          And warns the user that it may

      • by AmiMoJo ( 196126 )

        Blaming the drive for mis-using safety features only works when it is the result of exceptional stupidity. When it's just the normal way that humans tend to get lulled into a false sense of security, we call it a design flaw and try to fix it.

        The frequency with which we see people not carefully monitoring autopilot suggests that it is in the latter category. The fact that defeat devices that let you take your hand off the wheel went into mass production is proof that it is unsuitable for use by many drivers

    • and bumping into a car on the back of a tow truck that, apparently, was being towed at an angle where it was partially in the next lane.

      last story was a bike rider that ran a stop sign

      Now, look the car has got to "realize" its about to hit something when the radar pings show it getting closer, and overrule its "logic" based on the direction a car is facing. Also, this has got to be such a rare occurrence - a car being towed backward. This is typical of repos where they just yank the car anyway they can, but

      • Is our acceptable error rate for self driving really zero accidents? Its a little crazy.

        Isn't that the goal? To take humans out of the equation so there aren't accidents? There's someone on here who regularly announces they can't wait for self-driving vehicles so humans won't keep having all the car accidents and killing people. In fact, they want all human driven vehicles banned. Keep an eye out for them.

        • Is our acceptable error rate for self driving really zero accidents? Its a little crazy.

          Isn't that the goal? To take humans out of the equation so there aren't accidents? There's someone on here who regularly announces they can't wait for self-driving vehicles so humans won't keep having all the car accidents and killing people. In fact, they want all human driven vehicles banned. Keep an eye out for them.

          Taking humans out of the equation does not make zero accidents. That is the definition of accident. The goal is to have less injuries and fatalities than human drivers and to continue to make that number.as small as possible.

          The question I raised is why do some humans hate self driving cars so much, and why do they seem to have an expectation of zero accidents from them? Why don't they see the fault on both sides of the accident, as in these last 2 cases?

          In my opinion, this all stems from an inability to em

          • When someone is programming a car ahead of time then everything in that car and everything the car does is intentional and far from an accident. Bugs are bad enough when they cause a bluescreen, but when they can kill it's a bit different.
        • No, putting an automated device on the road that is perfectly capable of getting in zero accidents without taking the effort to get it there first (because $) is a little crazy. They are literally deciding who lives and who dies. Do they protect the passengers or the pedestrian. Everyone working on those cars is making that decision because everything in that car is preprogrammed. Should they be allowed to?
          • No, putting an automated device on the road that is perfectly capable of getting in zero accidents without taking the effort to get it there first (because $) is a little crazy. They are literally deciding who lives and who dies. Do they protect the passengers or the pedestrian. Everyone working on those cars is making that decision because everything in that car is preprogrammed. Should they be allowed to?

            You know nothing, John Snow

      • You realize that all rear wheel drive vehicles are towed "backwards" and are not really that rare of an occurrence? Regardless of that, if the car thinks that the towed vehicle is coming at it and it decides to drive into it head on because it failed to predict where it is going is a huge problem.
      • It's common to tow rear-wheel drive vehicles backward. It avoids any potential damage to the transmission. Trucks, a significant segment of the vehicle market, are notoriously RWD.
        • This is what neutral is for, you know that "N" letter on your transmission.

          Common does not mean safe, to safely tow a vehicle you want the weight over the rear axle of the tow truck for stability. Also, you want the front, steering wheels, ,on the dolly for better control in turns. Rear engine vehicles are an argument to this, city towing load it up front facing, but if you're getting on the highway I'd recommend using a flat bed.

          Look, obviously people do it and the self-drive cars should be able to figure

          • Modern electronic transmissions don't have a mechanical linkage. You can shift into "N" all you like. It's just a switch. Driving a transmission "backward" does not ensure proper lubricant flow. You are driving the torque convertor in reverse at best. It's not clear what, if anything, is getting lubricated in the final stages of the transmission. It's a really bad idea to tow a rear-wheel drive vehicle with the wheels on the ground and connected to the transmission. I hereby order all tow trucker drivers
          • by hawk ( 1151 )

            >This is what neutral is for, you know that "N" letter on your transmission.

            that simply isn't how an automatic transmission with a torque converter (or even a mere fluid coupler) works.

            With a manual, there is no problem towing a vehicle in neutral. Putting it in neutral means that, while the driveshaft spins, it isn't connected to anything.

            With an automatic, however, the torque converter still gets spun by the driveshaft--the transmission in neutral is on the *other* side of the torque converter from th

    • Two Waymo cars hit the *same* towed vehicle? That is something to be discussed. Did any humans hit the same vehicle?

      If both cars were confused and dove out of the way because it looked like it was coming towards them, that would be one thing. But both hit it? That's a good time to tell people "we're going to fix that".
      • by dgatwood ( 11270 )

        Two Waymo cars hit the *same* towed vehicle? That is something to be discussed. Did any humans hit the same vehicle?

        In other news, the owner of that vehicle is now officially the unluckiest person in the world. We've asked the owner to provide a hundred sets of lottery numbers every drawing so that we know what numbers not to pick.

      • Of course 2 cars hit it, they are running the same software. This is not surprising, it is expected. If there were 15 waymo cars trailing the tow truck they all should have hit it, if they didn't I would go looking for a bug in the software.

      • by hawk ( 1151 )

        nah, those unreliable squishes hit three *different* vehicles.

        No consistency at all, yet they're allowed to drive! :)

        hawk

    • Yes, it is news.
      This is the robot car running into another vehicle because the software could not predict where the other vehicle was headed.
      This is the robot car ignoring sensory input that it was about to collide with another vehicle.
      Pretty newsworthy for an industry based on robots having better awareness and safer driving habits than humans, especially now we are many years and millions of robot driven miles into development

    • You know what's not newsworthy? The hundreds of people who are burned alive in their ICE vehicles every day around the world. EVs have better safety records than standard cars.
  • Driverless cars should have never been let on the road. From their inception as the DARPA-contest desert-roaming kill vehicles, to experimentation in cities crowded with pedestrians and cyclists--they should have never been green-lit in the first place.
  • and just a few minutes later another Waymo robotaxi made contact with the same pickup truck being towed

    It would have been more notable if they crashed into the pickup *simultaneously* - bonus points if the two approach vectors are reflections around a line going through the centre of mass!

  • by isomeme ( 177414 ) <cdberry@gmail.com> on Wednesday February 14, 2024 @11:49AM (#64239052) Journal

    Hey, at least the bug is reproducible!

  • This news is the instructions on how to go trolling for waymo robots. They just can't resist this new bait. Informercials to come soon.

    • by hawk ( 1151 )

      "chumming" is generally considered unsporting, and often illegal.

      Like leaving bags of cash around to take lawyers out of season . . .

      hawk, retired lawyer

Life is a healthy respect for mother nature laced with greed.

Working...