Forgot your password?
typodupeerror
Transportation AI

'Confused' Waymos Stopped in Intersections During San Francisco Power Outage (cnbc.com) 146

"On Saturday, videos shared widely on social media showed Waymo vehicles stopped mid-intersection with hazard lights flashing, forcing other cars to maneuver around them," reports the San Francisco Chronicle.

The Independent notes that "Without working traffic lights, the driverless cars were seemingly left confused, with many halting in their tracks and causing major traffic jams. Local riders and pedestrians shared photos and videos of the vehicles stuck at intersections with long lines of drivers piling up behind them..." In some instances, several Waymos were piled up in front of a single intersection. "6 Waymos parked at a broken traffic light blocking the roads. Seems like they were not trained for a power outage," another social media user wrote.
More from CNBC: San Francisco resident Matt Schoolfield said he saw at least three Waymo autonomous vehicles stopped in traffic Saturday around 9:45 p.m. local time, including one he photographed near Arguello Boulevard and Geary Street. "They were just stopping in the middle of the street," Schoolfield said.

The power outages began around 1:09 p.m. Saturday and peaked roughly two hours later, affecting about 130,000 customers, according to Pacific Gas and Electric. As of Sunday morning, about 21,000 customers remained without power, mainly in the Presidio, the Richmond District, Golden Gate Park and parts of downtown San Francisco. PG&E said the outage was caused by a fire at a substation that resulted in "significant and extensive" damage, and said it could not yet provide a precise timeline for full restoration...

Amid the disruption, Tesla CEO Elon Musk posted on X: "Tesla Robotaxis were unaffected by the SF power outage." Unlike Waymo, Tesla does not operate a driverless robotaxi service in San Francisco. Tesla's local ride-hailing service uses vehicles equipped with "FSD (Supervised)," a premium driver assistance system. The service requires a human driver behind the wheel at all times...

The Waymo pause in San Francisco indicates cities are not yet ready for highly automated vehicles to inundate their streets, said Bryan Reimer, a research scientist at the MIT Center for Transportation and co-author of "How to Make AI Useful." "Something in the design and development of this technology was missed that clearly illustrates it was not the robust solution many would like to believe it is," he said. [He recommends "human backup systems in place around highly automated systems, including robotaxis."] State and city regulators will need to consider what the maximum penetration of highly automated vehicles should be in their region, Reimer added, and AV developers should be held responsible for "chaos gridlock," just as human drivers would be held responsible for how they drive during a blackout.

Waymo did not say when its service would resume and did not specify whether collisions involving its vehicles had occurred during the blackout.

This discussion has been archived. No new comments can be posted.

'Confused' Waymos Stopped in Intersections During San Francisco Power Outage

Comments Filter:
  • by Joe_Dragon ( 2206452 ) on Sunday December 21, 2025 @05:38PM (#65873463)

    needs to work with no network as well!

    • by usedtobestine ( 7476084 ) on Sunday December 21, 2025 @05:47PM (#65873473)

      You'd think that and no working traffic signals would have been the first two tests after power-on.

    • by Berkyjay ( 1225604 ) on Sunday December 21, 2025 @06:09PM (#65873507)

      They legally are obligated to work on a network, as they should be.

      https://www.dmv.ca.gov/portal/... [ca.gov]

      • Re: (Score:2, Insightful)

        by rogersc ( 622395 )
        So being autonomous is all a big hoax. The cars cannot run unless they are connected to a communications network.
        • And having the traffic lights out is a very obvious edge case.
        • Well, it's always been a big hoax. Waymos are technically able to run without connection or guidance within the area that they have been mapped for....at least that's what I've seen Google claim. But these regulations are enforced by the state of California for good reason. They ensure a level of safety and accountability.

          • But they obviously should not just stop in traffic that will already be congested enough from the lack of lights. A human would at least shoulder check and pull over safely.
            • by fahrbot-bot ( 874524 ) on Sunday December 21, 2025 @07:02PM (#65873615)

              A human would at least shoulder check and pull over safely.

              Unfortunately, Waymo hasn't yet installed shoulders on their vehicles. :-)

            • That is specified in the regulations.

              • by fluffernutter ( 1411889 ) on Sunday December 21, 2025 @07:22PM (#65873653)
                So you are saying California did this to themselves.
                • Re: (Score:2, Insightful)

                  by Berkyjay ( 1225604 )

                  Did what? Make sure that cars with no human drivers are safe and the owners held to a level of accountability? Yes, yes we did.

                  • by uncqual ( 836337 ) on Sunday December 21, 2025 @10:18PM (#65873849)

                    Accountability?

                    California seems to be far from making sure about that. In California it's still not clear who gets a ticket in case of a moving violation and who gets points on their record when autonomous cars violate the law and who pays the fines and fees - so nobody does.

                    Should all Waymo's lose their license to operate when, across all of them, they accumulate too many points on their record? After all they are all basically running the same software just an individual human brain is - and it is this negligent human brain that the DMV wants to get off the road.

                    The fact that Waymo cars, both individually and collectively, may drive more miles in California per year than the "typical" driver seems irrelevant. A driver who accumulates 12 points in a year while driving only 1000 miles in the year suffers the same restrictions that a driver who accumulates 12 points in a year while driving 30,000 miles.

                    The California legislature still has much to work out on this and, apparently, they really don't care to address the issue.

                    • In California it's still not clear who gets a ticket in case of a moving violation and who gets points on their record when autonomous cars violate the law and who pays the fines and fees - so nobody does.

                      You actually have that backwards. It *IS* clear who gets the ticket. The problem is that the way the law is written the ticket would be issued to a party that isn't involved in the moving violation, i.e. the non-existent driver.

                      This is the case of too much clarity, rather than not enough. If there was not enough clarity there there would be sufficient wiggle room to issue the ticket to someone. But as with virtually all laws in the world (save for the few stupid ones that given the police unchecked power),

                    • by mjwx ( 966435 )

                      Accountability?

                      California seems to be far from making sure about that. In California it's still not clear who gets a ticket in case of a moving violation and who gets points on their record when autonomous cars violate the law and who pays the fines and fees - so nobody does.

                      Should all Waymo's lose their license to operate when, across all of them, they accumulate too many points on their record? After all they are all basically running the same software just an individual human brain is - and it is this negligent human brain that the DMV wants to get off the road.

                      The fact that Waymo cars, both individually and collectively, may drive more miles in California per year than the "typical" driver seems irrelevant. A driver who accumulates 12 points in a year while driving only 1000 miles in the year suffers the same restrictions that a driver who accumulates 12 points in a year while driving 30,000 miles.

                      The California legislature still has much to work out on this and, apparently, they really don't care to address the issue.

                      This is what happens when a government becomes beholden to businesses.

                      California at least still has to give the impression that they care about or represent the people. Had this happened in a state that isn't as powerful as California or more corrupt, like say, Kentucky, the courts would have already rules that the autonomous cars and their owners are completely free of any liability, in fact they'll issue you a ticket for getting run over by one.

                    • In California it's still not clear who gets a ticket in case of a moving violation and who gets points on their record when autonomous cars violate the law and who pays the fines and fees - so nobody does.

                      Are those mechanisms relevant or useful for regulating autonomous vehicles? It seems to me that you're applying a system designed to incentivize and manage the behavior of individual human drivers to an entirely different context. That doesn't make sense.

                      What does? Well, pretty much what California is doing. There's a regulatory agency tasked with defining rules for licensing self-driving systems to operate on state roads. Failure to comply with regulatory requirements, or evidence of failure to beha

                    • Ok but then you haven't taken it to the next logical step-- If there is no real driver, who then becomes liable? Liability doesn't just 'go away' because a company says it does.
                    • I think the whole notion of applying a behavior-management program designed for individual drivers to a company operating a fleet of robot drivers makes no sense. It's a different situation, and calls for different regulatory strategies. I'm not saying there shouldn't be regulation of autonomous vehicles, just that it should be tailored to address that problem, rather than applying a solution designed for a different problem.

                      And, frankly, California's strategy seems like a good one. They're allowing sy

                    • by uncqual ( 836337 )

                      The point is that in California there currently appears to be NO penalty or state-wide mechanism for addressing traffic violations by a robotaxi. Police apparently have little choice but to just let them go on their way without any action (at least that is what police are doing).

                      Perhaps it would be inappropriate to apply the current standard for human drivers to a robotaxi. Perhaps a robotaxi should be subject to higher standards as any failure to follow traffic rules is by design (it's software!) than due

                  • How many accidents did these stopped cars cause? Is waymo taking responsibility for those?
          • Relying on a network connection for normal operations isn't a good plan. Latency and reliability are both huge problems. The network is probably just for telemetry and overrides.
            • The network is probably just for telemetry and overrides

              No it is mandated by California regulations.

        • Re: (Score:2, Insightful)

          by thegarbz ( 1787294 )

          So being autonomous is all a big hoax. The cars cannot run unless they are connected to a communications network.

          Sure, if you redefine autonomous into a narrow way to suit your narrative then it is a joke. On the flip side a vehicle which drives itself without human interaction (which is what Waymo does) very much is autonomous according to every other person on the planet, even if they are required to have a remote take over ability.

          In other news autonomy doesn't exist because all machines have an emergency stop switch. Does that sum up your bafflingly stupid arguement?

      • They legally are obligated to work on a network, as they should be.

        https://www.dmv.ca.gov/portal/... [ca.gov]

        What part of that document says they have to be networked? I skimmed it and didn't see anything like that. I found some stuff about remote operators, but those appear to be optional.

        • by ObliviousGnat ( 6346278 ) on Sunday December 21, 2025 @08:43PM (#65873759)

          Section 227.32 on page 11 says the autonomous vehicle test driver is mandatory. Earlier it says there should be a communications link between the driver and the vehicle, but it doesn't say it must go through a "network."

          • Where does it say that stopping in traffic is an acceptable remediation if the car loses connection? In my mind this makes waymo more culpable because they had no way to maintain the mandated connection in the event of a power failure.
          • Section 227.32 on page 11 says the autonomous vehicle test driver is mandatory. Earlier it says there should be a communications link between the driver and the vehicle, but it doesn't say it must go through a "network."

            Thanks. I guess this requirement goes away when the system graduates out of "test" mode?

        • The requirement that there be a way to 'take over' the vehicle in case of a problem literally requires network access for a remote 'driver' to take over in case of a problem involving a 'driverless' vehicle.

          How can a remote driver take over a vehicle's controls if there is no network?

          • The requirement that there be a way to 'take over' the vehicle in case of a problem literally requires network access for a remote 'driver' to take over in case of a problem involving a 'driverless' vehicle.

            How can a remote driver take over a vehicle's controls if there is no network?

            I was looking for a reference. Luckily, ObliviousGnat was actually helpful.

    • by Kisai ( 213879 ) on Monday December 22, 2025 @04:28AM (#65874119)

      I think Driverless cars are never going to work without buy-in from a country first to impose a standard protocol. It definitely will not be the US. It will likely be Japan, Korea, Taiwan or Singapore. Basically any small landmass country where it can be rolled out and made mandatory.

      Step 1: Define and implement three protocols
      a) C2C (car to car P2P protocol) that tells all cars travelling in the same direction to drive close to each other and with enough space to all simultaneously brake.
      b) T2C (Traffic control to Car) that relays the current traffic lights at each stop, as well as transponders on all Stop and Yield signs near intersections in order to consider if a traffic control situation has changed at that sign, plus allowing cars to stop exactly on the stop line and not 6' into it
      c) S2C (Sight to Car) This is the internal car system that relies on GPS/lidar/ultrasonic sensors that relays its coordinates and speed back to the T2C and C2C system without requiring a cellular or wireless network. Basically you can get in the lead car of a motorcade and drive the entire motorcade using one driver. Rules are setup in advance for vehicles to remain within sight and formation, or return to formation through intersections and stops.

      Basically C2C is for unrelated cars, T2C is for the city to control the traffic flow when possible and S2C for one car in a group to control multiple cars (so like a president or king's car has traffic priority and such)

      Step 2
      Require all new cars to have this feature turned on by default, and not to take control away from the driver except for braking.
      Require all existing cars to have transponders that support both connected to the CANBUS II network or they will not be considered road worthy.

      Step 3
      Once all vehicles are able to be told when it's safe to stop by either traffic control or other (human driven) cars, then require the default to be "Always on" even when humans are in the drivers seat. The driver always has the option to force an automated car to stop, but otherwise, it's treated as a "Emergency stop (full brakes applied) rather than "car is confused"

      Never take the ability of the driver to brake, even if it's 100% automated, and even if it's a robotaxi.The last thing you want are robo taxi's becoming robo-kidnappers.

      • also add

        Step 3.5 change DUI laws so that having an E-stop button or access to an app or screen to set an destination does not make an rider / passenger in control (even if they have the keys to the car on them) When in auto drive mode.

        Step 3.6 Traffic tickets / Points / Parking tickets / Photo tickets. (can not have rent a car fees and go to an pool that is payed into by the car and software manufacturers)

        Step 4 (NO ongoing FEES allowed / Vehicles must at least 8-10 years of any needed updates for auto driv

        • by dgatwood ( 11270 )

          Step 4 (NO ongoing FEES allowed / Vehicles must at least 8-10 years of any needed updates for auto drive mode covered as part of the basic price)

          I'm okay with drivers having to pay for cellular communications for updates or pay to have Wi-Fi access within range of their cars. But otherwise, yes, map updates should be free forever. If that means the government has to pay to maintain the servers that provide the map updates, fine.

          Cars can't not have an forced auto drive to dealer mode in theme

          What does that mean?

          Cars can not disable auto drive mode for doing stuff like getting an oil change at jiffy lube / changing an battery / etc and 3rd party shops must have access to tools at fair rates.

          Doing something like that would already violate any number of state and federal laws, at least in the United States.

      • by dgatwood ( 11270 )

        I think Driverless cars are never going to work without buy-in from a country first to impose a standard protocol. It definitely will not be the US. It will likely be Japan, Korea, Taiwan or Singapore. Basically any small landmass country where it can be rolled out and made mandatory.

        Won't work. Shouldn't try. There are fundamental security reasons why trusting data from outside the car is unsafe.

        a) C2C (car to car P2P protocol) that tells all cars travelling in the same direction to drive close to each other and with enough space to all simultaneously brake.

        Assuming LED tail lights (nanoseconds from dark to light), camera latency is one frame (which should ideally be 1/60th of a second, but in practice, is usually closer to 1/30th of a second) plus processing time of maybe another 1/60th of a second, for a total of < 100 ms.

        Time to transmit that signal over a radio, do the public key crypto to verify that it was actually sent by a car and isn

  • Typical AI issue (Score:4, Insightful)

    by gurps_npc ( 621217 ) on Sunday December 21, 2025 @05:42PM (#65873467) Homepage

    This is how I know AI are over-educated morons. They know the things they are taught, but can not deal with new issues.

    The total inability to do anything they were not expressly trained to do makes them totally unsuited to replace a human - except for Congressmen. :D

    • Re:Typical AI issue (Score:5, Interesting)

      by allo ( 1728082 ) on Sunday December 21, 2025 @06:03PM (#65873499)

      I'd say its a safety feature. If the vehicle uses remote data to drive safely, the safe fallback is to stop when there is no remote data. They are just not fully autonomous. The big question is, if we want cars that act fully on their own.

      • by dgatwood ( 11270 )

        I'd say its a safety feature. If the vehicle uses remote data to drive safely, the safe fallback is to stop when there is no remote data. They are just not fully autonomous. The big question is, if we want cars that act fully on their own.

        Everybody here is assuming that the cellular network went down completely, and that the cars couldn't communicate. While possible, I would assume that Waymo uses multiple cellular providers to ensure reliable service, particularly given how spotty service on any individual provider can be in SF. If they don't, I bet they do next week. :-D

        I'm also pretty sure they don't use remote data to drive safely at all. From the various articles I've read, all true safety-related data, including map data and driving

        • I think the issue is very simple, the Waymo car drives based on its internal mapping, and the mapping says there is a stop light at a given intersection, and when Waymo encounters the missing stop light, it wasn't programmed to fall back and act as if it was a 4-way stop.

          Remember, when stop lights go out, humans start driving erratically, with some thinking 4-way stops mean you yield to the driver on your right, some think it's the driver on the left, and others think it's whoever gets to their stop line fi

          • by dgatwood ( 11270 )

            I think the issue is very simple, the Waymo car drives based on its internal mapping, and the mapping says there is a stop light at a given intersection, and when Waymo encounters the missing stop light, it wasn't programmed to fall back and act as if it was a 4-way stop.

            Various news sites also reported that some people observed Waymo cars treating them as 4-way stops. So it probably isn't as simple as not being programmed to be able to fall back, but rather some combination of multiple factors, including the nonfunctional lights, that in combination spooked the cars.

            Either way, though, requiring human verification of an outage once per traffic light is probably the right thing to do, if only to ensure that the non-detection isn't a bug. Multiply times a lot of cars at a

        • by allo ( 1728082 )

          > The problem, I suspect, is that they are designed to fail safe. Specifically, when they encounter a situation that is substantially unexpected, they stop and reach out to operators to ask how to resolve the unexpected situation.

          That's what I tried to say. And that is for someone who is responsible the best solution to avoid being responsible for dangerous situations. On the other hand it is massively inconvenient, not only for passengers but also for other drivers. But the question is, how much do we a

      • Stopping in the middle of the street is not a safety feature. Stopping in the emergency lane or parking next to a curb is.

        I'd be okay with an autonomous car being able to scout for a safe stopping point within 50ft of its current position. Apparently, that was not part of the design requirements for production deployment into real traffic, and even I find that really surprising.

        • by dgatwood ( 11270 )

          Stopping in the middle of the street is not a safety feature. Stopping in the emergency lane or parking next to a curb is.

          Apparently you have never driven in San Francisco. These aren't freeways, and these aren't suburbs. They're mostly dense urban streets a la Manhattan. You either have a continuous row of cars parked along one side or both, or you'll have driving lanes. There are no shoulders. If you get very lucky, there might be one spot free, but not enough spots for four cars at a single intersection — probably not at any time, day or night, realistically.

          So in situations where stopping in the middle of the st

      • Stopping is fine. Stopping in the middle of an intersection is not. A power outage is likely to require a response of emergency services which will not be able to get to their intended destination.
        • by allo ( 1728082 )

          The problem is, you need to do something that can be done reliably (!low AI") and predictable (others are not too surprised by it and can plan their own maneuvers ahead).

          If the thing (seemingly) can't work as usual in this situation, it needs a safe fallback. And when it seems that any complicated maneuvers are out of question, you need to stop (and in the best case let a human handle the vehicle). What would be the alternative? Keep driving straightforward no matter what? Do some things that were deemed to

    • Then how did Tesla work fine? Waymo actually uses less AI than Tesla.

      • by dgatwood ( 11270 )

        Then how did Tesla work fine? Waymo actually uses less AI than Tesla.

        Tesla FSD beta relies on a human operator in the car. I don't know what it does when a light is out. It either treats it as a red light or as a green light. In the former, then it relies on the human driver to take over to get it going again. If the latter, then it relies on the human driver to avoid a fatal collision. Either way, it relies on a human driver in the car.

        Tesla's robotaxis also have a human safety driver. And still reportedly crash 12.5x more often than human drivers [commondreams.org]. So my guess would

  • There was a system that didn't depend on remote systems to run correctly. Perhaps until we work out what that could be we put a human behind each wheel. Call them taxi drivers perhaps?
  • by bobby ( 109046 ) on Sunday December 21, 2025 @06:31PM (#65873543)

    Sounds like a Waymo vehicle could not pass a standard driving test. As in, "what is the rule when you come upon an intersection where the traffic lights are completely out?"

    • by ArchieBunker ( 132337 ) on Sunday December 21, 2025 @06:35PM (#65873549)

      Nobody stops for non working lights anymore.

      • by bobby ( 109046 )

        They barely stop for working ones.

      • Nobody stops for non working lights anymore.

        This is absurd, of course they stop. Anyone who fails to stop is violating the law.

        • by bobby ( 109046 )

          You're all correct. In my area (East Coast USA major city suburbs) where there's a non-working traffic light they just blow through. Same if it's flashing red, which means STOP, but people fly through at full speed limit speed or more. Cops rarely around. But there are more and more traffic light cameras, so I'm not sure but maybe they're receiving tickets.

        • Anyone who fails to stop is violating the law.

          So does anyone who exceeds the posted speed limit. What's your point? When has the law ever been important for driving a vehicle?

          • So does anyone who exceeds the posted speed limit. What's your point?

            My point is that it is illegal and for good reason to prevent property damage, injury and death.

            When has the law ever been important for driving a vehicle?

            Since the invention of driving tests, licensing, and enforcement. The concept you are trying to promulgate that rules don't matter because not everyone follows them therefore anything can be justified on that basis is inherently unfalsifiable gibberish.

        • Nobody stops for non working lights anymore.

          This is absurd, of course they stop. Anyone who fails to stop is violating the law.

          Dude, I wish I had mod points! This is the best laugh I've had all day.

    • To be fair if this were actually something that people would be regularly quizzed on you'll find that nearly everyone would fail the driving test.

      There's another part of a driving test that covers what to do in an unsafe situation, you know the answer to that is to stop right? How many humans have you ever seen to that?

      Let's not hold autonomous cars to standards we don't hold ourselves to. Driving tests were a basic measure for safety, and Waymo's record has definitively shown that it is far safer not only

      • To be fair if this were actually something that people would be regularly quizzed on you'll find that nearly everyone would fail the driving test.

        Let's not hold autonomous cars to standards we don't hold ourselves to.

        Why should we do any such thing?

        Driving tests were a basic measure for safety, and Waymo's record has definitively shown that it is far safer not only than an average human, but safer than a careful one too.

        This is like saying women commit far less murders than men therefore when they commit a murder it should be ignored.

    • Same as when it's working, "Ignore it."
  • There is a huge and fragile constraint system in any driving automation. Almost no one trusts AI enough to take those out (except that company which had self driving trucks without safety drivers for a week or so before putting them back, can't just stop as a failsafe on the highway, so they'd have to err on the side of disaster a lot more).

    Only remote operators can push Waymo's, slowly, through these constraints. They were clearly overwhelmed in this case.

    Something similar will happen if cellular is out fo

  • If anyone else had their entire fleet of vehicles stalled out and blocking traffic for hours in the middle of the city they would be given hefty fines for obstructing traffic and told to take their cars elsewhere.

    But not Waymo, who for some reason are exempt from the rules that everyone else is expected to follow.

  • by BrendaEM ( 871664 ) on Sunday December 21, 2025 @07:12PM (#65873627) Homepage
    In case of a war or disaster, we cannot have those vehicles littering the streets.
    • by CAIMLAS ( 41445 )

      Yep. The default activity it should perform when there's no control is to turn emergency lights on and pull over/park on the shoulder - at a minimum.

    • by flink ( 18449 )

      Yeah, like say, during a big earthquake. It would really suck to have these clogging up the streets while emergency vehicles are trying to get through. Good thing California doesn't need to worry about that...

      It's a total farce that these things are allowed to operate on public roads with no human behind the wheel.

      • If you asked me yesterday regarding earthquakes I would have said "Don't be ridiculous! Do you think you're smarter than them? Waymo has obviously run many simulations and developed a plan of what their cars should do."

        Now I'm thinking they probably haven't even thought about it. Scary.

    • In case of a war or disaster, we cannot have those vehicles littering the streets.

      I see you've never experienced a war or disaster. Hint: vehicles operated by humans litter the streets as well. There's a reason when you see true chaos unfolding that the news is running images of people walking and carrying stuff, and it's nothing to do with too many Waymos.

    • You mean, like when a hurricane or flood goes through a city, and (human-driven) cars litter the streets? Having steering wheels and gas pedals does not make these cars drivable. Maybe your definition of disaster is different from mine, or maybe you haven't actually seen a real disaster.

    • As long as cars have human-capable controls, humans are liable for the safety of the operation of the vehicle. As we see with Tesla and GM and other companies with "full-self-driving" modes, the driver is expected to remain in control, but at the same time not in control. If a crash happens, it's always the driver's fault.

      In other words, a self-driving car with human controls, isn't self-driving. I'd like a fully self-driving car. I have no interest in one that only pretends to be self-driving, but leaves m

  • In a city where the power goes out regularly... just how do they fail to correctly deal with basic traffic laws?

    This isn't just "lights aren't working, can't negotiate" - it's a failure to account for basic road conditions. What if there's an intersection with no visible signage (damaged or obscured stop sign)? That's not safe at all.

  • Who could have ever thought to test for the condition that a traffic light is out. I mean it's unthinkable. Waymo should be suspended from public roads until they prove they fixed it.
  • Sharing this with Sarah Connor. We now know that all we have to do to stop the Terminator takeover is to turn off the traffic lights.
  • This debacle suggests that Waymo relies on extremely detailed mapping of among other things exact coordinates of traffic lights. Because the traffic lights were not providing the required signals, the Waymos were unable to proceed apparently. This indicates much less flexibility than has been touted.

    • by dgatwood ( 11270 )

      This debacle suggests that Waymo relies on extremely detailed mapping of among other things exact coordinates of traffic lights. Because the traffic lights were not providing the required signals, the Waymos were unable to proceed apparently. This indicates much less flexibility than has been touted.

      It doesn't necessarily indicate that. You're speculating.

      It is entirely possible that Waymo Driver truly doesn't know how to handle that edge case. That's probably not the sort of thing that you'd have a lot of training data for, after all. However, I can think of at least five other possible explanations that are also plausible.

      Option 1: In the interest of safety, they required the cars to phone home to report a traffic light down and confirm before proceeding, but because so many lights went down all a

Old programmers never die, they just become managers.

Working...