Forgot your password?
typodupeerror
Transportation

Waymo is Having a Hard Time Stopping For School Buses (theverge.com) 134

Waymo's robotaxis have racked up at least 24 safety violations involving school buses in Austin since the start of the 2025 school year, and a voluntary software recall the company issued in December after a federal investigation has not fixed the problem.

Austin Independent School District initially reported at least 19 incidents of Waymo vehicles failing to stop for buses during loading and unloading -- illegal in all 50 states -- prompting NHTSA to open a probe. At least four more violations have occurred since the software update, including a January 19th incident where a robotaxi drove past a bus as children waited to cross the street and the stop arm was extended.

Waymo also acknowledged that one of its vehicles struck a child outside a Santa Monica elementary school on January 23rd, causing minor injuries. Austin ISD has asked Waymo to stop operating near schools during bus hours until the issue is resolved. Waymo refused. Three federal investigations have been opened in three months.
This discussion has been archived. No new comments can be posted.

Waymo is Having a Hard Time Stopping For School Buses

Comments Filter:
  • I would expect "stopping for school busses" would be an obvious and easy situation.
    • Re: Kind of weird (Score:5, Insightful)

      by Mr. Dollar Ton ( 5495648 ) on Saturday February 07, 2026 @01:06AM (#65974042)

      it is obvious if you understand the concept of driving instead of mimicking it statistically with some probability.

      a simple difference that the "AI" proponents and "investors" can't seem to grasp and acknowledge.

      • Re: Kind of weird (Score:4, Interesting)

        by phantomfive ( 622387 ) on Saturday February 07, 2026 @01:23AM (#65974060) Journal
        Someone should have (and I'm sure they did) thought, "we need to train this to handle school-buses and cone zones" then including those scenarios in the training data.

        There are definitely side cases that are difficult to predict for self-driving vehicles; this isn't one of them.
        • Re: Kind of weird (Score:5, Insightful)

          by Mr. Dollar Ton ( 5495648 ) on Saturday February 07, 2026 @01:36AM (#65974072)

          I'm sure someone though of it. What's obvious from the failures is that model training isn't a substitute for understanding, which the model is lacking. So it will always have a nonzero chance to fuck up an obvious situation, which is what we mostly deal with.

          Of course you'll have people arguing it isn't different with people on the account of the outcome (people are slower, get tired, etc) but the fundamental difference is the understanding, and the model doesn't have it.

          Hence Agrdaaeelbal instead of America.

          • What's obvious from the failures is that model training isn't a substitute for understanding, which the model is lacking.

            Even if the model doesn't have understanding, it can be trained to stop for school buses.

          • Even though your post is rightfully modded up to 5, the fact of the matter is that many people also lack understanding, and the world would be a safer place if the human factor were taken out of driving entirely.

            I do disagree that there's a larger non zero chance to duck up an obvious situation for a machine than for a human driver. Perhaps the chance is larger for non obvious cases, but that will get fixed.

            Also, you seem to assume that self driving is largely straight out of a model, like LLMs hallucin

            • Re: Kind of weird (Score:4, Interesting)

              by Mr. Dollar Ton ( 5495648 ) on Saturday February 07, 2026 @04:19AM (#65974188)

              fact of the matter is that many people also lack understanding,

              Yes, and many among those who do understand, ignore the rules. Hence there is responsibility to face if one is guilty of such behavior.

              I do disagree that there's a larger non zero chance to duck up an obvious situation for a machine than for a human driver.

              Which isn't something I'm saying above. It is a mixture of factors, understanding the rules, however, is a cause for most of these "uncanny" problems.

              Also, you seem to assume that self driving is largely straight out of a model,

              Apparently not, it seems that it may just be a case of Filippino drivers simply not knowing the US driving rules in detail :)

              https://www.newsweek.com/waymo... [newsweek.com]

            • Even though your post is rightfully modded up to 5, the fact of the matter is that many people also lack understanding, and the world would be a safer place if the human factor were taken out of driving entirely.

              My company is looking for a person like you to console grieving parents. We need someone telling them that "At least your kid wasn't killed by a human - so count your blessings since AI is much safer than flawed humans."

              But seriously, your complete disregard of the fact that human wetware has bad consequences when struck by heavy vehicles shows that you perhaps lack understanding of psychology of humans. I suspect you might be pretty misanthropic,

              Or as Comedian Gallagher once noted, "Drive safe on the w

              • Hey Ol, I have no idea what tone you read in my message, but I considered completely different outcomes, without any fatalities, like what's possible on driverless rail that's been in use for ages. If you twist it to envision people getting killed regardless, that's your imagination.
                • Hey Ol, I have no idea what tone you read in my message, but I considered completely different outcomes, without any fatalities, like what's possible on driverless rail that's been in use for ages. If you twist it to envision people getting killed regardless, that's your imagination.

                  If you think I am imaging something - perhaps you have never ever had to deliver accidental death information to people. It is even worse when you tell them their child has been killed.

                  You can tell them what you like, but saying that driverless is always safer is tone deaf, that it will be better once no humans are driving, when their offspring or significant other is dead after being hit by the safer vehicle.

                  I'm not even disagreeing about relative safety, just that you might consider the thinking an

                  • No advance is entirely good or entirely bad. It's always a mix. You hope the good outweighs the bad on average. If driverless cars save 100,000 lives but also kill 1,000 others----not a subset, but people who would not have died----that's still a huge win, even though it may not look like it from a certain direction. It's the same story with vaccines. There are a few adverse reactions, but we accept the risk because far more lives are saved.
                    • No advance is entirely good or entirely bad. It's always a mix. You hope the good outweighs the bad on average. If driverless cars save 100,000 lives but also kill 1,000 others----not a subset, but people who would not have died----that's still a huge win, even though it may not look like it from a certain direction. It's the same story with vaccines. There are a few adverse reactions, but we accept the risk because far more lives are saved.

                      It isn't a matter of statistics. Here's the thing - if you could save 500 people by killing your wife or SO, would you do it? One person dead, 500 saved.

                      This is assuming you loved her. S it means a person who you loved more than anything else in the world that you would accept their death. Statistically, that would be a good tradeoff. 500 living people, against one who you loved more than anything.

                      Maybe you would. After all the good outweighs the bad not just on average, but quantifiably so.

                      And tha

                    • You aren't wrong. Most people would make a different choice in the trolley problem when it's the "someone you know" variant. But of course, you can't predict WHO it's going to be a priori.
                  • Hold your horses, please. Pretty please? I'm not saying you're imagining things, so you can drop the line of all the things I'm apparently not aware of. I have no idea from the remark I made and the comments before, why you come up with having to tell parents they lost a child.

                    This is Slashdot, light banter around technical stuff, we're not solving any problems here.

                    You apparently had serious cases in your mind, which is in your head. That doesn't mean I'm claiming you're imagining things, it just wasn'

      • The primary control mechanism of a car should be an old-fashioned algorithm, not AI. The AI should tell the algorithm "likely school bus identified" and the alg should then ask the AI if the bus has flashing lights or a pop-out stop sign (via a probability score). If so, stop and wait.

      • by gweihir ( 88907 )

        Statistical-type AI is not used in self-driving cars. Too slow, too unreliable, too resource intensive. Well, maybe Tesla does it, but they are only faking having the tech for it.

        Other than that, I agree.

        • Re: Kind of weird (Score:4, Informative)

          by Mr. Dollar Ton ( 5495648 ) on Sunday February 08, 2026 @12:07AM (#65975500)

          Statistical-type AI is not used in self-driving cars.

          Please. Their system is full of it, actually.

          Waymo cars use trained models in all stages of operation. They use something called rangenet (a cnn) to turn lidar point clouds into cars. They use another layer of cnns and transformers for 360 degree observation, called SurroundView. They use a GNN to build representation of this as a simplified map of the dynamic situation on the road, called VectorNet. They use a MoE model to guess where the cars around will go. They use a gemini derivative to read the road signs. Their actual driving module is a RNN.

          I'm sure I'm forgetting a few and I don't even know all.

          • by gweihir ( 88907 )

            That application is properly called "machine learning" and not "AI", because it does not try to fake being intelligent.

            • Statistical-type AI is not used in self-driving cars.

              All the models that waymo lists as part of their car-driving software that I've mentioned above are "statistical-type" - data-trained neural networks of one kind of another, so you're wrong and such models are very much in use in self-driving cars.

              Moreover, one of the models is specifically described as a Gemini variant, which is exactly what you'd choose to call an "AI".

              Not that it is a big deal, but I've got the bad headache and am in a nitpicking mode.

    • Actually, it is not exactly the same in all 50 states for multi-lane roads. In the linked article it states that the Waymo vehicle was filmed breezing through the opposite lane of traffic. The laws vary on opposing traffic depending on the state.

      For a multi-lane road with only a turn lane separating the opposing traffic, Texas law requires opposing traffic to stop. Texas school bus laws [liggettlawgroup.com]

      But for Washington state, Missouri, South Carolina, and a few others, a turn lane is enough separation to allow opposing

      • Sounds like self driving certification needs to be done on a per state basis. Even if that means some additional training and software is required to comply with state laws and potentially updated each year.

    • Re:Kind of weird (Score:4, Insightful)

      by martin-boundary ( 547041 ) on Saturday February 07, 2026 @03:31AM (#65974154)

      The robotaxis are a single driver.

      I would expect if a single driver racked up 25 safety violations involving school buses that their drivers license should be suspended.

      • Agreed. If I drove 25 different cars and got infractions in each, the government is going to be rather peeved with me.

      • by gweihir ( 88907 )

        The robotaxis are a single driver.

        No. They are a multi-instance driver. You know, same as humans essentially. Probably should outlaw human driving with all the accidents and violations a typical human driver does though. Homo Sapiens is not a competent self driving platform in general.

        The bottom line is that since humans do not scale, no "distance traveled" is factored into their incompetency scores. For multi-instance drivers it needs to be. In the end, what counts is accidents caused in proportion, not absolute numbers.

        • No. They are one model. That's literally the whole point and the source of economic scalability claims.

          If there was a different set of behaviours across the taxis (like humans are different) then the costs for training/modelling/generating/supervising each individual taxi "driver" would blow up. When you have one model, you can update all instances simultaneously and constrain all instances to exactly the same standard.

          Do not confuse the randomness of the environment (eg which road is being driven) and

          • by gweihir ( 88907 )

            You think a human driving a car is a "truly independent decision maker"? Talk about deep delusion.
            You are in good company though. Most human drivers are pretty bad. Most human drivers _think_ they are pretty good.

            All I see here is a complete risk management failure done to prop up a myth. That does not make anybody safer. Congratulations, you are aiding and abetting traffic-kills done by humans. Not a good look in any way.

      • The robotaxis are a single driver.

        I would expect if a single driver racked up 25 safety violations involving school buses that their drivers license should be suspended.

        While I agree with you the problem is they aren't such in the face of a law.

        Also while I agree with you I would expect any single driver that accumulates over 200 million miles on just urban and suburban roads to rack up 25 safety violations. Actually I expect far more. The single driver methodology breaks down when you ignore distance travelled. The real question for safety is what is the violation rate per km.

        And the real legal question is, when will the government reformulate road rules to start reflecti

    • Well, it is if you think of the driving software as a deterministic machine, as we are used to. If you have a toy truck, you can make it go where you want, but try that with a cat. The driving software is quite nearer the cat situation than the toy truck.

    • by gweihir ( 88907 )

      Same. I expect it will need to be a coded-in exception though and some "manager" probably did not want to spend extra for it. Or the engineers working on this have no clue how the real world works. This way they now get irrational hysterics in the press. They could have avoided that. Yes, the actual risks would not have changed, but many people cannot do risk management at all (as this story nicely shows) and need to have their irrationalities catered to if you want to be successful in a non-expert market.

    • Probably the school buses problem for throwing a stop sign off the side of the bus, instead of actually doing something a computer would understand, like V2X infrastructure or something. Pretending the future doesn't exist doesn't make it go away. Be machine friendly in your designs.

  • context (Score:4, Insightful)

    by phantomfive ( 622387 ) on Saturday February 07, 2026 @01:04AM (#65974040) Journal

    Austin ISD has asked Waymo to stop operating near schools during bus hours until the issue is resolved. Waymo refused.

    I would like to see the context behind why Waymo refused this request, prima facie it seems like a reasonable request.

    • I'd say it's time to get the lawyers involved.

    • Re: context (Score:4, Funny)

      by Mr. Dollar Ton ( 5495648 ) on Saturday February 07, 2026 @01:11AM (#65974050)

      They obviously need more statistics to retrain the failing models. Knocking down a few kids in the process isn't a large cost, the investors will cover the damages.

    • I'm curious if Waymo's refusal could impact their likely future liability, for example from something like negligent manslaughter to a more serious charge. And then we get to the question of who is actually liable when the machines they control damage things.
      • They might have had a legal right to refuse (I don't know). But if something like that comes up in a future civil injury case, it could be used to suggest a pattern of irresponsible behavior and lack of concern.

    • Doesn't sound very reasonable to me for a vehicle with a safety record so impeccable it's never caused so much as a serious injury. If anything we should ban SUVs near schools. They fatally injure kids all the time.

      • The point is they didn't follow orders of the TX safety board. (If interpreting that correctly.) Whether they were "logically justified" in not following orders is moot. It's TX's roads and they get final say. If Waymo gets arrogant like that, they need a time-out and a fine. If they do it a second time, bootem out of TX.

        • Except those "orders" are not "orders". The Austin Independent School District has zero authority over public roads.

          Stop posting on Slashdot. Will you comply with my authority-less "order"? If not why not? As you formulate your answer in your head you will see how silly of a point you were just trying to make. Remember if you reply based on your justification a Slashdot administrator (different department to me, no relation to me, after all I have zero authority on Slashdot) should justifiably ban you right

      • I have a safety record so impeccable I've never caused so much as a serious injury, and yet I still follow traffic laws.
        • I have a safety record so impeccable I've never caused so much as a serious injury, and yet I still follow traffic laws.

          Except you don't. You have a quite average safety records based on your miles driven and near misses. Come back and compare yourself to the big boys when you have 4 orders of magnitude more driving miles under your belt.

          • And Waymo lies with statistics (and you believe them blindly). For example, they avoid difficult intersections, and don't tell you how often a human takes over. So in that sense, I am better than Waymo.

            It doesn't matter. Waymo has to obey the law.
    • by tlhIngan ( 30335 )

      I would like to see the context behind why Waymo refused this request, prima facie it seems like a reasonable request.

      I'm willing to bet the hours are also a peak period for Waymo. School buses picking up or dropping off kids would generally be around the same time as when kids are dropped off or picked up from school, and I'm willing to bet given the poor state of pedestrian travel networks in the US, that Waymo makes a lot of money picking up and dropping kids off to school.

  • ...failing to stop for buses during loading and unloading -- illegal in all 50 states

    Well, maybe illegal or not, depending on the circumstances and depending on the state:

    • Not necessary to stop if the bus is in a loading zone that is completely off the road if crossing the road is not permitted.
    • Not necessary to stop in the opposite direction if the road has 4 or more lanes.
    • Not necessary to stop in the opposite direction if the road has a barrier between travel directions.
    • Not necessary to stop in the opposite direction if the road has a median.
    • Not necessary to stop in the opposite direction
    • by rta ( 559125 )

      I've always found these rules poorly calibrated and overly conservative.
      basically based on moral panic about kids rather than logic about traffic laws and sharing the road fairly.

      but there are no school buses where I currently live, so haven't been annoyed by it for a while.
      (tbh idk WHY there aren't buses or how kids get to school here. )

      • Let me guess .... You don't have kids? Only a non payment would call protecting kids on their way to school a "moral panic". This is exactly a place where they should be protected. I know Americans are confused due to the prevelence of shooting galleries in schools.

        • Only a non-parent... Stupid phone

        • by rta ( 559125 )

          the origin of these laws appears to be pre-war rural areas where the bus created an ad hoc crosswalk for kids to cross to and from the bus on roads that didn't have crosswalks.
          that makes some sense.

          having the thing deploy automatically every time the door opens as these crawl through suburbia where everyone's already waiting at the bus stop is overkill.

          • Well where I am, with every single accident that happens they add a safeguard to prevent it from happening again. But then I live in a place where we care about others and not shootem-up-yehaw USA.

        • Yeah, it is net-widening and overcriminalisation not moral panic.

          Moral panic is just for the auto cars

          • Well fortunately there is still compassion left in law making.

            Again as someone without kids you have not the life experience to make an educated choice.

    • If Waymo is operating in a state, they can follow the laws of the state. That's the easy part of self-driving.
    • Our traffic laws really do need to be standardized at the national level

      That will be great, then we can have supreme court judge manipulation and national elections based on whether right turn on red should be legal or not. Give me passing on the right or give me death! Make America great again, get rid of the roundabouts! Speed up to get through yellow lights is change we can believe in!

      • by dgatwood ( 11270 )

        Our traffic laws really do need to be standardized at the national level

        That will be great, then we can have supreme court judge manipulation and national elections based on whether right turn on red should be legal or not. Give me passing on the right or give me death! Make America great again, get rid of the roundabouts! Speed up to get through yellow lights is change we can believe in!

        There's something pretty messed up about the ignorance of the law being no excuse when the U.S. legal system is such a nightmarishly complex mess. For the most part, we all pretty much assume that if we're not doing something obviously wrong, we'll be okay, and that's usually roughly good enough, but traffic is a big exception.

        Whether right turns on red are allowed or banned (and whether signs are posted saying so), whether u-turns are allowed or denied by default, whether lane splitting by motorcycles is

        • Wait until you find out that cities can make their own traffic laws, too.
          • by dgatwood ( 11270 )

            Wait until you find out that cities can make their own traffic laws, too.

            Which is why I said, "This is doubly true when policies vary from city to city."

      • Roundabouts were an American invention to make vehicle movement consistent (IE: All vehicles moving in the same direction.) and thus safer. The British give-way rule, made all intersections safer again. Roundabouts are used where there are a large number of exits, or vehicle density is very uneven over a work-day, and thus vehicles would be unnecessarily stopped by traffic lights, during most of the day.
        • Oh, you did it now. Now I'm reaching for the big rhetorical guns. If you like roundabouts, you are literally Hitler! Remember, you made me do it.
          • by Entrope ( 68843 )

            No, no, Hitler escaped to Argentina, not Australia. Seriously, have you ever driven in Australia? They are absolutely mad for roundabouts. But at least they are also scrupulous about signaling for them.

            Diverging diamond interchanges and their bastard offspring, though? I think those are the devil's own work.

  • by sjames ( 1099 ) on Saturday February 07, 2026 @02:09AM (#65974102) Homepage Journal

    Same problem in the Atlanta area.

    Also an incident where a Waymo got confused on the interstate, which it is forbidden to even be on.

    • by Ogive17 ( 691899 )
      Must be a million Waymos on the interstate in Atlanta because I see weird shit every single time I drive on an interstate or highway here.
  • I can't believe someone hasn't thrown themselves off the fender or hood of a waymo and sued. I can't see a judge or jury finding in Waymo's favor...

    • I can't believe someone hasn't thrown themselves off the fender or hood of a waymo and sued. I can't see a judge or jury finding in Waymo's favor...

      "Throwing themselves" is probably not a good thing to do with a vehicle that is recording your actions from all sorts of angles. If you can't see a judge finding in Waymo's favour when you literally described an act of fraud on behalf of the pedestrian then I don't know what to tell you.

      • by Archfeld ( 6757 )

        umm wow...
        Loosen your grip and let some blood flow...it was a joke,
        perhaps not a good one but sheesh...

        • Oh hahahaha you made a joke. Sorry buddy it's 2026. Slashdot is full of complete idiots who don't understand basic application of laws (just read this thread). If you want to not be misunderstood then give us a smiley to help get your point across. The default state of any point made online is to assume the person is an idiot rather than a genius making a joke. ;-) Man I wish we were back in the 90s.

  • -the child mode was set to "kill", see you just set it to "do not kill" and you're all good here now.
  • I would suspect that the Waymo would just weave in and out of traffic while accelerating and laying on the horn with a loudspeaker screaming "Move over and get off my lawn!"
  • Oh No (Score:4, Funny)

    by UncleWilly ( 1128141 ) <UncleWilly07.gmail@com> on Saturday February 07, 2026 @07:14AM (#65974300)

    These robot cars are WAY MOre dangerous than I suspected!

  • The problem has been solved. Waymo computer vision algorithms were incorrectly identifying school buses as orange canaries. Waymo is not sure how the bird identification module was given high priority.

  • We don't have enough info to actually judge what happened in all those 25 situations or the situation where it hit a kid. Last I heard about a waymo taxi hitting a kid was when a kid ran into the road from behind a large vehicle and the taxi actually did a better job of trying to stop as most human drivers would have. Most people reading this article think of the situation where the bus is parked on the right side of the road, had all its lights on and the stopsign extenended, and the waymo taxi driving on
  • ...is get rid of the school busses - How much we paying for those things anyway?

  • does the rider if any need stay on site?
    Does the rider / renter / owner risk jail time?
    Does waymo try to get out of paying out any thing? or say the car is owned by some local sub contractor and you need to sue them?

  • I would have expected this case to be hardcoded in because of the mindless hysterics to be expected in the press. Apparently the engineers working on this need to get out more.

  • We need to rethink how school bus stops work and their purposes. The idea of having a vehicle that stops in the middle of the road, blocking all traffic, so kids can board and de-board was a fitting solution at the time it was deployed, in the early 19th century. It was a very different world back then, slower speeds, lower traffic volumes, less schools, fewer multi lane roads and fewer kids. School buses mostly stopped on rural quiet streets in those days, not the same clogged streets we have today. In tod

COMPASS [for the CDC-6000 series] is the sort of assembler one expects from a corporation whose president codes in octal. -- J.N. Gray

Working...