Waymo is Having a Hard Time Stopping For School Buses (theverge.com) 134
Waymo's robotaxis have racked up at least 24 safety violations involving school buses in Austin since the start of the 2025 school year, and a voluntary software recall the company issued in December after a federal investigation has not fixed the problem.
Austin Independent School District initially reported at least 19 incidents of Waymo vehicles failing to stop for buses during loading and unloading -- illegal in all 50 states -- prompting NHTSA to open a probe. At least four more violations have occurred since the software update, including a January 19th incident where a robotaxi drove past a bus as children waited to cross the street and the stop arm was extended.
Waymo also acknowledged that one of its vehicles struck a child outside a Santa Monica elementary school on January 23rd, causing minor injuries. Austin ISD has asked Waymo to stop operating near schools during bus hours until the issue is resolved. Waymo refused. Three federal investigations have been opened in three months.
Austin Independent School District initially reported at least 19 incidents of Waymo vehicles failing to stop for buses during loading and unloading -- illegal in all 50 states -- prompting NHTSA to open a probe. At least four more violations have occurred since the software update, including a January 19th incident where a robotaxi drove past a bus as children waited to cross the street and the stop arm was extended.
Waymo also acknowledged that one of its vehicles struck a child outside a Santa Monica elementary school on January 23rd, causing minor injuries. Austin ISD has asked Waymo to stop operating near schools during bus hours until the issue is resolved. Waymo refused. Three federal investigations have been opened in three months.
Kind of weird (Score:2)
Re: Kind of weird (Score:5, Insightful)
it is obvious if you understand the concept of driving instead of mimicking it statistically with some probability.
a simple difference that the "AI" proponents and "investors" can't seem to grasp and acknowledge.
Re: Kind of weird (Score:4, Interesting)
There are definitely side cases that are difficult to predict for self-driving vehicles; this isn't one of them.
Re: Kind of weird (Score:5, Insightful)
I'm sure someone though of it. What's obvious from the failures is that model training isn't a substitute for understanding, which the model is lacking. So it will always have a nonzero chance to fuck up an obvious situation, which is what we mostly deal with.
Of course you'll have people arguing it isn't different with people on the account of the outcome (people are slower, get tired, etc) but the fundamental difference is the understanding, and the model doesn't have it.
Hence Agrdaaeelbal instead of America.
Re: (Score:2)
What's obvious from the failures is that model training isn't a substitute for understanding, which the model is lacking.
Even if the model doesn't have understanding, it can be trained to stop for school buses.
Re: Kind of weird (Score:2)
the model inputs (road conditions, perceived bus color, etc) are always random.
Re: (Score:2)
Yes, they could, but at some point it will get way more expensive than a driver just because of all that hardware, while the original idea was, I presume, to replace the drivers for less. No idea how much is invested up to today and how the costs compare to the results achieved. I would guess they are still way more expensive than a driver.
Re: Kind of weird (Score:2)
Drivers are really expensive. They cost more than the vehicle in short order if it's a car, but also over the lifetime of the vehicle even if it's a bus.
Re: Kind of weird (Score:4, Insightful)
If it's on a computer, it's fully deterministic (unless someone installed a hardware RNG).
If you believe that computers are fully deterministic, I have a PC I want to sell you...
Theoretical computers may be fully deterministic, but physical computational machines are made of electrical signals running on rare earth semiconductors, and with AI we have complex statistical chaotic interactions on top.
Any small unpredictable perturbation at any later may swing the whole system in a whole new direction. Hardly what we'd call deterministic (unless you believe the whole universe is deterministic, in which case the word loses all differentiation power).
Re: (Score:3)
Re: Kind of weird (Score:5, Interesting)
Computers are deterministic, software based on statistical models is not.
My dissertation on Data Requirements to Train Neural Network Controllers for Use in Process Industries which dates to 1997 proved that.
The short version is the weights between the layers are non-linear, therefore the failure mode is non-linear as well. In other words you don't know what it's going to do.
Apparently 25 years of work has failed to solve the problem or we wouldn't be seeing all these hallucinations.
Re: Kind of weird (Score:3)
Re: (Score:3)
Computers are deterministic, software based on statistical models is not.
You're conflating non-deterministic with chaotic. You can have a perfectly deterministic system that exhibits chaotic behavior, meaning that the tiniest difference in input often produces a wildly different output.
Apparently 25 years of work has failed to solve the problem or we wouldn't be seeing all these hallucinations.
I think hallucinations may just turn out to be an essential feature of sufficiently-rich models. People hallucinate, too. Mostly we're better at identifying our hallucinations. Mostly.
Re: (Score:2)
Apparently 25 years of work has failed to solve the problem or we wouldn't be seeing all these hallucinations.
Except that is incorrect. Given the same set of inputs you will get the same hallucination on the output, even on AI models. Hallucinations are an algorithmically problem. The issue is these AI models are not your basic Neural Networks from the 90s. They have among other things randomness built into their generation (for example if you ask AI to draw you a picture you'll get repeatedly different outputs, but if you control the seed of the input you will get the same output every time).
Likewise for communica
Re: (Score:2)
I do disagree that there's a larger non zero chance to duck up an obvious situation for a machine than for a human driver. Perhaps the chance is larger for non obvious cases, but that will get fixed.
Also, you seem to assume that self driving is largely straight out of a model, like LLMs hallucin
Re: Kind of weird (Score:4, Interesting)
fact of the matter is that many people also lack understanding,
Yes, and many among those who do understand, ignore the rules. Hence there is responsibility to face if one is guilty of such behavior.
I do disagree that there's a larger non zero chance to duck up an obvious situation for a machine than for a human driver.
Which isn't something I'm saying above. It is a mixture of factors, understanding the rules, however, is a cause for most of these "uncanny" problems.
Also, you seem to assume that self driving is largely straight out of a model,
Apparently not, it seems that it may just be a case of Filippino drivers simply not knowing the US driving rules in detail :)
https://www.newsweek.com/waymo... [newsweek.com]
Re: (Score:2)
Re: (Score:2)
Even though your post is rightfully modded up to 5, the fact of the matter is that many people also lack understanding, and the world would be a safer place if the human factor were taken out of driving entirely.
My company is looking for a person like you to console grieving parents. We need someone telling them that "At least your kid wasn't killed by a human - so count your blessings since AI is much safer than flawed humans."
But seriously, your complete disregard of the fact that human wetware has bad consequences when struck by heavy vehicles shows that you perhaps lack understanding of psychology of humans. I suspect you might be pretty misanthropic,
Or as Comedian Gallagher once noted, "Drive safe on the w
Re: (Score:2)
Re: (Score:2)
Hey Ol, I have no idea what tone you read in my message, but I considered completely different outcomes, without any fatalities, like what's possible on driverless rail that's been in use for ages. If you twist it to envision people getting killed regardless, that's your imagination.
If you think I am imaging something - perhaps you have never ever had to deliver accidental death information to people. It is even worse when you tell them their child has been killed.
You can tell them what you like, but saying that driverless is always safer is tone deaf, that it will be better once no humans are driving, when their offspring or significant other is dead after being hit by the safer vehicle.
I'm not even disagreeing about relative safety, just that you might consider the thinking an
Re: Kind of weird (Score:2)
Re: (Score:2)
No advance is entirely good or entirely bad. It's always a mix. You hope the good outweighs the bad on average. If driverless cars save 100,000 lives but also kill 1,000 others----not a subset, but people who would not have died----that's still a huge win, even though it may not look like it from a certain direction. It's the same story with vaccines. There are a few adverse reactions, but we accept the risk because far more lives are saved.
It isn't a matter of statistics. Here's the thing - if you could save 500 people by killing your wife or SO, would you do it? One person dead, 500 saved.
This is assuming you loved her. S it means a person who you loved more than anything else in the world that you would accept their death. Statistically, that would be a good tradeoff. 500 living people, against one who you loved more than anything.
Maybe you would. After all the good outweighs the bad not just on average, but quantifiably so.
And tha
Re: Kind of weird (Score:2)
Re: (Score:2)
This is Slashdot, light banter around technical stuff, we're not solving any problems here.
You apparently had serious cases in your mind, which is in your head. That doesn't mean I'm claiming you're imagining things, it just wasn'
Re: Kind of weird (Score:2)
That should be the main point here .. that it isn't even self driving.
Re: Kind of weird (Score:2)
When not to use AI (Score:3)
The primary control mechanism of a car should be an old-fashioned algorithm, not AI. The AI should tell the algorithm "likely school bus identified" and the alg should then ask the AI if the bus has flashing lights or a pop-out stop sign (via a probability score). If so, stop and wait.
Re: (Score:2)
Statistical-type AI is not used in self-driving cars. Too slow, too unreliable, too resource intensive. Well, maybe Tesla does it, but they are only faking having the tech for it.
Other than that, I agree.
Re: Kind of weird (Score:4, Informative)
Statistical-type AI is not used in self-driving cars.
Please. Their system is full of it, actually.
Waymo cars use trained models in all stages of operation. They use something called rangenet (a cnn) to turn lidar point clouds into cars. They use another layer of cnns and transformers for 360 degree observation, called SurroundView. They use a GNN to build representation of this as a simplified map of the dynamic situation on the road, called VectorNet. They use a MoE model to guess where the cars around will go. They use a gemini derivative to read the road signs. Their actual driving module is a RNN.
I'm sure I'm forgetting a few and I don't even know all.
Re: (Score:2)
That application is properly called "machine learning" and not "AI", because it does not try to fake being intelligent.
Re: (Score:2)
Statistical-type AI is not used in self-driving cars.
All the models that waymo lists as part of their car-driving software that I've mentioned above are "statistical-type" - data-trained neural networks of one kind of another, so you're wrong and such models are very much in use in self-driving cars.
Moreover, one of the models is specifically described as a Gemini variant, which is exactly what you'd choose to call an "AI".
Not that it is a big deal, but I've got the bad headache and am in a nitpicking mode.
Re: (Score:2)
Seriously, get a clue. This is not the current hype AI at work.
Re: (Score:2)
Sure, Jan :)
Re: (Score:3)
Actually, it is not exactly the same in all 50 states for multi-lane roads. In the linked article it states that the Waymo vehicle was filmed breezing through the opposite lane of traffic. The laws vary on opposing traffic depending on the state.
For a multi-lane road with only a turn lane separating the opposing traffic, Texas law requires opposing traffic to stop. Texas school bus laws [liggettlawgroup.com]
But for Washington state, Missouri, South Carolina, and a few others, a turn lane is enough separation to allow opposing
Re: Kind of weird (Score:3)
Sounds like self driving certification needs to be done on a per state basis. Even if that means some additional training and software is required to comply with state laws and potentially updated each year.
Re: Kind of weird (Score:2)
The Waymo vehicle knows zero of the traffic laws. It's just doing what it looks like it has done before. It doesn't understand anything including what it did before, so it can't do a sanity check. Sanity isn't even a concept which applies.
Re: (Score:2)
Is that really how they work? I was arguing the other day that there was too much sensory data to process to give an accurate assessment of the environment in real time with the power and processing available in a car at a viable cost. Someone told me that analysing the lidar/camera data and building a model of the world is not what is going on (and that I am a dick for assuming it was the approach used).
Re: (Score:2)
As far as I know they aren't learning in realtime, but they are still learning systems. I sure hope they are at least trying to failsafe around it as well, but clearly they are not doing a great job if they are. I admit I'm making some assumptions based on a combination of what they've said and how it's going, but I think they are reasonable.
Re: (Score:3)
Yikes! I am not sure that the way to produce a better driver is to watch how normal drivers behave, but at a lower spatial resolution and not seeing all the cues they are responding to.
Thanks for the reply
Re: (Score:2)
Well, they clearly should have coded an special case for school busses. I bet they regret not having done that now. Whether it would have done anything for actual safety is a different question. But the hysterics in the press were entirely predictable.
Re:Kind of weird (Score:4, Insightful)
The robotaxis are a single driver.
I would expect if a single driver racked up 25 safety violations involving school buses that their drivers license should be suspended.
Re: Kind of weird (Score:3)
Agreed. If I drove 25 different cars and got infractions in each, the government is going to be rather peeved with me.
Re: (Score:2)
The robotaxis are a single driver.
No. They are a multi-instance driver. You know, same as humans essentially. Probably should outlaw human driving with all the accidents and violations a typical human driver does though. Homo Sapiens is not a competent self driving platform in general.
The bottom line is that since humans do not scale, no "distance traveled" is factored into their incompetency scores. For multi-instance drivers it needs to be. In the end, what counts is accidents caused in proportion, not absolute numbers.
Re: (Score:2)
No. They are one model. That's literally the whole point and the source of economic scalability claims.
If there was a different set of behaviours across the taxis (like humans are different) then the costs for training/modelling/generating/supervising each individual taxi "driver" would blow up. When you have one model, you can update all instances simultaneously and constrain all instances to exactly the same standard.
Do not confuse the randomness of the environment (eg which road is being driven) and
Re: (Score:2)
You think a human driving a car is a "truly independent decision maker"? Talk about deep delusion.
You are in good company though. Most human drivers are pretty bad. Most human drivers _think_ they are pretty good.
All I see here is a complete risk management failure done to prop up a myth. That does not make anybody safer. Congratulations, you are aiding and abetting traffic-kills done by humans. Not a good look in any way.
Re: (Score:2)
The robotaxis are a single driver.
I would expect if a single driver racked up 25 safety violations involving school buses that their drivers license should be suspended.
While I agree with you the problem is they aren't such in the face of a law.
Also while I agree with you I would expect any single driver that accumulates over 200 million miles on just urban and suburban roads to rack up 25 safety violations. Actually I expect far more. The single driver methodology breaks down when you ignore distance travelled. The real question for safety is what is the violation rate per km.
And the real legal question is, when will the government reformulate road rules to start reflecti
Re: (Score:2)
Well, it is if you think of the driving software as a deterministic machine, as we are used to. If you have a toy truck, you can make it go where you want, but try that with a cat. The driving software is quite nearer the cat situation than the toy truck.
Re: (Score:2)
Same. I expect it will need to be a coded-in exception though and some "manager" probably did not want to spend extra for it. Or the engineers working on this have no clue how the real world works. This way they now get irrational hysterics in the press. They could have avoided that. Yes, the actual risks would not have changed, but many people cannot do risk management at all (as this story nicely shows) and need to have their irrationalities catered to if you want to be successful in a non-expert market.
Re: Kind of weird (Score:2)
Probably the school buses problem for throwing a stop sign off the side of the bus, instead of actually doing something a computer would understand, like V2X infrastructure or something. Pretending the future doesn't exist doesn't make it go away. Be machine friendly in your designs.
context (Score:4, Insightful)
Austin ISD has asked Waymo to stop operating near schools during bus hours until the issue is resolved. Waymo refused.
I would like to see the context behind why Waymo refused this request, prima facie it seems like a reasonable request.
Re: (Score:2)
I'd say it's time to get the lawyers involved.
Re: context (Score:4, Funny)
They obviously need more statistics to retrain the failing models. Knocking down a few kids in the process isn't a large cost, the investors will cover the damages.
evolution (Score:3)
By killing the slow unobservant kids Waymo is improving the human race, ready for the war against AI.
Re: evolution (Score:2)
yeah, we'll have to evolve to deal with the "AI" bullshit. I guess we're too dumb to do it from first principles.
Re: context (Score:3)
Re: context (Score:2)
They might have had a legal right to refuse (I don't know). But if something like that comes up in a future civil injury case, it could be used to suggest a pattern of irresponsible behavior and lack of concern.
Re: (Score:2)
Doesn't sound very reasonable to me for a vehicle with a safety record so impeccable it's never caused so much as a serious injury. If anything we should ban SUVs near schools. They fatally injure kids all the time.
We little people get fired, plutocrats a frown (Score:2)
The point is they didn't follow orders of the TX safety board. (If interpreting that correctly.) Whether they were "logically justified" in not following orders is moot. It's TX's roads and they get final say. If Waymo gets arrogant like that, they need a time-out and a fine. If they do it a second time, bootem out of TX.
Re: (Score:2)
Except those "orders" are not "orders". The Austin Independent School District has zero authority over public roads.
Stop posting on Slashdot. Will you comply with my authority-less "order"? If not why not? As you formulate your answer in your head you will see how silly of a point you were just trying to make. Remember if you reply based on your justification a Slashdot administrator (different department to me, no relation to me, after all I have zero authority on Slashdot) should justifiably ban you right
Re: (Score:2)
Re: (Score:2)
I have a safety record so impeccable I've never caused so much as a serious injury, and yet I still follow traffic laws.
Except you don't. You have a quite average safety records based on your miles driven and near misses. Come back and compare yourself to the big boys when you have 4 orders of magnitude more driving miles under your belt.
Re: (Score:2)
It doesn't matter. Waymo has to obey the law.
Re: (Score:2)
I'm willing to bet the hours are also a peak period for Waymo. School buses picking up or dropping off kids would generally be around the same time as when kids are dropped off or picked up from school, and I'm willing to bet given the poor state of pedestrian travel networks in the US, that Waymo makes a lot of money picking up and dropping kids off to school.
All 50 states... but differently (Score:2)
...failing to stop for buses during loading and unloading -- illegal in all 50 states
Well, maybe illegal or not, depending on the circumstances and depending on the state:
Re: (Score:2)
I've always found these rules poorly calibrated and overly conservative.
basically based on moral panic about kids rather than logic about traffic laws and sharing the road fairly.
but there are no school buses where I currently live, so haven't been annoyed by it for a while.
(tbh idk WHY there aren't buses or how kids get to school here. )
Re: All 50 states... but differently (Score:2)
Let me guess .... You don't have kids? Only a non payment would call protecting kids on their way to school a "moral panic". This is exactly a place where they should be protected. I know Americans are confused due to the prevelence of shooting galleries in schools.
Re: All 50 states... but differently (Score:2)
Only a non-parent... Stupid phone
Re: (Score:2)
the origin of these laws appears to be pre-war rural areas where the bus created an ad hoc crosswalk for kids to cross to and from the bus on roads that didn't have crosswalks.
that makes some sense.
having the thing deploy automatically every time the door opens as these crawl through suburbia where everyone's already waiting at the bus stop is overkill.
Re: (Score:2)
Well where I am, with every single accident that happens they add a safeguard to prevent it from happening again. But then I live in a place where we care about others and not shootem-up-yehaw USA.
Re: All 50 states... but differently (Score:2)
Yeah, it is net-widening and overcriminalisation not moral panic.
Moral panic is just for the auto cars
Re: (Score:2)
Well fortunately there is still compassion left in law making.
Again as someone without kids you have not the life experience to make an educated choice.
Re: (Score:2)
Re: (Score:2)
Our traffic laws really do need to be standardized at the national level
That will be great, then we can have supreme court judge manipulation and national elections based on whether right turn on red should be legal or not. Give me passing on the right or give me death! Make America great again, get rid of the roundabouts! Speed up to get through yellow lights is change we can believe in!
Re: (Score:2)
Our traffic laws really do need to be standardized at the national level
That will be great, then we can have supreme court judge manipulation and national elections based on whether right turn on red should be legal or not. Give me passing on the right or give me death! Make America great again, get rid of the roundabouts! Speed up to get through yellow lights is change we can believe in!
There's something pretty messed up about the ignorance of the law being no excuse when the U.S. legal system is such a nightmarishly complex mess. For the most part, we all pretty much assume that if we're not doing something obviously wrong, we'll be okay, and that's usually roughly good enough, but traffic is a big exception.
Whether right turns on red are allowed or banned (and whether signs are posted saying so), whether u-turns are allowed or denied by default, whether lane splitting by motorcycles is
Re: (Score:2)
Re: (Score:2)
Wait until you find out that cities can make their own traffic laws, too.
Which is why I said, "This is doubly true when policies vary from city to city."
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
No, no, Hitler escaped to Argentina, not Australia. Seriously, have you ever driven in Australia? They are absolutely mad for roundabouts. But at least they are also scrupulous about signaling for them.
Diverging diamond interchanges and their bastard offspring, though? I think those are the devil's own work.
Re: (Score:2)
The default without power is to treat any intersection as an all-way stop, which is safe and probably not too inefficient given the design. Probably a traffic cop would stand on the side that both directions of traffic come in, so they can see the lines of traffic and vice versa?
They do seem to work well as intersections, but they're a real surprise if a driver is not
Not just Austin (Score:3)
Same problem in the Atlanta area.
Also an incident where a Waymo got confused on the interstate, which it is forbidden to even be on.
Re: (Score:2)
Sounds like a payout waiting... (Score:2)
I can't believe someone hasn't thrown themselves off the fender or hood of a waymo and sued. I can't see a judge or jury finding in Waymo's favor...
Re: (Score:2)
I can't believe someone hasn't thrown themselves off the fender or hood of a waymo and sued. I can't see a judge or jury finding in Waymo's favor...
"Throwing themselves" is probably not a good thing to do with a vehicle that is recording your actions from all sorts of angles. If you can't see a judge finding in Waymo's favour when you literally described an act of fraud on behalf of the pedestrian then I don't know what to tell you.
Re: (Score:2)
umm wow...
Loosen your grip and let some blood flow...it was a joke,
perhaps not a good one but sheesh...
Re: (Score:2)
Oh hahahaha you made a joke. Sorry buddy it's 2026. Slashdot is full of complete idiots who don't understand basic application of laws (just read this thread). If you want to not be misunderstood then give us a smiley to help get your point across. The default state of any point made online is to assume the person is an idiot rather than a genius making a joke. ;-) Man I wish we were back in the 90s.
WEeelll here's your problem- (Score:2, Informative)
Death Race 2000 (Score:3)
Wasn't it 70 points for children under 12 and 100 points for the elderly?
Re: Death Race 2000 (Score:3)
Lol I remember that old lady with the walker sacrificing herself for the driver.
Re: WEeelll here's your problem- (Score:3)
Regarding children, I have my car set to "stun".
I would suspect (Score:2)
Oh No (Score:4, Funny)
These robot cars are WAY MOre dangerous than I suspected!
Problem solved (Score:2)
The problem has been solved. Waymo computer vision algorithms were incorrectly identifying school buses as orange canaries. Waymo is not sure how the bird identification module was given high priority.
Not enough info to judge (Score:2)
Well the smart thing to do (Score:2)
...is get rid of the school busses - How much we paying for those things anyway?
and after one runs over and kills an kid? (Score:2)
does the rider if any need stay on site?
Does the rider / renter / owner risk jail time?
Does waymo try to get out of paying out any thing? or say the car is owned by some local sub contractor and you need to sue them?
That is surprising (Score:2)
I would have expected this case to be hardcoded in because of the mindless hysterics to be expected in the press. Apparently the engineers working on this need to get out more.
School bus stops are lame (Score:2)
We need to rethink how school bus stops work and their purposes. The idea of having a vehicle that stops in the middle of the road, blocking all traffic, so kids can board and de-board was a fitting solution at the time it was deployed, in the early 19th century. It was a very different world back then, slower speeds, lower traffic volumes, less schools, fewer multi lane roads and fewer kids. School buses mostly stopped on rural quiet streets in those days, not the same clogged streets we have today. In tod
Re: Oh right February, time for the... (Score:3)
2 monthly Waymo doesn't stop for school busses story. Although this time posted by msmash instead of BeauHD for a change. I suspect we're getting a dupe of this tomorrow.
Re: (Score:2, Informative)
People who are discussing something totally non-political and then want to bring Trump into the discussion are people who have VERY bad cases of TDS and need to be institutionalized ASAP.