Forgot your password?
typodupeerror
Transportation

Waymo Hits a Dog In San Francisco, Reigniting Safety Debate (latimes.com) 169

A Waymo robotaxi struck a small unleashed dog in San Francisco -- just weeks after another Waymo killed a beloved neighborhood cat. The dog's condition is unknown. The Los Angeles Times reports: The incident occurred near the intersection of Scott and Eddy streets and drew a small crowd, according to social media posts. A person claiming to be one of the passengers posted about the accident on Reddit. "Our Waymo just ran over a dog," the passenger wrote. "Kids saw the whole thing." The passenger described the dog as between 20 and 30 pounds and wrote that their family was traveling back home after a holiday tree lighting event. The National Highway Traffic Safety Administration has recorded Waymo taxis as being involved in at least 14 animal collisions since 2021.

"Unfortunately, a Waymo vehicle made contact with a small, unleashed dog in the roadway," a company spokesperson said. "We are dedicated to learning from this situation and how we show up for our community as we continue improving road safety in the cities we serve." The spokesperson added that Waymo vehicles have a much lower rate of injury-causing collisions than human drivers. Human drivers run into millions of animals while driving each year.

"I'm not sure a human driver would have avoided the dog either, though I do know that a human would have responded differently to a 'bump' followed by a car full of screaming people," the Waymo passenger wrote on Reddit. One person who commented on the discussion said that Waymo vehicles should be held to a higher standard than human drivers, because the autonomous taxis are supposed to improve road safety. "The whole point of this is because Waymo isn't supposed to make those mistakes," the person wrote on Reddit.

This discussion has been archived. No new comments can be posted.

Waymo Hits a Dog In San Francisco, Reigniting Safety Debate

Comments Filter:
  • by dada21 ( 163177 ) <adam.dada@gmail.com> on Tuesday December 02, 2025 @10:09PM (#65831379) Homepage Journal

    And?

    • Same as when a drunk falls onto subway tracks: it's capitalist society's fault.

      • This. The fact that this incident is making headlines shows just how fucking safe these robots are. If you want to see if they are fair, compare them to the occurrence of human driver dog impacts, etc.. on a miles per incident basis.
      • by tragedy ( 27079 )

        Anyone falling into subway tracks is the fault of someone besides the person who fell in, because it is insanely idiotic in this day and age from a safety point of view for there to be no wall between the train and the platform. It's not a difficult concept. You have a wall or barrier with gates that open up once the train arrives. As for capitalism, subways are generally public utilities, though they may be run by private operators and contractors. In general it is not about capitalism, though it may have

        • by Valgrus Thunderaxe ( 8769977 ) on Wednesday December 03, 2025 @04:04AM (#65831749)
          Anyone falling into subway tracks is the fault of someone besides the person who fell in, because it is insanely idiotic in this day and age from a safety point of view for there to be no wall between the train and the platform.

          Sorry, what's idiotic is this point of view. Taken to it's logical conclusion, we should have walls around the beach because someone might walk into the ocean and drown.
          • Are there no signs at the beach indicating swim at your own risk? Is there a sign on the car letting everyone know they are in the vicinity of the car at their own risk?
            • Signs at the beach stating "Swim at your own risk" are for the lawyers when someone drowns and sues the city. No reasonable person is going to see these signs and change their behavior.
          • by tragedy ( 27079 )

            Sorry, what's idiotic is this point of view. Taken to it's logical conclusion, we should have walls around the beach because someone might walk into the ocean and drown.

            Wouldn't your argument, taken to its logical conclusion, be even more idiotic since we would not have railings on stairs, pilot lights in furnaces to prevent houses from exploding, bumpers on cars, speed limits, lane markers (heck, just one giant lane for traffic in both directions), emergency services like police, fire, and ambulance, building regulations for building skyscrapers, etc., etc., etc. There is a middle ground between reasonable safety and building deathtraps. Also, notably, beaches are (mostl

          • Taken to it's logical conclusion, we should have walls around the beach because someone might walk into the ocean and drown.

            We often do have fences around the beach, especially in areas frequented by a lot of people. They can slow down a child enough for an attentive parent to notice and maybe catch up before they enter the surf or fall off a cliff.

    • Re: (Score:2, Informative)

      I dunno, but you got down-modded for saying the obvious truth.

      • by tragedy ( 27079 ) on Wednesday December 03, 2025 @04:13AM (#65831769)

        It's not really that obvious. It comes down to the question of the capabilities of the AI. The real question is if it simply failed to notice the dog or if it noticed the dog and didn't even attempt to stop. Then, if it is the latter, did it not stop because it explicitly recognized the dog as a dog, or did it simply not recognize it as human and not stop. For example, did it not stop simply because it was too low to the ground or too small, etc. Even if we ignore the ethics of running over dogs (which affects humans in the vehicle and outside it, even if you take the position that the dog itself has no intrinsic value) there is a real problem if the vehicle has a problem recognizing, for example, small humans. For example, will the vehicle run over a toddler who runs into the road because they're too short. If a baby crawls into the road and just sits there, does it just run the baby over? How about a ball that bounces into the road? Does the vehicle just ignore the ball either because it recognizes it specifically and doesn't care about hitting it, or because it does not recognize it, but due to it not matching specific criteria, it deems it OK to hit? Most human drivers understand the maxim "after a rolling/bouncing ball comes a running child", but does the AI?

        So, the whole point of concern over this is not just about the life of one dog, it is about the actual safety capabilities of the AIs driving the cars and how they will react in various situations. I hope we all remember the case of Elaine Herzberg. Now, she was clearly not being careful herself, but I recall that it turned out that the Uber vehicle in that case, after a lot of initial speculation, turned out to basically be a murder machine. In other words, while the initial story was that the system did not see her, it later turned out that it basically did see her, but just was not programmed to avoid colliding for her. The car itself had collision avoidance that would have stopped, but the self-driving system overrode it without properly replicating its function.

        These sorts of things are of concern for self-driving. Yes, this may put the self-driving system under more scrutiny than a human being, but in the case of self-driving cars, you actually can review exactly what the car saw and understood and what actions it was capable of taking and (hopefully) why it behaved the way it did. That is important to understand for this sort of technology going forward. Brushing it off because "and?" is nonsensical.

        • by N1AK ( 864906 )
          And?

          I'm all in favour of investigations into the cause of the incident and the points you raise about it are mostly valid but it's a single example of an animal in a road being killed by an automotive; this is happening constantly with human drivers and isn't even close to being considered news. If the car did something especially concerning or there was a statistical trend that was concerning about animal fatalities and self-driving cars then fine, but short of that this kind of exceptional treatment of
          • by tragedy ( 27079 )

            this is happening constantly with human drivers and isn't even close to being considered news

            Sure, but the point is that we know the reasons this happens constantly with human drivers and they mostly have to do with factors like inattention, slow reflexes, lapses in situational awareness, intentional cruelty, etc. All of those are things that should not happen with AI drivers. It is not unreasonable to expect what is essentially perfection from a computer-based system. Now, there are reasons other than those, the main one being that it was simply unavoidable that are certainly excuses for an AI sys

        • The real question is if it simply failed to notice the dog or if it noticed the dog and didn't even attempt to stop.

          Also, why it didn't attempt to stop (if it didn't). If it didn't attempt to stop because it correctly determined that attempting to stop would risk causing a more serious accident with other vehicles on the road, that's not only good, it's better than the vast majority of human drivers.

          • by tragedy ( 27079 )

            If it didn't attempt to stop because it correctly determined that attempting to stop would risk causing a more serious accident with other vehicles on the road, that's not only good, it's better than the vast majority of human drivers.

            I am looking at a few ways of parsing that sentence, but it really sounds like you're saying that it is correct to assume that attempting to stop for an animal in the road will risk causing a more serious accident? Maybe that's not what you're saying, but if it is, that would be very problematic. Coming to an emergency stop to avoid hitting a large object in the road should be the safest option in the vast majority of circumstances. The main exception is when the car behind is driving extremely unsafely, an

    • by Xarius ( 691264 )

      Are you talking about the dog, or the car?

  • wait, what? (Score:4, Insightful)

    by ddtmm ( 549094 ) on Tuesday December 02, 2025 @10:25PM (#65831399)

    "The whole point of this is because Waymo isn't supposed to make those mistakes," the person wrote on Reddit.

    I don't think any system can ever be expected to never make a mistake. As long as they make way less mistakes than humans, I'm all for it. The expectation of perfection is unrealistic.

    • Right, but what about the mistake that is affecting you? If a person makes a mistake they will be held responsible. So, same should go for companies as well.
      Good for them if they make less mistakes, they are going to save money and pay less.
      • by ddtmm ( 549094 )
        Who says the company isn't held responsible?
        • Re:wait, what? (Score:4, Insightful)

          by rsilvergun ( 571051 ) on Tuesday December 02, 2025 @10:50PM (#65831427)
          Last I checked the local government said so. They have indemnified waymo in every market they have launched their taxi service. I don't think that would hold up if they killed a human being, at least not one that isn't homeless, but so far it's held up for the more minor stuff that's happened.

          Basically waymo cannot be cited for traffic violations and killing a pet is just a traffic violation. The most they could be held responsible for would be the value of the pet which is usually under $100.
          • by djinn6 ( 1868030 )

            The most they could be held responsible for would be the value of the pet which is usually under $100.

            That's the same for a human driver too. And you have to prove they were negligent, i.e. speeding, running a stop sign etc.

            • as it stands Waymo won't even be held responsible for that. Those sorts of penalties are waived for self driving cars in the cities they operate.

              They have to be because they commit traffic infractions so often that the cops would be pulling them over and ticketing them constantly.

              Rules for thee but not for me.
          • That's not true. If there was intent then they are responsible for the medical bills of the dog. And "we don't know what it will do" is not a defence Designing a car to drive itself and then not knowing what will happen is criminal negligence.
      • Re:wait, what? (Score:5, Insightful)

        by ihadafivedigituid ( 8391795 ) on Tuesday December 02, 2025 @10:48PM (#65831423)
        This is on the dog owner since they were breaking the city's strict six-foot leash law. Seems to me they owe Waymo for damage to the car due to their obvious negligence.
        • by evanh ( 627108 )

          As the jaywalking example implies, even a human driver is at fault if the jaywalker is actually hit by the car. So, while the jaywalker can be fined for being on the road, it's still a far worse offence if a car hits that jaywalker.

          • That's a ridiculous over simplification. It's true that vehicles can't simply mow down jay-walking pedestrians as there is a reasonable expectation the driver avoids any contact... but if a pedestrian unexpectedly ventures into traffic the driver is unlikely to face any legal issues unless under the influence.
        • by evanh ( 627108 )

          If it could be demonstrated the dog was attempting suicide then leniency might be considered towards the driver's failure to avoid the accident.

    • Yup, still better than humans.

      "Driver hits unleashed dog that darted into street" is just a Tuesday, but "autonomous vehicle hits unleashed dog that darted into street" is a headline because it is so rare.

      • It's a bit different in SoCal. Here "Los Angeles man shot in front of a liquor store" is a Tuesday.

      • by sjames ( 1099 )

        "Driver hits unleashed dog that darted into street" is just a Tuesday, but "autonomous vehicle hits^w makes contact with unleashed dog that darted into street" is a headline because it is so rare.

        I'm not so sure there have even been enough incidents to decide if Waymos are more or less likely to run over an animal, but I *STRENUOUSLY* object to trying to soften it with 'made contact with' in the press release.

    • Re:wait, what? (Score:4, Informative)

      by evanh ( 627108 ) on Tuesday December 02, 2025 @10:37PM (#65831415)

      When it's a mechanised public service it's never okay to make mistakes. Same as for air accidents.

      • by PPH ( 736903 )

        Hey Sully! Why didn't you dodge those geese?

      • by djinn6 ( 1868030 )

        Planes hit birds all the time. In the vast majority of cases, it kills the bird but causes minimal damage to the plane.

        If you're not okay with that then you need to stop flying.

        • by tragedy ( 27079 )

          The point is not about the lives of animals. Not that they have no value, but we clearly value human lives more. Self-driving is by no means advanced enough or well understood enough by even the engineers building the systems for anyone to ignore things like this yet. It is important to understand precisely why the car hit the dog. It may simply be a case of it being an unavoidable collision, but we don't actually know that. There are serious questions to be answered like if the vehicle knew it was about to

          • I'd just like to know what the car does if it could have saved the dog but it has to slam on the brakes hard to do it. As a human I'm watching my rear view mirror and already know how close the person is behind me and slam on my brakes as hard as I can to save the animal since generally there is no one tailgating and if they hit me it is their fault anyway. I'm concerned an automated car would not stop in that situation because it would make the ride 'unpleasant' for occupants. The life of an animal shou
            • by djinn6 ( 1868030 )

              and if they hit me it is their fault anyway

              Being hit from behind could cause permanent injuries. It's fine for you to take a personal risk to save an animal. It's not fine for the self-driving car to make that decision for its occupants.

              • by tragedy ( 27079 )

                Speaking as someone who has, you know, driven cars, I would like to point out that if you have decent situational awareness, you can brake to avoid something in front of you while still watching the car behind you and accelerating out of danger or taking other evasive maneuvers if necessary if they are not going to stop soon enough. I've had to do both before in my life. Once I was trying to take a turn and some vintage car (like pre-WWII vintage, not just from the 70's or something) with terrible braking w

          • by djinn6 ( 1868030 )

            I don't think it matters to us what actually caused it. I'm sure their engineers have access to all the information necessary to debug this. My guess is it darted out at such a speed that no human could've avoided it either.

            And while this debate continues, those cardboard boxes containing small children are being run over by drunk or inattentive humans on a regular basis.

            • by tragedy ( 27079 )

              I don't think it matters to us what actually caused it. I'm sure their engineers have access to all the information necessary to debug this. My guess is it darted out at such a speed that no human could've avoided it either.

              That might be a good guess, but the point is that it is just a guess. We don't have enough information to conclude that.

              And while this debate continues, those cardboard boxes containing small children are being run over by drunk or inattentive humans on a regular basis.

              Does it follow that it is somehow OK? Or that there is no responsibility to drivers to avoid being reckless?

              • by djinn6 ( 1868030 )

                Does it follow that it is somehow OK?

                Yes. Anything better than status quo is not just ok but good, because the status quo is 40,000 dead humans per year in the USA. Once we replaced the status quo, then we can talk about what further improvements could be made.

                If you were rational and you had the opportunity to magically turn all cars into self-driving cars by sacrificing a few pets, you would do it. If rationality is too pedestrian for you and you only care about feelings, then you need to educate yourself with stories of people whose lives w

          • Pretty clear self drive makes some significant mistakes still. TMZ initially reported and was picked up by the national news. https://www.tmz.com/2025/12/01... [tmz.com] Pretty crazy the AI decided to drive thru what was an active crime scene. With passenger Other stories about passengers being held captive while the car drives around in circles. https://www.cnn.com/2025/01/07... [cnn.com] and I thought another circled around a McD's parking lot.
  • by ihadafivedigituid ( 8391795 ) on Tuesday December 02, 2025 @10:45PM (#65831417)
    San Francisco has a strict six-foot leash law for dogs, right? How is this Waymo's fault? Are they supposed to defy physics when someone's mutt jumps into traffic?
    • Unleashed dog runs into street and get hit: not the driver's fault.

      Driver completely ignores that and fails to stop: very much the driver's fault.

      So yeah Waymo weren't the only problem here but they definitely a problem.

  • Expectations (Score:3, Informative)

    by RitchCraft ( 6454710 ) on Tuesday December 02, 2025 @10:58PM (#65831443)

    People expecting automated driver-less vehicles to do better than humans is hilarious. Last year one of my dogs got out and entered the street in front of my house. I was able to wave my arms at the driver approaching motioning him to stop, which he did. Got my dog, fixed the fence, life goes on. Imagine waving your hands at a robot motioning it to stop. Will it? Probably not. Dead dog. Automated vehicles are an utterly stupid idea.

    • by Luckyo ( 1726890 )

      By the numbers, they do between seven and ten times better than human driven cars.

      Waymo publishes their safety numbers. Google it. It's insane how much safer those cars are compared to human driven ones.

    • At least robotcars will learn about this situation and all get updated accordingly to try and handle it better next time. Human drivers don't.
    • I was able to wave my arms at the driver approaching motioning him to stop, which he did.

      There are areas I drive through that if you stood on the side of the street and waved your hands, I would stomp on the gas and get out as fast as possible. You could be breakdancing on the sidewalk and I wouldn't stop.

      My neighborhood? Sure. Your neighborhood? It depends.

    • They are an utterly stupid idea... why? We already have lots of machines that would be dangerous if things go wrong. What about this machine is so different that we can't improve it to be relatively safe and reliable? If we really needed a means to 'warn' automated cars we possibly could. But in your example, you shouldn't need to 'warn' the car to avoid hitting the dog. It should ideally have that capability to detect and avoid it without human intervention. Which is sort of the whole point.
  • "The whole point of this is because Waymo isn't supposed to make those mistakes,"

    There is no whole point in such a complex issue, but I would like to tell this person that the idea is part of the argument for automated vehicles is they may make less mistakes. Perfection shouldn't be a condition for improvement.

  • by Applehu Akbar ( 2968043 ) on Tuesday December 02, 2025 @11:00PM (#65831447)

    ...In all those millions of driving miles? That's a far better record than human drivers in any single small town.

    • 100,000 dogs get killed each year by human drivers (sometimes just their own owners stupidity, like having them unsecured in the tray) https://www.petscare.com/news/... [petscare.com]

      So far Waymo has killed zero dogs, and one cat. And that dog that didn't even die was off-leash so the owner should be sued for damaging the car.

    • Many millions of those miles are on roads that never have animals on them.
      • Many millions of those miles are on roads that never have animals on them.

        Until last month, Waymo only allowed their cars to drive on city streets, no freeway driving. Even now, freeway usage is limited, only for selected riders (I'm not sure what the selection criteria is).

        So, basically all of Waymo's millions of miles are on streets that often have animals on them.

  • by timholman ( 71886 ) on Tuesday December 02, 2025 @11:18PM (#65831471)

    "I'm not sure a human driver would have avoided the dog either, though I do know that a human would have responded differently to a 'bump' followed by a car full of screaming people," the Waymo passenger wrote on Reddit.

    I can tell you exsctly how many human drivers would respond in a situation like this, because I've seen it happen and have heard about it enough times: the driver would have accelerated away from the incident at high speed.

    Lots of people are jerks. And others don't want to take the risk of confronting an angry (possibly armed) person who blames the driver for running over their pet.

    The dog was unleashed. The legal fault lies with the owner. This was an unfortunate accident, but it is hardly Wayno's fault.

    • Owner's fault. I won't wreck my car to save a dog either.

      I've run over a couple that I likely saved by straddling them. One clearly lived as it was running for home after the tumbling trip under the truck. I hope it learned a valuable lesson.

      The last fatality left me with the following choices. Hit the dog, Slam on the brakes and get rear-ended, drive into the ditch which made up the median, or swerve into the right lane and hope the car in that lane just behind me could manage to miss me. The dog took the

      • by tragedy ( 27079 )

        The last fatality left me with the following choices. Hit the dog, Slam on the brakes and get rear-ended, drive into the ditch which made up the median, or swerve into the right lane and hope the car in that lane just behind me could manage to miss me. The dog took the hit.

        I mean, you have to see the multiple problems with that example, right? The fact that you could not slam on the brakes or dodge without being rear-ended or crashing suggests something very, very wrong with how the people around you were driving and possibly with how you were driving. Having an issue with swerving into another lane in an emergency maneuver without being hit? Same issue.

        Sure, human limitations might make that difficult you. As a human, you have limited situational awareness. An AI should not.

        • I find it such a hypocrisy. They tout how safe these cars are and then they compare them to drunk and tired drivers to prove their point.
    • I can tell you exsctly how many human drivers would respond in a situation like this, because I've seen it happen and have heard about it enough times: the driver would have accelerated away from the incident at high speed.

      They would have done that after slamming on the brakes in a vain attempt to avoid hitting the dog, possibly losing control of their vehicle, and possibly causing a collision with other cars or objects. If their reaction failed to cause a serious accident, then maybe they'd have sped away.

  • Conclusions (Score:5, Insightful)

    by quantaman ( 517394 ) on Wednesday December 03, 2025 @01:10AM (#65831581)

    The one conclusion we can draw from this is the folks drawing conclusions are exposing nothing but their own beliefs.

    All we know is the dog was unleashed and the Waymo hit it.

    We don't know if the dog shot out from under a parked car, so it was literally impossible to avoid. Or if it was sitting in the middle of the road and the Waymo ran straight over it.

    All the folks trying to assign blame one way or another are doing so completely prematurely.

    • by RobinH ( 124750 )
      Isn't "assigning blame prematurely" the precise reason we handed everyone a megaphone by giving them a phone in their pocket with access to social media? I was wondering what everybody was thinking, and now I get to hear them scream it into the void.
    • And I think the problem is we will never know. Waymo will be allowed to keep the video it surely has secret, and will never even be seen by CalTrans or NHTSA. Transparency is what should be happening. Not closed door meetings at waymo. All these self drive programs are granted way too much leniency in this area. gm got caught basically because there was 3rd part video of their human dragging scene. And it did not turn out well. As I recall gm tried to bury it. And these large corps are usually very good at
  • I can't believe how angry Americans are at other Americans. Then they use that anger to justify a trillion dollar company killing animals. It doesn't matter whether waymo is safer than humans or not, they should also not kill anything that a human wouldn't. We need to ask ourselves if a human would have truly seen this dog and slammed on the brakes, and not see it through the anger and disdain that apparently all Americans have for everyone and everything else.
    • Not judging Waymo and have no idea of the actual situation under discussion, but the reality is that, in a great many situations, you should run over the dog to avoid the potential of a larger problem. In the extreme, you could imagine someone slamming on their brakes at 75mph on a freeway causing a huge pileup and many fatalities just because a dog jumps out from behind a pillar or something else on the side of the road.
  • "I'm not sure a human driver would have avoided the dog either, though I do know that a human would have responded differently to a 'bump' followed by a car full of screaming people," Yes a human would have reacted different but if you are willing to get into a driverless car, then you have to expect that. Its almost like we should just drop the bridges surrounding that mentally deficient city as if it were a leper colony and let the problem work itself out.
  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Wednesday December 03, 2025 @07:14AM (#65832005)

    If a bot-car can be held up by a gangster simply stepping in front of it in a situation where you'd actually like to rather plow through, how safe is that car for the people in it? Given, cars are more dangerous for their environment most of the time, but sometimes its the environment that is more dangerous for the car and the people in it.

    Also driving through a rising river that is a meter deep may usually be a bad idea, but if the flood is rising it might be well worth the risk and a very reasonable decision to attempt it. You want to be able to actively make your car do that in those situations.

    • by 0123456 ( 636235 )

      People have been pointing out this problem for decades. No-one is going to want to take responsibility for building a car which will intentionally run over people, so all it will take to rob passengers in future is to stand in front of the car while your friends rob them.

      This is particularly bad for self-driving trucks because they carry a lot of valuable stuff that can be looted South African-style.

      Now, sure, there'll be video of the robbery and maybe you could put some trackers in the cargo to try to catc

  • by John Allsup ( 987 ) on Wednesday December 03, 2025 @09:54AM (#65832237) Homepage Journal

    Imagine a headline: "Car driven by human hits dog, igniting safety concerns over allowing humans to drive cars."

    It's silly. You'd laugh. It is equally silly to talk of 'self driving car hits dog' in the same way.
    The question that matters is whether or not a self-driving car is less likely to hit a dog than a human driven car.
    Improving standards of self-driving car software and hardware is in the same bucket as improving driver discipline.
    And there are many drivers with poor discipline who are more likely to hit a dog than a self-driving car.

  • not like the humans who die at 40,000 people per year in the United States because of cars that they themselves are driving. no that's perfectly safe.
  • "Human drivers run into millions of animals while driving each year." What an idiotic statement.

    In 2023 US drivers drove about 3.2 teramiles. Waymo claims they drove about 100 megamiles unitl now. Waymo did not drive high speed limit country roads at night, or pet-laden suburban neighborhoods. These statistics are not comparable.

    Meanwhile, so Waymo hit a dog. So what? Keep your dog on a leash. Waymo hit a cat. So what? Don't let your cat run free, crapping on other people's property and killing birds

  • Wham-o is a dangerous experiment, allowed by corrupt politicians.

You can tune a piano, but you can't tuna fish. You can tune a filesystem, but you can't tuna fish. -- from the tunefs(8) man page

Working...