Forgot your password?
typodupeerror
Transportation AI

Trapped! Inside a Self-Driving Car During an Anti-Robot Attack (seattletimes.com) 139

A man crossing the street one San Francisco night spotted a self-driving car — and decided to confront its passenger, 37-year-old tech worker Doug Fulop. The New York Times reports the man yelled that "he wanted to kill Fulop and the other two passengers for giving money to a robot." A taxi driver would have simply driven away. But Fulop's vehicle had no driver — it was a self-driving Waymo... Self-driving cars are designed to stop moving if a person is nearby. People can take advantage of that function to harass and threaten their passengers.... It was unsettling to be trapped inside a Waymo during an attack, Fulop said. "If he had kept hammering on one window instead of alternating, I'm sure he would have eventually broken through," he said. The attacker did not appear to be on drugs or otherwise impaired, but seemed to be overtaken by extreme anger at the self-driving car, Fulop said.

It did not seem safe to get out and run, he added, since the man was trying to open the locked doors and said he wanted to kill the passengers. They called 911 and Waymo's support line, Fulop said. Waymo told them that it would not manually direct the car away if someone was standing nearby, and that the passengers would be OK with the doors locked. The car's software does not allow riders to jump into the driver's seat and take over during an incident. The attack lasted around six minutes. By then, bystanders had begun cheering on the man, Fulop said. That distracted the man, who moved far enough away from the car that it could finally drive away...

Fulop said he had stopped using Waymo for a time after the January attack and would avoid the service at night unless the company changed its policy of not intervening when a hostile person threatened riders. "As passengers, we deserve more safety than that if someone is trying to attack us," he said. "This can't be the policy to be trapped there."

The article remembers other incidents — including a 2024 video showing three women screaming as their autonomous taxi is spray-painted by vandals. And technology author/speaker Anders Sorman-Nilsson says in Los Angeles five men on e-bikes surrounded his Waymo and forced it to stop. The author felt safe inside the vehicle, according to the times, which adds "He felt reassured knowing that Waymo's many exterior cameras were recording the men. After around five minutes, he said, they gave up and rode away."
This discussion has been archived. No new comments can be posted.

Trapped! Inside a Self-Driving Car During an Anti-Robot Attack

Comments Filter:
  • by Joe_Dragon ( 2206452 ) on Sunday March 22, 2026 @07:02PM (#66054956)

    could someone do that to trap an car on railroad tracks?

    • Probably, I know at least last year and before it would not take routes that cross train tracks.

    • Yes, it's called a rail blockade. Happens all the time.

      https://www.news18.com/news/in... [news18.com]

      https://www.sasktoday.ca/highl... [sasktoday.ca]

      https://financialpost.com/news... [financialpost.com]

      • by KiloByte ( 825081 ) on Monday March 23, 2026 @08:05AM (#66055796)

        That's a blockade done against human drivers, who (usually) know how to drive off the railway track, and the blockaders are only protesting rather than actively trying to murder. They stop cars from passing but don't trap them on the tracks.

        What GP suggests is that by people simply standing there, the self-driving car's software will stop on the track without aggressively trying to escape.

        Here in Poland we have campaign teaching people how to get out of a railway crossing if you get stuck. A bunch of differently-smart humans didn't even contemplate driving through the bar gate, and in some cases didn't even evacuate the car either. The bars are designated to break easily when forced by a car, but somehow in a stressful situation drivers regard them as sacrosanct. As Waymo cars behave that way in about every potentially dangerous situation, I'm afraid they'll do the same when on a railroad crossing as well.

        • The bars are designated to break easily when forced by a car, but somehow in a stressful situation drivers regard them as sacrosanct.

          They are following the rules. The rules say that the arms are an impassable barrier. Sure, the arms CAN be broken, but people who follow the rules will not even try. That is why Fascism can so easily slip into your country without any resistance.

    • Yes, it's possible that murderers are able to murder. Why would they choose that method over a lot of other methods?

      • by gweihir ( 88907 )

        Simple: This way it becomes a scary story and some bogus anti-self-driving story can be pushed. Obviously, this is all rare, irrelevant events and does not even begin to get remotely close to the numbers of people killed by human drivers.

      • by tragedy ( 27079 )

        Yes, it's possible that murderers are able to murder. Why would they choose that method over a lot of other methods?

        A lot of the people who might want to murder people randomly tend to not be the brightest/sanest bulbs. I have personally met a number of people who were convinced that, due to Castle Doctrine, they could invite people to their homes and murder them with impunity. None of them even lived in Castle Doctrine states. The same sort of person might have the same ideas about blocking self-driving cars this way. From their perspective, they would be completely legally in the clear. After all, they're just standing

    • by gweihir ( 88907 )

      That is why you can always open the doors. Basic safety-engineering. Now, whether the passengers would be smart enough to get out is a different question.

      • Yeah, get out and face the probably armed idiot who jumped on the car. Smart move.

        What if it is a real terminator?

        https://www.youtube.com/watch?... [youtube.com]

        • by gweihir ( 88907 )

          Try to keep up. We were talking about an FDS car standing on railroad tracks.

        • Goalpost moving.

          Before it was some crazed lunatic that wants to stop a self-driving car on train tracks to murder the passengers, while the passengers apparently just sit there an wait for the oncoming train to turn them into several hundred pounds of ground chuck mixed with twisted plastic, glass, and steel.

          Now, because someone pointed out the most obvious reaction to such an action: get out of the car; now you have made the crazed murderer armed all of a sudden.

          If they were armed and intent on murder, why

    • If a murderer wants to murder, there are many ways to accomplish that.

      What you are suggesting is akin to "murder, but on the internet" patents of the 90s.

      Nobody would do this, because there's a dozen easier ways to murder someone if so inclined.

  • Was anyone arrested? (Score:4, Interesting)

    by myth24601 ( 893486 ) on Sunday March 22, 2026 @07:04PM (#66054960)

    Story says they called the police. Did they ever come? Was the attacker arrested? Should we be on the look out? Don't these things have tons of cameras, should be easy to find the guy.

    • Did they ever come?

      Given the story said there was an official police report I'm going to go with yes, also if they didn't come that would certainly have been the headline. Beyond that there's a lot of no comments.

      Should we be on the look out?

      Yes the world is full of crazy people. You should be on the look out, whether this one particular person has been arrested or not.

  • Edge cases (Score:5, Insightful)

    by Baron_Yam ( 643147 ) on Sunday March 22, 2026 @07:11PM (#66054972)

    Edge cases are why human override is required until the computer is as smart and flexible as a human being.

    If I have good reason to believe someone is trying to get into my car to kill me, I am going to try to get away, and I am completely morally fine with that meaning I run them over if that is my only viable option.

    The robotaxi would effectively sacrifice me so as to protect my attacker.

    • Re: (Score:3, Informative)

      I would like to point out that the rider signed a fuckton of waivers to his legal rights, and the attacker...did not.

      • Re:Edge cases (Score:5, Insightful)

        by Baron_Yam ( 643147 ) on Sunday March 22, 2026 @07:35PM (#66055020)

        I think you'll find you lose significant legal protects when you make credible death threats and appear to be interested in following through immediately.

        Self defense can be a valid defense for your target killing you, and you don't have to sign anything.

        • Re: (Score:3, Insightful)

          I was not trying to say riders do not have a right to defend themselves from attackers, if that was how you read it. I am suggesting that Waymo's contracts may be so full of waivers that Waymo would get into more trouble if they allow the car to be used to drive over attackers than if they literally allow the murder to take place.

          If this seems unlikely, my basis for this line of thinking is Disney attempting to use, what was it, a Disney+ trial subscription EULA as part of their justification they were not

    • Re: (Score:2, Informative)

      by sarren1901 ( 5415506 )

      It's San Fransisco, of course the criminals have all the rights. Especially compared to a tech-bro. What a weird city.

      • There's also a tech bro hierarchy. OpenAI apparently had that one tech bro killed with no consequences.
    • Edge cases are why human override is required until the computer is as smart and flexible as a human being.

      Yet this edge case has demonstrated that this wasn't needed. Police exist for a reason (even if they are borderline non-functional in the USA. But I find it funny when people comment on a non-story about how they need to address edge cases. How many people have died in a Waymo?

      • Waymo might have done the right thing here. If the car had sped away and accidentally run over the person, that would be a very different story.

        The guy in the car was annoyed for six minutes and then it was over. I have some sympathy but not really.
        • Annoyed? He was threatened with credible violence. As the article notes, if the attacker had concentrated on one window, he could have broken into the vehicle. The Waymo support person should absolutely prioritize the safety of the customer over the safety of the attacker.

          • Re: (Score:3, Interesting)

            by thegarbz ( 1787294 )

            The article is making a baseless assumption. Glass windows (especially passenger ones) are not laminated. If they windows didn't shatter there's not reason to think they would have shattered if the focus was just one one over and over again.

            Yes the person was threatened, but again, the risk postulated in the article was not credible.

            safety of the customer over the safety of the attacker.

            Who said the attacker? A car speeding away out of control isn't about the attacker, it's about another bystander. We're making a mountain of assumptions here. Maybe speeding awa

            • I thought I was crazy when I read that bit about "if he would have worked on just one window"

              That's now how auto glass works. The windscreens (front and rear) are laminated, so they would take some work to make a hole that you can then grip and rip out the sealants and adhesives holding the whole windshield in place. The side windows are just safety glass. They'll break, and they'll break into a shitload of little cubes with sharp corners. But you have to be able to deliver enough force to shatter the g

    • by bjoast ( 1310293 )
      People probably made the same argument for the horse when the car was invented: "A horse requires no fuel. What if someone attacks you and the car runs out of fuel?"
      • People probably made the same argument for the horse when the car was invented: "A horse requires no fuel. What if someone attacks you and the car runs out of fuel?"

        I would be very surprised if someone had that argument when the car was invented. You're really stretching for this one.

      • Because if someone wanted to shoot you that badly they wouldn't just shoot the horse first?

    • by gweihir ( 88907 )

      These edge cases basically never happen. The sensationalist way this is reported already demonstrates this nicely. The whole "story" is a non-event.

  • ... rather than a bug in the humans?

    It's the feral humans involved that needed to be taken offline for maintenance ...

    • by toutankh ( 1544253 ) on Sunday March 22, 2026 @07:32PM (#66055012)

      Because if it weren't a self driving car, there wouldn't be a story to begin with. Human drivers have no issue dealing with this situation.

    • Because humans. Some of them are pure scum, so you need to account for that.
    • Because there are both. We need to do a better job tracking people down like this and making it clear that this is not acceptable behavior, since functioning societies rely in a large part on trust of people not being assholes. But there's also a bug in the car here since human bad behavior is probably never going to end: in particular, there should be a manual driving override option so a human can get out of a seriously dangerous scenario. Unfortunately, elsewhere on the internet, there are already some
    • by gweihir ( 88907 )

      Indeed. But institutions for the mentally ill costs money, so the US does not do those anymore. Far cheaper to put those people on the streets, and as an added bonus they create fear and make doing politics easier. Funny how basically all of the civilized world does this differently.

  • John Connor? Just some out-of-work cabbie?

  • When seconds count (Score:5, Insightful)

    by Whatever Fits ( 262060 ) on Sunday March 22, 2026 @08:01PM (#66055058) Homepage Journal
    When seconds count, the police are only minutes away. Remember that the police are not in charge of your safety, you are. If you don't feel comfortable then you need to do something to make that situation better. Depending upon the situation that might be "don't do it" or maybe get your CCW. Those are two of your many choices. Your safety really depends on you.
    • by evanh ( 627108 )

      It's actually a security concern, not safety. The car was in fact conforming in accordance with good safety practices.

      • by Viol8 ( 599362 )

        And if he'd poured fuel over the car and lit it? Still only a security concern?

        • by evanh ( 627108 )

          Yes, of course. Anything deliberate falls under security. Anyone can jump off a bridge if they choose to. The safety barriers only serve as accidental protections.

  • by hdyoung ( 5182939 ) on Sunday March 22, 2026 @08:33PM (#66055118)
    so he attacks the... wait for it.... the humans.

    (removes eyeglasses and rubs eyes)

    Are we sure that the robots are the problem here?
    • by gweihir ( 88907 )

      I am sure the person making the threats here is the problem. And that this person is not in a closed mental institution. But the US thinks it is cheaper having these people out on the street ...

    • by vbdasc ( 146051 ) on Monday March 23, 2026 @03:33AM (#66055522)

      robots are soulless things without free will. It's pointless to attack them, because they won't learn. The human traitors aiding and abetting them, on the other hand... are attackable.

      I'm not necessarily thinking like that, just speculating what the attacker could have thought.

    • by tsqr ( 808554 )

      so he attacks the... wait for it.... the humans.

      (removes eyeglasses and rubs eyes)

      Are we sure that the robots are the problem here?

      The attacker didn't express anger for robots. He expressed anger at the car's occupants for funding robots. But the problem isn't the robots or the human passengers; the problem is the attacker.

  • *RKBA not valid in California...
    which is why I retired in Tucson.
    • by Dantoo ( 176555 ) on Sunday March 22, 2026 @11:16PM (#66055288)

      The three large, urban counties with the lowest homicide rates were San Jose, California; Anaheim, California; San Diego, California; in 2023 with less than 7 homicides per 100K population.

      Tucson, Arizona, had a reported violent crime rate of 47.74 per 100k and drop to 54 homicides overall, (about 5 per 100k) reportedly a big improvement in 2025. So it seems Tuscon has actually commenced a trend toward safety. A single incident could blow that out though with 1.1M pop.
      Best to stay home until it reaches something like London in the UK 1.1 per 100k.

    • Re: (Score:2, Insightful)

      by thegarbz ( 1787294 )

      which is why I retired in Tucson.

      Because you're just itching to kill someone needlessly? In this scenario here precisely zero people were injured or killed. Having a gun would very likely have changed that.

  • by Wolfling1 ( 1808594 ) on Sunday March 22, 2026 @09:30PM (#66055182) Journal
    So, someone's moved on to anger. I mostly see denial and bargaining. The anger is there, but normal people have some checks and balances on anger-driven behaviours. Apparently, this person did not. I was surprised to read that the crowd was cheering on the attacker. For most of the western world, threatening murder is a crime. I guess the rule of law really has become optional in the USA.
    • by rea1l1 ( 903073 )

      The rule of law is optional everywhere. There are a variety of specific requirements for prosecution to occur that are usually very straight forward to evade if cognizant about one's own activities. Law is mostly a big stick intended to scare the little people who do dumb things on the open public stage. Let's not pretend the "rule of law" is some morally superior code - it's the rules the wealthy apply to the poor to keep themselves wealthy and enable tranquility among the working class, while the wealthy

  • by adfraggs ( 4718383 ) on Sunday March 22, 2026 @10:51PM (#66055270)

    K.I.T.T. would have known how to handle this situation

  • by gweihir ( 88907 ) on Sunday March 22, 2026 @11:23PM (#66055298)

    That is basically the story here. The self-driving car part does not matter. The fix? Start to care for the mentally ill again, even if that costs money. Putting them out on the streets is a very bad idea.

    • I think the major point is that self driving car service passengers are pretty much stuck and in potential danger if a random someone comes up to them while they are in the car.

      Unlike in a traditional vehicle, where even if there is no driver, you can at least try to get into the driver seat and drive off (assuming the car is running).

  • Maybe if the rider told Waymo support they plan to unleash deadly force in self defense if the attacker manages to break through the window, or brandishes a weapon, or tries to set the car on fire, or whatever other threatening action, then the passengers will fear for their lives and plan on using whatever is necessary to neutralize the threat. Decline to answer whether or not the passengers have guns, maybe they do, maybe they don't. Perhaps then some manager would decide it's worth using an override beca
  • Remember?
    1. Red means STOP
    2. Yellow means SLOW
    3. Green means GO

    Oh, it is...my bad

  • this looks like the EVs are going to be forced to decide in advance on the trolly car problem. "Do I save my passenger by injuring pedestrians, or almost certainly get my passenger killed?"

    This is going to have to be coded in software. And if you think you can just avoid coding it, the choice will be made for you, BY THE CODE. It's a binary decision, you can't opt out of making the decision because INACTION is one of your options.

    So eventually legislation is going to have to get on the books to answer it

    • Cars stop faster than trolleys, and if they talk to each other, every car in the area can slam on its breaks at the same time. That would reduce the number of circumstances where that decision would have to be made.
    • They're going to outsource the decision making to a trained AI, then they'll say they can't precisely predict the choices it will make in advance, but that they will be in alignment with the training data.

      For me, the real question is "is it statistically better than humans?", and wherever the answer is "yes", then we have to live with the machines killing people once in a while because they'll kill fewer people than people would.

  • My guess is that there would be a very different outcome if someone tried to do something like this in Texas.
  • When someone is threatening you, you should call the police,not the car company.

    When the car company gets a call about someone threatening someone in their car, they need to call the police themselves.

    Or do we have to wait till someone stops the car, dissasembles it, then proceeds to shoot the occupants before people realize:

    When you are threatened, you call the cops.

  • They talk a big bucnh of shit about personal freedom, but let anyone not march in lockstep with those stupid fuctards and you then see their real fucking colors:

    JUST AS UNSTABLE AND FUCKED UP IN THE HEAD AS ANY OLD NAZI THEY'D CARE TO NAME.

    Fall into the ocean, california, or secede. Either way, get the fuck away from here.

  • Always encouraging to know that the external cameras will help find the perps who murdered you.

God help those who do not help themselves. -- Wilson Mizner

Working...