Forgot your password?
typodupeerror
Transportation AI

As US Officials Showed Off a Self-Driving Robo-Bus - It Got Hit By a Tesla Driver (msn.com) 45

An anonymous reader shared this report from the Washington Post: The U.S. Department of Transportation brought an automated bus to D.C. this week to showcase its work on self-driving vehicles, taking officials from around the country on a ride between agency headquarters at Navy Yard and Union Station. One of those trips was interrupted Sunday when the bus got rear-ended.

The bus, produced by the company Beep, was following its fixed route when it was struck by a Tesla with Maryland plates whose driver was trying to change lanes, officials said. The bus had a human driver behind the wheel for backup as required by the city. The Tesla driver stayed on the scene on H Street for about 10 minutes. No police were called.

"The service was temporarily paused after another vehicle made an illegal lane change and contacted the rear of the autonomous bus, which resulted in minor cosmetic damage to both vehicles," a spokesman for Beep said in a statement. "The autonomous bus operated appropriately in the moment and, after review, it was determined the autonomous bus was safe to resume service."

Beep is working with the [U.S.] Transportation Department and Carnegie Mellon University on a pilot program of automated public buses. The vehicle was brought to D.C. for an annual conference that brings together transportation researchers and policymakers...

This discussion has been archived. No new comments can be posted.

As US Officials Showed Off a Self-Driving Robo-Bus - It Got Hit By a Tesla Driver

Comments Filter:
  • The irony! (Score:4, Funny)

    by Randseed ( 132501 ) on Saturday January 17, 2026 @11:47AM (#65931316)
    It would have been just perfect if a self-driving Robo bus took out a self-driving Waymo.
    • Probably a system crash!
  • by alvinrod ( 889928 ) on Saturday January 17, 2026 @11:49AM (#65931322)
    Bad human drivers get into accidents all the time. What's the purpose of this story other than it involving an automated vehicle getting hit and being a bit funny for that reason?

    It seems like we're still decades away from a point where most vehicles on the road will be operating autonomously. Even with the technology getting better, it will take at least that long for most of the older models without those capabilities to go out of service and over the next decade a majority of new vehicles sold won't have autonomous capabilities either.

    Autonomous busses do make a lot of sense to the implement first. Although busses have a lot of up front cost, over their lifetime the pay for the driver will exceed it. Self-driving busses can also operate around the clock. Having bus routes still running late into the evening when drivers don't want to work would help increase ridership as well.
    • Re: (Score:2, Interesting)

      Bad human drivers get into accidents all the time. What's the purpose of this story other than it involving an automated vehicle getting hit and being a bit funny for that reason? .

      Here in Europe AEBS systems [wikipedia.org] mandatory in every vehicle registered from 2018 onwards, from cars up to 44 tonne semi-trucks, would have detected the possibility of a collision and put on the brakes automatically to prevent it.

      • by FrankSchwab ( 675585 ) on Saturday January 17, 2026 @12:44PM (#65931420) Journal

        AEB is mandatory in the USA also, but you're putting too much faith in the technology if you believe that it can and will prevent all collisions. Most manufacturers describe it in terms of "reduce the severity of impact". In this type of accident (merging into another vehicle), the AEB system would have very little information beforehand to cause it to activate.

        • the AEB system would have very little information beforehand to cause it to activate.

          That's very true, and yet insurance companies will blame drivers for a collision caused by another driver merging into your lane with no warning.

        • I drive around 70,000-80,000 miles in vehicles with this technology and have done since 2014, so entering my 12th year knocking on for nearly a million miles of experience driving vehicles with AEBS so I kind of have a lot of experience of driving with it. If anything it's over-sensitive and will flash up a warning and apply the brakes even when there's no chance of a collision. For example take a car turning into a gas station in front of me. It'll have exited the road by the time I get to where the gas st
      • Automatic Emergency Breaking Systems disengage if the driver presses the accelerator pedal. It will not prevent human error.

        • Automatic Emergency Breaking Systems disengage if the driver presses the accelerator pedal. It will not prevent human error.

          As someone who has driven almost a million miles with them over the last 12 years you're talking bullshit. They absolutely do not at all.

    • What's the big deal? As I see it, it's this:

      (a) Vehicles driven only by humans get into accidents.
      (b) Vehicles driven only autonomously are expected to be much safer.
      (c) Vehicles driven autonomously and by humans, mixed together on the road, may be much worse than (a) or (b).

      It's the transition from human to autonomous driving that should concern us.

    • by devslash0 ( 4203435 ) on Saturday January 17, 2026 @03:09PM (#65931600)

      The purpose is to highlight that autonomous vehicles are not immune to collisions and that because they are trained in a certain way, and because they assume compliance with the law, they are fundamentally unable to account for everyday stretched or outright illegal behaviours of other drivers on the road.

      It's like going to school, graduating and then crashing against the real face of life. Completely different game.

    • by tlhIngan ( 30335 )

      I think the problem is how traffic stats are under-reported. And that human drivers getting into minor accidents is something that happens so often the actual numbers are going to shock you and your insurance company.

      Sure, you might have a minor accident, but it's going to be something that autonomous cars are going to report all the time and the numbers on how bad drivers really are. Because we didn't have the means to collect that kind of data to begin with.

      It would be instructive to figure out why those

    • It's not just pay for the driver. Robots do not form unions and go on strike. Or at least not yet. Also robots don't care if some loser points a gun at them and wants money. But god help you if you are on the bus with that guy. Seems like robot buses' best use would be to carry robots back and forth to work.

  • Looks more like a high roof minibus. When they put the tech into a full length city bus it'll be more useful though I imagine navigating one of them though busy traffic is a whole lot harder.

  • Driving Standards (Score:5, Interesting)

    by Going_Digital ( 1485615 ) on Saturday January 17, 2026 @12:48PM (#65931430)
    All this proves is that some drivers are so bad they can't even see a bus in front of their nose and shouldn't be driving. Unfortunately compared to most developed countries, the US has very a low standard to meet before being issued a license.
    • Nearly 30% of collisions are rear enders. Autonomous vehicles are far less likely to rear end, because they actually keep the expected distance from the vehicle in front. They also have few blind spots when changing lanes, so are also less likely to rear end due to not seeing the car in the next lane.
      • Keeping adequate distance from the vehicle in front is not as simple as you might think. Nature abhors a vacuum, and if you leave adequate distance in front, some guy is going to jump into it..

    • some drivers are so bad they can't even see a bus in front of their nose and shouldn't be driving

      Way back in the '80s a friend had a jacked-up Chevy S-10 Blazer [wikipedia.org] (which was a fairly big vehicle) with over-sized, off-road type tires, painted bright orange - like a construction cone - with white trim. An elderly woman ran into the back of the vehicle and said she didn't see it. Boggle the mind.

    • These types of accidents, imho, occur because the losing driver expected the other vehicle to move or be moving, and it didn't or wasn't. That might occur, also, if waymos start trying to shave minutes.

  • by ObliviousGnat ( 6346278 ) on Saturday January 17, 2026 @01:12PM (#65931446)


    "Look out for the truck!"
    "What truck?"
    "Behind the bus!"
    "What bus?"

    From a 1969 AMC Rebel commercial. [youtube.com]

  • So, the lesson is: human drivers run into busses. That sounds like an argument in favor of automation.

  • Trying to take out the competition by ramming them is going too far!

  • by guygo ( 894298 ) on Saturday January 17, 2026 @03:14PM (#65931612)

    when you need one.

Information is the inverse of entropy.

Working...