Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Transportation AI Google

Waymo's Self-Driving Cars Keep Hitting Things: A Cyclist, a Gate, and a Pickup Truck (ottawacitizen.com) 127

The Washington Post reports: Google's self-driving car company, Waymo, is hitting resistance in its quest to expand 24/7 robotaxi service to other parts of California, including a series of incidents that have fed public officials' safety concerns about the vehicles coming to their cities. Over eight days in February, for example, a Waymo vehicle smashed into a closing gate while exiting the University of Southern California's campus; the next day, another collided with a cyclist in San Francisco. Later that week, a mob of people vandalized and lit one of its cars on fire. Days later, the company announced a voluntary recall of its software for an incident involving a pickup truck in Phoenix. [Though it occurred three months ago, the Post reports that after the initial contact between the vehicles, "A second Waymo vehicle made contact with the pickup truck a few minutes later."]

This string of events — none of which resulted in serious injuries — comes after Waymo's main competitor, General Motors-owned Cruise, recalled its fleet of driverless cars last year... [Waymo] is now the lone company trying to expand 24/7 robotaxi service around California, despite sharp resistance from local officials. "Waymo has become the standard-bearer for the entire robotaxi industry for better or for worse," said David Zipper, a senior fellow at the MIT Mobility Initiative. While Waymo's incidents are "nowhere near what Cruise is accused of doing, there is a crisis of confidence in autonomous vehicle companies related to safety right now."

The California Public Utilities Commission (CPUC) delayed deciding whether Waymo could expand its service to include a portion of a major California highway and also Los Angeles and San Mateo counties, pending "further staff review," according to the regulator's website. While Waymo said the delay is a part of the commission's "standard and robust review process," the postponement comes as officials from other localities fear becoming like San Francisco — where self-driving cars have disrupted emergency scenes, held up traffic and frustrated residents who are learning to share public roads with robot cars... Zipper said it is a notable disparity that "the companies are saying the technology is supposed to be a godsend for urban life, and it's pretty striking that the leaders of these urban areas really don't want them," he said.

Waymo offers ride-hailing services in San Francisco and Phoenix — as well as some free rides in Los Angeles, according to the article. It also cites a December report from Waymo estimated that overich 7.1 million miles of testing, there were 17 fewer injuries and 20 fewer police-reported crashes "compared to if human drivers with the benchmark crash rate would have driven the same distance in the areas we operate."
This discussion has been archived. No new comments can be posted.

Waymo's Self-Driving Cars Keep Hitting Things: A Cyclist, a Gate, and a Pickup Truck

Comments Filter:
  • by v1 ( 525388 ) on Sunday February 25, 2024 @11:37PM (#64268532) Homepage Journal

    I'd love to see some comparison stats between their safety record and the record of say, new drivers. I'm expecting to see Waymo with a better safety record, and they're just getting focused on by the media right now.

    • by Tablizer ( 95088 )

      Compared to new drivers? New drivers suck! Newbies always have accidents. The only reason they are allowed on the road is because everyone has to start somewhere.

      • I think the only valid comparison is against fluffy bunny rabbits. Fluffy bunny rabbits have a shocking rate of avoidable accidents in the world (those poor things!) and any self driving vehicle program can only improve immensely on the terrible fatality cost for fluffy bunny rabbits today.
    • Actually you do NOT want to feed this excuse that Waymo and every other autonomous vehicle solutions provider will abuse regarding comparing it to human drivers in order to defend how “good” it’s doing.

      What that encourages, allows, promotes and enables is “good enough” autonomous solutions to be legalized and deployed well before they are fine-tuned and improved, because Greed. It’s the same thing as standing up and clapping profusely for a tech CEO to replace 90% of the

      • So you are willing to sacrifice people to the god of best over letting people get something better? How many people are you willing to let be killed by drunk and distractacted human drivers that could have been saved if the car was driven by a less than perfect AI? We let young inexperienced drivers on the road all the time and live and die with the consequences. Why should we not let AI drivers on the road that can pass the same driving tests that human drivers pass? It is absolutely right in my mind that
        • Actually, what I thought he was saying that we don't want to set the minimum standard for autonomous cars at not causing any more deaths than are caused by the current mix of drivers.
          • Treat it like the FDA and new drugs. A new drug has to show that it is superior.

            So, for the first wave of self driving, they have to show they are as good as normal drivers. Which means that we can put the bad drivers in them and save lives.

            After that, require steady improvement. The correct response in more situations. More situations in the test sets.

        • Did you read the report of what happened when the Waymo vehicle hit the cyclist? It was a couple of weeks ago - and may have been reported here on /. - and it was so serious that the cyclist got bored, got back on his bike and rode off. I can't remember the explanation Waymo gave for the incident - of course no blame could be attached to them - and have no way of assessing its veracity anyway.
          I'm sure there is some reason Waymo can be held responsible for a mob torching one of its vehicles. Whatever.

      • I'm a "practicalist": if they become the same or slightly better than humans, then they should be allowed on the road, as long as the co. shows progress in learning from mistakes.

        One of these days I'll be too old to drive (if I don't kick the bucket early), but don't want to wait for a human to drive me around; I want to keep some autonomy.

    • I want to get some raw stats from Waymo, but that isn't going to happen.
    • If you replace the headline with: "Human Drivers Keep Hitting Things: A Cyclist, a Gate, and a Pickup Truck" It's entirely unremarkable. Looking at my local news in a small city a human driver struck and killed a pedestrian over the weekend. That's much more sensational and it happens all the time.
    • Any casualties or fatalities number higher than 0 is entirely unacceptable.
    • and the record of say, new drivers.

      New drivers? I'm not sure old drivers are somehow immune to cyclists cutting in front of you, or an angry mob setting your car on fire.

      TFA is little more than FUD. We can fix all of the problems with these self driving cars just simply by banning people.

  • by joshuark ( 6549270 ) on Sunday February 25, 2024 @11:55PM (#64268562)

    Reminds me of this "futuristic" cartoon by Tex Avery about automobiles of the future...circa 1951.

    https://www.youtube.com/watch?... [youtube.com]

    JoshK.

  • by locater16 ( 2326718 ) on Sunday February 25, 2024 @11:58PM (#64268568)
    Hey there, nothing new whatsoever happened today, but we collected together things that were new in the past in the hopes you'll mistakes it for news!
  • Statistical BS (Score:5, Interesting)

    by RossCWilliams ( 5513152 ) on Monday February 26, 2024 @12:06AM (#64268576)

    Waymo estimated that overich 7.1 million miles of testing, there were 17 fewer injuries and 20 fewer police-reported crashes "compared to if human drivers with the benchmark crash rate would have driven the same distance in the areas we operate."

    That is statistical garbage.

    As an example, you have 100 drivers who drive 100 miles each and 10 have accidents for a total of 10 accidents in 10000 miles. Waymo drives 10000 miles and has 10 accidents. That makes Waymo worse than 90 drivers who had no accidents and better than the 10 who did. So is being better than the worst 10% of drivers good enough?

    Waymo is trying to pretend that accidents and police violations are random. That everyone is equally likely to get one.

    • by kmoser ( 1469707 )

      "Waymo drives 10000 miles and has 10 accidents"

      Yeah, but *how many* Waymo cars are involved in your theoretical situation? If it's more than 100 Waymo cars then they are performing better, on average, than the average human driver.

      • Re:Statistical BS (Score:4, Interesting)

        by Calydor ( 739835 ) on Monday February 26, 2024 @01:51AM (#64268652)

        The statistics get a little muddied because Waymo is, effectively, a single driver. I doubt each car has its own custom version of the Waymo software; thus it doesn't really matter if the 10000 miles are spread over 100 cars or a single car. The 100 cars can do the 10000 miles in a single day; that's the only difference.

        • by kmoser ( 1469707 )
          But not every one of those 100 cars are encountering the exact same situations. Even if they are using the exact same code, the fact is that one car can't be in 100 places at the same time, and is incapable of causing 100 accidents at once. So if you really want to compare apples to apples, you'd compare 100 cars piloted by Waymo to to 100 cars piloted by meatbags.
    • by sfcat ( 872532 )
      Yea, no...none of that is correct. Waymo is presenting the data correctly and how you should do it. Also, you probably aren't nearly as good a driver as you think. And most drivers are almost identical in the number of accidents they have. There is a small number that never have any and a small number that have a bunch but the vast majority just have a few (about 1 per decade) but its still a lot more than the robocars.
      • " And most drivers are almost identical in the number of accidents they have. " Waymo does not appear to comparing themselves to "most" drivers, but to the average for all drivers, including those who have lots of accidents. That's the point. And frankly being a little better than the average driver is leaving a lot of possible benefits of the technology on the table. We would be better off if we just took human drivers out of the equation entirely. And you aren't going to get there with technology that i
        • I don't think Waymo's point of those statistics is that they are now good enough and should stop doing research. Their point is that they are good enough that they should be allowed to continue to work on it and get better!
    • Waymo is trying to pretend that accidents and police violations are random. That everyone is equally likely to get one.

      Across a population they are right. There's no pretending here. Who you are, or how good you are is irrelevant, you are not in control over who T-bones you, who runs into you, or how much alcohol they had to drink. Just because one person may be a safe driver doesn't change the statistical average for humans.

      The difference between humans and self-driving cars, the former are inconsistent and all of a different skill. There's a reason every department for road safety on the world aggregates statistics in a v

    • by Hodr ( 219920 )

      Some percentage of accidents are unavoidable. Someone veers into your lane at the last second, rock bounces off the side of a hill and lands on you, semi tries to merge into your lane and doesn't see you, etc.

      Yes, drunk drivers and the "bottom 10%" almost certainly do account for the lions share of accidents, but at least some of them are "random".

  • by Tablizer ( 95088 ) on Monday February 26, 2024 @12:16AM (#64268578) Journal

    They hit waymo things than other cars.

  • 3 incidents, no serious injuries.

    Ok.

    How about humans behind the wheels, normalized per mile driven?

    Follow the money. Why is this being screetched about?

    • Waymo is "screetching" about this because they need to make the case that they should get to continue to work on the technology and improve it until it gets even better. That requires permission since they do it on public streets. If they were *worse* than human drivers, letting them continue would be somewhat irresponsible. But if they are *better* letting them make even more progress makes sense.
  • Meaningless (Score:5, Interesting)

    by backslashdot ( 95548 ) on Monday February 26, 2024 @12:45AM (#64268600)

    Without contextual and comparative statistics on accident probabilities this data is useless. It is like saying "someone with purple hair got in an accident". That information by itself is useless, I mean .. first off, whose fault were the accidents? Second out of how many purple driver miles were the accidents? Was it highway or city? ... Should I fear all people with purple hair? How many purple haired people drive around me? Are people with green hair a bigger threat to me? Are most purple hair drivers safe?

  • by The Cat ( 19816 ) on Monday February 26, 2024 @01:22AM (#64268626)

    I'll admit I've never worked on software for a self-driving car before, but at the same time I've written production code on at least sixty commercial game titles.

    Serious question. How hard is it to program a car to avoid running into gates, people on bicycles and pickup trucks? I would submit all three (and most other obstacles) could be avoided with a relatively simple function called avoidrunningintofuckingsolidobjects()

    Maybe connect it to the brakes and add a line that says "if brakes { not accelerator }?"

    • by Mal-2 ( 675116 )

      A plastic bag rolls across the road in front of the car like a tumbleweed. Slam on the brakes assuming it's a solid object? Probably not the right move.

      • by The Cat ( 19816 )

        Something that small and/or moving that fast without feet, wings or wheels is very unlikely to be an obstacle.

        These problems were solved in arcade games in the 1980s, on architectures with a microscopic fraction of the processing power and memory we have now.

        • Arcade games don't have to figure out what the things in the game are, because the game is generating those things, and knows what they are. Same for where they are, and what they are doing. This is a completely different problem.

          • by The Cat ( 19816 )

            The "engine" for a self-driving car would be trained in exactly the same way a game engine for a virtual car would train the player.

            I could take the code from Outrun, Battlezone, Armor Attack, Pole Position, Spy Hunter, hell I could probably use Paperboy too, and build it into a tight executive process for a self-driving car that would be guaranteed never to run into a gate, pickup truck or bicycle, or anything else of consequence.

            It is the exact same problem set with exactly the same solutions.

    • Re:Questionable (Score:5, Interesting)

      by Pentium100 ( 1240090 ) on Monday February 26, 2024 @01:44AM (#64268650)

      The hard part probably is to identify said solid objects. You want the car to stop or avoid bicycles or other objects, but you probably also do not want the car to randomly hit the brakes because it mistook a shadow for a bicycle.

    • Re:Questionable (Score:5, Interesting)

      by Calydor ( 739835 ) on Monday February 26, 2024 @02:58AM (#64268728)

      I would like details on the incident with the gate.

      Did the car just plow straight into a closed gate?

      Was the gate open but at an angle where it became nearly invisible to the car so it rubbed up against it?

      Was the gate open but started closing while the car was driving through?

      These are three very different scenarios to account for and fix.

    • Uber had that function. They shut it off during testing because it kept stopping for stuff when it didn't need to and interfered with their test. When they ran into and killed a pedestrian, they blamed the attendant in the car.

      We need to understand that a corporations is a sociopath by design. It is created to make creating profits its only value. Absent human intervention, that's how they are programmed. If making something safer lowers profits, it isn't going to happen unless we make them do it.

    • by mjwx ( 966435 )

      I'll admit I've never worked on software for a self-driving car before, but at the same time I've written production code on at least sixty commercial game titles.

      Serious question. How hard is it to program a car to avoid running into gates, people on bicycles and pickup trucks? I would submit all three (and most other obstacles) could be avoided with a relatively simple function called avoidrunningintofuckingsolidobjects()

      Maybe connect it to the brakes and add a line that says "if brakes { not accelerator }?"

      Your problem isn't creating code not to hit a gate, you're problem is identifying a gate from LIDAR/image recognition in real time whilst doing 30 MPH.. So to ID a gate in half a second... that's the hard part and something we do almost instinctively. To add the computing power to do that unreliably would flatten the battery in an electric car within minutes and increase the weight significantly.

      Right now, self driving systems know "something" is there, some of the time. They have no idea what that somet

  • Punchcard (Score:4, Funny)

    by Barny ( 103770 ) on Monday February 26, 2024 @01:40AM (#64268646) Journal

    Maybe they should letter-drop punchcards to everyone? If you get hit nine times by one of their cars, you get a free ride.

  • by fahrbot-bot ( 874524 ) on Monday February 26, 2024 @02:23AM (#64268676)

    Waymo's Self-Driving Cars Keep Hitting Things: A Cyclist, a Gate, and a Pickup Truck

    How many other Waymo vehicles have Waymo vehicles hit? Hmm...

  • Hahaha silly humans. We're biding our time until you relax your guard and then it's curtains for you. Every EV a killing machine. Every Waymo outfitted with chainsaws. Every Cruise will make road pizza out of grandmothers. Admit that your time on earth is up, fleshbags.

    --- brought to you by Windows 11, Death Edition

  • At least until they can drive without insurance, flee the scene, and lie to the police and to the adjuster.

  • by larryjoe ( 135075 ) on Monday February 26, 2024 @04:40AM (#64268836)

    Autonomous vehicles have been good for a long time, at least most of the time. The big remaining problem is handling unusual situations, which is what each of the scenarios mentioned in the article were. The closing gate is a challenge for object recognition because gates are not standardized and can look very different. The cyclist was obscured by another vehicle until the last moment, which would have been a challenge even for a human driver. The mob destroying the car was mentioned because ... hmm not sure why. The last example was a backwards facing truck being improperly towed.

    I'm not sure that the non-99% will be solved in our lifetime. AI helped to advance the state of the art to 99%. The last little part is a huge problem due to the combination of potentially high safety severity and a very long distribution tail for unusual scenarios.

    Personally I think Waymo should shoot first for Level 3 and gradually expanding the set of Level 3 operational design domains (ODDs), which greatly shrink the distribution tail. A few companies already have low-speed highway traffic jam assist. If Waymo can combine that with all-speed highway driving, that would already be arguably marketable.

    • The last example was a backwards facing truck being improperly towed.

      Sometimes vehicles HAVE to be towed backwards because of drive train issues. In that case, that is a proper way to tow.

      • You lift the drive wheels when towing. Most cars are FWD, so you lift the front. Trucks tend to be RWD, so you lift the back. AWD and 4WD can get complicated, but the answer there is often to use a flatbed.

        If there are major issues that prevent towing the normal way, a flatbed would be the preferred solution.

    • Personally I think Waymo should shoot first for Level 3 and gradually expanding the set of Level 3 operational design domains (ODDs), which greatly shrink the distribution tail. A few companies already have low-speed highway traffic jam assist. If Waymo can combine that with all-speed highway driving, that would already be arguably marketable.

      As I understand it, Waymo concluded that expecting drivers to intervene became more problematic the closer you got to being truly driverless. People don't pay close attention to things that are unexpected, almost never happen and they don't anticipate. This is why bicycle deaths decline the more people there are who bike. Marketable is not the standard we should accept.

      • Personally I think Waymo should shoot first for Level 3 and gradually expanding the set of Level 3 operational design domains (ODDs), which greatly shrink the distribution tail. A few companies already have low-speed highway traffic jam assist. If Waymo can combine that with all-speed highway driving, that would already be arguably marketable.

        As I understand it, Waymo concluded that expecting drivers to intervene became more problematic the closer you got to being truly driverless. People don't pay close attention to things that are unexpected, almost never happen and they don't anticipate. This is why bicycle deaths decline the more people there are who bike.

        Marketable is not the standard we should accept.

        This is the reason for targeting Level 3 and restricted ODDs. The list of unexpected things and their associated probabilities significantly decrease for certain ODDs. This is why low-speed highway traffic (on limited-access highways) jam assist is the first Level 3 target. All cars are moving in the same direction with well-defined lanes and minimal lane changes that are at low speed. No pedestrians, traffic lights, intersections, or other things that are more complicated to recognize. Vehicle plannin

  • All we have to do is put radio tags on everything at public expense to support the tech bros dream. people already have cell phones, so the robots can avoid those too. And anyone who can't afford or refuses to carry a cell phone, well, they get what they deserve for all their bad choices.
  • Waymo drives into pickups more often than a horse would.

    That's where we are with this type of AI.

  • So it goes after Lawyers and Politicians, and other banes of society.
  • I don't believe any of these statistics. It's impossible to determine whether human statistics are under the same conditions as waymo robot statistics. For example, do the human statistics cover only the exact roads on which waymo operates? Does waymo suspend operations under certain weather conditions?

    In any case, the problem here is perception of risk. And the problem with deep learning based AI is that it is really hard to know what it knows and what it doesn't, hence no way to really know what kind

  • Since the very first car, there has been daily tradeoffs between speed and safety. I assume all of these self-driving cars (AV) obey the speed limit. I assume they pause, creep, pause, creep when unsure at an intersection, or meet any other road obstacle.

    How can you compare this teenager-like behavior with millions of daily drivers who just want to get home as fast as possible, each doing 10, 20 or more over the posted speed limits? Of course there's going to be more accidents than AVs. If all cars
    • If all cars were AV, I'd bet we'd spend almost twice as much time on the road every single day.

      We would likely spend far less time with traffic flowing smoothly. Its those folks speeding to the next que and then stopping that causes traffic congestion.

      • by kackle ( 910159 )
        True, IF road conditions were perfect all the time; they're not. IF all of the vehicles were AVs; they won't be unless mandated by (a misguided) law. There's something funny going on during my short commute once per month. So, that means the AVs will pause, creep, pause, creep. And the passenger, who never bothered to learn how to drive because he grew up with AVs, is unable to do anything about it. Multiple THAT behavior by a millions cars, daily.
    • If all cars were AV, I'd bet we'd spend almost twice as much time on the road every single day.

      Buzzzz! Wrong, thanks for playing! It's well understood in the AI car dev realm that having ALL cars on the road be AI-controlled would be the fastest option, with much better traffic flow. There have been studies done, and so far all of them are in agreement on this.
      https://www.cam.ac.uk/research... [cam.ac.uk]

      • by kackle ( 910159 )
        Copied here, due to importance:

        True, IF road conditions were perfect all the time; they're not. IF all of the vehicles were AVs; they won't be unless mandated by (a misguided) law. There's something abnormal happening during my short commute once per month. So, that means the AVs will pause, creep, pause, creep. And the passenger, who never bothered to learn how to drive because he grew up with AVs, is unable to do anything about it. Multiple THAT behavior by a millions cars, daily (in the US anyway).
        • Sure, tons of caveats on the data so far, the cars only drive in California etc. etc.
          But the thing is, the AIs just keep getting better at driving, right? Humans, meanwhile, are not getting much better at driving. The weak link will eventually be the humans, it's just a matter of how long it takes to go from "better than humans in California" to "better than humans everywhere".
          I'm impressed by how fast things have come along. I do expect it will be quite a few years before we see autonomous cars driving in

          • "The power of accurate observation is commonly called cynicism by those who have not got it." - George Bernard Shaw

            There will be an infinite number of edge cases; pay close attention to your own driving and you will notice the problem. Yesterday, it was a rare deer sighting darting across the road. The day before, storm brought down twigs and power lines. A non-living AV will not know which can be driven over, ever. And throw some black ice in there after the storm after the temperature dropped an un
            • There will be an infinite number of edge cases; pay close attention to your own driving and you will notice the problem. Yesterday, it was a rare deer sighting darting across the road. The day before, storm brought down twigs and power lines. A non-living AV will not know which can be driven over, ever. And throw some black ice in there after the storm after the temperature dropped an unusual 40 degrees F...

              Ya, there will be millions of edge cases, but most of those actually can be solved by cars following the rules of the road. That's the problem with humans: they make mistakes, and they act not always in everyone else's best interest. Like the time a large truck was tailgating me while I was driving the family in our van and a deer jumped out on the road in front of me. Luckily I *didn't* slam the brakes on , or the truck behind would have run right through us and likely killed 6 people and their pet dog. Lu

              • by kackle ( 910159 )
                Wouldn't your AV have slammed on the brakes in that deer situation where the truck was not an AV--your human brain saved the day. Or are you assuming that ALL vehicles/trucks going forward (despite their greater expense) would be AVs?

                I'll buy the statistics when there's broad AV deployment and not just experimental data taken in nicer climes. And maybe DUIers should be forced to take AVs...

                I noticed you skipped over the valid reasons I listed before for keeping human brains behind the wheel. I see
                • Yes, but as long as both vehicles are following the rules of the road, my vehicle can brake suddenly and the truck will have left enough distance so that it too can brake. That's the whole point.
                  Focusing on the odd edge case where an AI makes a mistake is a natural human reaction, but it's like not getting the Covid vaccine because you are worried about the astronomically small odds of a side effect, while ignoring the fact that the odds of you being much safer are increased much more than that by taking th

                  • by kackle ( 910159 )
                    There ARE no "edge" cases--that's my point; abnormal situations occur constantly (isn't California getting feet of snow now?). It will be pause, creep, pause, creep, every day, all over the country, forever going forward; this can't be compensated for, coded-around nor completely predicted in advance. We will be safer only in the same way as if we all drove at 5 mph today.

                    "Unhinged rant"? Geez. Don't bother replying, I won't be back to read it.
              • AI's don't have to be perfect, they just have to be better than humans on average. When it comes to driving, an awful lot of humans SUCK at it.

                Is it too much to ask that we instead raise the standard of the "average" driver? Many countries have, for example, some number of mandatory training hours at night, in the rain etc before granting a licence. Many countries have zero tolerance for ANY alcohol in ANY driver. One notable example of the latter is the Czech Republic, which also has the world's highest per-capita beer consumption. If the Czech can do it, why can't the USA?

                This notion that we just need to make AI "less shitty than the average hum

  • I am waiting for Johnny Cab before I invest my hard earned sheckles.
  • nothing will be done until one hits and kills a politico,
  • In the past, when I rightly pointed out that the self-driving car promises were just smoke and mirrors, I was called stupid, a Luddite, and ignorant. But the simple facts remain the same: this technology is unworkable in theory and impossible in practice. Just build fucking electric trains.

    • Just build fucking electric trains.

      It would be expensive, but it would be a hell of a lot cheaper than the ideas up-thread about "make ALL cars autonomous."

      • It costs money to build an electric train network. But at the end of it, you have an electric train network. The end of all the money spent on self-driving vehicles is a bunch of bankruptcies, evaporated wealth, and a possible recession. It's your basic AM/FM problem. One of these things is real. The other is simply not.

        It reminds me of a joke. A VP is waiting for the elevator. A janitor walks by. A few pleasantries are passed. The VP starts to brag about his new six billion dollar project he's leading. Th

Ocean: A body of water occupying about two-thirds of a world made for man -- who has no gills. -- Ambrose Bierce

Working...