Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Transportation

Feds Probe Waymo Driverless Cars Hitting Parked Cars, Drifting Into Traffic (arstechnica.com) 57

An anonymous reader quotes a report from Ars Technica: Crashing into parked cars, drifting over into oncoming traffic, intruding into construction zones -- all this "unexpected behavior" from Waymo's self-driving vehicles may be violating traffic laws, the US National Highway Traffic Safety Administration (NHTSA) said (PDF) Monday. To better understand Waymo's potential safety risks, NHTSA's Office of Defects Investigation (ODI) is now looking into 22 incident reports involving cars equipped with Waymo's fifth-generation automated driving system. Seventeen incidents involved collisions, but none involved injuries.

Some of the reports came directly from Waymo, while others "were identified based on publicly available reports," NHTSA said. The reports document single-party crashes into "stationary and semi-stationary objects such as gates and chains" as well as instances in which Waymo cars "appeared to disobey traffic safety control devices." The ODI plans to compare notes between incidents to decide if Waymo cars pose a safety risk or require updates to prevent malfunctioning. There is already evidence from the ODI's initial evaluation showing that Waymo's automated driving systems (ADS) were either "engaged throughout the incident" or abruptly "disengaged in the moments just before an incident occurred," NHTSA said. The probe is the first step before NHTSA can issue a potential recall, Reuters reported.
A Waymo spokesperson said the company currently serves "over 50,000 weekly trips for our riders in some of the most challenging and complex environments." When a collision occurs, Waymo reviews each case and continually updates the ADS software to enhance performance.

"We are proud of our performance and safety record over tens of millions of autonomous miles driven, as well as our demonstrated commitment to safety transparency," Waymo's spokesperson said, confirming that Waymo would "continue to work" with the ODI to enhance ADS safety.
This discussion has been archived. No new comments can be posted.

Feds Probe Waymo Driverless Cars Hitting Parked Cars, Drifting Into Traffic

Comments Filter:
  • by quonset ( 4839537 ) on Tuesday May 14, 2024 @05:25PM (#64472333)

    Crashing into parked cars, drifting over into oncoming traffic, intruding into construction zones

    This is exactly how humans drive. This is just Waymo catching up to reality.

    • by geekmux ( 1040042 ) on Tuesday May 14, 2024 @05:34PM (#64472351)

      Crashing into parked cars, drifting over into oncoming traffic, intruding into construction zones

      This is exactly how humans drive. This is just Waymo catching up to reality.

      True, but there’s one glaring difference:

      all this "unexpected behavior" from Waymo's self-driving vehicles may be violating traffic laws

      If a human driver pulled that shit, exactly no one would be talking about how they “may” have violated traffic laws. The human driver would be facing several traffic infractions, points on license, insurance penalties, and possibly civil or criminal charges.

      Now explain how in the hell we make that fair in the future for victims.

      • well the renter of the car may be on the hook.
        and waymo can say when some one request's an ride they are renting the car and per the EULA they need to cover any damage, cleaning costs, tickets when the car is under there hire

      • Not really. Members of my family have been in a few crashes over the years, none too serious, and I don't think anybody was ever cited on either side. However my wife did get sued once (unsuccessfully), and my son got what he says was a pretty sizable settlement for being hit by a car when he was cycling. But I don't think they were cited or anything like that either.
      • by brunes69 ( 86786 )

        What makes you think Waymo did not pay for all of the damages?

        • What makes you think Waymo did not pay for all of the damages?

          What makes you assume once harm becomes an acceptable “norm” they’ll continue to agree to do so easily?. Think suing one of these huge corporations is going to get easier in the future, or do you think you’ll suddenly find you waived your right to sue when you opened a driverless door and Agreed to the EULA you didn’t read?

          All autonomous cars have to do in order to gain full legal support and compliance, is prove they kill less than humans do. If 30,000 deaths a year is basica

          • That's not how this works. That's not how any of this works.

          • by N1AK ( 864906 )
            Even if you're right on everything else, then barring particularly unlikely edge cases, reducing traffic related deaths by 66% IS a huge win and should be viewed as such.

            Plenty of people currently get injured or killed in traffic accidents where the perpetator leaves the scene and isn't found, has no insurance and no assets leaving the victim reliant on their own insurance, or happens in circumstances where it isn't possible to determine responsibility with enough confidence. I find it very unlikely that
            • Even if you're right on everything else, then barring particularly unlikely edge cases, reducing traffic related deaths by 66% IS a huge win and should be viewed as such.

              I wonder if you’ll be that dismissive when it’s a mega-corp you stand zero chance of suing successfully, killing a loved one with a software glitch or bug they knew about and did nothing to correct, because profits.

              I’ve just described the kinds of abuses that have happened for decades. I’m not sure why people are so damn trusting about how companies will own and manage a product like autonomous vehicles. As if their tactics of buying laws and manipulating lawmakers will suddenly ce

      • The human driver would be facing several traffic infractions

        Humans yakking on their cell phones drift out of their lane all the time.
        They are very rarely cited for it.

      • by batkiwi ( 137781 )

        If a human driver pulled that shit, exactly no one would be talking about how they “may” have violated traffic laws. The human driver would be facing several traffic infractions, points on license, insurance penalties, and possibly civil or criminal charges.

        Human drivers pull that shit all the time, and they do not face infractions/fines/points/etc because they aren't recorded 24/7 and don't have an obligation to self report every incident.

        Do you think every human who has drifted into oncoming traffic/hit a parked car/gone into a construction zone has called up the police after to self report?

        • by Brain-Fu ( 1274756 ) on Tuesday May 14, 2024 @10:22PM (#64472825) Homepage Journal

          don't have an obligation to self report every incident.

          Yes we do. It's a legal obligation and if you don't self-report then you are guilty of hit-and-run, which comes with worse penalties.

          Of course that doesn't stop some people from trying, when they think they can get away with it. But whether they get away with it or not, the legal obligation to report a collision is still there.

          • I'm sorry, I must have missed something. If I happen to be adjusting the radio and drift into the other lane (oncoming lane) and there is no accident, am I expected to call the police and report myself for drifting into the other lane? I mean, I don't actually do this in the real world, but hypothetically, because the article refers to drifting into the oncoming lane but no accident occurring (among other things). If, the same example occurs and I drift into a construction zone, and there is no accident and
        • If a human driver pulled that shit, exactly no one would be talking about how they “may” have violated traffic laws. The human driver would be facing several traffic infractions, points on license, insurance penalties, and possibly civil or criminal charges.

          Human drivers pull that shit all the time, and they do not face infractions/fines/points/etc because they aren't recorded 24/7 and don't have an obligation to self report every incident.

          Do you think every human who has drifted into oncoming traffic/hit a parked car/gone into a construction zone has called up the police after to self report?

          Do you expect every company to bend over and simply pay out when their robotic driving force screws up, especially when driving safety will be the de facto standard by which driverless companies will brag they are the better choice vs. the competition? Hilarious you assume corporations “self-report” their crime so much better than individuals (cough, Big Tobacco, wheeze) They’re gonna fuck people over with nothing but a damn EULA at the end of the day. You’ll likely waive all right

      • by torrija ( 993870 )

        Make it so that if a driverless car gets a fine, all driverless cars of the same type get the same fine. They are in fact the same automated driver.

        But you are right, first they need to get that fine.

      • Now explain how in the hell we make that fair in the future for victims.

        Since this is an American topic, let me answer as our American government will ultimately decide:

        Waymo is a corporation. Therefore the burden should fall to the taxpayer to pay all penalties for any infraction the corporation creates that may have a negative impact on ordinary citizens. Also, if we could maybe hand them a few extra billion while we're at it just to make up for the fact that we allow all these distractions from the utopian vision of the driverless car as the perfect solution to imperfect hum

      • If a human driver pulled that shit, exactly no one would be talking about how they “may” have violated traffic laws. The human driver would be facing several traffic infractions, points on license, insurance penalties, and possibly civil or criminal charges.

        Drifting over center lines and intruding into construction zones would result in absolutely no penalty whatsoever for a human driver unless they had done so in view of the police. I see humans do this often while I'm driving, nothing happens. There isn't anyone reviewing footage or analyzing every mistake from the data log for human drivers, so there aren't any penalties. Even if you are seen by the police, unless you live somewhere like Virginia, they frequently don't care and won't pull you over, especia

      • by mjwx ( 966435 )

        Crashing into parked cars, drifting over into oncoming traffic, intruding into construction zones

        This is exactly how humans drive. This is just Waymo catching up to reality.

        True, but there’s one glaring difference:

        all this "unexpected behavior" from Waymo's self-driving vehicles may be violating traffic laws

        If a human driver pulled that shit, exactly no one would be talking about how they “may” have violated traffic laws. The human driver would be facing several traffic infractions, points on license, insurance penalties, and possibly civil or criminal charges.

        Now explain how in the hell we make that fair in the future for victims.

        Here in the UK, if you hit a parked car you'd be lucky to walk away without a "driving without due care and attention" charge (unsafe driving, for the Americans playing along at home) and that carries a minimum of 3 points but may have a disqualification (you lose your right to drive) and that is the minimum, you're just as likely to be hit with a Dangerous Driving charge which is almost certainly a disqualification.

        Also your insurance will tear you a new one, when you try to insure your next car... And

    • the problem is we need fewer people to crash, not more waymos that do.

      • okay, so, I guess your answer is to reward Waymo (for example) for every fatality they cause, because in the end that is reducing the amount of people?
    • by evanh ( 627108 )

      Humans are individuals, the same can't be said of bots.

      • Humans are individuals, the same can't be said of bots.

        Au contraire, mon frere. Despite claiming each bot is the same, they are not. There are still minute differences in how the software is laid down, in the computer chips, in the motherboard, in the wiring, and so on. Each bot is an individual, albeit not a living one. If every bot were identical you would only need to create one to pefection then replicate it.

      • That's exactly the advantage of these robotic cars, if one car happens upon a situation which can be dealt with through a software update, ALL cars can deal with it afterwards. Humans must learn it individually and even forget it in the long run.
    • Perhaps it's learning from it's peers.
    • Those parked cars really had it coming.
    • It's what you get when you train driving AInon real data. At least the model is accurate this time.

  • by MpVpRb ( 1423381 ) on Tuesday May 14, 2024 @05:45PM (#64472359)

    The tech was released before it was ready

    Unfortunately, there is no way to perfect the tech on test tracks or closed roads. It needs to be tested in the real world

    • by Mr. Dollar Ton ( 5495648 ) on Tuesday May 14, 2024 @05:49PM (#64472379)

      What you mean to say is

      Unfortunately, there is not cheap enough way for the waymos to perfect the tech on test tracks or closed roads. It is cheaper for them to do testing in the real world

      so that's what they do.

      • That's just not the case, you just cannot think up every single situation on a test track that happens on the public roads. Humans and environments are way more unpredictable as any testroad can provide. But I'm all for a public database with all accidents/incidents which manufacturers can access and use for testing purposes, and even a restriction on having to pass a certain amount of situations before it is allowed to be used on public roads. And those requirements are continually, so every x months the s
        • That's just not the case, you just cannot think up every single situation on a test track that happens on the public roads.

          Thinking is what you don't do, so this should be perfect for you.

          Instead you get private access to some streets, whether that means building your own fake town or (more probably) using one of the many abandoned ones and tarting it up. Then you pay test drivers equipped with safety gear to drive like idiots, this is easy, just recruit them from among the general population and hire a lot of young men with primer Hondas and young women with Altimas.

    • by timeOday ( 582209 ) on Tuesday May 14, 2024 @06:31PM (#64472475)
      I think Waymo has done a great job. 20 million miles driven [waymo.com]. As of late last year the stats [waymo.com] they reported were 85% reduction in injury-causing crash rates, and 57% reduction in police-reported crash rates.

      And if somebody doubts the statistics, a degree of that is healthy, but read my second link before taking uneducated guesses at what you assume they're doing wrong.

  • by ihavesaxwithcollies ( 10441708 ) on Tuesday May 14, 2024 @05:47PM (#64472371)
    We have contacted Boeing on how to deal with dissidents.
  • by Miles_O'Toole ( 5152533 ) on Tuesday May 14, 2024 @05:53PM (#64472391)

    There's billions upon billions of dollars to be made by kicking humans out of the driver's seat of commercial vehicles. As long as the body count can be kept to a manageable level, there won't be any consequences for driverless cars wrecking the property and the lives of people too poor to keep a long court battle going.

    • There's billions upon billions of dollars to be made by kicking humans out of the driver's seat of commercial vehicles. As long as the body count can be kept to a manageable level, there won't be any consequences for driverless cars wrecking the property and the lives of people too poor to keep a long court battle going.

      Commercial vehicles? The goal is to get EVERYONE out of the driver's seat. The continuous push, and one I see repeated by a shocking number of people, is that humans drivers = shit. No human behind the wheel is ever safe. Ever. ALL of us suck, are continuously distracted, and *WILL* cause accidents. Always.

      I honestly think by the rhetoric that the overall vision is to have no individual vehicles anymore. Just "as needed" rentals that you call when you need to go somewhere, of course for a constant monthly f

      • From what I have seen of traffic accident stats, humans are pretty good at driving if you put it into the perspective of how many miles are driven.
        • From what I have seen of traffic accident stats, humans are pretty good at driving if you put it into the perspective of how many miles are driven.

          Overall, this is probably true. The problem is that, since 9/11 especially, people have been running as fast as they can towards safety and security at all costs. Protect us from ourselves. I remember thinking early on it was a hype cycle that would die. But clearly, it's not. People will give up *EVERYTHING* in the pursuit of supposed safety. Even if it's all theater.

      • You're wrong. Fluffernutter is right. You can look up the numbers if you like.

  • Imagine the cacophony of opprobrium that would be taking place here if this story had Tesla/Musk FSD as its subject.

    Waymo, Tesla, Cruize, Zoox, or Pony -- with automation in control we are going to see a lot more bent metal over the coming years and arguably we will never stop seeing it. Some products will be better than others but that is the best you can hope for. The incidents in this little article mean little over the large sweep of history of this.

    As for the ultimate winner? My money is on Tesla

    • As for the ultimate winner? My money is on Tesla and that is even after having my own intervention events on my free trial over the last month. Everyone else has gone down the route of "need more LIDARs!" and so they have gone to war with that bucket on their foot.

      What I'm wondering here is how you came to this completely backwards conclusion.

      A mix of sensors gives the vehicle the best chance of correctly identifying an object. There is really no debate about this.

      Teslas make embarrassing and sometimes fatal mistakes that nobody else makes specifically because they lack these other sensors. Cameras can be fooled in the same ways our eyes are. The difference is, we have a brain behind our eyes. Tesla doesn't seem to have any concept of [dis]continuity. If something se

      • A mix of sensors gives the vehicle the best chance of correctly identifying an object. There is really no debate about this.

        The work product of a sensor system in this application is an output stream of vectors to be processed. Sometimes called "a cloud of points." The data in the vectors could include anything that the sensor system and feedback loop is capable of producing. Position, delta-position, color, temperature, whatever.

        The consumer of that work product takes that cloud of points and constructs objects out of it. It may or may not be good at this. If you get failures you have to first determine where the CoP was

        • That argumentative response offers no benefit to the discourse.

          The Tesla driving through a tractor trailer's trailer is an example, since the camera was blinded by the sun it thought the road clear. The types of incidents and greater frequency of their system's incidents, despite being a lower level system than others, points to the same conclusion for anyone with even a passing attention to headlines in the space.

          Suggesting every input device provides a a point cloud ignores that how they were sourced doe

        • There is really no question whether you are making things up while accusing me of same. If you had been paying attention to discussions here on this subject you might be up to speed, though you probably wouldn't. Read the sibling to this comment for concrete supporting examples you seem to be ignorant of.

  • Need to compare apples to apples...incidents per distance driven.

  • So, it turns out Cruise was a ruse, and Waymo is lame-o.

A CONS is an object which cares. -- Bernie Greenberg.

Working...