Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation

Human Drivers Keep Rear-Ending Waymos (arstechnica.com) 170

Waymo's driverless cars have a much lower crash rate than human drivers, with fewer than one injury-causing crash per million miles driven, compared to an estimated 64 crashes by human drivers over the same distance. As Ars Technica's Timothy B. Lee notes, a significant portion of Waymo's most severe crashes involved human drivers rear-ending the Waymo vehicles. From the report: Twenty injuries might sound like a lot, but Waymo's driverless cars have traveled more than 22 million miles. So driverless Waymo taxis have been involved in fewer than one injury-causing crash for every million miles of driving -- a much better rate than a typical human driver. Last week Waymo released a new website to help the public put statistics like this in perspective. Waymo estimates that typical drivers in San Francisco and Phoenix -- Waymo's two biggest markets -- would have caused 64 crashes over those 22 million miles. So Waymo vehicles get into injury-causing crashes less than one-third as often, per mile, as human-driven vehicles.

Waymo claims an even more dramatic improvement for crashes serious enough to trigger an airbag. Driverless Waymos have experienced just five crashes like that, and Waymo estimates that typical human drivers in Phoenix and San Francisco would have experienced 31 airbag crashes over 22 million miles. That implies driverless Waymos are one-sixth as likely as human drivers to experience this type of crash. The new data comes at a critical time for Waymo, which is rapidly scaling up its robotaxi service. A year ago, Waymo was providing 10,000 rides per week. Last month, Waymo announced it was providing 100,000 rides per week. We can expect more growth in the coming months.

So it really matters whether Waymo is making our roads safer or more dangerous. And all the evidence so far suggests that it's making them safer. It's not just the small number of crashes Waymo vehicles experience -- it's also the nature of those crashes. Out of the 23 most serious Waymo crashes, 16 involved a human driver rear-ending a Waymo. Three others involved a human-driven car running a red light before hitting a Waymo. There were no serious crashes where a Waymo ran a red light, rear-ended another car, or engaged in other clear-cut misbehavior.

Human Drivers Keep Rear-Ending Waymos

Comments Filter:
  • by whoever57 ( 658626 ) on Wednesday September 11, 2024 @05:22PM (#64781451) Journal

    "That car looks strange -- I can't see a driver -- I'd better get closer so I can see what is going on."

    • Waymo cars have learned how to stoop and squat.
      • Waymo cars have learned how to stoop and squat.

        You mean swoop and squat. And no, I don't think Waymo cars have learned that. How would that get into their neural nets as a reward?

        [I didn't see a <sarcasm> tag, so I'm reading your post prima facie.]

    • Re:Obvious cause (Score:5, Interesting)

      by Local ID10T ( 790134 ) <ID10T.L.USER@gmail.com> on Wednesday September 11, 2024 @05:37PM (#64781491) Homepage

      Waymo vehicles are coded to follow the law as written. Humans anticipate that the car will not follow the law (because they would not follow the law as written) and act accordingly -crashing into the Waymo vehicle.

      Tesla is paying people specifically to train it's autopilot to ignore laws so that it drives more like a human would.

      In the short term, this will cause the Tesla's to "fit it" better with human drivers, but will not provide the long-term safety improvements that the Waymo method will.

      As it stands, the Waymo vehicles are already safer than human drivers -but only in the limited areas they are trained for. They are far from a fully capable, go-anywhere-under-any-circumstances, self-driving vehicle, but they are expanding the regions they cover as they master the existing regions. It is a slow-but-safe method of progress.

      • ..."fit in" better with human drivers...

      • train it on the song I can't drive 55!

      • by geekmux ( 1040042 ) on Wednesday September 11, 2024 @05:57PM (#64781533)

        To bring up specifically one type of crash (rear ending), tends to imply a couple of possibilities for the cause. One of which is determining just how safely the driverless car is slowing and stopping. I’d like more detail on that aspect before we start believing every autonomous humblebrag about how much safer they are. When you have no meatsack detected in the car at all, does it still drive like it’s protecting a meatsack inside, or does it drive differently (“expeditiously picking up the next rider with maximum efficiency” in marketing-speak)?

        I would like to assume human error is the reason for the excessive rear-ending. Let’s see some proof of that.

        • by Kaenneth ( 82978 )

          Assuming blame in accidents has a 50/50 chance of being of either driver,

          If Waymo's vehicles are getting into 1/3rd as many crashes, that accounts for all the crashes their driver would have caused, PLUS avoiding 1/3 of crashes where the other driver is at fault; meaning the system is slowing/stopping even safer, and should not be accused of unsafe sudden stops.

          There are other factors, I think Waymo doesn't even try to drive in snow, etc. If Waymo's only drive in the safest situations, their record can be b

          • Assuming blame in accidents has a 50/50 chance of being of either driver,

            If Waymo's vehicles are getting into 1/3rd as many crashes, that accounts for all the crashes their driver would have caused, PLUS avoiding 1/3 of crashes where the other driver is at fault; meaning the system is slowing/stopping even safer, and should not be accused of unsafe sudden stops.

            It assumes the entire system as a whole is safer, but does not grant you the ability to merely dismiss any single aspect of it. We may find a glitch where the driverless car blips the brakes before every right-turn signal is engaged, causing accidents that get smothered and covered with blanket statistics. Wrong way to go about this. You analyze every aspect of crashes. To avoid them altogether.

            There are other factors, I think Waymo doesn't even try to drive in snow, etc. If Waymo's only drive in the safest situations, their record can be better just from that.

            That would be a biased limited record, not a better one. Apples to watermelons. I’ve lived in Alaska.

        • by Pascoea ( 968200 )
          If I were a betting man, I'd say that it's these cars propensity and ability to follow the letter of the law. The Waymo car detects a yellow light and thinks "Oh, yellow light. I can safely stop for that" where just about every human on the planet thinks "Oh, yellow light. I can make that before it turns red.". Those two reactions are incompatible when the one in front is the computer driver.
          • by geekmux ( 1040042 ) on Wednesday September 11, 2024 @06:33PM (#64781661)

            If I were a betting man, I'd say that it's these cars propensity and ability to follow the letter of the law. The Waymo car detects a yellow light and thinks "Oh, yellow light. I can safely stop for that" where just about every human on the planet thinks "Oh, yellow light. I can make that before it turns red.". Those two reactions are incompatible when the one in front is the computer driver.

            This is especially deadly if the driverless car stopping at every yellow light, does not account for the object directly behind them. To clarify, I’m not talking about the arrogant human in a perpetual hurry that can stop at the yellow light and doesn’t for selfish reasons.

            I’m talking about real situations, with Waymo assuming that 18-wheeler directly behind their stop-at-all-yellow programming can actually avoid the accident incoming at a physics rate of speed, limited only by DOT load regulations and an honest loadmaster.

            My car is equipped with Brembo multi-piston brakes on 15” discs. Just because I can stop on a dime doesn’t mean the 30,000 pounds behind me easily can. Wonder how Waymo accounts for this and runs yellow lights when it should.

            • by Pascoea ( 968200 )
              I mean, sure, of course there are scenarios where the safest route is to blow the yellow light when you could and should safely stop. (I would argue that most human drivers don't have the reaction time to assess the situation to that level of detail in the required amount of time, but I digress.) I'm referring to the combination of a computer that knows that it can safely and comfortably stop for a light that turned yellow vs the perpetually in a hurry human driver behind them that knows they can make it i
              • Agreed. There are those scenarios. I’ve trained myself over many years to glance in the mirror approaching an intersection, just so I’m aware a bit more aware of why I might not stop at a yellow. Thankfully pretty rare, but would have been a bad day for me a couple of times with the wrong decision.

                It’ll be interesting seeing how this develops. No doubt the machine can react faster and generally be safer than humans driving. Reality never fails to be creative though. Murphys law and a

              • Generally the following automobile should also have plenty of time to stop, except that very often they aren't paying any attention at all. So I give them more leeway if they let me, but some insist on driving too close. I've even had cars that were half a block behind me fail to slow down until the last minute, like they were watching their phone or something.

                Also making it into the intersection when it is red is an infraction in many places, yellow is not a sign to hurry up, it's a sign that yo

            • by SvnLyrBrto ( 62138 ) on Wednesday September 11, 2024 @06:55PM (#64781739)

              Not that it helps the car that gets rear-ended in these scenarios; but it bears reminding that if you *can't* stop safely before the intersection and before a red when the light turns yellow; then you are driving too fast for safety in the first place and any resulting accident is entirely your own fault, whether you rear-end someone who did stop, blow into the intersection at a red and T-bone or get T-boned, or if you luck out and nothing happens. Licensed CDL drivers know this. It's part of the training.

              Besides, Waymos drive very conservatively (slowly) in the first place. I would expect that if you can't safely break to a stop behind one in whatever circumstance; in addition to driving unsafely, you've not kept your own vehicle properly maintained either. And that, as before, is entirely on you; not the Waymo.

            • I’m talking about real situations, with Waymo assuming that 18-wheeler directly behind their stop-at-all-yellow programming can actually avoid the accident incoming at a physics rate of speed, limited only by DOT load regulations and an honest loadmaster.

              Does that not imply that the 18-wheeler is travelling at an unsafe speed, or an unsafe following distance?

              When you're operating a vehicle that may well kill the driver in front of you if you rear-end them, you especially need to be able to avoid a collision even if they slam on their brakes. We're hypothesizing about it being caused by an autonomous vehicle stopping in a scenario where you wouldn't predict them to do so, but it could just as easily happen to a human driver if a pedestrian steps into traf

              • I’m talking about real situations, with Waymo assuming that 18-wheeler directly behind their stop-at-all-yellow programming can actually avoid the accident incoming at a physics rate of speed, limited only by DOT load regulations and an honest loadmaster.

                Does that not imply that the 18-wheeler is travelling at an unsafe speed, or an unsafe following distance?

                When you're operating a vehicle that may well kill the driver in front of you if you rear-end them, you especially need to be able to avoid a collision even if they slam on their brakes. We're hypothesizing about it being caused by an autonomous vehicle stopping in a scenario where you wouldn't predict them to do so, but it could just as easily happen to a human driver if a pedestrian steps into traffic, a tree branch falls into the road, etc.

                Quite frankly with the legal load limits on large haulers, none would exceed 20MPH ever if they had to ensure stopping within the distance allotted. The lines painted at intersections and general driving rules basically allow drivers to stop on a dime if they choose. It ain’t smart, but its legal.

                50MPH to zero within a short distance, becomes a matter of physics above a certain weight regardless of your stopping tech. This is why some truck drivers absolutely refuse to carry certain loads (like lar

            • I've had cars honk loudly at me for just slowing and having the brake lights on because of a very light tap, and they weren't even tailgating. Some people just seem to think that stopping is optional. Around here, if the light turns red before you leave the intersection, it is an infraction, even if it was green when you entered, but most humans still seem to think that as long as it's not red when you enter that it's ok.

              It also depends where you drive. I do NOT like to drive in San Francisco, there are

            • My car is equipped with Brembo multi-piston brakes on 15” discs. Just because I can stop on a dime doesn’t mean the 30,000 pounds behind me easily can.

              One thing for your silly analogy is that the 30000 pound cars are typically not driven by tailgating morons. Just because they have a longer stopping distance doesn't mean they follow as closely as you do with your amazing brakes.

              As for your yellow light situation, the rear view mirror is for changing lanes and reversing. What is going on behind you is *never* part of the decision tree of whether to stop at a yellow light or not.

          • The rear-enders were looking at their phone. It's that simple. AI has nothing to do with it.

            "Distracted driving is the main contributor for rear-end accidents. The NHTSA stated that driving while distracted contributed to 60% of rear-end collisions. The reason why this is so common is because distracted drivers often fail to notice stopped or slowing vehicles in front of them, causing a rear-end collision."
            ( https://www.mccoyandsparks.com... [mccoyandsparks.com] )

          • by RobinH ( 124750 )
            More importantly the time between detecting the yellow light and applying the brakes is going to be measured in the tens of milliseconds for the waymo, and in the hundreds of milliseconds for the human to register the light, and then another fraction of a second to get their foot off the accelerator onto the brake. Your car can move a long distance in that period.
            • by ObliviousGnat ( 6346278 ) on Wednesday September 11, 2024 @08:02PM (#64781921)

              Yellow lights are timed such that if you are going the speed limit, you will have at least 1 second to decide whether to brake or proceed through the intersection.

              But as you increase your speed, you have less and less time to decide and react. If you are going fast enough and in the wrong place (called the "dilemma zone" [ssti.us]) when the light turns yellow, you will have negative seconds to decide, in other words you will run the red light unless you accelerate.

              So if you are speeding, you should slow down at signaled intersections in order to avoid finding yourself in the dilemma zone.

              • Re: (Score:3, Insightful)

                by RobinH ( 124750 )
                That's not really the point. While you're taking your 1 second to decide, the waymo in front of you just hit the brakes in 25 milliseconds, calculating that it could stop with super-human reflexes. And you then have to stop, and react very fast. So the problem isn't speeding, necessarily, it's following too closely. You can get away with following a human driver closer because you both have similar reaction times, but you need to give waymo cars more distance.
        • Rear-ending is almost always the fault of the driver in the rear. Yeah, I realize sometimes human drivers will cut someone off and then immediately slam on their brakes, but I doubt the robotaxis are programmed to merge in such an unsafe manner.

        • tends to imply a couple of possibilities for the cause. One of which is determining just how safely the driverless car is slowing and stopping.

          Actually there is only one single cause for rear-endings: the car behind following too closely to safely stop if something unexpected occurs.

          This is legally defined by the way. It doesn't matter if you just for shits and giggles slam on the brakes for no reason, if the guy behind you hits you, he is at fault in the eyes of the law and insurance for driving "dangerously" close.

          Now sure everyone actually does it. Few people leave sufficient stopping distance, and let's face it everyone with adaptive cruise co

        • I once went through a red traffic light. It turned yellow, I had plenty of time to stop so I slowed down, then I realised the car behind me was _not_ going to stop. I decided that going past a light that had just changed was still safer than someone ramming my car from behind.
      • It's like when I drive the speed limit on the interstate and some moron gets behind me and starts having a shit fit, flashing his lights, then swings around and in front of me, passing about six inches from my bumper, causing me to brake, thus causing the next idiot behind me to do the same thing, and on and on. This causes a caterpillar effect, slowing all the traffic down. It's one of the reasons you get slowdown on the highway and then suddenly you get to where the perceived slowdown is and there's nothi

        • It's like when I drive the speed limit on the interstate

          You didn't state if you were in the left lane. It's pivotal to assessing the response.

      • by AmiMoJo ( 196126 )

        Tesla's method is not viable for full self driving, because Tesla would become liable for all the traffic violations that the car makes.

      • Waymo vehicles are coded to follow the law as written. Humans anticipate that the car will not follow the law (because they would not follow the law as written) and act accordingly -crashing into the Waymo vehicle.

        Tesla is paying people specifically to train it's autopilot to ignore laws so that it drives more like a human would.

        In the short term, this will cause the Tesla's to "fit it" better with human drivers, but will not provide the long-term safety improvements that the Waymo method will.

        As it stands, the Waymo vehicles are already safer than human drivers -but only in the limited areas they are trained for. They are far from a fully capable, go-anywhere-under-any-circumstances, self-driving vehicle, but they are expanding the regions they cover as they master the existing regions. It is a slow-but-safe method of progress.

        SO it would seem like the obvious plan should be to use the Tesla method in the short term, and only then once you have some awesome, "like a really good human" driving, and mass adoption, only then pivot to "even safer, but only when not mixed with lots of human drivers" behavior.

    • Re: (Score:3, Insightful)

      by guruevi ( 827432 )

      More like, this is a city where everyone runs 1/2 a car length away at 55mph and the Waymo simply slams its brakes whenever it detects an error.

      Brake checking a car is illegal in most places.

      • Do you even know what brake checking means?

      • Re:Obvious cause (Score:4, Insightful)

        by ArchieBunker ( 132337 ) on Wednesday September 11, 2024 @06:07PM (#64781571)

        Brake checking a car is illegal in most places.

        So is not leaving enough distance to safely stop your vehicle.

      • You seem to be assuming the Waymo is more likely to be rear-ended than a human driver, which the story does not say.
      • Brake checking a car is illegal in most places.

        If you're riding so close to their bumper that a collision was absolutely unavoidable, you're going to get a ticket too. Two wrongs don't make a right.

        On the highway, I've once been behind a semi-truck that had its engine seize, and I was able to safely stop without a collision. The vehicle behind me, however, had to ditch into the breakdown lane. Guess who was leaving a safe following distance, and who wasn't?

      • Minimum assured clear distance is the law in my area. It basically means that you as the following driver are responsible for being able to safely stop if the car in front of you stops. Even if the clown in front of you pulls the ânet meme âoebrake checkâ move.

        They brake check chump can still be cited under he broad âoeunsafe operation of a motor vehicleâ but for all these Waymo rear ends itâ(TM)s the human at fault.

    • Re:Obvious cause (Score:4, Interesting)

      by ArchieBunker ( 132337 ) on Wednesday September 11, 2024 @06:06PM (#64781565)

      I did contract work for a self driving car company. They had the same problem with getting rear ended because the cars drive conservatively and are more likely to stop out of caution. It got so bad that during test runs a chase car would follow behind.

      It call comes down to people tailgating or looking at their phones instead of the road.

      • Did you also have a chasechase car to protect your chase car? Or was the chase car rear-ended less for mysterious reasons?

    • Is tailgating. (Score:4, Insightful)

      by Fly Swatter ( 30498 ) on Wednesday September 11, 2024 @06:06PM (#64781567) Homepage
      Most drivers that are experiencing a tailgate situation start slowing earlier than usual so that the dumb-ass behind you has plenty of warning to slow down. Waymo would be wise to do the same with their programming.

      Most drivers follow too closely. Of course when you follow a nice safe distance that safety zone will be filled by another car if it's a double lane road.
      • If someone is tailgating me I slow down until the distance they're following me at is a safe distance for that speed. It's very effective.

    • There is actually nothing in the summary to suggest that waymo cars are more likely to be rear-ended than human driven cars.
      • There is. Mathematically, for every rear-ended car there is exactly one rear-ending car. So for equal rear-endings they would have had to drive into 16 other cars.
    • by dbialac ( 320955 )

      "That car looks strange -- I can't see a driver -- I'd better get closer so I can see what is going on."

      Seems like they're set up to drive like a CRV or RAV4 driver. Simple driver classification:

      • People who don't know how to drive (kids, people who have never pursued a license, etc.)
      • Student drivers
      • People who know how to operate a motor vehicle, but implement this knowledge poorly. Drivers of CRV, RAV4, more and more Outback, and drivers of gold or beige cars. All are likely to cause an accident but get away with it even though they did something stupid. Always give them a wide berth.
      • Asshole drivers. Peop
  • This doesn't surprise me in the least. First, human drivers are terrible, so the robots don't even have to be *that* good to do better than us ;)

    But second, they've been testing these things for many years now (I can confirm that I've seen them driving around Mountain View for years before they went to SF). Furthermore, the government of San Francisco is not the kind that would let a corporation safety test on their citizens. I strongly suspect their internal data showed a much better-than-human accident

  • by smokinpork ( 658882 ) on Wednesday September 11, 2024 @05:40PM (#64781501)
    I can't wait till I no longer need to park in the city.
  • by aitikin ( 909209 ) on Wednesday September 11, 2024 @05:41PM (#64781507)

    The general driving public is not a fair sample set to compare to. Give me the comparison of them to JUST cabbies (the people they're supposed to replace) and we'll see what happens.

  • by penguinoid ( 724646 ) on Wednesday September 11, 2024 @05:42PM (#64781509) Homepage Journal

    You need to check your brakes waymo often.

    • by m00sh ( 2538182 )

      You need to check your brakes waymo often.

      I would brake check tail-gaters if I didn't care about my car, or the headache that comes after the crash. I friggin hate those idiots.

      I guess Waymo software can brake check and let Waymo staff take care of the BS while it takes a break.

      • by cusco ( 717999 )

        If you ever actually brake check a tailgater and have an accident absolutely do **NOT** say the words "brake check" to either the cops or the insurance company. You won't like what happens afterwards if you do.

  • by PPH ( 736903 ) on Wednesday September 11, 2024 @06:02PM (#64781545)

    They will die of old age getting stuck behind one.

  • by John.Banister ( 1291556 ) * on Wednesday September 11, 2024 @06:05PM (#64781559) Homepage
    I'm wondering if Waymos could have a display on back for a QR code, so that if I have left home with my vehicle, need to get it back home, but don't consider myself in good condition to drive, I could set my car to "follow the Waymo" mode where I hire a Waymo to drive from my location to my home and it displays a QR code on the back which I set my car to follow. I'm sure the Waymo people wouldn't want the liability of assuming control of my car directly, but if they provide a "follow me" service, their vehicle can still use its own hardware to inform its decisions, and my vehicle could be certified to perform the significantly easier task of following it. The Waymo vehicle could even tell my vehicle in advance when it stops or turns, and find my vehicle a close to home parking spot where the autopark feature could take over. If the service became popular, Waymo could even have vehicles with no room for passengers just for leading cars that have automation with a less certified level of independence.
  • Not 22 million miles (Score:5, Interesting)

    by BetterSense ( 1398915 ) on Wednesday September 11, 2024 @06:11PM (#64781585)
    They say 22 million miles as if it's 22 million random miles, and as if it's legitimate to use that metric to compare Waymo accident rates with auto accident rates.

    What they really do is they have their cars drive around the same loop of blocks in Phoenix 1 million times, and say it's "22 million miles" of driving. It's not wrong, but you can only compare accident statistics against human drivers driving the same route at the same times in the same weather.

    Furthermore, the only thing that matters is how reliable the latest version of software and hardware is. Has it been 22 million miles with NO changes to the software or hardware? More likely, it's 21.99 million miles with previous versions, and 0.01 million miles with the latest patch that's going to bug out and kill me or somebody.
    • Re: (Score:2, Informative)

      by timeOday ( 582209 )
      As a matter of fact they are comparing themselves against human stats on the same types of roads in the same areas:

      This table shows how many fewer crashes Waymo had (regardless of who was at fault) compared to human drivers with the benchmark crash rate if they were to drive the same distance in the areas we operate.

      The reductions are shown combined and separately for Phoenix and San Francisco. Results have been rounded to the nearest whole number.

      The comparisons in Los Angeles and Austin are not shown

      • So are they comparing "Waymo on the same loop of blocks in Phoenix 1 million times" vs humans throughout Phoenix? I don't know much about this, but your and OPs claims are compatible depending on how the comparison is done.

        • Waymo doesn't even fixed have route loops like a bus. It has coverage areas. Like a taxi. For example it will re-route around traffic just like google maps would direct a human driver to do.

          It also doesn't cover just some limited number of blocks, it covers 315 square miles of the Phoenix metro:

          https://www.forbes.com/sites/b... [forbes.com]

    • by flink ( 18449 )

      What they really do is they have their cars drive around the same loop of blocks in Phoenix 1 million times, and say it's "22 million miles" of driving.

      Yeah, get a few million miles driving around a city in the northeast during the winter and I'll be impressed.

  • And all the evidence so far suggests that it's making them safer. It's not just the small number of crashes Waymo vehicles experience -- it's also the nature of those crashes. Out of the 23 most serious Waymo crashes, 16 involved a human driver rear-ending a Waymo

    Years ago I was at a red light. The light turned green, I started driving in my manual car, took my foot off the clutch a bit too quickly, stalled, and got read-ended.

    Legally, the accident was 100% the other driver's fault (they weren't paying clos

    • I've seen bumper stickers with a tongue-in-cheek warning that the car has a manual transmission and might roll backwards or stall.

      Hell, one of my friends has been driving stick for years and last time we went to go grab some car parts he stalled out and we had a good laugh about it. It happens.

    • Aside from the occasional weird bugs that lead to things like them mobbing that one street next to the Presidio a while back; they're actually very predictable. Just assume that they will follow the driver's handbook and street & traffic signs & signals to the letter.

      • Aside from the occasional weird bugs that lead to things like them mobbing that one street next to the Presidio a while back; they're actually very predictable. Just assume that they will follow the driver's handbook and street & traffic signs & signals to the letter.

        I admittedly don't have a ton of experience with them, but the one I saw in SF a few months ago was jerkily creeping through a left hand turn, most definitely not typical human driver behaviour.

        But I think the real issue is that they're necessarily more careful than humans and their default safe behaviour is to stop. Now, in most situations that is the safest behaviour, but it does mean they'll end up unexpectedly slowing or stopping a lot more than humans, and get rear ended as a result.

      • by Lehk228 ( 705449 )
        The problem is a large number of drivers are unaware of what the driver's manual says
    • one of the more important rules of driving is to be predictable

      I've noticed this is also a problem with drivers from other areas of the world, or myself driving there. The unspoken rules are just a bit different.

  • He's been on ars for years and clearly has an agenda and some hate on this topic.

    While the things he says may not be outright lies, often he picks and chooses what he shares and what he hides to give clear preference to his favorites.

  • Sure sounds like it to me.
    IDGAF. Still want nothing whatsoever to do with self-driving cars, and would just as soon they were taken off the road permanently. I sincerely believe they cover up problems with them so they can keep selling the idea.
    • by Entrope ( 68843 )

      And is Slashdot getting paid to lie about articles? From TFS:

      [Waymo cars cause] fewer than one injury-causing crash per million miles driven, compared to an estimated 64 crashes by human drivers over the same distance.

      From the quoted part of TFS:

      [Human drivers] would have caused 64 crashes over those 22 million miles. So Waymo vehicles get into injury-causing crashes less than one-third as often, per mile, as human-driven vehicles.

      There's kind of a Big Fucking Difference between 64 injuries per million miles and 64 injuries per 22 million miles, but apparently that is too subtle a distinction for BeauHD.

      • Can only speak for myself. I see lots of dangerous holes in the technology that will sooner or later lead to lots of serious problems, yet there are groups out there that want to force human driver out of their cars and into SDCs, whether they're safe or not.
    • by XaXXon ( 202882 )

      Tim lee has been shilling for waymo for at least 5 years. I was so happy when ars dropped him but I guess they still bring him back for stuff.

      All their car coverage is just shilling.

      They launder gifts from the auto industry into cash and then claim they aren't paid for. You have to be really careful what you read on ars these days. If it's not Beth it's probably not good.

  • The obvious question is why humans keep rear-ending Waymo cars. Just knowing that such collisions occur doesn't necessarily impart blame to either humans or Waymo. However, from my observations, Waymo cars tend to drive conservatively. The early Google cars drove conservatively to an extreme, e.g., five mph under the speed limit or ten mph slower than everyone else, or waiting until a really long gap when making a left turn. These are all legal maneuvers but ones that can increase the probability of col

  • by Togden ( 4914473 ) on Wednesday September 11, 2024 @07:20PM (#64781811)
    I generally drive exactly by the rules and when I was still driving to work daily I would get rear ended every now and then. Of the 3 times I can recall this happening, only one time did I find any sympathy for the other driver. First time the guy was on the phone and the second time the other driver actually tried to blame me for not pulling away fast enough in traffic which was unbelievable. The third time I'd had to break harshly and unexpectedly because another car had made a very unpredictable and illegal manoeuvre and the lady who hit me was just slower to react. She did also apologise.
    • by cusco ( 717999 )

      The last time I got rear-ended the other driver got out of her car and proceeded to berate me for not running the stop sign like she expected. Still shake my head over that one.

  • Oh what a surprise. You build the robots so they're timid and never cause accidents, then you run them for millions of miles and find people run into the back of them.

    Why? Just because it's normal for that many miles travelled? because they surprise more assertive drivers and stop when they thought they would go? or because people are too distracted by their presence they forget they're driving and drive into them. .... All these questions and more might very well be addressed in the article, but I'll never

    • Why do you hate paragraphs so much slashdot?

      Sentences are much easier to follow when you break them up into different lines.

      Why, Slashdot, Why, can't I just have single line breaks in my comments?

  • Since the self-driving cars are new vehicles, safety comparisons should only be done against new cars of similar cost / specifications. (which would likely include automatic emergency braking, blind-spot monitoring, etc...)

    Comparing against the existing fleet is only fair if the self-driving systems are meant to be retrofit onto the existing fleet.

  • This article is misleading. Here's why:

    Waymo cars are essentially taxis. Comparing taxis to all other drivers on the road, e.g. high-school kids, frat boys, alcoholics, retirees who are semi-conscious because of the medications they're on, etc., etc., will give a false impression of road safety.

    In the same way that university research hospitals, which have higher death rates, are not giving poorer care, they're just dealing with the most difficult & serious cases to treat. Put those patients in a
  • How can the 64 crashes per million miles make sense? A driver who does 10,000 miles a year would do approximately half a million miles over their driving career. So that means they would average 32 crashes over their driving career?? That can't make sense. What am I missing?

    • by hipp5 ( 1635263 )

      How can the 64 crashes per million miles make sense? A driver who does 10,000 miles a year would do approximately half a million miles over their driving career. So that means they would average 32 crashes over their driving career?? That can't make sense. What am I missing?

      Answering my own question, the first part of the Slashdot summary is misleading: "with fewer than one injury-causing crash per million miles driven, compared to an estimated 64 crashes by human drivers over the same distance", when really it is "Waymo estimates that typical drivers in San Francisco and Phoenix would have caused 64 crashes over those 22 million miles."

      However, that's still 3 per million miles, or 1.5 over my estimated driver's career of 500,000 miles. That still seems really high.

      • I'm sure they used average human driver rather than median human driver. I often wonder, but never try to find data, whether human driver with accidents follow the 20% principle
  • by e3m4n ( 947977 ) on Thursday September 12, 2024 @08:59AM (#64782615)

    “Waymo Takes It In The Rear!”

An optimist believes we live in the best world possible; a pessimist fears this is true.

Working...