Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
Transportation

Waymo Says Its Driverless Cars Are Better Than Humans At Avoiding Crashes (teslarati.com) 104

An anonymous reader quotes a report from Teslarati: Waymo Driver is already reducing severe crashes and enhancing the safety of vulnerable road users. As per a new research paper set for publication in the Traffic Injury Prevention Journal, Waymo Driver had outperformed human drivers in safety, particularly for vulnerable road users (VRUs). Over 56.7 million miles, compared to human drivers, Waymo Driver achieved a 92% reduction in pedestrian injury crashes. It also saw 82% fewer crashes with injuries with cyclists and 82% fewer crashes with injuries with motorcyclists. Waymo Driver also slashed injury-involving intersection crashes by 96%, which are a leading cause of severe road harm for human drivers. Waymo Driver saw 85% fewer crashes with suspected serious or worse injuries as well. "It's encouraging to see real-world data showing Waymo outperforming human drivers when it comes to safety. Fewer crashes and fewer injuries -- especially for people walking and biking -- is exactly the kind of progress we want to see from autonomous vehicles," said Jonathan Adkins, Chief Executive Officer at Governors Highway Safety Association.

Waymo Says Its Driverless Cars Are Better Than Humans At Avoiding Crashes

Comments Filter:
  • "driverless" (Score:3, Insightful)

    by Pinky's Brain ( 1158667 ) on Friday May 02, 2025 @09:10AM (#65346879)

    Though not really relevant for operation, the army of remote controllers pushing them through their fail safe behaviour (stopping) is essential. They are less driven, not driverless. Not autonomous enough to be left alone for any significant amount of time, which is a problem in a cellphone outage.

    • Not relevant for safety I meant.

      • by bagofbeans ( 567926 ) on Friday May 02, 2025 @10:35AM (#65347113)

        If I run you over and you die, I get prosecuted for something or other, probably custodial.

        If a driverless car kills you, who risks incarceration?

        • If I run you over and you die, I get prosecuted for something or other, probably custodial.

          Quite possibly, but it depends. If you were drunk/distracted/careless then the case against you is strong. On the other hand, if your brakes suddenly failed, then the question pivots to who is responsible for the brake failure. You for not servicing your car? Or the car or brake manufacturer for some kind of defect? And if I was the one who was being reckless, and gave you no chance to stop, then you might not get prosecuted at all.

          If a driverless car kills you, who risks incarceration?

          Just as above, it depends. If a defect in the car or its software can be dem

          • If a driverless car kills someone, this is by definition a defect unless it is deemed legally acceptable for driverless cars to kill people in certain circumstances.

            No-one is in jail as a result of dieselgate.

            • Uh, same circumstances as humans, mostly. It's not a defect that autonomous cars are still beholden to the laws of physics... If someone jumps out or swerves unexpectedly, the car can react faster but there's still a nonzero minimum stopping distance and constraints on how quickly and to where it can evade.
          • People at a company are very rarely shown to be responsible, very rarely so rarely in fact that it's called a corporate bail for a reason

            • by PCM2 ( 4486 )

              But companies can be held liable for defective products. This type of case would proceed as a wrongful death suit in civil court, for monetary damages.

              • by jvkjvk ( 102057 )

                The Op was about someone being held responsible.

                Having the company pay monetary damages doesn't qualify as "someone" unless you are a politician.

        • > If I run you over and you die, I get prosecuted for something or other, probably custodial.

          Not unless it's deliberate, no. If it's because you did something especially dumb then it might, emphasis on might, end up in court, but you're very unlikely to end up in prison.

          Here's an article on the appalling low prosecution rate of people who forget, or worse, use intentionally the fact that, their vehicle is a weapon: https://www.startribune.com/in... [startribune.com]

        • by dvice ( 6309704 )

          If a washing machine kills you, who do you blame?

        • A side of the trolley problem that hadn't yet been considered.

        • by Jeremi ( 14640 )

          If a driverless car kills you, who risks incarceration?

          I'm not sure. What typically happens after someone gets crushed by a malfunctioning elevator?

    • Though not really relevant for operation, the army of remote controllers pushing them through their fail safe behaviour (stopping) is essential. They are less driven, not driverless. Not autonomous enough to be left alone for any significant amount of time, which is a problem in a cellphone outage.

      Is this true? I would think that a guy with a joystick and a button in a room somewhere steering a car by looking at a monitor would be a disaster waiting to happen. Even having that guy staring at the screen for hours on end for that one incident that happens in a split second seems like a bad thing. Also, if this were slightly practical, the dollar cost of that guy would be horrible. Why save on the cost of the driver just to pay for a guy in a room?

      • by ceoyoyo ( 59147 )

        the army of remote controllers pushing them through their fail safe behaviour (stopping)

        The guys in the room aren't intervening in split seconds during crashes. They're intervening when the car goes "shit, I don't like this, I'm going to stop."

        Which reminds me of the first thing my father told me when he was teaching me to drive: "if you don't feel comfortable, you can always pull over and stop." If more human drivers had that fail safe behaviour maybe they'd be closer to Waymo's safety record.

    • by PCM2 ( 4486 )

      Though not really relevant for operation, the army of remote controllers pushing them through their fail safe behaviour (stopping) is essential.

      What army is that? There is a team that can "provide information" to the car, but there is no remote control. The actual driving is totally autonomous. In a worst-case scenario, an operator can instruct a car to pull over and wait for assistance. More info here. [reddit.com]

      • They're called "fleet response agents" now. The minutia of driving is still automated, but the remote controllers can set a path and push it through a subset of restrictions. If it needs help, it's no longer autonomous.

        • by PCM2 ( 4486 )

          They're called "fleet response agents" now. The minutia of driving is still automated, but the remote controllers can set a path and push it through a subset of restrictions. If it needs help, it's no longer autonomous.

          A lot of subway trains are essentially autonomous, yet they still have operators in the event of some kind of failure state.

          I buy Waymo's assertion that the actual driving is fully autonomous. Remember, you will never go to a dealership and buy your own Waymo. The whole project is designed as a service, not the future of private car ownership.

  • by Anonymous Coward

    It's a low bar.

  • forgot to mention facial recognition and behavioural analysis because it's part of the TOS if you enter our vehicle?
    <see smashdot article that I'm too lazy to lookup>
    Think of the children !!
  • by Revek ( 133289 ) on Friday May 02, 2025 @09:18AM (#65346909)
    Lately, I've missing George Carlin daily. People. Just say people or pedestrians. You know all those other road users who don't drive cars. I just want to slap the soulless people who come up with euphemisms like that to avoid acknowledging other peoples humanity so it doesn't look so bad to them when they kill someone.
  • by nickovs ( 115935 ) on Friday May 02, 2025 @09:18AM (#65346913)
    While I don't dispute the numbers, it would be interesting to see the comparison not to all drivers but to a cohort of humans that drive the same sort of routes with the same sort of regularity, which would be drivers for Uber, Lyft and other ride-share services. I don't know if those drivers have higher rates of pedestrian and cyclists collisions (because they spend their lives driving urban streets) or lower rates (because they know the streets and intersections better). Either way a more focused comparison would be informative.
    • by SandorZoo ( 2318398 ) on Friday May 02, 2025 @10:06AM (#65347033)
      The paper did discuss this in the intro, and said the rates were "similar in magnitude", but Uber/Lyft may be under-reported:

      [T]he Waymo crash rate (reported as part of the NHTSA SGO [National Highway Traffic Safety Administration - Standing General Order]) was found to be similar in magnitude to self-reported human transportation network company (TNC) crashes. It’s unclear what definition of a crash is used for the self-reported TNC crash data, and whether that TNC crash definition is well matched to the ADS [Automated Driving Systems] crashes reported as part of the NHTSA SGO. That is, there is an unknown amount of underreporting in the TNC crash data, while the ADS data from the SGO includes any amount of property damage with little to no underreporting. TNC drivers may have incentives to not report low severity collisions, as reported collisions may lead to deactivation from the platform.

      • The surge pricing model also encourages more use in bad weather Waymo doesn't operate as much in, and they're still a lot more environment limited than rideshares, in terms of road conditions. So I wouldn't put much stock in the "may be underreported" excuse.
  • Need I say more (Score:3, Insightful)

    by fluffernutter ( 1411889 ) on Friday May 02, 2025 @09:23AM (#65346925)
    At avoiding crashes .... where and when they OPERATE. Not in the middle of blowing snow. Not on rutted, icy roads. Not in rain. Not in fog. Not if there is a drop on the camera. Not if the area isn't fully scanned. Not if the area has changed since being folly scanned.
    • by N1AK ( 864906 )
      I'm not sure of the relevance of your post given that no one is claiming they are better at avoiding crashes in circumstances other than those they have been operating in it just seems like you are pointing out the obvious and redundant.

      Even if Waymo can only operate reliably in a very limited area the scope to reduce injuries is still very considerable. Picking a big US city at random I think Houston has something like 1,100 pedestrians injured and 100 killed per annum. Reducing accidents to Waymo's lev
      • Because over 2/3 more accidents happen in the conditions I mentioned. That's where we really need them.
        • by kqs ( 1038910 )

          "It's not 100% perfect in all conditions, so it's useless"? Some people just want to whine. These numbers are very good. Snow and ice will always be sucky, but most other issues can be worked on. I mean, do all of the Waymo cars disable themselves when there is rain in California? I doubt it.

          Besides, these cars don't need to beat ice and snow. They just need to beat human drivers in ice and snow. And that's a far, far lower bar.

          • Until one gets in an accident that seems ridiculous for a human to do. People will tend to prefer that they at least die by a mistake that they made rather than a mistake that a machine made because some developer failed to consider that situation.
        • Re: Need I say more (Score:4, Informative)

          by Sique ( 173459 ) on Friday May 02, 2025 @11:25AM (#65347229) Homepage
          They don't. Houston has barely any snow storms or icy roads and not much rain. And still, it has 1100 injuries of pedestrians and 100 killed pedestrians in traffic accidents each year.
        • Because over 2/3 more accidents happen in the conditions I mentioned. That's where we really need them.

          Yup, "Our cars are much safer, except under conditions where they aren't."

          I still wonder about liability.

        • I forget the exact distance, but most accidents happen near your home as well. The best way to reduce your risk is to buy a home in a different state, never visit that state, and rent the place you actually live at. That's the optimal method to keep yourself safe, at least according to all the stats.

          • That just happens because people are more comfortable near their homes and let down their guard. Move to a different state and that just becomes the area you are more comfortable in. So not a solution.
      • I'm not sure of the relevance of your post given that no one is claiming they are better at avoiding crashes in circumstances other than those they have been operating in it just seems like you are pointing out the obvious and redundant.

        It apparently isn't so obvious that those points might just be ignored. Getting from point A to point B in real life does not happen in perfect weather all the time, so you can bet if they just put out numbers that make them look like they are sooo safe is scammish.

    • They matched for road type, vehicle type and location, but not for weather. So yes, the human-driver data may include miles done in weather conditions the driverless cars would refuse to operate in. But the study used data from California, Arizona, and Texas. How much bad weather do those states have? (I genuinely have no idea).
      • by PCM2 ( 4486 )

        California, Arizona, and Texas. How much bad weather do those states have?

        California has just about any time of bad weather you want, from scorching desert heat to snowstorms and hail ... so Waymo doesn't operate in those areas.

    • by AvitarX ( 172628 )

      But, the fact that they're better at identifying and avoiding people and bicycles is super relevant since they've presumably been driving in bicycle and people dense areas.

      It's the first results that actually say anything I've seen from one of these companies in a while (much less likely to hit a person or bicycle).

    • by eepok ( 545733 )

      This is the key point that they omit because this is an investment marketing piece masked as groundbreaking news.

      People drive in all conditions, make changes to a route (en route), and drive in brand new places without petabytes of prepared data stating how a vehicle should operate in an area. Waymo cannot do that. Waymor does NOT adapt quickly by any stretch of the word "quickly".

      And that's OK!! Waymo's taking it slow and correct. They're moving at the speed that autonomous driving NEEDS to move.

      But don't

      • by PCM2 ( 4486 )

        But don't cherry pick one aspect of driving and imply that that one aspect makes them better than human drivers.

        OK, but scientific studies don't typically take the entire world into consideration. They tend to focus on one specific area of research. I have anecdotal evidence that Waymo cars also corner exceptionally well, accelerate and brake efficiently, and do a great job of observing traffic rules and laws ... but this study didn't look at those things.

    • At avoiding crashes .... where and when they OPERATE. Not in the middle of blowing snow. Not on rutted, icy roads. Not in rain. Not in fog. Not if there is a drop on the camera. Not if the area isn't fully scanned. Not if the area has changed since being folly scanned.

      Interesting point. My Jeep has driving assist and pedestrian emergency braking. lane change warnings. anti-collision emergency braking. Anti-tailgating cruise control.

      And yes, some of it gets turned off at times in rain or dust - we're in the middle of oak pollen season here.

      But I'm still the one driving. I have no issues having them on my car, and yes, they are a help. I wonder if Waymo has done a study on modern vehicles with modern bells and whistles. versus their cars?

      • That has been an issue with any stats these companies have brought. They have 100% modern vehicles but their comparison data includes people driving 1970 pickups with huge blind spots.
  • by FeelGood314 ( 2516288 ) on Friday May 02, 2025 @09:23AM (#65346927)
    Now if we could just remove the farm and construction safety exemption for SUVs that will never be used for construction or farming we could save even more lives. For example a car or a mini van must hit the average person, cyclist or child low enough to have them fall on the hood and not take excessive damage to the vital organs. Most SUVs have hoods above my waist and I'm 5'11".
    • Now if we could just remove the farm and construction safety exemption for SUVs that will never be used for construction or farming we could save even more lives. For example a car or a mini van must hit the average person, cyclist or child low enough to have them fall on the hood and not take excessive damage to the vital organs. Most SUVs have hoods above my waist and I'm 5'11".

      I've never heard of this "farm and construction safety exemption for SUVs" and Google doesn't return one. Where did you get this from?

      • by ceoyoyo ( 59147 )

        https://www.vox.com/future-per... [vox.com]

        OP seems to be incorrect. There is an exemption from fuel economy rules for vehicles that might be used on farms, and commercial tax benefits for large vehicles. US safety requirements don't seem to consider anybody outside the vehicle at all, regardless of vehicle type.

    • To be completely fair most jacked up trucks are absolutely flipping useless for farm / construction work. You want to ruin a pickup? Jack it up and put all that extra stress on the ball joints, axles, et al., and in general make the truck worse.

      As my friend puts, there are trucks you use for work... and most of those are built like the ones from 1970's / 1980's with very little bling. Bling trucks are "sissy trucks" that are useless to get stuff done because you're too afraid to scratch something or it's
    • by Tablizer ( 95088 )

      Seems you are suggesting a bottom wedge or cow-catcher. [wikipedia.org]

      Most SUVs have hoods above my waist and I'm 5'11".

      But this is 'Murica! We waste to show how powerful we are. Jesus Rambo Christ likes big SUV's so we can spread Jesushood via intimidation like the old days. We showed those half-naked Aztecs and their loser calendar who's boss!

  • by sinij ( 911942 ) on Friday May 02, 2025 @09:29AM (#65346949)
    World domination according to SV vulture capitalist:

    Step 1: Fund development of a product with a high barrier to entry.
    Step 2: Sell it at a loss, ensuring no competition.
    Step 3: Destroy all alternatives, ideally by legislation.
    Step 4: Enshittify to recoup Billions spent in Step 1 -3.

    Self driving is now at Step 2, moving to the Step 3.
    • Step 3: Destroy all alternatives, ideally by legislation.

      In a good election cycle, I'd expect Step 3 to be the buzzsaw that stops the whole plan, in the opposite sense: legislation or legal action to ensure the market is not dominated by a monopolist.

      But right now? Excuse me, I'm going to make popcorn.

  • A third party to decide, instead of a company that has a vested interest in reporting numbers that make them look good.

    • by ClickOnThis ( 137803 ) on Friday May 02, 2025 @10:36AM (#65347115) Journal

      Waymo is publishing this in the Traffic Injury Prevention Journal. [tandfonline.com] The journal is peer-reviewed, and (I assume) publishes reader comments. As imperfect as peer-reviewed publication is, I still think it's the best method we have right now for disseminating new scientific and engineering results.

      The whole point of Waymo publishing its findings is to begin a process that lets third parties "decide" what to make of Waymo's technology. Other studies will follow, including ones from third parties, as self-driving technology spreads.

      • You have trust in corporate america? Let's get the NSTB to decide

        • This is a report from a manufacturer on the safety record of their technology. It doesn't "decide" anything. It's a scientific/engineering paper, not a piece of legislation or a publication/opinion/ruling from the NTSB.

          And no, I don't trust corporate America, at least not without verification. I expect such verification will follow, after this paper is published. That's how science works.

  • So... (Score:1, Interesting)

    by Anonymous Coward

    The other day I boarded a Waymo and asked it to take me to the airport. A short while after we departed for the ten mile route, I felt an uneasiness in my bowels. Being that it was a Waymo, I promptly dropped my trousers and proceeded to use the back seat as my own personal latrine. It took a few minutes, but soon there was shit all over the back seat of this thing, with more than a little thiamine-rich urine. Much to my surprise, the "AI" decided that this was too much and redirected my Waymo cab into a co

  • A bot car that can analyze the surrounding 350m of traffic 50x per second is (waaay) better than humans at reacting to dangerous situations.
    This shouldn't be surprising to anyone paying attention in the last 10 years.

    Now I'd like to know when I need a special permit to drive a human controlled Motorbike rather than being forced to use a bot-car to take me from A to B. Or when manual driving will be prohibited for humans on public roads. Not sure how fast that could happen, but I expect it to be sooner than

  • Does it repeatedly do dumb shit and cause damage and deaths for no good reason? Including repeated obstructing of emergency services.

    The way I see it is if the "AI" can't handle learning on the fly then it'll never do the job of self-driving ... for a regular consumer car at least.

    • by dvice ( 6309704 )

      > Does it repeatedly do dumb shit and cause damage and deaths for no good reason? Including repeated obstructing of emergency services.

      Yes, Humans do that all the time. That is why they are trying to fix the problem with driverless cars. Waymo has AFAIK caused only one death. The victim was a dog which run from behind a parked car to the side of Waymo, making it nearly impossible to avoid.

      > The way I see it

      AI is already doing the job and better than humans, so your view is already wrong.

  • When their is an emergency, Waymo will be clueless. What we need instead: a whole bunch of ordinary people--working to put billionaires--penniless.
  • by doubledown00 ( 2767069 ) on Friday May 02, 2025 @10:35AM (#65347109)

    I don't believe you, Waymo.

  • I wonder if they indirectly cause more crashes because they drive so slug-slow and regular drivers need to execute risky maneuvers to get around them

    • No, they do not drive too slow. They're average speed around San Francisco and the Bay Area. You just don't like technology or something. Are you OK with 40,000 humans killing each other in traffic accidents in the USA every year? (Nearly one million worldwide). Self-driving is the only way forward .. even if it's slow and they're annoying .. we need the tech to develop. The tech is already safer than any humans. We need to ban humans from driving ASAP.

    • I wonder if they indirectly cause more crashes because they drive so slug-slow and regular drivers need to execute risky maneuvers to get around them

      They actually obey traffic laws like the concept of a speed limit. Do you mean like the human driver idiots on the road who tailgate me, then sling around me, coming six inches from my bumper, passing me, then slinging back in front of me, again passing six inches from my bumper, forcing me to slow down to regain following distance so some other dipshit behind me can do the same thing?

      • If everyone drove the speed limit stop and go traffic jams would mostly disappear. The behavior you describe is the reason we have traffic jams.

        Whether Waymo is already better than human drivers or not requires a lot more evidence than this one study. But the bar to being safer than most drivers is pretty low. Humans have slow reflexes, short attention spans, are emotionally unstable, have terrible risk assessment, are selfish and easily distracted. Take all the human drivers off the road and there is l

    • by Jeremi ( 14640 )

      regular drivers need to execute risky maneuvers

      "need to"? Or "are impatient and decide to"?

  • Because about every 5-10 miles a human being has to stop and make a correction.

    That's the dirty Little secret of Waymo. I guess it's still cheaper than a actual driver if they can get the hardware costs down. Although it does concern me if Waymo is ever expected to turn a profit I wouldn't be surprised to see them demanding less human interaction even if the hardware and software isn't ready for it yet.

    I never did like the idea that people are forced to opt in to a live test on public roads. Never m
    • I never did like the idea that people are forced to opt in to a live test on public roads.

      As far as I know people weren't asked to opt in to sharing the road with several ton motor vehicles traveling at inherently unsafe speeds driven by anyone with minimal skills or training. And yes, we ought to get to a point where they don't have to.

    • by PCM2 ( 4486 )

      Because about every 5-10 miles a human being has to stop and make a correction.

      You're thinking of Teslas. There's a sign on the Waymo steering wheel warning passengers never to touch the controls.

  • by ZombieCatInABox ( 5665338 ) on Friday May 02, 2025 @11:26AM (#65347233)

    I have a friend blind in one eye.

    She doesn't have lidar or radar sensors all around her face, or multiple cameras, or 5G location software to tell her where all other vehicules are ten miles all around, or GPS.

    All she has is one eye. And a head that turns. And maybe a couple accelerometers in her ears. And yet she's a good driver. An excellent driver. Heck, a better driver than I'll ever be. And she can drive equally well at night, in heavy rain, during a snowstorm, on slippery dirt roads, or when her car is completely covered in mud (she drove the Demster highway all the way to Tuktoyaktuk).

    And she's never done dumb shit like encaissing herself under a blue colored tank truck because her software confused the lower line of the tank with the horizon.

    Food for thought.

    • by dvice ( 6309704 )

      You are comparing your friend to Tesla. This article is about Waymo.

    • by Jeremi ( 14640 )

      You're comparing a best-case human driver(*) against a worst-case automated driver(**); so it's not surprising that the comparison favors the human.

      A contrived example like that doesn't tell us much about the real world, though, since the real-world will be dominated by average-ability drivers.

      (*) yes, "one-eyed but still awesome" is an optimal case, by definition

      (**) Tesla's self-driving approach is very much of the "move fast and break things" school, with a splash of "let's use the cheapest possible sens

  • by Berkyjay ( 1225604 ) on Friday May 02, 2025 @11:53AM (#65347311)

    What the fuck are these numbers and where are they getting them? They imply that Waymo is still injuring people and getting into accidents.

  • They're absolutely safer but it is because they drive more timidly than humans do. So timid in fact that they have no problem stopping in the middle of traffic requiring human remote drivers to nudge them along.

  • by DesertNomad ( 885798 ) on Friday May 02, 2025 @12:31PM (#65347489)

    Waymo's been operating on the streets of Phoenix for at least the past couple years now. While I've used the service a couple times, it's generally easier for me to just drive myself. (Maybe they've improved the routing between my house and the airport, the last time I used it to go to/from the airport the routing was so circuitous that it was irritating.) Also, it's really not much cheaper than the human-operated vehicles, but the Jaguars are clean, quiet, and the music is really cool. And no surly, inept, chatty, or smelly driver with 5 Little Trees hanging from the rear view mirror.

    Anyway, Waymo AFAIK is a fully autonomous vehicle, with all life/safety computing/information on board. Latency is critical, and wireless links are unreliable. I've done work with some significant "autonomous" vehicle projects (I'm the comms engineer) and anything that needs wireless connectivity to perform life/safety functions just won't work safely. I also have quite a bit of work experience with positive train control, a generally autonomous function, which was finally forced onto the US passenger and Class 1 freight railroads after a particularly disastrous head-on collision in California in 2008. While there are tons of wireless links in order to make it work, NONE are life/safety critical. All information necessary to manage the train (in case the engineer keels over or is really distracted) is on-board the locomotive. Informational messages transmitted wirelessly from wayside signals etc. is beaconed every 6 seconds or so, so there's generally plenty of time for a train to collect information on what's happening miles ahead. If the on-board computer doesn't get the necessary information, it forces the train to safely slow down and ultimately stop if the necessary information remains unavailable.

    As I said, Waymo's been here for a while. I sometimes (at a distance) follow a Waymo vehicle to see how it adapts to situations. I find its behavior very human-like, not super aggressive, but determined to do what it needs to do. It performs maneuvers that I'd do myself, though it has way more ability to "see" the situation in real time while my puny human brain and eyes can only pay attention to one thing at a time %^). And if it's thwarted in its mission (let's say it's gotta make a left turn from an uncontrolled intersection or driveway onto a busy street, but the traffic just won't let up) it will make a right turn, then figure out how to get going the correct direction. This is all done on-board.

    When a Waymo gets into a situation it can't get out itself, or it breaks down, or whatever, then it apparently does a phone-home for help. The latency on the cellular link is horrific for remote driving, it really can't be done safely, the delay is enough that for a human driver they'd be considered between DUI and extreme DUI.

    • by PCM2 ( 4486 )

      Maybe they've improved the routing between my house and the airport, the last time I used it to go to/from the airport the routing was so circuitous that it was irritating.

      In San Francisco, getting to the airport generally means taking the freeway. Although Waymo has permission to test its vehicles on the freeway, I don't think it's begun doing so. There is probably a way to get to the airport without the freeway, but it would take a very long time and it would mean boing outside of Waymo's service area. Effectively, going to the airport is not a use case for Waymo in San Francisco.

  • And companies ~ always ~ use accurate and truthful statements in research when trying to sell a product.
    see: A Frank Statement: https://en.wikipedia.org/wiki/... [wikipedia.org]
  • Not sure what kind of experimental controls are possible to get such statistics.

    What would really interest me is a comparison between waymo and good drivers. Presumably, there is some distribution of accident rate among drivers themselves. Part of our problem is that we allow a lot of bad drivers on the roads.

  • Human drives are notoriously bad and unaware of that. Hence any self-driving mechanism that makes it on the streets needs to be significantly better than human drivers. Oh, and look, it is.

  • So why not lay some more tracks?

    • We don't even have self-driving trains except in small closed loops. The only reason they're doing cars first is because laying track involves buying land rather than using publicly funded streets.

  • -company selling the product

Whenever a system becomes completely defined, some damn fool discovers something which either abolishes the system or expands it beyond recognition.

Working...