Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Transportation Google

Waymo Pulls Back the Curtain On 6.1 Million Miles of Self-Driving Car Data (theverge.com) 57

An anonymous reader quotes a report from The Verge: In its first report on its autonomous vehicle operations in Phoenix, Arizona, Waymo said that it was involved in 18 crashes and 29 near-miss collisions during 2019 and the first nine months of 2020. These crashes included rear-enders, vehicle swipes, and even one incident when a Waymo vehicle was T-boned at an intersection by another car at nearly 40 mph. The company said that no one was seriously injured and "nearly all" of the collisions were the fault of the other driver. The report is the deepest dive yet into the real-life operations of the world's leading autonomous vehicle company, which recently began offering rides in its fully driverless vehicles to the general public. ... [I]n this paper, and another also published today, the company is showing its work. Waymo says its intention is to build public trust in automated vehicle technology, but these papers also serve as a challenge to other AV competitors.

The two papers take different approaches. The first outlines a multilayered approach that maps out Waymo's approach to safety. It includes three layers: Hardware, including the vehicle itself, the sensor suite, the steering and braking system, and the computing platform; The automated driving system behavioral layer, such as avoiding collisions with other cars, successfully completing fully autonomous rides, and adhering to the rules of the road; Operations, like fleet operations, risk management, and a field safety program to resolve potential safety issues.

The second paper is meatier, with detailed information on the company's self-driving operations in Phoenix, including the number of miles driven and the number of "contact events" Waymo's vehicles have had with other road users. This is the first time that Waymo has ever publicly disclosed mileage and crash data from its autonomous vehicle testing operation in Phoenix. Between January and December 2019, Waymo's vehicles with trained safety drivers drove 6.1 million miles. In addition, from January 2019 through September 2020, its fully driverless vehicles drove 65,000 miles. Taken together, the company says this represents "over 500 years of driving for the average licensed US driver," citing a 2017 survey of travel trends by the Federal Highway Administration.
"This is a major milestone, we think, in transparency," said Matthew Schwall, head of field safety at Waymo, in a briefing with reporters Wednesday. Waymo claims this is the first time that any autonomous vehicle company has released a detailed overview of its safety methodologies, including vehicle crash data, when not required by a government entity. "Our goal here is to kickstart a renewed industry dialogue in terms of how safety is assessed for these technologies," Schwall said.
This discussion has been archived. No new comments can be posted.

Waymo Pulls Back the Curtain On 6.1 Million Miles of Self-Driving Car Data

Comments Filter:
  • Are the waymo stats better than and equal amount of human drivers?

    • Yes. They are better than human drivers, by a margin that is too large to even call a margin.

      • Not really. Average meat sack is 165K miles between accidents. Google is running 6.1M/18 or 339K, so a little better than 2X. But we don't know if the google miles were easier miles than meat sack miles. I've driven numerous miles on snow and ice w/o accident and I would characterize those as much trickier to drive than driving in phoenix. I used to live in Tuscon. Easy miles, easy miles.
        • There are 2/3 more accidents where it snows, so even though you were trying to be sarcastic you were actually right. Also, you should be comparing to a meat sack driving a brand new $30K+ vehicle, not... whatever a meat sack may drive without modern safety equipment.
          • Yes, my miles in snow and ice had RWD no abs, and certainly was not a car in the 30K range of today. It was a beater. I went to college in a cold snow/ice climate and there was this one small incline that I used to swear at people who did not roll the stop sign. What were they thinking. So much easier on ice if you keep rolling even a little to maintain traction. And pump the brakes, as there was no abs to do it for you. People today are so spoiled with AWD, much better compounding on tires for cold, etc.
      • by fermion ( 181285 )
        It is hard to compare. In real conditions the accident rates is 1000 times worse than the Waymo data. However these are necessarily real world conditions.
    • by AmiMoJo ( 196126 )

      Probably because it's very difficult to do. They would need human drivers in similar vehicles operating on similar routes at similar times. The pool of people doing that is probably too small to draw meaningful conclusions from.

      • Maybe with the data captured during a crash, they could recreate the situation on a VR driving system and compare the reactions of the average driver.
      • Probably because it's very difficult to do. They would need human drivers in similar vehicles operating on similar routes at similar times. The pool of people doing that is probably too small to draw meaningful conclusions from.

        Seems like you could get a pretty fair approximation by just looking at accident statistics in Phoenix. I don't know if Waymo vehicles operate at night; if not, restrict it to daylight statistics. I'll be the Arizona DOT has pretty good estimates on driver miles in Phoenix, and they obviously have very good numbers on collisions.

        However, given that this represents 500 years worth of miles for a normal driver, and a normal driver is unlikely to drive for more than 50 years, you can get a rough approximatio

    • by s_p_oneil ( 795792 ) on Friday October 30, 2020 @05:52PM (#60667324) Homepage

      IMO the answer is obvious.

      FTA:
      Taken together, the company says this represents “over 500 years of driving for the average licensed US driver,” citing a 2017 survey of travel trends by the Federal Highway Administration. ... Eighteen of these events occurred in real life... “Nearly all” of these collisions were the fault of a human driver or pedestrian, Waymo says, and none resulted in any “severe or life-threatening injuries.”

      Summary:
      So it has 500 years of driving experience with 18 minor incidents (nearly all the fault of the other driver) and 0 severe injuries. And that 18 counts minor "bumps" with no damage. I've been bumped by other cars behind me not paying attention about half that many times, and I've barely been driving for 30 years. The main caveat is that so far it has still been under under limited conditions. IIRC, it doesn't do as well in rainy conditions, which is why most of the tests are done in very dry parts of the US. They also have strictly limited the range because it's more difficult to be 100% certain that the maps are perfect over larger areas.

      • "So it has 500 years of driving experience"

        Not 500 years of the average humans' driving. The Waymo is geo fenced, and has all other kinds of limitations on it's driving conditions that don't apply to human drivers.

      • I replied too fast. You do mention the waymo's driving conditions are not comparable to a human driver.

      • The main caveat is that so far it has still been under under limited conditions. IIRC, it doesn't do as well in rainy conditions, which is why most of the tests are done in very dry parts of the US. They also have strictly limited the range because it's more difficult to be 100% certain that the maps are perfect over larger areas.

        Exactly. Also, at least when Google was testing cars around Mountain View, the driving behavior was extremely conservative. Those cars drove at or under the speed limit and wouldn't change lanes without a huge gap. There were always a few cars stuck behind the Google car, trying to find a gap in the next lane to pass the Google car.

        From the Waymo report [googleapis.com], "To date, Waymo has compiled over 20 million self-driving miles on public
        roads operating in over 25 cities, including 74,000 driverless miles." Tesla a

        • If that many people are having to pass the car then it's not driving human enough to be safe.
          • True, but that was much earlier on in their road tests. They had to built up confidence before speeding it up.

            If you watch the video linked in TFA, it seems to drive like an average Florida driver. That would be annoyingly slow to an Atlanta driver like me, but I don't think anyone really wants automated cars that drive like Atlanta drivers. ;-) I had a co-worker get pulled over in North Carolina, and the cop said "I knew you were from Atlanta when I saw you hit 75 on the entrance ramp" (for a stretch of hi

          • by ljw1004 ( 764174 )

            If that many people are having to pass the car then it's not driving human enough to be safe.

            Conjecture: the humans aren't driving computer-like enough to be safe.

        • "Also, at least when Google was testing cars around Mountain View, the driving behavior was extremely conservative."

          That makes perfect sense. In Google's case, Google was 100% liable for anything at all going wrong. In Tesla's case, they passed the buck on to the drivers by saying "While using Autopilot, it is your responsibility to stay alert, keep your hands on the steering wheel at all times and maintain control of your car". They basically made their customers beta testers for them. With that in mind, w

          • Waymo and Tesla are solving the self-driving problem from opposite ends of the spectrum. Tesla is trying to figure out how to drive cars in all conditions, including a lot of dangerous ones. Waymo is driving cars slowly in idealized conditions, and gradually branching out into slightly more difficult ones.

            It's going to be a decade or two before Waymo's cars can handle what Tesla's can handle now. They're not even driving them in places it rains. Tesla is navigating snow covered roads. Given what Tesla is do

    • The amount of incidents for the waymo vehicles would be within the margin of error for reporting accidents of human drivers.
    • They have a chapter about comparisons to human performance at the end of the second paper, "4.2 Aggregate Safety Performance".

      Basically - there's not enough data about humans for low severity crashes, since those go mostly unreported; and they didn't run enough miles for their more severe crash cases to be statistically significant.

      What Waymo is actually doing here is to establish a safety standard for AVs. The first paper is also important. If those testing methodologies and metrics are adopted at industry

  • " "nearly all" of the collisions were the fault of the other driver."

    Unexpected and non-normal movement of Waymo vehicle may be the reason for the collision even if other driver is at fault. Sudden unnecessary braking should be Waymo's fault, not a normal drivers fault. Slower than normal human driving should also not be acceptable with non-human driving.
    Waymo not avoiding a collision may also be an issue even if other vehicle is technically at fault. If I have to swerve to avoid another vehicle or obje

    • https://www.theverge.com/2020/... [theverge.com]

      This picture seems to indicate Waymo slowed down way too much and way too early for a traffic light. Electric vehicles and regenerative-braking vehicles are doing this dangerous driving also.

      • Are you talking about the picture labeled figure 3 event b? You're crazy if you think was slowing down too early. Furthermore, you can see way back there that the rear car had already slowed down to a matching speed but then in all that additional distance only slowed down 3 mph addition. Plus, if you remove the waymo vehicle from the scene, the rear car was doing 25mph when it was about 4 car lengths from the car already stopped at the light. If the waymo vehicle weren't there, the driver would've just plo

    • If you've ever been involved in a vehicular altercation with a law enforcement officer, you probably have an idea how the future is going to rule on the puny human driver being at fault vs. the unerring computer guidance system.

      • Every self-driving car is going to have a precise record not only its own telemetry, but the video/lidar evidence of other vehicles' positions and actions around it. It's probably going to be fairly easy to determine fault in most cases simply be examining the records.

        And yes, the default suspicion is going to fall on the human driver with very human foibles and failings. Once self-driving cars have a clearly superior safety record, that's going to be inevitable. Hopefully it doesn't unfairly prejudice t

    • Unexpected and non-normal movement of Waymo vehicle may be the reason for the collision even if other driver is at fault.

      Nope. The reason for the collision is the other driver didn't react to changed conditions. Fault and reason are the same thing. If the guy in front of you without any warning on notice slams on a break and you rear end him, you're at fault, the reason for the accident is that you were driving too close to react to a change. If the guy to the left of you swerves but never leaves his lane and you react and hit another car, you're at fault, the reason for the accident is you got spooked and were unable to cont

      • The average USA based driver...

        Apples to oranges.

        The average USA based driver doesn't drive slowly around Phoenix. That is all that Waymo is doing.

        They're not driving in rain, in the snow, on mountain roads, through rush hour in Atlanta, or during a hurricane evacuation in Florida. They are not going on new roads, or to new places. They're operating in a well-mapped, limited area at low speeds in idealized conditions.

        Unless you have driving data on similar drivers in Phoenix, you don't have a comparison to make.

    • Maybe who is at fault rules need to be changed when one of the drivers is non-human?

      Nope. It's your responsibility to cue from the driver, or lack thereof. I watch drivers' heads (and when possible, faces) to decide what they're going to do based on where they're looking. If I don't see one, I assume the vehicle could do anything, because the driver is rooting around in the footwell for a CD or vibrator or whatever, or because they can't see over the dashboard. The same logic would lead me to give a self-driving vehicle a wide berth.

      As for rear-ending a vehicle, it's always your responsibi

  • The only stats that I have are for me.
    I have a driver license and drive for 15 years now and I have owned a car for 13.5 years. I've driven a total of 180 000 km or 115 000 miles.
    I've been in 4 crashes:
    • 2 were other driver's fault - once I was rear ended, while fully stopped and waiting at a red light and once I had a right of way, but the other driver didn't respect it. Both were quite serious and took my car out of commission for a while.
    • 2 were my fault - I slightly rear ended couple of cars on diffe
    • Did you stick to Waymo training routes only for the entire time you were driving?
      • In phoenix, where there are 230 days of lack of what the rest of us call road conditions. Phonix average road speed is something like 7 mbps slower than the speed limit because of the number of traffic lights. Road without much in the way of potholes. Same study it Pittsburgh, NOLA, Red Bank NJ and Seattle and get back to me.
        • Give it time. As more data is gathered and the technology matures the scope of self driving will be expanded and you can then shift your goalposts farther away. Right now Waymo is doing what no robot has done before and many said can't be done at all, isn't that enough?
        • Phonix average road speed is something like 7 mbps slower than the speed limit

          mbps? I don't think we need commentary from chatbots on the information superhighway

      • The limited scope of testing conditions so far is not really relevant. The point is that under any conditions when you look at two party collisions an average driver is expected to cause about as many accidents as he suffers at no fault of his own. This statistic would be skewed for a really bad or a really good driver in respective direction. Exhibit A - Waymo results.
        • The limited scope of testing conditions means that they are only testing 20% of real world driving. Tell me the next time you make 20% of a game and it becomes commercially successful.
  • "pay no attention to that man behind the curtain"

  • On what kind of roads, under what weather conditions, under what traffic conditions, at what speed limits, and so on, and so on, and so on?
    If it's all under 'near ideal' conditions, not 'average, everyday conditions' that the vast majority of drivers experience, then it's cherry-picking.
    Still, none of this negates the fact that the so-called 'AI' has no actual reasoning ability, can only rely on it's 'training data', can still mistake objects for something else, and still needs a 'remote human operator' t
    • Also none of this will ever get around the fact that if you have no control over the vehicle you will never really feel safe

      I frequently ride in vehicles I have no control over and mostly feel safe.

      • Funny, I don't ride with certain drivers because I feel unsafe with them. I meet them wherever instead.
      • Re:Conditions? (Score:5, Interesting)

        by laird ( 2705 ) <lairdp@gm a i l.com> on Friday October 30, 2020 @07:46PM (#60667558) Journal

        It's true that people "feel unsafe" in situations where they don't feel like they are in control, even though they're far safer than in other similar situations. For example, it's not uncommon for people to fear flying, and not to fear driving, even though they are 750x more likely to die per mile driven than flown. So for some people it's not about actual safety, it's about feeling like you don't control the situation.

        So even though autonomous vehicles are expected (when they're ready) to be able to reduce collision and deaths by perhaps 90%, some people will still fear them and choose to take 10x the risk and drive manually. Luckily for them, they'll still be safer, since at least the autonomous vehicles won't hit them, and will be more able to respond to their erratic driving. Though it's unfortunate that they'll be putting themselves and everyone around them at unnecessary risk.

        • Though it's unfortunate that they'll be putting themselves and everyone around them at unnecessary risk.

          Yes, yes, and people like you will try to 'drive shame' us into giving up control of our own safety and lives, over to some half-assed software. Nope, nope, nope, not going to happen. Enjoy your death machine.

      • ..driven by another human being, who you can at least talk to, or at worst *yell at*, and who by the way has an actual brain that is capable of actual reasoning. not some hardware and crappy over-hyped software.
  • I literally just had a run in with one of their vans today here in SF. It swerved into my lane forcing me to hit the brakes. Seems like it was avoiding a large truck that was parked well away from the curb and sticking out into it's lane. Still not sure if the van was human controlled on self-driving.

  • by McGruber ( 1417641 ) on Friday October 30, 2020 @09:07PM (#60667752)

    18 crashes in 6.1 million miles traveled is a rate of 1 crash per 338,888 miles traveled.

    That is an awful safety performance compared to the crash rates of experienced Commercial Drivers License (CDL) holders, many of whom drive more than a million miles without a single crash.

    • That is an awful safety performance compared to the crash rates of experienced Commercial Drivers License (CDL) holders, many of whom drive more than a million miles without a single crash.

      On city streets, where Waymo operates? It's easy for an over-the-road truck driver to rack up lots and lots of accident-free miles, because freeway miles are very safe. The numbers for city bus drivers would be a better comparison. I spent a couple minutes googling and couldn't find those numbers, though. I could find fatality numbers, but Waymo's lack of fatalities prevents comparison.

      • Comparing Waymo self-driving to CDL holders will never, ever give you a fair comparison, period. Why? Because you only need a CDL to operate heavy trucks. You don't need a CDL to commercially operate a car, or a pickup truck. A CDL is required only for drivers of commercial motor vehicles with a GVWR or GCWR of 26,001 lbs or more, being operated for the purposes of commerce. I looked into this quite a bit because I am [occasionally] working on a bus to RV conversion with a GVWR of 31,200 lb.

        You can't meanin

    • The world doesn't care about "many of whom". The world cares about averages, and the average vehicle in America will crash once every 168000 miles according to the stats.

      Your CDL holders example is a very poor comparison of two very different things. CDL vehicles are not cars on a road, they aren't taxis on a road. They are slow moving heavy beasts with little acceleration frequently driven long haul over long distances in constant and non-varying conditions with very little chance of dangerous interactions

  • In Phoenix Arizona, probably on the same, roads, in hot weather. Guess we don't have the same definition of average.
  • The two major competitors to Waymo (Tesla and Uber) have literally had people die on their "test" platforms. So from purely a numbers perspective, they're doing well. Also unlike Tesla, Waymo works on residential roads and parking, something that you're not suppose to use Tesla's system on.

    • So Tesla, which operates on all roads in all weather with millions of vehicles in use you trust less than Waymo which operates 600 cars in Phoenix? Unless you also live in Phoenix, that's probably misplaced trust.

      Waymo is an academic exercise, and Tesla is actual real-world experience. People are going to have Teslas carry them up slippery mountain roads this winter by the tens of thousands, if not hundreds of thousands. Waymo is going to continue to operate somewhere flat and dry at speeds under the speed

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...