Waymo Pulls Back the Curtain On 6.1 Million Miles of Self-Driving Car Data (theverge.com) 57
An anonymous reader quotes a report from The Verge: In its first report on its autonomous vehicle operations in Phoenix, Arizona, Waymo said that it was involved in 18 crashes and 29 near-miss collisions during 2019 and the first nine months of 2020. These crashes included rear-enders, vehicle swipes, and even one incident when a Waymo vehicle was T-boned at an intersection by another car at nearly 40 mph. The company said that no one was seriously injured and "nearly all" of the collisions were the fault of the other driver. The report is the deepest dive yet into the real-life operations of the world's leading autonomous vehicle company, which recently began offering rides in its fully driverless vehicles to the general public. ... [I]n this paper, and another also published today, the company is showing its work. Waymo says its intention is to build public trust in automated vehicle technology, but these papers also serve as a challenge to other AV competitors.
The two papers take different approaches. The first outlines a multilayered approach that maps out Waymo's approach to safety. It includes three layers: Hardware, including the vehicle itself, the sensor suite, the steering and braking system, and the computing platform; The automated driving system behavioral layer, such as avoiding collisions with other cars, successfully completing fully autonomous rides, and adhering to the rules of the road; Operations, like fleet operations, risk management, and a field safety program to resolve potential safety issues.
The second paper is meatier, with detailed information on the company's self-driving operations in Phoenix, including the number of miles driven and the number of "contact events" Waymo's vehicles have had with other road users. This is the first time that Waymo has ever publicly disclosed mileage and crash data from its autonomous vehicle testing operation in Phoenix. Between January and December 2019, Waymo's vehicles with trained safety drivers drove 6.1 million miles. In addition, from January 2019 through September 2020, its fully driverless vehicles drove 65,000 miles. Taken together, the company says this represents "over 500 years of driving for the average licensed US driver," citing a 2017 survey of travel trends by the Federal Highway Administration. "This is a major milestone, we think, in transparency," said Matthew Schwall, head of field safety at Waymo, in a briefing with reporters Wednesday. Waymo claims this is the first time that any autonomous vehicle company has released a detailed overview of its safety methodologies, including vehicle crash data, when not required by a government entity. "Our goal here is to kickstart a renewed industry dialogue in terms of how safety is assessed for these technologies," Schwall said.
The two papers take different approaches. The first outlines a multilayered approach that maps out Waymo's approach to safety. It includes three layers: Hardware, including the vehicle itself, the sensor suite, the steering and braking system, and the computing platform; The automated driving system behavioral layer, such as avoiding collisions with other cars, successfully completing fully autonomous rides, and adhering to the rules of the road; Operations, like fleet operations, risk management, and a field safety program to resolve potential safety issues.
The second paper is meatier, with detailed information on the company's self-driving operations in Phoenix, including the number of miles driven and the number of "contact events" Waymo's vehicles have had with other road users. This is the first time that Waymo has ever publicly disclosed mileage and crash data from its autonomous vehicle testing operation in Phoenix. Between January and December 2019, Waymo's vehicles with trained safety drivers drove 6.1 million miles. In addition, from January 2019 through September 2020, its fully driverless vehicles drove 65,000 miles. Taken together, the company says this represents "over 500 years of driving for the average licensed US driver," citing a 2017 survey of travel trends by the Federal Highway Administration. "This is a major milestone, we think, in transparency," said Matthew Schwall, head of field safety at Waymo, in a briefing with reporters Wednesday. Waymo claims this is the first time that any autonomous vehicle company has released a detailed overview of its safety methodologies, including vehicle crash data, when not required by a government entity. "Our goal here is to kickstart a renewed industry dialogue in terms of how safety is assessed for these technologies," Schwall said.
Why don't they compare this with 'real' drivers? (Score:2)
Are the waymo stats better than and equal amount of human drivers?
Re: Why don't they compare this with 'real' driver (Score:1)
Yes. They are better than human drivers, by a margin that is too large to even call a margin.
Re: (Score:3)
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Probably because it's very difficult to do. They would need human drivers in similar vehicles operating on similar routes at similar times. The pool of people doing that is probably too small to draw meaningful conclusions from.
Re: Why don't they compare this with 'real' driver (Score:2)
Re: (Score:3)
Probably because it's very difficult to do. They would need human drivers in similar vehicles operating on similar routes at similar times. The pool of people doing that is probably too small to draw meaningful conclusions from.
Seems like you could get a pretty fair approximation by just looking at accident statistics in Phoenix. I don't know if Waymo vehicles operate at night; if not, restrict it to daylight statistics. I'll be the Arizona DOT has pretty good estimates on driver miles in Phoenix, and they obviously have very good numbers on collisions.
However, given that this represents 500 years worth of miles for a normal driver, and a normal driver is unlikely to drive for more than 50 years, you can get a rough approximatio
Re: (Score:2)
Re: (Score:2)
Good luck keeping that statistical anomaly!
Re: (Score:2)
Hello, nice to meet you. zero.
You've driven for 50 years?
Re:Why don't they compare this with 'real' drivers (Score:4, Informative)
IMO the answer is obvious.
FTA: ... Eighteen of these events occurred in real life... “Nearly all” of these collisions were the fault of a human driver or pedestrian, Waymo says, and none resulted in any “severe or life-threatening injuries.”
Taken together, the company says this represents “over 500 years of driving for the average licensed US driver,” citing a 2017 survey of travel trends by the Federal Highway Administration.
Summary:
So it has 500 years of driving experience with 18 minor incidents (nearly all the fault of the other driver) and 0 severe injuries. And that 18 counts minor "bumps" with no damage. I've been bumped by other cars behind me not paying attention about half that many times, and I've barely been driving for 30 years. The main caveat is that so far it has still been under under limited conditions. IIRC, it doesn't do as well in rainy conditions, which is why most of the tests are done in very dry parts of the US. They also have strictly limited the range because it's more difficult to be 100% certain that the maps are perfect over larger areas.
Re: (Score:2)
"So it has 500 years of driving experience"
Not 500 years of the average humans' driving. The Waymo is geo fenced, and has all other kinds of limitations on it's driving conditions that don't apply to human drivers.
Re: (Score:2)
I replied too fast. You do mention the waymo's driving conditions are not comparable to a human driver.
Re: (Score:2)
The main caveat is that so far it has still been under under limited conditions. IIRC, it doesn't do as well in rainy conditions, which is why most of the tests are done in very dry parts of the US. They also have strictly limited the range because it's more difficult to be 100% certain that the maps are perfect over larger areas.
Exactly. Also, at least when Google was testing cars around Mountain View, the driving behavior was extremely conservative. Those cars drove at or under the speed limit and wouldn't change lanes without a huge gap. There were always a few cars stuck behind the Google car, trying to find a gap in the next lane to pass the Google car.
From the Waymo report [googleapis.com], "To date, Waymo has compiled over 20 million self-driving miles on public
roads operating in over 25 cities, including 74,000 driverless miles." Tesla a
Re: (Score:2)
Re: (Score:2)
True, but that was much earlier on in their road tests. They had to built up confidence before speeding it up.
If you watch the video linked in TFA, it seems to drive like an average Florida driver. That would be annoyingly slow to an Atlanta driver like me, but I don't think anyone really wants automated cars that drive like Atlanta drivers. ;-) I had a co-worker get pulled over in North Carolina, and the cop said "I knew you were from Atlanta when I saw you hit 75 on the entrance ramp" (for a stretch of hi
Re: (Score:3)
If that many people are having to pass the car then it's not driving human enough to be safe.
Conjecture: the humans aren't driving computer-like enough to be safe.
Re: (Score:2)
"Also, at least when Google was testing cars around Mountain View, the driving behavior was extremely conservative."
That makes perfect sense. In Google's case, Google was 100% liable for anything at all going wrong. In Tesla's case, they passed the buck on to the drivers by saying "While using Autopilot, it is your responsibility to stay alert, keep your hands on the steering wheel at all times and maintain control of your car". They basically made their customers beta testers for them. With that in mind, w
Re: (Score:3)
Waymo and Tesla are solving the self-driving problem from opposite ends of the spectrum. Tesla is trying to figure out how to drive cars in all conditions, including a lot of dangerous ones. Waymo is driving cars slowly in idealized conditions, and gradually branching out into slightly more difficult ones.
It's going to be a decade or two before Waymo's cars can handle what Tesla's can handle now. They're not even driving them in places it rains. Tesla is navigating snow covered roads. Given what Tesla is do
Re: (Score:1)
Re: (Score:1)
They have a chapter about comparisons to human performance at the end of the second paper, "4.2 Aggregate Safety Performance".
Basically - there's not enough data about humans for low severity crashes, since those go mostly unreported; and they didn't run enough miles for their more severe crash cases to be statistically significant.
What Waymo is actually doing here is to establish a safety standard for AVs. The first paper is also important. If those testing methodologies and metrics are adopted at industry
Who is at fault? (Score:2)
" "nearly all" of the collisions were the fault of the other driver."
Unexpected and non-normal movement of Waymo vehicle may be the reason for the collision even if other driver is at fault. Sudden unnecessary braking should be Waymo's fault, not a normal drivers fault. Slower than normal human driving should also not be acceptable with non-human driving.
Waymo not avoiding a collision may also be an issue even if other vehicle is technically at fault. If I have to swerve to avoid another vehicle or obje
Re: (Score:1)
https://www.theverge.com/2020/... [theverge.com]
This picture seems to indicate Waymo slowed down way too much and way too early for a traffic light. Electric vehicles and regenerative-braking vehicles are doing this dangerous driving also.
Re: (Score:3)
Are you talking about the picture labeled figure 3 event b? You're crazy if you think was slowing down too early. Furthermore, you can see way back there that the rear car had already slowed down to a matching speed but then in all that additional distance only slowed down 3 mph addition. Plus, if you remove the waymo vehicle from the scene, the rear car was doing 25mph when it was about 4 car lengths from the car already stopped at the light. If the waymo vehicle weren't there, the driver would've just plo
Re: (Score:2)
If you've ever been involved in a vehicular altercation with a law enforcement officer, you probably have an idea how the future is going to rule on the puny human driver being at fault vs. the unerring computer guidance system.
Re: (Score:2)
Every self-driving car is going to have a precise record not only its own telemetry, but the video/lidar evidence of other vehicles' positions and actions around it. It's probably going to be fairly easy to determine fault in most cases simply be examining the records.
And yes, the default suspicion is going to fall on the human driver with very human foibles and failings. Once self-driving cars have a clearly superior safety record, that's going to be inevitable. Hopefully it doesn't unfairly prejudice t
Re: (Score:2)
Re: (Score:2)
Unexpected and non-normal movement of Waymo vehicle may be the reason for the collision even if other driver is at fault.
Nope. The reason for the collision is the other driver didn't react to changed conditions. Fault and reason are the same thing. If the guy in front of you without any warning on notice slams on a break and you rear end him, you're at fault, the reason for the accident is that you were driving too close to react to a change. If the guy to the left of you swerves but never leaves his lane and you react and hit another car, you're at fault, the reason for the accident is you got spooked and were unable to cont
Re: (Score:2)
The average USA based driver...
Apples to oranges.
The average USA based driver doesn't drive slowly around Phoenix. That is all that Waymo is doing.
They're not driving in rain, in the snow, on mountain roads, through rush hour in Atlanta, or during a hurricane evacuation in Florida. They are not going on new roads, or to new places. They're operating in a well-mapped, limited area at low speeds in idealized conditions.
Unless you have driving data on similar drivers in Phoenix, you don't have a comparison to make.
Re: (Score:3)
Maybe who is at fault rules need to be changed when one of the drivers is non-human?
Nope. It's your responsibility to cue from the driver, or lack thereof. I watch drivers' heads (and when possible, faces) to decide what they're going to do based on where they're looking. If I don't see one, I assume the vehicle could do anything, because the driver is rooting around in the footwell for a CD or vibrator or whatever, or because they can't see over the dashboard. The same logic would lead me to give a self-driving vehicle a wide berth.
As for rear-ending a vehicle, it's always your responsibi
Those are surprisingly good numbers (Score:2)
I have a driver license and drive for 15 years now and I have owned a car for 13.5 years. I've driven a total of 180 000 km or 115 000 miles.
I've been in 4 crashes:
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Phonix average road speed is something like 7 mbps slower than the speed limit
mbps? I don't think we need commentary from chatbots on the information superhighway
Re: (Score:2)
Re: (Score:2)
We all learned from Dorothy (Score:2)
"pay no attention to that man behind the curtain"
Conditions? (Score:1)
If it's all under 'near ideal' conditions, not 'average, everyday conditions' that the vast majority of drivers experience, then it's cherry-picking.
Still, none of this negates the fact that the so-called 'AI' has no actual reasoning ability, can only rely on it's 'training data', can still mistake objects for something else, and still needs a 'remote human operator' t
Re: (Score:2)
Also none of this will ever get around the fact that if you have no control over the vehicle you will never really feel safe
I frequently ride in vehicles I have no control over and mostly feel safe.
Re: (Score:2)
Re:Conditions? (Score:5, Interesting)
It's true that people "feel unsafe" in situations where they don't feel like they are in control, even though they're far safer than in other similar situations. For example, it's not uncommon for people to fear flying, and not to fear driving, even though they are 750x more likely to die per mile driven than flown. So for some people it's not about actual safety, it's about feeling like you don't control the situation.
So even though autonomous vehicles are expected (when they're ready) to be able to reduce collision and deaths by perhaps 90%, some people will still fear them and choose to take 10x the risk and drive manually. Luckily for them, they'll still be safer, since at least the autonomous vehicles won't hit them, and will be more able to respond to their erratic driving. Though it's unfortunate that they'll be putting themselves and everyone around them at unnecessary risk.
Re: (Score:2)
Though it's unfortunate that they'll be putting themselves and everyone around them at unnecessary risk.
Yes, yes, and people like you will try to 'drive shame' us into giving up control of our own safety and lives, over to some half-assed software. Nope, nope, nope, not going to happen. Enjoy your death machine.
Re: (Score:2)
That illustrates my point nicely, thanks.
Re: (Score:2)
This is weird (Score:2)
I literally just had a run in with one of their vans today here in SF. It swerved into my lane forcing me to hit the brakes. Seems like it was avoiding a large truck that was parked well away from the curb and sticking out into it's lane. Still not sure if the van was human controlled on self-driving.
Waymo is an awful driver compared to CDL holders (Score:3)
18 crashes in 6.1 million miles traveled is a rate of 1 crash per 338,888 miles traveled.
That is an awful safety performance compared to the crash rates of experienced Commercial Drivers License (CDL) holders, many of whom drive more than a million miles without a single crash.
Re: (Score:3)
That is an awful safety performance compared to the crash rates of experienced Commercial Drivers License (CDL) holders, many of whom drive more than a million miles without a single crash.
On city streets, where Waymo operates? It's easy for an over-the-road truck driver to rack up lots and lots of accident-free miles, because freeway miles are very safe. The numbers for city bus drivers would be a better comparison. I spent a couple minutes googling and couldn't find those numbers, though. I could find fatality numbers, but Waymo's lack of fatalities prevents comparison.
Re: (Score:2)
Comparing Waymo self-driving to CDL holders will never, ever give you a fair comparison, period. Why? Because you only need a CDL to operate heavy trucks. You don't need a CDL to commercially operate a car, or a pickup truck. A CDL is required only for drivers of commercial motor vehicles with a GVWR or GCWR of 26,001 lbs or more, being operated for the purposes of commerce. I looked into this quite a bit because I am [occasionally] working on a bus to RV conversion with a GVWR of 31,200 lb.
You can't meanin
Re: (Score:2)
The world doesn't care about "many of whom". The world cares about averages, and the average vehicle in America will crash once every 168000 miles according to the stats.
Your CDL holders example is a very poor comparison of two very different things. CDL vehicles are not cars on a road, they aren't taxis on a road. They are slow moving heavy beasts with little acceleration frequently driven long haul over long distances in constant and non-varying conditions with very little chance of dangerous interactions
500 years? (Score:1)
I trust Waymo more than their competitors (Score:2)
The two major competitors to Waymo (Tesla and Uber) have literally had people die on their "test" platforms. So from purely a numbers perspective, they're doing well. Also unlike Tesla, Waymo works on residential roads and parking, something that you're not suppose to use Tesla's system on.
Re: (Score:2)
So Tesla, which operates on all roads in all weather with millions of vehicles in use you trust less than Waymo which operates 600 cars in Phoenix? Unless you also live in Phoenix, that's probably misplaced trust.
Waymo is an academic exercise, and Tesla is actual real-world experience. People are going to have Teslas carry them up slippery mountain roads this winter by the tens of thousands, if not hundreds of thousands. Waymo is going to continue to operate somewhere flat and dry at speeds under the speed