Human Drivers Keep Rear-Ending Waymos (arstechnica.com) 171
Waymo's driverless cars have a much lower crash rate than human drivers, with fewer than one injury-causing crash per million miles driven, compared to an estimated 64 crashes by human drivers over the same distance. As Ars Technica's Timothy B. Lee notes, a significant portion of Waymo's most severe crashes involved human drivers rear-ending the Waymo vehicles. From the report: Twenty injuries might sound like a lot, but Waymo's driverless cars have traveled more than 22 million miles. So driverless Waymo taxis have been involved in fewer than one injury-causing crash for every million miles of driving -- a much better rate than a typical human driver. Last week Waymo released a new website to help the public put statistics like this in perspective. Waymo estimates that typical drivers in San Francisco and Phoenix -- Waymo's two biggest markets -- would have caused 64 crashes over those 22 million miles. So Waymo vehicles get into injury-causing crashes less than one-third as often, per mile, as human-driven vehicles.
Waymo claims an even more dramatic improvement for crashes serious enough to trigger an airbag. Driverless Waymos have experienced just five crashes like that, and Waymo estimates that typical human drivers in Phoenix and San Francisco would have experienced 31 airbag crashes over 22 million miles. That implies driverless Waymos are one-sixth as likely as human drivers to experience this type of crash. The new data comes at a critical time for Waymo, which is rapidly scaling up its robotaxi service. A year ago, Waymo was providing 10,000 rides per week. Last month, Waymo announced it was providing 100,000 rides per week. We can expect more growth in the coming months.
So it really matters whether Waymo is making our roads safer or more dangerous. And all the evidence so far suggests that it's making them safer. It's not just the small number of crashes Waymo vehicles experience -- it's also the nature of those crashes. Out of the 23 most serious Waymo crashes, 16 involved a human driver rear-ending a Waymo. Three others involved a human-driven car running a red light before hitting a Waymo. There were no serious crashes where a Waymo ran a red light, rear-ended another car, or engaged in other clear-cut misbehavior.
Waymo claims an even more dramatic improvement for crashes serious enough to trigger an airbag. Driverless Waymos have experienced just five crashes like that, and Waymo estimates that typical human drivers in Phoenix and San Francisco would have experienced 31 airbag crashes over 22 million miles. That implies driverless Waymos are one-sixth as likely as human drivers to experience this type of crash. The new data comes at a critical time for Waymo, which is rapidly scaling up its robotaxi service. A year ago, Waymo was providing 10,000 rides per week. Last month, Waymo announced it was providing 100,000 rides per week. We can expect more growth in the coming months.
So it really matters whether Waymo is making our roads safer or more dangerous. And all the evidence so far suggests that it's making them safer. It's not just the small number of crashes Waymo vehicles experience -- it's also the nature of those crashes. Out of the 23 most serious Waymo crashes, 16 involved a human driver rear-ending a Waymo. Three others involved a human-driven car running a red light before hitting a Waymo. There were no serious crashes where a Waymo ran a red light, rear-ended another car, or engaged in other clear-cut misbehavior.
Obvious cause (Score:5, Funny)
"That car looks strange -- I can't see a driver -- I'd better get closer so I can see what is going on."
Re: (Score:2)
Re: (Score:2)
Waymo cars have learned how to stoop and squat.
You mean swoop and squat. And no, I don't think Waymo cars have learned that. How would that get into their neural nets as a reward?
[I didn't see a <sarcasm> tag, so I'm reading your post prima facie.]
Re:Obvious cause (Score:5, Interesting)
Waymo vehicles are coded to follow the law as written. Humans anticipate that the car will not follow the law (because they would not follow the law as written) and act accordingly -crashing into the Waymo vehicle.
Tesla is paying people specifically to train it's autopilot to ignore laws so that it drives more like a human would.
In the short term, this will cause the Tesla's to "fit it" better with human drivers, but will not provide the long-term safety improvements that the Waymo method will.
As it stands, the Waymo vehicles are already safer than human drivers -but only in the limited areas they are trained for. They are far from a fully capable, go-anywhere-under-any-circumstances, self-driving vehicle, but they are expanding the regions they cover as they master the existing regions. It is a slow-but-safe method of progress.
Re: (Score:2)
..."fit in" better with human drivers...
train it on the song I can't drive 55! (Score:3)
train it on the song I can't drive 55!
Obvious cause, isn’t? (Score:4, Insightful)
To bring up specifically one type of crash (rear ending), tends to imply a couple of possibilities for the cause. One of which is determining just how safely the driverless car is slowing and stopping. I’d like more detail on that aspect before we start believing every autonomous humblebrag about how much safer they are. When you have no meatsack detected in the car at all, does it still drive like it’s protecting a meatsack inside, or does it drive differently (“expeditiously picking up the next rider with maximum efficiency” in marketing-speak)?
I would like to assume human error is the reason for the excessive rear-ending. Let’s see some proof of that.
Re: (Score:2)
Assuming blame in accidents has a 50/50 chance of being of either driver,
If Waymo's vehicles are getting into 1/3rd as many crashes, that accounts for all the crashes their driver would have caused, PLUS avoiding 1/3 of crashes where the other driver is at fault; meaning the system is slowing/stopping even safer, and should not be accused of unsafe sudden stops.
There are other factors, I think Waymo doesn't even try to drive in snow, etc. If Waymo's only drive in the safest situations, their record can be b
Re: (Score:2)
Assuming blame in accidents has a 50/50 chance of being of either driver,
If Waymo's vehicles are getting into 1/3rd as many crashes, that accounts for all the crashes their driver would have caused, PLUS avoiding 1/3 of crashes where the other driver is at fault; meaning the system is slowing/stopping even safer, and should not be accused of unsafe sudden stops.
It assumes the entire system as a whole is safer, but does not grant you the ability to merely dismiss any single aspect of it. We may find a glitch where the driverless car blips the brakes before every right-turn signal is engaged, causing accidents that get smothered and covered with blanket statistics. Wrong way to go about this. You analyze every aspect of crashes. To avoid them altogether.
There are other factors, I think Waymo doesn't even try to drive in snow, etc. If Waymo's only drive in the safest situations, their record can be better just from that.
That would be a biased limited record, not a better one. Apples to watermelons. I’ve lived in Alaska.
Re: (Score:2)
If you've ever been in Seattle on one of the few days it snows you might well believe that a third of all crashes occur on those days. If you actually know how to drive in snow it's worth finding an intersection at the bottom of a hill and just watch (from a safe distance) these idiots playing 'vehicle billiards'.
Re: (Score:2)
Snowed in Seattle a few weeks ago. Pavement warm to my bare feet while ice fell from the sky, bizarre.
Re: (Score:3)
Letter of the law, is deadly. (Score:5, Interesting)
If I were a betting man, I'd say that it's these cars propensity and ability to follow the letter of the law. The Waymo car detects a yellow light and thinks "Oh, yellow light. I can safely stop for that" where just about every human on the planet thinks "Oh, yellow light. I can make that before it turns red.". Those two reactions are incompatible when the one in front is the computer driver.
This is especially deadly if the driverless car stopping at every yellow light, does not account for the object directly behind them. To clarify, I’m not talking about the arrogant human in a perpetual hurry that can stop at the yellow light and doesn’t for selfish reasons.
I’m talking about real situations, with Waymo assuming that 18-wheeler directly behind their stop-at-all-yellow programming can actually avoid the accident incoming at a physics rate of speed, limited only by DOT load regulations and an honest loadmaster.
My car is equipped with Brembo multi-piston brakes on 15” discs. Just because I can stop on a dime doesn’t mean the 30,000 pounds behind me easily can. Wonder how Waymo accounts for this and runs yellow lights when it should.
Re: (Score:2)
Re: (Score:2)
Agreed. There are those scenarios. I’ve trained myself over many years to glance in the mirror approaching an intersection, just so I’m aware a bit more aware of why I might not stop at a yellow. Thankfully pretty rare, but would have been a bad day for me a couple of times with the wrong decision.
It’ll be interesting seeing how this develops. No doubt the machine can react faster and generally be safer than humans driving. Reality never fails to be creative though. Murphys law and a
Re: (Score:2)
The more common situation is the car behind you has decided that if they ride your ass they'll have just enough time to pass the intersection just before it turns red. So someone who wasn't tailgating before is suddenly very close and possibly accelerating. Combine with someone who considered crossing instead deciding that if they slam on the brakes, they have just enough time to stop before the intersection. I've seen plenty of near-collisions of this sort.
Re: (Score:2)
Generally the following automobile should also have plenty of time to stop, except that very often they aren't paying any attention at all. So I give them more leeway if they let me, but some insist on driving too close. I've even had cars that were half a block behind me fail to slow down until the last minute, like they were watching their phone or something.
Also making it into the intersection when it is red is an infraction in many places, yellow is not a sign to hurry up, it's a sign that yo
Re:Letter of the law, is deadly. (Score:5, Insightful)
Not that it helps the car that gets rear-ended in these scenarios; but it bears reminding that if you *can't* stop safely before the intersection and before a red when the light turns yellow; then you are driving too fast for safety in the first place and any resulting accident is entirely your own fault, whether you rear-end someone who did stop, blow into the intersection at a red and T-bone or get T-boned, or if you luck out and nothing happens. Licensed CDL drivers know this. It's part of the training.
Besides, Waymos drive very conservatively (slowly) in the first place. I would expect that if you can't safely break to a stop behind one in whatever circumstance; in addition to driving unsafely, you've not kept your own vehicle properly maintained either. And that, as before, is entirely on you; not the Waymo.
Re: (Score:3)
I’m talking about real situations, with Waymo assuming that 18-wheeler directly behind their stop-at-all-yellow programming can actually avoid the accident incoming at a physics rate of speed, limited only by DOT load regulations and an honest loadmaster.
Does that not imply that the 18-wheeler is travelling at an unsafe speed, or an unsafe following distance?
When you're operating a vehicle that may well kill the driver in front of you if you rear-end them, you especially need to be able to avoid a collision even if they slam on their brakes. We're hypothesizing about it being caused by an autonomous vehicle stopping in a scenario where you wouldn't predict them to do so, but it could just as easily happen to a human driver if a pedestrian steps into traf
Re: (Score:2)
I’m talking about real situations, with Waymo assuming that 18-wheeler directly behind their stop-at-all-yellow programming can actually avoid the accident incoming at a physics rate of speed, limited only by DOT load regulations and an honest loadmaster.
Does that not imply that the 18-wheeler is travelling at an unsafe speed, or an unsafe following distance?
When you're operating a vehicle that may well kill the driver in front of you if you rear-end them, you especially need to be able to avoid a collision even if they slam on their brakes. We're hypothesizing about it being caused by an autonomous vehicle stopping in a scenario where you wouldn't predict them to do so, but it could just as easily happen to a human driver if a pedestrian steps into traffic, a tree branch falls into the road, etc.
Quite frankly with the legal load limits on large haulers, none would exceed 20MPH ever if they had to ensure stopping within the distance allotted. The lines painted at intersections and general driving rules basically allow drivers to stop on a dime if they choose. It ain’t smart, but its legal.
50MPH to zero within a short distance, becomes a matter of physics above a certain weight regardless of your stopping tech. This is why some truck drivers absolutely refuse to carry certain loads (like lar
Re: (Score:3)
I've had cars honk loudly at me for just slowing and having the brake lights on because of a very light tap, and they weren't even tailgating. Some people just seem to think that stopping is optional. Around here, if the light turns red before you leave the intersection, it is an infraction, even if it was green when you entered, but most humans still seem to think that as long as it's not red when you enter that it's ok.
It also depends where you drive. I do NOT like to drive in San Francisco, there are
Re: (Score:2)
My car is equipped with Brembo multi-piston brakes on 15” discs. Just because I can stop on a dime doesn’t mean the 30,000 pounds behind me easily can.
One thing for your silly analogy is that the 30000 pound cars are typically not driven by tailgating morons. Just because they have a longer stopping distance doesn't mean they follow as closely as you do with your amazing brakes.
As for your yellow light situation, the rear view mirror is for changing lanes and reversing. What is going on behind you is *never* part of the decision tree of whether to stop at a yellow light or not.
Nope. Mobile phones. (Score:3)
The rear-enders were looking at their phone. It's that simple. AI has nothing to do with it.
"Distracted driving is the main contributor for rear-end accidents. The NHTSA stated that driving while distracted contributed to 60% of rear-end collisions. The reason why this is so common is because distracted drivers often fail to notice stopped or slowing vehicles in front of them, causing a rear-end collision."
( https://www.mccoyandsparks.com... [mccoyandsparks.com] )
Re: (Score:2)
Re:Obvious cause, isn’t? (Score:4, Informative)
Yellow lights are timed such that if you are going the speed limit, you will have at least 1 second to decide whether to brake or proceed through the intersection.
But as you increase your speed, you have less and less time to decide and react. If you are going fast enough and in the wrong place (called the "dilemma zone" [ssti.us]) when the light turns yellow, you will have negative seconds to decide, in other words you will run the red light unless you accelerate.
So if you are speeding, you should slow down at signaled intersections in order to avoid finding yourself in the dilemma zone.
Re: (Score:3, Insightful)
Re:Obvious cause, isn’t? (Score:5, Insightful)
What do you do when a human driver unexpectedly slams on their brakes because a pedestrian stepped into traffic or a tree branch fell into the road?
If you can't safely stop when the car in front of you brakes as hard as possible, whether or not you have the same opportunity to react to the road situation that prompted them to do so, you're either following too close or are travelling too fast.
Re: (Score:2)
Rear-ending is almost always the fault of the driver in the rear. Yeah, I realize sometimes human drivers will cut someone off and then immediately slam on their brakes, but I doubt the robotaxis are programmed to merge in such an unsafe manner.
Re: (Score:3)
tends to imply a couple of possibilities for the cause. One of which is determining just how safely the driverless car is slowing and stopping.
Actually there is only one single cause for rear-endings: the car behind following too closely to safely stop if something unexpected occurs.
This is legally defined by the way. It doesn't matter if you just for shits and giggles slam on the brakes for no reason, if the guy behind you hits you, he is at fault in the eyes of the law and insurance for driving "dangerously" close.
Now sure everyone actually does it. Few people leave sufficient stopping distance, and let's face it everyone with adaptive cruise co
Re: Obvious cause, isn’t? (Score:2)
Re: (Score:2)
It's like when I drive the speed limit on the interstate and some moron gets behind me and starts having a shit fit, flashing his lights, then swings around and in front of me, passing about six inches from my bumper, causing me to brake, thus causing the next idiot behind me to do the same thing, and on and on. This causes a caterpillar effect, slowing all the traffic down. It's one of the reasons you get slowdown on the highway and then suddenly you get to where the perceived slowdown is and there's nothi
Re: (Score:3)
It's like when I drive the speed limit on the interstate
You didn't state if you were in the left lane. It's pivotal to assessing the response.
Re: (Score:2)
Tesla's method is not viable for full self driving, because Tesla would become liable for all the traffic violations that the car makes.
Re: (Score:2)
Waymo vehicles are coded to follow the law as written. Humans anticipate that the car will not follow the law (because they would not follow the law as written) and act accordingly -crashing into the Waymo vehicle.
Tesla is paying people specifically to train it's autopilot to ignore laws so that it drives more like a human would.
In the short term, this will cause the Tesla's to "fit it" better with human drivers, but will not provide the long-term safety improvements that the Waymo method will.
As it stands, the Waymo vehicles are already safer than human drivers -but only in the limited areas they are trained for. They are far from a fully capable, go-anywhere-under-any-circumstances, self-driving vehicle, but they are expanding the regions they cover as they master the existing regions. It is a slow-but-safe method of progress.
SO it would seem like the obvious plan should be to use the Tesla method in the short term, and only then once you have some awesome, "like a really good human" driving, and mass adoption, only then pivot to "even safer, but only when not mixed with lots of human drivers" behavior.
Re: (Score:3, Insightful)
More like, this is a city where everyone runs 1/2 a car length away at 55mph and the Waymo simply slams its brakes whenever it detects an error.
Brake checking a car is illegal in most places.
Re: (Score:2)
Do you even know what brake checking means?
Re: (Score:2)
Does that count if you're stopping to avoid debris or dangerous conditions?
Re: (Score:2)
You assume it's "for no reason". There is no information clearly stating that.
Re:Obvious cause (Score:4, Insightful)
Brake checking a car is illegal in most places.
So is not leaving enough distance to safely stop your vehicle.
Re: (Score:2)
Re: (Score:2)
Brake checking a car is illegal in most places.
If you're riding so close to their bumper that a collision was absolutely unavoidable, you're going to get a ticket too. Two wrongs don't make a right.
On the highway, I've once been behind a semi-truck that had its engine seize, and I was able to safely stop without a collision. The vehicle behind me, however, had to ditch into the breakdown lane. Guess who was leaving a safe following distance, and who wasn't?
Re: Obvious cause (Score:2)
Minimum assured clear distance is the law in my area. It basically means that you as the following driver are responsible for being able to safely stop if the car in front of you stops. Even if the clown in front of you pulls the ânet meme âoebrake checkâ move.
They brake check chump can still be cited under he broad âoeunsafe operation of a motor vehicleâ but for all these Waymo rear ends itâ(TM)s the human at fault.
Re:Obvious cause (Score:4, Interesting)
I did contract work for a self driving car company. They had the same problem with getting rear ended because the cars drive conservatively and are more likely to stop out of caution. It got so bad that during test runs a chase car would follow behind.
It call comes down to people tailgating or looking at their phones instead of the road.
Re: (Score:2)
Did you also have a chasechase car to protect your chase car? Or was the chase car rear-ended less for mysterious reasons?
Is tailgating. (Score:4, Insightful)
Most drivers follow too closely. Of course when you follow a nice safe distance that safety zone will be filled by another car if it's a double lane road.
Re: (Score:2)
If someone is tailgating me I slow down until the distance they're following me at is a safe distance for that speed. It's very effective.
Re: (Score:2)
Re: Obvious cause (Score:2)
Re: (Score:3)
"That car looks strange -- I can't see a driver -- I'd better get closer so I can see what is going on."
Seems like they're set up to drive like a CRV or RAV4 driver. Simple driver classification:
Not Surprising At All (Score:2)
This doesn't surprise me in the least. First, human drivers are terrible, so the robots don't even have to be *that* good to do better than us ;)
But second, they've been testing these things for many years now (I can confirm that I've seen them driving around Mountain View for years before they went to SF). Furthermore, the government of San Francisco is not the kind that would let a corporation safety test on their citizens. I strongly suspect their internal data showed a much better-than-human accident
Parking (Score:3)
Bad sample set (Score:3)
The general driving public is not a fair sample set to compare to. Give me the comparison of them to JUST cabbies (the people they're supposed to replace) and we'll see what happens.
Re:Bad sample set (Score:5, Informative)
Taxis Are Safer Than Private Cars Over every 1 million miles driven, there are 4.6 cab crashes, 3.7 livery car crashes, and 6.7 crashes with private cars. In spite of this data, most people think that private cars are safer than taxis, but that perception is not completely misguided.
https://www.helpinginjured.com... [helpinginjured.com]
Re: (Score:2)
Brake check! (Score:4, Funny)
You need to check your brakes waymo often.
Re: (Score:2)
You need to check your brakes waymo often.
I would brake check tail-gaters if I didn't care about my car, or the headache that comes after the crash. I friggin hate those idiots.
I guess Waymo software can brake check and let Waymo staff take care of the BS while it takes a break.
Re: (Score:2)
If you ever actually brake check a tailgater and have an accident absolutely do **NOT** say the words "brake check" to either the cops or the insurance company. You won't like what happens afterwards if you do.
Waymos will kill people (Score:3)
They will die of old age getting stuck behind one.
Follow the Waymo (Score:3)
Not 22 million miles (Score:5, Interesting)
What they really do is they have their cars drive around the same loop of blocks in Phoenix 1 million times, and say it's "22 million miles" of driving. It's not wrong, but you can only compare accident statistics against human drivers driving the same route at the same times in the same weather.
Furthermore, the only thing that matters is how reliable the latest version of software and hardware is. Has it been 22 million miles with NO changes to the software or hardware? More likely, it's 21.99 million miles with previous versions, and 0.01 million miles with the latest patch that's going to bug out and kill me or somebody.
Re: (Score:2, Informative)
Re: (Score:2)
So are they comparing "Waymo on the same loop of blocks in Phoenix 1 million times" vs humans throughout Phoenix? I don't know much about this, but your and OPs claims are compatible depending on how the comparison is done.
Re: (Score:2)
It also doesn't cover just some limited number of blocks, it covers 315 square miles of the Phoenix metro:
https://www.forbes.com/sites/b... [forbes.com]
Re: (Score:3)
What they really do is they have their cars drive around the same loop of blocks in Phoenix 1 million times, and say it's "22 million miles" of driving.
Yeah, get a few million miles driving around a city in the northeast during the winter and I'll be impressed.
This is expected, and not a good thing (Score:2)
And all the evidence so far suggests that it's making them safer. It's not just the small number of crashes Waymo vehicles experience -- it's also the nature of those crashes. Out of the 23 most serious Waymo crashes, 16 involved a human driver rear-ending a Waymo
Years ago I was at a red light. The light turned green, I started driving in my manual car, took my foot off the clutch a bit too quickly, stalled, and got read-ended.
Legally, the accident was 100% the other driver's fault (they weren't paying clos
Re: (Score:2)
I've seen bumper stickers with a tongue-in-cheek warning that the car has a manual transmission and might roll backwards or stall.
Hell, one of my friends has been driving stick for years and last time we went to go grab some car parts he stalled out and we had a good laugh about it. It happens.
Re: (Score:2)
Aside from the occasional weird bugs that lead to things like them mobbing that one street next to the Presidio a while back; they're actually very predictable. Just assume that they will follow the driver's handbook and street & traffic signs & signals to the letter.
Re: (Score:2)
Aside from the occasional weird bugs that lead to things like them mobbing that one street next to the Presidio a while back; they're actually very predictable. Just assume that they will follow the driver's handbook and street & traffic signs & signals to the letter.
I admittedly don't have a ton of experience with them, but the one I saw in SF a few months ago was jerkily creeping through a left hand turn, most definitely not typical human driver behaviour.
But I think the real issue is that they're necessarily more careful than humans and their default safe behaviour is to stop. Now, in most situations that is the safest behaviour, but it does mean they'll end up unexpectedly slowing or stopping a lot more than humans, and get rear ended as a result.
Re: (Score:2)
Re: (Score:2)
one of the more important rules of driving is to be predictable
I've noticed this is also a problem with drivers from other areas of the world, or myself driving there. The unspoken rules are just a bit different.
Careful with any post by Tim Lee (Score:2)
He's been on ars for years and clearly has an agenda and some hate on this topic.
While the things he says may not be outright lies, often he picks and chooses what he shares and what he hides to give clear preference to his favorites.
Is Ars Technica getting paid to shill Waymo? (Score:2)
IDGAF. Still want nothing whatsoever to do with self-driving cars, and would just as soon they were taken off the road permanently. I sincerely believe they cover up problems with them so they can keep selling the idea.
Re: (Score:2)
And is Slashdot getting paid to lie about articles? From TFS:
[Waymo cars cause] fewer than one injury-causing crash per million miles driven, compared to an estimated 64 crashes by human drivers over the same distance.
From the quoted part of TFS:
[Human drivers] would have caused 64 crashes over those 22 million miles. So Waymo vehicles get into injury-causing crashes less than one-third as often, per mile, as human-driven vehicles.
There's kind of a Big Fucking Difference between 64 injuries per million miles and 64 injuries per 22 million miles, but apparently that is too subtle a distinction for BeauHD.
Re: (Score:2)
Re: (Score:2)
Tim lee has been shilling for waymo for at least 5 years. I was so happy when ars dropped him but I guess they still bring him back for stuff.
All their car coverage is just shilling.
They launder gifts from the auto industry into cash and then claim they aren't paid for. You have to be really careful what you read on ars these days. If it's not Beth it's probably not good.
Is Waymo driving too conservatively? (Score:2)
The obvious question is why humans keep rear-ending Waymo cars. Just knowing that such collisions occur doesn't necessarily impart blame to either humans or Waymo. However, from my observations, Waymo cars tend to drive conservatively. The early Google cars drove conservatively to an extreme, e.g., five mph under the speed limit or ten mph slower than everyone else, or waiting until a really long gap when making a left turn. These are all legal maneuvers but ones that can increase the probability of col
Same happened to me. (Score:3)
Re: (Score:2)
The last time I got rear-ended the other driver got out of her car and proceeded to berate me for not running the stop sign like she expected. Still shake my head over that one.
Surprised, anyone? (Score:2)
Oh what a surprise. You build the robots so they're timid and never cause accidents, then you run them for millions of miles and find people run into the back of them.
Why? Just because it's normal for that many miles travelled? because they surprise more assertive drivers and stop when they thought they would go? or because people are too distracted by their presence they forget they're driving and drive into them. .... All these questions and more might very well be addressed in the article, but I'll never
Why does slashdot hate paragraphs? (Score:2)
Why do you hate paragraphs so much slashdot?
Sentences are much easier to follow when you break them up into different lines.
Why, Slashdot, Why, can't I just have single line breaks in my comments?
Are humans in similar cars? (Score:2)
Since the self-driving cars are new vehicles, safety comparisons should only be done against new cars of similar cost / specifications. (which would likely include automatic emergency braking, blind-spot monitoring, etc...)
Comparing against the existing fleet is only fair if the self-driving systems are meant to be retrofit onto the existing fleet.
Not a fair comparison (Score:2)
Waymo cars are essentially taxis. Comparing taxis to all other drivers on the road, e.g. high-school kids, frat boys, alcoholics, retirees who are semi-conscious because of the medications they're on, etc., etc., will give a false impression of road safety.
In the same way that university research hospitals, which have higher death rates, are not giving poorer care, they're just dealing with the most difficult & serious cases to treat. Put those patients in a
Can't Wrap My Head Around the Stat (Score:2)
How can the 64 crashes per million miles make sense? A driver who does 10,000 miles a year would do approximately half a million miles over their driving career. So that means they would average 32 crashes over their driving career?? That can't make sense. What am I missing?
Re: (Score:2)
How can the 64 crashes per million miles make sense? A driver who does 10,000 miles a year would do approximately half a million miles over their driving career. So that means they would average 32 crashes over their driving career?? That can't make sense. What am I missing?
Answering my own question, the first part of the Slashdot summary is misleading: "with fewer than one injury-causing crash per million miles driven, compared to an estimated 64 crashes by human drivers over the same distance", when really it is "Waymo estimates that typical drivers in San Francisco and Phoenix would have caused 64 crashes over those 22 million miles."
However, that's still 3 per million miles, or 1.5 over my estimated driver's career of 500,000 miles. That still seems really high.
Re: (Score:2)
A funnier headline (Score:3)
“Waymo Takes It In The Rear!”
Re: (Score:2)
Re:because (Score:4, Informative)
they drive like an autistic dr. spock, not how actual humans drive
I hope you realize there's a difference between Dr. Spock [wikipedia.org] and Mr. Spock. [wikipedia.org]
Re:because (Score:4, Insightful)
not how actual humans drive
Like assholes?
Judgemental language aside, human drivers exhibit a "body language" in how they drive. For example, if somebody is planning to change lanes, they'll often drift a bit as they check over their shoulder. If somebody is getting ready to stop in a city/downtown kind of environment, you'll often notice their speed reduce as their attention shifts and they scan for a place to stop.
My personal theory is that part of the reason that everybody thinks every other locales drivers are assholes is because different areas have different driving body language, and people misinterpret that. In rural areas, people following closer than 1 second per 10mph are seen as asshole tailgaters. In places with more crowded freeways, you're seen as the asshole for not using that space efficiently if you tried to keep that following distance (plus you'd constantly have people cutting in front of you until you were doing 30 in a 65). There are plenty of other unspoken things like usage and duration of turn signals, when and how violently brakes are applied, etc.
Programming in this kind of body language would make the waymos safer around the other human drivers, but since we're talking silicon valley software developers, half of which probably don't even drive and all of whom are probably on the spectrum, I'd be surprised if they're even aware of it.
Re: (Score:2)
The larger problem is, as you say, it's regional. What region should be programmed in?
Re:because (Score:4, Insightful)
All of them. Traffic laws and rules for signage etc are also regional, and the car needs to be programmed for all of them.
For example, in Scotland you will sometimes see a (70) sign in situations where you would see a ( / ) sign in England. I'm pretty sure there are no (70) signs in England.
The reason for this is that ( / ) means the National Speed Limit for that particular type of road applies. A sign with a number in it means a local speed limit of that number of mph applies.
In England, every type of road has a national speed limit, which is never more than 70mph, hence you never see a local speed limit of (70).
In Scotland, there is no national speed limit for motorways, so every stretch of motorway has a local speed limit, mostly (70), lower in some urban areas. The Scottish government doesn't have the power to set national speed limits, only the Westminster Government. The Westminster Government wants to leave decisions on motorway speed limits in Scotland to the Scottish Government. So that's why this situation exists.
Re: (Score:2)
Thanks for using that example. I was in Scotland this summer, and had wondered about the '70' signs.
Re: (Score:3)
The larger problem is, as you say, it's regional. What region should be programmed in?
The region it's operating in. Waymo cars are already locked in to their specific region. They see "hyper accurate map" as just another one of the cars sensors, i.e. as it is currently, if you moved one away from the heavily mapped parts of the bay area it was currently designed to operate in, its functionality and reliability would be severely degraded.
Re: (Score:3)
Now everybody hollers if they put them anywhere but school zones. Because everybody needs to be able to break the law in their deadly weapons all the time.
Hardly. People would do a whole lot less "hollering" if there wasn't well documented abuse of red light cameras as revenue generators, e.g. the duration of the yellow being shortened (sometimes to an illegal/unsafe level) shortly after the cameras were installed. They should also be required to have a wide angle shot (or better yet, video clip showing a few seconds before/after) of the intersection and surrounding traffic to give context, and have an actual human looking at it and deciding "yes, this is ac
Re: (Score:2)
The difference in locales matches my experiences.
For many years I lived in one big city and then moved to another one about 500 miles away. In the first city when you were merging onto the freeway, cars already on the freeway mostly just ignored you (the merging car) - they didn't try to stop you, but they also didn't "help" you by slowing down to create a gap for you. In the second city drivers already on the freeway were much, much more likely to try to "help" a merging car.
In the first city, I was used t
Re: (Score:2)
not how actual humans drive
Like assholes?
*looks at the smartphone glued to damn near every drivers hand*
You misspelled addicts.
Re: (Score:2)
Not everyone drives like a cop.
Re: (Score:3)
I think in the not-so-long run these things are going to wipe out a LOT of things. People will use a robotaxi for virtually every trip. Even to the deep countryside as it will be easy and cheap to rent a taxi to just stay with you until you go back.
This will wipe out private car ownership. Only collectors and people with jobs that require a specialized vehicle will own one. A robotaxi is more convenient and closer than you can park a private car. It will also wipe out lots of stuff that relies on private ca
Re: (Score:2)
There are a **LOT** of jobs that are going away in the next decade or two, and you're right, very few people are talking about it. There are a lot of people in the world who can't do anything more complex than walk a security guard patrol, wash dishes, or sweep floors, and when those jobs are gone there aren't any no-skills no-education jobs coming to replace them. When the farmer bought a tractor the farm hand could go work at the factory. When the factory got a robot to screw on lug nuts the worker cou
Re: (Score:2)
In up to date equipment (with modern safety features...), since they are not retrofitting the self-driving onto old vehicles.