Uber Halts Self-Driving Car Tests in Arizona After Friday Night Collision (businessinsider.com) 227
"Given that the Uber vehicle has flipped onto its side it looks to be a high speed crash," writes TechCrunch, though Business Insider reports that no one was seriously injured. An anonymous reader quotes their report:
A self-driving Uber car was involved in an accident on Friday night in Tempe, Arizona, in one of the most serious incidents to date involving the growing fleet of autonomous vehicles being tested on U.S. roads... Uber has halted its self-driving-car pilot in Arizona and is investigating what caused the incident... A Tempe police spokesperson told Bloomberg that the Uber was not at fault in the accident and was hit by another car which failed to yield. Still, the collision will likely to turn up the temperature on the heated debate about the safety of self-driving cars.
Not all wrecks can be avoided (Score:4, Insightful)
Conversations would be different if the uber car was at fault but not all accidents can be avoided.
Re:Not all wrecks can be avoided (Score:4, Interesting)
Conversations would be different if the uber car was at fault but not all accidents can be avoided.
Conversations will be different when the autonomous car is at fault due to a hack, as prioritizing security over everything else is usually avoided.
Re:Not all wrecks can be avoided (Score:5, Insightful)
I appreciate the point that, statistically, this *will* happen as some accidents are unavoidable. You're absolutely correct and we should look at the bigger picture.
However I'm skeptical of reports where the self driving car is not at fault because the other car "failed to yield". Being legally in the right doesn't necessarily mean the car was driving well or defensively, and these are the particular situations where a human might have been clued in to the other driver's behavior and avoided it entirely.
Re: (Score:3)
On the other hand, self driving cars don't get mad at other drivers making a mistake and try to get back at them, causing all kinds of dangerous situations.
Re: (Score:2)
On the other hand, self driving cars don't get mad at other drivers making a mistake and try to get back at them, causing all kinds of dangerous situations.
Oh, yeah? Says who . . . ? An autonomous vehicle might be programmed to drive "aggressively" to get through traffic jams faster. They'll give the feature some innocuous title like, "affirmative driving".
What you'll end up with is autonomous vehicles playing "chicken" with each other. An autonomous vehicle will not win any races by driving cautiously.
Anyway, the point is moot, because Über is not at fault in the same way that Über is not a taxi company. Über is a newfangled economy comp
Re:Not all wrecks can be avoided (Score:5, Insightful)
If everybody drives aggressively, traffic jams will get worse. When there are sufficient self-driving vehicles, they'll probably come up with some communication protocol so they can synchronize their strategies and achieve optimal road use, benefiting everybody.
Re: Not all wrecks can be avoided (Score:2)
Self driving cars are usually not being programmed by their target audience. The car's developers can scale back the "selfishness" that exploits other vehicles driving in a cooperative manner. Their notice is to make life good for their customers overall, not for their most greedy customers at the expense of the rest.
Re: (Score:2)
"You going to sell a passively driving SDC to THAT consumer with claptrap about 'the greater good'? They'll get in their big gas guzzling SUV, take a big swig of beer, give you the finger and run you over just for suggesting it."
Ultimately, the self-driving cars will all be owned by fleet operators whose concern for personal ego will be overridden by their corporate lawyers.
Re:Not all wrecks can be avoided (Score:4, Insightful)
Re: (Score:3)
The only question that matters is if a human would have avoided the accident. If they could have easily, then this accident was caused by self-driving. It doesn't matter what side of the law Uber was on.
A good driver absolutely would have avoided that collision.
The problem with computers is that they don't take into account that people will break the rules and do stupid things, a defensive driver assumes someone will do the dumbest thing possible. Drivers who are about to pull out in front of you give a lot of cues to their behaviour, everything from the way they're looking at you to rocking back and forth to revving and creeping. A good driver learns to pick up on these cues.
The thing is, AI isn't s
Re: (Score:2)
Re: (Score:2, Insightful)
Re: (Score:2)
Re: (Score:2)
It's also well established that there is a level of competence that is expected along with reasonable reactions, etc, and if you're determined to not have met it in a particular situation you'll be at fault.
Every self driving car is FAR more aware of its surroundings than a sizable majority of the human's driving, myself included.
Re: (Score:3)
Wrong, a self driving cart is not *aware* of anything at all, it's software doing it's function. There is also a big difference between what a car scans(some might erroneously call this 'aware of') and what the car then recognises. If you watch google car videos, you'll see that what the car actually recognises is simply defined as some moving cubes, some static items etc, sometimes these things are highlighted as being further categorised, sometimes they aren't.
Autonomous vehicles have got a long long way
Re: (Score:3)
I think you're massively over-estimating the capabilities of self-driving cars. I do not mean souls at all, when I say I do not mean souls that means I do not mean souls. You're the one talking about souls, not me. If you look at the definition of aware, it says nothing about souls. I'm not religious and I'm not coming at this from any kind of spiritual angle.
These cars do not have situational awareness, they don't know 1% of what your average adult knows about the world around them and as such can't make t
Re: (Score:2)
Assuming you like pizza, when you say you like pizza, do you mean the lowest common denominator pizza that may have human excrement for a topping or do you mean your general perception of pizza? It's a fairly simple concept.
That analogy only works if you say that the pizza has to be better in every way, better pie, better crust, better sauce, better cheese, better ham... I expect a self-driving car to meticulously obey the rules of the road, be extremely consistent in its driving and have superior reaction time. But to analyze all aspects of the human condition and flag all signs that another driver or pedestrian may not be inclined to follow the rules better than a human sounds unlikely. But if overall it has less accidents a
Re: (Score:2)
Re: Not all wrecks can be avoided (Score:2)
Re: (Score:3)
Re: (Score:2)
Having gone through hell with one of the sickest insurance corporations Allianz, those cunts had not the slightest qualm in wanting to halve my claim by saying I failed to take evasive action when the other person turned through a red light right in front of me. Those arse holes at Allianz even put that shit in writing to halve the claim, so fuck off with you 'idiotic' claim. Always this bullshit with corporations, never their fault, always everyone else fault.
PS avoid Allianz like the plague they will alw
Re: (Score:2)
Re: (Score:2)
and these are the particular situations where a human might have been clued in to the other driver's behavior and avoided it entirely.
Based on what I've seen of other self driving cars this is something that computers will very quickly be better at than humans as well. In aggregate, humans are frigging horrible at situational awareness on the road and even worse at driving defensively (tip: Defensive driving is the opposite of being a tailgating jackass).
Re: (Score:2)
Conversations would be different if the uber car was at fault but not all accidents can be avoided.
But there are also accidents that one could have avoided even if it were not their fault. This could very well be one of those cases. I have avoided accidents where another driver has not properly yielded more than a few times. Its a matter of not trusting the other driver to do the right thing. Its the kind of thing that is very hard to program in to an automated system.
Re: (Score:3)
Re: (Score:2)
Its the kind of thing that is very hard to program in to an automated system.
It is absolutely not hard to program into an automated system, facepalm.
Avoid collisions, easy.
Re: (Score:3)
It is absolutely not hard to program into an automated system, facepalm. Avoid collisions, easy.
So easy, even Uber can do it.... oh wait a minute. Maybe avoiding collisions is easier when they are anticipated.
Re: (Score:2, Insightful)
A newcomer like Uber can not do it.
Why newcomers get allowance to test their bullshit on real roads when Audi, Toyota, BMW, Mercedes etc. have self driving cars since a decades is beyond me.
Re: (Score:3)
Re: (Score:3)
Conversations would be different if the uber car was at fault but not all accidents can be avoided.
There are accidents that nobody could avoid, there are accidents that I cannot avoid, and accidents that could be avoided.
Obviously self driving cars will initially have to be clever enough to only cause accidents very, very rarely, At some point when this is achieved, they will try to avoid avoidable accidents where someone else is at fault.
Re: (Score:2)
Yeah, but (Score:3)
The other car was a Tesla in autonomous mode whose driver was watching a Disney movie.
The self-driving car is blamed for human error (Score:5, Insightful)
I am reminded that when cars were first invented, there were laws put in place mandating that someone walk ahead of any self-propelled vehicle waving a red flag [wikipedia.org], for fear of scaring horses and making people uncomfortable.
I'm sure that in one hundred years this sort of reaction - blaming the software for an inattentive driver failing to yield - will be seen in exactly the same way.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
Note that the Uber car did NOT crash into the car that cut them off. The car doing the "cutting off" ran into the Uber car (I'm assuming it hit the Uber car on its side, since TFA refers to the Uber car being knocked over on its side).
Now, it the human driver of the other vehicle decided that the Uber car had "cut him off" and crashed into the Uber car on purpose, that would fit your description nicely.
Alas, the Uber car had righ
Re: (Score:2)
Re: (Score:2)
The most hilarious part of that wiki entry is the Virginia proposal requiring drivers to rapidly disassemble their car and hide the parts behind bushes at the first sign of livestock. Would have become law if not vetoed by the Governor.
It fascinates me that we haven't really progressed at all as a species in 120 years. People will be up in arms at the first sign of autonomous vehicles crashing, even if and when they're literally proven to be say 100x safer than humans. You will have websites popping up with
Re: (Score:2)
Re: (Score:3)
Who cares? If the net result is fewer deaths, then that's the net result. Your fear of the new and Luddite instincts don't factor in.
Re: (Score:2)
Rear-view mirror. (Score:3)
I am reminded that when cars were first invented, there were laws put in place mandating that someone walk ahead of any self-propelled vehicle waving a red flag, for fear of scaring horses and making people uncomfortable.
Not automobiles as we know them.
Steam powered road tractors, mammoth agricultural tractors and heavy construction equipment. Ca. 1860-1896. Think township or county roads that were dirt or gravel tracks barely more than a single lane wide. Now do you know why you needed a flag man?
Most accidents have multiple causes (Score:4, Insightful)
Airline accident investigations are really good at demonstrating how an entire chain of events led up to the accident. And that any single factor happening differently could've prevented the accident. e.g. The Concorde crash was caused by (1) debris on the runway from a faulty repair on a previous plane, (2) failure of the Concorde's tires when it struck the debris, (3) failure of the undercarriage to withstand tire debris striking it from a blowout at take-off speed, (4) the manufacturer not making any procedures or provisions to recover from a double engine failure on a single side because it was considered so unlikely. Any one of these things doesn't happen and the Concorde doesn't crash.
Safety systems layer multiple accident-avoidance measures on top of each other. This redundancy means that only when all of those measures fail is there an accident. Consequently, even if the self-driving car was not legally at fault, that it was involved in an accident still points to a possible problem. e.g. If I'm approaching an intersection and I have a green light, I don't just blindly pass through because the law says I have the right of way. I take a quick glance to the left and right to make sure nobody is going to run their red light, or that there aren't emergency vehicles approaching which might run the red light, or that there's nobody in the crosswalk parallel to me who might suddenly enter into my lane (cyclist falls over, dog or child runs out of crosswalk, etc).
So even if the autonomous car wasn't legally at fault, that's not the same thing as saying it did nothing wrong. There may still be lessons to learn, safety systems which were supposed to work but didn't, ways to improve the autonomous car to prevent similar accidents in the future.
Re: (Score:2)
I would not blame software for anything but inevitable bugs in the software. Just think, this software is in its testing stage, wouldn't you agree that it may have bugs? As far as conspiracy theories go, I would rather blame the police for covering up a corporate mishap here.
Re: (Score:2)
I am reminded that when cars were first invented, there were laws put in place mandating that someone walk ahead of any self-propelled vehicle waving a red flag [wikipedia.org], for fear of scaring horses and making people uncomfortable.
I'm sure that in one hundred years this sort of reaction - blaming the software for an inattentive driver failing to yield - will be seen in exactly the same way.
The two situations are not comparable.
When the automobile was invented it wouldn't take more than a handful of real world experiments to determine that red flag laws were unnecessary.
But demonstrating that current self-driving car technology is as safe as a human driver is a much tougher challenge, and I'm not convinced that it's a challenge they're taking seriously.
Rethink (Score:2)
Re: (Score:2)
Firstly an autonomous car is not driven by a (strong) AI, barely half of the algorithms count as weak AI.
Secondly, it is of course programmed to avoid crashes/collisions at all cost.
What else? Why do people believe otherwise is beyond me. Even if no one is injured, the hassle with the insurances to get the damage to the cars sorted is something no one wants to have.
Re: (Score:3)
Pioneers (Score:2)
"Fuck you California, we are pioneers, we'll go to Arizona where they welcome pioneers!"
* THHHUNK! *
"Ahhh, arrow in my back, arrow in my back! Help!"
Finally (Score:2)
Just like google glass people can't stand progress. I was wondering how long until people started intentionally crashing into auto cars.
Re: (Score:2)
Congratiulations, you've won the prize for being the most clueless thing I've read today.
Tipped over does not imply speed (Score:5, Informative)
"Given that the Uber vehicle has flipped onto its side it looks to be a high speed crash, which suggests a pretty serious incident..."
In one past life I learned accident investigation and in another extricated victims, both dead and alive, from vehicle collisions. I have to call malarkey on the "high-speed" claim.
Cars can tip over at very low speed. I've seen at least two such crashes within two blocks of my house. In one, a driver ran a stop sign and clipped a small SUV which tipped over onto the opposite sidewalk. The entire accident scene covered, perhaps, 30 feet edge to edge.
In the other, a driver drifted into the parking lane sideswiping a parked car such that the door-panels hooked which caused the car to rotate then roll.
The "high-speed" car in both cases was traveling 20-30mph.
Though the provided photo does not show a large surrounding area, neither car looks crushed - just some body-panel denting and debris is right next to the car vs. scattered down the roadway and "nobody was seriously injured."
Nothing about this suggests high-speed.
Re: (Score:2)
Aren't SUVs with their high centre of mass a known rollover hazard? I very much remember the "reindeer test" videos of about a decade ago: basically making a very sudden, sharp turn at fairly high speeds to avoid a reindeer, causing most SUVs to roll over.
Re: (Score:3)
Aren't SUVs with their high centre of mass a known rollover hazard? I very much remember the "reindeer test" videos of about a decade ago: basically making a very sudden, sharp turn at fairly high speeds to avoid a reindeer, causing most SUVs to roll over.
Yes, the high risk of rollovers in SUV's have been known for years. However because the *NCAP programs dont bother with rollover tests (or rear end tests, it's not like nose-tail collisions are the most common type or anything) this risk is ignored by manufacturers who are making a lot of money by selling jacked up hatchbacks.
No such thing as not at fault (Score:2)
Re: (Score:3, Funny)
Don't be silly...Uber isn't raping anybody. Uber is just trying to innovate here.
Right now only the passenger can give the driver oral sex, but what if the driver wants to give the passenger oral sex too? Thus you need self driving capability. Just think about it: You no longer need to jack off before work, instead somebody else does it for you on the way to work while you check your email. This saves a lot of time off of your busy day by doing three things at once.
Isn't the gig economy wonderful?
Re: (Score:3)
Re: (Score:3)
OMG.
Why is still happening? Why won't Uber stop raping their female employees?!?
They should let them write code for the self-driving cars instead. That way, just like in real life, female drivers would cause accidents but it's the male driver that has to swerve or veer to avoid a collision that would be at fault.
Re: (Score:2)
Re:So backwards... (Score:4, Insightful)
The human made a mistake yes, but the self-driving car crashed into him. So now the question is whether a human would have done better in that situation.
It's a given there will be some instances where a human driver might have done better than a self driving auto. In the same vein, the possibility also exists that the human driver may have done worse in the identical situation.
If driver-less autos can perform appreciably better than humans do over a large enough sample size, they should then be considered a safe alternative... the only question is how much better they need to perform.
Re: (Score:2)
Re:So backwards... (Score:5, Insightful)
But. If we concede that any human life = any other human life, and the widespread use of driver-less vehicles saves X accidents and Y highway fatalities over the same number of driven miles, it has saved more accidents and human lives than it lost.
If we set the bar at ZERO accidents a human could've avoided, well, that is an impossibly high standard; and self-driving vehicles should be shelved right now.
Re: (Score:2)
Re: (Score:2)
they can't be making mistakes that most humans would not have made
For sure, they will mistakes, and they will make mistakes that some humans would not have made in the same situation. On the other hand, they'll avoid other mistakes that humans often make. The only thing that makes sense is to compare overall statistics, not zoom in on particular cases.
If you compare male and female drivers, or old and young drivers, I'm sure you'll also see different patterns of mistakes. That doesn't mean that one group should be allowed to drive, while the other group shouldn't.
Re: (Score:3)
Re: (Score:2)
Re:So backwards... (Score:5, Insightful)
Automated cars just need to become as good as an average human who is paying attention, awake, and sober. That will make them better than 80% of the cars on the road. Good enough for mass use at that point.
Don't let the perfect be the enemy of the good.
Re: (Score:2)
If humans where good in avoiding accidents we had not so many accidents in the first place.
And "avoiding an accident" implies: there is an accident about to happen, and only by luck/precaution/whatever the other involved party can avoid it. When all cars are autonomous accidents are only thinkable in the most obscure situations, I can not even imagine one right now (obviously software/hardware failure is an option).
Re: (Score:2)
Re: (Score:2)
They see properly at night. ... the article btw is about "seeing better" not about "not seeing properly."
Just not the american fake news autonomous cars
Re: (Score:2)
Re:So backwards... (Score:5, Insightful)
I concede that companies should not profit from products that might kill people. That is all that I concede.
You might as well just say that companies shouldn't be allowed to sell anything. Read the safety labels on anything you buy these days.
Re: (Score:2)
Re:So backwards... (Score:4, Insightful)
Well, now you're making a very different argument than the original "companies should not profit from products that might kill people." But I'll bite anyway. There are plenty of products that, though used correctly, can under some circumstances cause injury or death.
A very obvious one is medication. There are many medications that can have serious side effects, including death, when taken exactly as prescribed. We continue to use them because the benefits outweigh the risks.
You mentioned chainsaws. It is true that the majority of chainsaw accidents happen because of operator error. However, that doesn't mean that all of them do. The only way to completely eliminate the possibility of harm is to not use a chainsaw. But again, we continue to use them because the benefits are big enough.
There does need to be a standard for how safe autonomous vehicles need to be before we allow them on the roads. But setting that standard at "they need to never cause a death" is not only unrealistic, it is totally inconsistent with how our society deals with other potentially dangerous products.
Re: (Score:2)
You can use many medications as directed and die. Doctors have been complaining about Tylonol usage being a leading cause of liver failure for years now. People die during routine dental operations. There is risk all around you. You can walk outside and get hit by a bus. A bus driver can fall asleep at the wheel and drive his bus into your living room. Right now. Life kills.
Re: (Score:2)
Re: (Score:2)
Because there are medications out there that occasionally kill you even if you use them exactly as prescribed and they are prescribed exactly as recommended and approved. We use them because they don't kill that often. Even Ibuprofen can very rarely cause a life threatening reaction with lasting consequences.
I have heard of people injured by a chainsaw when they hit something embedded in a tree.
Re: (Score:3)
Re: (Score:2)
So you'd be OK with a company if it sells such products at cost or at a loss (say while they are still working out the software bugs that might kill people)? What does profit have to do with it?
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
Sure, but at the same time you'll be really relieved if you screw up and an autonomous vehicle acts with inhumanly fast reflexes and saves you and your passengers from a collision.
Re: (Score:2)
Re: (Score:2)
So you're incapable of screwing up? No chance a medical problem might surprise you? Simply can't happen?
Re: (Score:2)
Re: (Score:2)
Since the Uber cars have not been at fault in an accident, the best you can hope for is an equally good record unless you can show that the Uber car should have been able to avoid the accident. I don't think there's enough information out there to even guess about that.
Re: (Score:2)
Re: (Score:2)
Automated cars will improve over time. You'll go the other way. The crossover point is only a matter of time.
Also, you're a terrible driver - just suck at it totally. Well, that's a safe assumption, given you're a human who think's he's good at driving.
Re: (Score:2)
The problem here is that of course you only hear about that one incident where a human driver could have prevented an accident where the robot failed. You do not hear about the 100 incidents where the robot avoided an accident where the human may have not - which makes sense as nothing happened in the other 100 cases. In the same line, you don't hear about the thousands of flights that perform without a hitch every day, but whenever there is an accident with an airplane you hear about it.You probably hear m
Re: (Score:2)
You're ignoring the other side of the equation - what about the accidents that the autonomous cars don't get in that a human would have?
Wake me up when it happens.
Re: (Score:2)
Re: (Score:2)
Why? As long as the car knows the limits of the abilities, it's a tier above human drivers. That's Volvo's goal: not to be able to work in every possible condition, but to pull safely off the road when it can't. That's enough to allow the human to stop paying attention.
Re: (Score:2)
If driver-less autos can perform appreciably better than humans do over a large enough sample size, they should then be considered a safe alternative..
Appreciably safer than the average driver might not be safer than some drivers.
Re: (Score:2)
Correct, but we're not stopping the average driver, so why should we stop a better-than-average one ?
Re: (Score:2)
Re: So backwards... (Score:2)
While a valid question, what is missing here is the big picture. How many mistakes does a human make on average, vs. how many mistakes an autonomous car makes. Think seatbelts, while everyone agrees they save lives, people don't refute that in some situations (e.g. car is submerged) they kill people. And yet, most people agree that seatbelts are good and it's better to use them.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Autonomous vehicles will always have to deal with surprises. If not human drivers screwing up, kids and animals running into the road, falling trees, sinkholes, rockslides, flooding, etc.
Re: (Score:2)
I agree that this was not the Uber car's fault (after all, that was the official finding). However, failure to avoid an accident never makes it your fault if it was the other driver that violated the rules of the road.