Uber Vehicle Saw But Ignored Woman It Struck, Report Says (engadget.com) 323
gollum123 writes: Uber has reportedly discovered that the fatal crash involving one of its prototype self-driving cars was probably caused by software faultily set up to ignore objects in the road, sources told The Information. Specifically, it was that the system was set up to ignore objects that it should have attended to; Herzberg seems to have been detected but considered a false positive.
Oops! We left it in murder mode. (Score:5, Funny)
Re: Oops! We left it in murder mode. (Score:2, Insightful)
Not a bug, a feature! Cleaning up the streets one (homeless) cyclist at a time.
Functions as designed.
Re: Oops! We left it in murder mode. (Score:5, Insightful)
Actually, there are a number of videos that people took from their cars on the following nights that show the area as well lit and it seems unlikely a moderately attentive human would have hit her. The video released from the car camera does not appear to be representative of what a human would have perceived. Notice that the "safety" driver obviously could see her easily when he glanced up from whatever she was distracted by. I am, of course, assuming that the street lighting hadn't suffered a massive failure that night and been restored the next few nights.
Comment removed (Score:5, Insightful)
Re: (Score:2)
Surely even Uber can program a vehicle to hunt down a target even if he she or it isn't in the road.. I'm guessing that capability is scheduled for a future software upgrade.
Re:Oops! We left it in murder mode. (Score:5, Funny)
Re: (Score:3)
That will be her epitaph:: "False Positive".
So who is to blame? (Score:4, Interesting)
Re:So who is to blame? (Score:5, Interesting)
Probably the person playing on their phone as it was their job to override decisions made by buggy software.
Also the video Uber released is highly altered. I drive on that street frequently and it is very well lit.
Re: (Score:2)
Re: (Score:2)
Wrong. At the speed the car was moving an operator should have detected the vehicle applying brakes about 3 car lengths before the car entered the shadow of the bridge (as to not put the passengers on the windshield). If the operator have punched the break pedal through the floor as the hood was entering the shadow the hit would not have been fatal, and it was possible to have full stop. It is a very wide bridge, several lanes of highway.
Re:So who is to blame? (Score:4, Insightful)
Re: (Score:2)
"But the safety operator was supposed to figure this all out in less than 3 car lengths."
They had a lot more than 3 car lengths to realize the car wasn't slowing down.
Re: (Score:3)
Exactly. The bridge is 16 lane widths, and crossed almost at 45 degree angle at that. It is very very wide. Heck.. at that angle at 45mph which is the allowed speed I need 3 lane widths (effectively 4.24 at that angle) to bring the car to complete stop.
Re: (Score:2)
Refresh my memory... the next lane over was clear? Braking is the wrong action at this distance; if you might full-stop, you should instead use a lane toss.
Re:So who is to blame? (Score:5, Insightful)
I could see the car recognizing a potential hazard well in advance of a need to take action - that info should be given to the safety driver. If they in turn take action before the autopilot would have, perhaps an algorithm needs tweaking. And, if the driver sees a potential hazard first, they should be able to provide feedback on that, too, so they can figure out why the human is doing a better job.
Re: (Score:2)
Re:So who is to blame? (Score:4, Insightful)
Re: (Score:3)
Any "ghosting" should be done by the automation system, with the driver in primary control.
Re: (Score:3)
If the purpose is to test, you can't disconnect it before it makes a mistake.
But video shows the 'safety operator' was not watching the road. Which is understandable. Try watching a CNC machine with you hand hovering over the e-stop button, see how long you last. I guarantee you, it won't be an 8 hour shift of alert watching. Then again, he was in the car. You'd likely watch a lot better if your skull was in reach of the tool.
Testing autodrive cars on the road is hubris. Build an artificial test enviro
Re: (Score:2, Interesting)
If the purpose is to test, you can't disconnect it before it makes a mistake.
Yes you can. Hitting the brakes when the car doesn't expect it would cause it to evaluate what's going on. Did it identify an obstacle? Did it identify a possible not-obstacle? Did it expect to need to react, but not judge the condition as resolved yet (someone's up the road 250 feet, and you've got plenty of distance at 50 feet, so you wait to see if they move and possibly slow down at 150 feet to prepare--condition is resolving into a single probable outcome as you approach)?
The car can even report
Re: (Score:2)
In this case, it's academic. The operator wasn't watching the road.
If you touch the brakes everytime there is a possible obstacle, you would be on them _all_day_ (in city traffic or side streets). Your back to 'no additional information'.
As always, you have to be careful not to train to artifacts. But artifacts exist in both virtual worlds and real world sensors. An advantage of virtual testing is you can isolate subsystems (or not). e.g. You don't need to render the scene and feed that to the camera/i
Re:So who is to blame? (Score:4, Interesting)
When I drive and see a possible danger (kid playing with a ball by the side of the road, dog wandering around near the edge of the road without a leash, motorcycle rider looking the wrong way on a side street as I approach, etc.) I always take my foot off the accelerator and cover the break - ready to instantly respond if something stupid happens. It's called defensive driving, and is how everyone should drive.
I never really picked up this habit though until I had been riding motorbikes for a while, when you absolutely have to drive defensively id you want to survive commuting in London or Tokyo traffic. (I used to ride in both).
The safety driver should have been doing this, and it would not be any impediment to testing the autonomy of the car - it can still do the driving, but the safety driver would have had time to react appropriately.
The safety driver completely failed in her duty - possibly due to lack of training - but if your getting paid to be a safety driver then you should do your job instead of buggering around with your phone.
Re:So who is to blame? (Score:4, Interesting)
Try watching a CNC machine with you hand hovering over the e-stop button, see how long you last. I guarantee you, it won't be an 8 hour shift of alert watching.
I'd probably go 12 hours or more. I love watching those things work.
Re: (Score:2)
Exactly,
Being this is test technology. It was his job to override the car when it made a bad decision. Being that Uber's Self driving cars is years behind other makers such as Google, the safety driver should had been much more vigilant.
Re: (Score:2)
I'm not convinced that particular spot was "well lit". She was obviously coming out of a shadow, and there seemed to be only the one light in the area with no overlap.
Re: (Score:3)
The road there is about as well lit as you would expect a big road to be, it's not as dark as the video implies. Granted, most of the time I'm down there I'm there for a concert and they might have additional lights on, but the video was obviously darker than what a person would see. At the place where she was crossing, the driver should have been able to see her crossing the road for several hundred feet at least.
Re:So who is to blame? (Score:4, Interesting)
> Probably the person playing on their phone as it was their job to override decisions made by buggy software.
I dunno, looking at the video of the crash, the victim crossed the road outside of a crosswalk and wasn't even LOOKING in the direction of potential traffic. I'd assign the lion's share of the blame to the person who literally walked into the path of a brightly lit car without noticing.
Re: (Score:2)
Also the video Uber released is highly altered. I drive on that street frequently and it is very well lit.
That doeesn't mean the video was altered. It's just a shitty camera.
Re: (Score:2)
Did you see the entire video? The street is very well lit all the way through, except at the point where the woman was crossing.
Re: (Score:3, Insightful)
I've seen plenty of videos. The street is very well lit in all of them except Uber's video.
Interestingly, the camera facing the human "driver" is crisp and clear, using your standard "night vision" mode.
The Uber video is either doctored or doctored.
Uber and people who authorized this experiment (Score:5, Insightful)
This is a clinical trial. The FDA has long long long long long experience in conducting clinical trials. Now one can argue if FDAs caution is too much but even in the worst case everyone would agree they have a well established process for assuring something is safe and effective before you release it onto the public.
Uber is conducting experiments on the public.
If this were a new drug or treatment or medical procedure they would be shut down.
This is actually far worse than that because most new drugs or treatments have clear lineages from prior ones that give us high expectations of what the outcome will be.
The argument that something has to be allowed prematurely because in the long run it will save lives is a failed argument for medicine.
In this case there is nothing to support the claim that this will save lives in the long run. Sure one could imagine that it would. But I don't think thats very well established. And if this were a drug study people would have spent the time and money to establish that.
The claim that they have conducted 5 million miles (or whatever of testing) is rubbish. Those are not statistically valid tests. We execs dashing in front of the cars going 50 miles per hours in any of those tests? I assure you that did not happen.
Moreover we already have evidence from those tests that driver re-aqusitions do happen frequently, and there is a substatnial lag in the hand over dues to human inattention. THe fact that they only had one driver in it says Uber is negligent.
Re: (Score:3)
Re: (Score:2)
> This is testing on people that have not be forewarned and have given zero consent
Have you given your consent to the guy down the street having his first epileptic seizure while driving past your kids playing?
Framing self driving car tests in drug trial language is useless.
Re: (Score:2)
So are all the other "autopiloting" car manufacturers. And drive-by-computer, aka no-mechanical links to the brakes, steering and throttle are also robotic, which will bite us some day.
Skin in the game (Score:5, Informative)
Excellent point re execs. I read that in England sometime in the middle ages bridge engineers were required after the construction to sleep for two weeks under the bridge -- with their families.
Re: (Score:2)
>Uber is conducting experiments on the public.
At some point you have to test tech like this in the real world.
>If this were a new drug or treatment or medical procedure they would be shut down.
Would it? If this "new drug" had the potential even with a couple of side effects to replace or supplant a known drug that was already killing 40,000 people and maiming hundreds of thousands in the US alone per year?
Re: (Score:2)
Its not movie studios don't have "fake" real towns built in their premises in California. Why weren't these used to test the car with individuals who consented to be stand-ins for cyclist, pedestrians and other drivers.
Re: (Score:2)
If this were a new drug or treatment or medical procedure they would be shut down.
I'm not sure how you can draw an analogy there. In a clinical drug trial, the drug doesn't go out and kill someone not part of the trial.
Re: (Score:2)
(Should) We (have) execs dashing in front of the cars going 50 miles per hours in any of those tests?
This is a really cool idea. It would "drive" home the point on system safety. Think how much more thought there'd be about operating when the exec's gotta put their life on the line for the work of their minions?
Re:Uber and people who authorized this experiment (Score:5, Informative)
If this were a new drug or treatment or medical procedure they would be shut down.
Uber self-driving tests have been (mostly) shut down.
Uber makes it sound like they suspended their testing operations voluntarily, but the fact is they lost their testing permits in Arizona, California, and one other state.
And if there is any testing going now with Uber, it's only happening now in computer simulations, or in mocked up urban environments with fake pedestrians and bicyclists.
Re: (Score:2)
The head of QA. How many of tests on a track with mannequins did they do? Probably should have been hundreds or more.
start at the top UBER VP / CEO needs to (Score:2)
start at the top UBER VP / CEO needs to go to court take the fall for the full outsourced map. Or the next one can just outsourced things so much that no one person is responsible and it takes an year or more to just fully understanding the outsourced mapping.
Re: (Score:2)
Nobody, since the case was settled out of court.
Re: (Score:2)
Re: (Score:2)
Relax, dude. You are discontent with the general public's reaction to this murder and are putting the weight of that injustice squarely on one person's shoulders. The software developer is not making the public act they way they are. You don't know the guy.
If I was the software developer that made this mistake, I would already be tortured with guilt over having killed a person. Your assumption that this person must be punished because he's clearly obviously laughing about it like the rest of them are is
Re: (Score:2, Interesting)
Punishment is not justice, and justice does not right a wrong. Don't be so eager to swing an axe just because you yourself feel bad about this situation. That is just selfish. Instead, let's figure out the right way to react to the injustice that took place here, barring any prejudice.
I actually proposed a Constitutional Amendment. Working on the language.
The purpose of law being to establish Justice and insure domestic Tranquility, the execution of law against an offense shall be to redress and rehabilitate.
To this purpose, and to the purpose of a fair and speedy trial, none shall be confined against his will except as necessary for the security of the public, and such confinement shall to the greatest extent achievable respect the dignity of the confined as human beings and ensure their individual needs are met and rights protected; and no bail shall be required except where other means are insufficient to the same purpose; and civil damages shall not be imposed in excess of those necessary to redress.
Re: (Score:2)
It's a software guy alright.
It's you! We've pinned the whole deal on whatever code we can prove you wrote and abandoned. Bet you didn't expect that code to end up in an 'AI'!
Re: (Score:2)
Re: So who is to blame? (Score:2)
The woman was found to be at fault (Score:2)
No one, because there was no vehicular manslaughter.
The woman was found to be at fault [azcentral.com] for not checking that the road was clear before stepping out of the shadows to cross illegally. Something that she could have easily done since it was dark and the vehicle's headlights were on.
A large median at the site of the crash has signs warning people not to cross mid-block and to use the crosswalk to the north at Curry Road instead.
Re: (Score:2)
They outsourced that part to India.
Re: (Score:2)
They outsourced that part to India.
Nah! The Uber thought the woman was it's wife, and thus was programmed to ignore her.
Re: So who is to blame? (Score:2, Insightful)
No one is guilty of vehicular manslaughter. This is an accident due to bad design. You don't jail the engineers or architects who design a building that fails in an earthquake. You don't arrest the airline execs when a plan component fails. The only difference here is software failed. Learn from the mistake, don't do it again. Sheesh.
Re: (Score:2, Funny)
That sounds an awful lot like something someone guilty of vehicular manslaughter would say.
Re: So who is to blame? (Score:5, Insightful)
Of course you jail the engineers or architects, when they are criminally negligent.
There is little doubt that the Uber CEO, Dara Khosrowshahi, was criminally negligent and aware that this kind of accident was not only possible but likely.
Re: So who is to blame? (Score:5, Insightful)
No one is guilty of vehicular manslaughter. This is an accident due to bad design. You don't jail the engineers or architects who design a building that fails in an earthquake.
You do if it wasn't designed to code [gizmodo.com].
You don't arrest the airline execs when a plan component fails.
Maybe not the exec, but certainly the maintenance engineer [nytimes.com] who committed fraud that resulted in the death of people.
So here - who's to blame? Who decided to live-test an experimental system that can operate with the safety disengaged?
Re: (Score:2)
How do you propose to validate a pipelined set of neural nets' training? This isn't just a message loop and a giant case statement.
I'd start by setting up a transparent, adversarial system in a virtual world. Let the public earn money by providing data sets that trigger bad behavior in the 'AI'. Good fun.
Re: (Score:2)
If a virtual Pepe/Pedobear walking across the road makes a car crash, they've done us all a service.
Re: (Score:2)
serious work done making sure that all phases of the workflows creating systems "that have the potential to cause human casualty or death" are secure and error free.
Well some companies [japantoday.com] are indeed building tracks to begin neural net training live. Additionally, there's been enough failures and near misses from other car companies to begin edge testing as well.
Additionally, map makers are now refocusing on a new emerging market of maps for self driving cars. [latimes.com] These maps differ from the typical on-line map in that they need typical pattern usage of a given intersection or piece of road that initial algorithms create too many edge cases for. Good example might be the 65/ [goo.gl]
Re: (Score:2)
A forward radar, simple stupid, system should have been able to detect the bike if not the person.
Oh good. (Score:4, Interesting)
Then it's an easy fix. Just move the "sensitivity" slider a little to the left.
Actually, it's kind of terrifying that all that stands between life and death is a sensitivity setting.
Re: (Score:3)
Then it's an easy fix. Just move the "sensitivity" slider a little to the left.
Actually, it's kind of terrifying that all that stands between life and death is a sensitivity setting.
Absolutely.
I hope they use material design so the settings are all hard to see.
And I really hope it's totally ambiguous whether you have to click Save, or if the changes to the Sensitivity slider will just save automatically, just because you touched them or something.
Re: (Score:2, Interesting)
Then it's an easy fix. Just move the "sensitivity" slider a little to the left.
Actually, it's kind of terrifying that all that stands between life and death is a sensitivity setting.
It is not the setting that is the problem. The problem is socalled AIs with less intelligence than a cockroach being put behind the wheel of cars.
Re: (Score:3)
That is general true about the world.
If you are a driver and you are distracted or not fully focused on the world around you, your sensitivity setting is just off too. The biggest reason why Motocycles get in accidents is because automobile drivers fail to see them, just because they may not be expecting a Motocycle, so their eyes are on the look out for fast moving objects that fill up at least 2/3 of the lane. This fact that we fail to comprehend things that we don't expect is how magicians trick us to s
Re: (Score:2)
is because automobile drivers fail to see them, just because they may not be expecting a Motocycle,
The reason I didn't see them is because they were coming up between two lanes of moving traffic. You're right, I am not set to expect that.
does the autonomous sensitivity need to be changed (Score:2)
does the autonomous sensitivity need to be changed all the time? Time of Day? Weather? urban vs rural
Re: (Score:2)
Re: (Score:2)
Just move the "sensitivity" slider a little to the left.
Actually, it's kind of terrifying that all that stands between life and death is a sensitivity setting.
There is no "correct" setting for sensitivity because the software is broken. Where it is right now is both "incorrectly classifying a safe situation as dangerous" AS WELL AS "incorrectly classifying a dangerous situation as safe" (probabilities apply).
Which way do you want the slider to move? It's already too far from the correct position for both classifications.
Re: False Positives and False Negatives (Score:2)
By the way this was a false negative (detection), not false positive.
A false negative means it decided there was nothing there i.e. that the data did not indicate an object. But that was false.
A false positive would mean the system decided something was in front of the car, when there wasn't anything or anything significant anyway.
The problem is when you set the parameters to lower the rate of false negatives (A good thing if you are considering being a pedestrian), then the rate of false positives goes up.
Too large! (Score:5, Interesting)
I understand that programmatically telling a blowing plastic bag from a child's toy is difficult.
But she (and her bike) were clearly large enough to damage the vehicle. Even if the code saw her as debris, the car should have avoided it.
I think the code had to have dismissed her as lens flair or something similar.
Re: (Score:2, Insightful)
Re: (Score:2)
I think the code had to have dismissed her as lens flair or something similar.
Damn you Michael Bay!
Uber cuts corners (Score:5, Insightful)
Re: (Score:2)
Re: (Score:2)
I think a big part of the problem is unwillingness to spend on a really really good sensor array. Being confused by stationary objects is only a thing because the computer needs to make guesses with choppy data that is not actually reliable. So, instead of making the occupants seasick with lots of popping on the brakes for no apparent reason, they try to teach the car to ignore some signals.
Re:Uber cuts corners (Score:5, Insightful)
Roads are dirty places, the better your sensor array, the more signals it will have to see and decide, hopefully correctly, to ignore.
Re: (Score:2)
Let us know when one of those is available. Hell, forget the affordable part.
Re: (Score:2)
Uber's entire business model is based on cutting corners (not paying employees as employees, not following local taxi laws/regulations, etc.). I wasn't at all surprised to hear that one of their self-driving test cars killed somebody. I immediately assumed that it was the result of yet another corner that they cut.
And I think your assumption is pretty valid. I believe there will be some attorneys that agree with me as well. That wrongful death suit is going to be very expensive and damaging to Uber.
Re: (Score:2)
That wrongful death suit is going to be very expensive and damaging to Uber.
You think so? Because from what I see, Uber settled confidentially with the woman's family within 11 days after the accident.
Re: (Score:2)
Maybe that's a fair immediate assumption, but did you view the video and, if so, did it cause you to reevaluate your initial assessment?
I started by assuming that the technology was still its infancy and hence crap[1], then I saw the video and realize that no only is the technology crap, but it's literally a person jumping out from a shadow at the last possible moment on a large thoroughfare nowhere near a crosswalk.
[1] Not even a judgment on Uber TBQH, could have been Tesla or GM or Toyota. I've seen enoug
Re:Uber cuts corners (Score:5, Insightful)
Maybe that's a fair immediate assumption, but did you view the video and, if so, did it cause you to reevaluate your initial assessment?
I started by assuming that the technology was still its infancy and hence crap[1], then I saw the video and realize that no only is the technology crap, but it's literally a person jumping out from a shadow at the last possible moment on a large thoroughfare nowhere near a crosswalk.
[1] Not even a judgment on Uber TBQH, could have been Tesla or GM or Toyota. I've seen enough technologies come up to realize that the cutting edge is riddled with snakes. By the time it's thoroughly ironed out, it's also super boring.
I did view the video.
And like most people I came to the conclusion that Uber was either using ridiculously bad cameras or the video was altered. This impression was only compounded when 3rd party videos came out that showed the road in question was actually quite well lit.
Either way Uber was still fully to blame for the collision, the tech was obviously not ready for testing on live roads, especially not with a single driver who was prone to being distracted. Authorizing that test is damn well close to negligent homicide.
Re:Uber cuts corners (Score:4, Insightful)
... it's literally a person jumping out from a shadow at the last possible moment on a large thoroughfare nowhere near a crosswalk.
If by "jumping out from a shadow" you mean "slowly crossing the street", and by "at the last possible moment" you mean "and had nearly crossed all three lanes", ignoring that there's plenty of evidence that the released video did not even vaguely show the actual level of light in the location.
criminal case let's uber ceo in tent city jail for (Score:2)
criminal case! let's see uber ceo in tent city jail for some time.
Also with an criminal case you can't hide under the EULA's or a big list of subcontractors.
In autopilot software (airpalnes / FAA) testing (Score:2)
In autopilot software (airpalnes / FAA) this would be tuned in testing / code review before it makes it to real use.
Re: (Score:2)
Excepting the battery charger software.
My guess (Score:2)
Most of the code came from the Kalanick era (Score:2)
In all likelihood the AI did detect the woman, but then decided she wasn't attractive enough to harass and switched to "ignore" mode.
How much timewas allowed for compliance? (Score:2)
TFA doesn't mention how much time, if any was allowed for compliance. Compliance errors were even documented in an old movie.
https://www.youtube.com/watch?... [youtube.com]
I'm still more worried about the driver (Score:3)
I still want to know why nobody seems to care that the driver wasn't looking at the road. The software bug is secondary.
Misleading title (Score:2)
Uber software did not "see but ignore woman".
Uber software processed some pixel data and erroneously concluded that there was no significant solid object right in front of the car.
Simple as that.
To imply that the the software "saw" a person there but ignored the person is pejorative, sensationalist language, designed to troll.
QA by real life trials? (Score:2)
So...this is a QA cycle with the expense of 'users'? Did they not test it in more controlled environment? I hope they (all of them) will now.
Re:The War Between Man and Machine (Score:5, Funny)
Yeah, really. How do they know the vehicle "ignored" the woman? Maybe it just acted like it didn't recognize it needed to take action when it really was targeting her.
Re:Should you name your self driving car? (Score:4, Informative)
Should you name your self driving car? If so, are Christine and Kitt off limits?
ITYM Karr (Knight Automated Roaming Robot) - that was the evil one. Kitt (Knight Industries Two Thousand) was the good one.
(I feel very old now)
Re: (Score:2)
Re: (Score:2)
I can't wait for them to decide enough is enough and they ban these from public use. We need REAL AI not this half-assed pseudo-intelligence 'machine learning' crap. What we really need is better driver education, training, and testing, and stricter penalties for bad drivers, up to and including revoking their driving
Re: (Score:3)
If people follow at appropriate distance for the speed, no crash will happen.
Re:Not so simple... (Score:2)
If people...
That's a big "if", especially when you look at how many crashes are caused by drivers texting and/or drunk.
It's like saying people should assume drivers will jam on their brakes because pedestrians have the right-of-way.
Re: (Score:2)
Many cyclists are assholes, future Darwins, but you can't just run them over. She was in the road long enough to cross a lane and a half.
Re: (Score:2)
I for one hail our new corporatocratic [wikipedia.org] overlords.