Inside Google's Self-Driving Car Test Center (medium.com) 43
An anonymous reader writes: Steven Levy reports on his trip to the facility where Google tests is autonomous vehicles (here's a map). The company apparently has a four-week program to certify people to not-drive these cars, and they gave Levy an abbreviated version of it. "The most valuable tool the test team has for making sure things are running smoothly is the laptop on the co-driver's lap. Using an interface called x_view, the laptop shows the world as the car sees it, a wireframe representation of the area that depicts all the objects around the car: pedestrians, trees, road signs, other cars, motorcycles—basically everything picked up by the car's radar and laser sensors.
X_view also shows how the car is planning to deal with conditions, mainly through a series of grid-like "fences" that depict when the car intends to stop, cautiously yield, or proceed past a hazard. It also displays the car's path. If the co-driver sees a discrepancy between x_view and the real world, that's reason to disengage. ... At the end of the shift, the entire log is sent off to an independent triage team, which runs simulations to see what would have happened had the car continued autonomously. In fact, even though Google's cars have autonomously driven more than 1.3 million miles—routinely logging 10,000 to 15,000 more every week—they have been tested many times more in software, where it's possible to model 3 million miles of driving in a single day."
X_view also shows how the car is planning to deal with conditions, mainly through a series of grid-like "fences" that depict when the car intends to stop, cautiously yield, or proceed past a hazard. It also displays the car's path. If the co-driver sees a discrepancy between x_view and the real world, that's reason to disengage. ... At the end of the shift, the entire log is sent off to an independent triage team, which runs simulations to see what would have happened had the car continued autonomously. In fact, even though Google's cars have autonomously driven more than 1.3 million miles—routinely logging 10,000 to 15,000 more every week—they have been tested many times more in software, where it's possible to model 3 million miles of driving in a single day."
and what about the prison / jail time when the car (Score:2)
and what about the prison / jail time when the car runs though the framers market?
Re:Million Dollar Payout (Score:5, Insightful)
Google will be paying some accident victim millions of dollars in the future. It is inevitable.
... which will be covered by their insurance company ... the same insurance companies that are already paying millions to accident victims. The only thing that will change with SDCs, is that they will be paying a lot less.
SDCs don't need to be perfect. They just need to be better than human drivers. That is not a high bar.
Re:Million Dollar Payout (Score:4, Informative)
I will be surprised if Google's insurance company ever has to pay a claim. The cars will have so much data on the accident that it should be trivial to show that the car was obeying all the laws and that any accident was either impossible to avoid or the fault of someone else.
Google itself might pay in the case of an accident that was not their fault BUT has a PR issue attached to it.
But, overall, I think that the insurance companies will love the autonomous cars that they're insuring. It's free money for them.
Re: (Score:2)
Again, it's naive to think that the cars driving software will be infallible, but most accidents are caused by the fallibility of human drivers. All Google needs to do to save thousands of lives and millions of dollars is be a little better than people, a goal which should not be hard to reach.
Re: (Score:2)
Airbus killed 20-30 people
So software killed a few dozen. Pilot error has killed thousands.
spoiler alert! (Score:3)
it's not bigger on the inside. :(
medium.com (Score:3)
Is it fair? (Score:2)
For those of you that haven't seen the university of Michigan report [govtech.com]
Re: (Score:2)
Re: (Score:3)
Who exactly is Google conning? So far, the only ones losing money on self-driving cars are the companies like Google that are researching them.
Re: (Score:2)
Indeed, Google is conning itself, they've said their goal is autonomous cars by 2020 but they still haven't actually said how they hope to achieve this, primarily - with or without constant extensive mapping?
And now they've released the fact that the only reason the cars didn't crash is perhaps because humans took control of the cars whenever they made mistakes.
Re: (Score:3)
The problem with that report is that it only covers reported accidents.
ANY accident that involves an autonomous car gets reported.
A similar accident between two humans may not be reported.
And, finally, the report even states that NONE of the accidents were the fault of the autonomous cars. They were ALL the fault of the human drivers. So, yes, the autonomous cars are better drivers than the humans in those instances.
Re: (Score:3)
1) The first sentence of the report says "after correcting for underreported accidents".
2) Most of the mileage of autonomous cars is going 25mph on residential streets or on the open, non-congested highway under perfect driving conditions.
so I'm not sure it's comparing apples to apples. I haven't been able to find a non-pay walled version if anyone has I would like to read it but am not paying.
Re: (Score:2)
From the abstract:
So they are admitting that their "higher" rate may not be correct. Which is what I said.
Secondly, yes, it is "apples to apples". Because the human drivers involved in the accidents with the autonomous cars were all driving under the exact same co
Re: (Score:2)
Re: (Score:2)
Don't use terms you do not understand.
You claimed that the autonomous cars were WORSE drivers because they had MORE accidents. I said that that 100% of the accidents had to be reported for autonomous cars but not for human accidents. So there is a question whether the comparison is accurate.
You claimed that the paper said that they had adjusted for that.
I quoted the summary saying that those researchers admitted that they were NOT sure that the adjustment was correct.
And then I pointed
Re: (Score:3)
Or let's try this a different way. A mental experiment. Think of both sides as two humans. Autonomous Alice and Bob.
Alice drives less than Bob. And Alice only drives under perfect conditions in a limited area. Bob drives everywhere in all conditions.
Bob does not report every accident he has to his insurance company. But Alice does. The insurance company sees that, on average, Alice reports more accidents than Bob. And the insurance company tries to adjust for Bob's under-reporting.
But every single accident
Re: (Score:2)
Google (Score:2)
Driverless car = Everyone else
Re: (Score:2)
Google cars, the ones that go in actual traffic are self-driving, not driverless. Because there is still a driver in the car in case of emergency.
Passenger airplanes nowadays are pretty much self-flying, but they are not pilotless.
Redundancy (Score:2)
In fact, even though Google's cars have autonomously driven more than 1.3 million miles—routinely logging 10,000 to 15,000 more every week—they have been tested many times more in software, where it's possible to model 3 million miles of driving in a single day."
That would be three million miles over the same few miles of well understood track and roads. The real world is much more varied than that.
Google cars involved in crashes (Score:3)
Read the details here [yahoo.com] which outline the results of the report.
Re: (Score:2)
I wouldn't say it failed. As the article explains it does not mean the sensors failed to detect something, it means the sensors detected a glitch/malfunction/blockage and alerted the driver to take over. Presumably that means the test models lack redundancy and the ability to "vote out" malfunctioning gear, those are things that are relatively trivial to fix since it's purely a technical issue and not about the car's understanding of the surroundings. In fact over the test period they show about a 7x improv
Re: (Score:2)
That's a rather massive presumption and could be completely wrong for instance the failure could be on
Ob (Score:2)
Why would a car test center need to drive?
really, really slow and/or dangerous (Score:2)
in TFA the car has to be taken off auto-drive because it comes to a construction area and slows down so much it was barely moving
these things are not going to work in this form, and pushing them into the market will be a disaster
car AI is much, much improved, and I can see groups of electric semi-trucks following one lead driver on an interstate, but that's about it