Philosophical Differences In Autonomous Car Tech 247
An anonymous reader writes: The Guardian has an in-depth article on the status of self-driving car development at BMW. The technology can handle the autobahn just fine, for the most part. But the article highlights philosophical differences in how various companies are developing self-driving tech. European and Asian car manufacturers are fine working on it piece-by-piece. The car will drive itself when it can, but they expect drivers to always be monitoring the situation and ready to take control. Google's tests have taught it otherwise — even after being told it's a prototype, new drivers immediately place a lot more trust in the car than they should. They turn their attention away and stop looking at the road for much longer than is safe. This makes Google think autonomous cars need an all-or-nothing approach. Conversely, BMW feels that incremental progress is the only way to go. They also expect cars to start carrying "black boxes" that will help crash investigators figure out exactly what caused an accident.
In related news, Google is bringing on John Krafcik as the CEO of its self-driving car project. He has worked in product development for Ford, he was the CEO of Hyundai North America, and most recently he was president of Truecar.
Autonomous "Driving" needs to be truly driverless (Score:5, Insightful)
Expecting a driver to take control in a failure scenario is not a solution.
I would never trust a car that could require me to take control in an emergency. At the very least, the autonomus driver should get the car to a safe stop before requiring a human to take over.
It doesn't necessarily need to be an all or nothin (Score:4, Insightful)
Expecting a driver to take control in a failure scenario is not a solution.
I would never trust a car that could require me to take control in an emergency. At the very least, the autonomus driver should get the car to a safe stop before requiring a human to take over.
I agree that "expecting a driver to take control in a failure scenario is not a solution" and stopping is an acceptable solution but it doesn't necessarily
need to be an all or nothing. A better piecemeal solution would be to have it only engage on known safe highways. It would still be extremely useful
in trucks, RVs, and regular cars if it only engaged on predesignated roads or interstates. The trucking industry already has depots at both ends of
Kansas where trucks double or triple up before taking the long straight stretch across Kansas to minimize drivers. I see no reason why driverless
cars couldn't do the same where you could only engage autopilot on certain known safe highways with good shoulders to do emergency stops.
You could also do the same with weather. If it detects rain starting then it gives a 60 second warning and pulls over to the side of the road.
Re: (Score:2)
Re: (Score:2)
> better piecemeal solution would be to have it only engage on known safe highways.
Re: (Score:2)
If it detects rain starting then it gives a 60 second warning and pulls over to the side of the road.
Imagine you have a road full of driverless cars, and it starts to rain. Suddenly the shoulders get jammed as every single car tries to pull over.
I see no reason why driverless cars couldn't do the same where you could only engage autopilot on certain known safe highways with good shoulders to do emergency stops.
There are none. Any road has potential to have construction at any time, for example, without warning. This particular problem could be fixed with legislation requiring all construction projects to be entered into a database in advance, but do you really want your cars driving to be affected by a remote database? Seems like a security issue.
Re: (Score:2)
There are none. Any road has potential to have construction at any time, for example, without warning. This particular problem could be fixed with legislation requiring all construction projects to be entered into a database in advance, but do you really want your cars driving to be affected by a remote database? Seems like a security issue.
I expect that the first adopters of this will be big rigs with plenty of money and as such, I could see them even constructing special pull off areas and yes, even requiring all construction on the road to be documented. Toll roads which are privately owned would be a reasonable place to start.
also more driver assist, similar to what we have n (Score:5, Insightful)
> better piecemeal solution would be to have it only engage on known safe highways.
Also, what we're already seeing is more and more driver assist. My 2012 Dodge has an option for "smart" cruise control where is slows down if you're getting too close to the car In front. Most cars these days have traction control, where the computer automatically brakes the wheels independently in order to turn the car in the direction the steering wheel is pointed. Do we already have systems that will nudge the steering a bit when you start to drift out of your lane? If not, that could be added. Not overriding a clear steering input from the driver, just a slight torque so that the existing self-centering action of the steering wheel follows the lines which mark the lanes. In other words, with today's cars, if you let go if the steering it'll tend to go straight ahead. Mayb with tomorrow's cars if you let go of the wheel they'll TEND to follow the lane.
On my 2012, the headlights automatically turn on and off as needed.
I could see more and more of that stuff being added, stuff where the computer insures that the car does what the driver wants/expects it to do. Eventually, you slowly get to the point where "what the driver expects" is defined by the destination they select in the gps.
Re: (Score:2)
Re: (Score:3)
Expecting a driver to take control in a failure scenario is not a solution.
Yes it is. Its just a solution that can't be used in America where no one feels any need what so ever to have any personal responsibility for the situation they are in. You're just demanding to be pampered and ignore the fact that in a failure situation THE CAR CAN'T BE TRUSTED. Thats what a failure means in this context. You'll die in a car accident even with Google's car because you'll be bitching about how its not working right when ti drives off a cliff instead of just putting your foot on the damn
Re: (Score:2)
Re:Autonomous "Driving" needs to be truly driverle (Score:5, Insightful)
And I'd absolutely disagree.
The first step for autonomous driving would make sense to be implemented ONLY on long-stretch highway drives, with strong signals many minutes before exit-destination arrivals and a "pull over and stop" system for drivers that don't respond/wake up.
To suggest that driverless cars have to be able to cope with every conceivable situation is totally unreasonable. Hell, HUMANS can't cope with "every conceivable situation", really.
Re: (Score:2)
Re: (Score:2)
At the very least, the autonomus driver should get the car to a safe stop before requiring a human to take over.
Depending on the emergency that may not be possible to do. Rather, if you're going to go that route it should be interactive with the driver to safely transfer control and all the while continuing to attempt the safest maneuvers it can.
Re:Autonomous "Driving" needs to be truly driverle (Score:5, Insightful)
Expecting a driver to take control in a failure scenario is not a solution.
Why not, that is the design philosophy for airliners made in the past 30 years and their pilots who operate them. Sit and babysit the machine for 99.99% of the time; then jump in ready to go for the 0.01% of the time the situation is beyond the programming of the software. (In which case the software 1) does wrong thing. 2) just shuts-down while displaying a message to the pilots to let them know that, suddenly, THEY are flying the plane.)
The problem with this is that when an airline pilot is forced to take control, they probably have MINUTES before any real issue will arise.
They are asking car drivers to take over when there are possible issues within SECONDS (possibly less).
Re: (Score:2)
.... The problem with this is that when an airline pilot is forced to take control, they probably have MINUTES before any real issue will arise. They are asking car drivers to take over when there are possible issues within SECONDS (possibly less).
Not necessarily, especially if it involves fire, structural damage, rapid decompression or engine failure. Maybe the chain of events unfold over minutes or hours, but there are times where a correct decision needs to be made quickly, i.e. turn back, land straight ahead, divert, eject, etc.
Re:Autonomous "Driving" needs to be truly driverle (Score:5, Insightful)
The bigger issue is that pilots spend hundreds of hours practicing those emergency maneuvers. Car drivers not so much. Pilots have strict rules about how many hours they can fly. Automobile drivers, not so much.
If we held automobile and truck drivers to the standard we hold even private pilots, there would likely be many fewer accidents. But we don't.
Re: (Score:3)
If we held automobile and truck drivers to the standard we hold even private pilots, there would likely be many fewer drivers.
FTFY. Not that fewer drivers would be a bad thing.
Re: (Score:2)
The main problem with this idea is: how do people get around if they're not proficient enough to get a driver's license under a new regime where drivers have to be as skilled as pilots? We don't have public transit in this country which is actually feasible for much of the population. A lot of people don't even live anywhere near public transit routes. So what are you going to do, take away their livelihood and make them move at their own expense? The results would be catastrophic with all the people go
Re: (Score:2)
Agreed, it would require a serious restructuring of transportation in the US, something that couldn't happen quickly or cheaply.
At this point we're almost certainly better off just waiting for self-driving cars to become practical, then we can dramatically raise the requirements to get a license -- or just ban manual driving on public roads entirely.
Re: (Score:2)
The bigger issue is that pilots spend hundreds of hours practicing those emergency maneuvers.
... and even then there are a number of accidents caused by autopilot handoff to pilots that were not fully aware of the situation, leading to poor decisions. Such as Air France 447 [wikipedia.org].
Re:Autonomous "Driving" needs to be truly driverle (Score:5, Insightful)
People aren't good at driving and make bad decisions in emergency situations as it is. Now you want the person to have 99.99% less experience behind the wheel, and yet be capable of doing the right thing when a tricky situation is suddenly thrown at them with half a second notice? Are you sure you don't see the problem with this?
Re: (Score:2)
Not to mention that the AI driving the car can probably "see" in the infrared and in all directions simultaneously so would likely have noticed the elk in the woods running towards the road long before a human driver and wouldn't have to brake hard to barely avoid a collision.
Re: (Score:2)
Why not, that is the design philosophy for airliners made in the past 30 years and their pilots who operate them. Sit and babysit the machine for 99.99% of the time; then jump in ready to go for the 0.01% of the time the situation is beyond the programming of the software. (In which case the software 1) does wrong thing. 2) just shuts-down while displaying a message to the pilots to let them know that, suddenly, THEY are flying the plane.)
A couple of reasons. First, the barrier to flying a plane, both for a human and for a computer is significantly higher. Specialized training far and beyond that to drive a car is necessary to fly a plane and not simply becuase the machine is more complicated, but because the conditions that the machine could encounter are much more varied and the way the operator reacts is much more important. Second, the number of people that could fall victim to an error is greater and the amount of harm that could be
Re: (Score:2)
You forgot third, planes are in the air. They're not on the road, twelve inches from another car, and able to collide with that car within a tiny fraction of a second with only a small amount of incorrect steering input. If it takes a couple of seconds for the pilot to take over from the autopilot, it's scary, but nothing bad will happen... takeoffs and landings aside, of course.
Re: (Score:2)
Re: (Score:2)
In most situations an emergency routine to slow the car to a halt, possibly pulling over, is a preferred solution to handing control to an ever more skillless driver.
If most cars are robot, and most of the remainder have robot avoidance, this should go well the vast majority of the time.
Re: (Score:2)
The problem here is that "pulling over" is frequently not an option. Where do you pull over? Many roads do not have any shoulders on them. The rural roads near where I live are windy, narrow and have no shoulders at all, and traffic moves pretty fast on them. They also have a lot of wrecks; I came up on one last week, the road was completely blocked with a dozen fire trucks and cop cars, and I had to turn around and find another route around the mess. I guess if you could program the car to pull into s
Re:Autonomous "Driving" needs to be truly driverle (Score:5, Insightful)
The airliner scenario is only superficially similar.
At cruising altitude, a plane typically has minutes before it crashes to the ground. For example, from the time its problem began, to the point in hit the ocean, Air France flight AF447's pilots had 3 minutes and 30 seconds to try and save the plane. There are typically few, if any other planes in its airspace to worry about, so pilots can do things like take our their operating manuals and run through operating prodecures to attempt to rescue the situation without worrying about hitting the kerb, another plane, etc. If my self driving car is going to give me 3 minutes before the actual crash, then fine. Otherwise, it is less than useless to give the control to a driver who likely doesn't have the correct situational awareness (who might even have fallen asleep).
Even if the driver had not been sleeping, a driver's awareness is reduced because he doesn't have to process what is happening around him all the time like one does when they are driving. So, for example, if the problem is that he is about to crash, unless he was hyper vigilant, he is the worst person in the world to drop into the driving seat so to speak.
Re: (Score:2)
Airline pilots != Joe Sixpack.
An airline pilot has a lot more time (in general) to move a plane, and they have had thousands of hours of training, and may more in the cockpit.
A driver, assuming he/she isn't drunk, stoned, texting, KO-ed on K2, or watching a movie, is not going to have the reaction times to realize the autonomous system just went TU, and they have to put down their tablet, set their beer down, and actually get through a dangerous situation
IMHO, Google's philosophy is the best here. Treat a
Re: (Score:2)
Why not, that is the design philosophy for airliners made in the past 30 years and their pilots who operate them.
And, quite often, the pilot then crashes the plane, even though they have minutes to figure out the problem.
This design philosophy has been disastrous in aviation, where there are far less 'self-driving' vehicles and the people monitoring the computers are much better trained and have much more time to respond. It's not going to work at all with cars.
Re: (Score:2)
Re: (Score:2)
This is a particular problem with small, non-commercial planes. Modern airframes are very safe, and the cats majority of crashes are "controlled flight into terrain". Something goes wrong with the plane, something that's not an immediate risk, plenty of time to sort it out, but the pilot gets so distracted he forgets to fly the plane.
This isn't a material problem with US airliners, mostly because there are two pilots, so one can pay full attention to flying the plane while the other works on whatever wen
Re: (Score:2)
s/cats/vast/
That was an odd auto-correct.
Re: (Score:2)
Go look objectively at the statistics for plane crashes since commercial flights began, then come back and say that with a straight face.
There are multiple cases of airliners following the 'it's ok, we don't have to be able to handle all conditions, we'll just hand control back to the pilot if we can't figure out going wrong' and the pilots crashing a perfectly good plane when it happens.
Which part of this is hard to understand? The pilots have been taken out of the loop, have no idea of what's going on, and no idea how to get out of it.
Re: (Score:2)
The part that you don't understand is that despite this horrible design decision, aircraft are much safer than they have ever been. Perfect, no. Safe, yes.
Re: (Score:2)
I think the argument is "by taking the pilots mostly out of the loop except in emergencies, we have greatly increased aircraft safety. By taking them out entirely, we'll increase it some more."
I suspect that is probably right; for every time that a pilot saved the day we can probably find several times that pilot error was the proximate cause (or the root cause) of the crash. But I'm happy to see evidence otherwise, and I realize that the low rate of crashes means that we don't have a large dataset to min
Re: (Score:2)
Why not, that is the design philosophy for airliners made in the past 30 years and their pilots who operate them. Sit and babysit the machine for 99.99% of the time; then jump in ready to go for the 0.01% of the time the situation is beyond the programming of the software. (In which case the software 1) does wrong thing. 2) just shuts-down while displaying a message to the pilots to let them know that, suddenly, THEY are flying the plane.)
Because that approach has killed people in the past. vakuona mentioned a key example, flight AF447 whose autopilot, as I understand it, bailed out on its pilots once it had dumped them in a cluster of thunderstorms at high altitude, blind to everything including airspeed, and ready to stall at even the slightest deviation in pitch outside a narrow range.
Re: (Score:2)
>>Expecting a driver to take control in a failure scenario is not a solution. Why not, that is the design philosophy for airliners made in the past 30 years and their pilots who operate them. Sit and babysit the machine for 99.99% of the time; then jump in ready to go for the 0.01% of the time the situation is beyond the programming of the software. (In which case the software 1) does wrong thing. 2) just shuts-down while displaying a message to the pilots to let them know that, suddenly, THEY are flying the plane.)
Problem is that for the Airbus planes where that is what is typical, the pilot also has to convince the computer to give them control by entering various codes into the computer to relinquish control back to the pilot as the committee that designed it trusts the computer over the pilot.
That's not to say it doesn't happen too on Boeing aircraft where the pilot has the first right to control (even in fly-by-wire systems); but it's much easier for the pilot to take over to resolve the emergency.
(No, I'm
Re: (Score:2)
But even that doesn't always work all that well.
The Asiana 214 crash [al.com] in San Francisco in 2013 has been blamed to a large extent on an over-reliance on automation.
Stuck in traffic (Score:5, Interesting)
At first, I want autonomous driving for when I'm stuck in traffic. It should be able to handle that situation safely. Let's have that and then move on from there.
Re: (Score:2)
That would be awesome. Especially when nearly all cars have some rudimentary autonomous capability as well as the ability to communicate with each other. Then, when the light turns green, the entire fleet can move forward as one, rather than starting up with a traveling wave and wasting gobs of time. Intelligent intersections and cars will make urban travel far more efficient than the horrorshow one finds in some cities.
LA freeways are a good use-case for autonomous driving: generally slow-varying traffi
Fahrvergnügen (Score:2)
There are philosophical differences in how people are developing self driving cars because there are philosophical differences in how & why people drive.
Some people want to 'skip the boring parts'. Uber & Google are trying to replace cars for people that really don't want a car. They don't want maintenance, a car payment, to drive. They just want a magic transporter to get from A to B scheduled from their phone. When I'm stuck in traffic or need to get home from the bar, I want to press auto and fal
Re: (Score:2)
They don't want maintenance, a car payment, to drive.
Sounds like a lot of computer users. But whenever a company makes a non-reparable cell phone, the internet has a meltdown. Two years ago this happened when a car company was going to produce a car with a sealed engine compartment.
Re: (Score:2)
At first, I want autonomous driving for when I'm stuck in traffic. It should be able to handle that situation safely.
Agreed...this seems like a situation where many of the out-of-band variables are constrained, especially in bumper-to-bumper type traffic. Cruising "free form" down a widely varying array of roads and intersections with all sorts of random traffic situations seems much more difficult to manage safely and consistently.
Here you go... (Score:2)
http://www.wired.co.uk/news/ar... [wired.co.uk]
Re: (Score:2)
Some may come pretty close. I drive a 2007 Volvo S80 with adaptive cruise control -- a radar panel keeps track of what's in front of you, and slows down to keep distance with the car in front and speeds up to the cruise set point if the car up front speeds up.
My model is good to 20 mph, and about 15 mph it will cut out. It will slow way down but not quite come to a complete stop (I've tested this exiting a freeway ramp)
The last time I was at the dealership reading the BS about newer models, it sounds like
Re: (Score:2)
I think "well defined" piece-by-piece would be OK (Score:3)
I think that google are correct in that you cannot expect a driver to be fully alert on a long trip all the time. On the other hand a car that could handle the autobahn, but not other roads (which could have pedestrians, horses, or even marching bands!) would be fine as long as it gave a "count down" of warnings as it approached the exit, and the driver knew that they would have to take over once they reached the slip road.
Even here there should be some sort of fail-safe behaviour, if the driver does not acknowledge taking control the car should park up, and possibly phone teh emergency services (at some point someone will have a heart attack while being driven in an autonomous car).
Re:I think "well defined" piece-by-piece would be (Score:5, Interesting)
at some point someone will have a heart attack while being driven in an autonomous car
There's a macabre thought.
So I'm driving to see the grandkids. I have a heart attack in route and die. And the car dutifully delivers my dead body to the grandkids.
Eww...
But 65 years ago... (Score:2)
Re: (Score:3)
I recall seeing an old illustration of a father and son playing chess while the car drives them to their destination on the freeway.
I recall seeing something similar in Popular Mechanics . . . except that the car was flying, not driving.
I guess our technological development took a wrong turn at Albuquerque somewhere.
Re: (Score:2)
I recall seeing an old illustration of a father and son playing chess while the car drives them to their destination on the freeway.
I recall seeing something similar in Popular Mechanics . . . except that the car was flying, not driving.
I guess our technological development took a wrong turn at Albuquerque somewhere.
Not a wrong turn. It was just bad prognostication. It's very hard to predict the difficulty of problems you don't yet thoroughly understand.
Re: (Score:2)
With some things (like flying cars), maybe. With other things (like space stations and moon bases), no.
The future depicted in 2001: A Space Odyssey was almost completely plausible (even in 2001, much less 2015), except for the bit about HAL being self-aware and all that. Videophones, rotating space stations with gravity, and a Moon base could all have been done by 2001 if there had been the will and the budget for it. We even had videoconferencing by then, but thanks to poor regulation there was no stand
Re: (Score:2)
With some things (like flying cars), maybe. With other things (like space stations and moon bases), no.
Sure, some problems are easier to understand. Well, unless you include the larger social context, which is why 2001 didn't happen :-)
Re: (Score:2)
On Youtube, there's a video from the 50s about how you'll soon be able to get on the highway and your self-driving car will take you to your destination. Of course, they had 'road traffic control' towers controlling the cars, not computers.
Re: (Score:2)
Not going to happen (Score:2, Interesting)
Self-driving cars won't happen for quite some time in my estimation. With today's roads there are simply too many factors that the car won't know how to handle.
If roads were retro-fitted with some sort of standardized guide-wire or other tracking/placement beacons then I think it would be more likely to work, but between the many variations in roadways, weird intersections, roundabouts, ramps, etc etc, PLUS factoring in all the out-of-band stuff like pokey drivers, bicyclists, motorcycles, etc etc, I just s
Re:Not going to happen (Score:4, Insightful)
If you want to see where the state of the art is at (at Google, anyway), have a look at this video [youtube.com]. The first part is high-level info about Google's approach to the problem; the actual demonstration of the software starts at the 7m50s mark, and an example of the car dealing with something unexpected is at the 11m00 mark.
Note that in none of the examples has the roadway been pre-fitted with any kind of tracking or placement system, and that these aren't imaginary scenarios; rather these cars have been on the roadways, in real life, for years already. If the problems were really as unsolvable as you suggest, I'd expect we'd have read about a number of hilarious and/or tragic mishaps by now.
Re: (Score:2)
If the problems were really as unsolvable as you suggest, I'd expect we'd have read about a number of hilarious and/or tragic mishaps by now.
Yeah, and if overclocking our chips was dangerous, we wouldn't have been running them at 3GHz in the pre-release benchmarks (oops, we picked the chips that were capable of running at that speed, not the ones we'd sell to you for $200, but don't tell anyone).
How many of those cars were being driven in sunny weather by Google employees who were paying full attention to what's going on around them? And how many of those cars were being 'driven' by soccer moms talking on their cellphones in between yelling at t
Re:Not going to happen (Score:5, Insightful)
That's not exactly correct... the Google cars have incredibly precise maps of the roads they're on, not just the route, but maps of the actual surface of the road (e.g., where the potholes are). That level of detail available to the onboard computers is pretty much the same as having sensors on the road. It requires an incredible amount of prep work. Of course, map updates could be handled by sensors on other cars constantly providing real time information. It's a cool approach, but only practical when you have that level of detail available.
Google, et al, are showing very controlled research projects. Even though they're testing in the real world, they're still highly controlled experiments.
Sure, many of the problems are resolvable using this approach, but what we don't know is what new problems will evolve once there are more than a handful of self driving cars on the road. More research will help identify these, but anyone who's done real science or engineering knows that what works at small scale rarely scales as you would hope/expect.
-Chris
Re: (Score:2)
Also, no current automated car (including Google's) works at all when the roads are covered by snow, and large numbers of people live in areas that have at least some snow during the year. Purely automated cars are not practical in most of the US and Canada until they can handle that scenario.
Re: (Score:2)
The gradualist approach will prevail (Score:3)
Business and practical considerations will mean that BMW's and other automakers gradual approach to automation will prevail. A gradual approach allows automakers to recoup their investment immediately, and allows automakers to fine tune their technology as each aspect of automation is rolled out. It is also important to note that regulatory agencies will react and set the rules for new autonomous vehicles on the road based on the technology that is on the road, so those carmakers rolling out the technology first - those with a gradual approach - will have a greater input on the regulatory nature of that technology. The risk for Google as that as the other automakers will end up defining that regulatory environment, their technology will be obsolete from a regulatory standpoint by the time it is rolled out.
As much as I like Google's approach from an engineering standpoint, the truth is Google is already being left behind in autonomous car technology. Other auto makers are already introducing various aspects of self driving - from automatic emergency braking, lane assist, adaptive cruise control - so that by the time Google has their self-driving vehicles ready for the market, the major automakers will already have a road-test, established and entrenched set of technology they're working with.
Re: (Score:2)
As much as I like Google's approach from an engineering standpoint, the truth is Google is already being left behind in autonomous car technology. Other auto makers are already introducing various aspects of self driving - from automatic emergency braking, lane assist, adaptive cruise control
You failed to describe anywhere Google is actually being left behind. Their systems obviously do all of those things.
Sometimes completely self driving (Score:3)
We need completely self-driving cars. But we don't need them to drive everywhere. Handling suburban, city, freeway, highway, et al is hard to program in. If we just focused on 100% freeway driving, I think that would be much easier to program. We could have a self driving car that drives on the freeway autonomously, but gives ample warning before an exit ramp where it expects the user to take over at the first stop.
Re: (Score:2)
Re: (Score:2)
Recommended suggestion (Score:2)
It appears obvious that the human condition will put too much trust in the car. So, let's not let all humans operate self-driving vehicles for now. Let's say we instead begin with a very limited license that can only be obtained by specially-trained drivers familiar with expectations for device operation and manual override. Find a fleet of taxi drivers in a municipality, for example, or perhaps some transport vehicles that just bus passengers between an airport and hotels. Beta test car operations to d
Re: (Score:2)
Expecting new infrastructure (at least in the USA) is even more unrealistic than waiting for autonomous cars to become fully self sufficient.
Fundamental difference in trust between industries (Score:2)
Automation Paradox (Score:2)
Over the summer, 99% Invisible [99percentinvisible.org] and NPR's Planet Money [npr.org] put out several podcasts ([1 [npr.org]], [2 [99percentinvisible.org]], [3 [99percentinvisible.org]]) on the automation paradox, and the Google car is front and center. So is Air France Flight 447 [wikipedia.org], which shows what happens when automation fails and humans can't properly respond.
Re: (Score:2)
Automatic transmissions are an example of what is to come.
Theoretically automatic transmission drivers should be safer. They have one less thing to worry about and all the 'enthusiastic', stigish drivers have real transmissions.
In reality if you want to identify a terrible driver you ask the question 'can you drive stick'. If the answer is no they are guaranteed to be rolling hazards. They simply pay less attention.
Re: (Score:2)
As the 99% Invisible article points out, the net effect of automation is better overall safety. The knowledge that is lost because people no longer know the details of how to operate a manual elevator, or airplane, or car, is more than made up for by the relative reliability of the automation systems that replaced the manual processes. Yes, it might be true that people don't know how the finer points of making horseshoes any more, but who cares?
Yes, in the case of airline or train accidents, people can di
Aviation industrie do it incrementally (Score:2)
And require constant monitoring from human in charge.
The difference is ... (Score:2)
In Europe, they don't have any problem with telling incompetent drivers that its time to hand over the license and start taking the bus. Not so in the USA, where punching the wrong pedal and ramming the Cadillac through the coffee shop is not s
Re: (Score:2)
No. My German Aunt is still, sort of, driving.
She has been a hazard for at least 20 years. Stops in the middle of a left turn to argue with her sister about where they are going.
Took years and many tries to get her license in the first place. As bad a driver as any American I've ever met.
Re: (Score:2)
These are the kinds of people who still have 12:00 flashing on their VCR. That's how dumb and foolproof the technology needs to be. Otherwise it's just an expensive gismo.
Distracted driving does not need Google for proof. (Score:2)
"Google's tests have taught it otherwise — even after being told it's a prototype, new drivers immediately place a lot more trust in the car than they should. They turn their attention away and stop looking at the road for much longer than is safe."
Statistically, someone will likely die today as a result of distracted driving, so not sure how this isn't already true today. It sadly is.
On a side note, what exactly did Google expect? People want to text, email, surf, sleep, eat, put on make-up, do just about any damn thing except actually pay attention and drive when behind the wheel today. Of course the consumer is looking towards automation for them to be able to legally engage in just about any activity other than paying attention to the road,
Re: (Score:2)
I think that is exactly what Google expected and now they have the data points to prove it. Hence they are advocating the all-in approach. Of course, the ultimate goal is that all bad/drunk/distracted drivers are removed from the roads because no humans are driving. Once that happens perhaps riding in an automobile would be statisically safer than say.. working out.
Statistically speaking, I suppose you envision such a system to be 100% secure and impervious to hacking or corruption as well.
Today, a networked computer is insecure. Assuming anything otherwise is and has been a costly mistake.
Tie that networked computer to the object hurtling your mass of flesh and bones down a freeway at 80MPH, and that same system becomes not just insecure, but downright deadly.
Re:Black Boxes??? (Score:5, Insightful)
Basically every car since 2010 stores at least 30 seconds of logging data before (and if possible after) any collision which triggers the airbag. That includes the complete engine state and all the inputs like accelerator position and brake light switch status. What they're talking about now is just standardizing what is already there.
Re: (Score:3)
Basically every car since 2010 stores at least 30 seconds of logging data before (and if possible after) any collision which triggers the airbag. That includes the complete engine state and all the inputs like accelerator position and brake light switch status. What they're talking about now is just standardizing what is already there.
True, but what's there is not presently officially there. The manufacturers will use the data but they won't release it; and they have typically actively denied its existence. So TFA is also about making sure one is there for actual investigative uses with capabilities.
Re: (Score:2)
True, but what's there is not presently officially there. The manufacturers will use the data but they won't release it; and they have typically actively denied its existence.
Yeah, NHTSA is done taking that kind of guff. They're planning to double (or maybe triple) their staff shortly. The automakers are going to find it harder and harder to hide things, and NHTSA is going to get more and more involved in their code audits and whatnot.
Re: (Score:2)
True, but what's there is not presently officially there. The manufacturers will use the data but they won't release it; and they have typically actively denied its existence.
Yeah, NHTSA is done taking that kind of guff. They're planning to double (or maybe triple) their staff shortly. The automakers are going to find it harder and harder to hide things, and NHTSA is going to get more and more involved in their code audits and whatnot.
They presently don't want to release the unofficial black boxes claiming all kinds of things - including that the information in there (if there is one) is only for their internal use to help make things safer; further, they fight it (and deny the existence) because they don't want the liability that could come with it since the data could potentially show that it was their fault.
So if forced to have one (which will probably happen) I'd image that manufacturers will probably end up with two - one officia
Re: (Score:2)
Well, then..thankfully for me, my car is older than 2010...but still I need to research further to make sure.
Re: (Score:2)
Well, then..thankfully for me, my car is older than 2010...but still I need to research further to make sure.
If your car has airbags, then it likely does this kind of logging. And frankly, the kind of complexity they had to add in 1996 to support OBD-II means that most vehicles from then on are capable of storing that kind of data. They will do that if any fault is detected, too, not just if an airbag is fired.
Re:Black Boxes??? (Score:5, Insightful)
What makes you think that you have some right of privacy in a crash? It is basically a crime scene. All a black box in a car does is stop some lying bad driver get away with quite possibly murder.
If you want to drive around on public roads like a crazed lunatic then I have a right to be able to prove in the event of an accident that it's your fucking fault.
After nearly being killed last month by some stupid fucker who decided to overtake when there was not space that lead to me screeching to an emergency stop and being missed by the moron by less space than I care to remember I have ordered up a camera system to fit, so at least the fucker would be facing charges for dangerous driving now.
Re: (Score:2)
Actually, yes. Between myself and my lawyer. And I should not have my property testifying against me.
Re:Black Boxes??? (Score:4, Insightful)
So, that blood on your knife is private information and not subject to search warrant, etc.?
Re: (Score:2)
Re: (Score:3)
Really I just don't want to be tracked any more!!! Fitbit to see how I live, black boxes to see my driving habits, using my cell phone to track my movements.
Enough...FUCKING ENOUGH!!
Your angst is a bit late. That telemetry beacon has left the barn. It was pretty much inevitable that as soon as people could create systems that tracked everyone and everything, they would. Thus is the way of man. (Notice I said 'man', if women ran the world, it might well have been different). The best we can do is limit the damage, use the rule of law to keep the incursions into areas that really need to be private, private.
And goodluckwiththat.
Re: (Score:2)
And how does all of this tracking make you feel?
It's not tracking. It just isn't. (And I suspect you know that, else you'd have not posted as an AC...it's not like being anti-tracking will get you modded down on Slashdot, after all...)
It's like a flight recorder, so that data on the state of the car in the last moments immediately before a crash are available for analysis. I'm fine with this personally, since it's something that's fair and objective.
If, for example, some guy cuts me off and then slams on his brakes suddenly...causing me to hit him...u
Re: (Score:2)
But I should be able to choose if I "want" a flight..err...driving recorder black box type machine installed in my car. And as for the OnStar, Uconnect....next car I get if it has this, it will be disconnected, and rendered unable to be accessed externally, at least wirelessly.
Re: (Score:2)
But I should be able to choose if I "want" a flight..err...driving recorder black box type machine installed in my car.
Not if you're going to use your car on a shared, publicly-owned road. I think black box recording should be mandatory on the public roads.
Now, if you're talking about a car only used on roads you own, have at it.
Re: (Score:2)
This shared open roads argument is getting ridiculous. What's next? Mandatory silencing of radio/music when on the "shared roads"? Reporting of distracting kids in the car or yelling at each other? I mean, you think this is going to a ridiculous level, but who a few years ago, would have suspected a fucking black box tracking all your driving? Insurances with gadgets (
Re:Black Boxes??? (Score:5, Insightful)
We've had it just fine on the shared publicly owned roads for decades now, without having this type of intrusive, electronic surveillance and got along perfectly good.
I don't know what you call "perfectly good", but over 30,000 people killed a year in cars doesn't meet my definition of perfectly good.
Well, you just get kids used to surveillance and they then accept it for normal and "good".
I don't support surveillance, which I think of as the ability to be monitored in real-time. I support reporting, where you plug into a box that can't be accessed without physical, interior access to the car.
Screw it, I'm gonna buy and old 60's-70's muscle car, with no computer and no tracking..and hell if old enough, no fucking emissions bullshit.
We're coming for those, too. I expect that at some point, things like Interstate highways will be restricted for automatic driving only.
You don't like it, build your own roads. The public roads are for public use, and we can & have constantly redefined how they can be used.
Re: (Score:2)
Oh, it absolutely is tracking. If I am not in control of something I own divulging information without me being able to allow or deny it, then it is tracking even if its just a speed history. People's things should not be able to be used against them (or even for them) without them allowing it to happen.
Re: (Score:2, Interesting)
The law about challenging speeding ticket could even be altered to require that a defendant provide a copy of the "black box" data (or the digital camera footage). Then my dad's technique wouldn't work any longer. Or, the police could just do a standard data transfer at the time when the car gets pulled over.
I seriously question whether police departments and local municipalities will even allow self-driving cars on roads. They threaten to completely ruin their source of funding: tickets. Why would some
Conflict of interest (Score:2)
I seriously question whether police departments and local municipalities will even allow self-driving cars on roads. They threaten to completely ruin their source of funding: tickets.
Good! I've always found it reprehensible that governments actually base budgets on the number of people they can catch breaking the law. I don't have a problem with using fines as punishment but the government entity issuing the fine should not be the beneficiary. Huge conflict of interest there.
Re:Well, duh (Score:5, Insightful)
There's nothing that could ever satisfy that test. A kid who runs into the road is going to get hit by a car in ordinary circumstances. If he's incredibly lucky or the driver is heroically responsive and saves the day, the kid won't get hit. No system, human or otherwise, can ever be created to solve this situation satisfactorily. The best hope is probably automatic braking.
Re: (Score:2)
If trained airline pilots can crash into the sea two minutes after their 'self-driving plane' hands control back to them, a driver has no chance of missing that kid when he runs out into the road and their 'self-driving car' hands control back them to avoid legal liability.
FWIW, as I understand it Google's position on legal liability is that it's on the maker of the driving system, so the car wouldn't hand control back for liability reasons. In that particular situation, the self-driving car is almost certainly going to be better able to avoid hitting the kid than a human driver could, even without the handoff delay. That doesn't invalidate your point, though. The example is a bad one, but the general notion that the car cannot rely on the driver to quickly handle things it c