How Autonomous Cars' Safety Features Clash With Normal Driving 451
An anonymous reader writes: Google's autonomous cars have a very good safety record so far — the accidents they've been involved in weren't the software's fault. But that doesn't mean the cars are blending seamlessly into traffic. A NY Times article explains how doing the safest thing sometimes means doing something entirely unexpected to real, human drivers — which itself can lead to dangerous situations. "One Google car, in a test in 2009, couldn't get through a four-way stop because its sensors kept waiting for other (human) drivers to stop completely and let it go. The human drivers kept inching forward, looking for the advantage — paralyzing Google's robot." There are also situations in which the software's behavior may be so incomprehensible to human passengers that they end up turning it off. "In one maneuver, it swerved sharply in a residential neighborhood to avoid a car that was poorly parked, so much so that the Google sensors couldn't tell if it might pull into traffic."
Best solution: (Score:3, Insightful)
Ban human drivers.
Re:Best solution: (Score:5, Insightful)
No surprise this is up modded insightful. I'm betting many slashdotters are horrible technologists who assume the world needs to bend to technology. Simply put, that's not the case. If these cars can't handle driving around humans they are not ready for consumption. The fact that they can't properly work with and adapt with humans on the road means that these cars are unsafe. They may be "safe" from the definition of the laws, but they are not safe if they are causing or instigating traffic accidents. It seems it's blind luck that these cars haven't been the clear cut cause of an accident yet.
Re:Best solution: (Score:5, Insightful)
There have been lots of discussions on attempting to change driver behavior. Those are also nonstarters. People are not going to change how they drive until conditions in the field force them to do so. Hell, we still have idiots driving below the speed limit in the left lane on busy freeways where they're actually posing a safety hazard and where the law actually states that one can be cited for failing to yield and being passed on the right. Most people probably don't even know the rules for what's defined as stopping (ie, remaining still for two seconds where I live) and have no interest in bothering to learn, and the police don't seem inclined to enforce either, so this simply won't change.
The cars are going to have to learn how to adapt to these conditions.
Re: (Score:3)
Re: (Score:3)
Re: (Score:2)
And deer. (Score:2)
And deer. And snow too. Also road construction.
Re:Best solution: (Score:4, Informative)
Poor example (Score:5, Insightful)
"One Google car, in a test in 2009,..."
One would think that in 6 years some improvements would have been made. Do we have a more current example?
Re: (Score:2)
Re:Poor example (Score:5, Insightful)
it will be always be a challenge to have these control systems anticipate what human drivers intend to do.
This is complicated by the fact that some human drivers do not even know themselves, what they intend to do. So how should a computer control system be able to anticipate what a human driver intends to do, when the human drivers don't even know themselves?
I really don't think it is that many . . . maybe only 1% of all human drivers. However, one clueless driver can confuse and tie up 99 drivers who know where they want to go, and can communicate it to other drivers.
It's like being on a escalator at the airport or train station. Two folks don't know where they are going. So they stop dead in their tracks at the end of the escalator, blocking the path for all the other folks on the escalator. An accordion affect ensues, with all the folks on the escalator getting squished together. The two people doing the blocking, are totally oblivious to this fact. Their field of vision ends at their own noses. They are entirely engulfed in themselves, and can't even conceive that there are other living beings around them.
This is what happens on the road, as well. The driver of the car parked halfway into the street, is just not capable of thinking, that other drivers might be confused by this. Is the car really parked? Or is the driver trying to park? Or maybe trying to drive away . . . ? At any rate, some drivers need to be taught that it is terribly important to anticipate how others might interpret their actions.
Re: (Score:2)
Re: (Score:3)
Re: (Score:3)
Wait, are you claiming to live in a place where school speed zones ("she"?) have the ability to be deactivated, and yet you must obey the lower limit even when it is deactivated? Do I have that right?
What, then, does it mean for the zone to be "deactivated" in your country?
Here in the USA, our signs say "25 MPH WHEN CHILDREN PRESENT" or "25 MPH WHEN LIGHT FLASHING". If children aren't present, or the light isn't flashing, then the limit does no apply. People who drive slowly through those zones anyway are c
Re: (Score:3)
"And here, travelling more than 15kmph under the speed limit is ticketable too."
Yeah, technically that can possibly happen in the United States but it is very rare. And the problem is that speed limits themselves are stupid low, so if the highway says 55 MPH then a person would have to be going 40 on a damn highway where everyone else is going 85 to get a ticket. The result is that we have these few drivers who are creating very unsafe highway conditions but never get ticketed for it.
What I want is simply e
Re: (Score:3)
Not necessarily useless.
In the school zone, let's say 80% of the kids need to cross the road. They then spread out from there, and just the next road over maybe only 20% of them need to cross, if they aren't staying in the school zone to wait for a bus at which point perhaps they won't be crossing the road at all.
The school zone says that a LOT of children are likely to be present, and extra attention and lower speed becomes required.
Re:Poor example (Score:5, Interesting)
That is a well known problem in architectural design. Give a clear path for people to exit and clear escalators and the like, and place directional signs where people have space to stop to read them. There's an art to finding a way to entice people away from stairs and escalators after they exit them.
Re: (Score:3)
I lived in Boston for a couple years. A few weeks before I moved out I was driving on a surface street and the car in front of me signaled his upcoming turn. I swear this is true; in my brain, my inner monologue went like this:
"What the hell is that weird flashing orange light on that car?"
"Is it a hazard light, but the other bulb is burned out?"
"Maybe I should slow down, something could be seriously wrong."
"You idiot, that's a turn signal."
"Oh, gosh, you're right. I haven't seen one of those in a long time
Re:Poor example (Score:5, Interesting)
"One Google car, in a test in 2009,..."
One would think that in 6 years some improvements would have been made. Do we have a more current example?
It mentions further down in the article that that particular example has already been corrected.
... For instance, at four-way stops, the program lets the car inch forward, as the rest of us might, asserting its turn while looking for signs that it is being allowed to go.
Re:Poor example (Score:5, Interesting)
Re: (Score:2)
So the earlier example with a car doing much the same has been corrected. Now they have data that shows a bike can do something similar at an intersection. I imagine it will be pretty trivial to produce code that lets the car progress through the intersection slowly, while watching the bike to make sure it stays within a box.
One thing's for sure, a car that refuses to go until the cyclist stops moving or takes their turn is hardly creating a dangerous situation.
It does raise another interesting point though
Re: (Score:3)
Re: (Score:2)
If the roads are very similar in traffic volume, use a roundabout.
I've seen places where they put in roundabouts for existing road intersections and the problem is that the roundabouts, despite taking about 20 times more space than a normal intersection, was still far too small for traffic to go around safely.
Re: (Score:3)
Putting in a roundabout uses a lot more space for intersections that rarely have more than 2 cars meet. It also is a lot more costly than just adding 4 signs.
Uh, you do realize that, in situations like that in Europe, they just paint a circle on the road and put up roundabout signs, right?
Re:Poor example (Score:5, Funny)
It's not a traffic circle without a million dollar island complete with shrubbery.
Re: (Score:3)
Arizona is replacing its 4-way stops and light-use intersections with "modern roundabouts" that do not use any more land than is already dedicated. THis saves energy and keeps residential neighborhoods quieter.
Re: (Score:3)
Interesting, thank you. Here's more information. [azdot.gov]
Re: (Score:3)
Re:Poor example (Score:4, Insightful)
To be fair, if some idiot cyclist was going back on forth at an intersection I would hesitate to drive as well. Of course, as a human I would quickly lose patience and just start driving with the assumption that he would just stop.
Well, I haven't really met up with a cyclist doing a track stand. In 99.9% of cases, the cyclist just blows through the intersection. The other 1 out of 1,000 times, the cyclist will do circles or figure eights.
Either of those cases will likely confuse the software. It certainly confuses regular drivers, and pisses them off.
Re: (Score:3)
Well, I haven't really met up with a cyclist doing a track stand. In 99.9% of cases, the cyclist just blows through the intersection. The other 1 out of 1,000 times, the cyclist will do circles or figure eights.
Either of those cases will likely confuse the software. It certainly confuses regular drivers, and pisses them off.
You forgot about the pedestrians in the crosswalk that the cyclist came within 6 inches of hitting. It pisses them off, too.
Re:Poor example (Score:5, Insightful)
Re: (Score:3)
It is still not a big deal. Especially when using clipless pedals it is far safer to clip out and stand on one leg than trying to keep balance, because if anything goes wrong you'll fall on the side and probably won't be able to clip out during the fall.
Re: (Score:3)
And he's not "going back and forth" but rather "rocking back and forth" slightly to help stay upright.
I'm not sure about the US, but here in Europe the rules of the road clearly say that cyclists have to put down their foot on the road at a stop sign, or else it is considered as a rolling stop. So even if the guy is talented enough to stop with his feet in the air, it would still be against the rules.
Re:Poor example (Score:5, Informative)
The NYT hates new car tech, especially EVs and robots. I seems to be in the pocket of some big vested interests (oil presumably, maybe other auto manufacturers who are falling behind). Remember the infamous Tesla Model S review by that Broder guy, where he did everything in his power to make it fail, exceeding the speed limit and slow-cooking himself with the heater etc.
This is just another hit-piece against autonomous cars. It might even be out to trash Tesla again, since they are introducing autopilot.
Re: (Score:2)
Do we have a more current example?
That would kinda mess up the story now wouldn't it?
This is just a techno FUD story for people who can't stand Google or self driving vehicles to point to and yell "See, SEE? I told you these things will never work!"
Theat they have to point to 6 year old data is sort of telling.
In other news (Score:5, Insightful)
Millions of people on the road today deserve to have their license taken from them because they can't follow simple rules like signaling, not parking halfway out into the street and leaving enough room to brake in case the car in front of you brakes.
Re: (Score:2, Interesting)
Millions of people on the road today deserve to have their license taken from them because they can't follow simple rules like signaling, not parking halfway out into the street and leaving enough room to brake in case the car in front of you brakes.
Due to this - I am a pedestrian - I'd much rather have self-driving cars.
Re: In other news (Score:2, Funny)
I'd think, as a pedestrian, you'd prefer to have self-walking shoes.
Re: (Score:3)
"I'd think, as a pedestrian, you'd prefer to have self-walking shoes."
Ask, and ye shall receive:
http://fortune.com/2015/08/07/... [fortune.com]
Re:In other news (Score:5, Insightful)
Or speeding in residential areas. Those people are the scum of the earth.
Re:In other news (Score:5)
Re: (Score:2)
We just passed a law basically saying "keep right, except to pass". Judging by comments on news articles about the law, this concept is utterly incomprehensible for a large number of drivers. Any number of folks seemed to think that as long as they were doing the speed limit, this rule didn't apply to them, and that it was ok to stay in the left lane all day.
Re: (Score:3)
This is already law in the state I live (Washington), and has been so since before I get my license. It is *almost* never enforced.
With that said, there was a time at a past employer where somebody posted mail to a large social (voluntary inclusion, not-work-related) internal mailing list asking for advice on how to get out of a ticket he'd gotten for holding up traffic in the left lane. The typical response ran something like this:
Re:In other news (Score:5, Informative)
Not your job to prevent others from speeding. In fact in most cases the law is written forcing you to yield the left lane to the speeder as they are traveling faster then you.
It is more dangerous for you to go slowly in the left lane and force the speeders to pass you on the right, then it is for you to just get the hell over.
Re: (Score:3)
Not sure where you live, but in my state, the left lane is for passing. If you linger there while not passing or turning, you are technically violating the law. Here's a map: http://jalopnik.com/5501615/le... [jalopnik.com]
Many other iterations of the law specify that you should not block the "normal flow of traffic", specifically distinct from the "speed limit".
Speeding in a residential area can be more dangerous, but you're still often in the wrong if you're doing exactly the speed limit in the left lane.
Re: (Score:3)
Not my job to accommodate them. .
True, your job may be to be a perfect asshole and impede others. But legally, and common sense-wise, and just as common courtesy, get over and drive properly.
Re: (Score:3)
No, somebody who doesn't enable you to break the law even harder than before isn't lower than somebody who endangers the lives of children. Entitled assholes like you are actually the cause of most problems.
Nice strawman. You assume the person being blocked is breaking the law. That is not necessarily the case. The person being blocked may be traveling at or under the speed limit and legitimately trying to pass traffic going even slower. Meanwhile, the person doing the blocking is absolutely for certain breaking the law because it is illegal to drive in the passing lane unless you are passing somebody. If somebody is able to pass you on the right, then you are not supposed to be in that lane.
Re: (Score:3)
Entitled assholes like you are actually the cause of most problems.
Sounds like you are one of those idiots that does not understand why passing left and keeping right helps everyone. And feels 'self entitled' to slow down others and impede traffic by lagging in the left.
Re: (Score:2)
Re: (Score:2)
Very pokey and slow (Score:2, Interesting)
From what I have read from people who have actually interacted with a autonomous car. They are very pokey, slow, and tend to pause trying to figure out what to do.
In fact many times its the required human driver who has to intervene in order to help the car out of a jam. I think the more we try and mix these auto driven vehicles with human one's the more we will experience the growing pains of this technology.
Re: (Score:2)
I see this autonomy being helpful, like an autopilot control. I may be the one in control as I enter the highway, get to speed and merge, but once I'm in a lane, I may switch over to the autonomous system.
culture dependent (Score:5, Interesting)
“They have to learn to be aggressive in the right amount, and the right amount depends on the culture.”
Very true, When holidaying in Texas I quickly found out that stopping for a red light that had just turned would upset drivers behind me. The lights had a much longer amber time, so a whole lot of people who would have had to brake for the lights in the UK would go through
Re:culture dependent (Score:5, Interesting)
Depends on the location. In my country, a yellow light means you can legally drive out of an intersection (for example, if you are turning left), but not into it, unless you need to brake suddenly, in which case you can go.
Now, "brake suddenly" is a subjective thing. If I see the light some distance away and do not need to slam on the brakes to stop, then I stop. It may cause an inattentive driver to hit me from behind (happened recently when I stopped to allow a pedestrian to cross as required by law).
However, it is also the law to leave a safe distance between you and the car in front so you can stop without hitting it if the car in front of you suddenly stops. If you hit another car from behing, you will almost always be found guilty (pretty much the only hope for you is for the other driver to be drunk - drunk drivers are always guilty for an accident even if they did not cause it - this is done to discourage people from driving drunk).
Re:culture dependent (Score:5, Informative)
The other get out for hitting a driver in front is if said driver pulled into your safe braking distance.
So if some idiot pulls out of a junction without looking and you go into the back of them as a result it is not your fault. Another one would be someone overtaking pulling in and then slamming the brakes on (the last one is often done as part of an insurance fraud).
There are a whole bunch of others as well, though they can be hard to prove if you don't have a dashboard camera.
Re: (Score:3)
Yes, but because they are very difficult to prove without a dashcam, if you do not have a dashcam, you'd better hope the other guy is drunk or is honest and accepts the responsibility.
Re:culture dependent (Score:5, Informative)
Amber means stop you can do so safely:
https://www.gov.uk/government/... [www.gov.uk]
It does not mean try to squeeze through because you think you have time. I know people who've had tickets for running the amber, despite being across the line before it went red.
Re: (Score:2)
In the USA, it means nothing more than the light is about to change to red (CVC 21452 [ca.gov]).
Re: (Score:2)
Yes, you are supposed to stop safely when the light goes amber, unless it is unsafe or impossible to do so before the stop line.
And this is one of the areas where autonomous vehicles will struggle, just as the speed limit. Because it is very hard to justify that autonomous vehicle should speed or run red lights, like most people do. It is the difference between a decision on polity and a decision on an individual case.
I for one am curious how this is going to be solved. I think it will also reveal that foll
Re: (Score:2)
PS: Unsafe is usually interpreted to mean using an acceleration that may not be available to all vehicles. Obviously a sports car on special tyres can stop a lot faster than an old banger or a truck, and and therefore you should never use maximum deceleration (emergency stop).
Whether the driver behind you is paying attention is not your problem. Anybody not paying attention at a traffic light should not be on the road.
Safe driving (Score:2)
Unsafe is usually interpreted to mean using an acceleration that may not be available to all vehicles.
No it isn't. Unsafe is operation of a vehicle in a manner that violates duty of care [wikipedia.org]. You can safely accelerate faster in a Corvette in many circumstances than the maximum acceleration of a Nissan Versa. Whether it is unsafe will depend on factors including road conditions, visibility, nearby traffic, nearby pedestrians, driver skill, vehicle capability, etc. There is no requirement to only accelerate as slowly as the slowest vehicle.
Obviously a sports car on special tyres can stop a lot faster than an old banger or a truck, and and therefore you should never use maximum deceleration (emergency stop).
You use emergency stops in an emergency. It is the responsibility o
Re: (Score:2)
Depends entirely on the location. In many (the UK, and Texas for example, which seem relevant to the example), the rule is that you must stop if it's safe to do so. If it's not safe, you may cautiously proceed through the junction.
So in this case, it seems like it was completely safe to stop, and therefore, yes, it effectively was a stop light.
Re: culture dependent (Score:5, Funny)
I'm color blind, you insensitive clod. It's called fucking the middle grey.
Not normal driving. (Score:4, Informative)
They are confused by BAD driving. People in general really really suck at driving and a computer will have problems with that.
Re:Not normal driving. (Score:5, Informative)
Which is always going to be the problem ... because as long as there are human drivers on the road, there will always be cases in which the computer utterly fails.
And any technology future which is predicated on suddenly replacing all drivers with autonomous cars is complete crap and will never actually happen. Because nobody is going to pay for it.
It's the corner cases which will always cause these things to go wrong. And, I'm sorry, but the driver with his right turn signal on who swoops across two lanes and turns left ... or the ones who think they can use the oncoming lane because there's something in their lane ... or who randomly brake because they can see a cat a half mile away ... or cyclists who do crazy and random shit ... or any number of crazy things you can see on a daily basis ... all of these things will create situations in which the autonomous car utterly fails to do the right thing.
As much as people think it will mostly work most of the time, if these things require the driver to constantly monitor it or have to swoop in when the system decides it doesn't know what to do, then the utility of the autonomous car pretty much vanishes.
I just don't see this technology ever becoming widespread or used in the real world, other than by companies trying to prove how awesome it is. Because it's just going to have too many cases which simply don't work, and the occupants will have to be ready to take the controls.
In which case you might as we be driving and actively engaged in the process instead of zoned out and not paying attention. Because the human reaction time is greatly diminished when you're reading the newspaper and suddenly have to take evasive reaction.
Programmed behaviour is programmed behaviour. (Score:5, Interesting)
If the programming makes it jerk the steering away from a stationary hazard rather than, say, detect it earlier and slow down as it approaches, then it's not suitably programmed for coexistence with unexpected stationary hazards (Not even anything to do with human presence! What if that was a cardboard box and it swerved heavily in case that box "pulled out"?).
If it can't make it's way through a junction where the drivers are following the rules, that's bad programming. If it can't make it's way through a junction where other drivers don't come to a complete halt for it, it's not fit to be on the road with other drivers.
If you want a car to co-exist on the road, it has to be treated as a learner driver. If a learner driver swerved at a non-hazard, they would fail. If a learner driver refused to make progress at a junction because the masses didn't open up before it, they would fail. So should an automated car.
Unless - and this is important - you are saying that automated cars should only operate on automated roads where such hazards should never be possible and they are deliberately NOT programmed to take account of such things. Which, in itself, is expensive (separate roads with separate rules with no human drivers), stupid (that's otherwise known as a "train line", and because they can't do anything about it it will hurt more when it does happen), and dangerous (because what happens if a cardboard box blows over the automated road? etc.).
Program to take account of these things, or don't plan on driving on the road. The safety record is exemplary but equally there are only a handful of them and the eyes of the world are on them, and there are still humans behind the wheel, and even by miles travelled each one is probably dwarved by a single long-distance driver over the course of a year - and it's not hard to find a long-distance driver who's not had an accident for years.
If you're going to be on the roads, then you need to be able to take account of all these things, the same as any learner driver. Sure, you didn't hurt anyone by swerving or not pulling out, but equally - in the wording of my first driving test failure - you have "failed to make adequate progress" while driving.
A car sitting on a driveway would have an even better safety record but, in real life, it's still bog-useless compared to a human. Similarly for any automated vehicle that just stops at a junction because it can't pull out, or swerves out of the way of a non-hazard (and potentially weighs up collision with non-hazard vs collision with small child and gets it wrong).
Re:Programmed behaviour is programmed behaviour. (Score:5, Interesting)
The problem is that people don't follow rules. We follow approximations of the rules. For instance, my driver's handbook described the correct way to deal with yielding at a four-way stop as "yield to the person on the right." For a computer, that's an obvious deadlock situation, or worse - an obvious mistake. If four cars are parked at a four way stop, and each car yields to the car on the right, then (a) a situation could occur where no one goes anywhere, and (b) if the individual cars only pay attention to the person on the right, then they could hit an on-coming car turning left, or the car on the left turning left. People process the "yield to the person on the right" rule into something much more complex.
People use a number of complex behaviours at four-way stops. Firstly, the wave of the hand, or the nod of the head to indicate that you yield to the other driver is an important signal. Secondly, in my jurisdiction, 90% of the four way stops are done on a first-come first-served basis. Lastly, and this is the bit I don't understand, often people yield to the person on the left. The actual system of navigating a four-way stop is much more complex than what an initial computer implementation might be.
Re:Programmed behaviour is programmed behaviour. (Score:5, Insightful)
Computers follow rules. Humans (a.k.a every other asshole on the road) do not.
This is a no win situation. If you program a car to drive safely and follow rules, then it won't be safe on roads because of all the assholes who don't. If you program the car to behave more like an asshole ( a human driver), then it won't be safe since there's a good chance it will make the wrong call. If you program the car to just account for assholes but still drive safely, then it will basically choke in situations like a four way stop in southern California where every other asshole will just muscle or roll their way through the stop.
The long pole in the tent isn't developing an AI capable of driving. It's developing an AI that can deal with assholes.
Re:Programmed behaviour is programmed behaviour. (Score:4, Insightful)
It's not a no-win situation. It just means that self-driving cars have to know when to break the rules. They can and should behave like the best of human drivers.
If you program the car to just account for assholes but still drive safely, then it will basically choke in situations like a four way stop in southern California where every other asshole will just muscle or roll their way through the stop.
The current programming of the car handles that situation. Less aggressively than a human would, but aggressively enough to assert its intention to go, and go.
Re:Programmed behaviour is programmed behaviour. (Score:5, Interesting)
Re: (Score:2)
Re: (Score:2)
In the short-term, the human driver can take over in these situations.
But the human 'driver', freed of the need to keep tabs on traffic is probably doing something else.
Yesterday, I was coming home through the daily traffic jam. Heading eastbound, I saw a van stopped in the westbound lane, backing up traffic. I figured it had broken down or something until I passed it. The driver had his nose in his phone, busily texting away (or playing Angry Birds). He probably figured that he'd get something else done while the line wasn't moving and failed to notice that it had started a
Re: (Score:2)
Here's an elaborate algorithm for autonomous cars to solve the oh-so-huge problem at a 4-stop intersection:
0. Are we in a 4-stop intersection with three other cars?
Yes: Goto step 1
No: We're finished.
1. Are some of the cars human-driven?
Yes: Yield to them. Goto step 0.
No: Work out go-order by communicating with the other three autonomous cars. We're finished.
Re: (Score:2)
Re: (Score:2, Informative)
That's not really the take-away from the article. The issue is that even with this limited amount of real-world usage, the cars are running into scenarios that the programmers didn't anticipate and the software handles the scenario poorly as a result - more poorly than humans would. That is the issue. That is the limitation of the self-driving car. There will *always* be unanticipated events when driving - more so when there are more automated cars on the roads. The automated cars will handle those sit
Re:Programmed behaviour is programmed behaviour. (Score:4, Insightful)
Program to take account of these things, or don't plan on driving on the road.
Duh.
Technology in development is imperfect. Big surprise. These issues are why Google hasn't yet started selling them to the public. None of them are insurmountable, but it takes a lot of time and effort to build sophisticated systems.
What if that was a cardboard box and it swerved heavily in case that box "pulled out"?
The cars can easily distinguish between a cardboard box and a vehicle. Determining whether or not the vehicle has a driver in the seat and might move... that's often impossible. Likely the reason that the car swerved sharply rather than braking earlier is because the badly-parked car was obscured by other obstacles.
If it can't make it's way through a junction where the drivers are following the rules, that's bad programming.
Six year-old programming, note. The article mentions that the current version of the software inches forward to establish intent to move.
and potentially weighs up collision with non-hazard vs collision with small child and gets it wrong
Google cars recognize pedestrians (of all sizes) and regularly notice them even when no human could. I'm sure the car would choose to hit another vehicle over a pedestrian or cyclist.
Really, your whole comment is a mixture of outdated information buttressed by invalid assumptions and layered over with a veneer of blindingly obvious conclusions.
Re:Programmed behaviour is programmed behaviour. (Score:4, Informative)
Every time I've heard an expert (usually a college professor with a background in computer science, robotics, or automation) discuss existing self-driving cars (the Google car is almost always mentioned as an example), the experts always describe self-driving cars as something more highly programmed and rule-bound than actually autonomous.
They rely less on machine vision and more on extremely detailed and high-resolution saved maps versus driving the road they see in front of them. Sensors are used to determine hazards, but more for avoidance than some kind of self-guided navigation decisions.
What about speeding / useing the center of the roa (Score:2)
What about speeding? Even more so on under posted highways / interstates / toll roads?
Useing the center of the road as an extended trun lane? even when not marked as one?
rolling stops when no other cars are in the way?
Re:What about speeding / useing the center of the (Score:4, Insightful)
Re: (Score:2, Informative)
Actually, autonomous cars are programmed to exceed the speed limit by up to 10 mph. This is done because Google deems it safer than driving at the speed limit and being slower than the other cars on the road.
http://gizmodo.com/googles-autonomous-car-is-programmed-to-speed-because-i-1624025227 [gizmodo.com]
http://www.bbc.com/news/technology-28851996 [bbc.com]
Re: What about speeding / useing the center of the (Score:3)
Re: (Score:3)
So is Google going to pay my speeding ticket when a cop pulls over my autonomous automobile for speeding?
Almost certainly. Though they will bring in several well-respected highway safety engineers to testify that following the flow of traffic is significantly safer than following the posted speed limit. Enough jurisdictions will lose money arguing these cases that there won't be money to be made by writing the tickets. Absent both the financial and safety benefits the police will stop issuing the citations.
Re: (Score:3)
Re:What about speeding / useing the center of the (Score:4, Insightful)
If all the cars were autonomous the morning commute times could be cut in 1/2 or 1/3rd without changing the speed limit since rush hour style rubber band stop and go traffic would be a thing of the past.
Re: (Score:2)
Re: (Score:3)
Re:What about speeding / useing the center of the (Score:4, Insightful)
A slow commute isn't such an issue if you can spend it relaxing or working instead of driving... and even speeding by 15 MPH only saves a few minutes on a commute.
Re: (Score:2)
And that's a ticket for each one of those offenses [if caught]. I've been ticketed in the last 5 years for each and every one of those. Annoying? Yes. Unavoidable? Don't use a Google car. :)
Please give me an otherwise full-manual car (Score:2, Insightful)
Would You Buy a Car That’s Programmed to Kil (Score:2)
Would You Buy a Car That’s Programmed to Kill You? [youtube.com]
Re:Actually, it IS the software's fault (Score:5, Insightful)
The article summary isn't very good. If the software is programmed in a way that causes a car to behave in a way that's dangerous, it IS the software's fault.
That's trivial but true.
It becomes interesting when the software has the car behaving in a way that is SAFE, but unexpected.
Re: (Score:2)
It is the programmer's fault (Score:2)
The article summary isn't very good. If the software is programmed in a way that causes a car to behave in a way that's dangerous, it IS the software's fault.
No, it is the programmer's fault. Software is an amoral set of machine instructions written by a human. Saying it is the software's fault is like blaming a press for cutting off someone's hand. The actual fault is either user error or faulty machine design. The machine is just doing what it was told so blaming software is misplaced. The fact that the problem of autonomous driving has a lot of difficult and dangerous corner cases is irrelevant.
Software is just a set of instructions given by a human so i
Re: (Score:2)
Re: (Score:2)
While I agree that there's bias in the reporting here, in all cases the driver of the "autonomous" car has not been cited as at fault. Thus they are certainly within the definition to say the software wasn't at fault. I am with you in one respect: All of the data should be made public on every incident, and on any close calls where actions of any outside agent (the driver of the "autonomous car" or one in a nearby car) prevented an incident. Unfortunately it's not in Google's interest to do so, and there
Re: (Score:2)
+1 Correct.
Re: (Score:2)
When I'm having an off day mentally, I will consciously decide to drive less aggressively. I will stay in the slow lane, leave a little extra room in front, and look three times before changing lanes. Far from perfect, but trying to do better.
Re: (Score:2)
Re: (Score:2)
Four way stops are the safest intersection. And much cheaper than traffic lights. They are only 'retarded' if you don't care about pedestrian safety.
The state of Washington [wa.gov], and the Mythbusters [treehugger.com] would tell you that roundabouts are safer (for cars and pedestrians), cheaper to build, and more economical for drivers than either a 4-way stop, or light controlled intersection. There seem to be multiple other studies with similar results, a search for "safety of roundabouts vs. 4-way stop" brings up pages full.
Re: (Score:2)
OK. How are Google cars at handling roundabouts?