German Auto Firms Face Roadblock In Testing Driverless Car Software 177
An anonymous reader writes As nations compete to build the first operational autonomous car, German auto-manufacturers fear that current domestic laws limit their efforts to test the appropriate software for self-driving vehicles on public roads. German carmakers are concerned that these roadblocks are allowing U.S. competitors, such as Google, to race ahead in their development of software designed to react effectively when placed in real-life traffic scenarios. Car software developers are particularly struggling to deal with the ethical challenges often raised on the road. For example when faced with the decision to crash into a pedestrian or another vehicle carrying a family, it would be a challenge for a self-driving car to follow the same moral reasoning a human would in the situation. 'Technologically we can do fully automated self-driving, but the ethical framework is missing,' said Volkswagen CEO Martin Winterkorn.
Biggest issue is still liability (Score:4, Insightful)
So, disregarding how the self-driving car decided who it is best to kill in any given situation, for me the biggest problem with self-driving cars is legal liability.
If Google wants to sell autonomous cars, Google should be liable for anything the damned thing does.
And none of this cop out where if the computer doesn't know what to do it just hands back to the human -- because that's pretty much guaranteed to fail since the human won't be able to make the context switch in time (if at all).
As far as I'm concerned, the autonomous car has to be 100% hands off by the user at all times, and the company who makes the damned thing is 100% responsible for what it does.
Why the hell would someone have to pay for insurance for something they don't have control of what it does?
Re: (Score:2)
So what happens when the brake assist in a modern car causes an older car to rear end a car on the highway? Who is liable as it was technology that caused the accident, not either drive.
These things are already dealt with in modern countries, but lets pretend that all the years of driving related liability rulings never happened.
Re: (Score:3)
Same thing that happens when a modern car with brake assist rear ends an old car with better brakes and traction.
If your car has shitty brakes you leave extra room. Good drivers realize that 'shitty brakes' is always relative.
If you're driving a crazy high performance car you moderate your brake use to avoid being rear ended.
Re: (Score:2)
If your car has shitty brakes you leave extra room. Good drivers realize that 'shitty brakes' is always relative.
Sounds to me like the solution to the problem in question - a computer could quickly periodically recompute the envelope of possible scenarios and never drive in the phase space into points from which it can't recover without hitting someone or something.
Re: (Score:3)
That computer would stop on the on ramp of SF bay area highways and refuse to move.
Re: (Score:2)
That computer would stop on the on ramp of SF bay area highways and refuse to move.
Why? Too steep to brake? Too many tailgaters?
Re: (Score:3)
Same thing that happens when a modern car with brake assist rear ends an old car with better brakes and traction.
If your car has shitty brakes you leave extra room. Good drivers realize that 'shitty brakes' is always relative.
If you're driving a crazy high performance car you moderate your brake use to avoid being rear ended.
This.
You also dont have to be driving a crazy high performance car to get good braking. Just get some performance pads, rotors, good tyres and maybe some braided brake lines and you can make a Toyota Corolla stop like a sports car. Your 0-100 time will still be crap but 100-0 will be amazing. You dont even need to fit six piston callipers.
You've got to understand your car. Sadly this is something most people never learn. They get in it every day but dont understand where the edge of the envelope is. W
Re: (Score:2)
Except they're not the same.
If you have a human driving, you usually know who to blame.
If you have a computer driving, the people who made the computer sure as hell aren't going to take liability.
But you can bet your ass some sleazy lawyer will put it into the EULA that by driving in a Google car you assume all liability.
If they're going to make autonomous cars, they pretty much need to be 100% autonomous, with the humans effectively in the back seat with no controls.
At present, there simply ARE no liabilit
Re: (Score:2)
If you have a human driving, you usually know who to blame.
Which, to me, is a horrible way of looking at things. If that were the only criterion, we could easily end up with ten times more car deaths simply because we're more comfortable with putting blame at people, even at the expense of lives.
Re: (Score:2)
Welcome to a world with lawyers and liability laws.Someone is always to blame.
And, as I said, you can bet your ass Google et al are going to try to make sure it's you and not them.
Re: (Score:2)
And that is why we have insurance.
Why do you assume that insurance would not be available? And if insurance is available why is this an issue?
Sure if there are multiple vehicles and multiple insurance companies they might have a proxy battle to determine the answers to these questions.... but that is how issues like this have been sorted out for the last couple hundred years and will continue to be sorted out. Once you have precedents the insurance companies just adjust their rates accordingly.
Re: (Score:2)
So, falling back to first principles... the following car should be prepared for the lead car to do pretty much anything, including drop a bumper (which will come to a stop far faster than either car can brake). If you're not leaving enough room to avoid hitting a fallen bumper, you're too close. Follower at fault, next case.
Re: (Score:2)
A dropped bumper will tend to slide along the ground and likely will go off the side of the road before the approaching car gets anywhere near it (coefficient of friction of steel or plastic vs rubber will show you that), however, when a tire falls off the lead car and the disc brakes are instead digging into the road surface, that will stop rapidly.
However, that was just an example. There are others for current "autonomous" features in cars. How about the speed adjusting cruise control (when it malfuncti
Re: (Score:2)
Every single one of your examples is just bad driving, and have nothing to do with autonomous features in cars.
Cruise control malfunctioning is no different than the car in front slowing. It is your job to notice and react. If you 'don't have time' then you are tailgating.
4 wheel drive? What does that have to do with acceleration?
ABS - yes, that is what is does. If you are driving in snow, leave more room. Not that complicated.
Traction control - yes, that is why it has an OFF switch
Re: (Score:2)
Did you miss the first two words on the Cruise control line? Speed Adjusting Cruise Control, it is the new hot thing, it adjust the speed of the vehicle automatically to the vehicle in front. If it were to malfunction you may not know in time to react.
4 wheel drive ONLY effects acceleration, what else would it have control over?
You are missing the point, all of these systems (besides 4WD...) are being integrated into an auto driving car, so we have to look at the failure modes to understand where the liab
Re: (Score:2)
It does not matter if the speed control is speed adjusting, it is still YOUR responsibility to maintain safe distance.
4 wheel drive does not affect acceleration, it affects traction. The only way to blame 4WD for unintended acceleration is if you were planning on spinning your wheels, which again is just shitty driving.
And, no, I am not missing the point, you are. The 'failure modes' you listed, whether or not some technology was involved, are failures of the DRIVER, and that is where the liability reside
Re: (Score:2)
For cases where the only difference is who was making the decisions, I'd say liability should be with the manufacturer. Other cases are similar to existing with-driver cases: parts failure (manufacturer), skipped maintenance (owner), poorly performed maintenance (shop that did the work).
In the end, like with FAA investigations, everything boils down to pilot error and equipment failure, and in many cases "not dealing with equipment failure properly" is considered pilot error. The only question is who's the
Re: (Score:2)
Dunno how it is in the US, but here in the Netherlands you are obligated to prevent parts from dropping off your car. Also, the quite thorough yearly tests check for such cases.
The first car would be liable.
In practice it is impossible to find liability in such a case. If I drive behind a car that looks like it's going to loose parts I'll keep an appropriate distance. It doesn't happen all that often.
Re: (Score:2)
Huh? It certainly is the fault of the driver in the older car.
Re: (Score:2)
Not always. If I slam on my breaks (ABS, electronic brake distribution, traction control, etc) and someone rear ends me, I could be at fault for overbraking even if it was needed to prevent me hitting someone/something.
Re: (Score:2)
Where? Everywhere I have been the driver in back is always at fault in rear-end collisions.
Re: (Score:2)
Insurance fraud is not an accident, it is a crime. Big difference.
Also, I should have mentioned rear-end accidents caused by the lead vehicle 'stopping short'. That is always the fault of the following vehicle for not leaving enough space. There are cases where the lead car may be at fault, such as merging in too close to following cars, but that is different from just stopping short, which is what was originally discussed (brake assist causes car to stop quicker than following car without assist).
Also,
Re: (Score:2)
Re: (Score:2, Funny)
Why the hell would someone have to pay for insurance for something they don't have control of what it does?
Says every parent of a teenager since cars became widespread.
Re: (Score:2)
I fully expect that insurance for completely autonomous cars will be less expensive, once self-driving cars are proven. To prove them, I expect large fleets sponsored by the manufacturer or syst
Re: (Score:2)
Again, why would I pay liability insurance to cover the actions taken by a computer?
The only viable business model for fully autonomous cars I can see is essentially as taxis.
The notion that we're all going to trade in our cars and let the computer do all the driving is laughable -- too many people like driving, and there's decades worth of cars out there. The notion that we'd buy a self d
Re: (Score:2)
Again, why would I pay liability insurance to cover the actions taken by a computer?
Because that's the most pragmatic solution. If you don't like it, don't get a self driving car (and probably pay even more insurance).
Re: (Score:2)
Exactly. Do you think you ARE NOT paying insurance when you are in the back of a taxi? Its just that the fare reflects the operating cost. And part of the operating cost is the insurance. And you really want the driver to HAVE insurance. So you pay the fare which pays the insurance.
How is that different from buying insurance to cover the self driving car you buy or lease or rent to get you from point a to point b. The insurance is there to make sure that any parties injured in a collision (including yoursel
Re: (Score:2)
I think it will be even easier.
The autonomous cars will be packed with sensors that record EVERYTHING.
If there is an accident then the insurance companies will know which car has
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
for me the biggest problem with self-driving cars is legal liability.
Why is this a problem? Several states already allow self driving cars on the road (although with a driver in the seat for now). The liability issue is already resolved. The party responsible is the insurance company. Duh.
The only thing that changes with SDCs, is that the insurance will likely be much cheaper. Not just because accidents will go down, but also because the camera, gps, and sensor data will make it very clear what happened, eliminating disputes over the facts, so legal costs will be much l
Re: (Score:2)
You seem highly confused as to what insurance is. Here is a clue: the 'party responsible' is NEVER the insurance company. The 'party responsible' is YOU, the insurance company is just providing the cash for you.
Its cute that you think cameras, gps, and sensor data will make it very clear what happened or eliminate disputes.
Re: (Score:2)
The 'party responsible' is YOU, the insurance company is just providing the cash for you.
As long as the insurance company is paying, why should I care who is "responsible"?
Its cute that you think cameras, gps, and sensor data will make it very clear what happened or eliminate disputes.
In many countries, insurance companies offer discounts for anyone using a dash cam. Why? Because cameras reduce disputes, thus lowering legal costs.
Re: (Score:2)
Seriously? First, your liability does not end where your coverage does. If you are under insured, it is you who us responsible. Second, the insurance company does not pay out of the goodness of their hearts, they pay because you pay them to. And if you have a claim, you will pay more.
The lower insurance rates with dash cams are more about fraud detection than dispute resolution.
Re: (Score:2)
The point is moot. Either the owner pays the insurance, or the owner pays Google, and then Google will pay for the insurance.
As long as insurance companies are willing to provide the insurance, the finer liability issue isn't important.
Re: (Score:2)
for me the biggest problem with self-driving cars is legal liability.
This is already covered. Brakes fail, tires blow out, mechanical failures happen. They kill people. Its been something that has happened plenty and gone trough the courts many times. Precedent has been set.
Re: (Score:2)
This old chestnut needs to die.
Rules for this already exist, it's just that human drivers dont follow them. An autonomous car will be programmed to take the course that causes the least damage and is the most legal. So they would choose a rear end crash over a right angle crash because a rear ender presents the lowest risk of casualties. If you think that veering out of your lane to avoid a rear end crash is
This is no moral decision (Score:2)
Humans are unable to make moral decisions in a few miliseconds. They would either freeze for a least one second and hit the next car or pedestrian depending on which comes first. If they have more time, they would try to avoid collision with the human and hit the car, because you cannot really see other people in there and you do not know how many persons are in there. Also people in the car are better protected. So the safest thing is hit the car. But beside that people know when approaching an truck trail
Re: (Score:3)
Humans are unable to make moral decisions in a few miliseconds. They would either freeze for a least one second and hit the next car or pedestrian depending on which comes first. If they have more time, they would try to avoid collision with the human and hit the car, because you cannot really see other people in there and you do not know how many persons are in there. Also people in the car are better protected. So the safest thing is hit the car. But beside that people know when approaching an truck trailer and they cannot stop, they should aim for the wheels and not the section in the middle. However, most people are unable to implement that so why should be cars be able to do these things?
You have hit on one of the key reasons why trying to implement human reasoning in an emergency; especially since it's usually a subconscious reaction to avoid hitting the bigger, scarier thing. yo can train people to make calm decisions in an emergency situation but that takes a lot of simulator time and practice; something most drivers sorely lack before getting a license. If you wanted to follow the human reasoning it would simply be "CRAAAP.... AVOID HITTING THE BIG THING...DAMN... A PEDESTRIAN ... OH WE
Re: (Score:2)
I would go further and say that those who believe that ordinary people think rationally and/or ethically in the spit second of crisis in an imminent collision are possibly sociopaths and should not be allowed near critical software that will these kinds of decisions.
Regulations? (Score:2)
As opposed to all the laws and regulations making driverless cars difficult to test in the US? Google has to pay someone to sit in the front seat so they can take over from the computer (that can make better decisions faster than a human).
What regulations concern them so much (I didn't see any listed in the article) and how do they differ from the US regulations (like the US in some lawless state..)
Re: (Score:2)
The thing is quite simple. They have tested such cars in Germany. Therefore, it is possible to do so. They can also test their shiny new thing in German traffic, as long as a human back up is sitting in the car. And testing it in a German or other European inner city should be challenging enough for now. However, they want to show how innovative they are and not get surprised by the Japanese again, as it was with the hybrid. So actually this is mostly advertisement.
Love how they avoid the things humans CAN NOT DO (Score:3)
But the self-driving cars ARE capable of hitting the breaks quicker and more reliably (avoiding skidding) than a normal human would
Think about it if it were the other way around - what if humans were crappy about deciding to hit the pedestrian but computers had incredibly slow reflexes and took ten times as long to decide to hit the break. Given that example we would laugh and say no way would we let anyone with slow reflexes drive a car.
But we already do that - we let human reflexes drive a car - (Even if they have had one drink 30 minutes ago, slowing them down). The question is not and never has been will computers be perfect drivers. Instead the question is will they do
And that is something that we likely can do within the next couple of years, if we can't already do that.
So stop being obstructionists idiots bringing up the rare/never seen in the real world situations, and talk about what actually happens.
Re: (Score:2)
I somewhat agree... but the problem is not "is a driverless car better than a human" it's: who do we sue when something goes wrong.
In the proposed hypothetical, whoever gets hit is going to be suing someone. Who do they sue? The owner of the car (even if they weren't in the car the time?) the "passenger" or the company that makes the vehicle.
I tend to think that it will be the owner - and the owner will need to have insurance to cover whatever the autonomous car could do. There is no way a company like G
Re: (Score:2)
This is totally true. No human ever was able to make a moral determination and act on it in an accident situation. Beside the fact. The pedestrian and the family in the car is quite simple. The family is protected by their car the pedestrian is not. It would be a different thing between one or a group of pedestrians. Most humans would freeze and hit who ever comes first. End of story. The car could try to reduce victims.
Re: (Score:2)
As soon as any autonomous car advocates start talking about 'what actually happens' the conversation can start in earnest.
But for now, all we have is Google's marketing BS and some DARPA challenges that paint a much less rosy picture.
Re: (Score:2)
Why yes! Just the other day a baby stroller magically appeared 2 feet in front of me while I was doing 90mph on the local autobahn, forcing me to make a snap decision between creamed baby or ramming the President's car which was carrying a gaggle of pregnant neurosurgeons to a peace conference that just happened to be in the other 5 lanes of the freeway and the shoulders and the sidewalks and the ditches, all at the same time
Re: (Score:2)
Do you have a point? Besides the one on your head?
Re: (Score:2)
The pro-driverless car crowd always love to ignore the fact that the autonomous car wont be driverless for decades. A human will still be required to oversee and in case of a failure, take control of the vehicle.
The big problem with this is that people will be taking manual control because the autonomous car will abide by the rules that human drivers like to ignore like keeping a safe distance, not driving in the passing lane, keeping to the
Bullshit (Score:2)
For example when faced with the decision to crash into a pedestrian or another vehicle carrying a family, it would be a challenge for a self-driving car to follow the same moral reasoning a human would in the situation. 'Technologically we can do fully automated self-driving, but the ethical framework is missing,' said Volkswagen CEO Martin Winterkorn.
We learn how to drive in driving school and not how to crash. In a situation like the above. Most people will hit the next thing in front of them, regardless of what or who it is.
The faster the car goes the less likely anyone is to avoid an obstacle.
If anything. A machine could improve things by being able to react in time.
Re: (Score:2)
We learn how to drive in driving school and not how to crash. In a situation like the above. Most people will hit the next thing in front of them, regardless of what or who it is.
I remember stats showing drivers going into a tree at high speed are more likely to aim (unconsciously) for the passenger side - drivers have higher chance of survival than front passengers.
>> same moral reasoning a human would in the situation
This is some hilarious shit right there. There are no higher mental functions involved in a crash, its all on instinct, pre-learned behaviour and reflex.
The Germans just need to do development in the US (Score:2)
While they are trying to change the laws back home they might as well do their development and testing in the United States. We currently have fewer restrictions here.
I agree with gstoddart about autonomous cars being able to be 100% hands off by the user at all times for normal driving regimes. If the companies that make them do it right then they should not be afraid of being 100% responsible when the vehicle is in autonomous mode. Some computer modeling of autonomous vehicles has shown a major drop in
But (Score:2)
'Technologically we can do fully automated self-driving, but the ethical framework is missing
Ethically we can allow fully automated self-driving, but the Technological framework is kinda missing
Morality Framework UNNEEDED (Score:2)
Why this obsession with moral reasoning on the part of the car? If using self-driving cars are in 10x fewer accidents than human driven cars, why the requirement to act morally in the few accidents they do have. And it isn’t as if the morality is completely missing, it is implicit in not trying to to hit objects, be they human or otherwise. Sure try to detect which are objects are human and avoid them at great cost, but deciding which human to hit in highly unlikely situations seems unneeded and pe
Re: (Score:2)
Instinctively this is what we humans do already -- try not to hit anything, but save ourselves as a first priority. In my few new misses (near hits) Iâ(TM)ve had, I never find myself counting the number of occupants in the other car as I make my driving decisions.
It's a horribly stupid breakpoint in any case. The truth is that if you swerve to avoid a school bus and run over an old person on the sidewalk, they're just going to do you for running over the old person because you weren't supposed to be on the sidewalk. Meanwhile, you weren't supposed to be outdriving your vision or your brakes, so you wouldn't have had to dodge the school bus anyway. The self-driving car won't outdrive its vision or brakes, so it won't have to swerve. If someone jumps in front of it, i
Re: (Score:2)
Ahhh, but you are looking at the one situation in isolation. The moral thing to do is everyone hand over the driving to the machines as that will save the greatest number of lives in the long run. By being unwilling to hand the decision to a machine you are choosing to kill a greater number of humans in practice on average – just so you can exercise the moral decision in some outlier. If self-driving cars were only as good as, or even possibly just a little better than us at driving, I might side w
Re: (Score:2)
While you might like to make the decision in most cases there simply isn't time in the fraction of a second you have available during a crash.
The issue about moral decisions with self driving cars arises because for computers, first they will have far more situational awareness. They will have been monitoring and possibly making worst case projections showing the possibility of an accident for a long time (for computers seconds is a very long time...)
So even once the probability of collision reach 100% ther
Why American companies have it easier (Score:3)
The summary doesn't really explain why that dilemma is harder for German companies to solve than American companies.
For Americans, the answer is: always hit the pedestrian(s). What the hell was anyone doing outside of a car?
ethics (Score:3)
For example when faced with the decision to crash into a pedestrian or another vehicle carrying a family, it would be a challenge for a self-driving car to follow the same moral reasoning a human would in the situation
Or maybe it would follow better moral reasoning. Ours is not perfect, it's just whatever evolution came up with that gave us the best species survival rates. That doesn't mean it's really the most ethical solution.
For example, in a post-feminist society, let's assume for arguments sake that gender discrimination has been overcome, wouldn't we also do away with "women and children first" - which is a suitable survival approach in a species fighting for survival in the african prairie, but hardly for the dominant species that already is overpopulated.
Re: (Score:2)
Correct. Rational evaluation will lead you to this result.
We're not rational, because our genes (who drive our evolution and thus our minds) don't give a fuck if we survive or not, only if they survive. To them, your kid is more valuable then you are.
Manufactured controversy (Score:3)
2) The most brilliant philosophers still disagree over the ethics of choosing who dies when someone's gotta go. See also the Trolley Problem, most other ethical dilemmas, and generally the eternal struggle between various consequentialist and deontological systems of ethics.
3) This precise scenario is highly contrived and seems (1st approximation) to be vanishingly rare.
Given the above, maybe the question shouldn't be if a robot can make a perfect (or any) ethical decision. Maybe for now it should just be if the robot can do better than a human at not killing anyone at all in these sorts of situations. Maybe "I did my best to protect my owner from death and just happen to average out safer for everyone" will have to be ok for now.
Manufactured straw computer controversy (Score:2)
Re: (Score:2)
My point was that questions like the one in TFS are matsurbation. The question ought to be, are we at a point where they're safer (aggregate) than humans, driving in real world conditions? You and I both agree that currently the answer is no. For optics and liability reasons, t
Re: (Score:2)
I pretty strongly object to testing in real life situations when the populace has an expectation of safety. The google car is perhaps the most advanced in the world yet is not able to function safely in city driving. Google themselves admit it's not ready or it would be rolled out as a product. It's a far cry from teams of engineers, programmers, and scientists fussing over every last detail, planning routes where only expected problems (if any) are in ideal si
Ethics are an interesting dilemma (Score:2)
Let's say they manage to program the car so that it can calculate which course of action will cause the least injuries/fatalities. Now you get into a situation where the only two options available are a.) evade some obstacle on the road, but thereby hit a group of five pedestrians, quite possibly severely injuring or killing them or b.) hit the obstacle, quite possibly killing the driver (you). You are alone in your car.
Now, would you drive such a car which sometimes can decide, with cold, pure Vulcan logic
That ethical challenge is nil (Score:5, Interesting)
Re: (Score:3)
Hmm... Call me a sick fuck then because rather than killing myself, my wife and my children driving into that upcoming 2 ton truck, I would choose to hit the pedestrian every day of the week! And to add a bit of fuel to the discussion, I (and I suspect there are many that agree with me) would only ever buy a car that makes decisions that takes ME into account first and THEN starts looking at minimizing collateral damage. But, like you said, I'm a sick fuck by thinking this apparently...
Wrong logic (Score:2)
"For example when faced with the decision to crash into a pedestrian or another vehicle carrying a family, it would be a challenge for a self-driving car to follow the same moral reasoning a human would in the situation."
No, a self driving car shouldn't get into that situation in the first place. The right thing to do here is to anticipate events and slow down. Self driving cars have a huge advantage here, in that they don't get tired or lose attention over time.
Wait what question? (Score:2)
For example when faced with the decision to crash into a pedestrian or another vehicle carrying a family
Um there is no question for several reasons.
First if the situation is so immediate your only two options are hit a vehicle or hit a person its highly unlikely you have time to peer into the other vehicle and count its occupants.
Second most vehicles on the road today have lots of safety features; if they are being used, seat belts fastened airbags not disabled etc, most crashes are highly survivable; most pedestrian vehicle crashes far far less so for the pedestrian (excepting very low speed nudged someone i
Re: (Score:3)
First, this is outright cruel to say that. Second, this is only fear mongering of the car developers. They could test their cars in the US without any trouble and they have done the same in Germany. Yes they want to be allowed to put them on the road right now to show technology leadership, once, as they have been embarrassed by Japanese car manufacturers over the hybrid thing.
Re: (Score:2)
That's a deep answer. This is the funniest article of the day for me as a flying robot/drone developer.
While the germans are flying drones [flickr.com] all over the place, selling them and have a regulatory framework, they are complaining they cannot build/sell autonomous cars... and calling out other countries, mainly the US as beating them in that game.
But in the US, it's the completely reverse... or bizarro situation. While Google and Uber are building autonomous cars, and getting them approved for use [google.com], drones on the
Re: (Score:2)
I think it's more likely we'll ban human drivers. Just this morning I counted over 16 silver/grey/blue-grey vehicles driving in pouring rain and light fog without headlights on. On average a computer driver today is probably better than a human, and they'll just get better as time moves on whereas human improvements are a bit slower to happen.
Re:Not concerned (Score:5, Insightful)
Even low occupancy transit like taxis will do away with drivers- it will remove the human element as a risk to the passenger and will mean that the cab companies make more money as they're not simply renting cabs to drivers for a flat rate, they're collecting all of the revenue for the cab's use, and they only have to operate as many cabs as they have service demands for at any given moment, so there's less unnecessary wear and tear on the cars as drivers aren't speculatively taking cabs out.
Sure, there will be plenty of human drivers out there, but there's going to be a whole lot of automation because it will simply be much more cost-effective in many circumstances.
Re: (Score:2)
That will become a union thing (in both cases). The trains and trucks will still be required to have a warm butt in the seat, the butt will just be totally ignoring the windshield though instead of just partially ignoring.
Re:Not concerned (Score:5, Informative)
The trucking industry would absolutely love to do away with hundreds of thousands of long-haul drivers.
At least in America, the drivers are the trucking industry. When you see an 18 wheeler on the freeway, the chances are very high that the truck is owned by the guy driving it.
Re:Not concerned (Score:4, Informative)
Who pays the trucker? The owner of the merchandise or the trailer.
Don't forget, lots and lots of large retailers maintain their own over-the-road fleet. Sears/Kmart, Walmart, Target, Costco, Kroger, Safeway, Autozone, and that's only a drop in the bucket. They could all retrofit to an automated tractor, or at least where a pilot car or truck escorts a caravan of autonomous trucks following behind.
Re: (Score:2)
Most large companies outsource their transport to JB Hunt, Schneider, etc. Sure, the big letters say 'WalMart', but in smaller, DOT minimum sized font, it often has another name.
Re: (Score:2)
Most large companies outsource their transport to JB Hunt, Schneider, etc. Sure, the big letters say 'WalMart', but in smaller, DOT minimum sized font, it often has another name.
So what you're saying is that there are actually fewer transport companies than there appear to be? Because that's an argument in favor of self-driving trucks, once again.
Re: (Score:2)
Because that's an argument in favor of self-driving trucks, once again.
Nobody is arguing that about whether SDTs are coming, but about who will drive the change. The "trucking industry" is unlikely to be an agent of change. They are entrenched incumbents who will fight, lobby, and bribe to stop automation. Progress is more likely to be driven by customers such as WalMart, or entirely new transport companies. They will be lobbying and bribing in the opposite direction.
Re: (Score:2)
The "trucking industry" is unlikely to be an agent of change. They are entrenched incumbents who will fight, lobby, and bribe to stop automation.
Again, why would they do that? The only members of the "trucking industry" who stand to lose if trucks go automated are truckers themselves, but they can't possibly out-lobby trucking companies.
Progress is more likely to be driven by customers such as WalMart, or entirely new transport companies.
No. Wal-Mart will just contract with whoever can moves the trucks most cheaply. But they're not going to do the leg-work themselves. They'll just contract whoever has the self-driving trucks, after they do the lobbying.
Wal-Mart doesn't give a shit how cheap trucking is, only that they get it at the lowest possible co
Re: (Score:2)
Again, why would they do that? The only members of the "trucking industry" who stand to lose if trucks go automated are truckers themselves, but they can't possibly out-lobby trucking companies.
Again, the drivers are the trucking companies.
Re: (Score:2)
Large companies like doing business with other large companies. If JB Hunt, Swift, Allied, or any of a slew of larg
Re: (Score:2)
Again, the drivers are the trucking companies.
A romantic notion, but incorrect [inboundlogistics.com]. Private fleets account for 80% of trucks [nptc.org] and over 50% of OTR tonnage shipped.
Re: (Score:3)
I should actually correct myself slightly: Wal-Mart (and others) have some in house drivers and some outsourced.
BTW, in discussions of the transport industry, don't get distracted/lied to by the companies. Some drivers think they are owner operators, when in practice, they aren't. They will lease/buy a truck from (as an example, all of the bigs do this) Schneider. As part of the lease terms, they can only accept loads from Schneider. It should be obvious that the 'owner' is an employee who has assumed much
Re: (Score:2)
I expect lower operating costs could come from simply not having to operate as many trucks. If man-operated trucks usually only operate 5/12 of the day, it's conceivable that autonomous trucks could operate much closer to the full day, less maintenance, refuelling, and load/unload or hookup/unhook, and those latter tasks might cou
Re: (Score:2)
The trucking industry would absolutely love to do away with hundreds of thousands of long-haul drivers.
At least in America, the drivers are the trucking industry. When you see an 18 wheeler on the freeway, the chances are very high that the truck is owned by the guy driving it.
Its the same in Australia.
Truckers are a very powerful union and a lot of them are owner/drivers. This is changing of course, but it's going to be a slow and laborious process. However it will be a very long time before we have fully autonomous trucks.
Re: (Score:2)
Re: (Score:2)
I expect that some human drivers will remain for odd jobs, like we
Re: (Score:2)
I can us
Re: (Score:2)
Re: (Score:2)
Self driving vans will be deployed by UPS / Fed Ex / et al simply so that they can have the driver become a full time package sorter and deliverer. On some routes in a busy downtown area there may even be multiple people in the van getting dropped off and picked up by the self driving van which then does not need to park.
You could also see people moving from an empty truck to a newly arrived full one. The empty one heading back to the depot to fill up. The full one having driven itself out from the depot.
Pe
Re: (Score:2)
ahem [wikipedia.org]
Re: (Score:3)
It's even easier than that.
Do YOU want to be the person dragged into court because YOU wrote the program that INTENTIONALLY HIT AND KILLED someone?
No? Then write the code to be 100% neutral. The code will ONLY attempt to stop the vehicle as fast as possible.
If pedestrians are within X meters of the car then the car should slow to Y. If they get closer then the car should stop.
But the code should NEVER have the option "hit object X".
Re: (Score:2)
That would be "suicide".
And the sensor logs of the car should be able to show that it was suicide.
But more to the point, how would that situation be any different in a faster-reacting-autonomous-car than in a human-controlled-car?
Or are you postulating a world where there are no cars because someone might try to commit suicide by jumping in front of one?
Re: (Score:2)
Its postulated that the teenagers of tomorrow may have a new fun game that involves playing chicken with autonomous cars on the local freeway. Just walk across and watch all the cars veer away and avoid hitting you.
Avoiding them is no different from avoiding other moving targets such as dogs, deer, moose, etc. Anything that might damage the vehicle or occupants needs to be avoided. With the message (moving obstacle on road) being broadcast to the rear for following cars and the front for approaching (in the
Re: (Score:2)
Which of those steps covers engaging the braking system?
And how does the license plate "determine what would happen if you hit it"?
Why weren't the cameras and sensors on already if the car was operating autonomously?
Re: (Score:2)
256GB of flash is just over $100 right now. Storage is not a problem. Even AIRCRAFT do not have a problem with storage and they have a LOT more data to store.
Re: (Score:2)
Re: (Score:2)
Self preservation should always triumph. Why? That is what people do anyway.
That is obviously wrong. Minimising total damage should always triumph. If you intentionally drive into a pedestrian, you should and will go to jail.