Would You Need a License To Drive a Self-Driving Car? 362
agent elevator writes Not as strange a question as it seems, writes Mark Harris at IEEE Spectrum : "Self-driving cars promise a future where you can watch television, sip cocktails, or snooze all the way home. But what happens when something goes wrong? Today's drivers have not been taught how to cope with runaway acceleration, unexpected braking, or a car that wants to steer into a wall." The California DMV is considering something that would be similar to requirements for robocar test-driver training." Hallie Siegel points out this article arguing that we need to be careful about how many rules we make for self-driving cars before they become common. Governments and lawmakers across the world are debating how to best regulate autonomous cars, both for testing, and for operation. Robocar expert Brad Templeton argues that that there is a danger that regulations might be drafted long before the shape of the first commercial deployments of the technology take place.
If "yes," then it's not self-driving (Score:5, Insightful)
If "yes," then it's not self-driving.
Re:If "yes," then it's not self-driving (Score:5, Interesting)
Simply this. To elaborate further. Self-driving cars should be the legal equivalent to sitting in the back of a taxi. Even from an insurance/liability standpoint, owning one means you're responsible/liable for fuel & maintenance - and that's about it. It should be down to the manufacturer to ensure safe, autonomous operation. (Otherwise, things such as self-valet and timed pick-ups won't happen)
Comment removed (Score:5, Insightful)
Re:If "yes," then it's not self-driving (Score:5, Insightful)
Forget about sensors for a moment: We don't deal with malfunctioning PEOPLE right now. Drunks, old people, and visual impaired people routinely climb behind the wheel everyday. We are already running over darting children, cyclists and pretty much anything else with the temerity to set foot, hoof or paw on the road. Old people ramming cars into crowds because they can't tell the brake from the accelerator are just the cost of doing business in a free society.
A self-driving system doesn't have to be perfect, it just has to be better than what we have now when we scale it up. Given that you can give a driving AI the equivalent of millions of miles road experience in all conditions, I doubt that AI's will drive worse than human beings for much longer.
The insurance companies will need to be convinced for sure, but they will be when self-driving systems demonstrate their superiority.
Re:If "yes," then it's not self-driving (Score:4, Interesting)
We don't deal with malfunctioning PEOPLE right now. Drunks, old people, and visual impaired people routinely climb behind the wheel everyday.
We don't deal with these problems, because we have bad laws. We have bad laws because politicians want to please lobbyists, and don't want to seem "soft" on crime or negligence. As a result, they pass laws that are too strict (DUI laws being a classic example: studies show the majority of people are NOT significantly impaired at 0.08%).
When unreasonable laws are passed which victimize pretty much "innocent" people, people lose respect for the law. Not just DUI but also (former or at least getting there) marijuana laws are great examples.
A self-driving system doesn't have to be perfect, it just has to be better than what we have now when we scale it up.
Nope. Based on past advances in automobiles (ABS, airbags, power steering, computer throttle control), what will happen is that they will get released, and they will have some major screwups (or public perception of screwups anyway), and there will be a flurry of very heavy-duty lawsuits, and it will go away for a while. Then they'll come back in new and improved form. Then there will be a couple of more lawsuits, and some recalls. Sales will go down a bit and improve again. And it will gradually smooth out. Probably.
It's a bit like the "ringing" effect in some kinds of oscillators.
Re: (Score:3)
In France, the DUI limit is 0.05%. My anecdotical experience is that this threshold does not seem too low: I certainly do not have the same reflexes or spatial awareness when I am close to this threshold. And I do not think this is a corner case.
Re: (Score:3)
Re: (Score:3)
"We have bad laws because politicians want to please lobbyists,"
I'm pleased to note that autonomous auto manufacturers won't stoop to employing lobbyists.
Re: (Score:3)
I have never seen a single study to show that 0.08% is "too strict". In fact, it is extremely lenient by most other country standards. A quick perusal showed this:
http://trid.trb.org/view.aspx?... [trb.org]
It concluded impairment begins with any deviation, and almost all people are significantly impaired by 0.08% (lending credence to the idea that the line is too lenient, not too strict).
If you have a study that actually shows what you purport, I'm sure people would love to see it.
Re:If "yes," then it's not self-driving (Score:5, Insightful)
Even if you can account for such things, how will your autonomous vehicle handle malfunctioning sensors? Aerospace has been working at this for decades and still hasn't figured it all out [wikipedia.org].
Detecting a malfunction in a sensor is hard, really hard. You'll need more than one sensor, preferably different types, to realise there's an error, and then you have to decide which of the contradictory sensor results is the correct one. As naturally sensors will always return slightly different results, you'll have to account for that as well.
So let's say we solved this. Then you know there's a problem. For an autonomous car it's simple: it could decide to continue (minor problem), or stop (e.g. tyre blow-out or other major problem that makes it unable to continue, or simply "I don't know how to handle this situation, so I pull over to the side of the road and stop to have my human overlords sort it out"). In the second scenario an automated call to the repair service could be included, so the human(s) in the car can continue to sleep while it's being fixed and after that be sent on their way again.
An airplane doesn't have this fail safe stop option, and needs to have human overlords present at all times to take control if something happens the programmers didn't foresee.
Re: (Score:3)
An airplane doesn't have this fail safe stop option, and needs to have human overlords present at all times to take control if something happens the programmers didn't foresee.
Even then, there's arguments for removing the human pilots today because they actually cause around half the accidents.
Re: (Score:3)
"Detecting a malfunction in a sensor is hard, really hard. "
it depends. you have a known range the sensor will read and you have a known rate of change. For example the sensor in my BMW that measures steering angle will go from 10 to 65525 it can read from 0 to 65535 but the physical limits of the mounting will not allow it. which is fine. the computer system also knows that it is 100% impossible to have more than a rate change of + or - 3500 per second. so if any rate changes are high than that, like
Re: (Score:3)
You wrote, "Detecting a malfunction in a sensor is hard, really hard."
Actually is is quite simple to do. If you get anything more then the cheapest of sensors, they continually diagnose themselves and report back the diagnosis. There are failures that cause the sensor to freeze up and stop reporting. If it keeps sending the same data, easy to detect the value stopped changing. If it stops sending any data at all, easy to see a step change that should not have occurred and you also do a redundant sensors
Re:If "yes," then it's not self-driving (Score:5, Interesting)
Therefore, you either keep the abstraction simplified and require the pilot to do a bit more work, or re-instate the flight engineer.
Last year I worked on a Stacker/Reclaimer (EG the big wheely bucket loaders that either scoop up huge piles of coal, or stack coal into piles), and has an operator out in a cabin doing the driving. The same system was being sold to two different customers. The first customer wanted an automatic mode that would guide the machine around the piles of coal and scoop/deliver in order to get an optimal materiel field while the operator sat back and basically watched.
The second customer basically said "I don't want no damn stinkin' automatic mode, because if I'm payin' for an operator to sit out there, he better be working"
Re:If "yes," then it's not self-driving (Score:5, Insightful)
I'd anticipate that autonomous vehicles would be able to react a lot quicker and of course they wouldn't be distracted (driver distraction being the number one cause of accidents). In the case of an emergency, I wouldn't want the vehicle to be relying on the human to wake up, figure out what was going on and then take appropriate action.
Not completely self-driving (Score:3)
Actually, this would be a problem. The USAF is currently struggling with some of this - they automated their drones too much, operators don't have enough to do to keep proper attention on the drone in case something does happen. They're actually considering removing some of the automation...
I don't disagree that this is the most likely current situation, but it's going to be virtually impossible to keep the driver from doing other things as you remove more responsibility and control from them.
Re: (Score:3)
Well I'd hope that the car on autopilot would have slowed to a halt until I was past whether or not the driver was on the phone. That raises another interesting possibility, people deliberately driving aggressively to make cars on autopilot get out of their way...
Re:If "yes," then it's not self-driving (Score:5, Insightful)
Self-driving cars should be the legal equivalent to sitting in the back of a taxi. Even from an insurance/liability standpoint, owning one means you're responsible/liable for fuel & maintenance - and that's about it. It should be down to the manufacturer to ensure safe, autonomous operation. (Otherwise, things such as self-valet and timed pick-ups won't happen)
Let's be realistic. Self-driving cars are coming, but it is going to be a gradual transition. We've already seen the beginning of it with adaptive cruise control and self-parking. These features will continue to be refined while new ones are added, but we almost certainly face years (decades?) of gradual transition where our cars are some weird hodgepodge of self driving and user operated. The laws governing this won't be nearly as straightforward as you suggest.
Re: (Score:2)
Let's be realistic. Self-driving cars are coming, but it is going to be a gradual transition. We've already seen the beginning of it with adaptive cruise control and self-parking. These features will continue to be refined while new ones are added, but we almost certainly face years (decades?) of gradual transition where our cars are some weird hodgepodge of self driving and user operated.
What's funny is this whole bit about "Where is my flying car?" when realistically it won't happen in any quantity until personal flying is almost completely automated. And I don't see that happening until we CAN, at least, make reliable automated cars.
The collision-avoidance problem, in some ways, is multiplied in the air. At least on the ground you have specific lanes with traffic control devices on them (lights, etc.).
Re: (Score:3)
The collision-avoidance problem, in some ways, is multiplied in the air. At least on the ground you have specific lanes with traffic control devices on them (lights, etc.).
Just the opposite. Consider that we developed drones long before we developed a self driving car. You can program specific lanes for flying, they're used all the time by commercial aircraft, but by the same token there's a lot less static clutter, margins are greater(no worrying about whether the kid on the side of the road will dart out), etc...
There are reasons why we developed self-piloting plants decades before we developed self-driving cars.
Re:If "yes," then it's not self-driving (Score:4, Insightful)
Given that operator handoff is most likely to happen either under relatively hairy conditions, or when some system failure has left the automated systems unable to cope, there isn't an obvious incentive to relax the(already not terribly demanding, at least in the US) requirements placed on licensed drivers until 'self-driving' actually does mean 'self-driving'. If it means 'sometimes self driving, except the hard parts', that may require less operator effort; but not obviously less operator knowledge(if anything, given that drivers usually get somewhat safer with experience, at least until they hit the point where each additional year stops making them less young and stupid and starts making them more old and inept, I'd be particularly worried about the likely performance of somebody whose vehicle is sophisticated enough to coddle him most of the time, then screams and hands him the wheel when the situation is already halfway lost.)
I have no doubt that the laws(or at least the liability litigation and insurance-related contracts, even if carried out under existing law) for damage and death caused by partially-automated vehicles will be an epic nightmare of horrendous proportions; but on the operator licensing end "If you might have to drive it, you need a driver's license; if you won't have to drive it, you don't." really covers a lot of territory. There might be some incremental adjustments, mostly to the format of the test(say, allowing use of a rear-view camera in addition to mirrors and over-the-shoulder during tests of parking); but not too much need to complicate things.
Re: (Score:3)
Re:If "yes," then it's not self-driving (Score:5, Insightful)
Given that operator handoff is most likely to happen either under relatively hairy conditions, or when some system failure has left the automated systems unable to cope,
Euhm... let me get this right... you expect cars to drive automatically, except when it gets difficult or something else unexpected happens it suddenly gives back control to the driver. That's what you mean, right?
Bad idea. Very bad idea. The driver is probably reading the paper, or is dozing off, or otherwise simply not paying attention to the road, as the car is doing the driving and he has nothing to do. He's not supposed to do anything about driving, as the car is in full automatic driving mode. Suddenly asking for attention, then expecting the driver to handle a difficult situation instantly, is asking for accidents. Many more than when the driver was in control already, and possibly sees the situation coming, so anyway has much more time to react.
To allow the driver to fully hand off control to the car, the car should be able to handle it all. The driver assist functions we have available on certain cars nowadays are a great start in working towards full control by the car: now the car will intervene in certain emergency situations, when that's all settled, we can think about giving off control of the rest of the ride as well. For fully automatic drive, the car should not rely on human intervention, ever.
Re: (Score:3)
Disagree 100%. I think handoff will be in quiet, planned circumstances. More like an airplane autopilot than a dumb cruise control that has a high chance of spinning out in hydroplane situations, or would happily ram you into the back of the car in front or run you of the road if you stopped paying attention long enough.
Common sense is that a handoff during dangerous situations is quite pointless. Say you are driving along a straight road, expecting nothing evil, either driving yourself or the car is driving. Totally unexpected, a moose jumps into the road. If you are driving yourself, reasonably concentrated, you will have a hard time handling this correctly. A self driving car may handle it better (depending on a situation there might be an accident because the accident was unavoidable).
However, what is absolutely gua
Re: (Score:2)
Self-driving cars should be the legal equivalent to sitting in the back of a taxi. Even from an insurance/liability standpoint, owning one means you're responsible/liable for fuel & maintenance - and that's about it. It should be down to the manufacturer to ensure safe, autonomous operation.
The autonomous car is safe only within its operational limits --- but how many drivers will be willing to let a car or its manufacturer decide when it is safe to take to the roads?
How many will risk being stranded if automated systems begin shutting down because they are confused and overwhelmed by bad weather, outdated maps, or other unforeseen circumstances?
Re: (Score:2)
How many will risk being stranded if automated systems begin shutting down because they are confused and overwhelmed by bad weather, outdated maps, or other unforeseen circumstances?
Probably the same number who are willing to try horseless carriages that might get overwhelmed by bad weather, outdated maps, or other unforeseen circumstances.
Re:If "yes," then it's not self-driving (Score:4, Interesting)
By that theory, nobody ever drives anywhere, because there could be an unexpected road closure. I go lots of places where there is only one road, and if it is closed (which happens) then you can either try the next day, or drive an extra 250 miles. I've never once heard of it as a reason people don't go to those places. Even a doctor isn't going to stay in town and never go to the beach on a day off because of some small percent chance the road would be closed.
If the car is leased with a service agreement (likely for early versions) then you probably just call roadside assistance if it strands you, and they send a tow truck, same as AAA.
Gosh, nobody would even play golf, because of the lightning risk.
Re: (Score:2)
Self-driving cars should be the legal equivalent to sitting in the back of a taxi. Even from an insurance/liability standpoint, owning one means you're responsible/liable for fuel & maintenance - and that's about it. It should be down to the manufacturer to ensure safe, autonomous operation.
The autonomous car is safe only within its operational limits --- but how many drivers will be willing to let a car or its manufacturer decide when it is safe to take to the roads?
How many will risk being stranded if automated systems begin shutting down because they are confused and overwhelmed by bad weather, outdated maps, or other unforeseen circumstances?
How many drivers? Probably not that many, unless the safety envelope is very wide indeed. However, if the autonomous car is, in fact, autonomous, the question is also how many non-drivers would like to have access to the road, most of the time, without becoming drivers. Especially if they don't have a choice(visual or other incapacity that precludes driving, alcohol violation, too young, etc.) or their use case is relatively miserable driving(If you are going to have a shit commute in heavy traffic, do you
Really? Come on now, you should know better. (Score:2)
You should know better than to make false assertions when we have plenty of evidence countering your assertion that technology will ever be this good. Since the 1960s we have been automating space travel and airlines, and still need pilots and astronauts because when the shit hits the proverbial fan humans are required to intervene. Sometimes to correct problems with the technology, and sometimes to bypass it and fly by hand.
Drones require people to pilot them too, so don't try to go down a bad path.
I don
Re: (Score:3)
Since the 1960s we have been automating space travel and airlines, and still need pilots and astronauts because when the shit hits the proverbial fan humans are required to intervene.
We have pilots to make passengers feel good. We have astronauts because we can't make a robot as dextrous as a human yet.
Re: (Score:3)
What I wanted to show by bringing up this example is that in current airplane design, there are circumstances in which automation is known to fail (in this case, unreliable/defective sensors). In these circumstances, the systems are designed to give control back to the pilot. The rationale for this is quite clear. It could be argued that fully working automated systems are safer and more reliable than humans. However, automate
Re:Really? Come on now, you should know better. (Score:5, Insightful)
Nothing, and that is an absolutely nothing, has ever been made by man which has been perfect.
A self-driving car does not have to be perfect. It just has to be better than the alternative.
With motor vehicles already being the number one killer in the US annually, we want human intervention early and often.
Isn't the fact that motor vehicles are already the number one killer in the US annually actually an argument for automated cars?
As stated above, a half a century has not perfected "self driving" anything else.
Five centuries of work before that never perfected heavier-than-air flying machines either, until one year, presto, all the necessary preconditions were finally met and airplanes became a reality. There's nothing linear about progress.
Re: (Score:2)
Re: (Score:2)
If "yes," it's worse than traditional cars. Even if you're stuck in a traffic jam, if you have to pay attention (in case something goes wrong with your car), there is no benefit.
Re: (Score:3)
You don't have to be licensed so that you can pay attention "in case something goes wrong," though you'll probably be expected to push the car out of the roadway if physically able.
The reason you have to be licensed is that if the car malfunctions and creates an insurance claim, there is lots of existing legal precedent related to insurance liability that means the insurance company will require a licensed driver, until the laws are changed by people not scared of self-driving cars. That will take up to 50
Re: (Score:2)
A driver's license is not really entirely about driving, which is why some jurisdictions refer to them as operator's licenses.
To operate a motor vehicle, you're showing competence in the vehicle's operation, For a normal car, that means mostly the in-motion controls and law knowledge, but there's also a section of most tests where you're required to demonstrate mastery of the machine and the ability to keep it in good condition, by demonstrating indicator lights, completing a knowledge test, passing emissio
Re: (Score:2)
I know, right? Like, how can you drive your license around if you're not driving. Oh wait, but it says the car will be driving. Wait, I don't even drive my license as it is, I drive my car!
If it doesn't drive, I'll agree it isn't self-driving. But if it isn't licensed, then I can only agree it is not self-licensing.
It is a bit of a "no-brainer" that at first a licensed driver will be required, for the purpose of integrating normally into the existing insurance law and regulation. Only after they're common w
TL-DR (Score:2, Insightful)
There will be detractors, luddites, and evangelists, sociopaths and attention whores all vying for a moment in the sun.
Welcome to the human race. I'll go get my popcorn.
Do pilots still need licenses? (Score:5, Insightful)
For a long time, an autonomous car will not be driverless. People need to get over this notion that next year a car will drive itself and you'll sit in the back with a Martini and the paper. That probably wont happen in our lifetimes.
Initially, fully autonomous modes will only be permitted on certain roads (think limited access roads like highways, freeways and autobahns). This will last years as engineers are even more conservative than law makers. The next step is likely to be special lanes on A roads. It will be a long time before autonomous cars are good enough to operate on a B road or suburban street.
Ultimately, because the law requires someone to be responsible for the operation of the machine it means a qualified operator will need to be at the controls whilst in operation. Same with a lot of other automated systems (such as long distance trains).
Re: (Score:3)
Unless you are over 80 it is going to happen in your lifetime.
Fully autonomous vehicles will be driving on all (but possibly) rural roads in the near future.
Re: (Score:2)
Your dad said the same thing about flying cars.
Re: (Score:2)
Sorry but chess is a well defined game with simple rules. Playing chess needs excellent memory and fast computation. Computers are very good at that. Driving a vehicle is very different and requires much higher and different intelligence.
Re: (Score:3)
Every time a computer gets good at a task once thought to be outside of the realm of AI
Who ever said that?
Chess computers win by computing all the possible outcomes for a large number of moves ahead. The only limiting factor is computer speed. As computers get faster they can follow very similar algorithms aned get much better. At the base chess is not hard if you can calculate far enough in the future. People can not calculate far enough and therefore use other methods to win. By the way, we have had chess computers for 50 years.
The problem with real life is that it is not constrained by sim
Re: (Score:3)
The problem with real life is that it is not constrained by simple rules like chess. There are two many variables and too many situations that are non deterministic. For example, if you see a person standing near the curb. What do you think they might do. The prediction is based on many things; age, gender, which way they are facing, what they are doing, etc. They might just stand there or they might dart out into traffic. It is very hard for a computer to make predictions. The same goes for other vehicles.
Chess is easier. However, in a game of chess, you have an opponent who actively tries to beat you. In a car, other drivers don't actively try to hit you. That person at the curb doesn't wait for an opportunity to throw himself in front of your car and get killed.
Re:Do pilots still need licenses? (Score:4, Insightful)
Sorry but chess is a well defined game with simple rules. Playing chess needs excellent memory and fast computation. Computers are very good at that. Driving a vehicle is very different and requires much higher and different intelligence.
Every time a computer gets good at a task once thought to be outside of the realm of AI, people simply rationalize how it wasn't that hard in the first place. Soon people will be saying how self-driving cars weren't that hard to create for whatever reason. Then something else will be "impossible" for a computer to do.
You know the funny part about your story here?
It's actually believable...in a universe devoid of lawyers.
We live in a world today where litigation is the main limiting factor with technology, not the technology itself.
After watching the birth of patent trolls as well as companies spending millions fighting for inventions like "rounded corners", I would have thought this would have been more obvious. It will be when vendors are sitting around for a decade waiting to deploy their inventions while we argue who should be sued when it fucks up.
Re: (Score:2)
Unless you are over 80 it is going to happen in your lifetime.
Why are you so certain? The technology we have today is certainly not fully autonomous, and the stuff Google is working on isn't leading towards that. Do you have a reasoning behind your estimate, or is it merely a guess?
Re: (Score:2)
The google stuff is fully autonomous already. They have steering wheels for 2 reasons: they're required to by the State of California, and they're prototypes and they need to be able to steer them around with some of the equipment turned off.
A consumer model doesn't need to be able to still be operated after a malfunction. It can just shut down and call a tow truck.
Re: (Score:3)
Re: (Score:2)
You have no idea how long I'll live, you insensitive clod! :P
Re: (Score:3)
Not quite. It's "yes" because most people would be unable to get over their fear of flying in an entirely autonomous plane, not because we need heroic pilots to override the computer when things go wrong.
Consider that about half of all aviation accidents are traced to pilot error. The percentage of crashes caused by autopilot error is zero.
Re: (Score:2)
Not quite. It's "yes" because most people would be unable to get over their fear of flying in an entirely autonomous plane, not because we need heroic pilots to override the computer when things go wrong.
Consider that about half of all aviation accidents are traced to pilot error. The percentage of crashes caused by autopilot error is zero.
No,
Pilots are still there because autopilots can fail.
http://www.dailymail.co.uk/new... [dailymail.co.uk]
You didn't hear about this 4 years ago because no-one died... Thanks to some quick thinking by the "error prone" lumps of meat in the cockpit.
Re:Do pilots still need licenses? (Score:5, Insightful)
Bullshit. Watch Air Crash investigations, there are enough autopilot caused incidents to show why pilots are needed.
You might think that, but you'd be wrong...
Pilot error is the single largest cause of all aircraft crashes...
The reality is, if you simply accepted that every time the computer messed up, everyone would die, you would have FEWER deaths than you would with the current system.
Because no one wants to hear that and can't emotionally accept any deaths as "known", we have what we have.
Source? 15 years of aviation experience, thousands of hours of flight time in everything from helicopters to corporate jets, Certified Flight Instructor in both airplanes and helicopters. I know better than most people on Slashdot about this subject. Just because you watched something on The History Channel doesn't make you an expert.
Re: (Score:2)
People need to get over this notion that next year a car will drive itself and you'll sit in the back with a Martini and the paper. That probably wont happen in our lifetimes
It'll happen during the next decade. Bet against Dr. Moore at your own peril.
(granted, the government will lag 20 years behind the technology, so we'll still have drunk drivers killing people when the autopilots would have been safer)
Re: (Score:2)
People need to get over this notion that next year a car will drive itself and you'll sit in the back with a Martini and the paper. That probably wont happen in our lifetimes
It'll happen during the next decade. Bet against Dr. Moore at your own peril.
(granted, the government will lag 20 years behind the technology, so we'll still have drunk drivers killing people when the autopilots would have been safer)
The concentration of transistors has nothing to do with this.
You rely on a bad interpretation of Moore's Law at your own peril.
The technology will be adopted slowly because any mistake will kill the technology. When your laptop crashes due to a production fault, you might lose a little bit of work that you'll have to redo, when a car crashes due to a production fault, there's a good chance people will die. So ordinarily cautious and conservative car companies will be even more cautious and conservativ
Re: (Score:2)
The problem with the design of current cars is that a deadly defect is built in; a steering wheel allowing direct, manual human control!
This design flaw causes the vast majority of all automobile accidents.
Re: (Score:2)
I agree that adoption will be gradual. The first generation of "self-driving" cars will probably have "smart cruise-control" and "self-parking" modes, but the driver will still be expected to be at the wheel and ready to take control if needed. Next, the vehicle will be smart enough to take you from start to destination by itself, but only in good weather and relatively common driving circumstances. Eventually, engineers will probably figure out how to make these systems so smart and reliable that we can
Bad Analogy (Score:3)
Do pilots still need licenses in the age of autopilot? Well yes because machines aren't infallible.
This is a terrible analogy. First autopilot for a plane cannot taxi the aircraft so it is not feature complete. Secondly the consequences of mechanical failure in a car are far less severe and you can probably solve most of the ones which do not themselves involve the engine dying by having a kill switch and a steering wheel: all you have to do is yank the switch and steer the now rapidly braking car out of trouble. A kill switch on an aircraft is a somewhat less viable option which is why you need a pilot
Should all car drivers be able to ride a horse? (Score:2)
I think that it's pretty clear that within a few 10s of years the car with a driver will be the anomaly. The economic advantage in large areas of transportation (trucking, taxis, deliver, etc. etc.) are so huge that the technology will be adopted, and the transition to home vehicles is inevitable because the cost i
Re: (Score:2)
The cost for the array of sensors is far from minimal at the moment. Maintenance on them will add up too, you have new complicated pricey parts. The majority of people are probably driving cars worth $5K or less. Cheap low maintenance human-driven vehicles will be the norm for the foreseeable future, outside of wealthy suburbs.
Re: (Score:2)
The cost for the array of sensors is far from minimal at the moment. Maintenance on them will add up too, you have new complicated pricey parts. The majority of people are probably driving cars worth $5K or less. Cheap low maintenance human-driven vehicles will be the norm for the foreseeable future, outside of wealthy suburbs.
Driverless cars will probably be introduced as a taxi like service. That way the cost will be spread out over a large customer base. At some point most young couples will decide not to get a second car because the autonomous service will take one of the spouses to work and back. Then with a generation or two of families having only one vehicle, new young couples will start by passing owning a car in the first place. Or at that point, they will have become economical enough, the one car the family does own w
"Promise a future where we can sip cocktails" (Score:3)
And I stopped listening right there.
Only fucking MORONS want this sort of thing.
When you're in a piece of heavy machinery, like a car, even if you're NOT driving it, you DON'T want to be impaired in case of an emergency.
So, drinking in a self-driving car is pretty much out. And for many of the reasons this dipshit talked about. MALFUNCTIONS.
Before you bring up bus and rail transport. Keep in mind, there are people actually driving those. And, in the case of long distance trains, crews full of people. All better trained at running the transportation than you are.
Re: (Score:2)
Before you bring up bus and rail transport. Keep in mind, there are people actually driving those.
Many subway trains are now automated. Expect all other vehicles to follow eventually.
Re: (Score:2)
Subways run on a track. A track, further, that is enclosed away from other things running onto the track.
Re: (Score:2, Insightful)
"Only fucking MORONS want this sort of thing.
When you're in a piece of heavy machinery, like a car, even if you're NOT driving it, you DON'T want to be impaired in case of an emergency."
Wha?? Hows this any different than taking a cab home?
Am I putting myself in horrible danger when i take the cab home after a night drinking? Afterall i dont want to be impaired in case of emergency. Friends dont let friends take a cab?? -rolls eyes-
Blah... you people worry too much.
Re: (Score:2)
Re: (Score:2)
That seems a bit extreme. I don't know about you but if I drink a small amount of alcohol it won't impair my ability to react to an emergency in any meaningful way.
Re:"Promise a future where we can sip cocktails" (Score:4, Interesting)
You'll find that people hold self-driven cars to a much higher standard than we have today. I think people just fear change.
Insurance (Score:4, Interesting)
Re:Insurance (Score:4, Insightful)
Re: (Score:2)
If two self-driving cars are involved in a collision, who is responsible for the damages?
If the cars are owned by individuals and not a taxi service, it'll probably be related to if they've kept the software up to date. If one person's car is up to date with the latest patches, and the other person hasn't updated in the last three years, and their car has had an update which would have avoided the accident, the person who didn't maintain their vehicles software will be liable.
Completely stupid (Score:2)
The real question is do we need traffic lights (Score:2)
or interchanges. If the cars are well guided and coordinated you could have full speed ground level crossing where the cars just space out enough to weave past each other. Would be terrifying at first but people would get used to it.
Why don't we just incorporate them? (Score:5, Funny)
Why don't we just incorporate them?
Then they'll legally be people, and they can get their own driver's licenses!
Re: (Score:2)
you dear sir,
just won a internetz!
Thank you for making my day.
(wish I had mod points)
Don't spook the horses! (Score:2)
Requiring a license for a self-driving car is the modern red flag to avoid spooking the lawyers.
Wide Load. (Score:2)
When the automobile debuted, the UK passed the infamous Locomotive Acts (otherwise known as the Red Flag Law), requiring someone to walk in front of a "horseless carriage" waving a red flag.
The first Locomotive Acts were passed in the 1860s.
Forget the "horseless carriage." We are talking about road trains, huge and heavyweight steam powered agricultural tractors, bailers, threshers, bulldozers, steam shovels and the like.
To this day flagmen and escort vehicles serve the same purpose.
Everyone will still need a ;licence (Score:2)
But it win;t be called a ;drivers licence , it will be called a government issued photo ID
You need one now to cash a check, vote etc.
(I would have used the term "state issued ID" but SCOTUS still has to decide whate the term 'state' means.
Note to grammar nazis - would have is pronounced would of
Of course (Score:2)
The state needs the ability to track the movement of the populace. If they had unlicensed cars on the road, people would be able to move about freely, without being surveiled. Imagine the safety implications there...
Also, they would lose an avenue for much needed recurring revenue, and something to hold over the head of criminals.
It's almost like you think you live in a free country?
A horse for a horseless carriage (Score:5, Insightful)
Self driving does not have to mean self reliant (Score:2)
Some
The license isn't the issue... the insurance is (Score:3)
Lets say my self driving car runs someone over... who is liable?
Children (Score:2)
I think a more interesting question is will gaurdians be allowed to put children in cars alone?
( Jimmy, your parents just called and said they would be late, I'm going to call yoiu a robotaxi. )
Re: (Score:2)
An interesting problem. You might have an individual who was out drinking heavily, with the expectation that the car can drive him home. But if the car comes across something it can't handle, the car owner would be in no condition to take over control.
Re: (Score:2)
But if the car comes across something it can't handle, the car owner would be in no condition to take over control.
At that point the car says "Sir, would you mind if I hand control over to a licensed remote driver? An inebriated silence will be allowed as acceptance". Then the car will do the equivalent of todays On-Star system, and have a professional take over.
Re: (Score:2)
All you have to do is make sure that the rate of such incidents and the accidents they cause or can't avoid is less than what a human driver, even a somewhat below average one would experience. Let part of the purchase price of the vehicle go toward insurance. If the manufacturer can get the accident rate down, then they will make more money. The incentives are right, just let t
Re: (Score:2)
Nonsense, the insurance would never get shifted onto the manufacturer, because maintenance happens after that, and is part of the accident risk. The vehicle owner will be required to pay for ongoing insurance, and will be required to be licensed (for the purpose of purchasing insurance!) until the car insurance requirements get shifted so that non-licensed drivers can buy insurance on self-driving cars.
But I like the taxi analogy. And it keeps working too; just like in that situation, the most likely thing
Re: (Score:2)
The computer, having detected the loss of brakes, will not ask the magical human to save it, it will just activate the emergency brake and flashers, maintain control of the vehicle, stop in the lane, and call roadside assistance.
Then the car will ask you to do a manual chore. Pushing the car off the road. And there is absolutely nothing about the brakes going out that makes the human suddenly better at controlling the steering wheel, avoiding accidents, or obeying traffic laws.
Humans are too stupid to opera
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
IMHO the invention of the mouse "by apple" spawned one of the darkest ages of humanity: noob computing. The mouse enabled people to (ab-)use computers without having the least bit of insight -- enabling stupidities like facebook, X11, or slashdot beta.
And Anonymous Cowards posting drivel in Slashdot comments.
Re: (Score:2)
Re: (Score:2)
I have trouble staying focused (and awake) on long trips of over an hour outside of city limits. Driving the 2+ hours between my college town and parents' house when I was at school was awful. Driving the 8 hours from home to my relatives' house in the next state is murder.
Re: (Score:2)
I can't wait for everyone to have one. Not a big fan of driving myself anyway, but I'm sick of the everyday accident on the way to work, or traffic slowed down because ONE jackass is screwing around on his phone and doing 50 in the 65 mph zone.
I swear I see that every day. If people can't be bothered to actually drive their cars, and that's a demonstrable fact for some, fine, give me (and them) autonomous cars.
Re: (Score:2)
Re: (Score:2)
So, it is your opinion, that government may declare anything, that's not explicitly enumerated in the Bill of Rights to be a privilege?
Walking on a street? Cooking a barbeque? Having children — or an abortion, as the case may be? Oh, wait [umkc.edu]!..
Of course, you may be onto something — because even something, that is [wikipedia.org] explicitly enumerated as a right, is routinely treated as a mere privilege nation-wide... Point is, of course, it s
Re: (Score:2)
States have the right to regulate commerce within their state, and Congress the right to regulate interstate commerce. Driving a vehicle falls well within the regulatory bounds of commerce, and can easily be argued within the government's domain of regulation, and not a unilateral right of the citizen.
So, I'll ask again. Do you have anything
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
Driving on the road isn't the problem, it's driving on the road and not hitting the deer that just ran into it, or avoiding the knucklehead who just swerved into your lane because he's drunk.
If it was just a problem of navigating between A and B while staying inside the painted lines, it'd be a much easier problem.
Re: (Score:3)
... car drives license. License drives you.