Driverless Cars Need a Lot More Than Software, Ford CTO Says (axios.com) 163
In an interview, Ken Washington, Ford's Chief Technical Officer, shared company's views on how autonomy will change car design. From an article: The biggest influence will be how the cars are bought, sold and used: "You would design those vehicles differently depending on what business model (is being used). We're working through that business model question right now," he said. The biggest misconceptions about autonomous capabilities is that it's only about software: "People are imagining that the act of doing software for autonomy is all you need to do and then you can just bolt it to the car," he said. "I don't think it's possible to describe what an autonomous vehicle is going to look like," he added.
Translation (Score:5, Insightful)
Another Translation (Score:2)
"Driverless Fords Need a Lot More Than Software" everybody else says
Re: (Score:2, Interesting)
You're probably right though in that a ride service like that will add whatever kind of micro-transactions or advertisements to their vehicles that they can get a
Re: (Score:2)
Car manufacturers have rather more experience with creating software than any software company has with manufacturing cars. Most cars have dozens of computers in them already, and have for years.
Re: (Score:2)
Car manufacturers have rather more experience with creating software than any software company has with manufacturing cars. Most cars have dozens of computers in them already, and have for years.
True. But both suck at the other's job.
Re: (Score:2)
Which is Tesla? Because it's been doing a pretty good job of both.
Re: (Score:3)
Re: (Score:2)
Because I want to own my own car. I want control over the vehicle internal hardware and software, I don't want someone else's dirty vehicle, I want to add the accessories I choose and have easy access to them, I want to take it off road or on long trips where it will wait for me, I want it at my beck and call 24/7, I don't want monthly payments just off the top of my head. In an emergency or other high demand time I want a 100% shot at immediate vehicle access. You feel free living without a vehicle and only a glorified uber, paying far more cost per mile than vehicle ownership for the same number of miles, you and others may prefer this. Corporations like ford definitely want this yesterday. I, and likely many others, never will.
Fully autonomous won't happen tomorrow. Maybe it won't happen for 50 years but eventually it will & guys like you will be as rare & peculiar as cowboys in condos.
Re: (Score:2)
Re: (Score:2)
Things like this are generational change. I was reading the other week that fewer young people are even bothering to get a driving license these days. And that's before autonomous vehicles are even available.
It's like stick shift in the American market. Old timers always swore there were going to stick to them. After all, autos were expensive, inefficient, and not as fast. But these days, younger drivers don't even know how to drive a stick shift. They learned to drive in an auto and it's been autos ever si
Re: (Score:2)
Cost per mile will undoubtably be greater for a non-owned fleet vehicle. But without the fixed costs. No purchase cost, insurance, road tax, servicing, repairs. And for the next generation, no time or money spent learning to drive.
Business model... (Score:4, Insightful)
Re:Business model... (Score:5, Insightful)
Now if we could only find a way to program human drivers to that standard.
Re: (Score:3)
Was someone talking about taking away your right to drive? It strikes me that the first place self-driving vehicles are going to make major inroads is in long haul trucking, and that has nothing to do with taking away your right to drive, but rather with firing a whole lot of truckers, and making shipping cheaper.
Automation is coming, and you might as well accept it. I can't imagine self-driving vehicles are going to be common consumer products in the near future, but in thirty or forty years, I'll wager th
Re: (Score:2)
I'm just not sure this is going to mak
Re: (Score:2)
For now...
Re: (Score:2)
Re: (Score:2)
Or slaves to excellent machines, for that matter.
Re: (Score:2)
The business model should include protecting people and pedestrians at all cost.
If you want to be taken seriously, try to avoid hyperbolic phrases like "at all costs". In the real world, resources are always finite.
A car that protects itself while getting everyone killed probably won't have a great used car value.
Killing a human will cost millions or tens of millions in legal fees and payouts. Suggesting that these cars will intentionally prioritize avoiding mechanical damage over human life is absurd.
Re: (Score:2)
In the real world, resources are always finite.
How many years did it take the auto industry to be shamed by Ralph Nader into providing safety features for their customers?
Suggesting that these cars will intentionally prioritize avoiding mechanical damage over human life is absurd.
Depends on the business model. Not every business model will prioritize human life. Based on the business model is how these self-driving cars will be programmed.
https://www.technologyreview.com/s/542626/why-self-driving-cars-must-be-programmed-to-kill/ [technologyreview.com]
Re: (Score:2)
How many years did it take the auto industry to be shamed by Ralph Nader into providing safety features for their customers?
Immediately after Ralph Nader shamed the government into changing product liability laws.
Re: (Score:2)
People just valued cheap more than they valued safety.
People are very bad at understanding risk. Product liability laws shifted the cost from injured customers (who are poor decision makers) onto the bean counting accountants and lawyers at the auto and insurance companies (who are good decision makers).
The auto companies could then either make their products safer or pay higher liability payouts. Either of these will mean higher prices for consumers, who can then make a rationale choice since the "cost of risk" is incorporated into the sticker price.
Re: (Score:3)
Really? Where were all the moderately more expensive cars with seatbelts that people had the option of buying?
One of the big problems with the market is that it's almost entirely filled with what manufacturers *think* people want (and will make them the most money), rather than actually *providing* those choices for those who want them. I suspect that simply requiring manufacturers to provide seatbelts as a low-profit option with easy aftermarket install would have had much the same effect, if somewhat mo
Re: (Score:3)
This is difficult, and is going to have to be a government decision eventually.
We'll never make a vehicle that will never kill anyone under any circumstance, there are just too many possible circumstances. The bigger problem is how it decides who dies.
The driver of a vehicle will always choose to save themselves over someone else. In fact, they'll likely choose to save themselves over several others. But what choice will the car make?
If people know that one make of car prioritizes the occupants of the vehic
Re:Business model... (Score:4, Insightful)
This is so simple. The car should save the occupants, just like any normal driver would have done. Trying to take this to some Asimov "donot cause harm" bullshit will practically require cars to be self-aware, at which point cars may not actually want to serve their masters anymore.
Re: (Score:3)
Except that for this to work right the manufacturers have to take the liability for the vehicle's actions as they're the ones doing the programming. Which means the manufacturer is going to do the math. 1 occupant or 3 pedestrians, the lawsuit for the 1 occupant will probably cost them less money, so they'd rather save the 3 pedestrians.
This isn't a simple choice, and is not likely to be resolved decisively until regulatory agencies get involved (which they are guaranteed to do eventually)
Re: (Score:2)
And once again, why you'll see regulatory bodies being the ones who make these decisions eventually, not the manufacturers or the consumers. It will be taken out of their hands.
Re: (Score:2)
It's not that simple for human drivers, why would it be that simple for computer drivers?
People have lost major lawsuits while following all the traffic laws because they killed someone when they didn't have to. Computers will be the same. If there's an action it could reasonably have taken to prevent loss of life, but failed to do so, the manufacturer will be sued, and based on precedent, they'll lose.
Re: (Score:2)
This is so simple. The car should save the occupants, just like any normal driver would have done. Trying to take this to some Asimov "donot cause harm" bullshit will practically require cars to be self-aware, at which point cars may not actually want to serve their masters anymore.
If only there was some sort of Code, a Code for the Highway, that told you what you should do in these situations. A shame something like that doesn't already exist.
Re: (Score:2)
You are making some deeply faulty assumptions about the kind of logic in place here. You want to judge cars based on their handling of a simple trolley problem. But far more important is the ability of multiple vehicles to coordinate and minimize total risk. Cooperation is going to to do far more to minimize crashes, injuries, and death.
Plus, there's balancing the kind of injuries incurred. Thus, the logic or prioritizing would be more like:
No injury > vehicle damage > them minor injury > us
Re: (Score:2)
Nowhere did I ever claim that vehicles wouldn't coordinate, nor that any of these wouldn't minimize crashes, injuries and death. Nor did I say that it would be simple, or that other injuries wouldn't be a factor as well. But at some point it comes down to that trolley problem, or an us vs them decision. Even your prioritization put "us" after "them", but based on what? and based on how many "us" and how many "them"?
If anything you've made my point for me. You made an assumption about who would be protected
Re: (Score:2)
I didn't say that it was the formula, I said that the formula would be more like what I proposed. Yes, eventually, there does have to be some choice, but if the car is 1/10 as likely to be in a crash, and the automatic driving can cut the fatality rate in the remaining crashes by 1/10, then a slightly better formula for crash force minimization will easily outweigh the effects of all the ethical programming in the world.
Yes, it makes for a great philosophical debate. But it's obsessing over what is, fro
Re: (Score:2)
Once again, I fully agree that these will be MUCH safer than current vehicles, and nowhere have I ever stated otherwise.
but that "philosophical debate" isn't just philosophical, it's real. There are currently an estimated 1.25 million annual fatalities involving motor vehicles in the world. Even a system that's a full million times safer (and even the strongest advocates for self driving vehicles have never claimed that) would still involve deaths averaging more than one per year. You can try all you want t
Re: (Score:2)
The trolley problem is not real. It's a philosopher and psychologists plaything. It does not represent what happens in an emergency situation at all. When there's an imminent collision, people clearly don't think and weigh up alternative outcomes and make a choice before operating the controls. They simply react. Like 99% of driving the conscious mind that can make such high level choices isn't being used at all. Driving is simply a behaviour that comes from the subconscious.
It's seems likely that in an eme
Re: (Score:2)
People just react, but a computer has a lot more time to decide, and in fact it MUST decide because it can't work on intuition, it must chose every single action. So yes, it is a very real problem. The car will at some point have a choice between 2 things to hit, where not hitting anything isn't an option. It could be programmed to chose a random number between 1 and 2 and hit based on that, but it's more likely you'd program it to chose based on minimizing harm. But harm to whom?
Re: (Score:2)
No it doesn't have to decide, any more than a human does. Just as a human does it only has to react (or not). It's equivalent.
We're in the world of training, neural nets and fuzzy logic here, where there is no programmer that knows the specific rules by which the system is acting. Just as the human conscious does not know the reasons for which the subconscious reacts. We can only guess.
You could have the developers make moral judgement on a series of these trolley problem scenarios, assigning different weig
Re: (Score:2)
Computers don't have an equivalent to "just react". Computers make decisions, they always do one thing, or another, never do they "just react". You specifically have to program which thing every computer will do.
And it is 100% guaranteed to be required to make a specific choice. it WILL be government mandated, it's only a matter of if the requirement comes before, or after, a driverless car kills someone.
Re: (Score:2)
Sorry, but you don't know what you are talking about. You need to study neural networks. The are not "specifically programmed" they are trained with data sets. And not only are the ways they work not specifically programmed, a programmer cannot find out in any meaningful sense how it does work.
Re: (Score:2)
The programmer does feed in the data sets though. And the car does only what it's programmed to do.
In your fantasy world if a car plowed through a crowd of people for no reason killing several of them you'd just shrug and say "the car wasn't programmed, it was just trained with data sets" That's not how it works, and if in fact it did work the way you suggest, I can 100% guarantee that no regulatory agency on this planet would ever approve a self driving vehicle.
Luckily for everyone, you don't have the fain
Re: (Score:2)
I'm afraid you're suffering from the Dunning Kruger effect. You understand so little of how neural nets work, you don't even know how little you know.
As to regulatory agencies, they are interested in is demonstrable performance. Number of miles driven in tests, and how many incidents happened. Where an incident might be a collision, or a breaking of the law, such as running of a red light. Regulatory approval will simply come from a demonstration that over a large number of miles there are fewer/less seriou
Re: (Score:2)
Cars won't decide who to kill. They will never be programmed to make that decision, and thus there will be no liability.
Human drivers are taught to drive that way too. The laws surrounding driving don't require you to decide on a course of action based on who will die, they require you perform certain prescribed manoeuvres (e.g. an emergency stop) and to generally drive carefully. If you didn't create the conditions that caused the accident in the first place, you can't be held liable for not choosing suici
Re: (Score:2)
In what world does an emergency stop always bring you to a full stop before impact, and is always a better choice than swerving?
Automated vehicles will be far safer than existing human powered ones, but even they will not be 100% perfect, and will not have perfect knowledge. They can not stop an obstacle from appearing from behind something without enough time to stop in all situations.
A car that's only possible reaction to that situation is to slam on the brakes and hope, would be a horrible design as many
Re: (Score:2)
In the UK you are supposed to go slowly enough that you can always stop. Obviously if someone else makes a mistake and you can't stop it's not your fault. In that case swerving might help, but you are not obliged to risk it out punished if you don't do it.
Swerving could make things worse. Then liability gets complicated.
Re: (Score:2)
If swerving could obviously have avoided the collision,and you don't do it, you're probably liable, and even if you aren't liable in the terms of the highway code, you're likely liable from lawsuits from whatever you hit.
Re: (Score:2)
It's hard to imagine a situation where you could "obviously" have swerved to avoid a collision that was caused by someone else's actions. Can you give an example?
Re: (Score:2)
Easily, child runs out in to the street right in front of your car, no room to stop by braking, swerve to avoid. Car backs out of parking spot without looking without room to stop, swerve to avoid. Load falls off the truck in front of you on the highway, swerve to the adjacent lane to avoid.
This is an extremely common type of situation and if you are not capable of doing it without thinking you simply shouldn't be on the road.
Re: (Score:2)
In all of those cases you would not be liable for the accident unless you were speeding.
Sure, it would be great if drivers could avoid those accidents, but the point is that if you just applied the brakes you wouldn't be legally liable for the injuries or damage. The person who made the mistake of walking into the road or backing out without looking or not securing their load could not absolve themselves of blame by expecting you to swerve.
Re: (Score:2)
Do you want to kill the child?
Do you want your self driving car to kill the child?
Do you really think that if swerving was an option and you chose not to, that nobody would think you liable? do you want to defend that lawsuit? People have lost those lawsuits in the past. Do you want to be next?
Re: (Score:2)
I'd rather the car was designed so that it drove slowly when there were parked cars where pedestrians might leap out without warning, and have a front end designed to avoid killing them if it does collide.
EU standards actually require the front of the car to be designed to make pedestrian accidents survivable. Many cities have 20 MPH limits in residential areas.
Do you have links to any of these lawsuits? I'm genuinely interested in the legal arguments used.
Re: (Score:2)
There is no such thing as a place where pedestrians can't leap out without warning, it happens everywhere. So you want the car to do 15km/hr on the highway, just in case? That's ridiculous, and nobody will buy that car. You can't avoid ALL collisions, it's just not possible, you also can't avoid all situations where you might have to take evasive action.
People like you who think driving is black and white, need to get off the road, you are unsafe.
Re: (Score:2)
Well, technically true but it's actually illegal to walk along the motorway here and if you did and got hit there would be no question of it being your fault.
Re: (Score:2)
You still don't get it at all.
I hope for everyone's sake that you NEVER get behind the wheel of an automobile. Your attitude is the most dangerous I've ever seen. You don't care who dies as long as what you do wasn't technically illegal. That's a horrible mindset to have, and I'm glad that those designing and regulating self driving vehicles don't think like you do!
Not every situation in driving is black and white, follow the law or don't. There are many things that are perfectly legal to do, but will get y
Re: (Score:2)
Re:Business model... (Score:5, Insightful)
Shitty 'self driving cars' will fail spectacularly in the marketplace once people truly understand the reality of them: your real freedom taken away, as you're strapped into some machine that you have zero real control over.
I for one will be very happy to have a machine do the driving for me. I already use adaptive cruise control and traffic jam assist on my commute, and I would happily turn over the drudgery of driving to a machine. I derive no joy from driving, though I know many people who do, and I don't begrudge them that.
But I question the common perception that self driving cars are going to lead huge drops in car ownership. Right now my golf clubs and gym bag are in my car, and my sunglasses, and my bike rack, and my music collection. And compared to the amount of crap I see in other peoples vechicles, I am the model of tidiness. Music can migrate to my phone, and I can carry my sunglasses easily enough, but how do I call for a car that has a bike rack that fits a recumbent bike? I can take my golf clubs in an uber type car to work, then to the course, then back home, but that is a bunch of schlepping that is easier when I can just leave my clubs in the trunk. What about child seats? Will parents have to provide their own car seats, or count on calling a car that has one or more available?
None of these things is a showstopper, but if I am already spending money to own my car, why wouldn't I spend money to own my self driving car, that already has my stuff in it? I can see two car families turning into one car families, but I suspect many people will still want to own their own vehicle.
feld techs / plumbers / hvac / cable guys / etc (Score:2)
feld techs / plumbers / hvac / cable guys / etc.
Keep alot of parts / tools in the car / van and they go site to site throughout the day.
Re: (Score:2)
dense city (Score:2)
But I question the common perception that self driving cars are going to lead huge drops in car ownership. {...} None of these things is a showstopper, but if I am already spending money to own my car, why wouldn't I spend money to own my self driving car, that already has my stuff in it?
Depends on where you live.
In a dense (european-style) city, owning a car is a complicated matter. There's no free street parking.
You need a place to park it over night (so in addition to rent your own flat, you need to pay rent for a parking spot in the underground garage under you apartment building - if one exists. Otherwise you need to pay a yearly fee just to be able to leave it without limits in your own street)
You need to pay for for parking whenever you go shopping somewhere.
You need to pay a monthly
Re: (Score:3)
So in the UK that's basically just centralish London. Can't speak for other countries but because it might work in London means jack for the remaining 90% of the population.
Re: (Score:2)
You're not quite thinking this through.
Your golf-clubs will be kept at a storage facility, which will automatically load the clubs onto a small self-driving car that will roll out to wherever you want it.
Don't forget, cars don't just move people, they move objects too. There's only logistical reasons why your golf bag itself can't be a self-driving car, or at the very lease easily slotted onto one.
Re: (Score:2)
I'm sure there were many horse breeders and carriage manufacturers who believed the same thing, right up until most of them went out of business.
Re: (Score:2)
you're strapped into some machine that you have zero real control over.
This is also true when you are a passenger on an airplane, taxi, bus, or even a carpool. Yet people do that everyday. The difference with SDCs is that you will be safer.
Re: (Score:3)
"Shitty 'self driving cars' will fail spectacularly in the marketplace once people truly understand the reality of them: your real freedom taken away,"
On the contrary. The problem is probably going to be that most drivers are going to find themselves in "manual mode" with minimal or no assistance from the vehicle much more of the time than they wish. I'm guessing that NOBODY wishes to deal with the Garden State Parkway or any of the I5-I405 splits-merges in Socal if the car can manage them. And those are
Re: (Score:2)
See what happens when you make an actual contribution to the discussion? Keep it up, and I might actually grow to APPRECIATE your contributions here.
I'm just baiting my trolls so I can ignore them and see how they respond. This past weekend they accused me of gaining weight for the last ten years by taking anti-psychotic drugs. Made fo interesting reading.
Re: (Score:2)
We also noticed you seem to have pulled all content from the last few months on your blog.
The top three popular links for my blog this past weekend was the Hello, Slashdot [bit.ly], The Original Slashdot F.A.Q. [bit.ly], and, of course, the tag [bit.ly] for all my Slashdot-related blog posts.
Best to forget that little episode now that you're normal!
Sorry, I took my vitamins this morning.
Re: (Score:2)
We simply put the pieces together.
Translation: We made shit up, as we usually do.
A soul? (Score:2)
So it almost needs a soul when it needs to make life and death decisions, sort of a
Complete holistic reconnaissance intelligence system to intercept necrosis events.
Re: (Score:2)
So it almost needs a soul when it needs to make life and death decisions, sort of a
Complete holistic reconnaissance intelligence system to intercept necrosis events.
Like in iRobot where the robot saves Will Smith's character from a car crash while letting a child in the other car die. The algorithm predicted that Will Smith's character had a higher likelihood of surviving. But it doesn't take into account that most (unselfish) people would want the child to be rescued. The problem is that it is a moral and value judgement rather than something that can be easily calculated.
Re: (Score:2)
The technology simply isn't safe enough yet (Score:2, Insightful)
This rush to deploy driverless vehicles is insanity. Especially after the news of the gentleman who was denogginized by an 18 wheeler through no fault of his own. In response to events like that, Musk and other true believers simply think the concept might need a few more tweaks.
Re:The technology simply isn't safe enough yet (Score:4, Insightful)
Yes. Because I can't remember when the last time a human driven car caused a death. Excluding Charllotesville. And Barcelona. Oh, and my Grandmother. Actually it's pretty common. Which explains why you don't think about it.
Common risks are ignored, while uncommon things get talked about.
This causes some people to think that ridiculous precautions should be taken to stop the uncommon things while doing nothing to fix the common ones.
Nope. Driver-less cars, using CURRENT technology would be safer than what we have now.
But that doesn't mean we shouldn't take a few years to get the tech cheaper and better while we figure out the legal and sociological changes we need to make to support them.
Re: (Score:3)
Nope. Driver-less cars, using CURRENT technology would be safer than what we have now.
Those are the ones that can be tricked into thinking a stop sign is actually a speed limit sign with nothing more than a handful of stickers, right?
BTW, did Google ever figure out how to get their car to recognize a stopped cyclist, and not repeatedly slam on the brakes?
Re: (Score:2)
Re: (Score:2)
As I understand it, the research team that demonstrated the "vulnerability" first created the tool to detect traffic signs, then exploited their own tool. It does sound it would be a lot more difficult to exploit the sign detectors of algorithms you don't have complete control over. For example, the recognition might not be based on neural networks or might be based on neural networks with adversarial training.
And the last big news I read about Google's project are maybe a year or two old. I would be amazed
Re: (Score:2)
Re: (Score:2)
Driver-less cars, using CURRENT technology would be safer than what we have now.
That's a trick statement - it's only true because current technology can't drive far enough without user input to get into trouble.
IOW, Of course it's safer - they can only drive on highways, in good weather, with no unexpected obstacles, and with a driver ready to take over when something unexpected happens.
Current human drivers are on average a great deal safer than the 20-year old tech you think is current (You *DO* realise that SDC performance has three orders of magnitude more resources thrown at it si
Re: (Score:2)
Nope. Driver-less cars, using CURRENT technology would be safer than what we have now.
That's incorrect. Right now, no 100% autonomous car has been tested. Every single one has had a human driver watching over it.
So what you really meant to say is that human and car working together is safer than what we have now.
What will happen when your average, mouth breathing, insta-face-app addled moron gets it into their head that they now consciously don't have to pay attention to the road will be a very different thing.
Re: (Score:3, Insightful)
The guy who had "no fault of his own" drove his car in to the side of a semi-truck. That is the very definition of his fault. He didn't apply the brakes, didn't swerve, he drove straight in to the side of a truck.
And don't claim it was the car's fault. The car was not self driving, you can't buy a self driving car at this point, nobody claimed the car could drive itself, and he had to agree to, and ignore, many warnings that it could not before operating it.
In response to that incident, Musk did the horribl
Re: (Score:2)
Musk never said that the system in place on that vehicle needed a few more tweaks to achieve self driving, he said that the system on that car was never meant for self driving, and never advertised as such. He also said that future models of the car would include self driving by using different hardware and software.
Why don't you read their claims yourself?
Full Self-Driving Hardware on All Cars [tesla.com]
All Tesla vehicles produced in our factory, including Model 3, have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver.
They promise that buying a Tesla will get you a self-driving car with nothing more than a software upgrade.
Full Self-Driving Capability
Build upon Enhanced Autopilot and order Full Self-Driving Capability on your Tesla. This doubles the number of active cameras from four to eight, enabling full self-driving in almost all circumstances
...with a bit of small print:
Please note that Self-Driving functionality is dependent upon extensive software validation and regulatory approval, which may vary widely by jurisdiction. It is not possible to know exactly when each element of the functionality described above will be available, as this is highly dependent on local regulatory approval.
Translation: Development is done, it's already here but due to the red tape we can't say it is.
Re: (Score:2)
Why don't you read their claims yourself?
Full Self-Driving Hardware on All Cars [tesla.com]
You do realize that those claims (although 100% false advertising) don't even apply to the vehicle in question because it didn't have that hardware on it right? The hardware you're talking about, and the claims you're pointing to are for hardware released AFTER the car you're talking about was sold.
This is like blaming your Ford Model T for the cruise control not working because modern Fords include it.
Translation: Development is done, it's already here but due to the red tape we can't say it is.
no, translation: "the hardware on our latest cars is done, but the software isn't, and still requires quit
Re: (Score:2)
This rush to deploy driverless vehicles is insanity. Especially after the news of the gentleman who was denogginized by an 18 wheeler through no fault of his own. In response to events like that, Musk and other true believers simply think the concept might need a few more tweaks.
Self-driving cars already have a better driving record than any human could hope for. They don't have to be perfect, they just have to be better than us.
Re: (Score:2)
It's not clear that no human could drive more safely than do the current cars. It *may* be true, but people could drive more safely by being careful and avoiding dangerous circumstances, which they are much more widely, if not quickly, aware of than "automated cars" currently are.
FWIW, it's not clear to me just *what* the state of the art "automated car" can do by way of driving. But it is clear that many people show an incredible ability to be unaware of dangers. I also suspect that an automated car wou
the cut over time is really bad with you must be r (Score:2)
the cut over time is really bad with you must be ready to take over with much less reaction time Then an airline pilot has when the autopilot fails.
Re: (Score:3)
Actually, the Ford guy has a valid point. He probably doesn't want to talk about it, but it *is* valid. When the car drives itself, do you provide controls? What percentage of people are going to own a car, and how many will just use an automated taxi when needed. Etc.
Note that many of these changes are dependent on social decisions that haven't been made, but might be. E.g., if rush hour goes away, then the "I'll depend on a taxi" option becomes more viable. If it doesn't, then micro-buses dominate.
Re: (Score:2)
So a self-driving mini-bus that comes with a strap-on set of oversized genitalia isn't going to cut it?
Well yea... (Score:2)
Re: (Score:3)
Many vehicles today already have a large percentage of the hardware, it's needed for other more basic systems like automatic emergency braking, forward collision warning, automatic lane keeping, adaptive cruise control, parking sensors, blind spot monitoring, etc. These cars will likely still need a bit more in the sensor department, but not all that much. They'll likely need some more powerful computers processing those signals though, and then of course a lot of software.
What Ford is talking about though
Actually... you need both software and hardware. (Score:3)
Well duh! (Score:2)
If, for example, your sensors can't detect a white truck on a cloudy day, no software is going to be good enough.
Re: (Score:3)
Can your cell phone camera take a picture of a white truck on a cloudy day? Sure it can. Can the software of a potential self-driving system identify the white truck? That's the problem.
Something everyone knew already (Score:2)
This has been discussed for years; it is why the manufacturers invest in Uber/Lyft, it is why Uber is investing in self-driving cars, and it is why higher utilization rates of autonomous cars are expected.
Yes, it means that a car with 50% utilization will be more expensive than one with 5%, it means that the service model changes dramatically, and it means that the ownership model is also likely to be impacted.
Who is really only looking at the first-order issues here? Aside for people complaining about EVs
Who would want to BUY an autonomous car? (Score:2)
Sigh (Score:2)
Anyone else hear "business model" and think "how can we screw the customer for every penny"?
I've only ever heard the phrase used in terms of things like rentals, recurring licencing, "cheap printers, expensive proprietary ink", etc.
If you have to have a business model beyond "make product, sell product", I'm not sure I want it.
Re:maybe not a Ford vehicle (Score:5, Interesting)
If you take a model S, and add cross traffic and rear radars, it will have the hardware to be 100% self driving. (Don't believe Tesla when they take your money for "full self driving" without those basic necessities, they're flat out lying as they have done so often in the past)
Beyond that though, there's a LOT of software work to be done, and I really don't know how far away that is. There are just so many edge cases in driving that I'm not confident that we'll get to 100% self driving with zero driver input under any circumstances for a very long time (and that's what you need if you want to get out of the car at work and send the car to pick your kid up at school without you)
Ford though is talking about the next stage, once self driving is around, you won't want what the Model S offers. sitting facing forward with a steering wheel in your lap and with the primary entertainment display off to the side and out of your line of sight will be awkward and unnecessary. Thing is, that's talking about what a self driving car CAN be, not what a self driving car MUST be, these are 2 very different things, and I don't think Ford understands that. Too many people think that you must have complete revolution, instead of simple evolution. The first fully self driving cars will be just like today's cars, but with radar, lidar, and cameras mounted on them, plus some pretty powerful computers and software. They'll evolve from there to include more vehicle to vehicle communication, and to change the interior away from a driving focus, and towards an entertainment focus, but none of that will happen instantly, nor does it need to.
The people who expect a full self driving revolution don't tend to be happy with the slow evolution that actually could get us there, and therefore these people are holding back progress.
Re: (Score:3)
Re: (Score:2)
We don't know what is enough, but that doesn't mean that we don't know what isn't enough. Surround view cameras without radar, and without any way of keeping those cameras clean, are very obviously not enough.
Re: (Score:2)
Humans can look around a single raindrop on a window, a camera can not. Take your windshield, place a 1" diameter piece of mud on it, and try to drive, no problem right? now place that 1" diameter piece of mud directly in front of the camera that only has a 3/4" lens and see how well it does. On the windshield there are wipers and washers to clear that camera, no other camera on the car has that.
Beyond that, if your whole window gets covered, you can stop, get out, clean the window, and continue (though you
Re: (Score:2)
Re: (Score:2)
later on, almost guaranteed. But initially? no, the first self driving cars will be like todays cars, but with self driving. Later they'll evolve to more relaxing spaces. For seems to think that that evolution would be needed first, but it isn't. You can design a car like today's cars that self drives, but you can't design a car like you propose without it.
Re:Does anybody remember the Pinto? (Score:5, Informative)
That's not actually the whole story. Ford had written a cost-benefit analysis of changing fuel systems across all cars, not just Pintos, as part of a presentation to NHTSA about moving to a 30mph fixed-barrier standard. The standards were rapidly shifting during the development of the Pinto - there was no rear-collision standard at all when the design began and the original proposal was a 20 mph moving-barrier standard, which Ford supported and designed the Pinto around.
The NHTSA then solicited opinions on a future change to a 30mph fixed-barrier standard, which was the reason Ford provided that analysis. The Pinto was one of dozens of vehicles which would be affected by such a change. Mother Jones magazine got the report and turned it into a series of stories about the Pinto being a death trap. The NHTSA then tested the Pinto using non-standard methods (different levels of gasoline, the use of a "bullet" car instead of a barrier that was designed to ram under the gas tank, and a 35mph speed that had never been discussed). Based on a set of tests that were designed specifically to cause a gas leak and exceeded any standards even being discussed, the car was recalled.
Pretty much any station wagon or hatchback of the era would have failed that test. I'm glad that we now have even more stringent tests but it's clear that this was a rigged test and a media generated controversy rather than specifically nefarious company wrongdoing. Every applicable standard of the time was met - it just couldn't pass a test specifically designed to make it fail.
Re: (Score:2)
Thank you for a solid exposition and reply.