Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation

Driverless Cars Need a Lot More Than Software, Ford CTO Says (axios.com) 163

In an interview, Ken Washington, Ford's Chief Technical Officer, shared company's views on how autonomy will change car design. From an article: The biggest influence will be how the cars are bought, sold and used: "You would design those vehicles differently depending on what business model (is being used). We're working through that business model question right now," he said. The biggest misconceptions about autonomous capabilities is that it's only about software: "People are imagining that the act of doing software for autonomy is all you need to do and then you can just bolt it to the car," he said. "I don't think it's possible to describe what an autonomous vehicle is going to look like," he added.
This discussion has been archived. No new comments can be posted.

Driverless Cars Need a Lot More Than Software, Ford CTO Says

Comments Filter:
  • Translation (Score:5, Insightful)

    by burtosis ( 1124179 ) on Monday August 21, 2017 @01:04PM (#55058019)
    We are sick and tired of selling value at this price point. We don't easily know where you go in real time, can't divert you to areas we want you to go, or subject you to in vehicle ads, and after only a few years you are off a payment plan. We are going to fix this for you, and likely make it illegal to return to the old model of ownership and privacy rights.
    • "Driverless Fords Need a Lot More Than Software" everybody else says

    • Re: (Score:2, Interesting)

      by atrex ( 4811433 )
      One thing he could be referring to is the idea of an automated fleet of vehicles, effectively self driving taxis, that scoffer people around on demand and on schedules and eliminate any significant need for car ownership in metropolitan areas. And do so without a significant monthly cost - ie only $50 a month for 50 hours of travel time or such.

      You're probably right though in that a ride service like that will add whatever kind of micro-transactions or advertisements to their vehicles that they can get a
  • Business model... (Score:4, Insightful)

    by __aaclcg7560 ( 824291 ) on Monday August 21, 2017 @01:08PM (#55058033)
    The business model should include protecting people and pedestrians at all cost. A car that protects itself while getting everyone killed probably won't have a great used car value.
    • by MightyMartian ( 840721 ) on Monday August 21, 2017 @01:13PM (#55058063) Journal

      Now if we could only find a way to program human drivers to that standard.

    • The business model should include protecting people and pedestrians at all cost.

      If you want to be taken seriously, try to avoid hyperbolic phrases like "at all costs". In the real world, resources are always finite.

      A car that protects itself while getting everyone killed probably won't have a great used car value.

      Killing a human will cost millions or tens of millions in legal fees and payouts. Suggesting that these cars will intentionally prioritize avoiding mechanical damage over human life is absurd.

      • In the real world, resources are always finite.

        How many years did it take the auto industry to be shamed by Ralph Nader into providing safety features for their customers?

        Suggesting that these cars will intentionally prioritize avoiding mechanical damage over human life is absurd.

        Depends on the business model. Not every business model will prioritize human life. Based on the business model is how these self-driving cars will be programmed.

        https://www.technologyreview.com/s/542626/why-self-driving-cars-must-be-programmed-to-kill/ [technologyreview.com]

        • How many years did it take the auto industry to be shamed by Ralph Nader into providing safety features for their customers?

          Immediately after Ralph Nader shamed the government into changing product liability laws.

    • by green1 ( 322787 )

      This is difficult, and is going to have to be a government decision eventually.
      We'll never make a vehicle that will never kill anyone under any circumstance, there are just too many possible circumstances. The bigger problem is how it decides who dies.

      The driver of a vehicle will always choose to save themselves over someone else. In fact, they'll likely choose to save themselves over several others. But what choice will the car make?

      If people know that one make of car prioritizes the occupants of the vehic

      • by swilver ( 617741 ) on Monday August 21, 2017 @03:05PM (#55058991)

        This is so simple. The car should save the occupants, just like any normal driver would have done. Trying to take this to some Asimov "donot cause harm" bullshit will practically require cars to be self-aware, at which point cars may not actually want to serve their masters anymore.

        • by green1 ( 322787 )

          Except that for this to work right the manufacturers have to take the liability for the vehicle's actions as they're the ones doing the programming. Which means the manufacturer is going to do the math. 1 occupant or 3 pedestrians, the lawsuit for the 1 occupant will probably cost them less money, so they'd rather save the 3 pedestrians.

          This isn't a simple choice, and is not likely to be resolved decisively until regulatory agencies get involved (which they are guaranteed to do eventually)

        • by mjwx ( 966435 )

          This is so simple. The car should save the occupants, just like any normal driver would have done. Trying to take this to some Asimov "donot cause harm" bullshit will practically require cars to be self-aware, at which point cars may not actually want to serve their masters anymore.

          If only there was some sort of Code, a Code for the Highway, that told you what you should do in these situations. A shame something like that doesn't already exist.

      • You are making some deeply faulty assumptions about the kind of logic in place here. You want to judge cars based on their handling of a simple trolley problem. But far more important is the ability of multiple vehicles to coordinate and minimize total risk. Cooperation is going to to do far more to minimize crashes, injuries, and death.

        Plus, there's balancing the kind of injuries incurred. Thus, the logic or prioritizing would be more like:
        No injury > vehicle damage > them minor injury > us

        • by green1 ( 322787 )

          Nowhere did I ever claim that vehicles wouldn't coordinate, nor that any of these wouldn't minimize crashes, injuries and death. Nor did I say that it would be simple, or that other injuries wouldn't be a factor as well. But at some point it comes down to that trolley problem, or an us vs them decision. Even your prioritization put "us" after "them", but based on what? and based on how many "us" and how many "them"?

          If anything you've made my point for me. You made an assumption about who would be protected

          • I didn't say that it was the formula, I said that the formula would be more like what I proposed. Yes, eventually, there does have to be some choice, but if the car is 1/10 as likely to be in a crash, and the automatic driving can cut the fatality rate in the remaining crashes by 1/10, then a slightly better formula for crash force minimization will easily outweigh the effects of all the ethical programming in the world.

            Yes, it makes for a great philosophical debate. But it's obsessing over what is, fro

            • by green1 ( 322787 )

              Once again, I fully agree that these will be MUCH safer than current vehicles, and nowhere have I ever stated otherwise.

              but that "philosophical debate" isn't just philosophical, it's real. There are currently an estimated 1.25 million annual fatalities involving motor vehicles in the world. Even a system that's a full million times safer (and even the strongest advocates for self driving vehicles have never claimed that) would still involve deaths averaging more than one per year. You can try all you want t

              • The trolley problem is not real. It's a philosopher and psychologists plaything. It does not represent what happens in an emergency situation at all. When there's an imminent collision, people clearly don't think and weigh up alternative outcomes and make a choice before operating the controls. They simply react. Like 99% of driving the conscious mind that can make such high level choices isn't being used at all. Driving is simply a behaviour that comes from the subconscious.

                It's seems likely that in an eme

                • by green1 ( 322787 )

                  People just react, but a computer has a lot more time to decide, and in fact it MUST decide because it can't work on intuition, it must chose every single action. So yes, it is a very real problem. The car will at some point have a choice between 2 things to hit, where not hitting anything isn't an option. It could be programmed to chose a random number between 1 and 2 and hit based on that, but it's more likely you'd program it to chose based on minimizing harm. But harm to whom?

                  • No it doesn't have to decide, any more than a human does. Just as a human does it only has to react (or not). It's equivalent.

                    We're in the world of training, neural nets and fuzzy logic here, where there is no programmer that knows the specific rules by which the system is acting. Just as the human conscious does not know the reasons for which the subconscious reacts. We can only guess.

                    You could have the developers make moral judgement on a series of these trolley problem scenarios, assigning different weig

                    • by green1 ( 322787 )

                      Computers don't have an equivalent to "just react". Computers make decisions, they always do one thing, or another, never do they "just react". You specifically have to program which thing every computer will do.

                      And it is 100% guaranteed to be required to make a specific choice. it WILL be government mandated, it's only a matter of if the requirement comes before, or after, a driverless car kills someone.

                    • Sorry, but you don't know what you are talking about. You need to study neural networks. The are not "specifically programmed" they are trained with data sets. And not only are the ways they work not specifically programmed, a programmer cannot find out in any meaningful sense how it does work.

                    • by green1 ( 322787 )

                      The programmer does feed in the data sets though. And the car does only what it's programmed to do.

                      In your fantasy world if a car plowed through a crowd of people for no reason killing several of them you'd just shrug and say "the car wasn't programmed, it was just trained with data sets" That's not how it works, and if in fact it did work the way you suggest, I can 100% guarantee that no regulatory agency on this planet would ever approve a self driving vehicle.

                      Luckily for everyone, you don't have the fain

                    • I'm afraid you're suffering from the Dunning Kruger effect. You understand so little of how neural nets work, you don't even know how little you know.

                      As to regulatory agencies, they are interested in is demonstrable performance. Number of miles driven in tests, and how many incidents happened. Where an incident might be a collision, or a breaking of the law, such as running of a red light. Regulatory approval will simply come from a demonstration that over a large number of miles there are fewer/less seriou

      • by AmiMoJo ( 196126 )

        Cars won't decide who to kill. They will never be programmed to make that decision, and thus there will be no liability.

        Human drivers are taught to drive that way too. The laws surrounding driving don't require you to decide on a course of action based on who will die, they require you perform certain prescribed manoeuvres (e.g. an emergency stop) and to generally drive carefully. If you didn't create the conditions that caused the accident in the first place, you can't be held liable for not choosing suici

        • by green1 ( 322787 )

          In what world does an emergency stop always bring you to a full stop before impact, and is always a better choice than swerving?

          Automated vehicles will be far safer than existing human powered ones, but even they will not be 100% perfect, and will not have perfect knowledge. They can not stop an obstacle from appearing from behind something without enough time to stop in all situations.

          A car that's only possible reaction to that situation is to slam on the brakes and hope, would be a horrible design as many

          • by AmiMoJo ( 196126 )

            In the UK you are supposed to go slowly enough that you can always stop. Obviously if someone else makes a mistake and you can't stop it's not your fault. In that case swerving might help, but you are not obliged to risk it out punished if you don't do it.

            Swerving could make things worse. Then liability gets complicated.

            • by green1 ( 322787 )

              If swerving could obviously have avoided the collision,and you don't do it, you're probably liable, and even if you aren't liable in the terms of the highway code, you're likely liable from lawsuits from whatever you hit.

              • by AmiMoJo ( 196126 )

                It's hard to imagine a situation where you could "obviously" have swerved to avoid a collision that was caused by someone else's actions. Can you give an example?

                • by green1 ( 322787 )

                  Easily, child runs out in to the street right in front of your car, no room to stop by braking, swerve to avoid. Car backs out of parking spot without looking without room to stop, swerve to avoid. Load falls off the truck in front of you on the highway, swerve to the adjacent lane to avoid.

                  This is an extremely common type of situation and if you are not capable of doing it without thinking you simply shouldn't be on the road.

                  • by AmiMoJo ( 196126 )

                    In all of those cases you would not be liable for the accident unless you were speeding.

                    Sure, it would be great if drivers could avoid those accidents, but the point is that if you just applied the brakes you wouldn't be legally liable for the injuries or damage. The person who made the mistake of walking into the road or backing out without looking or not securing their load could not absolve themselves of blame by expecting you to swerve.

                    • by green1 ( 322787 )

                      Do you want to kill the child?

                      Do you want your self driving car to kill the child?

                      Do you really think that if swerving was an option and you chose not to, that nobody would think you liable? do you want to defend that lawsuit? People have lost those lawsuits in the past. Do you want to be next?

                    • by AmiMoJo ( 196126 )

                      I'd rather the car was designed so that it drove slowly when there were parked cars where pedestrians might leap out without warning, and have a front end designed to avoid killing them if it does collide.

                      EU standards actually require the front of the car to be designed to make pedestrian accidents survivable. Many cities have 20 MPH limits in residential areas.

                      Do you have links to any of these lawsuits? I'm genuinely interested in the legal arguments used.

                    • by green1 ( 322787 )

                      There is no such thing as a place where pedestrians can't leap out without warning, it happens everywhere. So you want the car to do 15km/hr on the highway, just in case? That's ridiculous, and nobody will buy that car. You can't avoid ALL collisions, it's just not possible, you also can't avoid all situations where you might have to take evasive action.

                      People like you who think driving is black and white, need to get off the road, you are unsafe.

                    • by AmiMoJo ( 196126 )

                      Well, technically true but it's actually illegal to walk along the motorway here and if you did and got hit there would be no question of it being your fault.

                    • by green1 ( 322787 )

                      You still don't get it at all.

                      I hope for everyone's sake that you NEVER get behind the wheel of an automobile. Your attitude is the most dangerous I've ever seen. You don't care who dies as long as what you do wasn't technically illegal. That's a horrible mindset to have, and I'm glad that those designing and regulating self driving vehicles don't think like you do!

                      Not every situation in driving is black and white, follow the law or don't. There are many things that are perfectly legal to do, but will get y

    • The only safe car is a parked car.
  • So it almost needs a soul when it needs to make life and death decisions, sort of a
    Complete holistic reconnaissance intelligence system to intercept necrosis events.

    • So it almost needs a soul when it needs to make life and death decisions, sort of a
      Complete holistic reconnaissance intelligence system to intercept necrosis events.

      Like in iRobot where the robot saves Will Smith's character from a car crash while letting a child in the other car die. The algorithm predicted that Will Smith's character had a higher likelihood of surviving. But it doesn't take into account that most (unselfish) people would want the child to be rescued. The problem is that it is a moral and value judgement rather than something that can be easily calculated.

  • by Anonymous Coward

    This rush to deploy driverless vehicles is insanity. Especially after the news of the gentleman who was denogginized by an 18 wheeler through no fault of his own. In response to events like that, Musk and other true believers simply think the concept might need a few more tweaks.

    • by gurps_npc ( 621217 ) on Monday August 21, 2017 @01:35PM (#55058183) Homepage

      Yes. Because I can't remember when the last time a human driven car caused a death. Excluding Charllotesville. And Barcelona. Oh, and my Grandmother. Actually it's pretty common. Which explains why you don't think about it.

      Common risks are ignored, while uncommon things get talked about.

      This causes some people to think that ridiculous precautions should be taken to stop the uncommon things while doing nothing to fix the common ones.

      Nope. Driver-less cars, using CURRENT technology would be safer than what we have now.

      But that doesn't mean we shouldn't take a few years to get the tech cheaper and better while we figure out the legal and sociological changes we need to make to support them.

      • Nope. Driver-less cars, using CURRENT technology would be safer than what we have now.

        Those are the ones that can be tricked into thinking a stop sign is actually a speed limit sign with nothing more than a handful of stickers, right?

        BTW, did Google ever figure out how to get their car to recognize a stopped cyclist, and not repeatedly slam on the brakes?

        • Mod parent up. Also driverless cars run most of the miles on freeways and all in fair weather. A major cause of accidents is bad weather, which driverless cars don't run in, another is alcohol which autonomous cars won't be good at avoiding since the drunk drivers don't follow rules. I can't even find a study that can compare apples to apples as driverless cars don't run under the same conditions as piloted ones and very few studies seperate out weather or road type in accidents. Comparing pre planned sa
        • by flux ( 5274 )

          As I understand it, the research team that demonstrated the "vulnerability" first created the tool to detect traffic signs, then exploited their own tool. It does sound it would be a lot more difficult to exploit the sign detectors of algorithms you don't have complete control over. For example, the recognition might not be based on neural networks or might be based on neural networks with adversarial training.

          And the last big news I read about Google's project are maybe a year or two old. I would be amazed

        • My GPS has the local speed limits in it. Would not be hard to add stop sign placements as well to provide a secondary check against such problems.
      • Driver-less cars, using CURRENT technology would be safer than what we have now.

        That's a trick statement - it's only true because current technology can't drive far enough without user input to get into trouble.

        IOW, Of course it's safer - they can only drive on highways, in good weather, with no unexpected obstacles, and with a driver ready to take over when something unexpected happens.

        Current human drivers are on average a great deal safer than the 20-year old tech you think is current (You *DO* realise that SDC performance has three orders of magnitude more resources thrown at it si

      • by mjwx ( 966435 )

        Nope. Driver-less cars, using CURRENT technology would be safer than what we have now.

        That's incorrect. Right now, no 100% autonomous car has been tested. Every single one has had a human driver watching over it.

        So what you really meant to say is that human and car working together is safer than what we have now.

        What will happen when your average, mouth breathing, insta-face-app addled moron gets it into their head that they now consciously don't have to pay attention to the road will be a very different thing.

    • Re: (Score:3, Insightful)

      by green1 ( 322787 )

      The guy who had "no fault of his own" drove his car in to the side of a semi-truck. That is the very definition of his fault. He didn't apply the brakes, didn't swerve, he drove straight in to the side of a truck.

      And don't claim it was the car's fault. The car was not self driving, you can't buy a self driving car at this point, nobody claimed the car could drive itself, and he had to agree to, and ignore, many warnings that it could not before operating it.

      In response to that incident, Musk did the horribl

      • by Kjella ( 173770 )

        Musk never said that the system in place on that vehicle needed a few more tweaks to achieve self driving, he said that the system on that car was never meant for self driving, and never advertised as such. He also said that future models of the car would include self driving by using different hardware and software.

        Why don't you read their claims yourself?

        Full Self-Driving Hardware on All Cars [tesla.com]

        All Tesla vehicles produced in our factory, including Model 3, have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver.

        They promise that buying a Tesla will get you a self-driving car with nothing more than a software upgrade.

        Full Self-Driving Capability
        Build upon Enhanced Autopilot and order Full Self-Driving Capability on your Tesla. This doubles the number of active cameras from four to eight, enabling full self-driving in almost all circumstances

        ...with a bit of small print:

        Please note that Self-Driving functionality is dependent upon extensive software validation and regulatory approval, which may vary widely by jurisdiction. It is not possible to know exactly when each element of the functionality described above will be available, as this is highly dependent on local regulatory approval.

        Translation: Development is done, it's already here but due to the red tape we can't say it is.

        • by green1 ( 322787 )

          Why don't you read their claims yourself?

          Full Self-Driving Hardware on All Cars [tesla.com]

          You do realize that those claims (although 100% false advertising) don't even apply to the vehicle in question because it didn't have that hardware on it right? The hardware you're talking about, and the claims you're pointing to are for hardware released AFTER the car you're talking about was sold.

          This is like blaming your Ford Model T for the cruise control not working because modern Fords include it.

          Translation: Development is done, it's already here but due to the red tape we can't say it is.

          no, translation: "the hardware on our latest cars is done, but the software isn't, and still requires quit

    • This rush to deploy driverless vehicles is insanity. Especially after the news of the gentleman who was denogginized by an 18 wheeler through no fault of his own. In response to events like that, Musk and other true believers simply think the concept might need a few more tweaks.

      Self-driving cars already have a better driving record than any human could hope for. They don't have to be perfect, they just have to be better than us.

      • by HiThere ( 15173 )

        It's not clear that no human could drive more safely than do the current cars. It *may* be true, but people could drive more safely by being careful and avoiding dangerous circumstances, which they are much more widely, if not quickly, aware of than "automated cars" currently are.

        FWIW, it's not clear to me just *what* the state of the art "automated car" can do by way of driving. But it is clear that many people show an incredible ability to be unaware of dangers. I also suspect that an automated car wou

  • They need hardware too. Duh.
    • by green1 ( 322787 )

      Many vehicles today already have a large percentage of the hardware, it's needed for other more basic systems like automatic emergency braking, forward collision warning, automatic lane keeping, adaptive cruise control, parking sensors, blind spot monitoring, etc. These cars will likely still need a bit more in the sensor department, but not all that much. They'll likely need some more powerful computers processing those signals though, and then of course a lot of software.

      What Ford is talking about though

  • That lot more than software is hardware.
  • If, for example, your sensors can't detect a white truck on a cloudy day, no software is going to be good enough.

    • I think you have some confused idea of what a "sensor" is, and what the software needs to do.

      Can your cell phone camera take a picture of a white truck on a cloudy day? Sure it can. Can the software of a potential self-driving system identify the white truck? That's the problem.
  • This has been discussed for years; it is why the manufacturers invest in Uber/Lyft, it is why Uber is investing in self-driving cars, and it is why higher utilization rates of autonomous cars are expected.

    Yes, it means that a car with 50% utilization will be more expensive than one with 5%, it means that the service model changes dramatically, and it means that the ownership model is also likely to be impacted.

    Who is really only looking at the first-order issues here? Aside for people complaining about EVs

  • Apart from Uber and one or two others? Come to that, will Ford, GM, VW et al still exist as separate car giants in 30 years time? They'll all be sub-contracting to the people who will be hiring out individual transport to take you from place to place in a vehicle that you hire by the hour. Very, very few people will own their own cars. Still less drive them - except for those "quaint" early 2000's models - on special tracks to which they will be transported safely (on specially designed flat beds) by - yep,
  • by ledow ( 319597 )

    Anyone else hear "business model" and think "how can we screw the customer for every penny"?

    I've only ever heard the phrase used in terms of things like rentals, recurring licencing, "cheap printers, expensive proprietary ink", etc.

    If you have to have a business model beyond "make product, sell product", I'm not sure I want it.

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...