Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Transportation AI Businesses

Ford Just Invested $1 Billion In Self-Driving Cars (usatoday.com) 113

An anonymous reader quote USA Today: Ford Motor is betting $1 billion on the world's self-driving car future. The Detroit automaker announced Friday that it would allocate that sum over five years to a new autonomous car startup called Argo AI, which is headquartered in Pittsburgh, Pa., and will have offices in Michigan and California. Ford's financial outlay is part of a continuing investment strategy anchored to transforming the car and truck seller into a mobility company with a hand in ride-hailing, ride-sharing and even bicycle rentals.
Lucas123 writes: Argo AI founders CEO Bryan Salesky, and COO Peter Rander are alumni of Carnegie Mellon National Robotics Engineering Center and former leaders on the self-driving car teams of Google and Uber, respectively. Argo AI's team will include roboticists and engineers from inside and outside of Ford working to develop a new software platform for Ford's fully autonomous vehicle, expected in 2021. Ford said it could also license the software to other carmakers.
This discussion has been archived. No new comments can be posted.

Ford Just Invested $1 Billion In Self-Driving Cars

Comments Filter:
  • Uber is dead (Score:2, Interesting)

    by Anonymous Coward

    And this is why. Several companies are set to invade their space.

    • They'll wait to destroy Uber until they make as much progress as possible at killing taxi companies.

  • It's great to see where this technology is going. You've got so many players now in the self-driving vehicles as well as hybrid and electric vehicles. I'm happy to see Ford jump in on this with both feet even if they are a little late to the game. Riding in a vehicle will be a much safer experience when humans aren't the ones driving.
    • by johanw ( 1001493 )

      It will be a lot slower too. Manufacturers will put in huge safety limits to prevent liability - imagine what will happen in some countries when bycicle riders figure out that the car will stop anyway if they just continue, wether they have priority or not. In The Netherlands they already do that with human drivers...

      Further, I don't see this happening anytime soon. It may work in the US, where streets are wide, people drive slow and orderly and traffic is adapted for cars. Did you ever drove a car in Pari

      • A lot of people think that insurance companies will be happy to cover the flaws of self driving cars because over all they will be safer, so the automated car companies will be left with zero liability for everything. There, I just typed that with a straight face.
        • And why wouldn't that be true ?
          • Because when someone's family member dies because they were riding a bicycle and the automated car didn't see them because of a gap in the sensors or what not, the insurance companies won't accept covering the payment to the family. They will sue the car company.
            • Why ? Currently they don't sue a driver for not seeing the bicycle either. Insurance companies don't care about individual cases, they deal with statistics. If self-driving cars cause fewer accidents per year, they'll make more money in insuring them.
              • Because we're not talking about a flaw in a human, we're talking about a flaw in a piece of technology that is supposed to be designed to be safe. Insurance companies cover human error, not technical error. If a car's accelerator pedal fails and sticks on and someone dies, insurance companies don't pay for that the car company does for faulty design and issues a huge recall. The same will be for automated driving, it is just a far more complicated technical error.
                • A rocket is a piece of technology, with a fairly high failure rate, and you can get insurance for that too. For the insurance company it's a simple calculation: if the premium covers the expected pay out, they'll cover it. The only thing they really hate is systemic errors, but these can be reduced by thorough testing, and by quick feedback from accident analysis. Because every detail can be logged, technical errors can be fixed and those fixes distributed to the rest of the fleet before they occur in large
                • Because we're not talking about a flaw in a human, we're talking about a flaw in a piece of technology that is supposed to be designed to be safe.

                  Tesla Auto Pilot has already killed people, and other driver assistance technology has as well. The insurance companies paid up. So what you are saying has not been true so far.

                  Insurance companies cover human error, not technical error.

                  Hogwash. insurance companies pay according to their contract. Standard auto insurance includes coverage for mechanical failure.

                  If a car's accelerator pedal fails and sticks on and someone dies, insurance companies don't pay for that the car company does

                  Car companies carry insurance to cover those liabilities. So an insurance company (maybe even the same insurance company) still pays ... and that is only if the manufacturer is found liable. If the fail

                  • This is just a problem in itself. We all know costs filter down to the clients, and other insurance holders should not be asked to bear the burden of failed AI. Did the insurance agencies at least have the self reason to call these accidents 0% responsibility to the driver? Because otherwise someone's premiums are going up for something that is beyond their control.
                • It seems that most people on slashdot like to talk about things they speculate on as though they are authorities on the topic.

                  Insurance companies are in business to make money.

                  If people chose one insurance company over another because of price and restrictions, the rules of the first company will adapt to attract new customers.

                  Insurance companies almost certainly will try to find a way to profit more from self-driving cars but when there's more profit to be made, one company will offer better terms than ano
                  • The question is, will they profit more by holding the providers of AI responsible for their technical issues and weaknesses, or will the clients of said insurance company be all contributing to clean up after these flaws? I think there must be a wave of court cases just under the radar that are minor things, where Autopilot ran over someone's pet or clipped a kid on a bike or didn't quite drive around another car. The money that the insurance company uses to 'fix' these things has to come from somewhere,
        • Self-driving cars are covered in sensors. It's no longer "your word against his" when it comes to insurance claims.

          If automated cars are seen to have significant faults which result in them causing crashes, software updates can rectify that across the fleet quickly - with humans that means extra individual training which most won't do.

          Bear in mind that most human drivers are barely competent to actually pilot a machine and tend to be easily distracted - automotons will be paying 100% attention 100% of the t

          • Software updates can only be pushed out if these cars are all connected 100% of the time which is a huge security concern in itself. Furthermore, automation doesn't pay attention in all directions 100% of the time. The sensors only have a certain view radius and they seem to want to limit the number of them used. If sensors saw everything then Autopilot wouldn't be able to run into a trailer which has happened at least two times now. Sure they increased the radar after the the accident and they say ther
            • Heavy fog/snowing == system slows down to safe speeds - which most humans do not and is why we end up with massive fogbound pileups.

              The Joshua Brown trailer incident was specifically because the system wasn't programmed to encounter clotheslining events - and given that Joshua _always_ recorded his trips, there's still the nagging question of what happened to his dashcam - it's never been found. (IE: there are a lot of unanswered questions about the crash and the conduct of the trucker)

              The chinese tesla inc

              • Yes you're one of those down with human driver people. You likely have a much lower bar for "safety" than I do. If AI is as good as an average driver then it won't save any lives because it will just get in as many accidents on average, and half the people that use it will actually be *less* safe than they would be without. No, it has to be pretty perfect in all circumstances.
      • I have driven in many European cities and I think the point you're missing is related to the attitude behind the driving.

        Mediterranean bordering countries are a special exception to... well every rule ever made. Ask them, they believe it too.

        If you're talking about Amsterdam, self driving cars will never work, but I give it less than 5 years before cars are simply illegal within the main city. They just passed that rule in Oslo, Norway. After June, it will be illegal to drive within the inside ring of the c
        • "I have never encountered more mean spirited drivers anywhere in the world. I honestly think the drivers in Paris believe that they are on earth purely to punish each other."

          On the other hand, the Parisian public transportation system is very good and gets you most places faster than can be driven.

          Paris has a major pollution problem. It's surprising that cars aren't banned within the old city wall ring road already.

  • Ford is late to the game, but still expects others to license their tech.
    • Once automatic driving becomes required - when the safety advantages are recognised - there will be a need for small manufacturers to offer it. And they won't be able to start from scratch....

      • by johanw ( 1001493 )

        Start talking about requiring it when it works at all in less than ideal conditions. Further, what makes you think it will ever be forbidden to drive manually? I think revoking the right to bear arms in the US will be easier.

        • Further, what makes you think it will ever be forbidden to drive manually? I think revoking the right to bear arms in the US will be easier.

          The right to drive and the right to bear arms are not really comparable. Gun owners tend to be geographically concentrated in rural states where they have disproportionate political power. It is also not clear that many "gun control" proposals would actually leads to less gun violence.

          The "right to drive" is more like the "right to smoke". Smokers are geographically dispersed, so have little political power, smoking is clearly dangerous, and restrictions on smoking have been effective in reducing harm. M

          • For the most part it won't take legislation to force people out of the driver's seat.

            Once the stats come in showing that robots are safer than meatsacks, the differential in insurance premiums will take care of the rest.

            At some point further down the track (as robots become ubiqutous) you can expect the requirements for actually holding a driving license become _much_ tougher and you'll also see a requirement for XYZ number of hours/year to keep the license, just like aviation licensing.

            The _vast_ majority

      • People are going to have to actually be able to, you know, afford it, before it becomes required.
        • Over the life of a car, that's a big gain. Similarly your medical insurance will be lower if you don't drive yourself.

          • I don't understand your first sentence. How is there a big gain? Very few people will be able to afford to purchase an automated car for themselves. If they only use driving services they sacrifice their personal freedom and get tracked everywhere they go. Either way there is a huge loss. Your second sentence I will grant you on the day that automated driving becomes that save for 90% of the population and insurance companies actually become altruistic and lower insurance premiums.
            • That's the issue we're disagreeing on. My expectation is that it won't add more than $10,000 dollars overall, and that's not dissimilar to the price of insurance over a 10 year period for a lot of people. Given that insurance will no longer be necessary for a lot of people - if they aren't planning to leave the areas that are fully automated - then this will pay for the upgrade

              Of course this assumes that people will persist in having their own cars, when it is likely that the shift will enable people to ren

              • Even if there is a $10K poor boy package, how well will it protect my family? Will the four sensor package be as safe as the fifty sensor package? How will they tap dance when it becomes evident that they sent all the experienced pilots to fly around wealthy people while the poor economy seats are all manned by bush pilots with 100 hours in the air? The software may be the same but will all packages budget and not *actually* be able to see everything happening around the car with no gaps?
                • The best reason for driverless cars is to reduce the massive death toll on the roads. The primary question is not whether the package will 'protect my family', but whether it will sufficient to ensure a substantial cut in road deaths. Given that the sensors will have a better view of the road than a human driver has atm, the only question is whether the software will be good enough to convert that data into safer driving than we get atm. This seems achievable.

  • The technology behind self-driving cars has come up in a number of episodes of the O'Reilly Data Show hosted by Ben Lorica. Ben knows his stuff well enough to perform this role, but to my taste, he's pretty softball most of the time; his show is more of a polite survey than a contest of minds.

    Here is one link I could quickly find:

    * The technology behind self-driving vehicles [oreilly.com]

    The guest is Shaoshan Liu, "co-founder of PerceptIn and previously the senior architect (autonomous driving) at Baidu USA".

    As I recal

    • A big question on my mind, is if there is a difference in the ability of an expensive self driving car to drive safely as opposed to a cheaper car, how will that work? That would be somewhat akin to having economy seats on an airplane with a pilot with 100 hours of flying experience, and putting the pilots with 2000 hours of experience in planes with more expensive seats. Doesn't really seem very fair to me. I realize more expensive cars may have better passive protection for passengers today, but we're
    • As I recall it, Liu says that the instrument package for a fully autonomous self-driving car—in the not too-distant past—costs around $100,000 and requires 3000 W

      "I think there is a world market for maybe five computers." -- Thomas Watson, president of IBM, 1943

    • You can get another order of magnitude by moving those algorithms to ASICs and FPGAs. 300 watts is within reasonable bounds for a vehicle electrical bus. I would be willing to bet that the prototype algorithms are running on less power efficient but more programmer friendly GPUs and CPUs mainly.

      The 100k figure is mostly driven by the LIDAR. There are cheap LIDARS for this.

      A $10k price tag plus a $1k a year license fee is achievable, and this would make self driving taxis readily feasible. Most people pr

  • All those companies spending such amounts of money on a technology only a few people really want.

    • Not so sure "only a few people really want".

      I admit I won't be using it, but that's mostly because I don't buy new cars. If it could be retrofitted to older cars without too much trouble, I'd put it on my car right now. Otherwise, ten or so years after it's available on everything, I'll have it on my car.

      And by then, I expect to be either dead or no longer driving....

    • There are a lot of dreamers. They dream that a self driving car will be just as cheap as a manual car is today. They dream that a self driving car will get them places as quickly as if they drive themselves. They dream that, if they hail a self driving car it will be like having their own and cheaper and more convenient than a taxi. They dream that the safety and convenience in a car they buy will be equal to that of a wealthy person. You remind them that they will have to fit into the economy somehow,
      • There are a lot of dreamers. They dream that a self driving car will be just as cheap as a manual car is today.

        When mass produced, it is unlikely that self driving cars will be much more expensive.

        They dream that a self driving car will get them places as quickly as if they drive themselves.

        Why wouldn't it? And with ultrafast lanes for self driving cars, not having to stop at intersections (or even red lights) when they are clear, and other benefits too dangerous to let human drivers have, they may get there faster.

        They dream that, if they hail a self driving car it will be like having their own and cheaper and more convenient than a taxi.

        It will certainly be cheaper to rent a self driving car than owning a second car that only sees occasional use, in a lot of cases. It's like renting a car without the suckiest part of car rental:

    • $1 billion in 5 years isn't all that much, though.
  • I really rather they invest the billion dollars in making electric cars instead. They are chasing the wrong technology.

  • Thought you might want to check out the daily/weekly newsletter from CognitionX with over 4K subscribers. We covered this story and many more. We provide you with the latest and greatest news related to AI. Subscribe to stay up-to-date: http://cognitionx.com/news-bri... [cognitionx.com]

In 1914, the first crossword puzzle was printed in a newspaper. The creator received $4000 down ... and $3000 across.

Working...