Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Transportation

Ford CEO Says the Company 'Overestimated' Self-Driving Cars (engadget.com) 223

Ford CEO Jim Hackett scaled back hopes about the company's plans for self-driving cars this week, admitting that the first vehicles will have limits. From a report: "We overestimated the arrival of autonomous vehicles," said Hackett, who once headed the company's autonomous vehicle division, at a Detroit Economic Club event on Tuesday. While Ford still plans on launching its self-driving car fleet in 2021, Hackett added that "its applications will be narrow, what we call geo-fenced, because the problem is so complex." Hackett's announcement comes nearly six months after its CEO of autonomous vehicles, Sherif Markaby, detailed plans for the company's self-driving car service in a Medium post. The company has invested over $4 billion in the technology's development through 2023, including over $1 billion in Argo AI, an artificial intelligence company that is creating a virtual driver system. Ford is currently testing its self-driving vehicles in Miami, Washington, D.C. and Detroit.
This discussion has been archived. No new comments can be posted.

Ford CEO Says the Company 'Overestimated' Self-Driving Cars

Comments Filter:
  • by Anonymous Coward on Thursday April 11, 2019 @12:05PM (#58422082)

    Pretty sure the people creating the self driving technology are overselling its capabilities to car makers. Not to mention the real world accidents from Tesla's and the Uber incident has to place some fear of liability in car makers these days. Even with a human behind the steering wheel doesn't mean they can recover when the technology has a brain fart.

  • Not surprised (Score:4, Insightful)

    by Anonymous Coward on Thursday April 11, 2019 @12:08PM (#58422096)

    Many, including me, have said it is not going to be that easy to make a fully autimatic car. These dreams of most new sold cars being fully automatic within 5 or even 10 years etc. are not realistic.

    I really don't want to be babysitting some semi-automatic car, so i won't touch the tech until it is completely and fully automatic, so i can sleep in the car while it's driving.

    • Comment removed based on user account deletion
      • And then there's Canada.
      • Road quality varies (Score:5, Informative)

        by sjbe ( 173966 ) on Thursday April 11, 2019 @12:51PM (#58422406)

        In the US the most consistent roads are the interstates, but if you've driven in enough states, you know that the interstates aren't exactly uniform from place to place.

        That's putting it mildly. I can tell when I cross the border from Ohio to Michigan with my eyes closed. Michigan's roads suck and are badly underfunded (only Georgia spends less per capita [mlive.com] - they need to raise taxes but the republicans control the legislature and break out in hives when they hear the words "raise taxes".

        We need to see how self-driving cars handle construction zones, rain, snow, and fog on interstates first.

        I think they will figure it out but it's just going to take a lot longer than many people (including Elon Musk) are proposing that they will. I figure even best case we are at least 15 years away from a truly self driving car that could be sold to the public. And that is probably being wildly optimistic. I think the technology will make it's way in to use fairly steadily and already has but full autonomy is quite a ways off yet. I think it's a worthy goal but it's just going to take a while because it's not an easy problem to solve.

      • by jwhyche ( 6192 ) on Thursday April 11, 2019 @02:49PM (#58423186) Homepage

        This is the exact argument that I've been making since I first heard about self driving cars. The technology isn't there. Using markers on the roads will not work because they are not constant. GPS isn't accurate enough and computer maps are not 100% accurate.

        With the current technology of today we could make self driving cars but it would require billions to overhaul the existing infrastructure to make this happen.

    • by fluffernutter ( 1411889 ) on Thursday April 11, 2019 @12:23PM (#58422216)
      I'm shocked that Ford actually cares about people enough not to subject the public to half-baked technology like the other automakers.
    • The cars are autonomous not "autimatic". perhaps you are overestimating what you bring to the table vs autonomous vehicles.

    • by gweihir ( 88907 )

      The thing most people do not understand is that the respective research has been going on for > 30 years. I remember a fellow student doing their CS graduation thesis on how to get a car over a 2-lane left turn autonomously about 30 years ago. It may well take another 30 years to get there.

    • by Sark666 ( 756464 )

      Well, it looks like waymo is way ahead of everyone else, so if google gives an update that they were also overly optimistic, I would take that as the indicator of where the industry is at.

      http://infographic.statista.co... [statista.com]

  • by Drethon ( 1445051 ) on Thursday April 11, 2019 @12:08PM (#58422100)

    While I think self driving cars is in fact another level of complexity entirely, this kind of makes me think NASA/ULA vs SpaceX. How much complexity is the problem that needs to be solved, and how much is the sheer inertia of a company that has been going one direction for a very long time?

    • by flippy ( 62353 ) on Thursday April 11, 2019 @12:24PM (#58422224) Homepage
      The complexity of autonomous vehicles is immense, especially since the general public and regulators are expecting them to be better at making decisions and safer than human drivers. I'd be willing to say that it's orders of magnitude bigger than the difference between reusable and non-reusable rockets.
      • by byteherder ( 722785 ) on Thursday April 11, 2019 @12:57PM (#58422454)

        The complexity of autonomous vehicles is immense, especially since the general public and regulators are expecting them to be better at making decisions and safer than human drivers. I'd be willing to say that it's orders of magnitude bigger than the difference between reusable and non-reusable rockets.

        The difference between reusable and non-reusable rockets is physics and engineering.

        The difference between human drivers and self-driving cars is hundreds of millions of lines of code that all have to work perfectly.

        Trust me, physics is easier.

        • by flippy ( 62353 )
          Yep!
        • And you don't write the code directly, you have to write a program that writes the code. And then they all have to work 100%.
        • It's more than that -- it's about intelligent interaction not just with the road but with other "brains" so to speak where the rules and modes of interaction are infinite -- vs. the extremely limited rules of interaction with other "brains" as in Go and chess. I guess they'll have to limit the problem to make it closer to Go/chess.

          • It's more than that -- it's about intelligent interaction not just with the road but with other "brains" so to speak where the rules and modes of interaction are infinite

            Intelligent Design?
            ha, ha, hahahahah

        • I pretty sure I could code a computer to pull out in front of me while doing worthless stuff on a cell phone in way under a million lines of code.

          • Could you code a computer to avoid a child running in the road?
            Could you code a computer to stay in it's lane during a blizzard when the lane lines are obscured?
            Could you code a computer to steer out of a skid on black ice?
            Could you code a computer to drive in pea soup fog when the sensors are blinded?
      • by Kjella ( 173770 )

        The complexity of autonomous vehicles is immense, especially since the general public and regulators are expecting them to be better at making decisions and safer than human drivers. I'd be willing to say that it's orders of magnitude bigger than the difference between reusable and non-reusable rockets.

        It depends on how much driving is about being smart and how much it is about being consistent. While I'm pulling the numbers out of my ass I really can't think of many situations where some extraordinary foresight on my part prevented a crash. But I know many situations where I fucked up and could have caused a collision and a few where I did. Some 87% of the adult population had a driver's license at its peak and the reminder mostly felt it was too expensive or hadn't gotten around to it, something like 5%

        • by flippy ( 62353 )

          As I said in a previous response, it's more about perception and acceptance than reality.

          If an autonomous vehicle has a 1 incident in 10k miles driven rate, and the average human driver has that exact same rate, the autonomous system is going to get way more press, and knee-jerk reactions will occur. That's just the nature of the press and human nature.

          The general public doesn't seem to care about comparing incident rates between autonomous vehicles and human-controlled vehicles.

  • No Biggy (Score:4, Insightful)

    by Anonymous Coward on Thursday April 11, 2019 @12:12PM (#58422124)

    "We overestimated the arrival of autonomous vehicles"

    That's fine. We didn't believe you anyway.

  • First press release: Don't buy a Tesla. We're about to release a self-driving car done right. Second press release. Don't buy a Tesla. It's not actually possible to produce a self-driving car.
    • Are you insinuating that a Tesla is anywhere close to a self-driving car?
      • Are you insinuating that a Tesla is anywhere close to a self-driving car?

        No, he seems to be insinuating that people might think that Tesla is close to a self-driving car and so not buy a Ford....

        The "not buy a Ford" being the critical part.

  • No kidding! (Score:5, Insightful)

    by flippy ( 62353 ) on Thursday April 11, 2019 @12:20PM (#58422186) Homepage
    Is anyone besides the CEOs themselves surprised that the corporate-types vastly underestimated the complexity of a problem they didn't truly understand in the first place?
    • Re:No kidding! (Score:5, Interesting)

      by Anonymous Coward on Thursday April 11, 2019 @12:35PM (#58422286)

      No kidding. Anyone who is both an experienced driver and experienced programmer could see that car companies, including Tesla, were grossly overselling autonomous vehicles. The fact that anyone was seriously talking about fully autonomous vehicles and projecting a timeline for their availability was a bad joke.

      As I keep pointing out, think of all the situations you've encountered on the road in the last year that no computer could handle without a vast increase in the capability of embedded AI. Weather, emergency vehicles, detours, mismarked roads, etc. all make fully autonomous vehicle a fantasy, at least in 2019 and the near future.

      • by dgatwood ( 11270 )

        As I keep pointing out, think of all the situations you've encountered on the road in the last year that no computer could handle without a vast increase in the capability of embedded AI. Weather, emergency vehicles, detours, mismarked roads, etc. all make fully autonomous vehicle a fantasy, at least in 2019 and the near future.

        Weather? Maybe. It kind of depends on the nature of the weather, the type of self-driving tech, what they're doing to de-noise the data, etc. But it causes problems for human driv

        • Detours? On major roads in most states, this has been a solved problem for many years. The relevant road agency posts a closure notice on their website, some bot scrapes that website and converts the road closure data into a form suitable for algorithmic routing, and the routing algorithm in the navigation system or app guides you around it.

          Wrong scale of "Detour". The ones you describe are easy.

          The hard ones are the ones where a guy with a stop/slow sign directs you to drive on the wrong side of the road. Or sometimes he's just using his hands instead of a sign.

          Those are very easy for a human to navigate. We're easily able to suspend the rules and drive the wrong way because we understand what's going on. That's not true of an autonomous vehicle.

          • by dgatwood ( 11270 )

            The hard ones are the ones where a guy with a stop/slow sign directs you to drive on the wrong side of the road. Or sometimes he's just using his hands instead of a sign.

            That's really not a detour, though. The word "detour" (from French détour: n. a turn or other change of direction) is typically defined as taking a different route that is significantly longer than the normal route (whether because of a road closure, heavy traffic, or just rerouting through town because of a Big Mac attack). If you

            • That's really not a detour, though.

              I've seen plenty of "Detour" signs that involved driving the wrong way down a street.

              Also, pedantic adherence to one definition of a word does not make a problem go away.

              Mainly, the software just has to know how to go to the correct side of the center stripe on the other side as soon as it is possible to do so

              Except the software also has to know when not do to do this. That's as hard as teaching it when to do this.

              Your example is also only the simplest way a construction zone could be laid out. It might not be a full lane shift. It might involve crossing a lot of lines that are now perpendicular to your direction of travel. It might involve

              • by dgatwood ( 11270 )

                Yes, in theory, construction zones can be a problem, but in practice, the problem is mostly political, not technological. Most of the problems you describe are actually caused by the various transit agencies not doing the necessary preparation for construction. In an era of self-driving cars, proper construction signage will likely be mandatory to avoid creating situations that require human intervention.

                For short-term lane shifts, safety cones work are unambiguous and easily followed. Transit agencies a

                • So who pays for that? Hopefully the cost of repairing the roads doesn't go up because many cities are struggling as it is. The US interstate system is crumbling and would take 60 years to fix if significant funds are allocated now (hows that for a national emergency?). In my area, construction workers won't even take the reduced speed limit sign down when there is no one there.
        • If pulling over is so easy, why does a Tesla keep driving even when it knows the driver is not responding as required for being in the car?
          • by dgatwood ( 11270 )

            If pulling over is so easy, why does a Tesla keep driving even when it knows the driver is not responding as required for being in the car?

            If Tesla AP were anywhere approaching full autonomy, pulling over and stopping would be relatively easy to add — far easier than driving on residential streets with no lane markings and pedestrian dangers. However, the Tesla AutoPilot feature isn't anywhere near self-driving yet, and their AP computers are running at the limits of their hardware capabilities even

            • Apparently I pushed a button. :-)
              • by dgatwood ( 11270 )

                Not at all. I just figured most people who don't drive Teslas have no real concept of just how far away from full self driving Tesla's implementation currently is, and therefore don't realize that it can barely even stay between the lane lines consistently, much less pull off the road. :-)

        • For police cars, they'll eventually do something like this:

          * Police officer wants car in front of him to pull over. Hits button signaling this intent.

          * Police car computer reads the license plate (or otherwise determines the car's unique ID), creates a "pull over" request, and signs it with a certificate derived from the police department's certificate. It then gets communicated to the car through various means... probably multiple, including somewhere on a network that the car itself polls every few second

      • I'll just add that as someone who inhabits an area where these things are currently in the wild, and seems them all the time. The Cruze, Waymo, etc are still only 50 percent successful at navigating a basic four way stop. Especially with the ubiquitous 'California stop' methods of the other human drivers. Bicyclists, pedestrians, double parked UPS trucks. There is such a long way to go on this.

    • Re:No kidding! (Score:5, Interesting)

      by gurps_npc ( 621217 ) on Thursday April 11, 2019 @12:36PM (#58422290) Homepage

      To me the question isn't why they they are underestimating the problem, but instead why are they concentrating on self driving cars?

      Obviously the first applications should be self driving Buses, Long Haul 18 wheelers, Garbage Trucks, etc. etc.

      Things where slower speed is acceptable whose route is mostly pre-planned, where companies are paying a man to drive rather than someone is driving themselves.

      • Re: (Score:3, Insightful)

        by v1s10nary ( 5867496 )

        Obviously the first applications should be self driving Buses, Long Haul 18 wheelers, Garbage Trucks, etc. etc.

        Those would indeed be the most logical applications of self-driving vehicles... but they would also cause the biggest economic disruptions. Bus/truck drivers & garbage men are notorious for aggressive union activity; imagine the outcry if they feel like their professions will be threatened by autonomy?

        It would be the "NYC taxi drivers vs. Uber" [dryve.com] situation on a much larger scale.

      • by OzPeter ( 195038 )

        To me the question isn't why they they are underestimating the problem, but instead why are they concentrating on self driving cars?

        Obviously the first applications should be self driving Buses, Long Haul 18 wheelers, Garbage Trucks, etc. etc.

        Things where slower speed is acceptable whose route is mostly pre-planned, where companies are paying a man to drive rather than someone is driving themselves.

        But people are working on autonomous trucks [wikipedia.org] and also trying other things such as platoon driving [wikipedia.org]. Also check out Self-Driving Trucks: Are Truck Drivers Out Of A Job? [atbs.com]. However I think that these are being underreported because self driving cars are seen as "sexy" but a truck isn't.

      • "Obviously the first applications should be self driving Buses, Long Haul 18 wheelers, Garbage Trucks, etc. etc."
        Those vehicles have so much more mass than a passenger car so they'll do so much more damage and kill so many more people when (not if, when) they fuck up. A 'self driving bus' full of people when it fucks up will kill dozens.
        Also the market for those is much smaller. Less profit to be made.
        • If you can create a long haul tractor trailer, that can drive with little/no human intervention, major shipping companies would likely shovel cash in your direction. A quick Google search suggests that the average semi is driven about 45,000 miles per year. If the driver is earning $0.40 per mile, that's a savings of $18,000 per year per truck. FedEx has 20,000 semi trucks in its fleet, Walmart has 6,000. As an aside, I read that the US had a shortage of 50,000 drivers in 2017.

          The market might be smalle

      • why are they concentrating on self driving cars

        There are "inefficiencies" to rape and pillage.

      • by ledow ( 319597 )

        Golf carts.
        Shopping trolleys.
        Fairground rides.

        All kinds of things operate in geo-fenced areas, can be made to run in a very constrained environment, where computer vision is easy, speeds are low, decisions are few, obstacles are few, and the end-result of a mistake is a bruised ankle not a dead kid.

        But they ALL skipped such stages.

      • Obviously the first applications should be self driving Buses, Long Haul 18 wheelers, Garbage Trucks, etc. etc.

        Ford is a minor player in heavy trucks. The biggest trucks they make are class 7. Hotel buses, airport shuttles, dump trucks, box trucks, school buses (I don't think anyone is still making a Ford or GMC school bus, but I could be wrong), etc. The big players in heavy trucks in the US are Freightliner (Mercedes), Peterbilt and Kenworth (both owned by Paccar), International-Navistar, Volvo (Geely), and Mack (also Volvo/Geely). Ford worked with what they had, which is to say minivans (Transit). Autonomous mini

      • For buses, 18-wheelers, and garbage trucks, the driver is probably the cheapest part of the daily operating expenses ANYWAY.

        Automation is definitely coming to 18-wheelers, but it'll mostly be as a safety feature & way to reduce the training and skill necessary to safely drive one. Today, driving an 18-wheeler is HARD. If they can lower the bar to the point where anybody qualified to drive a 20-foot U-haul truck could safely drive an 18-wheeler across the country, that alone would be a huge cost-savings

    • by Tablizer ( 95088 )

      Is anyone besides the CEOs themselves surprised that the corporate-types vastly underestimated the complexity of a problem they didn't truly understand in the first place?

      It's really hard to estimate future progress of this type of thing. If anyone were good at it, they'd be rich as Warren Buffett on their tech stock picks. (Warren tends to avoid tech stocks.)

      Progress was being made rather quickly. If one extrapolated the progress curve based on the past early pace, it's not unreasonable to conclude that pr

    • by gweihir ( 88907 )

      No. And since this is an IT problem, even less so.

  • by fluffernutter ( 1411889 ) on Thursday April 11, 2019 @12:21PM (#58422198)
    geo-fenced.. in other words it will run around a pre-defined circuit in a walled compound.
    • No, that means "retirement community" and "corporate parks".

      But, walled in a sense.

      • "retirement community" was actually what I was looking for.
        • They have walled communities that aren't age restricted in the Santa Barbara area, but an autonomous vehicle could easily crash through those walls, IMHO. I think they're called "planned communities" and, like retirement communities, they have restrictions on tons of stuff, so you could easily do something in a more desert-like or island-like area. Then anyone they mow down has "agreed" to the risk, other than the kids sacrificed on the alter of technology when they visit the rels.

  • by Joe_Dragon ( 2206452 ) on Thursday April 11, 2019 @12:36PM (#58422294)

    Tesla has oversold the autopilot and people died

    • by mspohr ( 589790 )

      Tesla releases a quarterly safety report on their cars. About one accident for every 2.87 million miles driven in which drivers had Autopilot engaged. As a contrast, for the general public, it's about one accident per 436,000 miles.
      So, much better but not perfect.
      As long as it's better than driving without assistance, it's a win.
      https://www.tesla.com/VehicleS... [tesla.com]

    • by pezpunk ( 205653 )

      Tesla vehicles in Q4 2018 experienced accidents at a rate of 1 accident per 2.87 million miles driven with Autopilot engaged, and one accident every 1.76 million miles without Autopilot. The average car in America experiences 1 crash every 436,000 miles (these stats do not take fault into account). Effectively, a Tesla is 4 times less likely than the average car to get into an accident, and that number jumps to 6 times less likely when using Autopilot (which is constantly improving).

      https://www.teslarati.co [teslarati.com]

    • by bigpat ( 158134 )

      Tesla has oversold the autopilot and people died

      Again, nobody gives a shit about the tens of thousands of people dying from today's defective driving technology, but one person dies from an autonomous vehicle and the specific problem is fixed and suddenly the problems are intractable?

      Whole planes full of people fell out of the sky in what is supposedly the safest industry because a sensor failed and Boeing is taking a few weeks to fix the problem and yet we haven't grounded all aircraft even when the sky is literally falling.

      Tesla and to an even greater

  • Uber killed someone and got the cops to try to pin part of the on the low trained safety driver that as part of there tasks is to take there eyes off the road to look at logs.

  • boeing 737 max 8 what happens when an sensor mess ups and that could of been much worse.
    Just what is lots of cars that can fail in the same with people that have way less training as backups.

  • by WillAffleckUW ( 858324 ) on Thursday April 11, 2019 @12:50PM (#58422398) Homepage Journal

    Internal engineers say this is pie in the sky (2023) and it's more like 2035 for dev and 5 more years for extensive real world testing.

    But that's reality.

    Face it, you're more likely to have working commercial (non-military) safe fusion reactors before you see self-driving cars.

    Let alone their security implications.

  • Expected (Score:5, Insightful)

    by MBGMorden ( 803437 ) on Thursday April 11, 2019 @12:58PM (#58422460)

    I've pretty much expected this for a while.

    Don't get me wrong the progress that has been made in this field is incredibly impressive, and I have no doubt that eventually we'll be there, but I've found it laughable when you have people with kids who are 8-9 years old stating that their kids won't have to learn how to drive.

    Autonomous driving is a VERY complex problem, and while it may be 90% solved, that last 10% will likely take us decades to perfect. I wouldn't expect fully autonomous cars to be the norm for probably 40-50 more years.

    Heck just look at the situation Boeing is in right now. Aviation is arguably a much easier task to automate, because there are fewer other vehicles around (and the ones that do typically have transponders announcing their location), and the environment is much more structured as to procedures, yet they've had multiple planes crash due to faulty sensors and autopilot related functions.

    Electric cars - sure, they'll be the norm in 10-15 years. Autonomous though? It'll be a while.

    • Autonomous driving is a VERY complex problem, and while it may be 90% solved, that last 10% will likely take us decades to perfect. I wouldn't expect fully autonomous cars to be the norm for probably 40-50 more years.

      Heck just look at the situation Boeing is in right now. Aviation is arguably a much easier task to automate, because there are fewer other vehicles around (and the ones that do typically have transponders announcing their location), and the environment is much more structured as to procedures, yet they've had multiple planes crash due to faulty sensors and autopilot related functions.

      Electric cars - sure, they'll be the norm in 10-15 years. Autonomous though? It'll be a while.

      Based on how airplane autopiolot has developed, I think the above is a much more realistic view of what self driving cars will look like.
      Auto pilot was invented in 1914 and has been in airplanes ever since in one form or another. And they can handle a lot of the day-to-day flying between point A and point B. But none of them do take offs. The most advanced ones can do landings to a degree, but when you start getting into low visibility they have problems. As a general rule landings are still done by p

  • Told you so. :-) (Score:4, Insightful)

    by Rick Schumann ( 4662797 ) on Thursday April 11, 2019 @01:03PM (#58422498) Journal

    Invest billions in what execs think is 'Just another new product R&D cycle'
    Get sold the idea that 'deep learning algorithms' are enough, just keep throwing more and more data at it!
    Discover you can't get it over the finish line because NO ONE has any fucking idea how a brain 'thinks'
    Legal department's analysis shows the risk/benefit ratio is so high that you'd be nuts to actually launch this hot mess

    Like I've been saying all along.
    The approach of the current crop of so-called, inaccurately named 'AI' is all wrong. You need a fully thinking brain behind the wheel, not just some shitty half-assed 'learning algorithm' that doesn't know the difference between a living being and a lamppost, or a real stop-sign from one painted on the back of a T-shirt, or that just because a stop-sign has some graffiti or a sticker on it doesn't mean it's not a stop-sign anymore.
    If your so-called 'self driving car' needs to pull over in the middle of a trip and 'phone home' so a remote HUMAN operator can 'guide it through' whatever it is that's making it vapor-lock on you, then it's not suitable 'technology' for public roads. Period.

  • by eepok ( 545733 ) on Thursday April 11, 2019 @01:04PM (#58422500) Homepage

    People like to say "everyone's stupid" and thus "if AVs are 10% better than human drivers, it will be worth it". But once you talk to the actual developers and technologists and ignore the futurists, they'll tell you that getting a machine to make decisions even half as good of humans is really, really, really difficult.

    We don't give "stupid" people enough credit. The innate base intelligence of someone that doesn't regularly kill others on the road is very difficult to emulate, regardless of how we consider them in relation to the more intelligent members of the species. They don't just follow lines in the road, they adjust to lighting conditions, curvature in the road, they know how people will swerve in advance of potholes. They can quickly decide if someone/thing is about to go into the road or even make subjective judgements on how another vehicle will move on a freeway based on minute experiences in the last 30 seconds.

    From a purely decision-based analysis, driving an automobile is extremely complex... still too complex for a computer to measure and judge appropriately to be autonomous. We'll get there... but it won't be quick, cheap, or easy.

    • by bigpat ( 158134 )

      We don't need human level AI to drive a damn car. People are taking too long to react because they are thinking. Thinking takes time and we haven't evolved to drive cars so our thinking takes too long. We can create AI that doesn't think so much about the problems and can therefore react more quickly than human drivers.

      This is a situation replacing thoughtful drivers with dumb drivers is exactly what you want in order to save lives.

  • by Sqreater ( 895148 ) on Thursday April 11, 2019 @01:35PM (#58422690)
    Fordy co. CEO James Mackit scaled back hopes about the company's self-dancing bots this week, admitting that the first breakbots will have limits.
    From a report:
    “We overestimated the arrival of break-dancing bots," said Mackit, who once headed the company's dancebot division, at a Detroit Economics Club event on Tuesday. While Fordy still plans on launching its new self-dancing bot in 2021, Mackit added that "its dance selection will be narrow, what we call geo-fenced, because the problem is so complex." Mackit's announcement comes nearly six months after its CEO of autonomous dancebots, Sherry Makersmark, detailed plans for the company's new self-dancing bot in a Mediums post. The company has invested over $40 billion in the technology's development through 2023, including over $10 billion in Farrago AI, an artificial intelligence company that is creating a virtual dancer system. Fordy is currently testing its self-dancing bots in Miami, Washington, D.C. and Detroit. Fatalities have been minimal.
  • by argStyopa ( 232550 ) on Thursday April 11, 2019 @01:40PM (#58422712) Journal

    Do they build their long term plans from what, skimming the covers of the ALWAYS overoptimistic Popular Mechanics?

    No sober person who didn't have a dog in the hunt believed that autonomous vehicles were anywhere NEAR close to implementation. We still haven't solved fundamental problems with vision and processing on perfectly clear, dry days on sunny, empty California streets (where pretty nearly your dog could drive safely), to say NOTHING of the major effects of weather, night, redundancy, and the never-insignificant-in-American-contexts: LIABILITY.

    If you think one company is going to seriously put truly autonomous vehicles on the road before they're essentially held-harmless if/when it runs over a kid, you don't understand corporations.

  • Dear Mr. Hackett. I fear your automotive start-up will never succeed unless you are able to excite 'investors' into pouring money into your scheme with outlandish self driving claims. This whole truth thing will spell doom for your small firm.

  • Well, good for Ford. GM and others are plunging headlong into this and creating a race of lemmings. Perhaps Ford looked at this and has decided to "pump the brakes" (haha) on the technology. It takes leadership to go against the herd and ask questions like that.

    Now "leadership" doesn't necessarily equate to a correct assessment. He could be mis-evaluating the state of the technology and be flat-ass wrong. Or he could have saved the company billions of lost dollars trying to keep up with the other she

Is knowledge knowable? If not, how do we know that?

Working...