Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Security Transportation

Mobileye Says Tesla Was Dropped Because of Safety Concerns 218

An anonymous reader writes: On Wednesday, Mobileye revealed that it ended its relationship with Tesla because "it was pushing the envelope in terms of safety." Mobileye's CTO and co-founder Amnon Shashua told Reuters that the electric vehicle maker was using his company's machine vision sensor system in applications for which it had not been designed. "No matter how you spin it, (Autopilot) is not designed for that. It is a driver assistance system and not a driverless system," Shashua said. In a statement to Reuters, Tesla said that it has "continuously educated customers on the use of the features, reminding them that they're responsible to keep their hands on the wheel and remain alert and present when using Autopilot" and that the system has never been described as autonomous or self-driving. (This statement appears to be at odds with statements made by Musk at shareholder meetings.) It is also emerging that the crash which cost Joshua Brown his life in May of this year was unlikely to have been the first such fatal crash involving Tesla's Autopilot. In January of this year in China, a Tesla ploughed into the back of a stationary truck at speed, killing the driver.
This discussion has been archived. No new comments can be posted.

Mobileye Says Tesla Was Dropped Because of Safety Concerns

Comments Filter:
  • Unreasonable (Score:5, Interesting)

    by AmiMoJo ( 196126 ) on Thursday September 15, 2016 @09:46AM (#52892953) Homepage Journal

    Asking customers to remain alert while the car drives itself for hours on end is unreasonable. Psychologists know that, NASA warned them about it... Human beings simple can't concentrate for that amount of time with nothing to do.

    • by Nutria ( 679911 )

      Exactly. If you call something autopilot, then people expect it to be an... autopilot.

      • Re:Unreasonable (Score:5, Interesting)

        by ShanghaiBill ( 739463 ) on Thursday September 15, 2016 @10:16AM (#52893219)

        Exactly. If you call something autopilot, then people expect it to be an... autopilot.

        ... unless they are actually a Tesla owner. I use Autopilot, and Tesla repeatedly and emphatically makes the capabilities of the system and the responsibility of the driver very clear.

        • Re: (Score:3, Interesting)

          by Nutria ( 679911 )

          If the driver has to keep his hands on the wheel, and pay attention, then... it's not an autopilot.

          (Not that I'm shocked or anything by deceptive marketing practices.)

        • by Octorian ( 14086 )

          Except everyone who casually reads tech news, only vaguely paying attention to headlines written by tech writers, has a completely mistaken impression of what it is and does.

          Seriously, I've seen everyone from random friends to strangers on the street assume the car could basically drive itself. (Yes, even before they released the feature.)

          The capabilities of the system, and the responsibilities of the driver, are quite clear... if you actually drive the car or read past the headlines. Unfortunately, most

          • by Nutria ( 679911 )

            Except everyone who casually reads tech news, only vaguely paying attention to headlines written by tech writers, has a completely mistaken impression of what it is and does.

            It only takes 97% of the 90K Tesla drivers doing the right thing for there to be thousands of Tesla drivers not doing the right thing.

            Hilarity ensues.

            • It only takes 97% of the 90K Tesla drivers doing the right thing for there to be thousands of Tesla drivers not doing the right thing.

              Stupidity and ignorance are two different things. The idiots posting Youtube videos [youtube.com], filmed from the backseat, of driverless Teslas, are fully aware that their behavior is reckless. They were not "tricked" by the name of the software.

              • by Nutria ( 679911 )

                are fully aware that their behavior is reckless. They were not "tricked" by the name of the software.

                Yet how many are thinking, "telling us to keep our hands on the wheel at all times is just lawyer CYA"?

        • They can make it as clear as daylight, but its human nature to begin doing the exact opposite after a period of time. It's human nature to not pay attention when you're not doing anything. Read the NASA opinion on this, explains it well.

      • by rhazz ( 2853871 )
        And here I was hoping to see a Tesla conversation that didn't devolve to an argument about the definition of the word "autopilot". Every fucking thread. Get over it.
      • Re:Unreasonable (Score:4, Informative)

        by bobbied ( 2522392 ) on Thursday September 15, 2016 @11:38AM (#52893863)

        Exactly. If you call something autopilot, then people expect it to be an... autopilot.

        The problem though is actual "autopilots" are in airplanes and they range in complexity and features.

        Some are simple 2 axis affairs that can maintain heading and altitude, sort of, as long as your DG doesn't drift and the altimeter works. Some are fully automatic, land in a fog bank worthy of a mystery novel affairs that literally do everything but talk on the radios from departure to arrival with little more than a few button pushes. Most fall in between the extremes.

        Using an autopilot in an airplane requires the pilots be fully aware of the automation's limitations and be monitoring the flight's progress. It's purpose is two fold, 1. to lower the pilot workload and increase safety at critical phases in flight, by automating the more mundane tasks like controlling altitude, heading and speed, and 2. Increase efficiency by keeping the aircraft operating in its most efficient way possible.

        Tesla's "autopilot" is something totally different. It's not about efficiency, and it's not about safety, it's about convenience. Though they call it an autopilot, it's most certainly isn't one. It's built for a totally different reason.

        • by Nutria ( 679911 )

          It's not about efficiency, and it's not about safety, it's about convenience. Though they call it an autopilot, it's most certainly isn't one. It's built for a totally different reason.

          Exactly.

      • Exactly. If you call something autopilot, then people expect it to be an... autopilot.

        I know. Tesla's system isn't an autopilot. It's far better than that. Autopilots as we traditionally know them can't cope with anything, they can only maintain direction and heading and drop back to the pilot control everytime someone looks at them funny. It'll happily fly into a storm or into a mountain.

        I propose we name the Tesla system the drunk chauffeur, much better than an autopilot system but it may still get you killed.

    • Re:Unreasonable (Score:5, Interesting)

      by Dare nMc ( 468959 ) on Thursday September 15, 2016 @10:32AM (#52893339)

      >Asking customers to remain alert while the car drives itself for hours on end is unreasonable.

      It is also be unthinkable to have your "backup" to evaluate the performance of the autopilot watching only the output. I have hundreds of hours logged in autonomous vehicles, but I would review the data, see all the diagnostics logged, all of the GPS signal lost, or drifted, etc for the week. I thus have never completely trusted them. All of the operators, even when told by engineers of running a beta release with big untested changes would spend all of their time working on their phones. Without knowing when every backup and sensors have failed to read something wrong. You cannot evaluate the maturity just off of, well it stopped the other 5 times someone stepped in front of the vehicle, why would I have to worry about walking in front of them. If you don't know 15 times in the last mile the cameras failed to maintain the road edge monitoring, and 20 times during that same period that sensor was the only thing that kept you on the road. Only dad those 2 events overlapped, which they eventually will, would the failures actually show up in the output.

    • I warned them about that. But then I worked for NASA. I even said similar things in the past and was labeled as a troll. I don't care and rarely comment on /. anymore.
    • Whenever I think of Tesla drivers complaining about the Autopilot, I think of this [imdb.com](sorry, couldn't find the actual clip)
    • Asking customers to remain alert while the car drives itself for hours on end is unreasonable.

      Asking drivers to remain alert while they drive for hours on end is unreasonable. That's why we have rest stops, and why everyone and your mom (literally!) will tell you to pull over and take a break occasionally.

      Perhaps Autopilot will, in the future, require that drivers do the same, and offer to drive them to a rest stop.

      Human beings simple can't concentrate for that amount of time with nothing to do.

      Except when you're in traffic with a bunch of fuckheads and your life is being threatened every few minutes (or seconds, as is more likely around say the Bay Area... or any big city in Te

      • Except when you're in traffic with a bunch of fuckheads and your life is being threatened every few minutes (or seconds, as is more likely around say the Bay Area... or any big city in Texas, or lots of other places) driving is already well below that threshold for anyone who has any actual business driving. Hence the need for more automated driving features...

        The idea that driving is too dangerous so machines should do it instead is a nice philosophical concept.

        In the real world details matter, implementation quality matters, technological capabilities matter and philosophy is in fact worthless.

        Technologies like AEB have a proven track record of significantly improving safety. Others such as LDW/LKS have yet to make the case or shown to be a liability in terms of safety in the aggregate.

        The question at hand does feature 'x' provide a benefit or is it in fact a

        • The idea that driving is too dangerous so machines should do it instead is a nice philosophical concept.
          In the real world details matter, implementation quality matters, technological capabilities matter and philosophy is in fact worthless.

          Yes, this is why I am opposed to additional proliferation of roadways, and support instead revising transportation to use a combination of PRT and pathways suitable for cyclists, pedestrians, riders of horses, et cetera. We have had the technology to have vehicles steer themselves since the 1800s and it is called rail. There are many reasons why cars are a stupid way to get around. Roads are crap, tires are crap, people are great drivers except when they aren't, and making a self-driving car that can litera

  • Well... (Score:5, Interesting)

    by John Smith ( 4340437 ) on Thursday September 15, 2016 @09:50AM (#52892995)
    Tesla has about 2 fatalities per 100 million miles. South Carolina, the worst state in the US for accidents, has 1.65 accidents/100M. Massachusetts has .57. Clearly, self driving cars have a long way to go.
    • Re:Well... (Score:5, Insightful)

      by mbeckman ( 645148 ) on Thursday September 15, 2016 @10:29AM (#52893307)
      Statistically insignificant. Tesla stats will only matter when tens of thousands, if not millions, of trips have been made under autopilot. Then compare accident rates. If Tesla turns out to be safer, it won't be because of AI, because we have no idea how humans drive in the first place. It will be because of image processing and predictive algorithms, combined with pre-ordained decision trees. And there may well be major unforeseen consequences, such as cascading failures and catastrophic feedback loop interactions between vehicles.
      • Statistically insignificant. Tesla stats will only matter when tens of thousands, if not millions, of trips have been made under autopilot.

        The problem with waiting around for better data is that you're asking consumers to be the guinea pigs for an untested and potentially dangerous device. Through their overreaching marketing, and their lack of transparency, most would not trust Tesla with their life, and those that do, do so at their peril. A safer (albeit slower) approach would be for Tesla to demonstrate safety through public testing data.

        • > The problem with waiting around for better data is that you're asking consumers to be the guinea pigs for an untested and potentially dangerous device.

          Counterpoint: You HAVE seen the roads being filled with potentially dangerous and unqualified meatbags driving multi ton objects at lethals speeds, yes?

          An autopilot car has never crashed because it was doing its makeup, talking on a cellphone, had too many at the local bar, fell asleep, argued with its wife and jerked the wheel around to punctuate a sal

          • Tying the steering wheel with a rope and putting a brick on the gas pedal never got into an accident for any of those reasons either.
            • Except that the rope and brick method isn't intended to be an autopilot. More of a kinetic payload strike, so the fault would lie with the meatbag who sent the brick car on its course.

              • My point is that there is fault for all kinds of reasons. Just because it is a computer AI reason and not a human reason doesn't make it any less at fault.
        • Make a good point. Quite possibly driverless cars are impossible to adequately tested without putting many humans at risk. We have to decide if the value of that risk, and the inevitable deaths that will result, outweighs the benefit inconvenience and not having to drive ourselves. There is no guarantee that driverless cars will end up being safer than driving ourselves. It may be a wash, or it could be far more deadly. Nobody knows, because there is zero data on accidents in an environment where many or
      • Statistically insignificant. Tesla stats will only matter when tens of thousands, if not millions, of trips have been made under autopilot. Then compare accident rates. If Tesla turns out to be safer, it won't be because of AI, because we have no idea how humans drive in the first place. It will be because of image processing and predictive algorithms, combined with pre-ordained decision trees. And there may well be major unforeseen consequences, such as cascading failures and catastrophic feedback loop interactions between vehicles.

        Apples to oranges - almost no one uses autopilot in the most adverse conditions where many If not most driving fatalities occur such as icy roads.

      • Statistically insignificant. Tesla stats will only matter when tens of thousands, if not millions, of trips have been made under autopilot. Then compare accident rates.

        How few trips do you imagine have been made under autopilot so far? the 100 million mile mark was passed back in May [theverge.com]. 100M/10k=10k. Since a Tesla can only go 300mi on a charge (assuming the best case) that means that they have to have had at least 333,333 trips. In actuality, of course, the number has to be much, much higher than that. Odds are beyond good that with 100M miles, there are more than 1 million trips.

    • by I4ko ( 695382 )
      ??? 57>>>>1.65..
    • by Rei ( 128717 )

      What's the fatality rate in South Carolina for luxury sports sedans?
      How do South Carolina's collision rates compare to others were Tesla's are (or are you suggesting that all Teslas are in South Carolina)?

    • It may not indicate what you think. It's possible that Tesla drivers are just bigger d-bags and worse drivers than the average joe blow. It might have nothing to do with autopilot at all. In my part of the country, it's generally assumed that if you drive a fancy car, you have to drive it like a complete asshat.
  • Comment removed (Score:5, Interesting)

    by account_deleted ( 4530225 ) on Thursday September 15, 2016 @09:51AM (#52893009)
    Comment removed based on user account deletion
    • by Anonymous Coward

      Musk is a Silicon Valley software guy. They have no concept of making things and they are used to an industry where you can sell people defective crap, have them find the bugs and then sell them a new version that fixes the crap that shouldn't have been shipped in the first place. Also SV people have this knack for over-hype that leads people to have expectations that the product cannot deliver.

      AND, Musk has Space X and Tesla going.

      Even if he were a competent manager, he is stretched too thin.

      • Take a look at those 'falcon wing' doors. No sensible engineer thinks those are a good idea. They're a nightmare production problem and an even bigger nightmare maintenance headache. They are there for visual and 'wow' impact.
    • by BostonPilot ( 671668 ) on Thursday September 15, 2016 @10:14AM (#52893193) Homepage

      Expecting Tesla to survive the avalanche of product liability suits that are coming is crazy. Musk appears oblivious to the problem. This is not a PR issue. There are numerous chinks in Tesla's armor that will be pried open and exploited by plaintiff lawyers. The company is toast.

      I'm glad you mentioned this. Just this weekend I telling (another pilot) that I don't understand the strategy. The goal of Tesla was to bring electric vehicles to the masses. How are they going to do that when they get sued into oblivion? A conservative approach would have been to offer assist technologies similar to what their competition (other luxury brands) was offering. Instead, Elon has acted like it's Autopilot that's selling Tesla cars. I think people like Autopilot, but would buy the car if it had a much less aggressive auto-drive system because the real value is in the electrification of the car, not the autopilot system.

      It's not all that dissimilar to his falcon wing door misstep, except that falcon wing doors did not present an ongoing risk of expensive lawsuits.

      So far the accidents have been such that the Tesla driver was the one who got hurt. What happens when a Tesla hits another car and kills everybody inside? How is Tesla going to avoid the liability? Yeah, sure, the driver should have been paying attention, but at least in the US Tesla will still get named in the lawsuit, and when they lose guess who is going to have to pony up the majority of the settlement? Hint: it won't be the driver.

      The good news is that Elon may have already jump-started the electric car industry and even if Tesla gets sued out of existence we may have enough momentum for the other car companies to keep moving forward.

      • Comment removed based on user account deletion
        • I think he may actually have an inkling of the fact that Tesla is doomed already and that's why the Mobileye announcement. Typically, if they thought they could weather this, they would join at the hip and offer a common defense and probably announce more cooperative deals.

          Tesla is a niche player in the automotive industry, and no supplier wants to be married to a niche player. So... no.

          Mobileye is now blaming Tesla for its implementation, to absolve themselves of liability.

          Which is what they would do as an independent supplier no matter what.

          • Comment removed based on user account deletion
            • Tesla is a niche player in the automotive industry, and no supplier wants to be married to a niche player. So... no.

              They might not throw Tesla under the bus if the business was valuable enough. Apparently it isn't.

              Yes, very good, that's what I said. On one hand, they've got Tesla, which is selling a tiny handful of cars compared to the other hand... potentially, everyone else. They might even wind up supplying Tesla again someday, since this kind of dialogue is SOP for the industry. Nobody wants to be at fault for anything, predictably.

      • You have to wonder how many millions they sunk into this autopilot system and if it even 'sold' a single car. What a terrible business choice.
    • by ripvlan ( 2609033 ) on Thursday September 15, 2016 @11:44AM (#52893913)

      I'll agree with the suggestion - although think Musk has enough influence to survive and make it go away. If real problems began he'd probably leave to pursue other opportunities and new management would right the ship.

      As a person who works in a regulated environment - you can't make claims about something that aren't proven. The product must be specifically designed & tested for these *Uses*. Read the back of a Tylenol / Aspirin bottle : "This product intended for the temporary relief of pain caused by ....(etc)" It doesn't cure cancer. If a salesperson tried to hint that maybe it did - they'd be strung up and fined (the drug industry has many examples of this).

      However - apparently Tesla isn't regulated in this space. They can hint and suggest. The can say, "It is so good that most of the time it works as an autopilot self-driving system... but don't try it at home." It wasn't specifically designed to do this - so they shouldn't be able to hint at it. Customer's don't understand what this means - the darn thing works most of the time and they get used to it working.

      Since the auto-pilot is designed to Assist the driver - the computer should monitor the driver and verify they are paying attention or pull the car over. Or NOT take over the wheel for indefinite periods of time. Consumers get used to this "not an approved use" behavior and begin to trust it - even make up their own uses ("hey look I can take a nap").

  • It is very possible that this whole mess really is all Tesla's fault, but I also can't help but wonder if Mobileye just threw them under the bus to protect their own reputation.
    • Any computer vision system that does not use LIDAR as its primary sensor does not belong in a vehicle capable of causing harm. Each death that has occurred in autonomous vehicles thus far is due to inadequate sensors being used.

      I am baffled that someone really thought it was a good idea to install these on production vehicles knowing these limitations.
      I haven't looked, but I'd imagine that even Mobleye intended for these to be installed in vehicles.

  • Mobileye doesn't want the liability exposure in that market, whether or not their product actually fulfills the role that Tesla is using it for.
  • by BenJeremy ( 181303 ) on Thursday September 15, 2016 @09:57AM (#52893069)

    Sour grapes from a former vendor. Mobileye would sell cameras to blind people if they could. Vendors are not leading any auto program in the industry... 2nd and 3rd tier vendors are even worse, and require constant attention, or they will deliver poor quality and unsafe products.

    More likely they raised their prices and Tesla balked at the price and moved to another vendor.

    • Sour grapes from a former vendor. Mobileye would sell cameras to blind people if they could. Vendors are not leading any auto program in the industry... 2nd and 3rd tier vendors are even worse, and require constant attention, or they will deliver poor quality and unsafe products.

      More likely they raised their prices and Tesla balked at the price and moved to another vendor.

      It was actually space aliens because this seems like most likely reason to me.

  • The problem with Mobileye's view is that no matter what you call it, people will treat it like the car drives itself. Mobileye's CTO, "No matter how you spin it, (Autopilot) is not designed for that. It is a driver assistance system and not a driverless system". They'd like to differentiate the 2 but the line is very blurry, and fading more everyday. Mobileye's disclaimer is no more indemnifying than Tesla's, "continuously educated customers on the use of the features, reminding them that they're responsib
    • Not really. Mobileye product is not just hardware, but software. Sadly, it's crap, which is why Tesla used the hardware, but never used ME's software.
    • reminding them that they're responsible to keep their hands on the wheel

      If Tesla were serious about they they would put a dead man switch on the steering wheel.

  • by bagboy ( 630125 ) <neo@nOSpam.arctic.net> on Thursday September 15, 2016 @10:04AM (#52893129)
    because there isn't much to run into in the air and flights are required to file a flight plan so they have clear airspace. Even then, you always have a pilot on the ready. And this has been around for decades. Letting a computer be in full control of your life on the ground at high speeds is foolish.
  • We don't even understand how humans make split-second decisions while driving, let alone know how to replicate that decision-making in software. So programming a computer to do this is a completely random act. This is not AI, and anyone who says it is now an accomplice to manslaughter.
    • by AuMatar ( 183847 )

      We don't need to, because the program doesn't need to make the decision in the same way, it just needs to come to a correct outcome. Basically it needs to be able to process the images/radar info/other input and come to a decision as to whether its about to hit anything and if so what to do about it. That is something that we're becoming capable of doing (and improved image recognition will push this along). But the path taken to get there can be completely divergent from how humans think.

      • Car AI's think in the same way that a web servers thinks about sending out a page to a requesting computer.
      • How do you tell the difference between a blowing plastic bag and a bouncing kid's ball on a side street?

        Highways are the easy part of the automated driving problem.

  • by Steve1952 ( 651150 ) on Thursday September 15, 2016 @10:26AM (#52893289)
    I find the timing interesting. After the Tesla crash, Mobileeye admitted that their system can't distinguish cars or trucks entering the main road from a side road. They then said it would take several years to implement this functionality. Then they "dropped" Tesla. It looks to me more as if the Mobileeye product had a hidden defect. If Mobileye had publicised this problem in advance of the crash, it is likely that Tesla and the other car manufacturers considering Mobileeye would have had a better understanding of the Mobileeye limitations, and could have adjusted their plans accordingly.
    • Yeah there's never, ever been a case where a company has used a component for a purpose it wasn't designed for, or exaggerated the capabilities of a used component. More likely is that Tesla picked Mobileeye because it was closest to being able to do what they intended, and they assumed they could provide the missing smarts in code. Mobileeye is now understandably worried about liability.
  • Seriously, it is obvious that mobile eye could not do that. Otherwise, Tesla would not have put in all this work on software to go much much further than what mobileye software did. Tesla only used them for hardware, not their crappy software.
  • Prepare for the onrush of /. Musk worshipers defending Teslas and everything Musk does and says...

    • by fnj ( 64210 )

      Will they balance the mountain of mouth-breathing ignorant knee-jerk haters?

  • How hard did Elon fuck this guy's company that he comes out with this publicly?

    My guess is this company said "you can't really do what you're trying to do with our stuff", Elon said 'make it work or else' and then implemented the 'or else' when they either failed or declined.

  • by cloud.pt ( 3412475 ) on Thursday September 15, 2016 @11:40AM (#52893879)

    One of the few things I will take for granted from Elon, is his vision that if EVERY car on the road follows SOLIDLY PROGRAMMED RULES (and the sensors, of course, do not all catastrophically fail, frequently), you will have a drastic decrease, maybe even statistically eliminate car accidents. Everybody has this misconception that automated "piloting", whatever its form, will eventually create harm either by outright failure or for being so right it eventually acknowledges the "crew" is "a" harm. Fact of the matter is, everybody is just afraid of acknowledging their own imperfection, and of losing their jobs and their economy, because the definition of automation is exactly that: replacing people with a better, cheaper and easier process. We have robots flying millions of miles to other planets without much issue. Yet the main reason we don't send humans to first missions of anything is not because they're worse - it's just that they're a liability to lose in a complexity of aspects that cannot be controlled at all - public opinion is very powerful into downing any idea it preempts wrong..

    I believe Elon is damn right that it is necessary to take risks in driving automation, and the holy grail in that field is to move human brain and action 100% out of the equation, for the simplest reason of them all: the driver, unlike computers, does not always have his safety as the first priority, be it by will to do something else or by distraction. Were talking big car companies here, not a service provider of a yet small car producer. Small companies cannot phathom the handling of such liability, oftentimes they don't even have the financial or legal capacity to handicap themselves with an established legal defense: ultimately the driver is liable for 99% litigation that happens about accidents TODAY because HE IS MAKING ALL DECISIONS IN REAL-TIME. Drivers don't stand a chance really. Judges will minutely side with the driver in litigation "against a car", and when they do, it usually makes it to national television.

    Elon has been risking it with both Tesla and Space X because he knows he has, to some extent, the money (or the ability to direct others' money) into something bold. This is not courage like Apple likes to call it, it's calculated risk assessment with a very high return and smaller than usual probability - nobody wants that kind of bet, unless they're either truly altruistic or they're in the business of not having a standardized existence in this world. And guess what, that is just fine by me and I won't blame him for trying to be great.

  • Rename it ... (Score:4, Insightful)

    by Monoman ( 8745 ) on Thursday September 15, 2016 @11:47AM (#52893929) Homepage

    Rename it to something like Copilot or Driver Assist. They can say what they want about how Autopilot should be used but the name suggests otherwise.

  • I find it really bizarre that Tesla is using the logic that people are imperfect drivers so we need automated driving... and then expect them to be even more perfect in staying diligent at the wheel while there is nothing to do. Yes people are not perfect, so design a system that is foolproof or leave it alone.

We are each entitled to our own opinion, but no one is entitled to his own facts. -- Patrick Moynihan

Working...