Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Software Transportation Technology

Elon Musk Rolled Out Autopilot Despite Engineers' Safety Concerns, Says Report (theverge.com) 195

An anonymous reader quotes a report from The Verge: When Elon Musk announced last fall that all of Tesla's cars would be capable of "full autonomy," engineers who were working on the suite of self-driving features, known as Autopilot, did not believe the system was ready to safely control a car, according to the Wall Street Journal. The WSJ report sheds more light on the tension that exists between the Autopilot team and Musk. CNN previously reported in July that Musk "brushed aside certain concerns as negligible compared to Autopilot's overall lifesaving potential," and that employees who worked on Autopilot "struggled" to make the same reconciliation.

A major cause of this conflict has apparently been the way Musk chose to market Autopilot. The decision to refer to Autopilot as a "full self-driving" solution -- language that makes multiple appearances on the company's website, especially during the process of ordering a car -- was the spark for multiple departures, including Sterling Anderson, who was in charge of the Autopilot team during last year's announcement. Anderson left the company two months later, and was hit with a lawsuit from Tesla that alleged breach of contract, employee poaching, and theft of data related to Autopilot, though the suit was eventually settled. A year before that, a lead engineer warned the company that Autopilot wasn't ready to be released shortly before the original rollout. Evan Nakano, the senior system design and architecture engineer at the time, wrote that development of Autopilot was based on "reckless decision making that has potentially put customer lives at risk," according to documents obtained by the WSJ.

This discussion has been archived. No new comments can be posted.

Elon Musk Rolled Out Autopilot Despite Engineers' Safety Concerns, Says Report

Comments Filter:
  • The current hardware doesn't have side facing cameras (or lidar or radar) on the front sides of the car. There are cameras in the side front fenders (in the logo) but they are looking backwards. They need to have side cameras close to the front of the car and high up as possible because before proceeding onwards from a Stop sign you need to see what's coming at you from the left or right side. The camera mounted in the middle post of the windows doesn't have an adequate view.

    • There are side facing cameras in the B Pillars. This is only a few inches further back than where a human driver's eyes are, and approximately the same height.

      • That's not good enough. We want it to be better than what a human driver would be able to work with. 40% of fatal accidents are at stop signs -- mostly side impact. That is, tens of thousands of people are killed in side impact collisions every year. Reference: https://safety.fhwa.dot.gov/in... [dot.gov]

        Anythng to eliminate that would be good. If a collision is imminent the early warning may help the car decide to speed up or brake such that the passenger compartment is safe. Having a camera in the B pillar may help

  • Autopilot! (Score:2, Troll)

    by sexconker ( 1179573 )

    You know, a country with money's a little like a mule with a spinning wheel. No one knows how he got it and danged if it knows how to use it.

    Heh-heh, mule.

    The name's Musk, Elon Musk. And I come before you good people tonight with an idea. Probably the greatest... Aw, it's not for you. It's more of a China idea.

    Now, wait just a minute. We're twice as smart as the people of China. Just tell us your idea and we'll give you subsidies for it.

    All right. I'll tell you what I'll do. I'll show you my idea. I give yo

  • by RhettLivingston ( 544140 ) on Thursday August 24, 2017 @07:02PM (#55079097) Journal

    I've been in engineering organizations releasing new products that had life saving or threatening potential. It is always an agonizing, scary hard call as to when you've passed the threshold of risk.

    There is a bell curve with a peak. You rarely hit the peak. If you make the call too late, you cost the lives of those you might have saved - too soon, you cost lives of those who might have saved themselves.

    Even if you hit the peak perfectly, you'll always be able to truthfully argue that some people are being saved who would have died and some are dying who would have lived. The peak is a point of balance between the two - not a perfect elimination.

    I can remember many times hating my bosses when they released a product that I didn't feel was ready. As an engineer, I have to be over-focused on the problems and stand no chance of seeing when I am perfectly perched on that probability peak. They had to pry the projects from my hands to get them out the door. I actually begged in tears once. But, in retrospect, I can't think of any case where my bosses weren't right in releasing the product that I was concerned about releasing.

    What we need to force progress is for attorneys to get smart and start figuring out how to file more effective suits for lack of progress toward autonomy. How many are dying today because we don't have it? We need to focus hard on that.

    • As an engineer,...

      You are obviously not an engineer in the way I'm familiar with in the UK. ie. chartered and at a minimum a member of your professional institute: mech, civil, electrical, whatever; they've all got their own professional body. Except software "engineers", of course.

      I've been in engineering organizations releasing new products that had life saving or threatening potential. It is always an agonizing, scary hard call as to when you've passed the threshold of risk.

      The fact that you agonize

      • by RhettLivingston ( 544140 ) on Friday August 25, 2017 @01:18AM (#55080665) Journal

        Having the license in this country is often career-ending, much like having a PhD. It can make it very difficult to get a job. I've been in corporations that had thousands of engineers and never met anyone I knew to have it. I think they tend to be in certain structural and mechanical, and most civil and architectural engineering areas. The electrical, aeronautical, and computer engineering professions have much less of this.

        Regardless, there is no such thing as a vehicle on the road today that does not make some safety compromise. Not one single vehicle uses the best-known safety mechanism for every single aspect of the car. No one could buy it if they did, and it wouldn't meet other necessary criteria if every compromise was made in the safety direction. Our government often has to force the matter by making regulations like the ones coming down the pipe soon to require all vehicles to have automatic braking technology. This is tech that has been available for a while, but many engineers must be signing off on vehicles that are killing people, otherwise, the government wouldn't have to be stepping in.

        Everything engineered makes these compromises. For example, every building might be designed to handle a 500-year quake, but what happens if a 5,000-year quake comes along?

        Airbags are an interesting example. Even the best airbag systems kill some people who would not have died without airbags. But they save many more that would have. So, you accept the compromise. Many years ago, seatbelts did the same and still do. Yet, we have them, and are even required by law in most places to wear them.

        With the autonomous vehicle question, it is ready to deploy when it will save more people than it will kill when measured versus human drivers (all of them, not just the competent ones). To wait any longer would be killing those people that it might have saved. Of course, determining when that point is is a near impossibility. The hard call will either be made or the vehicles will never be made because the engineers will never be able to say with any product that it is not flawed in some situation - often in which it is being misused by the consumer.

        Realistically, we do wait longer than the point of net balance because the public does not understand statistically-based decisions very well. When it is your family member that died because the tech failed you want to blame the tech without looking at the whole picture. We often don't even know when our family member died because the tech that could have saved them was held back because it was being over-engineered.

        Often, these hard decisions are the reason for regulation - not to protect the public but to allow the companies protection in deploying something that a big picture organization like the government has determined will be a net benefit to the public while being a detriment to some individuals. The engineers then have the excuse of having met the regulation. It seems to work better with our minds.

        Absent specific regulations and tests to target (which is the ideal situation in a free society), the business leaders are usually the ones who make the tough calls.

        • Our government often has to force the matter by making regulations like the ones coming down the pipe soon to require all vehicles to have automatic braking technology.

          Too bad that's a garbage example, since the automakers voluntarily chose to implement AEB by that time, without being forced. The example you want is seatbelts. And actually, many government safety mandates are crap. The rears of vehicles are creeping upwards to meet rear impact crash test requirements, that's a natural process. But the fronts of vehicles are being mandated to specific dimensions in the name of passenger safety. Instead of having test requirements to meet, the government is forcing specific

        • by Whibla ( 210729 )

          A well thought out and reasoned response to an inflammatory post.

          Airbags are an interesting example. Even the best airbag systems kill some people who would not have died without airbags. But they save many more that would have. So, you accept the compromise.

          Perfect example!

          Realistically, we do wait longer than the point of net balance because the public does not understand statistically-based decisions very well. When it is your family member that died because the tech failed you want to blame the tech without looking at the whole picture. We often don't even know when our family member died because the tech that could have saved them was held back because it was being over-engineered.

          Often, these hard decisions are the reason for regulation - not to protect the public but to allow the companies protection in deploying something that a big picture organization like the government has determined will be a net benefit to the public while being a detriment to some individuals. The engineers then have the excuse of having met the regulation. It seems to work better with our minds.

          Absent specific regulations and tests to target (which is the ideal situation in a free society), the business leaders are usually the ones who make the tough calls.

          And insightful!

          Just wanted you to know, your efforts were appreciated. :-)

      • Whenever someone starts getting sanctimonious about safety I ask them if they fit the best possible tyres to their car for the next journey. If not then they are prepared to sacrifice safety for cost or convenience.

      • by AmiMoJo ( 196126 )

        In practice what happens is the engineer only certifies for very limited use cases in controlled environments, and the management/sales people push it further. Then a few years later in court the engineer produces their documentation to show that they didn't support it being used that way and tried to warn people of the impending disaster.

  • licensed engineers may be need for autodrive software or something like it.
    The FAA does code audits on autopilot software.

    • licensed engineers may be need for autodrive software or something like it. The FAA does code audits on autopilot software.

      It's not exactly a code audit, it's more like the FAA certifies code to a certain level of robustness. For commercial airline software to get certified, they generally have to prove that every line of code has been covered by tests, and every branch has been taken and not taken. There are even higher levels of certification (usually for the OS), where the code must be symbolically expressed, and mathematically proved to be correct. Not an inexpensive undertaking.

  • This is a prime example of what I'm talking about when I say so-called 'self driving cars' are being rushed to market.
    NONE of them are really ready and won't be for quite some time -- if ever.
    Meanwhile people really don't want them anyway. [cnbc.com]
    • Cherry pick data much?

      "However, just over 70 percent would ride in a car that was partially autonomous. Gartner defined partially autonomous vehicles as those that could drive autonomously, but allow a driver to retake control of the car if needed."

      That describes the Tesla Autopilot.
    • The article you cite says people don't want autonomous cars because they don't trust them. As the technology improves, trust will also improve.
  • Electric cars cool, self-driving cars bad.

    If Musk really wants to "save the planet", drop the self-driving crap already. It makes the car more expensive so less people can afford one, meaning they keep their old polluting car or even buy a brand new polluting car.

    • Electric cars cool, self-driving cars bad.

      If Musk really wants to "save the planet", drop the self-driving crap already. It makes the car more expensive so less people can afford one, meaning they keep their old polluting car or even buy a brand new polluting car.

      I disagree, self-driving cars have an even better ability to "save the planet" than electric cars. But I would say "Electric cars cool, self-driving cars cooler, but currently unrealistic." That said, unrealistic is not in Elon's lexicon.

      • I disagree, self-driving cars have an even better ability to "save the planet" than electric cars.

        They really don't, because they're still cars, and cars still suck. PRT would be dramatically superior and it would even permit the existing automakers to continue to exist (just like AVs will) but big changes are scary.

I have hardly ever known a mathematician who was capable of reasoning. -- Plato

Working...