Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Transportation Technology

NTSB Cites Tesla To Make the Case For Stricter Autonomous Driving Regulation (engadget.com) 77

An anonymous reader quotes a report from Engadget: The National Transportation Safety Board (NTSB) is calling on its sister agency to implement stricter regulation related to automated vehicle technology. In a letter it sent to the National Highway Traffic Safety Administration (NHTSA) at the start of February, the NTSB says the regulator "must act" to "develop a strong safety foundation." What's notable about the document is that NTSB chair Robert Sumwalt frequently cites Tesla in a negative light to support his department's suggestions. The automaker is referenced 16 times across the letter's 15 pages.

For instance, in one section, Sumwalt writes of NHTSA's "continued failure" to implement regulations that would prevent driver-assist systems like Autopilot from operating beyond their intended use. "Because NHTSA has put in place no requirements, manufacturers can operate and test vehicles virtually anywhere, even if the location exceeds the AV control system's limitations," Sumwalt writes. "For example, Tesla recently released a beta version of its Level 2 Autopilot system, described as having full self-driving capability. By releasing the system, Tesla is testing on public roads a highly automated AV technology but with limited oversight or reporting requirements."

This discussion has been archived. No new comments can be posted.

NTSB Cites Tesla To Make the Case For Stricter Autonomous Driving Regulation

Comments Filter:
  • Pros and cons (Score:5, Interesting)

    by enriquevagu ( 1026480 ) on Friday March 12, 2021 @04:49PM (#61152304)

    On the one hand, Tesla is making very fast progress by releasing this system to the public.

    On the other hand, the NTSB is absolutely and completely right...

    • Re:Pros and cons (Score:4, Interesting)

      by quonset ( 4839537 ) on Friday March 12, 2021 @04:51PM (#61152318)

      On the one hand, Tesla is making very fast progress by releasing this system to the public.

      On the other hand, the NTSB is absolutely and completely right...

      What Tesla is doing is akin to what Microsoft does with its updates except with more serious ramifications. "Some of you may die, but that's a sacrifice I'm willing to make."

      • Worse, Tesla's marketing makes it sound like it has more capabilities than it does in fact. "Fully self driving?" No.

      • Re:Pros and cons (Score:5, Informative)

        by nbvb ( 32836 ) on Friday March 12, 2021 @05:04PM (#61152346) Journal

        Number of people injured by Tesla's FSD package being tested? Exactly zero.

          • Re: (Score:2, Informative)

            by dgatwood ( 11270 )

            Exactly zero. "Full self driving" is generally defined as "all driving features have been released". City driving has not yet been released, and there are several other missing pieces, such as hand gesture recognition, that will be required before it can be called FSD. Therefore none of those deaths involved anything even approaching full self driving.

            • by Anonymous Coward
              There are a total of 6 incidents that verified auto-pilot was in use in the last eight years, with millions of Teslas on the road. But of course, they also tell you to always watch the road and be ready to take over at any time.
          • Re:Pros and cons (Score:4, Insightful)

            by jbengt ( 874751 ) on Friday March 12, 2021 @06:15PM (#61152536)
            You linked a list of 144 fatal crashes involving Tesla, with 15 claimed as Autopilot involved and six of those confirmed by Tesla that Autopilot was on.

            One of the 15 being claimed that Autopilot was involved, one is described as "Driver mistakenly believes Autopilot is on".
            One of the 6 verified as involving Autopilot, one is described as "Sleeping Tesla driver kills motorcyclist".

            I'm not a fan of Tesla's Autopilot hyperbole, but as others often point out, we need more extensive actual accident statistics comparing Autopilot-assisted and un-assisted driving.
            • we need more extensive actual accident statistics comparing Autopilot-assisted and un-assisted driving

              Isn't that precisely from NTSB are requesting? From TFS:

              Tesla is testing on public roads a highly automated AV technology but with limited oversight or reporting requirements.

              You'd think, with Tesla being 1,000% connected to all of its cars sensors, data would be the last thing the government would have a difficult time obtaining from them.

              • No. What the NTSB is wanting is for NHTSA to set up an environment of controlled experiments to gauge the efficacy of self-driving systems, like a car track with obstacles or something to evaluate real-world conditions. Not something that puts untested software in control of cars to generate data in the public domain which has serious consequences if it's wrong.
            • I estimate the safety stats won't be much different than a modern car with proximity sensors.
        • ... are you sure?

        • ... gotta wonder how many lives it's saved too. In any case, it's not really the point, which is they should be regulated (before they injur people, not after!) and not free to just do what they want.

          • That would matter to me only if I could choose whether I wanted to have my life saved or lost ahead of time.
            • by dwater ( 72834 )

              I'd imagine it might matter to you in a more societal sense, and perhaps your health and insurance premiums too.

              • Well, no, unless they are significantly better I don't believe it is worth it in a societal sense if people are being held responsible for accidents that they didn't make.
        • It looks like some of this is because Tesla isn't voluntarily doing things that the NTSB wants:

          Tesla (the manufacturer of the Williston crash vehicle) continued to permit AV operation outside the ODD . . . Tesla advised the NTSB that it believes that “ODD limits are not applicable for Level 2 driver assist systems, such as Autopilot, because the driver determines the acceptable operating environment.” In March 2019, because of Tesla’s lack of appropriate safeguards and NHTSA’s inaction, another fatal crash occurred in Delray Beach, Florida, under circumstances very similar to the Williston crash.

          If we wait long enough, Tesla's current FSD package will be involved in a fatal accident. I suppose the relevant question is how many autonomous miles are driven per injury compared what one would expect of the typical Tesla owner demographic.

          But even assuming a net benefit to the public, the primary beneficiaries are the companies allowed to sell the feature on cars. So regulation and even mandatory information a

      • by Anonymous Coward

        On the one hand, Tesla is making very fast progress by releasing this system to the public.

        On the other hand, the NTSB is absolutely and completely right...

        What Tesla is doing is akin to what Microsoft does with its updates except with more serious ramifications. "Some of you may die, but that's a sacrifice I'm willing to make."

        Hey, the motto here in the Valley is move fast a break things--don't start crying a river when humans end up being those things. \_(")_/

    • They are right, but in my experience, filing reports to regulating agencies is orthogonal to safety.

    • by cmarkn ( 31706 )
      The NTSB is not even on the same plane as right.
      What is necessary is systems that are just slightly better than human drivers. Those would save thousands of lives a year. Maybe not everybody, but isn't some better than none?
      Perfect is the enemy of good -- Voltaire
      • Nope. If I am paying for a system to do the work for me it had better be a lot better than me. And I had better not have to pay for insurance based on what damage I could do with the vehicle because I can't do *any* damage with the vehicle.
    • Re: (Score:3, Insightful)

      by locater16 ( 2326718 )
      Theoretically right. In practical terms, well looking at actual statistics it's not like Tesla autopilot is any worse than human error right now. Might actually be safer.

      source
    • by AmiMoJo ( 196126 )

      Tesla hasn't made that much progress. For example their auto-parking is inferior to other manufacturer's, e.g. MobileEye.

      For full self driving Waymo is way out in front. I don't think Tesla will ever get there. Their idea of building up to it with just cameras is not going to work.

  • "We can promise that these vehicles will tested as rigorously as the automated systems on any airliner."

  • If they're going to cite Tesla, they should be aware that Tesla monitors every action of its self-driving software and sends continuous reporting of its actions including car state, cameras, etc. That is how Tesla is building a giant database of situations for its neural net chips.

  • Any needed Software updates / repairs must be free for at least 6-10 years for cars.
    Any needed map updates must be free for the same time frame.
    If an car needs to be online the car owner can not be billed for any cell data overage or roaming fees.

  • The fact is, Tesla could simply make users acknowledge an on-screen agreement that they understand enabling the "full self driving" features means they're responsible for any accidents or errant behavior, so need to keep a watchful eye on it at all times and take control as needed.

    The NTSB's responsibility for vehicle safety should only extend to ensuring the vehicles are constructed so they can perform the expected functions (steering, braking, accelerating, signaling turns, proper headlight illumination a

    • Re: (Score:3, Insightful)

      by Cmdln Daco ( 1183119 )

      Certainly. And the vehicle should just sit there immobile until every other driver, pedestrian, etc. that the vehicle could possibly encounter also "clicks through" that screen.

    • When it crosses over to the realm of letting drivers opt for "autopilot" type driver assistance systems, I think it's reasonable to let the drivers use their own judgement there.

      The average user has no basis for developing judgement on whether to use autopilot in a particular situation or not.

    • by jbengt ( 874751 ) on Friday March 12, 2021 @06:35PM (#61152582)

      The NTSB's responsibility for vehicle safety should only extend to ensuring the vehicles are constructed so they can perform the expected functions (steering, braking, accelerating, signaling turns, proper headlight illumination and so forth). When it crosses over to the realm of letting drivers opt for "autopilot" type driver assistance systems, I think it's reasonable to let the drivers use their own judgement there. Otherwise, I'm not sure why the NTSB doesn't start dictating whether or not new teenage drivers are allowed to drive vehicles on public roads, or act as the regulatory body telling older people when their reflexes are deemed too slow to safely operate one anymore?

      The NTSB [ntsb.gov] is not a regulatory body and has no authority to do the things you suggest they should or shouldn't do. Their legislative mandates are:
      - Maintaining congressionally mandated independence and objectivity;
      - Conducting objective, precise accident investigations and safety studies;
      - Performing fair and objective airman and mariner certification appeals;
      - Advocating and promoting safety recommendation;
      - Assisting victims of transportation accidents and their families.

  • Aww, that's too bad. I was looking forward to the constant, slower traffic as it tries to distinguish a child's runaway ball from a tumbleweed, a twig from a downed power line (or even a 12-inch diameter tree branch).

    Not to mention the bloodbath...one wrong bit at speed and kablewy! (I had to look that word up; I had never typed it!)

    People keep saying these will be better than human drivers. Spoiler: These cars will never be perfect, and even though it's been assumed they'll be better than humans
    • even though it's been assumed they'll be better than humans, that remains to be seen; and I know what my wager would be.

      Waymo has driven for 6.5 million miles and been responsible for 0 accidents. I don't think it's an assumption to say that it's proven better than a human.

  • The side facing camera on the B-pillar is inadequate when it comes to seeing cross traffic from the right when crossing a one way street at a stop sign. The passenger side B-pillar camera has requires the car to jut too far into the intersection to see whether traffic is coming. It has to move forward at least 3 feet more than a human driver needs to lean forward to see traffic. Itâ(TM)s not safe. Furthermore if they added front cross traffic cameras to the bumpers or front fenders the car wouldnâ

    • by crow ( 16139 )

      You're probably right, and having the same in the rear would likely be helpful for backing out of parking spaces. Many people have said the same thing. However, I'm withholding my judgement until I see exactly how the new FSD software behaves in practice.

    • by vix86 ( 592763 )

      I'm tossing my agreement into the hat on this one. The ideal location would be a fish eye lens camera in the rear view mirror housing along with the other cameras front facing cams. It'd be cool if they could put it up near the headlights so you could get it much more further ahead than the driver, but I'd worry about issues with keeping the camera clear in less than ideal situations.

  • They are eager to create more regulation but not sure what it is they are regulating exactly.
  • To me the answer seems obvious; any auto driving manufacturer should be comfortable taking all legal and financial liability for that car. If they aren't able to do that then they aren't comfortable enough with it then it is too dangerous to put a person into.
    • I do not disagree entirely, but I suspect that there will always be risk and the choices of humans will remain a key catalyst. If the companies assume the risk, they will end up valuing the risk and building it into the sticker price. Fine, but that means I end up paying for the moron who looks out his window, observes that conditions are unsafe to drive, but presumes the computer will âoefigure it outâ and hops in anyway.
  • If any citizen wants to drive a car, then in every state they have to pass a proficiency test and obtain a license. So the first question would be to ask why we don’t yet have some form of national proficiency test for any whole or partial autonomous mode for a vehicle.

    The second question, based on the OP at least, would be to ask what issue Sumwalt has with Tesla. There are no shortage of companies out there today that are working on autonomous vehicles. Just on the face of it, we could be forgive
    • Agree it would be appropriate to cite others attempting the same thing. But it is not unreasonable to observe that Tesla is unique in the scale and nature of what they are attempting (not to mention, vocal), so it does not surprise me that they are singled out in this way.
  • As far as I can tell only thing stopping them going live with whatever they have is fud, which sounds like a tempting situation, but doesn't necessarily mean they aren't fucked if it crashes.

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...