Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Transportation Government United States

US Agency Orders Automated Vehicle Makers To Report Crashes (apnews.com) 49

The U.S. government's highway safety agency has ordered automakers to report any crashes involving fully autonomous vehicles or partially automated driver assist systems. The Associated Press reports: The move Tuesday by the National Highway Traffic Safety Administration indicates the agency is taking a tougher stance on automated vehicle safety than in the past. It has been reluctant to issue any regulations of the new technology for fear of hampering adoption of the potentially life-saving systems. The order requires vehicle and equipment manufacturers and companies that operate the vehicles to report crashes on public roads involving fully autonomous vehicles, or those in which driver assist systems were operating immediately before or during a crash.

"By mandating crash reporting, the agency will have access to critical data that will help quickly identify safety issues that could emerge in these automated systems," NHTSA Acting Administrator Steven Cliff said in a statement. The agency says it will look for potential safety defects, and the information could cause it to send out a crash investigation team or open a defect investigation. Companies have to report crashes involving fully autonomous or partially automated vehicles within one day of learning about them, if they involve a hospital-treated injury, a death, air bag deployment, pedestrians or bicyclists, or were serious enough for a vehicle to be towed away. Other crashes involving vehicles equipped with the systems involving injury or property damage have to be reported every month. The requirement does not apply to consumers who own vehicles or auto dealers. NHTSA says in a statement that the data can show if there are common patterns in crashes involving the systems.

This discussion has been archived. No new comments can be posted.

US Agency Orders Automated Vehicle Makers To Report Crashes

Comments Filter:
  • by rmdingler ( 1955220 ) on Tuesday June 29, 2021 @09:10PM (#61535688) Journal

    This is one of those things you hear about that you can't believe isn't already implemented.

    What a kind nod to autonomous vehicle development... until you kill X amount of people, we'll not be recording the statistics.

    The price of progress?

    • just blame the contracted safety driver with an homicide and hope they get an good plea bargain so you don't have your logs, source code , docs , build notes, showing up in open court.

    • by rtb61 ( 674572 )

      What is happening is vehicle autopilot is no longer considered stupid proof and will likely be banned until vehicles are fully automatic and do no require any driver intervention. Vehicle autopilot invites and temps drivers to not pay any attention to what their vehicle is doing, which is really quite stupid in the current stage of development. So they will likely go an all or nothing approach, as being more stupid proof.

      • by larwe ( 858929 ) on Wednesday June 30, 2021 @03:11AM (#61536316)

        will likely be banned until vehicles are fully automatic and do no require any driver intervention

        My belief is that this state will come to pass approximately never (never being either "until all roads are smartcar-enabled, all non-smartcars are banned, and it's the road, not the car, that controls the traffic" or "until the hard AI problem is solved" - both of which sound like a hard never to me). The closest fielded approximation we have to autonomous vehicles is modern aircraft autopilots. An aircraft spends a tiny amount of each journey in a high-stress environment navigating gates and runways and tall obstacles and depart/approach traffic patterns, and most of its time navigating from waypoint to waypoint quite freely in 3-space with a good deal of separation between it and other traffic; plenty of time to detect imminent collisions and avoid them through minor course changes. While almost all of that can technically be automated (and is), the vast majority of accidents happen during those takeoff and landing endpoints, even with a minimum of two highly trained, regularly tested operators handling the systems. An autonomous car spends its entire time in the equivalent of the takeoff/landing phase - there's limited degrees of freedom to move to avoid accidents (ground cars are 2D, and can't move laterally in an arbitrary fashion; a steering circle is only so tight), there's obstacles, including moving obstacles, everywhere - with only seconds or less of time to detect, identify and avoid. And it's being operated by a single driver who may have qualified to drive 30 years ago and has not been recertified since.

        The 'lifesaving' statistics from autonomous features are to a certain degree based on the assumption that drivers will remain at least as attentive with the feature on as they were with the feature off - which is demonstrably not true. Again, my expectation is that the outcome of further study here will be much stronger regulation about having the car detect driver inattentiveness - the car needs to be _driven_ as if it was dumb, not merely _monitored_ by the driver.

        I am astonished that this data collection wasn't required as a condition of certifying any of these features/systems for operation on the highway. Adding these systems is effectively taking a percentage of the "license the driver" testing away and putting that onus on the car.

      • by Anonymous Coward
        Here's a pro-tip for you, old son:

        We won't ever have fully automated driving, because the useless, fake excuse for 'AI' everyones' marketing departments keeps trotting out and lying about will NEVER, NEVER EVER be anywhere near as good as a real human driver

        Please, people, stop believing the hype and nonsense, and wake the hell up: self-driving cars are a meme, and falling for it will get you killed.

    • This is one of those things you hear about that you can't believe isn't already implemented.

      What a kind nod to autonomous vehicle development... until you kill X amount of people, we'll not be recording the statistics.

      The price of progress?

      Given the fact that we humans can currently obtain some legal recourse against a horrible action (e.g. drunk/drugged driving causing death), we probably should be far more concerned about civil and criminal legal recourse being quietly obliterated with automation. Greed has their way, and you'll be able to sue an autonomous machine provider about as easily as the US Government.

      And that's before you get to the corrupt finger pointing (car's fault, network's fault, network provider's fault, GPS fault, sun sp

      • by Pimpy ( 143938 )

        And this is exactly why things like event non-repudiation and dynamic risk assessment are some of the biggest operational issues facing autonomous vehicles intending to operate on public roads. At SAE level 3, the risk of automation complacency is high, while the vehicle is only providing driving assistance and requires the driver to maintain situational awareness. At level 4, environmental monitoring and risk analysis shifts to the vehicle, and at level 5, the driver may not even have a mechanism to interv

        • I have nothing more to add here, except an appreciation for your detailed input on the matter. Interesting to know how this is developing, and I agree with your concerns regarding liability. Thanks for the feedback.

          Actually, I do have one thought; those who take public transit today; you are merely a rider on a bus, not in control of the vehicle. The bus crashes and kills a dozen riders. Cause is via negligent driver, or physical malfunction, or 3rd party involvement.

          What rights do you have to sue today

          • by Pimpy ( 143938 )

            While I do work in dynamic risk modelling and adaptive systems for connected and autonomous vehicles, and therefore work closely with both insurance companies and regulators, as well as automotive OEMs, I am by no means an expert on insurance. That being said, in your bus scenario, I would imagine the following: If the driver is engaging in negligent behaviour (e.g. talking on their phone) and crashes, you would sue them personally, and their personal liability insurance would cover this with some cap (typi

    • by Aczlan ( 636310 )

      This is one of those things you hear about that you can't believe isn't already implemented.

      What a kind nod to autonomous vehicle development... until you kill X amount of people, we'll not be recording the statistics.

      The price of progress?

      Sort of like getting a new light installed at an intersection, no traffic study until there have been a certain number of fatality wrecks at the intersection...

      Aaron Z

    • It's not a problem until you have enough vehicles on the road to be a hazard to many people. Too much regulation too early would likely have killed the industry, which will eventually go on to save lives — or at least that's the thinking. Remember, the goal is literally zero road fatalities, although IIRC the target date is only "someday".

  • by MagicMike ( 7992 ) on Tuesday June 29, 2021 @09:30PM (#61535730) Homepage

    I'm a sometimes vulnerable road user (motorcycles, scooters, bicycles), but I also drive cars plenty so everyone calm down.

    I think it's clear that driving is disproportionately dangerous though, and most collisions are preventable and caused by us lame meat sacks doing one or more stupid things in combination.

    Imagine if that sort of behavior was tolerated in air safety!

    Now imagine the opposite: if air safety standards (all mishaps reported, investigated, root-cause-analyzed, remediated) were applied to auto safety!

    I could go out on public roads on two wheels without fearing for my life. Buses full of passengers wouldn't crash killing all passengers.

    Bring on rigorous reporting and let's up our game here, it is about time.

    • by larwe ( 858929 )

      Now imagine the opposite: if air safety standards (all mishaps reported, investigated, root-cause-analyzed, remediated) were applied to auto safety!

      I assume this is galloping sarcasm. From a quick lackadaisical Googling, there are six million reported car accidents a year in the US (there are lower bounds on what gets reported, and tbh the unreported fender-benders are just as important to the task of improving autonomous driving software as the more expensive, reportable accidents). The best number I can find for the number of US aircraft accidents is "1,302 accidents for US-registered aircraft in 2019". Aircraft accidents are investigated by the NTS

      • by MagicMike ( 7992 )

        No sarcasm at all. You seem to be afflicted by that "it looks hard, we shouldn't try" disease, what a shame.

        It appears there is an agency (from TFA) starting to do it.

        I imagine once real safety standards are applied the number of accidents goes down rapidly (probably a log curve) such that you're defeatist "oh my god there's so many" pearl-clutching is addressed

        • by larwe ( 858929 )

          No sarcasm at all. You seem to be afflicted by that "it looks hard, we shouldn't try" disease, what a shame.

          I'm afflicted by "you clearly have no idea what imposing a mandatory investigation process - which will only grow over time, and is already from day 0 unworkable" disease. Air travel and ground travel are fundamentally different, starting with how both the vehicles and the operators are certified. The NTSB's budget is around $100MM right now. You may be cool with making that $100B (and that's a gross underestimate; growing a regulatory body by three orders of magnitude probably involves five orders of magni

          • by MagicMike ( 7992 )

            I dunno, in the general sense that a government is tasked with protecting life liberty and the pursuit of happiness, I see it as a core goal and easily worth of the expenditure for the whole "life" thing vs ...I dunno the death involved with the current military budget. Would be quite happy with a re-allocation there.

            • by larwe ( 858929 )

              Think about this for a moment. A driver misjudges while parallel parking and clips an adjacent car or mailbox. "Oh well". Two drivers slightly misjudge a turn and scrape a side mirror off. "Oh well". Two aircraft touch wingtips - Interviews with ground control, pilot(s), review of posted procedures, review of radio logs, review of training records, two months of analysis, written report approved by several levels of bureaucracy. Dozens of people involved. And this doesn't even get into the fact that there a

    • I could go out on public roads on two wheels without fearing for my life.

      Sure, because most people wouldn't be capable (mentally or financially) of passing the driving test (or its annual checks), so 90% of people would be using public transport.

  • by dknj ( 441802 ) on Tuesday June 29, 2021 @10:28PM (#61535866) Journal

    "Consumers and dealers are not subject to these rules"

    If your vehicle didn't phone home before, all future models will always report back to the manufacturer. Think about it.. cruise control is an autonomous feature. Lane assist is an autonomous feature. Once this regulation goes into place, do not buy a car.

    PROTIP: Ford vehicles without 4g connectivity (read: in-car wifi) are not permenantly connected, but they are always recording your cars stats. If you replace the headunit [rhinoradios.com], they no longer spy on you.

    -dk

    • must buy an data plan and maybe roaming? in boarder zones you don't want to be fringe roaming near the boarder and have to pay as high as $2.05/MB.

      • must buy an data plan and maybe roaming? in boarder zones you don't want to be fringe roaming near the boarder and have to pay as high as $2.05/MB.

        Hopefully by the time autonomous networks take hold, we will also have a global internet service available anywhere, so there won't be any "fringe roaming".

      • in boarder zones

        Border zones. Don't they teach people to spell these days?

        Yeah, yeah, I know. Get off my lawn!

    • by ledow ( 319597 )

      So you'll be walking from 2030/2035, will you?

      At that point it will be difficult to keep any ICE car working, new ICE car sales will be banned in almost every developed country (some earlier than others), the industry's focus will be on electric cars and car parts, and nobody will really care about your gas guzzler but you (P.S. take it from me: Prepare for huge fuel taxes in line with other countries when that happens).

      I do happen to drive a Ford without any talk-home capability, but as you say, it's stil

      • by dknj ( 441802 )

        I like how you talk smug like you actually knew what you were talking about.

        If your Ford is 2005 or newer, you have an APIM which is the "secondary headunit" that controls all of your comfort features of your car (communicates with your dashboard, your actual radio, your hvac, your mood lighting if you have it, it provides the nice chime if you have the upgraded model when you open your doors), This APIM is connected to your GPS and ODB2 and reads data such as, occupancy sensors, seatbelts, gps location, s

  • So basically this concerns all vehicles equipped with cruise control, correct?

    • by Monoman ( 8745 )

      Cruise control - maybe. That's just the driver locking in the speed. Cruise control with automatic speed adjustments to avoid running into slower vehicles i think would be included. Vehicles with tech to keep cars in the lane probably will be included too.

  • by nicolaiplum ( 169077 ) on Wednesday June 30, 2021 @03:47AM (#61536370)

    This is aimed at manufacturers doing testing and not at consumers who own vehicles already. So if you're running your private fleet of cars around to test them, like Waymo and others, you're subject to this.

    If you're uploading beta software to the cars owned by consumers then you're not subject to this, like Tesla.

    That's great for Elon Musk, isn't it? He's not going to have to report most of his company's Autopilot crashes but he can point to the terrible statistics of his competitor Waymo who would have to report everything (for the medium future, at least).

    • Re: (Score:2, Insightful)

      by geekmux ( 1040042 )

      That's great for Elon Musk, isn't it? He's not going to have to report most of his company's Autopilot crashes but he can point to the terrible statistics of his competitor Waymo who would have to report everything (for the medium future, at least).

      Pretty sure Elon isn't getting away from damn near any autonomous crash, since it still manages to make headline news every time.

      Gotta love those stock short whores. They'll never give up.

      • by dargaud ( 518470 )

        Gotta love those stock short whores. They'll never give up.

        Don't short stocks have a time limit on them ?

        • Greed, is timeless.

          I'm certain they can find a way to justify refreshing short positions to keep the anti-hype up, sponsored by the competition.

          • Re: (Score:2, Interesting)

            by Anonymous Coward
            Or, you know, maybe Musk isn't Jesus Christ after all and is just another greedy businessman with an expensive PR campaign?
  • My insurance company wants me to install a gadget that will record my driving, and for that will give me a discount. I politely refused but am still concerned that any of the current data stored by my engine control module could be misinterpreted should I have an accident.

    Until a standard is agreed upon for what data is collected, a standard that could prove innocence or guilt, or at least provide real assistance to the outcome, I hesitate to embrace the tech. We all know a lawyer that could take the minisc

    • by Pimpy ( 143938 )

      The "standard" isn't the problem, it's the lack of transparency over data collection, use, and processing, none of which a standardized data model does anything about. In OBD-II terms, there are many standardardized parameters that are collected for usage-based insurance, but these tend to be more about *how* you are driving the car (e.g. engine RPMs, vehicle speed, fuel consumption, engine temp, DTC codes, etc.). The OBD-II standard itself contains only limited instrumentation data, but some of the dongles

  • Yes, I'm sure autonomous systems will sometimes fail, resulting in near-misses, crashes and even the occasional fatality.

    However, the incidents per mile driven for computer-driven vehicles are going to be a (Small!) fraction of the rate for those vehicles under meat-bag control.

    As it stands, this will enable the same issue with EV fires on a much wider scale -- ~500 ICE vehicle fires in the US per day, and no one cares. One Tesla burns (with no injuries) and it's international news.

    It might be useful

  • The next few years are going to be very damaging for companies developing self-driving technology. Mostly (but not entirely) because people are stupid.

    With the increases in processing power, software capability and other innovations, it will not be much longer (a few years, and likely less than a decade) before self-driving tech is at the point where you can leave your house, and have your car take you to a destination across town, or across the country on roads that the vehicle as no prior knowledge of

Whoever dies with the most toys wins.

Working...