Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Software Transportation Technology

'Operational Limitations' In Tesla Model S Played a 'Major Role' In Autopilot Crash, Says NTSB (reuters.com) 210

Mr D from 63 writes from a report via Reuters: The chairman of the U.S. National Transportation Safety Board (NTSB) said on Tuesday "operational limitations" in the Tesla Model S played a "major role" in a May 2016 crash that killed a driver using the vehicle's semi-autonomous "Autopilot" system. Reuters reported on Monday that the NTSB is expected to find that the system was a contributing factor because it allows drivers to avoid steering or watching the road for lengthy periods of time. The NTSB is also expected to find that Tesla Inc could have taken additional steps to prevent the system's misuse and will fault the driver for not paying attention. "Today's automation systems augment, rather than replace human drivers. Drivers must always be prepared to take the wheel or apply the brakes," NTSB Chairman Robert Sumalt said. The system could not reliably detect cross traffic and "did little to constrain the use of autopilot to roadways for which it was designed," the board said. Monitoring driver attention by measuring the driver's touching of the steering wheel "was a poor surrogate for monitored driving engagement." At a public hearing Tuesday on the crash involving Brown, NTSB said the truck driver and the Tesla driver "had at least 10 seconds to observe and respond to each other."
This discussion has been archived. No new comments can be posted.

'Operational Limitations' In Tesla Model S Played a 'Major Role' In Autopilot Crash, Says NTSB

Comments Filter:
  • by Marc_Hawke ( 130338 ) on Tuesday September 12, 2017 @06:12PM (#55184455)

    """Monitoring driver attention by measuring the driver's touching of the steering wheel "was a poor surrogate for monitored driving engagement." """

    How would you monitor their engagement? Eye tracking? Manual corrections to the car's path/speed?

    What happens when people ignore the "please grab the wheel?" Does the car pull over and park? Is that what it should do?

    • by PolygamousRanchKid ( 1290638 ) on Tuesday September 12, 2017 @06:36PM (#55184607)

      How would you monitor their engagement? Eye tracking? Manual corrections to the car's path/speed?

      Well, with any automobile . . . the biggest mechanical danger is . . . "The Loose Nut Behind the Wheel" . . .

      "Autopilots" are probably something that most "normal" drivers should not be using anyway. Hey, driving is a privilege, and not a right.

      Hey, take someone who can't read the traffic signs and is abysmally clueless as to traffic laws . . . no wonder that stuff like this will happen more often. We'll just have to wait and see how the American lawyers will deal with this. They could kill self-driving cars . . . but then again . . . it would be more lucrative for them to milk the industry.

      More and the weather at eleven . . .

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Auto Pilot for cars is a stupid idea altogether. For airplanes with very little traffic interaction is OK, but for automobiles it reeks.
      There a millions of decisions made during driving in urban traffic, judgment calls are constantly being made. For example, if meeting traffic would suddenly breach center and you would have to decide what object to crash, in less than a second, the auto pilot would fail. However many drivers are so uneducated and mis-educated, that a shitty autopilot would be better.
      The US

      • Auto Pilot for cars is a stupid idea altogether. For airplanes with very little traffic interaction is OK, but for automobiles it reeks.

        Autopilot for (large commercial) planes typically includes a Collision Avoidance component as well.

      • The US population has _extremely_ weak drivers education, I do not think 0.0001% would pass a west European driver test.

        I think you may have a broken zero key, it got stuck. That's only ~300 people. There are probably many more military fighter aircraft pilots in the US than this, and you're basically saying quite a few of them (most?) are incompetent and inattentive drivers of much slower land vehicles.

    • It means you can behave like an idiot and ignore all safety warnings and common sense, have an accident, and you (or your next of kin) still get to sue the car manufacturer.

      If anything is going to stop self driving cars, it's moronic "victims" looking for a payday.
      • by Mr D from 63 ( 3395377 ) on Tuesday September 12, 2017 @07:05PM (#55184803)
        No, it means that human factors play a role in safety with these systems. The study has nothing to do with legal culpability.
    • """Monitoring driver attention by measuring the driver's touching of the steering wheel "was a poor surrogate for monitored driving engagement." """

      How would you monitor their engagement?

      Auto makers are working on attention monitoring tools, as they realize it is important human factors issue with partially autonomous driving systems;

      http://www.loebermotors.com/bl... [loebermotors.com]

      https://electrek.co/2017/08/01... [electrek.co]

      http://www.sciencedirect.com/s... [sciencedirect.com]

    • by Ichijo ( 607641 ) on Tuesday September 12, 2017 @07:04PM (#55184797) Journal

      How would you monitor their engagement?

      There may not yet be an effective way to monitor driver engagement. This doesn't invalidate the NTSB's conclusion.

      • by rtb61 ( 674572 )

        Like, duh, too fucking easy, just like trains, a 'dead man switch' https://en.wikipedia.org/wiki/... [wikipedia.org]. Keep in pressed in with your left foot and the vehicle continues to operate, ease pressure and an alarm sounds and the vehicle slows to a stop pulling over a near a practicable to the vehicle shoulder. I have driving long distance and the conclusions are erroneous, reality is not having to continually focus, means you do not tire as quickly, you can relax whilst continuing to drive. Those who fall asleep in

      • by AmiMoJo ( 196126 )

        Other manufacturers are using gaze tracking cameras. If the driver closes their eyes or isn't looking at the surrounding road for more than a few seconds (I think 15 is the industry standard) the car alerts them. If that fails to get their attention the car is supposed to stop, but if it pulls over to just comes to a halt in the fast lane depends on the manufacturer.

    • by quantaman ( 517394 ) on Tuesday September 12, 2017 @07:09PM (#55184833)

      """Monitoring driver attention by measuring the driver's touching of the steering wheel "was a poor surrogate for monitored driving engagement." """

      How would you monitor their engagement? Eye tracking? Manual corrections to the car's path/speed?

      What happens when people ignore the "please grab the wheel?" Does the car pull over and park? Is that what it should do?

      It means that it's really hard to make a partially self-driving car that is safe.

      People have two mode, driving and not-driving. If the car isn't safe while you're not-driving then the car isn't safe.

      • The problem is that the current "autopilot" is pretty much like being a driving instructor.

        The student (or in this case, autopilot) drives, but you have to be always looking out for his mistakes and prepared to take control in an instant.

        Also, in this case the "student" is high or something because, while he usually does not do the mistakes other students do (forgetting a turn signal, missing a red traffic signal) he ma, once in a while, not notice a huge 18-wheeler just in front of him.

      • by mjwx ( 966435 )

        People have two mode, driving and not-driving.

        Use of the latter mode does not depend on whether the car has any autonomous capabilities.

    • by BasilBrush ( 643681 ) on Tuesday September 12, 2017 @07:33PM (#55184945)

      Yes, eye tracking is the obvious way. And the Tesla Model 3 has a camera in the rear view mirror area that faces back towards the inside of the car. AFAIK it's not used yet, but it's obvious use case is monitoring driver attention. They could deliver that in a future software update.

    • Easy. Just have the auto pilot unexpectedly swerve off the road from time to time. That will keep the driver alert!

    • How would you monitor their engagement? Eye tracking? Manual corrections to the car's path/speed?

      Yes, yes, and actually more than that. Volvo uses facial recognition as an example. This technology is in active use across a few models from a few different manufacturers.

    • What happens when people ignore the "please grab the wheel?" Does the car pull over and park? Is that what it should do?

      Yes. That's exactly what it should do. It should pull over, park, and alert the authorities that the occupant has become incapacitated.

      The system is already snooping on the driver full-time, so privacy is not a valid objection to this plan. And the driver has a responsibility to other drivers, so lack of control is not a valid objection to this plan, either. It will, in fact, save lives. People who don't like it should avoid AVs.

  • by hawguy ( 1600213 ) on Tuesday September 12, 2017 @06:19PM (#55184501)

    As autonomous cars get better and better, we'll see more and more accidents attributed to driver inattention -- the better the car is at driving, the less the human is going to pay attention to the car or the road, and by the time the car tells the driver "Oh hey, I don't know how to handle this situation, you take over!", the driver won't have enough situational awareness to get out of the situation.

    Though the flip side is that as the cars get better at driving, the overall accident rate will decrease.

    The same problem already exists with airplane pilots [cnn.com], and it can be even worse where the autopilot compensates for some building condition (like icing), and by the time it gives up control to the pilot, the plane may already be in a bad state and the pilot has little time to figure out why.

    • Pay-to-debug (Score:4, Interesting)

      by Roger W Moore ( 538166 ) on Tuesday September 12, 2017 @06:33PM (#55184595) Journal

      As autonomous cars get better and better, we'll see more and more accidents attributed to driver inattention

      Not if we can have a system which is better at driving than a human. In fact, other than the "cool factor" I'm not sure I see the point of a semi-autonomous system which requires me to watch the road all the time since it is no different from driving myself and potentially a lot more annoying. Frankly, it sounds more like paying to debug a final system which will drive itself.

      • Re:Pay-to-debug (Score:4, Interesting)

        by slack_justyb ( 862874 ) on Tuesday September 12, 2017 @06:54PM (#55184725)

        Friend's car on a trip to a conference I got to actually try one of these out, these systems are a godsend in stop and go traffic. Did not test the thing on the open road.

      • Re:Pay-to-debug (Score:5, Insightful)

        by Moof123 ( 1292134 ) on Tuesday September 12, 2017 @07:07PM (#55184815)

        Once we are on the other side of the autonomous creepy valley, sure things will be great and perfect and all that jazz. So far that is a mythical future you can't buy yet.

        In the meantime the current crop of systems, and the ones planned for the next several years, all augment the driver rather than replace them. As such driving will be inherently BORING as hell while you are a quasi-passenger in your own car. Bored humans check phones, read email, nod off, watch movies, and other dumb stuff when they are still legally required to be in command and control of the vehicle. Google even admitted their employees were sleeping and working on laptops on the way home in their very beta test fleet (not that they fired them as they should have).

        So the upshot is that while these autonomous systems will create this whole new class of collision by zombified drivers, even while potentially lowering the overall rate with their mostly good collision avoidance systems. Drivers will be out of practice and situationally unaware when HAL throws up its hands and gives back control, or when it fails to respond to trucks in the roadway, construction, black ice, pot holes, etc, etc.

        So until your car is fully licensed to drive itself, these semi-autonomous systems need to be designed to keep the licensed driver on the ball and paying attention. Of course that takes away 90% of the sex appeal, so I expect the legal/ethical envelope to keep being pushed and more of these new types of crashes to keep occurring.

        • Re:Pay-to-debug (Score:5, Insightful)

          by BasilBrush ( 643681 ) on Tuesday September 12, 2017 @07:22PM (#55184887)

          They'll learn less from them by firing them than by devising methods to stop them doing these things. After all, they can't fire the eventual customers that will also do these things.

        • Driving long distances has always been pretty damned boring, and today's drivers are already distracting themselves to death using their smartphones with no assistance from semi-autonomous cars. And I'd posit that future drivers can't really do any worse than current drivers do with black ice - which is to say, typically losing complete control of the vehicle. So, I think that ship has long sailed, and I'm not really sure there's any turning it around.

          It's going to be a rough few years of transition, mos

        • In the meantime the current crop of systems, and the ones planned for the next several years, all augment the driver rather than replace them. As such driving will be inherently BORING as hell while you are a quasi-passenger in your own car.

          This is why the smartest position is to permit autonomy only at low speeds that are unlikely to kill people, on roads where the vehicles are unlikely to encounter pedestrians like highways. The new A8, for example, offers low-speed autonomy which will be useful in traffic jams — the most boring part of driving. It's automating away the most tedious part completely, but not taking responsibility for what you do at high speeds.

      • A system can be "better than a human" a large percent of the time and still cause crashes. Look to aviation for the answer, crashes have become almost a thing of the past in most developed countries with high standards of maintenance because we have figured out how best to get the humans and the machines to work together. The automation takes over most of the grunt work and when it runs into problems it essentially makes the humans the "backup system".

        Problem with cars is nobody wants to spend a lot of
      • by swb ( 14022 )

        My 2007 Volvo has the crudest form of semi-autonomy, distance-sensing cruise control, and I have to tell you it's the one must-have feature I would look for in another car.

        My system is pretty ancient technology wise, but it works in traffic down to 20 MPH and I use it all the time. Obviously I'm still driving and have to pay attention, but it really does make driving a lot simpler, especially in slow-and-go traffic.

        I'm less impressed with the blind-spot system -- I get an orange light by the mirrors if the

        • BLIS is weak mostly because the rain and stuff can mess it up. But it did prevent me from running a motorcycle over when I was driving my moms XC home from the dealer when BLIS was still new. On the interstate a few years back I watched an SUV slowly come out of it's lane and gently sideswipe a car next to it. I see cars leave their lane accidently all the time, especially during turns or long straightaways when people are on their phones. Lane control is going to be super helpful in cars.
    • You have no evidence to back up your claim. It is entirely possible that the overall accident rate will increase because people will start to rely on autonomous driving systems in situations where they are not safe or use them in ways that are unsafe. Many people already drive in surprisingly unsafe ways.
    • The same problem already exists with airplane pilots , and it can be even worse where the autopilot compensates for some building condition (like icing), and by the time it gives up control to the pilot, the plane may already be in a bad state and the pilot has little time to figure out why.

      That's a pretty easy problem to solve, though. When the systems kick in, alert the driver. Some vehicles flash the ABS light when ABS engages. Put that sort of information on a HUD and let the operator know what is happening and why, so that they're not caught off guard.

  • Since when does a machine have to monitor its operator to ensure he/she is "engaged". Ridiculous.
  • by ArchieBunker ( 132337 ) on Tuesday September 12, 2017 @06:38PM (#55184623)

    Didn't Tesla say this guy had to acknowledge like seven warnings before the crash?

    • by Pinky's Brain ( 1158667 ) on Tuesday September 12, 2017 @06:44PM (#55184665)

      They allowed the system to be active with seven warnings and over the speed limit.

      • Re: (Score:2, Funny)

        by Anonymous Coward

        They allowed the system to be active with seven warnings and over the speed limit.

        So it's somehow Tesla's fault that the driver was an idiot? The software is nowhere near a state where it can refuse to defer to a meatbag. The driver has to be assumed to be in control. The software can notify the driver of sub-optimal conditions; it shouldn't be blamed for the idiot behind the wheel ignoring repeated warnings.

    • You read all the TOS before clicking "Accept", right?

    • Re:This is dumb (Score:4, Informative)

      by Richard_at_work ( 517087 ) on Tuesday September 12, 2017 @11:32PM (#55185861)

      The issue is the duration in which the Tesla system gives warnings - the industry standard recommended timespan between inattentive-driver prompts is 15 seconds, while in this case, there were no inattentive-driver prompts from Autopilot for the two minutes leading up to the accident.

      This is the issue when you call something "Autopilot" and give it to a consumer base that is used to being spoonfed fictitious understandings of such systems from superficial TV shows - they are led to believe it does something that it most certainly does not.

      Yes, Tesla put all sorts of warnings in their manuals about this, but theres absolutely no requirement to read those manuals before jumping into the car, hitting the highway and engaging the system. Thats where the disconnect from reality and theory occurs - in theory, everyone reads the manual and understands the intimate details of the vehicle before setting off, while in practice people jump into new cars all the time and try things out.

      Who here has been the person sat in their rental car for 30 minutes reading the manual before driving off for the first time? I bet the number of people who response affirmatively to that question is .... low.

      Thats the issue Tesla need to solve.

      • by AmiMoJo ( 196126 )

        Tesla need to do much more than that. Just having your hand on the wheel is meaningless - there are videos of people sleeping with one hand on the wheel on YouTube, as their Tesla drives on auto.

        Since they can't retrofit gaze tracking cameras to every existing car, it's not clear what they can do to make them safe.

        They really seem to be selling this way before it is ready. They had a big set back with the AP2 hardware, and yet are already selling people full self driving capability with a promised software

  • Well duh. (Score:5, Interesting)

    by Gravis Zero ( 934156 ) on Tuesday September 12, 2017 @06:45PM (#55184673)

    The NTSB is also expected to find that Tesla Inc could have taken additional steps to prevent the system's misuse

    Of course they could have taken additional steps to prevent the system's misuse before the crash because that's exactly what they did right after the crash.

    • Re:Well duh. (Score:5, Interesting)

      by srw ( 38421 ) on Tuesday September 12, 2017 @06:57PM (#55184749) Homepage
      But they want it misused.

      This is the absolute best R&D that money can't buy. Idiots let the autopilot drive for them, even when they're not supposed to, and Tesla gets amazingly good real-life data and very little of the blame when things go wrong. (none, if they play it just right)
    • Once an always take an "additional" step. The question was what was in place before the crash, and it wasn't nothing. As part of the investigation Tesla showed that the driver actively ignored the warnings that were already in place.

      If you try to fix stupid with technology it will be a race to the bottom.

      • by AmiMoJo ( 196126 )

        There were no warnings in the two minutes leading up to the crash. Even if there had been, their driver attentiveness detection system is flawed - just having your hand on the wheel doesn't mean you are paying attention. You can easily have one hand on the wheel and the other on your phone, or sleep in that position.

        Look at this guy: https://youtu.be/sXls4cdEv7c [youtu.be]

        • There were no warnings in the two minutes leading up to the crash. Even if there had been, their driver attentiveness detection system is flawed

          And? The system repeatedly warned him over a period of half an hour and he ignored every warning (see report below). Would another warning in the last 2 minutes save him? You can't fix stupid with technology.

          You can easily have one hand on the wheel and the other on your phone, or sleep in that position.

          And it's amazing how many of those cases also cause major accidents in cars without any driver assistance technology.

          For the vast majority of the trip, the AUTOPILOT HANDS ON STATE remained at HANDS REQUIRED NOT DETECTED. Seven times during the course of the trip, the AUTOPILOT HANDS ON STATE transitioned to VISUAL WARNING. During six of these times, the AUTOPILOT HANDS ON STATE transitioned further to CHIME 1 before briefly transitioning to HANDS REQUIRED DETECTED for 1 to 3 seconds. During the course of the trip, approximately 37 minutes 16 passed during which the Autopilot system was actively controlling the automobile in both lane assist and adaptive cruise control. During this period, the AUTOPILOT HANDS ON STATE was in HANDS REQUIRED DETECTED for 25 seconds. For the remainder of this period, the AUTOPILOT HANDS ON STATE was in HANDS REQUIRED NOT DETECTED, or in one of the visual or aural warning states.

          • by AmiMoJo ( 196126 )

            The report is pretty damning. Over a period of 37 minutes he had his hands on the wheel for 25 seconds, and the car didn't stop or force him to pay attention.

            If all you need to defeat their attention test is to touch the wheel for 25 seconds out of every 37 minutes, that's not going to stop anyone playing with their phone or taking a nap.

      • by Cederic ( 9623 )

        When you publicise a product as 'autopilot' and infer that it doesn't require full attention, then fail to implement basic attention monitoring capabilities, and you sell your cars to idiots, you pretty much guarantee something is going to go wrong.

        Shit, my five year old car that doesn't even have lane assistance or dynamic cruise control but still monitors me and alerts me if it thinks I'm not paying attention to the road.

        • then fail to implement basic attention monitoring capabilities

          You mean like the detection systems that warned the driver over and over again to put his hands on the wheel?

          and you sell your cars to idiots, you pretty much guarantee something is going to go wrong.

          Wait are we still talking about autopilot here? Because last time I got rear ended was by some idiot playing with his phone. At least with an autopilot he may have stood a chance.

          Shit, my five year old car that doesn't even have lane assistance or dynamic cruise control but still monitors me and alerts me if it thinks I'm not paying attention to the road.

          Yeah it's almost like you didn't read the report that showed the driver ignored repeated visual and audible warnings his Tesla gave him.

          • Wait are we still talking about autopilot here? Because last time I got rear ended was by some idiot playing with his phone. At least with an autopilot he may have stood a chance.

            Just count yourself lucky you're not a truck driver that has to cross a street!

          • by Cederic ( 9623 )

            Sorry, why are you ranting at me? It's the NTSB that said Tesla's monitoring systems were inadequate.

            Bitch all you like but Tesla have been found to have unsafe technology in their cars. Tesla and safety: fail.

            • Sorry, why are you ranting at me?

              I'm not. I don't even know you. What I'm doing is addressing your obviously incorrect statements given that Tesla very much has the monitoring systems you claim it didn't.

              Bitch all you like but Tesla have been found to have unsafe technology in their cars. Tesla and safety: fail.

              Actually so far the only thing that was said is that the autopilot design contributed to this crash.
              In the meantime the NHTSA investigating the same report said while the autopilot did contribute to this crash there was nothing wrong with the design of the technology, and on top of it all the data shows that there's a 40% crash reduction r

  • by bobbied ( 2522392 ) on Tuesday September 12, 2017 @06:58PM (#55184761)

    This is BAD for Tesla. The NTSB basically found fault in the "auto pilot" system's user interface AND it's technical capability. I am NOT surprised by this.

    Automatic driving of cars and trunks needs to be thoroughly thought through. Not just the technology required to keep the car on the road, sensing what's going on around it and dealing appropriately with this dynamic environment, but also the complex human factors considerations. Tesla may have the first part working fairly well within the given limits of their sensors, but the second part of this problem hasn't been designed very well.

    Human Factors engineering has only recently been a consideration for *real* auto pilots (those in aircraft) and flight automation systems. And it has become clear that all the automation in aircraft has given us great efficiency and smooth operation a the cost of inexperienced pilots with poor flying skills who don't recognize when something is gravely wrong until it is too late. They trust the automation, because it just works, at least until it doesn't, and something really bad happens that was easily preventable. The folks over at the NTSB are very familiar with this issue because there have been a number of notable commercial aircraft crashes where this was a contributing factor, where the automation failed to do what the pilot expected and a crash happened in a perfectly flyable aircraft.

    Tesla has a serious level of risk with this feature. It may be wiz bang cool and Musk may love calling it an "autopilot" but the legal liability is huge unless they can keep people from crashing while it's on. The NTSB's statements here are NOT going to bode well for Tesla's legal liability and all the EULA's in the world won't stop the lawsuits when crashes happen.

    • And the autopilot in planes saves lives.

      The cases where aircraft autopilot doesn't do what it's supposed to do has resulted in far fewer accidents than pilot error has when the pilot is flying the aircraft full time.

      No, it doesn't yet handle the edge cases as well. But it handles the common cases far better than a human.

      • That is actually somewhat debatable in the industry. Not so much the autopilot for long flights, but the autothrottle for landing. Pilots set the wrong setting, and then do not monitor air speed until disaster strikes. Has led to quite a few crashes, almost all with Asian pilots.

      • Why do so many people insist on comparing the Tesla auto-pilot feature to an aircraft auto-pilot system? Their similarities end at the name.
        • ... because Tesla chose to use a name which is synonymous with a specific thing already in existence, namely aviation autopilots.

      • The autopilot systems in aviation are operated by heavily trained and experienced pilots who operate under very strict rules - I don't think a single driver here on Slashdot would agree to driving anywhere under "sterile cockpit" rules, where commercial pilots are forbidden from engaging in non-essential conversation and interactions during critical phases of the flight.

        The difference between pilots (in both commercial and general aviation areas) using an autopilot system and a general motor-vehicle driver

        • It's not nearly as big a difference as you think. And I don't engage in non-essential conversations during critical phases of driving. The reality is that most of driving is boring, it's the same thing you've seen thousands of times outside the windows, your brain turns off and you drive practically by muscle memory once you've been doing it long enough. If I'm sitting in nearly stopped traffic, I don't want to be paying attention to what's going on outside, and with current technology I don't need to.

          If

      • by mjwx ( 966435 )

        And the autopilot in planes saves lives.

        Incorrect.
        Autopilot combined with pilot saves lives. Dependence on autopilot has demonstrated the opposite effect, as evidenced in Asiana flight 214 and absolutely will not save lives when the pilot orders it to fly into a mountain (Germanwings flight 9525).

        The main reason that Google's self driving car has only had one at fault accident is because it had a professional driver who was been paying attention. The next biggest reason is that it's only been tested in sunny California. I'd like to see it on

      • And the autopilot in planes saves lives.

        Says who exactly? Can you quote something that proves that statement?

        Autopilots are there to increase efficiency and comfort of the passengers. They sometimes are there to reduce work load so they can reduce the cockpit crew size, but they are not required safety equipment for commercial flying, unless the automation was the basis for reducing crew requirements during type certification.

        The advent of autopilots has actually had a negative impact on pilot's flying skills, which impacts safety. Stick and

  • by markdavis ( 642305 ) on Tuesday September 12, 2017 @07:13PM (#55184853)

    You can't have it both ways. Really, you can't. Either the car is driving or the person is driving. Expecting that a person will let the car drive AND ALSO stay 100% ready to take over is just not reality. If you are not the one in control, then your mind will not focus on it. Driving is boring enough as it is, expecting someone to babysit a semi-autonomous car is way beyond what we can expect people to do.

    Just as an example, 9 years ago when I got my fully loaded Infiniti G37S with technology package, it was one of the first vehicles to have laser-controlled intelligent cruise control. It can match speeds of the cars in front and actively adjust, even brake if necessary. And just that ONE feature of driving assistance sounded like it would be very useful. OMG no. I tried many times to use it and found that just fully automated speed control was enough to disengage me from being an active driver. I could not adapt to it and ultimately decided I would never use it again. It was simply unsafe! Regular cruise control- no problem, I have to pay attention and I bump the speed up and down manually with the thumb control and take other action when necessary. But as soon as that was taken away from me, it became nearly impossible to stay attentive, even though I still had to steer!

    Now, maybe different brains work differently and some people can handle semi-automation, but I know I can't. So don't even TRY to give me a car that can sorta drive itself and expect ME to be the ultimate failsafe... that just isn't going to happen. And I expect I am far, FAR from alone in this.

    • When I code, when I am typing function names or variable names, I am super careful about spelling and watch every letter. Somehow, as soon as I start a literal string or a comment block, the brain instantly switches off attention and I make typos.

      I could easily see how taking away speed control is enough to let your mind wander.

      • by AmiMoJo ( 196126 )

        I carefully name all my variables so that I can type no more than 3 characters and use tab to auto-complete.

        If the IDE doesn't have auto-complete, I just use 3 letter variable names.

        I find this avoids many common spelling errors.

    • by jezwel ( 2451108 )

      Now, maybe different brains work differently and some people can handle semi-automation, but I know I can't. So don't even TRY to give me a car that can sorta drive itself and expect ME to be the ultimate failsafe... that just isn't going to happen. And I expect I am far, FAR from alone in this.

      My partners' new car has this. She uses it all the time, and I do as well when I'm driving.
      Once I figured out how to adjust the distance between the car and the one ahead, and added the max cruise speed, it is super simple to use.

      Neither of us have had difficulty in being overly bored whilst using it.

      There are obviously groups of people where this does work, and quite well.

    • by Cyberax ( 705495 )
      I drive Tesla and I love the self-driving. It definitely makes me a safer driver - I have enough attention to actually monitor surroundings while the car takes care of staying in the lane. It probably saved me from a couple of deer strikes already. I have no problem staying alert and I can actually stay alert for much longer with self-driving during road trips.
    • by mjwx ( 966435 )

      You can't have it both ways. Really, you can't. Either the car is driving or the person is driving. Expecting that a person will let the car drive AND ALSO stay 100% ready to take over is just not reality.

      Because this is emphatically not true is the main reason the Google autonomous car has had only one at fault accident.

      Its not that we are incapable of monitoring an autonomous system, we are, in fact there are many careers that mainly consist of watching an autonomous machine doing its thing and stepping in when things go wrong.

      The problem is that most people wont. Not because they cant but because they're lazy and marketing has sold them the impression that autonomous cars will do everything for them

  • I fully believe that autonomous driving is possible in 99% of the cases. Autonomous vehicles are not a new thing, research and experiments [wikipedia.org] have been conducted in this domain at least since the 1980s. Self-driving cars have been running on German and Italian roads since the mid-1990s.

    However, this technology in consumer cars is another story. It is between expensive and very expensive right now, and will likely continue to be for some time. This is also a complex system that needs to be monitored and maintai

    • by fisted ( 2295862 )

      Self-driving cars have been running on German and Italian roads since the mid-1990s.

      Do you mean that one instance in 1995 where a car went from Munich to Copenhagen with a mean time between human interventions of 5.6 miles?
      If so, that's "cars [plural] have been running ... since the mid 90s [as if it's an ongoing thing since then]"?
      If not, mind sharing a source for your claim?

  • The driver of the Tesla already had shown irresponsible behaviour on youtube when using his Tesla, so it all was the drivers fault, not Tesla. You cannot do anything against stupidity.
  • This is human nature 101. Put a broken feature into a car called autopilot and all it takes is for the car to screw up at a moment that the human is distracted for a catastrophic failure. Tesla even encouraged inattentiveness by allowing the human to take his/her hands off the wheel for extended periods of time. Even the modified requirements are too lax.

    None of this should have come as a surprise to Tesla or anyone who thought about this for a moment. Driver boredom and inattentiveness is an obvious cons

  • look for that condition and take action to get the bearing changing by reducing speed...

    It's not rocket science...

On the eighth day, God created FORTRAN.

Working...