Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Transportation

Tesla Autopilot Safety Defeat Device Gets a Cease-and-Desist From NHTSA (autoblog.com) 138

schwit1 writes: The National Highway Traffic Safety Association (NHTSA) is cracking down on a device that was designed to trick Tesla's semi-autonomous Autopilot feature into thinking a driver is paying attention, in order to extend the amount of time that it will operate without anyone touching the steering wheel. NHTSA announced on Tuesday that it has sent a cease and desist letter to the makers of Autopilot Buddy, and has given the company until June 29 to end sales and distribution of the $199 product.

The device is a two-piece weighted hoop with magnets that wraps around a steering wheel spoke and registers with the car's sensors as a hand on the wheel. Autopilot is programmed to disengage after a short period of time if the driver is not touching the wheel and ignores a series of alerts to take control.unity.

This discussion has been archived. No new comments can be posted.

Tesla Autopilot Safety Defeat Device Gets a Cease-and-Desist From NHTSA

Comments Filter:
  • ...companion cube.

    • by nazsco ( 695026 )

      This sounds like the Companion Cube's misdirection.

      There's no talk about that device being involved the current high profile investigation. It probably was not present but they are doing that now to divert attention.

  • They should be able to sell it, as long as they're willing to pay for the damages in any accident associated with its use. Bet it would be pulled very quickly...
    • Re:Liability... (Score:4, Insightful)

      by Frosty Piss ( 770223 ) * on Saturday June 23, 2018 @05:47PM (#56835370)

      They should be able to sell it, as long as they're willing to pay for the damages in any accident associated with its use.

      The people this device is involved in killing might disagree with you.

      • Re:Liability... (Score:4, Interesting)

        by dgatwood ( 11270 ) on Saturday June 23, 2018 @06:46PM (#56835548) Homepage Journal

        This product didn't kill anyone. It can only do one of two things:

        • Keep the Tesla from falsely nagging someone who does have both hands on the wheel.
        • Keep the Tesla from uselessly nagging someone who isn't going to pay attention anyway, and at best will just tap the wheel when the nags happen.

        Neither of these has any meaningful effect on driver or vehicle safety. The odds against a device like this causing a fatal accident are astronomical, because for the car's autosteer to shut down, the driver has to be so completely oblivious that he/she fails to respond to three nags WITH SOUND within a one-hour period. This is a relatively rare occurrence, short of someone dying behind the wheel....

        More importantly, any claim of reduced safety relies on the assumption that the nags somehow make the car safer, when in my experience, the precise opposite is true. The nag system takes an insane amount of time to detect when the driver doesn't have his/her hands on the wheel, most of the time, but constantly nags at highly inappropriate times (such as during acceleration) when the driver *does* have both hands on the wheel.

        As best I can tell, the main purpose of the nags seems to be to make the autosteer feature more annoying than driving by hand so that folks will spend more money for the self-driving package when it finally comes out. The nags have gotten so annoying that I'm finding myself using autosteer less and less frequently as the nag rate increases. In other words, assuming autosteer really is improving safety, then statistically speaking, the nags are making the car LESS safe, not more.

        Worse, because of the way Tesla detects hands on the wheel — by measuring the torque provided by your hands against the autosteer, the nags are actually more frequent when gripping the wheel tightly with two hands than when loosely hanging one hand on one side of the wheel. So the nags actively encourage drivers to do the exact opposite of what it claims to be doing. Again, the nags make the car LESS safe.

        So I don't know what NHTSA is smoking, but I'd like some of that. Obviously nobody involved in that C&D has ever actually driven a Tesla, or else they would not have sent it. The nags should die in a fire. They make the vehicle less safe, and any technology that can be used to render them harmless makes Tesla vehicles safer to drive, not less safe.

        • Re:Liability... (Score:5, Insightful)

          by CaptainDork ( 3678879 ) on Saturday June 23, 2018 @06:53PM (#56835574)

          Been there.

          Texaco refinery, Port Arthur, Texas.

          The operators stuffed red rags into the alarm horns and, sure enough, 8 people died on a unit where instruments showed there was sufficient time to get out of harm's way had the sound not been muffled.

          I remember my dad pulling the wire of the "ding, ding," of the lap belt warning.

          People take batteries out of smoke detectors.

          I think the answer is for the goddam artificial intelligence to be fucking intelligent.

          Until then, don't beta test the goddam thing in production.

          • Re:Liability... (Score:4, Insightful)

            by ShanghaiBill ( 739463 ) on Saturday June 23, 2018 @09:02PM (#56835964)

            I think the answer is for the goddam artificial intelligence to be fucking intelligent.

            Until then, don't beta test the goddam thing in production.

            Get some perspective. 3000 people a day die in human caused traffic accidents worldwide. If by rolling out Autopilot and collecting real world data, they bring forward the transition to SDCs by even a single day, they will have saved a thousand lives for every one lost in beta testing.

            This is the same as The Trolley Problem [wikipedia.org], except instead of throwing the switch to save five by sacrificing one, we save thousands, or perhaps tens or hundreds of thousands.

            The needs of the many out weigh the needs of the few.

        • by Megol ( 3135005 )

          If you want to kill or maim yourself you are right. But you have no right throwing a dice risking other peoples well-being which is what you are supporting here.

          • "If you want to kill or maim yourself you are right. But you have no right throwing a dice risking other peoples well-being which is what you are supporting here."

            People accidentally kill other people in road accidents every day. What you are proposing is that no one be allowed to do anything, ever, because humans are fragile and there is always a risk they might die.

            Driving equals risking the well-being of others, whether you do it correctly or not.

        • In other words, assuming autosteer really is improving safety, then statistically speaking, the nags are making the car LESS safe, not more.

          Or you can just leave your hands on the wheel and pay attention while the car does the heavy lifting.

          • by dgatwood ( 11270 )

            Or you can just leave your hands on the wheel and pay attention while the car does the heavy lifting.

            And when you do that, the nags still happen about once every minute or two, and they still distract me from my driving. Did you even read my post before you replied?

            Worse, because of the way Tesla detects hands on the wheel — by measuring the torque provided by your hands against the autosteer, the nags are actually more frequent when gripping the wheel tightly with two hands than when loosely hanging

        • by rojash ( 2567409 )
          The way you put it reminds me of one driving with one's wife in tow. The nag will keep you annoyed and alert right, and is a good thing to keep prodding you. Getting a gadget that alleviates your safety is ridiculous, and NHTSA is more interested in people's safety rather than your comfort when you drive. Its their job, whether they have driven a Tesla or not before banning anti-safety devices is their prerogative.
          • by dgatwood ( 11270 )

            The nag will keep you annoyed and alert

            Having a warning message flash on your dashboard does not keep you alert. It distracts you from the road.

    • by PolygamousRanchKid ( 1290638 ) on Saturday June 23, 2018 @05:48PM (#56835376)

      Actually, we need a "Cease-and-Desist" order for drivers who refuse to pay attention to the road, despite the explicit instructions from Tesla.

      Once again, the most dangerous part of an automobile is "The Loose Nut Behind the Wheel".

      • Re: (Score:1, Redundant)

        Why pay attention? Tesla calls it an "autopilot". "Auto=automatic". So its all automatic. Just like an automatic transmission where you don't need to shift, the car does it for you. Similarly, with an autopilot you don't need to pilot the car, it does it for you.
        • Re: (Score:2, Troll)

          by war4peace ( 1628283 )

          So let's fire all airplane pilots. 'cause planes have autopilot, y'know...

        • Tesla calls it an "autopilot". "Auto=automatic". So its all automatic.

          That changes nothing because all cars are "auto"mobiles. And in Germany, the "auto"mobiles drive on the "Auto"bahn.

          So everyone already believes cars don't need to be driven ... or perhaps people aren't as stupid as you think they are.

          Number of Tesla drivers who have claimed an accident wasn't their fault because they thought they didn't need to drive the car: 0.

          • by Anonymous Coward

            "Auto" = "self". "Automobile" means "self-propelled", not "self-guided".

      • Actually, we need a "Cease-and-Desist" order for drivers who refuse to pay attention to the road, despite the explicit instructions from Tesla.

        This, because it will be hard to C&D oranges [youtube.com]. Actually police should just mail this guy a speeding ticket for fun, since not only was he driving recklessly, he was also clearly speeding by 6mph.

      • by HiThere ( 15173 )

        This is a human factors design problem, and Tesla made a bad design decision. People have a lot of trouble focusing attention on a task where they all they do is pay attention and not do anything. What the correct decision is may be arguable, but it will involve people acting whenever they're supposed to be paying attention.

    • by Knuckles ( 8964 )

      Seems easier and more effective to ban it instead of waiting for it to kill people and the following protracted law suits

    • Unless they're judgment-proof, in which case, it might take a while to get the device pulled.

  • Natural Selection (Score:4, Insightful)

    by alvinrod ( 889928 ) on Saturday June 23, 2018 @05:49PM (#56835382)
    Try to make something fool proof and the universe will make a better fool.
    • by bondsbw ( 888959 )

      I would rather not be on the same road as this device, thanks.

    • by lsllll ( 830002 ) on Saturday June 23, 2018 @06:38PM (#56835512)

      Try to make something fool proof and the universe will make a better fool.

      Damn, am I reminded of that on a daily basis ... Can you tell I'm a programmer?

    • by AmiMoJo ( 196126 )

      The really worrying thing is that there are lots of people on the various Tesla forums complaining that the new Autopilot update makes it "unusable" for them. It now checks much more often for hands on the wheel, about every 30 seconds or so. It used to let you go 15 minutes or more without hands on.

      Seems like quite a lot of people were really using it in an unsafe way before, and are now angry at Tesla even though it's still a lot worse than every other manufacturer.

      Scary stuff.

  • by Leuf ( 918654 ) on Saturday June 23, 2018 @05:57PM (#56835412)
    Because anyone that would pay $200 for a small magnet in a piece of plastic is too stupid to be trusted to drive themselves.
    • by E-Lad ( 1262 )

      Just hope that they don't demonstrate their stupidity by careening into you / anyone.

    • by AmiMoJo ( 196126 )

      You can actually just wedge a bit of fruit in the wheel, something like an orange. The weight is enough to make it think you are applying torque to the wheel.

    • by ledow ( 319597 )

      Does your car have a spoiler?

      I've seen people pay WAY more than that to fit a "after-market spoiler" to a car that's not even capable of generating any kind of air-flow which would produce such an effect, nor any kind of aerodynamic effect to utilise them.

      Don't even get me started on twin-exhausts and all kinds of other shite.

      I agree, those people shouldn't be allowed to drive themselves, just through sheer stupidity and misunderstanding of how their car works, but I would posit that "owning a Tesla" is a m

  • This makes sense to me. It seems that communities (cities or states or whatever) seem to think they have the ability to determine whether self-driving vehicles are allowed on their streets. (This is not something I had thought about until the last couple of years.) Given that, the same governments should be allowed to determine if the sale of a device that turns a vehicle into a self-driving vehicle is allowed.
  • They should be halting sales of any device's safety devices that can be defeated by this device. Is that enough devices?
    • Comment removed based on user account deletion
      • by HiThere ( 15173 )

        I'm sure someone could come up with some other use for it. But its primary purpose is specifically to defeat a device intended to protect the lives of people, many of which are not the user. (If it only endangered the user of the device, I'd be OK with it.)

  • by fluffernutter ( 1411889 ) on Saturday June 23, 2018 @06:45PM (#56835540)
    This seems completely ironic to me. All this device does is stop the warning to put the drivers hands on the wheel. It does not make Autopilot safer or any less safe. Nor is there any way to determine whether a person is using Autopilot properly without one of these devices. Prohibiting these products seems to be a band-aid solution, when the real problem is that Autopilot is so easily misused in the first place.
    • by dgatwood ( 11270 )

      The real problem is that instead of warning people to pay attention to the road around them, Tesla felt it necessary to treat their customers like children with a useless nag that is annoying as heck even when users are using the product precisely as intended. As a result, folks have come up with creative ways to work around the lack of a "Stop nagging me already" switch in the settings. If they ban this, folks will come up with something else. It won't stop until Tesla cars either have true FSD capabil

      • Re:Ironic (Score:5, Insightful)

        by fluffernutter ( 1411889 ) on Saturday June 23, 2018 @07:15PM (#56835654)
        The thing that is flawed is that Tesla doesn't seem to be willing to acknowledge common human traits. A car is requires a human to interact with it properly in order to not kill or injure anyone. Humans have flaws, yet Tesla seems to think they can pick the ones that they should feel liable for even though these human flaws are well known and completely predictable. They are acknowledging technology can augment a human to make them a better driver, yet failing to acknowledge that their technology just brings out the flaw of having poor reaction times when not being completely engaged with the driving.
        • by Corbets ( 169101 )

          Being willing to acknowledge it is one thing.

          Believing that they should take responsibility for it, and take responsibility away from adults who have been licensed by the freaking state to drive, is another thing.

          • Is the state testing attention span as part of their driver licensing? If a person loses focus after, say, an hour of doing nothing, are they denying licenses?
      • What happened to SongCue?

        • by dgatwood ( 11270 )

          I got too busy with my actual job. I wouldn't want to touch it with a ten meter pole now, since it's all fairly close to the metal Xlib from almost two decades back.

  • If you watch the Youtube video [youtube.com] its just a weight held by magnets. I can wrap rubber bands around the steering wheel and it would accomplish the same thing. Might as well send a cease and desist to Youtube for showing how to circumvent the system. And if I tell my friends how to do it, will I get a cease and desist letter too?

  • What if it just pulls over to the side of the road and stops instead. The driver might be asleep or dead. Just disengaging could cause an accident.

    • Tesla's autopilot isn't capable of doing that, and a fully self-driving car is capable of taking its passenger someplace cleverer than the side of the road.

  • Its bad enough that "autopilot" lets the driver take their hands off the wheel for up to 30 seconds. But using a device to defeat even that amount of time is just plain stupid.
  • The sooner the people who think it's safe to not pay attention while driving are taken off the road the better.

    If that happens because their Tesla autopilot drives out the lane, speeds up and crashes into a safety barrier while still accelerating, so be it.

"All the people are so happy now, their heads are caving in. I'm glad they are a snowman with protective rubber skin" -- They Might Be Giants

Working...