Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Transportation

Consumer Reports Shows Tesla Autopilot Works With No One In the Driver's Seat (arstechnica.com) 288

Rei_is_a_dumbass shares a report from Ars Technica: Last Saturday, two men died when a Tesla Model S crashed into a tree in a residential neighborhood. Authorities said they found no one in the driver's seat -- one man was in the front passenger seat, while the other was in the back. That led to speculation that the car might have been under the control of Tesla's Autopilot driver-assistance system at the time of the crash. Elon Musk has tweeted that "data logs recovered so far show Autopilot was not enabled." Tesla defenders also insisted that Autopilot couldn't have been active because the technology doesn't operate unless someone is in the driver's seat. Consumer Reports decided to test this latter claim by seeing if it could get Autopilot to activate without anyone in the driver's seat. It turned out not to be very difficult.

Sitting in the driver's seat, Consumer Reports' Jake Fisher enabled Autopilot and then used the speed dial on the steering wheel to bring the car to a stop. He then placed a weighted chain on the steering wheel (to simulate pressure from a driver's hands) and hopped into the passenger seat. From there, he could reach over and increase the speed using the speed dial. Autopilot won't function unless the driver's seatbelt is buckled, but it was also easy to defeat this check by threading the seatbelt behind the driver. "In our evaluation, the system not only failed to make sure the driver was paying attention, but it also couldn't tell if there was a driver there at all," Fisher wrote in a post on the Consumer Reports website.

This discussion has been archived. No new comments can be posted.

Consumer Reports Shows Tesla Autopilot Works With No One In the Driver's Seat

Comments Filter:
  • So? (Score:5, Insightful)

    by Frosty Piss ( 770223 ) * on Thursday April 22, 2021 @09:09PM (#61303120)

    Did they find a weighted chain in the car?

    • Re: So? (Score:5, Insightful)

      by klipclop ( 6724090 ) on Thursday April 22, 2021 @09:30PM (#61303178)
      You can set your cruise control in a regular car and (probably) get out of the driver seat and let it go driverless until it hits something too.
      • Re: So? (Score:5, Funny)

        by Waffle Iron ( 339739 ) on Thursday April 22, 2021 @09:41PM (#61303208)

        Things get worse than that.

        On my 1967 Chevy Impala, I was able to light the JATO rockets without anybody at all being in the car.

        It was a good thing, too, because I doubt that anyone could have survived that crash.

      • Even in a car without cruise control, you can put a brick on the accelerator. Autopilot is a rather unconvincing scapegoat here.
        • Even in a car without cruise control, you can put a brick on the accelerator.

          Difference is that Musk isn't egging people on to do it.

          • How is Musk egging people to bypass Tesla Safety controls?

            All official Tesla media and even tweets from Musk, explain the limitations of full self driving, and what it currently can do. Also when enabling the feature it tells you again that you need to be engaged and ready to take control.

            What is really happening, is just a lot of talk from Tesla Fanboys that over exaggerate how good the product is, just as Tesla Haters exaggerate on how bad the product is.

          • by Brannon ( 221550 ) on Friday April 23, 2021 @01:19PM (#61305860)
            Elon Musk is involved with several different technology revolutions. People who have trouble adapting dislike him because they blame him for changing the world faster than they can cope. Then there are all the insecure people in tech fields who are just straight-up jealous that Elon Musk gets so much attention. They tell themselves that what they really don't like is "all the hype"--but the truth is that they feel insecure and self-conscious about their own level of success when they compare themselves to Musk. The only way to cope is to try to invalidate Musk's success by saying that "it's just hype", "he's just a snake oil salesman", "he's taking credit for other peoples' work", etc, etc.

            So there are large populations that just want Elon Musk to be a failure--and that dynamic leads to the clickbait mills generating content to feed that crowd. It's not that unusual. Pretty much the same people disliked Steve Jobs for the similar reasons--and pretty much every truly successful person has to deal with haters. It's a bit worse in the tech field in part because there are a lot of truly insecure people who follow tech closely.
      • by Junta ( 36770 )

        Yes, and this is a problem. There have been accidents, particularly in the era of internet videos, where people put their car into gear and idle forward or cruise control and then get out of the drivers seat (Ghost riding).

        The more capable cruise control gets at making this 'almost work, but not quite', the more and more important it is to have safeties around it. It's not like this is a huge ask, if it is expected to recognize random pedestrians outside the vehicle and react somehow appropriately already,

    • Some of us who drive in icy weather keep tire chains in the car.

    • Re: So? (Score:2, Insightful)

      by jobslave ( 6255040 )
      No shit. I can trick 100s of safety systems by rigging them in completely unintended ways of use and totally prove that it was intentional and the human's fault and not the tech. Someone was sitting in the driver's seat and like 1000s of drunk driving accidents a year the useless waste of air, resources and skin got out of the driver's seat and moved to the passenger seat after they realized they had an accident.
    • Maybe Mr. T was there. Was Mr. T there, fool?
    • by AmiMoJo ( 196126 )

      You can use a piece of fruit, a bean bag, a tool bought specially from Amazon for that exact task... Basically anything that can be wedged into the wheel. Or just lean over and rest your hand on the wheel.

      • Re:So? (Score:5, Insightful)

        by DarkOx ( 621550 ) on Friday April 23, 2021 @07:56AM (#61304426) Journal

        And exactly none of the matters. The question is what is the control designed to prevent. The answer is its intended to detect a lapse in driver attention.

        It is not designed to prevent a driver from deliberately using the product in reckless or unsafe fashion. If the control was intended to prevent malicious tampering we might say its ineffective but its not designed to do that, its designed to determine if I have dozed off, have spent a little to long looking at my phone etc. Even if wedging a banana in the wheel spokes is enough to thwart it, its no longer a case the safety control does not work its that the driver has made a conscious choice to disable it. At that point the responsibility shifts.

        I have ridding mower, I usually siphon whatever remaining gas is left in the tank out before I put it away at the end of the season. I like to make sure the carburetor floats and lines are also empty of fuel. There is a safety sensor in the seat that prevents the engine from running when nobody is sitting on it. I defeat it by setting a cinder block or something else heavy on the seat while I let it burn off the remaining fuel (cutting deck disengaged of course). This does not mean the safety system does not work. Conceivably it could slip into forward motion and run into something or over something or whatever. I would not blame the manufacturer at that point, I am after all operating the product in an unintended fashion having effectively disabled a safety system. Its now my responsibility to ensure I am monitoring in appropriately and nothing is in harms way.

        Tesla obligation should be to prevent accidents associated with ordinary use, ie you fall asleep at the wheel. Its not fair to expect them deal with you decide to intentionally fool the vehicle into engaging autopilot without a driver present.

        • Tesla obligation should be to prevent accidents associated with ordinary use, ie you fall asleep at the wheel. Its not fair to expect them deal with you decide to intentionally fool the vehicle into engaging autopilot without a driver present.

          If you can fool the system without a driver present, then how could it possibly detect a driver that fell asleep?

  • logs need to come from the state and not manufacture as the manufacture may be covering up issues that make what they are manufacturing unsafe.

    • by xalqor ( 6762950 ) on Thursday April 22, 2021 @09:24PM (#61303160)

      Send all those car logs to the state? Location, speed, audio, video, sensors... Thanks but I'll be opting out of that abomination of warrantless search.

      If the car has a "black box" device that stores only important events in a circular log file, and the state can get it with a warrant, it might work.

      • Agreed entirely.
        Modern cars do contain recent driving data in the ECU, and it can be pulled by the PD with a warrant.
        This has been used to prove criminal behavior after an accident.

        I too am not entirely comfortable with Tesla being the gatekeeper to that information. It should be retrievable from the car itself, instead of Teslas interpretation of the data uploaded to its servers.
        • This has been used to prove criminal behavior after an accident.

          No, it's been used to prove criminal behavior after a crash. If someone is driving criminally, and crashes, it's no accident. Language is important.

      • I may be wrong about this, but I don't think they would need a warrant to pull that data after an accident. And I'm fine with that, they're investigating what may be a crime scene. I sure as hell wouldn't be fine with a constant stream of data being sent to some bureaucrat.

        I wonder if an insurance company could require it as a condition of coverage.

  • Wait. (Score:5, Insightful)

    by mrsam ( 12205 ) on Thursday April 22, 2021 @09:15PM (#61303134) Homepage

    Ok, so someone's dead set on winning the Darwin Award, and it's the car's fault?

    I'm just trying to understand the argument here. I don't own a Tesla, only a slim chance I ever will, and I have no horse in the game. Even if somehow the car can be fooled into driving without a driver in the seat, why is that the car's fault? I could understand the problem if Tesla were prone to taking off, by themselves, while parked, or stopped, or whatever. But, as described, one has to intentionally and willingly go out of their way -- supposedly -- to defeat the existing safety measures which, as described, seem to be quite adequate.

    Then, if someone wins the Darwn award anyway, it's their prize to keep, not the car's.

    • Re:Wait. (Score:5, Insightful)

      by The Evil Atheist ( 2484676 ) on Thursday April 22, 2021 @09:28PM (#61303170)
      It should be treated like drunk driving, or driving without a seatbelt. Everything is in place to discourage you from doing it, but ultimately you make the final decision whether you're going to obey. At which point, it's your bloody fault when anything happens.
      • It's an experimental system and valuable intellectual property for Tesla if the system becomes robust and reliable. Simply allowing every idiot to slander their beta self driving system would be a mistake for now. Once these things are standard tech maybe just let people be stupid, but probably not because it means more death.
      • Re:Wait. (Score:4, Insightful)

        by AmiMoJo ( 196126 ) on Friday April 23, 2021 @02:30AM (#61303890) Homepage Journal

        Well that's the problem, there are many things they could put in place to discourage it but didn't. For example checking the weight on the driver's seat, which at least in Europe is mandatory for seatbelt warnings. The Model Y that they tested has an internal camera too, but it doesn't seem to be monitoring for the presence of a driver.

        The other issue is expectations. If someone puts a brick on the accelerator in a normal car they are expecting to die, they know it will soon crash. Tesla is selling "full self driving" and there are numerous gushing videos on YouTube about how wonderful and reliable it is, using carefully selected footage were it didn't screw up for 5 minutes on easy roads. The most likely explanation in this case is that the owner was trying to show off full self driving to their friend, vastly over-estimating the car's capabilities.

        • there are many things they could put in place to discourage it but didn't. For example checking the weight on the driver's seat, which at least in Europe is mandatory for seatbelt warnings.
          .
          .
          .
          vastly over-estimating the car's capabilities.

          The things they already did put in place is enough, regardless of the advertizing. All the things they put in place, are decent enough reminders that this is still a manned vehicle. They shouldn't need further measures to prevent the Autopilot being activated. Even seatbelt warnings ultimately rely on the people not being complete selfish dicks. Even a weighted seatbelt warning is nothing more than a sign saying "we can't stop you from being an idiot, but use your fucking common sense."

          Personally, I thin

        • Re:Wait. (Score:4, Insightful)

          by gmack ( 197796 ) <gmack@innerfiCHEETAHre.net minus cat> on Friday April 23, 2021 @06:07AM (#61304182) Homepage Journal

          The most likely explanation in this case is that the owner was trying to show off full self driving to their friend, vastly over-estimating the car's capabilities.

          The flaw in your reasoning is that the car did not have self driving installed.

      • Interesting proposition, but with drunk driving the decision is being shifted from the possibly intoxicated driver to the machine, at least for drunk driving offenders https://www.nj.com/news/2019/1... [nj.com]
    • Re:Wait. (Score:5, Insightful)

      by Octorian ( 14086 ) on Thursday April 22, 2021 @10:11PM (#61303334) Homepage

      Ok, so someone's dead set on winning the Darwin Award, and it's the car's fault?

      Because its Tesla, people are obligated to blame the car. It doesn't matter if its something that, if it happened in any other make of car, wouldn't even be a newsworthy story.

    • Re:Wait. (Score:5, Insightful)

      by Anonymous Coward on Thursday April 22, 2021 @10:39PM (#61303428)

      I think you've missed the point. The car, while flawed in some ways, is not really what people have a problem with. What people have a problem with is the company that makes the car recklessly describing something that's not a real self-driving system as "self-driving". Repeatedly. In many forms of media, and with carefully set up demonstrations to make what limited automatic guidance systems exist look much better than they actually are.

      The man driving the RV who turns on cruise control and goes to make a pot of coffee while thundering down a highway at 55 is rightly called a fool and must own his mistake wholly. Incautious people who hear "self-driving" and take it at face value doing something like this are also fools, but they are only part owners in the foolishness.

      • Re: (Score:3, Informative)

        by aaarrrgggh ( 9205 )

        Just to be honest on terms, they use “AutoPilot” and “Enhanced AutoPilot” for the offerings today. They sell “Full Self Driving,” but it is not currently available to drivers.

        Autopilot works almost exactly like in a plane. It will happily keep going unless something goes wrong, at which point it disengages and the pilot must take over.

        They have promised full self driving, or fully autonomous driving for a very long time. They have thus-far failed to deliver, or to be m

        • Re:Wait. (Score:5, Insightful)

          by The MAZZTer ( 911996 ) <megazztNO@SPAMgmail.com> on Friday April 23, 2021 @02:23AM (#61303876) Homepage
          I think the real problem is people think of the Hollywood "autopilot" which is essentially full self-driving. While it may meet the dictionary definition, it's a bad name. That said, most of the complaints I see are clearly not made in good faith as they could apply to other cars as well. For example you can easily get any car to drive without someone in the driver's seat by weighing down the gas pedal. As a bonus, it will probably crash sooner than a Tesla, since at least the Tesla will try to avoid obstacles.
          • by Musical_Joe ( 1565075 ) on Friday April 23, 2021 @06:14AM (#61304192)

            I always try to see both sides of an argument, but here I'm struggling. Tesla call their system "autopilot". As grandparent post points out, it works in a very similar way to autopilot on a plane - which we must presume it is named after. Are people so stupid as to honestly believe that "autopilot" in ANY circumstance - but for this example specifically on a plane - means "no human intervention required at any point at all, ever"? Are they so stupid that they think the job of a pilot - a (generally) well-paid, respected job that requires years of training to achieve - simply presses a button labelled "on" and the plane takes off, flies to its destination, lands of its own accord, then everyone gets out and the pilot presses the "off" button? You say "Hollywood" but I can't think of a Hollywood film (set in the present day) where the pilot's job is purely to switch on the autopilot - so I don't think you can even blame Hollywood for this idiocy. Just think about it for a moment; how utterly ridiculous a proposition this is - yet people are saying "autopilot implies the car needs no human intervention whatsoever" as if this preposterous belief is somehow not indicative of mental retardation at such an extreme level one should be hospitalised. $100k a year for pressing a button twice. God knows what these idiots think a co-pilot does. Maybe his job is to point out which is the "on" button and which is the "off" button in case the main pilot is blind? Yet even here on Slashdot there are people arguing that "autopilot" implies no human interaction. Occam's razor says to me that no-one past the age of a toddler could truly think such a stupid though, so actually people must just be saying it because they think Elon Musk is a tit. Which he may well be, of course...

            • Re:Wait. (Score:4, Informative)

              by Ed Tice ( 3732157 ) on Friday April 23, 2021 @06:36AM (#61304246)
              Yes, people believe that as they rightly should. Although early aviation "auto-pilot" systems were pretty limited, modern aircraft are very capable. We still use pilots even though, for the most part, the planes can fly themselves. The US military operates pilotless planes (they are remote controlled, of course). Instrument landing systems are so good that the planes can land themselves. Now that commercial aircraft all have GPS, there is constant chatter about maybe pilots can be eliminated. But no airplane manufacturer talks about "full self flying" (that I know of)

              Given that the flight systems on modern aircraft are more capable than what's on a Tesla and those are still called "auto-pilot," it's reasonable to think that a Tesla vehicle has similar levels of capabilities. I've heard plenty of people (not with technical background) talking about how they wish they could buy a Tesla so that they can take a nap while "driving" their kids to school

              That's not entirely unreasonable. There was an incident a few years ago where some pilots fell asleep in the cockpit and flew hundreds of miles past their destination airport. They only woke up when a passenger noticed and asked a flight attendant who called the cockpit. All they got was a letter of reprimand since it wasn't that dangerous. So yeah it's not a logical leap to say that if airplane pilots can put on the auto-pilot and fall asleep, so could a car driver with similar equipment.

              The Tesla defenders here always talk about what "auto-pilot" used to mean in the 1960s. Lots of words used to have different meanings. Elevator is not a brand name anymore. And auto-pilot is used (by those not in aviation) to refer to the full range of automation available on the latest commercial jetliners. Sorry but in modern language, "auto-pilot," "full self driving," and "You can climb into the back seat and fall asleep, the car will do everything" are synonymous.

    • It's not an argument, it's just somebody showing it's easy to bypass the safety features on an experimental self driving car and Tesla should probably improve that if only for the sake of PR.. oh and maybe human lives.. but DEFINATELY PR!!
    • The issue is not that the guy died. The issues here are (1) Tesla is lying, and (2) autopilot is not ready.

    • Re:Wait. (Score:5, Insightful)

      by Actually, I do RTFA ( 1058596 ) on Thursday April 22, 2021 @11:52PM (#61303602)

      The point of the story isn't that it's the car's fault. It's that Musk came out and said it was impossible because that's not how the car was designed. Just like I reject bug reports with "that's not how the software was designed" even if it's showing up in production.

      If the issue was someone abusing a Tesla, and Tesla's reaction was "that's retarded, don't do that" it would be one thing. But Tesla's reaction was "you cannot do that."

      To use a non-car analogy (oh, how the wheel has turned) if a gun is used to kill a police officer, I'm not going to hold the gun manufacturer guilty (feel free to spout off about a very vocal very small minority saying something different to attempt to derail the conversation.) If the gun manufacturer says "that's impossible, our guns are too smart to shoot police officers" I'm going to hold them responsible or at least mock them. But if they said "our police detecting code in our gun cannot recognize deputies, but it's not supposed to be pointed at people that's just a failsafe" i wouldn't. Make sense?

    • Re:Wait. (Score:5, Interesting)

      by tlhIngan ( 30335 ) <slashdot&worf,net> on Friday April 23, 2021 @02:20AM (#61303868)

      The argument is not "Can you use autopilot without a driver", it's "Is Tesla's driver monitoring system any good". Or "Can you fake a driver paying attention"

      That's the main result of the test. Other driver monitoring systems in other cars use a seat sensor, and a driver monitoring camera to detect the driver and see if he's paying attention.

      Tesla's is using a weight detection system on the steering wheel and that's it.

      That's the purpose of the test. The test doesn't answer the question "why is there no driver in the vehicle" - Tesla and the NTSB can fight that one out. All Consumer Reports has shown is that Tesla's autonomous driving which requires driver attention, doesn't actually check if the driver is paying attention or has the ability to check if the driver is paying attention. It's just looking for a weight on the steering wheel.

      Whether or not it's a serious safety problem or not will be determined by the NTSB and the like. The NTSB can find the system is flawed in that way, but adequate so the CR test becomes meaningless, or they could see that it's not a sufficient test and force Tesla to fix the issue somehow.

  • Good job (Score:4, Insightful)

    by knoworiginality ( 1361077 ) on Thursday April 22, 2021 @09:16PM (#61303136)
    Good job proving that stupid people can misuse technology to stupid and dangerous things. Excellent investigative reporting there guys. Please keep your toasters out of the bathroom.
    • Re:Good job (Score:5, Insightful)

      by The Evil Atheist ( 2484676 ) on Thursday April 22, 2021 @09:31PM (#61303182)
      It's still important to understand how easy something is to bypass, and whether it matches with manufacturer claims.
      • Re: (Score:2, Insightful)

        It is trivially easy to jump in front of a train. So should we install idiot proof impossible to bypass systems to prevent it?

        We start with the assumption people are not suicidal maniacs. If some Tesla/non Tesla driver is indeed a suicidal maniac the problem is with the maniac not with the engineer who designed some reasonable system that works for most normal people.

        • Re:Good job (Score:5, Insightful)

          by mrclevesque ( 1413593 ) on Thursday April 22, 2021 @10:18PM (#61303366)

          It's still important to understand how easy something is to bypass, and whether it matches with manufacturer claims.

          • by Firethorn ( 177587 ) on Thursday April 22, 2021 @10:36PM (#61303418) Homepage Journal

            Personally, I think that as a general standard, if something is harder to bypass than, say, opening a combination lock, then you've met the "deterrent" level.

            Going by what Consumer Reports said, while they called it "Easy", we're still looking at a multiple step process, requiring some additional equipment(the weighted chain), that might not be immediately obvious to do. I mean, they have to place the chain to fool the car into thinking there's hands on the wheel, buckle the seatbelt(despite there being nobody in the seat, might have to place a weight on the seat to detect that, etc...

            The more detection systems you put in there, not only does it get more expensive for the additional equipment and programming, it also tends to get more fragile. For example, imagine that we put a face detection system in there to detect whether there's a driver in the seat - now what happens when it comes out that the face detector is "racist" in that it isn't as good at detecting, say, Asian faces, and to go by all the reports I've seen, absolutely suck at seeing the faces of black people?

            Things like riding lawnmowers merely detect a certain weight(or more) in the seat for their safety for a reason.

            That said, I know I'm annoyed by how little weight it takes to set off the "fasten seatbelt" indicator for the passenger side; I mean, my tablet has been known to set the thing off, and that's light enough to one-hand all over the place, much less my backpack with laptop - still light enough to easily swing into the passenger seat, but do I have to be careful or the bloody car gets insistent that the non-existent passenger buckle up. Which leads to things like permanently bypassed sensors.

          • Great.

            I'm not saying Tesla is at fault here - I'm saying it's useful to know how a system works. Without this test, for example, we wouldn't know that it only required these things to bypass. And who knows, maybe someone else would take it from there and discover even easier ways of tricking Autopilot. Either we find out that there's a base level of effort required to bypass Autopilot checks, or we find out how easy it is to fool. This information is still GOOD TO KNOW to judge future incidents and maybe
          • This is literally the same comment as this [slashdot.org]

            To which I'm going to reply the same thing I already did [slashdot.org]:

            I'm actually pissing away 4 mod points that I've already spent in this discussion to reply to this.

            It's still important to understand how easy something is to bypass, and whether it matches with manufacturer claims.

            Well, technically it's more difficult to bypass the Tesla according to the reports, than it is to install Linux on a PC that comes preinstalled with Windows.

            Please, do name something that's intended to be used by humans and is *more* difficult to bypass than the Tesla autopilot. Most things I know of (microwaves, X-ray machines, heavy machinery etc) have an interlock of some kind, which essentially is a simple switch that you can fake-lock using nothing but a piece of chewing gum; or a two-hand switch (e.g. for metal sheet folders) which you can bypass by putting your backpack on the other-hand switch. Seat belt warning? Just plug the seat belt in, route the strap at the back of the seat instead of your front. Motorcycle foot interlock? It's usually a small switch below the engine block, use a plastic tie or a shoelace. Elevator door? Move your knee, don't block the light beam; also, depending on the model, sometimes there's a physical button hidden somewhere in the door frame, which you can keep pressed using your finger. Two-hands lever hydraulic press? Use your shoe laces. Two-hands and a pedal? Shoe laces and a brick.

            Or duct tape.

            Now that I think of it, I fact I haven't come across *any* usage security mechanism that I couldn't have bypassed using duct tape and a ball pen. And I've seen my fair share, including but not limited to particle accelerators, high-enery x-ray sources, cryostats, lasers, industrial manufacturing, chemical industry, various r&d facilities, hydroelectric power facilities... you name it.

            Actually Tesla's is rather difficult, because you need to come up with the "dial speed down to zero but not deactivate the autopilot" bit. Not everyone has the brains for it, given that it took several days and a consumer report to point that out.

            Hope this helps. You're welcome.

      • I'm actually pissing away 4 mod points that I've already spent in this discussion to reply to this.

        It's still important to understand how easy something is to bypass, and whether it matches with manufacturer claims.

        Well, technically it's more difficult to bypass the Tesla according to the reports, than it is to install Linux on a PC that comes preinstalled with Windows.

        Please, do name something that's intended to be used by humans and is *more* difficult to bypass than the Tesla autopilot. Most things I know of (microwaves, X-ray machines, heavy machinery etc) have an interlock of some kind, which essentially is a simple

    • Definitely keep the toasters out of the bathroom... But GFI still exists for a reason.
      It needs to be reasonably difficult for people to win Darwin Awards. The state has a reasonable interest in people staying alive.
    • Yeah, agreed - this is some dumb shit. "Well, ackshually, if you purposefully try real hard and explicitly attempt to defeat the system, you can put yourself in a dangerous situation!". Yeah, thanks you dumb fucks.

      I'm not up Tesla's ass by any means, in fact I think they'll be defunct when real carmakers start cranking out better, cheaper EVs. However, it's a little bizarre to take your TDS (Tesla Derangement Syndrome, natch) to this level. No shit it can work with no one in the drivers seat if you intentio

  • Threat model (Score:5, Insightful)

    by xalqor ( 6762950 ) on Thursday April 22, 2021 @09:16PM (#61303138)

    Is the safety feature (not allowing autopilot without a driver present) supposed to protect against silly mistakes like "oh it's not meant for that" or is the driver considered as a malicious actor here?

    I'm not personally ready to try any of this autopilot stuff, but it seems to me if a driver takes deliberate steps to disable or circumvent a safety feature that works fine under normal conditions, that the manufacturer should not be liable for what happens next.

    • by jaa101 ( 627731 )

      What about a case where someone loads a full-auto-drive car with a car bomb? We have to face up to the fact that this technology will have serious criminal and terrorist applications. Bypassing any safety features is always going to be easier than creating your own auto-drive system.

      • Well, the first problem is that "most" land-based targets are protected against car bombs these days, you'll find them hard to get to with a vehicle, due to things like big rocks and concrete planters in your path.

        Second, if you can make a car bomb these days that you can remote drive into something, you can probably make a drone to do the same thing, which avoids the blockers. Or you can go the mythbusters route and just set up the car to be remote controlled the old fashioned way.

        It only needs to work on

  • Nothing. Is technology and / or the companies that make it responsible for people going out of their way to get in dangerous situations? Maybe all the cell phone makers should be responsible for all the people who die while trying to get amazing selfies?

  • What next (Score:4, Insightful)

    by AlanObject ( 3603453 ) on Thursday April 22, 2021 @09:24PM (#61303162)

    It really sounds like an effort to find some way to blame Tesla for something.

    They say it "wasn't very hard" to defeat the safety. Does that mean that they are saying is if it was "very hard" to defeat the safety and the Darwin-award guy went and did it anyway then it wouldn't have been Tesla's fault?

    What BS. Next they will blame Tesla because the in-cabin camera didn't detect that there was nobody there. If it did then they would write sober screeds about privacy violation.

    • by Octorian ( 14086 )

      What BS. Next they will blame Tesla because the in-cabin camera didn't detect that there was nobody there.

      And if Tesla did use the in-cabin camera for that, they'd then move the goalpost and complain how they were easily able to fool it with an inflatable dummy.

      • The inflatable dummy is the autopilot.
      • Re:What next (Score:5, Informative)

        by Ed Tice ( 3732157 ) on Friday April 23, 2021 @06:40AM (#61304250)
        Uh no. The goal post was set by Tesla with the public claim that it was impossible. If Tesla makes another public claim along the lines of "The most realistic dummy in the world won't fool our full self-driving auto-pilot," that would move the goal posts. When evaluating the claims of a speaker, its the one making the claim who sets the goal posts which Tesla did very foolishly here.
    • It really sounds like an effort to find some way to blame Tesla for something.

      Well, Tesla is making claims about things being "impossible". And people are testing those claims.

      One claim is that Autopilot can't engage on an unmarked road. And then someone showed it doing that.

      There was another claim that it is impossible for Autopilot to drive the car without a driver in the driver's seat. This was testing that claim, and finding it to be false.

      That doesn't mean Tesla is at-fault for that particular accident. It means they're making false claims.

  • by godrik ( 1287354 ) on Thursday April 22, 2021 @09:42PM (#61303210)

    I mean, if you need to go through so many steps to trick the car into self driving mode without a driver, is it really the manufacturer's fault?

    • Yes, this soulds more like "If you take uranium out of the ground and refine it and put it in a perfect hollow sphere and surround it with perfectly shaped explosives and perfectly timed detonators, you get the first stage of a nuclear bomb... So clearly, the ground is the murderer!"

      But then ... how do explain nobody being in the driver's seat, and the car being at high speed? Got any explanation that aren't even less plausible?
      Because you know that Sherlock Holmes quote... :)

      • how do explain nobody being in the driver's seat, and the car being at high speed? Got any explanation that aren't even less plausible?

        Well, sitting in the back seat while poking the accelerator with a stick should work on just about any car produced in the last 80 years or so.

      • But then ... how do explain nobody being in the driver's seat, and the car being at high speed? Got any explanation that aren't even less plausible?
        Because you know that Sherlock Holmes quote... :)

        I have several, but there's not enough public information to rest for one or the other.

        One would be that the driver *was* in the driver's seat, got out and went away, leaving the other two to burn.

        Another would be that the doors were jammed owing to the accident, leaving the other two climing through the car to reach another door, until they burned.

        Another one would be that one of them didn't have their seatbelts on and was pushed around to another seat. Just search for "car accident filmed from inside" on

    • No, this isn't Teslas fault - idiots being idiots got themselves killed.

      What this is however, is a warning to people who make loud claims about "something not being possible" in defence of their baby, only to see most of the claims disproven within the next week (its possible to engage Autopilot without being in the driver seat by circumventing most of the basic checks, and its also possible to get Autopilot to engage on a road which Tesla say it shouldnt be possible to get it to engage on...).

      I'm going to

      • No, this isn't Teslas fault - idiots being idiots got themselves killed.

        Personally, I care less about "nobody in the driver's seat" and more "Crashed into a tree at a high enough speed to be fatal". That takes work with a Tesla, from what I've heard. Matter of fact, it should be nearly impossible with the collision avoidance systems I've heard they have. Unless the idiots managed to darwin themselves by turning that off as well, somehow.

        Of course, if they were playing shenanigans with driving the car with nobody in the driver's seat, odds are they didn't have their seatbelts

      • by ghoul ( 157158 )
        Tesla sells to customers who could be fooled by fake media. SpaceX sells to governments and the military who are smarter than the average customer. Hence Tesla related fake news needs to be defended more aggressively against.
    • by nnull ( 1148259 )

      You can do the same thing with the VW Touareg. You put a clip or water bottle on the steering wheel and now you have auto drive. There's even youtube videos of people abusing this.

      Why don't they pick on VW about this trick that has been around for the last couple years already?

  • Yeah...and? (Score:4, Insightful)

    by erp_consultant ( 2614861 ) on Thursday April 22, 2021 @09:44PM (#61303216)

    It's a nice party trick I suppose, tricking the Tesla car into thinking someone is in the driver seat to get the auto-pilot to engage. But what sort of idiot would put their lives in danger do this? The kind of idiot with a huge ego that wants to impress their friend with dangerous stunts. They ran into a tree but it could have just as easily been a minivan with three kids and their mom inside it.

    The driver, so to speak, has full responsibility for this accident. The auto-pilot system is not fully autonomous and they knew this. That's why the system shuts off if you don't touch the wheel.

    The media of course is focusing on the huge fire in the aftermath but the reality is it was just a couple of dumb rich fucks acting recklessly.

    • by nnull ( 1148259 )

      You mean like this:

      https://www.youtube.com/watch?... [youtube.com]

    • by Tom ( 822 )

      But what sort of idiot would put their lives in danger do this? The kind of idiot with a huge ego that wants to impress their friend with dangerous stunts.

      Yes. I wonder if there was a GoPro or a mobile phone found in the wreckage which has the whole thing on video. Because they probably wanted to upload it to TikTok or whatever the "cool" kids die for today.

  • None of the high rises in USA are designed to prevent people from jumping off windows. There is a door, usually but it can be easily opened.

    Get your intern to test the hypothesis and report.

    Not to self: ask daughter not to apply for any internships in consumer reports architecture testing division.

    • > None of the high rises in USA are designed to prevent people from jumping off windows

      Many are. Roof access is limited, there is a fence around the roof, and the upper windows open a very limited amount.

      • Many are. Roof access is limited, there is a fence around the roof, and the upper windows open a very limited amount.

        Yeah, but going by CR's definition of "Easy", them bringing a lockpick or a heavy hammer to open the door or break the window up more still counts as "easy".

    • by ghoul ( 157158 )

      Not to self: ask daughter not to apply for any internships in consumer reports architecture testing division.

      In all fairness that would depend on how irritating of a teenager your daughter is.

  • by Malays2 bowman ( 6656916 ) on Thursday April 22, 2021 @10:09PM (#61303318)

    Yes, I knew they gamed the system, and decided to win that Darwin Award.

    Tesla did everything they could to stop the driver from pulling a stunt like this, and while their driverless AI is good, it's far from perfect. I never heard Tesla state otherwise.

      The only reason this is news because it involves a Tesla. Had somebody done something like this using weights/chains etc in a regular car, it would've been forgotten about already.

    • I want to mention that saftey interlocks like this have been around for many decades. There were cars that would refuse to start if the seatbelt wasn't buckled. So the drivers of those cars simply buckled the seatbelt before sitting down and sitting on it.

        There will always be people that will bypass these kinds of saftey systems.

      • So the drivers of those cars simply buckled the seatbelt before sitting down and sitting on it.

        And they're the reason why, for a few years, we had those cars where the shoulder belt would open/close automatically, but you still had to fasten a lap belt to be truly safe. Because they figured that the people who'd simply buckle the belt behind them would leave the auto-belt alone.

        That said, I knew of such morons who, at that point, would either cut the belt and put the clip into the buckle, or just buy a 2nd clip and put that in.

  • So Many Problems (Score:3, Insightful)

    by lazarus ( 2879 ) on Thursday April 22, 2021 @11:53PM (#61303604) Journal

    Let's start with the Tweet by Musk quoted above which was cherry picked from the entire tweet which said:

    "Data logs recovered so far show Autopilot was not enabled & this car did not purchase FSD.
    Moreover, standard Autopilot would require lane lines to turn on, which this street did not have."

    So many people commenting on this story seem to be confusing AutoPilot with Full Self Driving (FSD). The car didn't have FSD, so it wasn't autonomously driving itself around (setting aside the creativity of Consumer Reports and their clever use of an iron chain). AutoPilot is a glorified cruise control. In fact, the only thing that it has on just about every other modern adaptive cruise control on the market is that it will also steer the car to keep you in your lane (if you keep your hands on the wheel). If, as Musk said, your road has lane lines. Which this road didn't. Note that you CAN'T TURN IT ON without lane lines and they crashed "a few hundred yards down the road".

    So Autopilot was quite likely not on (the logs don't show it on, and it wouldn't have worked very well if it was). And the car didn't have FSD. So one of two things happened here:

    1. The owner started out driving the car in the front seat, got up to speed, set the autopilot to on on a road with no lane lines (which likely failed to activate) and then jumped in the back seat to show his friend how great his car was. At that point the car likely started screaming bloody murder at him (you have to experience this to understand how jarring that fucking sound is). With AutoPilot NOT ON, the car, which would be slowing down just through regenerative braking flew off the road and hit a tree and burst into flames (probably while the stupid owner was trying to get back into the front seat).

    2. The owner mashed the go pedal, lost control (it is truly impressive acceleration), smashed into a tree, and ended up in the back seat as a result of the violence of the collision (have you seen the pictures?). Was he wearing his seatbelt? "Yes, officer, we found the charred remains of the seatbelt wrapped around his left ankle which was located three blocks away."

    I think #1 is more likely, but #2 is totally not out of the question. I appreciate Constable Mark Herman's 35 years of service, but until a coroner makes the assessment of the remains, I don't think we can discount it.

    As far as CS is concerned... I'm pretty sure I could convince my dumb-as-dirt F150 to drive itself into a tree in exactly the same manner. Get up to speed, set the cruise control, and then, like, hang out in the passenger seat while the car drives off the road and into a tree.

  • by hoofie ( 201045 ) <mickey@[ ]se.com ['mou' in gap]> on Friday April 23, 2021 @12:47AM (#61303692)

    It's surely comparable to taking a dangerous piece of machinery, taking the effort to bypass all the guards and interlocks and then sticking your head in it.

    Stupid ? Yes
    Fault of the Manufacturer ? No

  • "but it also couldn't tell if there was a driver there at all"

    All that engineering into a Tesla, and yet they overlooked the value, of a simple weight sensor in the drivers seat? How long have we been disabling/enabling critical safety features like passenger airbags with that old design?

    Yes, I understand weight sensors may have been purposely overlooked for other features (perhaps self-parking mode), but any mode that enables the car to move without a driver detected, should be limited to parking speed unless all other (Elon-claimed) safety factors are met (detected

  • by Tom ( 822 ) on Friday April 23, 2021 @04:40AM (#61304042) Homepage Journal

    As we all know, you can't idiot-proof anything because as soon as you do, the world invents a better idiot.

    So yeah, it can be defeated. So can anything else they put in place. But you have to do it INTENTIONALLY and fully aware that you are circumventing restrictions in order to make the system do something it shouldn't be doing.

  • by nospam007 ( 722110 ) * on Friday April 23, 2021 @05:16AM (#61304090)

    Just like my bank.

    You just have to make a hole in the 3feet concrete wall, disable the alarms, disconnect the phone lines and cameras, steal the hard-disks, open the safe with plasma lance and steal all the money.

    No security at all, those bastards.

  • by nagora ( 177841 ) on Friday April 23, 2021 @06:25AM (#61304230)

    It seems bizarre to me that a car filled with cameras and "AI" to determine where to drive and what's on the road doesn't have a basic camera pointing at the driver's seat to determine if there's anyone there. 2021 on the outside; 1930 on the inside.

  • by sjames ( 1099 ) on Friday April 23, 2021 @04:46PM (#61306666) Homepage Journal

    Obligatory XKCD [xkcd.com]

    And a look at How it works [xkcd.com]

It's a naive, domestic operating system without any breeding, but I think you'll be amused by its presumption.

Working...