Consumer Reports Shows Tesla Autopilot Works With No One In the Driver's Seat (arstechnica.com) 288
Rei_is_a_dumbass shares a report from Ars Technica: Last Saturday, two men died when a Tesla Model S crashed into a tree in a residential neighborhood. Authorities said they found no one in the driver's seat -- one man was in the front passenger seat, while the other was in the back. That led to speculation that the car might have been under the control of Tesla's Autopilot driver-assistance system at the time of the crash. Elon Musk has tweeted that "data logs recovered so far show Autopilot was not enabled." Tesla defenders also insisted that Autopilot couldn't have been active because the technology doesn't operate unless someone is in the driver's seat. Consumer Reports decided to test this latter claim by seeing if it could get Autopilot to activate without anyone in the driver's seat. It turned out not to be very difficult.
Sitting in the driver's seat, Consumer Reports' Jake Fisher enabled Autopilot and then used the speed dial on the steering wheel to bring the car to a stop. He then placed a weighted chain on the steering wheel (to simulate pressure from a driver's hands) and hopped into the passenger seat. From there, he could reach over and increase the speed using the speed dial. Autopilot won't function unless the driver's seatbelt is buckled, but it was also easy to defeat this check by threading the seatbelt behind the driver. "In our evaluation, the system not only failed to make sure the driver was paying attention, but it also couldn't tell if there was a driver there at all," Fisher wrote in a post on the Consumer Reports website.
Sitting in the driver's seat, Consumer Reports' Jake Fisher enabled Autopilot and then used the speed dial on the steering wheel to bring the car to a stop. He then placed a weighted chain on the steering wheel (to simulate pressure from a driver's hands) and hopped into the passenger seat. From there, he could reach over and increase the speed using the speed dial. Autopilot won't function unless the driver's seatbelt is buckled, but it was also easy to defeat this check by threading the seatbelt behind the driver. "In our evaluation, the system not only failed to make sure the driver was paying attention, but it also couldn't tell if there was a driver there at all," Fisher wrote in a post on the Consumer Reports website.
So? (Score:5, Insightful)
Did they find a weighted chain in the car?
Re: So? (Score:5, Insightful)
Re: So? (Score:5, Funny)
Things get worse than that.
On my 1967 Chevy Impala, I was able to light the JATO rockets without anybody at all being in the car.
It was a good thing, too, because I doubt that anyone could have survived that crash.
Re: So? (Score:2)
Jamie, is that you!?
Re: (Score:2)
Re: (Score:3)
Even in a car without cruise control, you can put a brick on the accelerator.
Difference is that Musk isn't egging people on to do it.
Re: (Score:3)
How is Musk egging people to bypass Tesla Safety controls?
All official Tesla media and even tweets from Musk, explain the limitations of full self driving, and what it currently can do. Also when enabling the feature it tells you again that you need to be engaged and ready to take control.
What is really happening, is just a lot of talk from Tesla Fanboys that over exaggerate how good the product is, just as Tesla Haters exaggerate on how bad the product is.
The difference: insecure people dislike Elon Musk (Score:4, Insightful)
So there are large populations that just want Elon Musk to be a failure--and that dynamic leads to the clickbait mills generating content to feed that crowd. It's not that unusual. Pretty much the same people disliked Steve Jobs for the similar reasons--and pretty much every truly successful person has to deal with haters. It's a bit worse in the tech field in part because there are a lot of truly insecure people who follow tech closely.
Re: (Score:3)
Yes, and this is a problem. There have been accidents, particularly in the era of internet videos, where people put their car into gear and idle forward or cruise control and then get out of the drivers seat (Ghost riding).
The more capable cruise control gets at making this 'almost work, but not quite', the more and more important it is to have safeties around it. It's not like this is a huge ask, if it is expected to recognize random pedestrians outside the vehicle and react somehow appropriately already,
Re: (Score:2)
Some of us who drive in icy weather keep tire chains in the car.
Re: So? (Score:2, Insightful)
Re: (Score:2)
Re: (Score:2)
You can use a piece of fruit, a bean bag, a tool bought specially from Amazon for that exact task... Basically anything that can be wedged into the wheel. Or just lean over and rest your hand on the wheel.
Re:So? (Score:5, Insightful)
And exactly none of the matters. The question is what is the control designed to prevent. The answer is its intended to detect a lapse in driver attention.
It is not designed to prevent a driver from deliberately using the product in reckless or unsafe fashion. If the control was intended to prevent malicious tampering we might say its ineffective but its not designed to do that, its designed to determine if I have dozed off, have spent a little to long looking at my phone etc. Even if wedging a banana in the wheel spokes is enough to thwart it, its no longer a case the safety control does not work its that the driver has made a conscious choice to disable it. At that point the responsibility shifts.
I have ridding mower, I usually siphon whatever remaining gas is left in the tank out before I put it away at the end of the season. I like to make sure the carburetor floats and lines are also empty of fuel. There is a safety sensor in the seat that prevents the engine from running when nobody is sitting on it. I defeat it by setting a cinder block or something else heavy on the seat while I let it burn off the remaining fuel (cutting deck disengaged of course). This does not mean the safety system does not work. Conceivably it could slip into forward motion and run into something or over something or whatever. I would not blame the manufacturer at that point, I am after all operating the product in an unintended fashion having effectively disabled a safety system. Its now my responsibility to ensure I am monitoring in appropriately and nothing is in harms way.
Tesla obligation should be to prevent accidents associated with ordinary use, ie you fall asleep at the wheel. Its not fair to expect them deal with you decide to intentionally fool the vehicle into engaging autopilot without a driver present.
Re: (Score:3)
Tesla obligation should be to prevent accidents associated with ordinary use, ie you fall asleep at the wheel. Its not fair to expect them deal with you decide to intentionally fool the vehicle into engaging autopilot without a driver present.
If you can fool the system without a driver present, then how could it possibly detect a driver that fell asleep?
Re: So? (Score:3, Interesting)
Re: (Score:2)
That sounds really dangerous. Somebody could get hurt.
Uh, I kind of doubt any simulation replicating the actual disaster where two passengers lost their lives, would be necessary.
You are testing for a lack of road markings. Do it in the damn desert with a pseudo-road and an empty car.
Re: So? (Score:4, Informative)
You are testing for a lack of road markings. Do it in the damn desert with a pseudo-road and an empty car.
We already know it works without road markings:
https://www.youtube.com/watch?... [youtube.com]
Of course the guy says it's impressive, which it kind of is, until it loses track of the road and plows into a tree
Re: (Score:3)
Four more astronauts this morning, and on "pre-owned" hardware. Eat it, Luddites.
logs need to come from the state and not manufactu (Score:2, Insightful)
logs need to come from the state and not manufacture as the manufacture may be covering up issues that make what they are manufacturing unsafe.
Re:logs need to come from the state and not manufa (Score:5, Insightful)
Send all those car logs to the state? Location, speed, audio, video, sensors... Thanks but I'll be opting out of that abomination of warrantless search.
If the car has a "black box" device that stores only important events in a circular log file, and the state can get it with a warrant, it might work.
Re: (Score:3)
Modern cars do contain recent driving data in the ECU, and it can be pulled by the PD with a warrant.
This has been used to prove criminal behavior after an accident.
I too am not entirely comfortable with Tesla being the gatekeeper to that information. It should be retrievable from the car itself, instead of Teslas interpretation of the data uploaded to its servers.
Re: (Score:2)
No, it's been used to prove criminal behavior after a crash. If someone is driving criminally, and crashes, it's no accident. Language is important.
Re: (Score:3)
I wonder if an insurance company could require it as a condition of coverage.
Wait. (Score:5, Insightful)
Ok, so someone's dead set on winning the Darwin Award, and it's the car's fault?
I'm just trying to understand the argument here. I don't own a Tesla, only a slim chance I ever will, and I have no horse in the game. Even if somehow the car can be fooled into driving without a driver in the seat, why is that the car's fault? I could understand the problem if Tesla were prone to taking off, by themselves, while parked, or stopped, or whatever. But, as described, one has to intentionally and willingly go out of their way -- supposedly -- to defeat the existing safety measures which, as described, seem to be quite adequate.
Then, if someone wins the Darwn award anyway, it's their prize to keep, not the car's.
Re:Wait. (Score:5, Insightful)
Re: (Score:2)
Re: (Score:2)
Re:Wait. (Score:4, Insightful)
Well that's the problem, there are many things they could put in place to discourage it but didn't. For example checking the weight on the driver's seat, which at least in Europe is mandatory for seatbelt warnings. The Model Y that they tested has an internal camera too, but it doesn't seem to be monitoring for the presence of a driver.
The other issue is expectations. If someone puts a brick on the accelerator in a normal car they are expecting to die, they know it will soon crash. Tesla is selling "full self driving" and there are numerous gushing videos on YouTube about how wonderful and reliable it is, using carefully selected footage were it didn't screw up for 5 minutes on easy roads. The most likely explanation in this case is that the owner was trying to show off full self driving to their friend, vastly over-estimating the car's capabilities.
Re: (Score:2)
there are many things they could put in place to discourage it but didn't. For example checking the weight on the driver's seat, which at least in Europe is mandatory for seatbelt warnings.
.
.
.
vastly over-estimating the car's capabilities.
The things they already did put in place is enough, regardless of the advertizing. All the things they put in place, are decent enough reminders that this is still a manned vehicle. They shouldn't need further measures to prevent the Autopilot being activated. Even seatbelt warnings ultimately rely on the people not being complete selfish dicks. Even a weighted seatbelt warning is nothing more than a sign saying "we can't stop you from being an idiot, but use your fucking common sense."
Personally, I thin
Re: (Score:2)
Using random members of the public for this kind of safety critical task is a terrible idea.
We are testing this literally ever hour, assuming that every driver is completely capable of any safety critical task. The high death toll is proof it doesn't work. Emergency Autopilot features does save lives, and should not require anything to activate more
Re:Wait. (Score:4, Insightful)
The most likely explanation in this case is that the owner was trying to show off full self driving to their friend, vastly over-estimating the car's capabilities.
The flaw in your reasoning is that the car did not have self driving installed.
Re: (Score:2)
Re:Wait. (Score:5, Insightful)
Ok, so someone's dead set on winning the Darwin Award, and it's the car's fault?
Because its Tesla, people are obligated to blame the car. It doesn't matter if its something that, if it happened in any other make of car, wouldn't even be a newsworthy story.
Re: (Score:3)
Comment removed (Score:4, Insightful)
Re:Wait. (Score:5, Insightful)
I think you've missed the point. The car, while flawed in some ways, is not really what people have a problem with. What people have a problem with is the company that makes the car recklessly describing something that's not a real self-driving system as "self-driving". Repeatedly. In many forms of media, and with carefully set up demonstrations to make what limited automatic guidance systems exist look much better than they actually are.
The man driving the RV who turns on cruise control and goes to make a pot of coffee while thundering down a highway at 55 is rightly called a fool and must own his mistake wholly. Incautious people who hear "self-driving" and take it at face value doing something like this are also fools, but they are only part owners in the foolishness.
Re: (Score:3, Informative)
Just to be honest on terms, they use “AutoPilot” and “Enhanced AutoPilot” for the offerings today. They sell “Full Self Driving,” but it is not currently available to drivers.
Autopilot works almost exactly like in a plane. It will happily keep going unless something goes wrong, at which point it disengages and the pilot must take over.
They have promised full self driving, or fully autonomous driving for a very long time. They have thus-far failed to deliver, or to be m
Re:Wait. (Score:5, Insightful)
Re:Wait. (Score:4)
I always try to see both sides of an argument, but here I'm struggling. Tesla call their system "autopilot". As grandparent post points out, it works in a very similar way to autopilot on a plane - which we must presume it is named after. Are people so stupid as to honestly believe that "autopilot" in ANY circumstance - but for this example specifically on a plane - means "no human intervention required at any point at all, ever"? Are they so stupid that they think the job of a pilot - a (generally) well-paid, respected job that requires years of training to achieve - simply presses a button labelled "on" and the plane takes off, flies to its destination, lands of its own accord, then everyone gets out and the pilot presses the "off" button? You say "Hollywood" but I can't think of a Hollywood film (set in the present day) where the pilot's job is purely to switch on the autopilot - so I don't think you can even blame Hollywood for this idiocy. Just think about it for a moment; how utterly ridiculous a proposition this is - yet people are saying "autopilot implies the car needs no human intervention whatsoever" as if this preposterous belief is somehow not indicative of mental retardation at such an extreme level one should be hospitalised. $100k a year for pressing a button twice. God knows what these idiots think a co-pilot does. Maybe his job is to point out which is the "on" button and which is the "off" button in case the main pilot is blind? Yet even here on Slashdot there are people arguing that "autopilot" implies no human interaction. Occam's razor says to me that no-one past the age of a toddler could truly think such a stupid though, so actually people must just be saying it because they think Elon Musk is a tit. Which he may well be, of course...
Re:Wait. (Score:4, Informative)
Given that the flight systems on modern aircraft are more capable than what's on a Tesla and those are still called "auto-pilot," it's reasonable to think that a Tesla vehicle has similar levels of capabilities. I've heard plenty of people (not with technical background) talking about how they wish they could buy a Tesla so that they can take a nap while "driving" their kids to school
That's not entirely unreasonable. There was an incident a few years ago where some pilots fell asleep in the cockpit and flew hundreds of miles past their destination airport. They only woke up when a passenger noticed and asked a flight attendant who called the cockpit. All they got was a letter of reprimand since it wasn't that dangerous. So yeah it's not a logical leap to say that if airplane pilots can put on the auto-pilot and fall asleep, so could a car driver with similar equipment.
The Tesla defenders here always talk about what "auto-pilot" used to mean in the 1960s. Lots of words used to have different meanings. Elevator is not a brand name anymore. And auto-pilot is used (by those not in aviation) to refer to the full range of automation available on the latest commercial jetliners. Sorry but in modern language, "auto-pilot," "full self driving," and "You can climb into the back seat and fall asleep, the car will do everything" are synonymous.
Re: (Score:2)
Re: (Score:2)
The issue is not that the guy died. The issues here are (1) Tesla is lying, and (2) autopilot is not ready.
Re:Wait. (Score:5, Insightful)
The point of the story isn't that it's the car's fault. It's that Musk came out and said it was impossible because that's not how the car was designed. Just like I reject bug reports with "that's not how the software was designed" even if it's showing up in production.
If the issue was someone abusing a Tesla, and Tesla's reaction was "that's retarded, don't do that" it would be one thing. But Tesla's reaction was "you cannot do that."
To use a non-car analogy (oh, how the wheel has turned) if a gun is used to kill a police officer, I'm not going to hold the gun manufacturer guilty (feel free to spout off about a very vocal very small minority saying something different to attempt to derail the conversation.) If the gun manufacturer says "that's impossible, our guns are too smart to shoot police officers" I'm going to hold them responsible or at least mock them. But if they said "our police detecting code in our gun cannot recognize deputies, but it's not supposed to be pointed at people that's just a failsafe" i wouldn't. Make sense?
Re:Wait. (Score:5, Interesting)
The argument is not "Can you use autopilot without a driver", it's "Is Tesla's driver monitoring system any good". Or "Can you fake a driver paying attention"
That's the main result of the test. Other driver monitoring systems in other cars use a seat sensor, and a driver monitoring camera to detect the driver and see if he's paying attention.
Tesla's is using a weight detection system on the steering wheel and that's it.
That's the purpose of the test. The test doesn't answer the question "why is there no driver in the vehicle" - Tesla and the NTSB can fight that one out. All Consumer Reports has shown is that Tesla's autonomous driving which requires driver attention, doesn't actually check if the driver is paying attention or has the ability to check if the driver is paying attention. It's just looking for a weight on the steering wheel.
Whether or not it's a serious safety problem or not will be determined by the NTSB and the like. The NTSB can find the system is flawed in that way, but adequate so the CR test becomes meaningless, or they could see that it's not a sufficient test and force Tesla to fix the issue somehow.
Re: (Score:2)
"Cars don't kill people, *people* kill people!"
We need to ban people things.
There fixed it for you. :-)
Good job (Score:4, Insightful)
Re:Good job (Score:5, Insightful)
Re: (Score:2, Insightful)
We start with the assumption people are not suicidal maniacs. If some Tesla/non Tesla driver is indeed a suicidal maniac the problem is with the maniac not with the engineer who designed some reasonable system that works for most normal people.
Re:Good job (Score:5, Insightful)
It's still important to understand how easy something is to bypass, and whether it matches with manufacturer claims.
Some Equipment Required (Score:5, Interesting)
Personally, I think that as a general standard, if something is harder to bypass than, say, opening a combination lock, then you've met the "deterrent" level.
Going by what Consumer Reports said, while they called it "Easy", we're still looking at a multiple step process, requiring some additional equipment(the weighted chain), that might not be immediately obvious to do. I mean, they have to place the chain to fool the car into thinking there's hands on the wheel, buckle the seatbelt(despite there being nobody in the seat, might have to place a weight on the seat to detect that, etc...
The more detection systems you put in there, not only does it get more expensive for the additional equipment and programming, it also tends to get more fragile. For example, imagine that we put a face detection system in there to detect whether there's a driver in the seat - now what happens when it comes out that the face detector is "racist" in that it isn't as good at detecting, say, Asian faces, and to go by all the reports I've seen, absolutely suck at seeing the faces of black people?
Things like riding lawnmowers merely detect a certain weight(or more) in the seat for their safety for a reason.
That said, I know I'm annoyed by how little weight it takes to set off the "fasten seatbelt" indicator for the passenger side; I mean, my tablet has been known to set the thing off, and that's light enough to one-hand all over the place, much less my backpack with laptop - still light enough to easily swing into the passenger seat, but do I have to be careful or the bloody car gets insistent that the non-existent passenger buckle up. Which leads to things like permanently bypassed sensors.
Re:Some Equipment Required (Score:5, Informative)
You can literally buy autopilot/full self driving defeat devices on Amazon.
https://www.amazon.com/s?k=tes... [amazon.com]
It couldn't get much easier to fool that system.
Re: (Score:2)
I'm not saying Tesla is at fault here - I'm saying it's useful to know how a system works. Without this test, for example, we wouldn't know that it only required these things to bypass. And who knows, maybe someone else would take it from there and discover even easier ways of tricking Autopilot. Either we find out that there's a base level of effort required to bypass Autopilot checks, or we find out how easy it is to fool. This information is still GOOD TO KNOW to judge future incidents and maybe
Re: (Score:2)
This is literally the same comment as this [slashdot.org]
To which I'm going to reply the same thing I already did [slashdot.org]:
I'm actually pissing away 4 mod points that I've already spent in this discussion to reply to this.
It's still important to understand how easy something is to bypass, and whether it matches with manufacturer claims.
Well, technically it's more difficult to bypass the Tesla according to the reports, than it is to install Linux on a PC that comes preinstalled with Windows.
Please, do name something that's intended to be used by humans and is *more* difficult to bypass than the Tesla autopilot. Most things I know of (microwaves, X-ray machines, heavy machinery etc) have an interlock of some kind, which essentially is a simple switch that you can fake-lock using nothing but a piece of chewing gum; or a two-hand switch (e.g. for metal sheet folders) which you can bypass by putting your backpack on the other-hand switch. Seat belt warning? Just plug the seat belt in, route the strap at the back of the seat instead of your front. Motorcycle foot interlock? It's usually a small switch below the engine block, use a plastic tie or a shoelace. Elevator door? Move your knee, don't block the light beam; also, depending on the model, sometimes there's a physical button hidden somewhere in the door frame, which you can keep pressed using your finger. Two-hands lever hydraulic press? Use your shoe laces. Two-hands and a pedal? Shoe laces and a brick.
Or duct tape.
Now that I think of it, I fact I haven't come across *any* usage security mechanism that I couldn't have bypassed using duct tape and a ball pen. And I've seen my fair share, including but not limited to particle accelerators, high-enery x-ray sources, cryostats, lasers, industrial manufacturing, chemical industry, various r&d facilities, hydroelectric power facilities... you name it.
Actually Tesla's is rather difficult, because you need to come up with the "dial speed down to zero but not deactivate the autopilot" bit. Not everyone has the brains for it, given that it took several days and a consumer report to point that out.
Hope this helps. You're welcome.
Re: (Score:2)
I'm actually pissing away 4 mod points that I've already spent in this discussion to reply to this.
It's still important to understand how easy something is to bypass, and whether it matches with manufacturer claims.
Well, technically it's more difficult to bypass the Tesla according to the reports, than it is to install Linux on a PC that comes preinstalled with Windows.
Please, do name something that's intended to be used by humans and is *more* difficult to bypass than the Tesla autopilot. Most things I know of (microwaves, X-ray machines, heavy machinery etc) have an interlock of some kind, which essentially is a simple
Re: (Score:2)
Re: (Score:2)
It needs to be reasonably difficult for people to win Darwin Awards. The state has a reasonable interest in people staying alive.
Re: (Score:2)
Yeah, agreed - this is some dumb shit. "Well, ackshually, if you purposefully try real hard and explicitly attempt to defeat the system, you can put yourself in a dangerous situation!". Yeah, thanks you dumb fucks.
I'm not up Tesla's ass by any means, in fact I think they'll be defunct when real carmakers start cranking out better, cheaper EVs. However, it's a little bizarre to take your TDS (Tesla Derangement Syndrome, natch) to this level. No shit it can work with no one in the drivers seat if you intentio
Threat model (Score:5, Insightful)
Is the safety feature (not allowing autopilot without a driver present) supposed to protect against silly mistakes like "oh it's not meant for that" or is the driver considered as a malicious actor here?
I'm not personally ready to try any of this autopilot stuff, but it seems to me if a driver takes deliberate steps to disable or circumvent a safety feature that works fine under normal conditions, that the manufacturer should not be liable for what happens next.
Re: (Score:3)
What about a case where someone loads a full-auto-drive car with a car bomb? We have to face up to the fact that this technology will have serious criminal and terrorist applications. Bypassing any safety features is always going to be easier than creating your own auto-drive system.
Might as well use a drone (Score:2)
Well, the first problem is that "most" land-based targets are protected against car bombs these days, you'll find them hard to get to with a vehicle, due to things like big rocks and concrete planters in your path.
Second, if you can make a car bomb these days that you can remote drive into something, you can probably make a drone to do the same thing, which avoids the blockers. Or you can go the mythbusters route and just set up the car to be remote controlled the old fashioned way.
It only needs to work on
Re: (Score:2)
You might want to brush up on your explosive stuff:
1. OKC wasn't a "car bomb", it was a "truck bomb". He stuffed a Ryder truck full of explosives.
2. OKC is the main reason why you can't get that close to buildings anymore. The truck was, if you look at the photos, right up against the building. Inverse cube law - it doesn't take that much distance to massively weaken a blast(though actual in-life stuff can get very complicated).
3. You don't need to know "many", just "one" will suffice. Note: Anythin
What is hack proof? (Score:2)
Nothing. Is technology and / or the companies that make it responsible for people going out of their way to get in dangerous situations? Maybe all the cell phone makers should be responsible for all the people who die while trying to get amazing selfies?
What next (Score:4, Insightful)
It really sounds like an effort to find some way to blame Tesla for something.
They say it "wasn't very hard" to defeat the safety. Does that mean that they are saying is if it was "very hard" to defeat the safety and the Darwin-award guy went and did it anyway then it wouldn't have been Tesla's fault?
What BS. Next they will blame Tesla because the in-cabin camera didn't detect that there was nobody there. If it did then they would write sober screeds about privacy violation.
Re: (Score:2)
What BS. Next they will blame Tesla because the in-cabin camera didn't detect that there was nobody there.
And if Tesla did use the in-cabin camera for that, they'd then move the goalpost and complain how they were easily able to fool it with an inflatable dummy.
Re: (Score:2)
Re:What next (Score:5, Informative)
Re: (Score:3)
It really sounds like an effort to find some way to blame Tesla for something.
Well, Tesla is making claims about things being "impossible". And people are testing those claims.
One claim is that Autopilot can't engage on an unmarked road. And then someone showed it doing that.
There was another claim that it is impossible for Autopilot to drive the car without a driver in the driver's seat. This was testing that claim, and finding it to be false.
That doesn't mean Tesla is at-fault for that particular accident. It means they're making false claims.
Let's be fair here, this is not Tesla's fault (Score:4, Insightful)
I mean, if you need to go through so many steps to trick the car into self driving mode without a driver, is it really the manufacturer's fault?
Re: Let's be fair here, this is not Tesla's fault (Score:2)
Yes, this soulds more like "If you take uranium out of the ground and refine it and put it in a perfect hollow sphere and surround it with perfectly shaped explosives and perfectly timed detonators, you get the first stage of a nuclear bomb... So clearly, the ground is the murderer!"
But then ... how do explain nobody being in the driver's seat, and the car being at high speed? Got any explanation that aren't even less plausible? :)
Because you know that Sherlock Holmes quote...
Re: (Score:2)
how do explain nobody being in the driver's seat, and the car being at high speed? Got any explanation that aren't even less plausible?
Well, sitting in the back seat while poking the accelerator with a stick should work on just about any car produced in the last 80 years or so.
Re: (Score:2)
But then ... how do explain nobody being in the driver's seat, and the car being at high speed? Got any explanation that aren't even less plausible? :)
Because you know that Sherlock Holmes quote...
I have several, but there's not enough public information to rest for one or the other.
One would be that the driver *was* in the driver's seat, got out and went away, leaving the other two to burn.
Another would be that the doors were jammed owing to the accident, leaving the other two climing through the car to reach another door, until they burned.
Another one would be that one of them didn't have their seatbelts on and was pushed around to another seat. Just search for "car accident filmed from inside" on
Re: (Score:2)
No, this isn't Teslas fault - idiots being idiots got themselves killed.
What this is however, is a warning to people who make loud claims about "something not being possible" in defence of their baby, only to see most of the claims disproven within the next week (its possible to engage Autopilot without being in the driver seat by circumventing most of the basic checks, and its also possible to get Autopilot to engage on a road which Tesla say it shouldnt be possible to get it to engage on...).
I'm going to
Re: (Score:2)
No, this isn't Teslas fault - idiots being idiots got themselves killed.
Personally, I care less about "nobody in the driver's seat" and more "Crashed into a tree at a high enough speed to be fatal". That takes work with a Tesla, from what I've heard. Matter of fact, it should be nearly impossible with the collision avoidance systems I've heard they have. Unless the idiots managed to darwin themselves by turning that off as well, somehow.
Of course, if they were playing shenanigans with driving the car with nobody in the driver's seat, odds are they didn't have their seatbelts
Re: (Score:2)
Re: (Score:2)
You can do the same thing with the VW Touareg. You put a clip or water bottle on the steering wheel and now you have auto drive. There's even youtube videos of people abusing this.
Why don't they pick on VW about this trick that has been around for the last couple years already?
Yeah...and? (Score:4, Insightful)
It's a nice party trick I suppose, tricking the Tesla car into thinking someone is in the driver seat to get the auto-pilot to engage. But what sort of idiot would put their lives in danger do this? The kind of idiot with a huge ego that wants to impress their friend with dangerous stunts. They ran into a tree but it could have just as easily been a minivan with three kids and their mom inside it.
The driver, so to speak, has full responsibility for this accident. The auto-pilot system is not fully autonomous and they knew this. That's why the system shuts off if you don't touch the wheel.
The media of course is focusing on the huge fire in the aftermath but the reality is it was just a couple of dumb rich fucks acting recklessly.
Re: (Score:2)
You mean like this:
https://www.youtube.com/watch?... [youtube.com]
Re: (Score:2)
But what sort of idiot would put their lives in danger do this? The kind of idiot with a huge ego that wants to impress their friend with dangerous stunts.
Yes. I wonder if there was a GoPro or a mobile phone found in the wreckage which has the whole thing on video. Because they probably wanted to upload it to TikTok or whatever the "cool" kids die for today.
Consumer reports! quick!! report this danger!!! (Score:2)
Get your intern to test the hypothesis and report.
Not to self: ask daughter not to apply for any internships in consumer reports architecture testing division.
Re: (Score:2)
> None of the high rises in USA are designed to prevent people from jumping off windows
Many are. Roof access is limited, there is a fence around the roof, and the upper windows open a very limited amount.
CR level testing (Score:2)
Many are. Roof access is limited, there is a fence around the roof, and the upper windows open a very limited amount.
Yeah, but going by CR's definition of "Easy", them bringing a lockpick or a heavy hammer to open the door or break the window up more still counts as "easy".
Re: (Score:2)
Not to self: ask daughter not to apply for any internships in consumer reports architecture testing division.
In all fairness that would depend on how irritating of a teenager your daughter is.
I already mentioned this before (Score:3)
Yes, I knew they gamed the system, and decided to win that Darwin Award.
Tesla did everything they could to stop the driver from pulling a stunt like this, and while their driverless AI is good, it's far from perfect. I never heard Tesla state otherwise.
The only reason this is news because it involves a Tesla. Had somebody done something like this using weights/chains etc in a regular car, it would've been forgotten about already.
Re: I already mentioned this before (Score:2)
I want to mention that saftey interlocks like this have been around for many decades. There were cars that would refuse to start if the seatbelt wasn't buckled. So the drivers of those cars simply buckled the seatbelt before sitting down and sitting on it.
There will always be people that will bypass these kinds of saftey systems.
Re: (Score:2)
So the drivers of those cars simply buckled the seatbelt before sitting down and sitting on it.
And they're the reason why, for a few years, we had those cars where the shoulder belt would open/close automatically, but you still had to fasten a lap belt to be truly safe. Because they figured that the people who'd simply buckle the belt behind them would leave the auto-belt alone.
That said, I knew of such morons who, at that point, would either cut the belt and put the clip into the buckle, or just buy a 2nd clip and put that in.
So Many Problems (Score:3, Insightful)
Let's start with the Tweet by Musk quoted above which was cherry picked from the entire tweet which said:
"Data logs recovered so far show Autopilot was not enabled & this car did not purchase FSD.
Moreover, standard Autopilot would require lane lines to turn on, which this street did not have."
So many people commenting on this story seem to be confusing AutoPilot with Full Self Driving (FSD). The car didn't have FSD, so it wasn't autonomously driving itself around (setting aside the creativity of Consumer Reports and their clever use of an iron chain). AutoPilot is a glorified cruise control. In fact, the only thing that it has on just about every other modern adaptive cruise control on the market is that it will also steer the car to keep you in your lane (if you keep your hands on the wheel). If, as Musk said, your road has lane lines. Which this road didn't. Note that you CAN'T TURN IT ON without lane lines and they crashed "a few hundred yards down the road".
So Autopilot was quite likely not on (the logs don't show it on, and it wouldn't have worked very well if it was). And the car didn't have FSD. So one of two things happened here:
1. The owner started out driving the car in the front seat, got up to speed, set the autopilot to on on a road with no lane lines (which likely failed to activate) and then jumped in the back seat to show his friend how great his car was. At that point the car likely started screaming bloody murder at him (you have to experience this to understand how jarring that fucking sound is). With AutoPilot NOT ON, the car, which would be slowing down just through regenerative braking flew off the road and hit a tree and burst into flames (probably while the stupid owner was trying to get back into the front seat).
2. The owner mashed the go pedal, lost control (it is truly impressive acceleration), smashed into a tree, and ended up in the back seat as a result of the violence of the collision (have you seen the pictures?). Was he wearing his seatbelt? "Yes, officer, we found the charred remains of the seatbelt wrapped around his left ankle which was located three blocks away."
I think #1 is more likely, but #2 is totally not out of the question. I appreciate Constable Mark Herman's 35 years of service, but until a coroner makes the assessment of the remains, I don't think we can discount it.
As far as CS is concerned... I'm pretty sure I could convince my dumb-as-dirt F150 to drive itself into a tree in exactly the same manner. Get up to speed, set the cruise control, and then, like, hang out in the passenger seat while the car drives off the road and into a tree.
Darwin Award Winner (Score:4, Insightful)
It's surely comparable to taking a dangerous piece of machinery, taking the effort to bypass all the guards and interlocks and then sticking your head in it.
Stupid ? Yes
Fault of the Manufacturer ? No
Decades-old Fix, overlooked..? (Score:2)
"but it also couldn't tell if there was a driver there at all"
All that engineering into a Tesla, and yet they overlooked the value, of a simple weight sensor in the drivers seat? How long have we been disabling/enabling critical safety features like passenger airbags with that old design?
Yes, I understand weight sensors may have been purposely overlooked for other features (perhaps self-parking mode), but any mode that enables the car to move without a driver detected, should be limited to parking speed unless all other (Elon-claimed) safety factors are met (detected
idiot-proofing (Score:3)
As we all know, you can't idiot-proof anything because as soon as you do, the world invents a better idiot.
So yeah, it can be defeated. So can anything else they put in place. But you have to do it INTENTIONALLY and fully aware that you are circumventing restrictions in order to make the system do something it shouldn't be doing.
It's a shame (Score:3)
Just like my bank.
You just have to make a hole in the 3feet concrete wall, disable the alarms, disconnect the phone lines and cameras, steal the hard-disks, open the safe with plasma lance and steal all the money.
No security at all, those bastards.
It seems bizarre (Score:3)
It seems bizarre to me that a car filled with cameras and "AI" to determine where to drive and what's on the road doesn't have a basic camera pointing at the driver's seat to determine if there's anyone there. 2021 on the outside; 1930 on the inside.
Obligatory XKCD (Score:3)
Obligatory XKCD [xkcd.com]
And a look at How it works [xkcd.com]
Re: (Score:2)
Re: (Score:2)
Obligatory South Park [youtube.com] reference?
Re: (Score:2)
Its certainly possible. A little steering wheel torque is necessary to occasionally "remind" the car that your hands are on the wheel when it is in autopilot. But a little more torque will make the autopilot disengage.
That reminds me (Score:2)
You post reminded me of the "sawstop" safety system [sawstop.com] for table saws, which, at the cost of the blade and replaceable cartridge, will stop the table saw before it does more than nick a finger.
In cases where you're cutting wet or green wood(you shouldn't be cutting wet wood, and most people don't have need to cut extremely green wood), or other issues, the system does have a bypass switch.
Of course, at that point the saw will as happily cut fingers and hands off as any other table saw.
Re: (Score:2)
Re: (Score:3)
No, the correct headline would be "Musk lies about safety systems of Teslas again".
Musk claimed this sort of bypass was not possible, and this is CR showing it's not only possible, but quite trivial.
Re: (Score:3)
When Tesla (via Musk) claims this sort of bypass is not possible, it becomes Tesla's flaw. Mostly in that they lied, not that they need to fix it.