Tesla Model 3 Drives Straight Into Overturned Truck In What Seems To Be Autopilot Failure (jalopnik.com) 322
A viral video making the rounds on social media shows a Tesla Model 3 smacking into the roof of an overturned truck trailer. The crash took place on Taiwan's National Highway 1 and appears to be "caused by the Tesla's Autopilot system not detecting the large rectangular object right in front of it, in broad daylight and clear weather," reports Jalopnik. From the report: There's video of the wreck, and you can see the Tesla drives right into the truck, with only what looks like a solitary attempt at braking just before impact. For any human driver paying even the slightest bit of attention, this accident is almost an impossibility, assuming the driver had the gift of sight and functional brakes.
Tesla's Autopilot system primarily uses cameras for its Autopilot system, and previous wrecks have suggested that situations like this, a light-colored large immobile object on the road on a bright day can be hard for the system to distinguish. In general, immobile objects are challenging for emergency automatic braking systems and autonomous systems, as if you use radar emitters to trigger braking for immobile objects, cars tend to have far too many false positives and unintended stops than is safe or desirable.
News reports from Taiwanese outlets, clumsily translated by machine, do seem to suggest that the driver, a 53-year-old man named Huang, had Autopilot activated: "The Fourth Highway Police Brigade said that driving Tesla was a 53-year-old man named Huang, who claimed to have turned on the vehicle assist system at the time. It was thought that the vehicle would detect an obstacle and slow down or stop, but the car still moved at a fixed speed, so when the brakes were to be applied at the last moment, it would be too late to cause a disaster." Thankfully, nobody was seriously hurt in the accident. The takeaway is that regardless of whether Autopilot was working or not the driver should always be paying attention and ready to step in, especially since no Tesla or any currently-available car is fully autonomous.
Tesla's Autopilot system primarily uses cameras for its Autopilot system, and previous wrecks have suggested that situations like this, a light-colored large immobile object on the road on a bright day can be hard for the system to distinguish. In general, immobile objects are challenging for emergency automatic braking systems and autonomous systems, as if you use radar emitters to trigger braking for immobile objects, cars tend to have far too many false positives and unintended stops than is safe or desirable.
News reports from Taiwanese outlets, clumsily translated by machine, do seem to suggest that the driver, a 53-year-old man named Huang, had Autopilot activated: "The Fourth Highway Police Brigade said that driving Tesla was a 53-year-old man named Huang, who claimed to have turned on the vehicle assist system at the time. It was thought that the vehicle would detect an obstacle and slow down or stop, but the car still moved at a fixed speed, so when the brakes were to be applied at the last moment, it would be too late to cause a disaster." Thankfully, nobody was seriously hurt in the accident. The takeaway is that regardless of whether Autopilot was working or not the driver should always be paying attention and ready to step in, especially since no Tesla or any currently-available car is fully autonomous.
"Autopilot" is a bad name (Score:5, Insightful)
Even if told otherwise, people will still believe that "autopilot" means "I don't need to do any piloting!" and let the car fully drive itself.
Even a simple name change, like "Drive Assist" or anything similar most probably would have prevented some of these avoidable accidents (some of which have been deadly)
Re: (Score:2, Insightful)
Autopilot has been used on airplanes forever, and never did it completely fly the airplane, pilot intervention is required. Fault of the ignorant driver for not understanding the technology or assuming "autopilot" means "autonomous". Fault of Tesla for overstating its capabilities without adequate disclaimers.
Re:"Autopilot" is a bad name (Score:5, Interesting)
Fault of the ignorant driver for not understanding the technology or assuming "autopilot" means "autonomous". Fault of Tesla for overstating its capabilities without adequate disclaimers.
Agreed - I think the biggest factor is that, unlike airplane pilots, there's no user training required for Tesla's "autopilot", so buyers are left with the impression of the functionality they got from the media (mostly TV/movies) that the planes just fly themselves.
I feel like that should be a requirement, perhaps with a new license level (just like for driving a big rig for example), for operators of vehicles with any autonomous functions, and not simply be left to the user manual.
Re:"Autopilot" is a bad name (Score:5, Informative)
And so what? The number of accidents Autopilot helps prevent far outnumber the number it causes. This is statistically proven with number of miles driven per accident.
Re: "Autopilot" is a bad name (Score:5, Funny)
Re: "Autopilot" is a bad name (Score:5, Insightful)
The comforting thing is, this will be yet another extremely unlikely edge case that Tesla will teach their cars all around the world to solve for. This will keep happening, but it will be more and more rare to see these funny quirks happen.
Re: (Score:2)
Tesla will teach their cars all around the world to solve for. And when we get to the point where fixing one issue makes an other come up or makes it not take Action in case where it used to.
Re: "Autopilot" is a bad name (Score:5, Informative)
If they are statistically safer than human drivers, it is still a win.
Re: "Autopilot" is a bad name (Score:5, Informative)
Re: (Score:3)
I don't. I don't find that I drive better (sunshade down or not) in glare conditions than AP. And we're a very glare-rich country (low sun angles, little to no night, at this time of year).
Maybe you do, though.
Re:"Autopilot" is a bad name (Score:4)
It tells you when you turn it on that you need to keep your hands on the wheel, and pay attention. EVERY SINGLE TIME, in the drivers screen in an X at least. Autopilot is not "chauffeur". We use words because they have meanings. We can't assume people don't know what it means. Really, they are just taking risks.
This type of design is the very worst possible. If a driver isn't actively paying attention to the driving task, their mind will wander, and it will take time for the system to regain the driver's attention in order to pay attention to what needs to be done. It really needs to be all or nothing.
I use an autopilot on my sailboat on a regular basis, it's relatively similar to what you'd find on an airplane, though in the case of the boat it just keeps the boat going in a straight line. For the boat, this frees me up to manage other systems, such as the sails, and keep better watch. But even then, I find my attention wandering if I'm not explicitly focusing on the task at hand. Fortunately, when your max speed is 6kts, you have a lot more reaction time.
Re: "Autopilot" is a bad name (Score:3)
This type of design is the very worst possible. If a driver isn't actively paying attention to the driving task, their mind will wander, and it will take time for the system to regain the driver's attention in order to pay attention to what needs to be done. It really needs to be all or nothing.
So on the one hand I have your assertion that "it needs to be all or nothing". And on the other hand I have data showing that the current "something" is still statistically safer than nothing, and will keep becoming more safer until we get to "all".
Which one you think I'm gonna go with it?
Re: (Score:2)
The funny thing is that a sailboat autopilot can be an amazingly simple device, with no electronics whatsoever--just a windvane and a tiller mechanism--and it does its job surprisingly well. Modern airplane autopilots are electronic, but relatively simple devices that simply have to trim pitch and roll to maintain a course and altitude, and they do so with great precision. On the other hand, an automotive 'autopilot' requires an AI smarter than most human drivers (yup it's already there) and still fails at
Re:"Autopilot" is a bad name (Score:4, Insightful)
Is that true? I thought that even with its well publicized mistakes, the Tesla autopilot was still safer than humans overall. Car autopilots don't need to be accident free to be safer than humans. And they'll have different failure modes than humans so the accidents may be more notable.
airplane pilots have know what it can & can't (Score:2)
airplane pilots have know what it can & can't do. And they are there to take over in case things mess up. Also in airplanes you in most cases have the time to look over the issue at hand vs very little time in an car. (look at the uber death)
Re: (Score:2)
except statements like "Musk previously estimated that by the middle of 2020, Tesla’s autonomous system will have improved to the point where drivers will not have to pay attention to the road." provide uncertainty and make idiots make assumptions about the tech they should not be making.
Well, we still have about four weeks left to the middle, so there's still time. :-D
But in all seriousness, there is a major Autopilot rewrite reportedly in progress to do 3D labeling based on comparing the images from multiple cameras. It seems likely that such code would have noticed this sort of obstruction even without any particular model training. It wouldn't have been able to identify it, but it would have known that there was a large obstruction in the road, and one would hope that the path-finding
Re: (Score:2)
Re: (Score:3)
"... without adequate disclaimers"
The first time you start up Autopilot, it makes you read through and accept a giant infosheet about its limits. Every single time you start autopilot after that, it pops up a message telling you to keep your hands on the wheel and pay attention to the road. The manual section on Autopilot is one page of disclaimers after the next, several pages long. Tesla staff give disclaimers (and watch you to make sure you're not abusing it) during test drives. Even the website where
Re: (Score:2)
Re: (Score:2)
Well we trust Average Joe to pay attention in non-autopiloted cars even though we know he won't.
An autopilot that's 95% as good as humans that pays attention to the road 100% of the time sounds better than a person that's 100% as good as the average human (and half of drivers are below average) , but only pays attention 90% of the time.
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
You have FSD? Really? Musk is on-record as saying it will be here by the end of the year (it's not here yet), so probably Q3/Q4 of next year. How did you get FSD already?
Oh, you mean you PAID for a feature that SHOULD be delivered at sometime in the future, but does not exist yet, but you're going to crow about buying literal vaporware at this point because you're just such a techie guy and chicks dig claims of FSD (just like they LOVE big wings and tons of stickers on a Honda Civic).
Re: (Score:2)
Re: "Autopilot" is a bad name (Score:2, Flamebait)
Re: (Score:2)
Indeed. People are generally stupid and generally incapable of reading the manual. At the same time, the more stupid they are, the more in control they think they are (the so-called "Dunning-Kruger Effect"). Hence technology for general use must make it exceptionally hard to hurt yourself with it. Sure, the driver here is probably 100% responsible legally, but ethically it is more a 50:50 thing.
Re:"Autopilot" is a bad name (Score:4, Interesting)
Even a simple name change, like "Drive Assist" or anything similar most probably would have prevented some of these avoidable accidents (some of which have been deadly)
Citation needed.
What would drivers do when they enabled a "Drive Assist" function and the car will steer itself along the road? Most would stop paying attention.
Requiring the driver to keep paying attention when there is nothing to do is already a lost cause.
Human beings are not wired to keep staying on standby and paying attention. People already have trouble keeping attention even when they actively driving, e.g. there are long tunnels with different wall patterns, long roads that intentionally made to curve left and right, for the purpose of keeping the driver from zoning out due to lack of change in scenery.
While I am optimistic the autonomous driving will one day take over and it will result in safer roads, doing it halfway will not work.
The right place to begin deployment is for long transportation where *no* driver would be present, and the software can be made to always err on the side of caution. Having the truck stop on any suspected obstacle and having someone override from a remote camera is no big deal for a truck not in a hurry.
Using it on consumer cars with drivers in the front seat with little understanding of how AI could fail is a recipe for disaster.
Autonomous driving on consumer cars should be 100% autonomous with no option for manual driving. That way, the seats can be designed to be much more safe for the passengers (such as backward facing) without the constraint of having someone sitting closely behind a big hard wheel ready to crush him on impact.
Re: (Score:3, Insightful)
I have an Audi with such drivers assist functions. It's nowhere near as good as Tesla, thus I do not trust it, and am always ready to take the wheel. Tesla's problem is that they got too good, but not good enough to be 100% reliable.
Re: (Score:3)
long roads that intentionally made to curve left and right, for the purpose of keeping the driver from zoning out due to lack of change in scenery
Man, I guess you've never been to West Texas. The only curvature you can see in the roads comes from the Earth.
Re: (Score:2)
Re: (Score:2)
Well that's quite the oxymoron!
Re: "Autopilot" is a bad name (Score:2)
If you think that's an oxymoron then you do not understand what one of those words means ...
Re: (Score:3)
Something cannot be consciously avoided if it was never expected. Therefore, avoidable means expected.
Accidental means something that was unexpected. [thefreedictionary.com]
Thus, "avoidable accident" means "expected thing that was unexpected." How is this not an oxymoron?
Re: "Autopilot" is a bad name (Score:4, Insightful)
Something cannot be consciously avoided if it was never expected
Sure it can. If you point a gun at your foot and pull the trigger thinking that it's unloaded, you may be quite surprised when you accidentally blow your foot off. Yet you still could have avoided the accident by not pointing fucking guns at your feet, regardless of whether or not you think they are loaded.
Re: (Score:2, Insightful)
Re: (Score:3)
Re: (Score:3)
Autopilots in planes can take over almost immediately after takeoff (once you clear 1000 feet) - fly you to your destination, and even land. In what way is the current Tesla solution like an Autopilot in a plane?
Airplane autopilots will disconnect if the wings (or a critical instrument [wikipedia.org]) ices up and the autopilot can no longer control the plane, or if the plane has a serious bird strike, or the engine fails, etc -- the pilot needs to be able to take over at any time. During good conditions, the Tesla autopilot can drive you all the way to your destination, but if something unusual happens, you better be ready to take over.
Re: (Score:2)
Re: (Score:2)
Autopilots also give you a rather loud and hard-to-ignore warning when they disconnect; does Tesla's Autopilot do that?
If a driver isn't touching the steering wheel, the Tesla autopilot warns the driver several times over several minutes with increasingly more noticeable audible/visual alerts to take the wheel until finally it will come to a stop of the driver refuses to take control. I don't know how it warns the driver that he needs to take over suddenly, like if it can no longer see the edge of the road, I assume there's some alert (my non-Tesla will beep annoyingly if Radar Cruise Control cancels itself or it loses si
Re: (Score:3)
Autopilots in planes can take over almost immediately after takeoff (once you clear 1000 feet) - fly you to your destination, and even land. In what way is the current Tesla solution like an Autopilot in a plane?
Airplane autopilots will disconnect if the wings (or a critical instrument [wikipedia.org]) ices up and the autopilot can no longer control the plane, or if the plane has a serious bird strike, or the engine fails, etc -- the pilot needs to be able to take over at any time. During good conditions, the Tesla autopilot can drive you all the way to your destination, but if something unusual happens, you better be ready to take over.
And in almost all planes, autopilot will obliviously fly you right into another plane or unexpected terrain (like a radio tower that's not in its database). Even if the plane is equipped with TCAS and it's blaring at you to take evasive action, in most planes TCAS guidance needs to be performed by the pilot, the autopilot won't do it.
Re:"Autopilot" is a bad name (Score:5, Informative)
Even if told otherwise, people will still believe that "autopilot" means "I don't need to do any piloting!" and let the car fully drive itself.
Even a simple name change, like "Drive Assist" or anything similar most probably would have prevented some of these avoidable accidents (some of which have been deadly)
I don't buy it. I don't think a name change would make any difference at all.
When Google first started experimenting with self-driving cars, they let some employees -- engineers who fully understood the deficiencies of the early system and *knew* they had to stay alert -- use them as daily vehicles, but with cameras inside to monitor usage, specifically to see how people interacted with the system. The designers of this test expected that they were looking for small changes in the amount of time the driver looked away from the road, or other subtle clues. What they found instead was that after a few hours of the vehicle performing pretty okay in common, easy driving scenarios (e.g. freeway traffic) even people who thoroughly understood the limitations tended to stop paying attention for long periods of time. What's more, they didn't even seem to realize how long they had stopped paying attention. When later shown the video of their own behavior, they were shocked and surprised at how irresponsible they'd been.
The bottom line is that "driver assist" systems that enable the driver to reduce how much attention they pay without fairly immediate negative feedback will cause driver inattention. This observation prompted Google (now Waymo) to decide that nothing less than level 4 (complete autonomy under specified conditions) is safe.
In practice I think Tesla's numbers have proven Google wrong about that, not because drivers do pay attention to what the self-driving car is doing but because the safety bar is so ludicrously low -- human drivers are so awful -- that on balance a level 3 system can actually be about as safe as a human driver. It'll screw up regularly, and occasionally horrifically, but so do humans. Humans tend to fail in different ways, screwing up by falling asleep, or getting distracted, or being under the influence, etc., but on balance the numbers are close to the same, and maybe even favor the Tesla system.
Still, Waymo's approach is that full level 4 is the only way to go, and they're operating fully-autonomous (no "safety driver") taxis in Phoenix right now. Of course Phoenix has no snow; little rain; broad, well-marked roads and (relatively) light and non-aggressive traffic, so the ability to operate autonomously there means little about the ability to operate in worse conditions. That's why Waymo is also testing in Michigan; they'll get there eventually.
Re:"Autopilot" is a bad name (Score:4, Interesting)
-- human drivers are so awful --
This cannot be stated enough. Most of the comments here seem to be along the lines of autopilot enabling drivers not to pay attention. That doesn't seem to stop every moron driving down the highway texting or reading the news or whatever.
I one day avoided what I consider a sandwich of inattentive stupidity, caused by something not quite as bad as in this footage. Driving on the right side of the highway the truck in front of me for no reason drifted slightly off the road and hit a service vehicle parked on the shoulder. I was following way too close to brake in time (because I'm an awful driver) so brakes + abs + traction control + yanking the wheel got me around the outside of the truck (luckily the guy next to me was paying attention when I nearly swerved into him. The guy 10m *behind me* looks to have not even attempted to brake, he hit the truck at full speed.
We are all morons just waiting to get into stupid accidents and while this is a news story because it *may* have been an autopilot fail, this stuff happens to human drivers constantly. We don't hear about it the same reason that a shooting in Chicago isn't news worthy on the national level. It's normalised to the point of being boring and may end up as a footnote on some news show.
Re:"Autopilot" is a bad name (Score:4, Interesting)
People are stupid no matter what. When I lived in the prairies we were buying a camper for the family and out in front of the dealership when we pulled in, there was a mostly destroyed motorhome over by the service bays. My dad asked one of the guys about it as we walked around the lot and it turns out some older guy bought it and misunderstood what "Cruise Control" meant, turned it on, and went in the back to go make a sandwich. Well, the road curved left and the motorhome went straight into a wheat field then flipped over when one tire hit a softer patch of soil. The guy lived, and was outraged that his fancy cruise controlled motorhome didn't drive itself when the cruise was activated.
This was in the early 80s. Dumb people have misunderstood driver assist systems since their invention. 50 years from now someone's going to go ballistic when their auto-driving car can't read their mind and stop at an ice cream shop when they suddenly pass it and realize they want ice cream.
Auto-pillock would be better (Score:2)
Re: (Score:3)
Re: (Score:2)
Re:"Autopilot" is a bad name (Score:5, Funny)
A giant object in the middle of the road is an edge case?
Re: (Score:3)
A giant object in the middle of the road is an edge case?
Yes. An "edge case" means something that rarely happens and is far outside of the norm.
The top of the truck is not visible in the video, but was facing the Tesla. Presumably it was light and uniform in color. Most likely the vision system mistook it for an area of the sky.
Obviously, Tesla needs to include some overturned trucks in the training data. Perhaps they should also rethink their "camera-only" navigation system. Waymo's system uses Lidar and almost certainly would have avoided this accident.
Re: (Score:3)
You're saying that the video image was overexposed, or the cameras didn't have sufficient dynamic range to be able to distinguish between truck and sky, or the software only supports up to 256 levels (8 stops) of dynamic range and the rest is clipped.
Whatever it is, they should have fixed it by now. So what's the holdup?
Re:"Autopilot" is a bad name (Score:4, Interesting)
Tesla should also observe the surrounding cars.
They are all slowing down and moving to the right.
That is a strong hint that there is a problem up ahead.
Re: (Score:2)
Probably the brakes got applied far too-late by the Tesla 3 when the radar system starting detecting something getting too close.
In the video, there is a burst of smoke when the car is about 30 meters from the truck (right after it passes the truck driver). This must be from applying the brakes hard. But the car does not appear to slow down at all. Weird.
Human brains have the same 'bug' (Score:5, Interesting)
Human brains are also bad at seeing things they aren't expecting, and it's surprisingly common for drivers to run into trucks pulled straight across a road, in a type of accident referred to as a 'side under-run'. So "not detecting the large rectangular object right in front of it, in broad daylight and clear weather" is actually pretty common for human drivers, because light colored trucks look a lot like the sky to a human eye or to an AI vision system, and drivers on long boring drives tend to just follow the lane markings and (for example) drive straight into trucks that are pulled across a road. The Tesla does have an advantage over a human, with the radar, but it sure looks like it didn't respond in time - perhaps slowing down at the end, though it's hard to tell from the video. I'd wonder if perhaps the material the top of the trick was made of suppressed radar reflections? Or perhaps the combination of factors was so unique that it wasn't trained to recognize it? That's a challenge with neural networks, that unless they're trained on a pattern, they won't recognize it, so things that happen very rarely don't become a trained pattern.
Re: Human brains have the same 'bug' (Score:2, Interesting)
Until cars are actually semi-intelligent in the real sense and not in the faux ML sense of pre-canned training for various already known situations the cars will keep driving into things because they're not intelligent in any way and can't think their way out like people can.
Re: Human brains have the same 'bug' (Score:4, Interesting)
Re: Human brains have the same 'bug' (Score:3)
ML cars can be trained forever, humans cannot (Score:2)
And that's the problem with ML cars. You can't possibly pre-train them for every situation
But over time they learn, and the important thing is they remember forever when they are trained for some new problem because of software updates that go out to all cars.
When will human drivers stop driving into trucks and other stationary obstacles? Never because there keep being new human drivers that also have to be trained, each one independently, always having to re-learn as much as they can in a very short lifet
Re: (Score:2)
Re: (Score:3)
Of course it works like that. If your model is inadequate, you improve it in the next iteration.
Moreover, when you find a scenario that your model doesn't handle, you add that scenario to your test set so that any future updates that cause a regression in that scenario (or any of the millions of others you accumulate as you gradually build up your test set), is caught early. Even better, as multiple self-driving systems enter the market, we should enact legislation to standardize these test sets, and require every accident to be investigated, a root cause determined and a new test to be added to the
Re: (Score:3)
Re: (Score:2)
This is a complete fail. Let's push the idiocy and have ai driving everything and watch a schoolbus full of kids pile into a gas tanker... Ya, that will be great on CNN. Complete proof this isn't ready for prime time.
Re: (Score:2, Informative)
Re: (Score:2)
Re: (Score:2)
Re: Human brains have the same 'bug' (Score:3)
bla, bla, bla.... the ... car ... drove .. into ... a .. parked ... truck!!!
Uhuh. "Parked". On it's side, across two lanes of a highway.
The fact that a human can "park" a truck in such a manner tends to suggest we aren't ready for prime time either. I've certainly never seen auto-pilot "park" a Tesla like that.
Re: (Score:3)
In the states, nearly all semi-trucks trailers like this, are covered with metal. In Germany, I noticed a number of trailers that had just cloth on the sides. I wonder if having, say, a white canvas would allow radar through, and camera did not know what to make of it??
Re: (Score:2)
They worked as intended- reduce impact to a survivable speed. Driver walked away, no need to go to hospital.
The crash avoidance systems apparently were not engaged (along with autopilot not being engaged).
Re: (Score:2)
That is what a driver, driving the car manually, has to do to avoid hitting something.
Automatic Emergency Braking is only for slowing the car to a survivalbe speed.
The difference... (Score:2)
Re: (Score:2)
It's too bad it can't push a Lidar hardware update though.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
What was the driver doing at the time? (Score:2)
Isn't the driver supposed to pay attention even when the autopilot thing is on? They should have seen the truck and braked.
Re:What was the driver doing at the time? (Score:4)
One more thing... (Score:4, Informative)
Re: (Score:2)
Re: (Score:2)
Accident would have been avoided if either crash avoidance or autopilot was enabled. This was just a distracted human driving a car.
Re: (Score:2)
LIDAR or don't bother (Score:5, Insightful)
They need to seriously consider LIDAR or don't bother anymore. The CEO's insistence that LIDAR is too expensive is foolish, dangerous, and will end this endeavor eventually.
Re: (Score:2)
Right. Because the. What, handful of these deaths compare so negatively against other manufacturers? Not by the stats I’ve seen - but perhaps you’re judging solely based on the headlines that have been presented to you by your chosen news sources...
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
"Especially when it comes to things people don't know about, kind of like LIDAR being nothing more than a system that tracks object depth and that autopilot software is just as capable of fucking up a depth map as it is a ... well depth map. Unless you work for Tesla how do you know why the system failed?"
Because Tesla's system is well-known to be vulnerable to this kind of failure. It's happened repeatedly. It's a pretty safe assumption that the problem was that it didn't understand what it was seeing.
Any
It's Agile... (Score:2)
Fortunately, humans never hit stopped vehicles (Score:2)
As this example clearly demonstrates, cars never hit vehicles stopped on *side* of a freeway at such speeds that the car practically disintegrates.
https://6abc.com/car-crash-vid... [6abc.com]
Every time this happens, all I can think of (Score:4, Funny)
Is this scene from Anchorman 2
https://youtu.be/LUEDVMOMY24?t... [youtu.be]
English grammar / meaning (Score:2)
It was thought that the vehicle would detect an obstacle and slow down or stop, but the car still moved at a fixed speed, so when the brakes were to be applied at the last moment, it would be too late to cause a disaster."
-- too late to not cause a disaster
-- too late to prevent a disaster
LIDAR (Score:4, Insightful)
So I would say that this is definitive proof for all Tesla fanboys on this thread...
https://tech.slashdot.org/stor... [slashdot.org]
with the top rated comment...
Considering that Tesla has recently demonstrated that they can now map their surroundings with near LIDAR like precision using just their cameras and radar, ...who repeatedly slagged off Volvo for using LIDAR boasting about how fantastic the system was on the Tesla was... were totally and utterly wrong.
VOLVO has already lost on cost. https://cleantechnica.com/2020 [cleantechnica.com]... [cleantechnica.com]
The more sensors you have on a fully autonomous vehicle the better. They all have weaknesses, you need all of them feeding the computer to minimise the chance of a failure. Especially a failure as complete as the one demonstrated in that video.
Interesting. The twitter guy is a tslaq type. (Score:2, Interesting)
Re:Interesting. The twitter guy is a tslaq type. (Score:4, Interesting)
Yeah... analysis of the video on one of the Telsa forums indicated that the car had to be manually driven (with crash avoidance disabled). The translation of the account of the incident is a bit rough, which seems more like a TSLAQ story...
no failsafe (Score:2)
Re: (Score:3)
I had the same question, and this Wired article [wired.com] answers some of it.
Intuitively, I had thought that if I were building an auto Autopilot that the one thing that I would nail down and make bulletproof is the failsafe that would keep the car into running into something.
It turns out that it isn't that simple. The Tesla manual warns that its system cannot detect all stationary objects, particularly at freeways speeds. And it turns out that Volvo's system has the same shortcoming even with its lidar suppo
Slow (Score:3)
Shouldn't the title have been "Telsa Travelling at Full Speed Narrowly Misses Truck Driver"? Why didn't it even slow a little for the person standing on the road? What kind of systems are we allowing on our roads!?
Typical /. herpderp (Score:5, Interesting)
The original article is quite clear that ****Autopilot Was NOT Enabled****.
The crash mitigation system is not designed to come to a complete stop from 70mph. That would be too dangerous (it could start a pile-up for a false-positive).
SMH the comments on yahoo were more sensible.
Ridiculous failure (Score:3)
If Tesla's auto-pilot system can't detect a massive object blocking your entire lane and bring you safely to a full stop before colliding with it, then it isn't ready for use on our highways.
Re:Just rename the feature (Score:4, Funny)
Rename the feature from Autopilot to "Driver Assist"
They should probably rename it to "Full Speed Ahead".
Re: (Score:3)
Re: (Score:2)
Re: (Score:3)