Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Transportation

Tesla Model 3 Drives Straight Into Overturned Truck In What Seems To Be Autopilot Failure (jalopnik.com) 322

A viral video making the rounds on social media shows a Tesla Model 3 smacking into the roof of an overturned truck trailer. The crash took place on Taiwan's National Highway 1 and appears to be "caused by the Tesla's Autopilot system not detecting the large rectangular object right in front of it, in broad daylight and clear weather," reports Jalopnik. From the report: There's video of the wreck, and you can see the Tesla drives right into the truck, with only what looks like a solitary attempt at braking just before impact. For any human driver paying even the slightest bit of attention, this accident is almost an impossibility, assuming the driver had the gift of sight and functional brakes.

Tesla's Autopilot system primarily uses cameras for its Autopilot system, and previous wrecks have suggested that situations like this, a light-colored large immobile object on the road on a bright day can be hard for the system to distinguish. In general, immobile objects are challenging for emergency automatic braking systems and autonomous systems, as if you use radar emitters to trigger braking for immobile objects, cars tend to have far too many false positives and unintended stops than is safe or desirable.

News reports from Taiwanese outlets, clumsily translated by machine, do seem to suggest that the driver, a 53-year-old man named Huang, had Autopilot activated: "The Fourth Highway Police Brigade said that driving Tesla was a 53-year-old man named Huang, who claimed to have turned on the vehicle assist system at the time. It was thought that the vehicle would detect an obstacle and slow down or stop, but the car still moved at a fixed speed, so when the brakes were to be applied at the last moment, it would be too late to cause a disaster."
Thankfully, nobody was seriously hurt in the accident. The takeaway is that regardless of whether Autopilot was working or not the driver should always be paying attention and ready to step in, especially since no Tesla or any currently-available car is fully autonomous.
This discussion has been archived. No new comments can be posted.

Tesla Model 3 Drives Straight Into Overturned Truck In What Seems To Be Autopilot Failure

Comments Filter:
  • by Sebby ( 238625 ) on Monday June 01, 2020 @09:39PM (#60133446)

    Even if told otherwise, people will still believe that "autopilot" means "I don't need to do any piloting!" and let the car fully drive itself.

    Even a simple name change, like "Drive Assist" or anything similar most probably would have prevented some of these avoidable accidents (some of which have been deadly)

    • Re: (Score:2, Insightful)

      by iotaborg ( 167569 )

      Autopilot has been used on airplanes forever, and never did it completely fly the airplane, pilot intervention is required. Fault of the ignorant driver for not understanding the technology or assuming "autopilot" means "autonomous". Fault of Tesla for overstating its capabilities without adequate disclaimers.

      • by Sebby ( 238625 ) on Monday June 01, 2020 @09:53PM (#60133502)

        Fault of the ignorant driver for not understanding the technology or assuming "autopilot" means "autonomous". Fault of Tesla for overstating its capabilities without adequate disclaimers.

        Agreed - I think the biggest factor is that, unlike airplane pilots, there's no user training required for Tesla's "autopilot", so buyers are left with the impression of the functionality they got from the media (mostly TV/movies) that the planes just fly themselves.

        I feel like that should be a requirement, perhaps with a new license level (just like for driving a big rig for example), for operators of vehicles with any autonomous functions, and not simply be left to the user manual.

        • by saloomy ( 2817221 ) on Monday June 01, 2020 @09:59PM (#60133530)
          It tells you when you turn it on that you need to keep your hands on the wheel, and pay attention. EVERY SINGLE TIME, in the drivers screen in an X at least. Autopilot is not "chauffeur". We use words because they have meanings. We can't assume people don't know what it means. Really, they are just taking risks.

          And so what? The number of accidents Autopilot helps prevent far outnumber the number it causes. This is statistically proven with number of miles driven per accident.
          • by Your Father ( 6755166 ) on Monday June 01, 2020 @10:06PM (#60133546)
            Well, obviously we need captchas that focus on overturned trucks. Get ready to prove you are human with upside down trucks!
          • by Strider- ( 39683 ) on Monday June 01, 2020 @10:34PM (#60133674)

            It tells you when you turn it on that you need to keep your hands on the wheel, and pay attention. EVERY SINGLE TIME, in the drivers screen in an X at least. Autopilot is not "chauffeur". We use words because they have meanings. We can't assume people don't know what it means. Really, they are just taking risks.

            This type of design is the very worst possible. If a driver isn't actively paying attention to the driving task, their mind will wander, and it will take time for the system to regain the driver's attention in order to pay attention to what needs to be done. It really needs to be all or nothing.

            I use an autopilot on my sailboat on a regular basis, it's relatively similar to what you'd find on an airplane, though in the case of the boat it just keeps the boat going in a straight line. For the boat, this frees me up to manage other systems, such as the sails, and keep better watch. But even then, I find my attention wandering if I'm not explicitly focusing on the task at hand. Fortunately, when your max speed is 6kts, you have a lot more reaction time.

            • This type of design is the very worst possible. If a driver isn't actively paying attention to the driving task, their mind will wander, and it will take time for the system to regain the driver's attention in order to pay attention to what needs to be done. It really needs to be all or nothing.

              So on the one hand I have your assertion that "it needs to be all or nothing". And on the other hand I have data showing that the current "something" is still statistically safer than nothing, and will keep becoming more safer until we get to "all".

              Which one you think I'm gonna go with it?

            • The funny thing is that a sailboat autopilot can be an amazingly simple device, with no electronics whatsoever--just a windvane and a tiller mechanism--and it does its job surprisingly well. Modern airplane autopilots are electronic, but relatively simple devices that simply have to trim pitch and roll to maintain a course and altitude, and they do so with great precision. On the other hand, an automotive 'autopilot' requires an AI smarter than most human drivers (yup it's already there) and still fails at

              • by hawguy ( 1600213 ) on Tuesday June 02, 2020 @01:46AM (#60134250)

                still fails at its task enough times that it really can't be depended on to do what it's name implies.

                Is that true? I thought that even with its well publicized mistakes, the Tesla autopilot was still safer than humans overall. Car autopilots don't need to be accident free to be safer than humans. And they'll have different failure modes than humans so the accidents may be more notable.

                During Q3, we registered one accident for every 4.34 million miles driven in which drivers had Autopilot engaged. This compares to the national average of one accident for every 0.5 million miles based on NHTSA’s most recent US data.

          • airplane pilots have know what it can & can't do. And they are there to take over in case things mess up. Also in airplanes you in most cases have the time to look over the issue at hand vs very little time in an car. (look at the uber death)

      • Autoland is a thing [flightdeckfriend.com]. Auto-takeoff doesn't happen - but I bet it's more a function of the pilot's union rather than a technical block. And of course, "autopilot" in normal flight will follow a course you set in [faa.gov], including glide-slope descents. So once you're above 1000 feet - a commercial airplane autopilot could take care of the rest of the flight.
      • by Rei ( 128717 )

        "... without adequate disclaimers"

        The first time you start up Autopilot, it makes you read through and accept a giant infosheet about its limits. Every single time you start autopilot after that, it pops up a message telling you to keep your hands on the wheel and pay attention to the road. The manual section on Autopilot is one page of disclaimers after the next, several pages long. Tesla staff give disclaimers (and watch you to make sure you're not abusing it) during test drives. Even the website where

    • Which leads to the followup question: Do you trust Average Joe (not a trained professional pilot) to pay close attention to an autopiloted system instead of zoning out and enjoying the music from the stereo? And what value does this"car autopilot" system bring to the table if it still requires the driver to do piloting, only in a more boring manner? At least with plane autopilot systems, pilots can relax because there is lots of reaction time at 30.000 ft compared to an overcrowded highway.
      • by hawguy ( 1600213 )

        : Do you trust Average Joe (not a trained professional pilot) to pay close attention to an autopiloted system instead of zoning out and enjoying the music from the stereo?

        Well we trust Average Joe to pay attention in non-autopiloted cars even though we know he won't.

        An autopilot that's 95% as good as humans that pays attention to the road 100% of the time sounds better than a person that's 100% as good as the average human (and half of drivers are below average) , but only pays attention 90% of the time.

    • by gweihir ( 88907 )

      Indeed. People are generally stupid and generally incapable of reading the manual. At the same time, the more stupid they are, the more in control they think they are (the so-called "Dunning-Kruger Effect"). Hence technology for general use must make it exceptionally hard to hurt yourself with it. Sure, the driver here is probably 100% responsible legally, but ethically it is more a 50:50 thing.

    • by khchung ( 462899 ) on Monday June 01, 2020 @10:45PM (#60133708) Journal

      Even a simple name change, like "Drive Assist" or anything similar most probably would have prevented some of these avoidable accidents (some of which have been deadly)

      Citation needed.

      What would drivers do when they enabled a "Drive Assist" function and the car will steer itself along the road? Most would stop paying attention.

      Requiring the driver to keep paying attention when there is nothing to do is already a lost cause.

      Human beings are not wired to keep staying on standby and paying attention. People already have trouble keeping attention even when they actively driving, e.g. there are long tunnels with different wall patterns, long roads that intentionally made to curve left and right, for the purpose of keeping the driver from zoning out due to lack of change in scenery.

      While I am optimistic the autonomous driving will one day take over and it will result in safer roads, doing it halfway will not work.

      The right place to begin deployment is for long transportation where *no* driver would be present, and the software can be made to always err on the side of caution. Having the truck stop on any suspected obstacle and having someone override from a remote camera is no big deal for a truck not in a hurry.

      Using it on consumer cars with drivers in the front seat with little understanding of how AI could fail is a recipe for disaster.

      Autonomous driving on consumer cars should be 100% autonomous with no option for manual driving. That way, the seats can be designed to be much more safe for the passengers (such as backward facing) without the constraint of having someone sitting closely behind a big hard wheel ready to crush him on impact.

      • Re: (Score:3, Insightful)

        by iotaborg ( 167569 )

        I have an Audi with such drivers assist functions. It's nowhere near as good as Tesla, thus I do not trust it, and am always ready to take the wheel. Tesla's problem is that they got too good, but not good enough to be 100% reliable.

      • by rho ( 6063 )

        long roads that intentionally made to curve left and right, for the purpose of keeping the driver from zoning out due to lack of change in scenery

        Man, I guess you've never been to West Texas. The only curvature you can see in the roads comes from the Earth.

    • People are not computers. They adapt. After a while, people get used to the convenience provided by "driver assist", and tend to let their attention slip.
    • avoidable accidents

      Well that's quite the oxymoron!

    • Re: (Score:2, Insightful)

      "Autopilot" is the most fitting name, the problem being that most people aren't pilots to comprehend it.
      • Autopilots in planes can take over almost immediately after takeoff (once you clear 1000 feet) - fly you to your destination, and even land. In what way is the current Tesla solution like an Autopilot in a plane?
        • by hawguy ( 1600213 )

          Autopilots in planes can take over almost immediately after takeoff (once you clear 1000 feet) - fly you to your destination, and even land. In what way is the current Tesla solution like an Autopilot in a plane?

          Airplane autopilots will disconnect if the wings (or a critical instrument [wikipedia.org]) ices up and the autopilot can no longer control the plane, or if the plane has a serious bird strike, or the engine fails, etc -- the pilot needs to be able to take over at any time. During good conditions, the Tesla autopilot can drive you all the way to your destination, but if something unusual happens, you better be ready to take over.

          • Autopilots also give you a rather loud and hard-to-ignore warning when they disconnect; does Tesla's Autopilot do that?
            • by hawguy ( 1600213 )

              Autopilots also give you a rather loud and hard-to-ignore warning when they disconnect; does Tesla's Autopilot do that?

              If a driver isn't touching the steering wheel, the Tesla autopilot warns the driver several times over several minutes with increasingly more noticeable audible/visual alerts to take the wheel until finally it will come to a stop of the driver refuses to take control. I don't know how it warns the driver that he needs to take over suddenly, like if it can no longer see the edge of the road, I assume there's some alert (my non-Tesla will beep annoyingly if Radar Cruise Control cancels itself or it loses si

          • by hawguy ( 1600213 )

            Autopilots in planes can take over almost immediately after takeoff (once you clear 1000 feet) - fly you to your destination, and even land. In what way is the current Tesla solution like an Autopilot in a plane?

            Airplane autopilots will disconnect if the wings (or a critical instrument [wikipedia.org]) ices up and the autopilot can no longer control the plane, or if the plane has a serious bird strike, or the engine fails, etc -- the pilot needs to be able to take over at any time. During good conditions, the Tesla autopilot can drive you all the way to your destination, but if something unusual happens, you better be ready to take over.

            And in almost all planes, autopilot will obliviously fly you right into another plane or unexpected terrain (like a radio tower that's not in its database). Even if the plane is equipped with TCAS and it's blaring at you to take evasive action, in most planes TCAS guidance needs to be performed by the pilot, the autopilot won't do it.

    • by swillden ( 191260 ) <shawn-ds@willden.org> on Monday June 01, 2020 @11:36PM (#60133870) Journal

      Even if told otherwise, people will still believe that "autopilot" means "I don't need to do any piloting!" and let the car fully drive itself.

      Even a simple name change, like "Drive Assist" or anything similar most probably would have prevented some of these avoidable accidents (some of which have been deadly)

      I don't buy it. I don't think a name change would make any difference at all.

      When Google first started experimenting with self-driving cars, they let some employees -- engineers who fully understood the deficiencies of the early system and *knew* they had to stay alert -- use them as daily vehicles, but with cameras inside to monitor usage, specifically to see how people interacted with the system. The designers of this test expected that they were looking for small changes in the amount of time the driver looked away from the road, or other subtle clues. What they found instead was that after a few hours of the vehicle performing pretty okay in common, easy driving scenarios (e.g. freeway traffic) even people who thoroughly understood the limitations tended to stop paying attention for long periods of time. What's more, they didn't even seem to realize how long they had stopped paying attention. When later shown the video of their own behavior, they were shocked and surprised at how irresponsible they'd been.

      The bottom line is that "driver assist" systems that enable the driver to reduce how much attention they pay without fairly immediate negative feedback will cause driver inattention. This observation prompted Google (now Waymo) to decide that nothing less than level 4 (complete autonomy under specified conditions) is safe.

      In practice I think Tesla's numbers have proven Google wrong about that, not because drivers do pay attention to what the self-driving car is doing but because the safety bar is so ludicrously low -- human drivers are so awful -- that on balance a level 3 system can actually be about as safe as a human driver. It'll screw up regularly, and occasionally horrifically, but so do humans. Humans tend to fail in different ways, screwing up by falling asleep, or getting distracted, or being under the influence, etc., but on balance the numbers are close to the same, and maybe even favor the Tesla system.

      Still, Waymo's approach is that full level 4 is the only way to go, and they're operating fully-autonomous (no "safety driver") taxis in Phoenix right now. Of course Phoenix has no snow; little rain; broad, well-marked roads and (relatively) light and non-aggressive traffic, so the ability to operate autonomously there means little about the ability to operate in worse conditions. That's why Waymo is also testing in Michigan; they'll get there eventually.

      • by thegarbz ( 1787294 ) on Tuesday June 02, 2020 @03:51AM (#60134570)

        -- human drivers are so awful --

        This cannot be stated enough. Most of the comments here seem to be along the lines of autopilot enabling drivers not to pay attention. That doesn't seem to stop every moron driving down the highway texting or reading the news or whatever.

        I one day avoided what I consider a sandwich of inattentive stupidity, caused by something not quite as bad as in this footage. Driving on the right side of the highway the truck in front of me for no reason drifted slightly off the road and hit a service vehicle parked on the shoulder. I was following way too close to brake in time (because I'm an awful driver) so brakes + abs + traction control + yanking the wheel got me around the outside of the truck (luckily the guy next to me was paying attention when I nearly swerved into him. The guy 10m *behind me* looks to have not even attempted to brake, he hit the truck at full speed.

        We are all morons just waiting to get into stupid accidents and while this is a news story because it *may* have been an autopilot fail, this stuff happens to human drivers constantly. We don't hear about it the same reason that a shooting in Chicago isn't news worthy on the national level. It's normalised to the point of being boring and may end up as a footnote on some news show.

    • by barc0001 ( 173002 ) on Monday June 01, 2020 @11:39PM (#60133882)

      People are stupid no matter what. When I lived in the prairies we were buying a camper for the family and out in front of the dealership when we pulled in, there was a mostly destroyed motorhome over by the service bays. My dad asked one of the guys about it as we walked around the lot and it turns out some older guy bought it and misunderstood what "Cruise Control" meant, turned it on, and went in the back to go make a sandwich. Well, the road curved left and the motorhome went straight into a wheat field then flipped over when one tire hit a softer patch of soil. The guy lived, and was outraged that his fancy cruise controlled motorhome didn't drive itself when the cruise was activated.

      This was in the early 80s. Dumb people have misunderstood driver assist systems since their invention. 50 years from now someone's going to go ballistic when their auto-driving car can't read their mind and stop at an ice cream shop when they suddenly pass it and realize they want ice cream.

    • As anyone familiar with the auto-pilot in the X-series space trading games will probably attest, auto-pillock would be a far more accurate name for this feature. Normally it would just be content with scrapping off half your shields on the dock port but every so often for no discernable reason it would just fly you at full speed into the side of the station.
    • by DrXym ( 126579 )
      Yes obviously. But Tesla likes to use hype and bullshit to sell this technology and calling it "drive assist" or "advanced lane keeping" or whatever doesn't sound as misleadingly sexy as "autopilot". And then people die. And then you can guarantee to hear someone say "uhnuh go and look up what an autopilot in a plane is" as if some dry definition cancels out the commonly understood one.
  • by laird ( 2705 ) <lairdp@gmail.TWAINcom minus author> on Monday June 01, 2020 @09:43PM (#60133466) Journal

    Human brains are also bad at seeing things they aren't expecting, and it's surprisingly common for drivers to run into trucks pulled straight across a road, in a type of accident referred to as a 'side under-run'. So "not detecting the large rectangular object right in front of it, in broad daylight and clear weather" is actually pretty common for human drivers, because light colored trucks look a lot like the sky to a human eye or to an AI vision system, and drivers on long boring drives tend to just follow the lane markings and (for example) drive straight into trucks that are pulled across a road. The Tesla does have an advantage over a human, with the radar, but it sure looks like it didn't respond in time - perhaps slowing down at the end, though it's hard to tell from the video. I'd wonder if perhaps the material the top of the trick was made of suppressed radar reflections? Or perhaps the combination of factors was so unique that it wasn't trained to recognize it? That's a challenge with neural networks, that unless they're trained on a pattern, they won't recognize it, so things that happen very rarely don't become a trained pattern.

    • And that's the problem with ML cars. You can't possibly pre-train them for every situation whereas a person can figure out what to do when they encounter something new.

      Until cars are actually semi-intelligent in the real sense and not in the faux ML sense of pre-canned training for various already known situations the cars will keep driving into things because they're not intelligent in any way and can't think their way out like people can.
      • by saloomy ( 2817221 ) on Monday June 01, 2020 @10:22PM (#60133624)
        No. These cars improve on human safety. You're making a bad assumption that drivers are perfect, and they aren't. Not even close. These cars by latest statistics are 12x safer than human-only vehicles.
      • And that's the problem with ML cars. You can't possibly pre-train them for every situation

        But over time they learn, and the important thing is they remember forever when they are trained for some new problem because of software updates that go out to all cars.

        When will human drivers stop driving into trucks and other stationary obstacles? Never because there keep being new human drivers that also have to be trained, each one independently, always having to re-learn as much as they can in a very short lifet

      • by kackle ( 910159 )
        Thank you. These cars will be forever befuddled by things a child could comprehend and deal with. Unusual situations on the road are not unusual when there are millions of cars on the roads every single day.
    • bla, bla, bla.... the ... car ... drove .. into ... a .. parked ... truck!!! YIKES!!!
      This is a complete fail. Let's push the idiocy and have ai driving everything and watch a schoolbus full of kids pile into a gas tanker... Ya, that will be great on CNN. Complete proof this isn't ready for prime time.
      • Re: (Score:2, Informative)

        by saloomy ( 2817221 )
        Because a human has never driven into a parked or overturned vehicle? This is just another extremely rare edge case that will now be handled. Don't advocate for throwing out the baby with the bathwater. The technology is actively saving lives as its statistically less likely to crash than a human alone.
        • doesn't matter what a human has or has not done. This needs to stand on it's own. Comparing AI to what a human can do negates any benefits of the vaporware promise of AI.
        • So - in the case of the schoolbus in to a tanker, the schoolbus driver is at fault, and if they survive, will pay the cost via a trial. In this case - which programmer or executive is charged with at least reckless driving or at most vehicular manslaughter?
      • bla, bla, bla.... the ... car ... drove .. into ... a .. parked ... truck!!!

        Uhuh. "Parked". On it's side, across two lanes of a highway.

        The fact that a human can "park" a truck in such a manner tends to suggest we aren't ready for prime time either. I've certainly never seen auto-pilot "park" a Tesla like that.

    • What I am interested in, is that AP should not figure into this. Teslas have radar/camera and they are used to determine not just driving, but the safety feature. It should have at least slammed on the brakes just for the emergency system.

      In the states, nearly all semi-trucks trailers like this, are covered with metal. In Germany, I noticed a number of trailers that had just cloth on the sides. I wonder if having, say, a white canvas would allow radar through, and camera did not know what to make of it??
      • They worked as intended- reduce impact to a survivable speed. Driver walked away, no need to go to hospital.

        The crash avoidance systems apparently were not engaged (along with autopilot not being engaged).

  • The difference between a self driving car and a human driving is that Tesla can push it an update and make sure none of it's cars ever make this particular mistake again.
    • It's too bad it can't push a Lidar hardware update though.

      • And yet, other car makes are not saying that LIDAR will not be needed. Radar combined with Cameras do the trick.
    • by Luthair ( 847766 )
      Which might break it in many other scenarios, or maybe the sensors on the cars aren't even adequate to detect this particular senario.
  • Isn't the driver supposed to pay attention even when the autopilot thing is on? They should have seen the truck and braked.

  • One more thing... (Score:4, Informative)

    by kurkosdr ( 2378710 ) on Monday June 01, 2020 @10:23PM (#60133630)
    This accident would have been avoided if the vehicle's "autopilot" was backed by a proper LIDAR system (aka radar). Instead Tesla went with cheaper sensors (basically cameras and ultrasound), which means the vehicle does NOT see a 3D mesh of its surround environment but instead has to piece it together from cameras and ultrasound, with some AI guesswork involved. Tesla hoped their Magic Artificial Intelligence Software(tm) would somehow always guess correctly and make up for the shortfall, but as every programmer knows, there is always that one special case.
    • *surround environment = surrounding environment
    • Accident would have been avoided if either crash avoidance or autopilot was enabled. This was just a distracted human driving a car.

  • by kriston ( 7886 ) on Monday June 01, 2020 @10:24PM (#60133636) Homepage Journal

    They need to seriously consider LIDAR or don't bother anymore. The CEO's insistence that LIDAR is too expensive is foolish, dangerous, and will end this endeavor eventually.

    • by Corbets ( 169101 )

      Right. Because the. What, handful of these deaths compare so negatively against other manufacturers? Not by the stats I’ve seen - but perhaps you’re judging solely based on the headlines that have been presented to you by your chosen news sources...

  • They only add features that they really know you will need
  • As this example clearly demonstrates, cars never hit vehicles stopped on *side* of a freeway at such speeds that the car practically disintegrates.

    https://6abc.com/car-crash-vid... [6abc.com]

  • by bobstreo ( 1320787 ) on Monday June 01, 2020 @10:56PM (#60133744)

    Is this scene from Anchorman 2

    https://youtu.be/LUEDVMOMY24?t... [youtu.be]

  • It was thought that the vehicle would detect an obstacle and slow down or stop, but the car still moved at a fixed speed, so when the brakes were to be applied at the last moment, it would be too late to cause a disaster."

    -- too late to not cause a disaster
    -- too late to prevent a disaster

  • LIDAR (Score:4, Insightful)

    by Darkling-MHCN ( 222524 ) on Monday June 01, 2020 @11:05PM (#60133774)

    So I would say that this is definitive proof for all Tesla fanboys on this thread...

    https://tech.slashdot.org/stor... [slashdot.org]

    with the top rated comment...

              Considering that Tesla has recently demonstrated that they can now map their surroundings with near LIDAR like precision using just their cameras and radar,
              VOLVO has already lost on cost. https://cleantechnica.com/2020 [cleantechnica.com]... [cleantechnica.com] ...who repeatedly slagged off Volvo for using LIDAR boasting about how fantastic the system was on the Tesla was... were totally and utterly wrong.

    The more sensors you have on a fully autonomous vehicle the better. They all have weaknesses, you need all of them feeding the computer to minimise the chance of a failure. Especially a failure as complete as the one demonstrated in that video.

  • Kind of interesting, in that this twitter guy, rooster, is a tslaq guy. As such, I have to wonder how honest he is being.
  • it seems like crash avoidance was aolely dependent on AI. It mistaked the truck as an overhead sign or billboard. Why wouldnt they use radar to physically detect something in front of the car as a failsafe?
    • I had the same question, and this Wired article [wired.com] answers some of it.

      Intuitively, I had thought that if I were building an auto Autopilot that the one thing that I would nail down and make bulletproof is the failsafe that would keep the car into running into something.

      It turns out that it isn't that simple. The Tesla manual warns that its system cannot detect all stationary objects, particularly at freeways speeds. And it turns out that Volvo's system has the same shortcoming even with its lidar suppo

  • by dohzer ( 867770 ) on Tuesday June 02, 2020 @01:53AM (#60134284)

    Shouldn't the title have been "Telsa Travelling at Full Speed Narrowly Misses Truck Driver"? Why didn't it even slow a little for the person standing on the road? What kind of systems are we allowing on our roads!?

  • Typical /. herpderp (Score:5, Interesting)

    by danskal ( 878841 ) on Tuesday June 02, 2020 @08:15AM (#60135036)

    The original article is quite clear that ****Autopilot Was NOT Enabled****.

    The crash mitigation system is not designed to come to a complete stop from 70mph. That would be too dangerous (it could start a pile-up for a false-positive).

    SMH the comments on yahoo were more sensible.

  • by jschultz410 ( 583092 ) on Tuesday June 02, 2020 @10:48AM (#60135556)

    If Tesla's auto-pilot system can't detect a massive object blocking your entire lane and bring you safely to a full stop before colliding with it, then it isn't ready for use on our highways.

Math is like love -- a simple idea but it can get complicated. -- R. Drabek

Working...