Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Transportation

A Tesla on Autopilot Crashed Into a Parked Police Car (fortune.com) 265

An anonymous reader quotes Fortune: A Tesla vehicle in Autopilot mode collided with a parked police cruiser in California, authorities said. The Tesla sedan was driving outbound when it struck a parked Laguna Beach police car, the Laguna Beach police department said Tuesday. According to police, the driver in the Tesla sustained minor injuries. The police cruiser was empty of officers at the time of the crash. Laguna Polic1e Sgt. Jim Cota told the Los Angeles Times the police car "is totaled."
The police sergeant also told the Times that it was the same area where a Tesla crashed into a semi-truck last year, adding "Why do these vehicles keep doing that? We're just lucky that people aren't getting injured."

"Tesla has always been clear that Autopilot doesn't make the car impervious to all accidents," Tesla responded in a statement, "and before a driver can use Autopilot, they must accept a dialogue box which states that 'Autopilot is designed for use on highways that have a center divider and clear lane markings.'"

Record producer Zedd also responded to the news by sharing on Twitter what he calls "the other side": I once fell asleep driving home late at night on the highway (w/ autopilot on) and got woken up by it beeping + turning off music to wake me up. Would have prob been dead without it... I didn't touch the steering wheel for a couple minutes and then it turned off the music and started beeping. Elon Musk responded to the tweet, "Glad you're ok!"
This discussion has been archived. No new comments can be posted.

A Tesla on Autopilot Crashed Into a Parked Police Car

Comments Filter:
  • Please stop (Score:3, Insightful)

    by Anonymous Coward on Saturday June 02, 2018 @05:41PM (#56717490)

    Drop the autopilot name and call it drive assist. The first implies it drives by itself, the second clearly means you still need to be at least holding the steering wheel.

    • But but but the marketing department said....
      • Re:Please stop (Score:5, Informative)

        by ShanghaiBill ( 739463 ) on Saturday June 02, 2018 @07:14PM (#56717906)

        All the people complaining about the name "Autopilot" being misleading have one thing in common: They don't own or drive Teslas.

        Tesla makes it extremely clear, when you buy the car, and every time you drive it, that Autopilot doesn't fully control the car, and the driver needs to stay alert and be ready to take control at any time.

        There are plenty of problems with Autopilot, but being "misleading" is not one of them.

        • first of all, they dont know who you might lend the car to. Secondly, they also train humans not to get into accidents in normal cars. Yet, shocker, they do. If humans were infallible, autopilot wouldnt be required in the first place. Tesla is almost asking that humans suddenly only error in ways that work with their tech. It just doesnt work that way.
        • All the people complaining about the name "Autopilot" being misleading have one thing in common: They don't own or drive Teslas.

          Let me guess... You didn't do a study to get the data to be able to make this claim? Am I right?

    • Re: Please stop (Score:3, Insightful)

      by reanjr ( 588767 )

      But that's how autopilot works on planes. Why should we redefine what autopilot means on cars?

      • Re: Please stop (Score:5, Insightful)

        by fisted ( 2295862 ) on Saturday June 02, 2018 @07:00PM (#56717836)

        Since when do you need to hold the "steering wheel" in a plane that's on autopilot?

        • Since when do you need to hold the "steering wheel" in a plane that's on autopilot?

          Whenever you have an unexpected situation ... same as the car.

          • Yet, you know that there are important differentces in way it works.
            Monitoring an airplane autopilot is a very passive activity which does not requires a lot of alertness (except in an autoland situation, which does not last so long).
            However, an airplane autopilot can disconnect if it encounters a situation it cannot cope with - and at that time, the pilot has to become very active. However, the analogy between the Tesla autopilot and the airplane autopilot ends here:
            - There are visual and audible war
          • Re: Please stop (Score:4, Insightful)

            by AmiMoJo ( 196126 ) on Sunday June 03, 2018 @03:00AM (#56719020) Homepage Journal

            The difference being that in a plane an alarm goes off and you have several seconds, maybe minutes to figure out what the problem is and do something about it. You also have a human co-pilot checking your work and taking turns watching over what is happening.

            If Autopilot was like that (level 3 autonomy, driver doesn't need to pay attention and has 30 seconds to take over when it needs to disengage) it would be fine. As it is, you have to be constantly alert and ready to take over in a fraction of a second.

            That's an unrealistic burden to place on the driver, human beings are not good at that.

          • Since when do you need to hold the "steering wheel" in a plane that's on autopilot?

            Whenever you have an unexpected situation ... same as the car.

            AP requires you to keep your hands on the wheel at all times, not just wait for unexpected situations. Then it could be too late.

      • Re: Please stop (Score:5, Informative)

        by Anonymous Coward on Saturday June 02, 2018 @07:14PM (#56717902)

        You clearly don't know anything about how autopilots work on planes. The autopilot in most airplanes will make no attempt at all to avoid obstacles. You tell it where to go and depending on how sophisticated the autopilot is, you may be able to specify the altitude and climb and descent rates. In some cases they can track satellite or ground based navigation systems. If I tell my autopilot to descend to 500 feet and point it at a mountain, my airplane will crash into that mountain. You have to tell it exactly what to do. Some airplane autopilots can only maintain a specific heading and don't even maintain altitude for you. If there is a crosswind you have to adjust the autopilot to stay on the correct course. Very few airplanes have autopilots that can handle all phases of flight. In most cases you have to take off and land yourself. Autopilot does not mean that the airplane intelligently flies itself to your destination.

        • Indeed, the Telsa "Autopilot" is an order of magnitude more intelligent than a normal plane AutoPilot. And both can kill for the same reason.

          There have been a number of crashes like the Air Asian one at SFO where pilots have set AutoPilot (actually AutoThrottle for nit pickers) to the wrong mode. The pilots then do not monitor basic things like air speed. Until the plane falls out of the sky. The would be probably better off with no Auto anything.

        • You clearly don't know anything about how autopilots work on planes.

          The thing is, operating cars don't require hundreds of simulator hours and proper certification

        • by mjwx ( 966435 )

          You clearly don't know anything about how autopilots work on planes. The autopilot in most airplanes will make no attempt at all to avoid obstacles. You tell it where to go and depending on how sophisticated the autopilot is, you may be able to specify the altitude and climb and descent rates. In some cases they can track satellite or ground based navigation systems. If I tell my autopilot to descend to 500 feet and point it at a mountain, my airplane will crash into that mountain.

          This, As evidenced by the Germanwings flight. Homicidal/Suicidal pilot tells plane to fly right into mountain... plane flies right into mountain. Autopilot sytems require pilots to remain aware and ready to take control at a moments notice.

          The difference between a plane and a car is that in a plane, a moments notice means you likely have 20-30 seconds minimum to fix the problem, a moments notice in a car means 2-3 seconds max to avoid a gruesome crash.

          Pilots aren't lazy, when autopilot is on they're

      • Because the people driving cars do not have the same level of training as the people piloting planes.

        There's more a name than the activity level. The name is about communication, and the general public simply does not have the training to makes those distinctions. Thus, since the *audience* is qualitatively different, using the same name is a mistake.

      • Because pilots have way more rigorous training than drivers who can't be bothered to read the manual?

      • Because basically if you're stupid and barely conscious as most people are, you don't get to drive a plane.

        You do get to drive a car though.
    • by Corbets ( 169101 )

      Do we really have to have this same discussion on every Tesla article?

      For crying out loud, my Lucky Charms don’t show any evidence of luck either, but nobody is complaining about those being misused and misnamed.

  • by iggymanz ( 596061 ) on Saturday June 02, 2018 @05:45PM (#56717508)

    if auotpilots aren't advanced enough to sense road boundaries, center lane line, parking spot markers (which this road had), then they aren't advanced enough to drive a car

    • by BasilBrush ( 643681 ) on Saturday June 02, 2018 @06:04PM (#56717588)

      First there was cruise control. It maintained speed.
      Then there was adaptive cruise control. It maintained speed and distance from the car in front.
      Now there's autopilot. It maintains speed, distance from the car in front, follows lanes and assists with overtaking.

      At no stage did anyone say you could take your focus off the road with any of them. And there's no need to revert to older technology because some people are stupid.

      • First there was cruise control. It maintained speed. Then there was adaptive cruise control. It maintained speed and distance from the car in front. Now there's autopilot. It maintains speed, distance from the car in front, follows lanes and assists with overtaking.

        At no stage did anyone say you could take your focus off the road with any of them. And there's no need to revert to older technology because some people are stupid.

        But only at the third state could one take their focus off the road.

      • by Actually, I do RTFA ( 1058596 ) on Saturday June 02, 2018 @06:21PM (#56717656)

        there's no need to revert to older technology because some people are stupid.

        On the contrary. If there are enough people who cannot understand how to use a technology, it should be held up. You can fix that by educating them, not by getting them to agree to disclaimers and then blaming them.

        Some people will obviously misuse technology. But if the number misusing is high enough, there is either a problem with the technology, or how it is being marketed/explained to people.

    • by jtgd ( 807477 )
      I've been in situations where *I* couldn't determine with certainty where the lanes were or where the edge of the road was. I wonder if autopilot could do any better. At least they are looking in all directions at other cars and could conceivably know to drive centered between other cars... if they were programmed to handle loss of lane lines with a plan B.
  • Well (Score:5, Funny)

    by burtosis ( 1124179 ) on Saturday June 02, 2018 @05:49PM (#56717522)
    At least it stopped automatically for the cops.
  • This happened over a week ago!
  • The autopilot refused to take a field sobriety test.

  • by AlanObject ( 3603453 ) on Saturday June 02, 2018 @05:51PM (#56717536)

    An airplane's autopilot can crash the plane. Either by flying into the side of a mountain, running out of fuel, running into another plane, or into weather conditions the plane can't handle. All possible and even likely if the human pilot does not take responsibility.

    And the collection of devices is still called "Autopilot" and have been for more than a half century. Nobody claims that the respective manufacturers have oversold their product and/or delivered defective product.

    I mention this because I pointed out this obvious fact when this story popped up on a popular liberal political blog. I was roundly denounced as I was "blaming the victim." Then referred to breitbart.com where apparently that is considered acceptable. Stupid me for expecting better.

    So I guess the noisy media circus that goes on any time Tesla is mentioned isn't going to abate anytime soon.

    • Fortunately for pilots, there are no police cars or concrete dividers in the sky. Unfortunately for Tesla drivers, they are plentiful on the ground.
    • by Known Nutter ( 988758 ) on Saturday June 02, 2018 @06:10PM (#56717612)
      "Autopilot" is a poor name for Tesla's driver-assist technology because most people associate the word autopilot with "totally autonomous" and aren't bothered by the nuances of the technology as applied to aircraft. To most folks, autopilot means exactly what it sounds like, and it's pretty clear that there are a number of Tesla drivers treating it as such.

      The comparison to an aircraft's autopilot, while technically correct, is irrelevant to the discussion. A remarkably minuscule percentage of the human population will ever see a cockpit, let alone operate the controls. Autopilot means George Jetson era autopilot and that's that.
      • by larryjoe ( 135075 ) on Saturday June 02, 2018 @06:35PM (#56717710)

        "Autopilot" is a poor name for Tesla's driver-assist technology because most people associate the word autopilot with "totally autonomous" and aren't bothered by the nuances of the technology as applied to aircraft. To most folks, autopilot means exactly what it sounds like, and it's pretty clear that there are a number of Tesla drivers treating it as such.

        The comparison to an aircraft's autopilot, while technically correct, is irrelevant to the discussion. A remarkably minuscule percentage of the human population will ever see a cockpit, let alone operate the controls. Autopilot means George Jetson era autopilot and that's that.

        Actually Autopilot is the perfect name for the technology, at least from a marketing perspective. It implies hands-off driving to those that haven't been trained to fly airplanes, i.e., just about everyone, and therefore drives sales. At the same time plausible deniability exists because there is some logical explanation that makes sense to some set of individuals.

    • by mrclevesque ( 1413593 ) on Saturday June 02, 2018 @06:14PM (#56717630)

      You can let a plane or a boat follow a heading and most of the time everything is fine. Autopilot simply keeps you from drifting off course due to winds or currents.

      What Tesla is selling clearly isn't autopilot in anything like that sense. They're using 'autopilot' as a 'high tech' marketing term and letting people believe the car can do things it really can't. To make matters worst, Tesla is also letting 'drivers' not pay attention to what's going on for long periods of time, reinforcing the idea that the car do more than it really can. So as far as this specific crash is concerned, it likely wouldn't have happened if not for Tesla's negligence.

    • The autopilot in a plane also requires all pilots to be trained and aware of not just its abilities but also its well defined limitations. Are you suggesting Tesla drivers should all require a separate training and certification process to be able to use autopilot?
  • by Bruce Perens ( 3872 ) <bruce@perens.com> on Saturday June 02, 2018 @05:51PM (#56717540) Homepage Journal

    Fords have killed tens of people today and do every day. On any typical day, more than 100 people die in the U.S. from auto accidents while riding in brands other than Tesla. In contrast, a handful of people have died in Teslas.

    While the NTSB is interested in batteries and self-driving systems, their announcements of investigations create a false impression that Teslas are more dangerous than other vehicles. The opposite is most likely the case, since a self-driving system, properly used, has the collision-avoiding attention of the driver and of a computer too.

    So, why so much bad news about Tesla?

    Tesla is also the most shorted stock at present, with short positions covering more than a quarter of all outstanding shares and perhaps as much as one third. That means a great many investors are desperate to see Tesla's stock reach a much lower price soon, or they'll be forced to buy it at its present price in order to fulfill their short positions, potentially bankrupting many of them and sending some out of the windows of Wall Street skyscrapers. These investors are desperately seeding, feeding, and writing negative stories about Tesla in the hope of depressing the stock price. Musk recently taunted them by buying another 10 Million dollars in stock, making it even more likely that there won't be enough stock in the market to cover short positions. If that's the case, short-sellers could end up in debt for thousands of dollars per shorted share -- as the price balloons until enough stockholders are persuaded to sell. Will short-sellers do anything to give Tesla bad press? You bet.

    And of course there's the interest of the gasoline industry, which will go out of business given the proliferation of fully-electric vehicles that are actually good enough to compete with gasoline ones, a position that only Tesla holds so far. Entrenched automotive manufacturers also have every reason to seed and feed bad press while they fail to build their own battery manufacturing plants. Before Tesla, one could see the obvious activities of these powers in seeding bad news about the Prius.

    Then there's the fact that Tesla does not advertise. Given the queue of Model 3 reservations, Tesla already has all of the sales they need for their next three years of their factory's production, before they might have any economic reason to advertise. This can't be comfortable for the press, and no doubt makes them more willing to carry stories seeded by those who would harm Tesla.

    • by wonkey_monkey ( 2592601 ) on Saturday June 02, 2018 @05:53PM (#56717550) Homepage

      Fords have killed tens of people today and do every day. On any typical day, more than 100 people die in the U.S. from auto accidents while riding in brands other than Tesla. In contrast, a handful of people have died in Teslas.

      There are presumably a lot more Fords on the road than Teslas.

    • by fluffernutter ( 1411889 ) on Saturday June 02, 2018 @05:55PM (#56717554)
      If Tesla autopilot actually drove 3.22 trillion miles a year in the US in all conditions and all places like humans do, how many accidents would that extrapolate to? Anyone care to make a guess?
      • Keep in mind, California is a relatively safe state to drive in. https://www.safewise.com/blog/... [safewise.com]
        • Unless CA drivers drive shorter distances. It really should be broken down by miles driven rather than number of drivers. The top state Massachusetts is four times less than the bottome state, North Dakota. But North Dakota is all spread out, while Massachusetts is dense.

          • by dgatwood ( 11270 )

            Here you go. In 2016 [slashdot.org]:

            • California: 1.07 deaths per 100 million miles
            • U.S. average 1.18 deaths per 100 million miles

            So California has about 9.4% fewer deaths per mile than the country as a whole. This is mostly because the average speed on the freeway is measured in inches per second, but still....

            Notably, however, California's fatality rates have been skyrocketing lately (mostly involving collisions with pedestrians or bicycles). They're now at 1.07, compared with a low of only 0.86 back in 2009 — an

      • That's the problem, everyone is guessing. Stop guessing. If you care, get the data.

        • You're right, it's a problem that we have to guess and it's a problem that the data isn't there. The problem is, no one posting here has the resources to go our and do in depth traffic accident analysis across all accidents in the US. This should be the job of the governing bodies that are allowing this tech on the road in the first place.
      • by Rei ( 128717 ) on Saturday June 02, 2018 @07:58PM (#56718042) Homepage

        Teslas have driven over 7,2 billion miles [electrek.co]. Given how by far most of that has been accumulated since the addition of AP hardware (in October 2016 they were only at 3,5B), and from the Q1 conference call we know that over 1/3 of Tesla miles are on AP, we can extrapolate to maybe around 2 billion miles (give or take large margins of error, and yes, that's the best we can do for now until the first AP statistics report comes out). At the normal US vehicle fatality rate of 1 per 86 million miles driven, 23 people should have died on AP (were Teslas only of average-safety, which they're not).

        Adjust up or down by your personal assumptions. We should have actual data to work with in a month or two.

        • I think it is more plausible that 1/3 of the miles of Teslas THAT HAVE AUTOPILOT INSTALLED were driven on autopilot. Not 1/3 of all miles on ALL Teslas on the road. I am guessing a large majority of Teslas have no autopilot "installed." Earlier models don't offer it at all. And for those that do have it offered, many won't want or trust it, and many more won't pay the big extra cost for it ($5,000 to $10,000).

          Besides, such statistics about fatality rates per mile don't take into consideration the TYPES

        • That's 7.2 billion miles IN THE WORLD. Not 7.2 billion miles in the US.
    • by AmiMoJo ( 196126 )

      The reason Tesla is interesting is that they are the only manufacturer with a large number of level 2 Autonomous vehicles out there. They were the first, and their system is a lot more lax than others on terms of enforcing driver attention.

    • by Burdell ( 228580 ) on Saturday June 02, 2018 @07:54PM (#56718026)

      Really? A Ford killed somebody today, by itself? I doubt that; that results in investigations and recalls.

      If Tesla requires the person behind the wheel (not the "driver", since they're relying on the car to do that) to accept a message that says the so-called Autopilot software is only designed for certain situations, why is the software driving in other situations? My car has lane-keeping and adaptive cruise control, but when I go outside their design ranges, they beep at me and disable.

  • by clovis ( 4684 ) on Saturday June 02, 2018 @06:01PM (#56717576)

    "Autopilot is designed for use on highways that have a center divider and clear lane markings."

    The GPS knows where the car is, and just about every mapping software knows what kind of road you're on.
    So how hard would it be to have the Tesla's computer not even turn on the autopilot if they're not on a road with center divider and clear lane markings? Or better yet, the autopilot only runs on roads that have been certified "not screwed up".

    In this case he was on Laguna Canyon Road, which has a median in some places and a middle "suicide lane" in others, and varies from one to two lanes from place to place.

    • "Autopilot is designed for use on highways that have a center divider and clear lane markings."

      The GPS knows where the car is, and just about every mapping software knows what kind of road you're on. So how hard would it be to have the Tesla's computer not even turn on the autopilot if they're not on a road with center divider and clear lane markings? Or better yet, the autopilot only runs on roads that have been certified "not screwed up".

      In this case he was on Laguna Canyon Road, which has a median in some places and a middle "suicide lane" in others, and varies from one to two lanes from place to place.

      An interesting idea. But such a Geo-permissive would require a lot of management and might be viewed more as a workaround than a solution.

    • by Anonymous Coward on Saturday June 02, 2018 @06:49PM (#56717768)

      GPS is very unreliable. Sometime you can't get signal from all the satellites you need to get an accurate in precise location. I wouldn't trust my life to them. Even when you get enough signal i have seen them be drastically wrong.I have had 3 gps sitting right next to one another and give me coordinates that are 100 meters away from each other even when they say they receiving enough signal to be accurate to within 3 feet.

      • by ColaMan ( 37550 )

        Precision vs accuracy strikes again!

      • by AmiMoJo ( 196126 )

        So make it fail-safe, i.e. if there is no GPS signal you can't engage autopilot. That's what GM does with Supercruise. The car has to be certain you are on a supported road before you can turn it on.

    • So how hard would it be to have the Tesla's computer not even turn on the autopilot if they're not on a road with center divider and clear lane markings

      It will hand off to the driver if it does not detect clear lane markings already, I've tested this myself. The FUD is, who can say for certain that their AI performs equally well in all situations? No one. So far though, it's been humans behaving badly.

      A system like this needs good statistics, none of which have been published (and possibly don't exist).

      • by Rei ( 128717 )

        The first quarterly report on Autopilot safety is due out some time in early Q2.

  • by Pinky's Brain ( 1158667 ) on Saturday June 02, 2018 @06:04PM (#56717586)

    It's going to kill a cop or first responder in some completely moronic way soon ... don't own Tesla stock.

  • by postmortem ( 906676 ) on Saturday June 02, 2018 @06:14PM (#56717626) Journal

    You know, ones that worked on the safety systems in aerospace industry for the last several decades, using technologies that aren't cool in 21st century. Guess what you get with toddler engineers.

    I bet if they would use the ol' guys just as consultants the number of catastrophic failures would be near 0. Just like it is in the aerospace when you ignore the human factor.

  • Simple solution (Score:5, Interesting)

    by quonset ( 4839537 ) on Saturday June 02, 2018 @06:32PM (#56717696)

    Since Musk can send an OtA update to adjust brake usage, he can send a command to disable auto pilot on all Teslas in the wild.

    With that problem taken care of, he and his engineers can spend their time working on reworking and/or improving their "auto pilot" so it doesn't run into parked vehicles. Or anything else.

    • Sure... but that's not the purpose of autopilot. You seem to be under the impression that autopilot means autodriving which it is clearly not.

    • by AmiMoJo ( 196126 )

      It might be unfixable.

      Autopilot uses a minimal set of sensors: front radar, cameras and ultrasonic sensors.

      The front radar isn't good at avoiding hitting stationary objects. Like all such radars it has to filter a lot of stationary objects to prevent the car constantly slamming on the brakes when there are reflections from signs, walls, roadworks, bumps in the road, plastic bags etc. So tuning it to recognize stopped vehicles with an acceptably low false positive rate is tricky.

      The cameras are currently not

  • by Locutus ( 9039 ) on Saturday June 02, 2018 @07:35PM (#56717958)
    In auto-pilot or not the driver is responsible for what the car does since the Tesla Auto-Pilot is not a level 5 autonomous control unit. So far I've only seen statements from the Police to the press stating that the driver _said_ it was in auto-pilot. Either way, the driver drove into the back of the patrol car, end of story. But it is not so this is just more of the negative press machine against Tesla and nothing more. Could be funded by Delco-Remy or any of the other antique auto industry players.

    LoB
    • The problem with this is that it is hard to believe that Musk is so ignorant about people that he is totally oblivious to the fact that some get distracted.
      • by sjames ( 1099 )

        The thing is, they get distracted when driving fully manual cars too.

      • by Locutus ( 9039 )
        And it is STILL their fault when they crash the car.

        The guy who told the cop it was the auto-pilot's fault he crashed into the cruiser might just as well have said it was MacDonald's fault because he was taking the lid of their coffee when he crashed. He was in the driver seat, it's called the driver seat because that is where all the controls for the vehicle are and those controls are, well you know, for controlling the car.

        We are in a very sad state of affairs when the driver of a car can blame something
        • We are in a very sad state of affairs when the driver of a car can blame something else for crashing and the Police and press blame that thing.

          Yet Tesla wants to not only push that envelope, but persist it after many accidents demonstrating what everyone else already knew; that humans enticed with this will become dangerous. Even the best drivers are out there lapsing in attention without the stressors that come with actually controlling the vehicle.

  • I bet the cop got pretty pissed when he figured out there was no one he could run in.
  • Tesla responded in a statement, "and before a driver can use Autopilot, they must accept a dialogue box which states that 'Autopilot is designed for use on highways that have a center divider and clear lane markings.'"

    If only the vehicle had some kind of fancy GPS + Computer Vision system that could detect when it was being used in a situation for which it was not designed and either refuse to work, or at least give the user a stern warning.

    I once fell asleep driving home late at night on the highway (w/ au

  • Fix that whole "can't avoid running into stationary objects" thing already. That's the most basic requirement for non-stationary objects with any form of steering.
  • The evidence of what many of us have believed is finally coming out. This technology isn't what it was promised to be and can't replace humans behind the wheel. The evidence points to the fact, because the technology isn't smart enough you still need a human in the seat monitoring what the car is doing, so the human can take over instantly when the system meets a situation where it finally decides it doesn't know what to do. This was proven in NTSA's crash analysis of the Phoenix fatality where the system f
  • by flug ( 589009 ) on Sunday June 03, 2018 @01:05PM (#56720822)

    This is a known characteristic of the Tesla "autopilot". I wouldn't even call it a "defect" per se as it is simply operating as it is designed to work.

    It won't pick up stationary objects, particularly if there is another vehicle in front of the autopilot vehicle, going about the same speed, and then that vehicles move aside with the stationary object right in the middle of the lane.

    This is one reason of many why the Tesla system requires constant supervision by the human driver.

    Of course the reason the whole type of system is a really bad idea is because it works great 99.9% of the time. Thus lulling the human driver into a false sense of security and safety. So then the human driver tunes out. Then 2000 miles later (or whatever) the "Autopilot" encounters a situation it can't handle and you wham into the back of a firetruck or whatever.

    And no, I'm not making this up:

    https://www.wired.com/story/te... [wired.com]

    http://www.newsweek.com/tesla-... [newsweek.com]

    https://www.teslarati.com/tesl... [teslarati.com]

Keep up the good work! But please don't ask me to help.

Working...