Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Transportation EU Software Technology

Tesla Is Nerfing Autopilot In Europe To Comply With New Regulations (cnet.com) 110

Tesla is rolling out a new software update to its Model S and Model X vehicles in Europe that will reduce the capacity of its Autopilot features in order to comply with regulations. Electrek reports: Tesla has to adapt its driver-assist system to different regulations in different markets. It results in a much more limited Autopilot in Europe compared to what Tesla is making available in its vehicles in North America. The automaker already reduced the capability of the Model 3 in Europe, but now it started warning Model S and Model X owners that they are also going to be affected after a new update.

We are talking about fairly significant changes including a requirement to apply pressure to the steering wheel every 15 seconds when using Autopilot. [Here is a list of changes coming to Autopilot, as noted in an email sent from Tesla:]

- Auto Lane Change will be restricted for use on divided roads with two or more lanes of traffic in either direction.
- Once Auto Lane Change is activated and the indicator is turned on, your car will wait a minimum of one and a half seconds before starting the lane change and will wait up to five seconds before cancelling if the lane change has not been able to start.
- The limit of how far the steering wheel can turn while using Autosteer is reduced and can affect your car's ability to maneuver curves or stay within the lane, requiring you to take action.
- Summon will require that you be within six meters of your car's location to operate.
You'll receive a reminder to hold steering wheel if it does not detect your hands on the wheel for 15 seconds.

This discussion has been archived. No new comments can be posted.

Tesla Is Nerfing Autopilot In Europe To Comply With New Regulations

Comments Filter:
  • by timeOday ( 582209 ) on Wednesday December 18, 2019 @08:34PM (#59534604)
    Seldom do you get to conduct mass real-world "experiments" like disabling these assistive features. Based on the safety stats collected with autopilot so far, my guess is disabling it will cost lives.
    • Tesla will not get the credit for 99 accidents it averted, but it will get the blame for the 1 accident that a "perfect" human driver could have avoided.

      Well, at least we will be able to compare accident rates with w/o driver assist systems in USA, Europe and China and see what we can figure out, in a few years.

      • I think you need to work on your counting skills. Here are 2 separate fatal accidents in florida(https://qz.com/1621235/autopilot-was-engaged-in-a-2019-tesla-model-3-crash-in-florida/) this one in SJ(https://www.mercurynews.com/2018/03/26/driver-identified-in-fatal-tesla-crash-on-highway-101/) and I recall a few others but don't feel like looking for a citation. I'm sure many manufacturers could claim accident prevention with emergency braking systems. The difference is those systems do not encourage people
        • by timeOday ( 582209 ) on Wednesday December 18, 2019 @09:00PM (#59534692)
          Counting to 2 is irrelevant.

          Here are some relevant stats:

          "In the fourth quarter of 2018, Tesla reported one accident for every 2.91 million miles driven with Autopilot engaged. The first Tesla safety report of 2019 shows that rate increasing slightly, to one accident every 2.87 million miles.

          However, both of those figures are better than the statistics for Teslas without Autopilot engaged: one accident for every 1.76 million miles driven in Q4 2018 and one every 1.58 million miles driven in Q1 2019.

          Teslas with or without Autopilot engaged also appear substantially safer than the average car, based on the new Tesla data. According to the National Highway Traffic Safety Administrationâ(TM)s most recent data, thereâ(TM)s an auto crash every 436,000 miles driven in the United States."

          I say relevant, not conclusive, since it's not a controlled experiment, e.g. the conditions under which autopilot was used are probably different than those in which it wasn't used, and so on. Still, there's no strong evidence that it's dangerous either.

          • I think it's very interesting to view both the Tesla accident and fatality rates. While the accident rate may be somewhat better (i.e., somewhere around 2x) with Autopilot, the fatalities rate [wikipedia.org] appears to be similar with or without Autopilot at around 250-300 million miles per fatality for Teslas. Why is this? Part of Autopilot capabilities are due to ADAS systems that are very common for many new cars. While data for the accident rate with and without ADAS is limited, many studies suggest that ADAS sign

            • by Kokuyo ( 549451 )

              It is conceivable that "unavoidable" accidents tend to also be the ones ending in a fatality.

              If a truck runs a red light and side impacts in a Tesla at 80km/h, I doubt very much that autopilot would have any better a chance than a human driver.

              Again, I said it is conceivable not that I have any proof whatsoever for this.

              • by zazzel ( 98233 )

                I would assume that the accident scenario accounting for 90% of (currently!) unavoidable accidents with trucks is "driver tired or fallen asleep" and "at the end of a traffic jam", and does not involve red lights.

                At least for German Autobahns, this is the major scenario involving trucks, also because I heard that truckers tend to disable certain safety features (like LKA and ACC), as they find them annoying.

                • by Kokuyo ( 549451 )

                  Well, it was one example and I think it made the point. A sleepy truck driver creating a big ass accordion at the end of a traffic jam generates quite an impressive statistical point underscoring what I said, though :D.

              • It is conceivable that "unavoidable" accidents tend to also be the ones ending in a fatality.

                If a truck runs a red light and side impacts in a Tesla at 80km/h, I doubt very much that autopilot would have any better a chance than a human driver.

                Again, I said it is conceivable not that I have any proof whatsoever for this.

                It's possible, but since there are not that many such fatal accidents [tesladeaths.com], we can see that the Tesla accidents were mostly due to the perception layer of the software:

                2016 China: Didn't recognize back of truck.
                2016 Florida: Didn't recognize side of truck crossing the lane.
                2018 California: Didn't recognize diverging lanes on highway.
                2019 Florida: Didn't recognize side of truck crossing intersection.

                These were all accidents that most attentive human drivers would have easily avoided.

          • by zazzel ( 98233 ) on Thursday December 19, 2019 @03:48AM (#59536132)

            I do not see how *any* of the mandated changes should increase accident rates:
            * Disabling auto lane changes on lanes with oncoming traffic (e.g. a road like a Bundesstraße or Landstraße in Germany) is actually a bright thing: you have hardly any opportunity to pass other cars anyways (especially trucks), the oncoming traffic is usually fast (80-130km/h - some people DO speed!), the roads are narrow and sometimes curvy, with many opportunities for waiting traffic from adjacent small roads / properties suddenly pulling onto the road.

            * Waiting until auto lane change activates is also really necessary, as the Tesla's sensors have trouble reliably detecting cars from behind in the passing lane - please remember that we have Autobahns without a speed limit, and every now and then you might be passed by cars at 250km/h (or 155 mph).

            * The limited angle of the steering wheel is not an issue in real world scenarios (I drive a BMW with lane keep assist and - limited - auto lane change).

            And how limiting a feature like summon (find some youtube videos!) should increase accident rates leaves me puzzled.

            • * Disabling auto lane changes on lanes with oncoming traffic (e.g. a road like a Bundesstraße or Landstraße in Germany) is actually a bright thing: you have hardly any opportunity to pass other cars anyways (especially trucks), the oncoming traffic is usually fast (80-130km/h - some people DO speed!), the roads are narrow and sometimes curvy, with many opportunities for waiting traffic from adjacent small roads / properties suddenly pulling onto the road.

              It is also disabled on roads with two lanes in every direction but no divider between the two. Text says "divided roads".

              * Waiting until auto lane change activates is also really necessary, as the Tesla's sensors have trouble reliably detecting cars from behind in the passing lane - please remember that we have Autobahns without a speed limit, and every now and then you might be passed by cars at 250km/h (or 155 mph).

              The Tesla is constantly monitoring traffic in adjacent lanes (and displaying it on the screen), not only before a lane change. Whether or not it waits 1.5 seconds before initiating the actual lane change makes zero difference in whether or not it will see traffic. It is annoying, though, especially in combination with the requirement for applying pressure to the wheel. If you don't apply

          • by AmiMoJo ( 196126 )

            We've been over this, the stats are meaningless. Autopilot gets used on the safest roads the most - highways. The numbers are also skewed by the demographics of Tesla owners who can afford autopilot.

            • The function that Tesla calls "Autopilot" is standard on all their cars.

              You are probably thinking of the "Full Self Driving" capability that some owners have prepaid for, but hasn't been delivered yet.

          • Are these numbers actually comparative? I.e. Are the manual numbers are constrained to the scenarios where drivers use autopilot?

          • "Teslas with or without Autopilot engaged also appear substantially safer than the average car"

            Due to owners being above average drivers I'd wager. Majority of Tesla owners are just a little step above the masses. You don't see the below 100 IQ crowd driving Teslas. You don't see the drunk construction worker driving Teslas. You don't see wild driving 16 year olds driving Teslas.

            This is something I learned in a psychology class many years ago. A car brand often attracks a certain type of driver.

            By this meas

        • I recall a few others but don't feel like looking for a citation..

          You just proved the point. You could provide citations for two accidents, and I concede there could be a few more.

          According to the stats provided by timeOday [slashdot.org] there must be hundreds of accidents avoided by the autopilot. But, I can't find them or cite them. This was my original point. Thanks for playing.

          • I think you missed the part in my post about forward collision braking doing as much as autopilot without causing others. I've yet to see a report of a car with auto braking slice someone in half under a semi, or hit a firetruck, or a police cruiser, or run into a barrier. And because tesla is the only company that tracks you like a dog, we don't know how many people were saved by auto brake activation. But then those companies do not have the next messiah at the helm. Enjoy your cult, I know you are. And p
            • Hitting stationary vehicles is a known documented limitation of using radar. Radar detects moving objects. It is difficult for radar to detect the difference between street furniture and a stationary vehicle.

              The major defect in the semi crash is the poor US safety standards that allow trailers to not have side impact protection. Had European mandated trailer side impact protection being present, then the airbags and other safety systems on the Tesla would of had a better chance of being deployed. However, t

        • Both Florida accidents involved roads that were wide, but NOT limited-access freeways.

          Making autopilot work safely on a road like a freeway is relatively easy. Strictly speaking, we could have had autopilot on freeways 25+ years ago IF they'd had:

          1) tracking wires embedded under the center of each lane, with additional wires to signal meta information (eg, more lanes to the left or right, upcoming merge or exit)

          2) robust barcode tags painted into the pavement to provide redundancy & extra meta info

          3) ph

        • You just made his point....

      • Well, you fuck one goat...

    • by dgatwood ( 11270 ) on Wednesday December 18, 2019 @08:52PM (#59534670) Homepage Journal

      Seldom do you get to conduct mass real-world "experiments" like disabling these assistive features. Based on the safety stats collected with autopilot so far, my guess is disabling it will cost lives.

      I'm 100% certain that this change will:

      - The limit of how far the steering wheel can turn while using Autosteer is reduced and can affect your car's ability to maneuver curves or stay within the lane, requiring you to take action.

      Translation: "When you hit a tight offramp, the EU requires us to crash into the guard rail unless you intervene." The odds of it warning you far enough in advance to be safe are approximately zero.

      Whatever politician thought this was a good idea should probably not be allowed to vote, drive, or, really, leave the rubber room again in the future without adult supervision.

      • Re: (Score:2, Insightful)

        Whatever politician thought this was a good idea should probably not be allowed to vote, drive, or, really, leave the rubber room again in the future without adult supervision.

        The politician did not think, at all.

        The legacy auto makers who "convinced" the politicians to enact this law, know it is a bad idea.

        • Re: (Score:2, Insightful)

          by mbkennel ( 97636 )
          Maybe the point is to cause Tesla-blamable accidents?
        • by Tom ( 822 )

          The politician did not think, at all.

          It's called conforming to the job description.

          There's a reason "politician" is now high on the list of most despised professions. Above strippers and prostitutes.

          • Above strippers and prostitutes.

            What fucked-up reality do you live in?? The thinking world detests preachers, politicians, carsalesmen... and now MBA's. The only ones bothered by strippers and prostitutes are closet-homo evangelicals, angry lesbians and jaded, broke-ass johns.

            • by Tom ( 822 )

              What fucked-up reality do you live in??

              I wonder that myself sometimes, but I merely quoted a survey and not my personal opinion. I'm actually have three (that I know of) sex workers among my acquaintances, they're just regular people outside work, like the rest of us. I agree with you that preachers should be extremely high on the list - but sadly, in a general survey they still don't make the top. For some unfathomable reason.

        • by AmiMoJo ( 196126 )

          Actually it's to prevent autopilot murdering you by performing a sudden violent maneuver. Remember those videos of it suddenly veering into oncoming traffic?

          This keeps it within human reaction times and is fine for its intended purpose - highways.

          • This keeps it within human reaction times and is fine for its intended purpose - highways.

            That was the intent, indeed. But to prevent the car suddenly veering into oncoming traffic, they shouldn't limit the lateral acceleration but rather the rate of change of that acceleration. There's absolutely no reason why the car should not be allowed to gently steer into a turn and gently increase the turn to whatever is needed to stay on the road. And it's very hard to argue that going off road (which it is now legally required to do) is safer than steering more than 3 m/s^2. The rule makers completely s

      • Whatever politician thought this was a good idea should probably...

        Be put in the back seat of a Tesla about to exit the highway on a curved ramp...

      • Will a Tesla ever slow down if the upcoming curve is beyond its control capacities? Humans (non-idiotic ones, anyway) don't drive at the same speed into a curve, instead they slow down if it looks like a sharp bend or they find that they can't turn the steering wheel enough.

        Does a Tesla do that, or does it just go off the outside of the curve at the same speed it was going at on the straight road?

        • by motd2k ( 1675286 )
          is slows down in preparation for the bend. The EU directive compliance on steering angle is scarier, it'll be happily taking a bend at a reasonable speed when it just aborts with a 'Take Control' warning mid apex. The 15 seconds rule is also irrespective of speed, 15seconds in a traffic jam is a lot more annoying than 15seconds at 70mph - neither would be a problem if the car could detect your hands on the wheel, but it can't... it works on torque applied so you have to learn to lightly countersteer just
      • by AmiMoJo ( 196126 )

        We generally avoid tight turns off fast roads in Europe, and where they exist the probably exceed the ability of autopilot to handle them anyway.

        And if you do crash then autopilot failed to ensure you were paying attention. 15 seconds is an awfully long time just to start nagging, and the cars knows a tricky turn is coming up so it should be checking and going nuts if the driver isn't ready to take over.

        • You've never been to rural areas of any Mediterranean or Eastern Bloc countries then. I'm not even sure how human drivers drive in some of those places and they do and often at high speed and in places like Spain and Italy, going slowly through because you're from further North will get you taken over - in a blind curve.

      • If the autopilot is meant for highway use only then it should be turned off when you are using an offramp anyways. It's not an autopilot. It's an assisted drive feature for parts of your journey. It's not going to drive you from point A to point B without you doing anything.

      • The odds of it warning you far enough in advance to be safe are approximately zero.

        Given tight offramps have warning signs and most GPS systems like TomTom or those built into car dashs have no problem warning the user either, I find your drama to be quite Shakespearean, specifically Macbeth: full of sound and fury,. Signifying nothing.

    • by Luthair ( 847766 )
      You mean the bogus statistics?
    • by Cederic ( 9623 )

      Why would forcing the car not to perform dangerous manoeuvres cost lives?

      If the car can't negotiate a corner without taking it too quickly then it shouldn't be pretending it can drive.

      If the car can't change lane safely then it shouldn't be pretending that it can drive.

      There's nothing unsafe in this regulations. Nothing.

  • by alvinrod ( 889928 ) on Wednesday December 18, 2019 @08:41PM (#59534626)

    You'll receive a reminder to hold steering wheel if it does not detect your hands on the wheel for 15 seconds.

    Does it eventually pull over and turn off if you fail to comply or does it just keep issuing reminders? Seems like a great movie scene for an EMS [wikipedia.org] pull up to a wreck where the Tesla is reminding the now deceased driver (who still has some earbuds in and tablet playing a movie connected to them just so the audience understands what happens) to please put his hands back on the wheel. The driver can be an asshole that the audience doesn't feel much remorse for so it doesn't seem like an outright slam against Tesla.

    • I think it just keeps saying at increasing volume levels "If you don't put your hands on the wheel right now, I swear to God I'll turn this car around!"
      • It will give very annoying beeps, with a large red steering wheel and warning messages on the screen. It then starts to slow down and eventually pull over, but you can actually override it using the accelerator (which is not considered to be "taking over" as long as you dont apply pressure to the steering wheel", The beeping continues at high volume incessantly.

        (Don't ask me how I know ;-) )

        Once you finally do take over, autopilot is disabled until you select "Park". So if you want to use autopilot again, y

    • Re:Too polite (Score:5, Informative)

      by 140Mandak262Jamuna ( 970587 ) on Wednesday December 18, 2019 @09:19PM (#59534768) Journal
      Tesla will gradually reduce speed and bring the car to rest. (Not sure if it will make lane changes to get to the shoulder). There are tons of videos of people deliberately ignoring the warning to see what happens. With camera, so that they can post it on YouTube.

      Even the famous sleeping driver in CA video footage, you can see the Tesla had already slowed down considerably compared to rest of the traffic.

      More important news was what was NOT reported. In both cases of footage showing people falling asleep at the wheel, there were no accidents, fender benders or more serious involving Tesla in those stretches of the road at that time.

      People fall asleep at the wheel all the time, quite unfortunate, but happens a lot. Most non Tesla auto pilot drivers end up causing an accident. The Tesla auto pilot will keep the car safe and in lane and with separation. Most likely those drivers woke up with a start, realize what had happened, and gotten a jolt of adrenaline that kept them awake for the next 36 hours. They got to live .

  • by BitterOak ( 537666 ) on Wednesday December 18, 2019 @08:43PM (#59534638)
    With all these features being disabled by a software update, I can see a big industry emerging in "car hacks": software to "jailbreak" your car and re-enable features that were disabled to comply with local regulations.
    • Indeed. You will only be able to pay for the "patches" with bitcoin, however. Or trust the "free" patches that were all in a big torrent.

    • by King_TJ ( 85913 )

      I welcome it, should this happen. I've run across a couple of web sites before, in other countries, offering Tesla hacking services. It's definitely a thing that exists already in a very limited way. A big problem is that Teslas are normally in constant communication with back-end servers. Tesla collects a lot of random driving data to try to improve the AI for the auto-pilot and even for things as simple as making automatic windshield wipers that use the camera input to decide if it's raining and how ha

  • by deek ( 22697 ) on Wednesday December 18, 2019 @09:00PM (#59534694) Homepage Journal

    Which, in Europe, will be within six meters of where you are. Amazingly useless, but possibly a good source of amusement.

    Next thing Tesla needs to do is teach the car how to fetch sticks.

    • by Tom ( 822 )

      Without the six metre rule, I could park the Tesla in my garage and have it pick me up at the front door. With it, not. City dwellers don't get that, but in the countryside, six metres is essentially removing that feature. What's left is that you can exit in front of the garage and it drives in or out by itself. That's useful in a very small city parking garage with narrow spaces, but you probably made your own garage not so tiny that you need that.

      • by Cederic ( 9623 )

        My garage was built in 1970, which is before I was born. I didn't make it tiny, cars grew big.

        My current car does fit in my garage. There just isn't room to open the doors or even climb out of a window to get out of it.

        • You must not be in the US. Early-1970s American cars were ENORMOUS. The AMC Pacer was popularly thought of as a "compact" car, even though it was basically the size of a Cadillac with truncated hood & trunk. My dad had a (2-seat) Corvair that I think had a bigger footprint than a 2018 Mustang GT... and it was considered "tiny". Even "subcompact" Toyota hatchbacks from the late 1970s were fairly huge by modern standards... they just seemed tiny compared to an average Cadillac, Lincoln, or Buick.

  • Good.

  • From Wikipedia: In the United Kingdom, the Locomotive Acts was a policy requiring self-propelled vehicles to be led by a pedestrian waving a red flag or carrying a lantern to warn bystanders of the vehicle's approach.
    • by 140Mandak262Jamuna ( 970587 ) on Wednesday December 18, 2019 @09:41PM (#59534836) Journal
      When the Locomotive Acts were in effect not a single Briton died or even was injured by an automobile. The law was saving lives.

      Just look at the accident rate and the deaths caused by automobiles, once those laws were foolishly and dangerously revoked at the behest of the auto industry.

      • When the Locomotive Acts were in effect not a single Briton died or even was injured by an automobile. The law was saving lives.

        Just look at the accident rate and the deaths caused by automobiles, once those laws were foolishly and dangerously revoked at the behest of the auto industry.

        That act was put in place at the behest of the horse driven carriage industry trying to kill the automobile industry at birth.

        • I know, I know.

          I tried speaking with a tongue in cheek and ended up chewing my own tongue instead. I still have not mastered the art. Sorry for the confusion.

  • Tesla closed on an all time high price of 393$. This is higher than the peak created by the surge following "Funding secured" tweet.

    Tesla shorts were up 5.1 billion dollars between 1/1/19 and 5/31/19. Then, when the stock was at 179$, a five year low, they shorted more. The number of shares shorted shot up from 25 million to 45 million. Now, they are down 1.9 billion year to date.

    It works about to 7 billion loss in the last six months.

    The bankruptcy theory is dead. Demand cliff theory is dead. Tesla k

    • Q4 will determine it. Q3 was just accounting tricks.
      • I acknowledged that. The only theory left is total outright fraud.

        Even the most ardent bears, who still maintain a 200$ price target concede Tesla's lead in battery technology, its cash position, acknowledge its basic operations. Even if there is accounting fraud, the company is likely to be worth 20 billion and will survive to build BEV in some form. Stock will become zero of course, but the brand, the tech will survive even accounting fraud.

    • Tucker [wikipedia.org]

  • by Tom ( 822 )

    And there goes my main reason for thinking about a Tesla next time a new car is due (I live in Europe).

    I drive the same road to work every day, and 90% of the distance is Autobahn or a fairly straight rural highway road that would pose no problem at all for an autopilot. I dream of the day that I can get into my car, drive the two small roads to reach the highway, then kick back, open my notebook and get some work done while the autopilot brings me into the city, where I drive the last few km to the office

    • by Espen ( 96293 )

      Sounds like you are dreaming of a train

      • Or a taxi.
      • by Tom ( 822 )

        I would, in fact, love to take the train and have done so most of my live.

        But I now moved and the new place doesn't have public transport that could compete with a car, even considering worse-than-usual traffic. I'd take the train if that were an option.

    • Autopilot is just an advanced cruise control. You should never kick back or take your eyes off the road. You are still driving!

  • There is nothing new here. It has been this way since a few months already. Now Tesla is simply making it explicit with an email, instead of burying it into "release notes".
  • Good (Score:5, Informative)

    by mccalli ( 323026 ) on Thursday December 19, 2019 @04:27AM (#59536224) Homepage
    I have a non-autopilot Tesla, but have had loan cars for quite a few whilst mine was in for service. Each time I've used autopilot (in the roads in and around to the west of London) I've found it simply isn't ready. It steers too close to other cars on the left, it has no concept of leaving space for motorbikes (of which there are a lot round that area), and the idea that it should turn immediately I flip a switch rather puts the lie to the word 'indicator'.

    The 15 seconds rule is already present (I think, might be 30) and is no great hardship. Honestly every time I've used it I've eventually just turned it off and used the speed control aspect of it but kept the steering to myself. Ironically I've found it's most useful in almost-but-not-quite-stopped traffic jams, of which again there are rather a large amount just to the west of London. It's less tiring than trying to concentrate and match speed in those circumstances. In all of the others...it's just not ready for there.
    • Re:Good (Score:5, Insightful)

      by nicolaiplum ( 169077 ) on Thursday December 19, 2019 @07:02AM (#59536502)

      The idea that a car changing lane without pausing between indicating and moving is good is ... hugely self-entitled and aggressive. UK driving standards (also probably other European standards) require the driver to observe, indicate, and then manoeuvre - with an interval between these actions. In the UK this is summarised as "Mirror, Signal, Manoeuvre". Moving while indicating is driving like an arsehole and causes accidents. Belgians and Californians drive like that, drivers in countries with low accident rates do not.

      The EU isn't "nerfing" the Tesla. The EU is civilising it.

      • “... your car will wait a minimum of one and a half seconds before starting the lane change ...” Too may use that switch as a turning “decorator” (“I like those flashing lights, they make my car even nicer than the pools of light in front of my fog lights and rallycross LED headlights”)
  • ... they mean making their autopilot safer.
  • I love Tesla and most everything they are trying to do with their cars. I've not bought one yet, but I keep getting the itch. But the European demands make sense to me. No need to encourage people to fall asleep driving and kill somebody. We aren't living in the future yet... it's a long slow trudge to get cars with autopilot functioning near perfectly. Maybe the bosses at Tesla aren't OK with slow and steady, but that's why there are governments to make rules to try to protect the innocent.

I've noticed several design suggestions in your code.

Working...