Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation

Tesla Now Has 160,000 Customers Running Its Full Self Driving Beta (theverge.com) 134

One piece of news from Tesla's AI Day presentation on Friday that was overshadowed by the company's humanoid "Optimus" robot and Dojo supercomputer was the improvements to Tesla's Full Self Driving software. According to Autopilot director Ashok Elluswamy, "there are now 160,000 customers running the beta software, compared to 2,000 from this time last year," reports The Verge. From the report: In total, Tesla says there have been 35 software releases of FSD. In a Q&A at the end of the presentation, Musk made another prediction -- he's made a few before -- that the technology would be ready for a worldwide rollout by the end of this year but acknowledged the regulatory and testing hurdles that remained before that happens. Afterward, Tesla's tech lead for Autopilot motion planning, Paril Jain, showed how FSD has improved in specific interactions and can make "human-like" decisions. For example, when a Tesla makes a left turn into an intersection, it can choose a trajectory that doesn't make close calls with obstacles like people crossing the street.

It's known that every Tesla can provide datasets to build the models that FSD uses, and according to Tesla's engineering manager Phil Duan, now Tesla will start building and processing detailed 3D structures from that data. They said the cars are also improving decision-making in different environmental situations, like night, fog, and rain. Tesla trains the company's AI software on its supercomputer, then feeds the results to customers' vehicles via over-the-air software updates. To do this, it processes video feeds from Tesla's fleet of over 1 million camera-equipped vehicles on the road today and has a simulator built in Unreal Engine that is used to improve Autopilot.

This discussion has been archived. No new comments can be posted.

Tesla Now Has 160,000 Customers Running Its Full Self Driving Beta

Comments Filter:
  • by TheMiddleRoad ( 1153113 ) on Sunday October 02, 2022 @09:22PM (#62931845)

    I don't have the FSD Beta, but I do have beta visualizations, to show what the software sees. Well, it sees jack shit. Two cars ahead is a maybe. Three cars ahead? Never. Is that a truck sliding back and forth like it's on roller skates? Yes. Did that human just disappear when they walked in front of that car? Yes.

    Terrifying.

    Tesla's software is such a fucking hack job, it will never work well unless they restart from scratch. It's fucking embarrassing.

    • by backslashdot ( 95548 ) on Sunday October 02, 2022 @09:31PM (#62931857)

      It does need additional and higher-resolution cameras. For one thing, it needs side-facing cameras further up front so the FSD mode can see down intersections without having to creep forward. The B-pillar cameras are about 2 or 3 feet behind where a human's head would be when looking for cross-traffic. I don't like that there are situations where FSD has to jerkily reverse back because it jutted in too far into the intersection to see what's cross-traffic is coming towards it.

      • Re: (Score:3, Insightful)

        by drinkypoo ( 153816 )

        It needs LIDAR so it doesn't have to constantly guess at depth, at which it is shit. Humans often get it wrong too, but not that wrong, and not that often.

        • For some reason Tesla thinks that seeing what a human can see should be sufficient for an AI. So no LIDAR - cameras are sufficient.

          Now I never understood why more information would not be better. I can only assume that Tesla does not want the additional cost of new sensors to be standard equipment on their cars. Part of their strategy is to utilize existing vehicles to collect data required to train the self driving AI. This requires that their vehicles include all sensors - which would be expensive

          • LIDAR isn't better. The frame rate is very low, and it is low resolution. At 30 fps, a car going at 60 mph moves about one yard between frames, which affects the parallax as the laser is scanning. LIDAR also used to have issues when many LIDARs were active in the same area. They still haven't fully solved that. The main advantage of LIDAR was that it used to provide better 3D reconstruction when the car was moving slowly, but now 2D computational imaging has gotten really good at that (extracting stereo inf

            • 2D computational imaging has gotten really good at that (extracting stereo information from a 2D single or double frame image -- just like your eyes).

              The cameras on a Tesla are much further apart than human eyes so give much more depth information.

              Plus there's multiple cameras.

              • The cameras on a Tesla are much further apart than human eyes so give much more depth information.
                Plus there's multiple cameras.

                And yet Tesla still cannot estimate depth reliably, and is still more accident-prone than a human driver, and oh yeah they don't actually even offer full self driving (SAE 4 or 5 automation) but they still call it full self driving. It's a scam from stem to stern, and you have been scammed. They are absolutely always going to have the worst preventable accident rates in the industry as long as they refuse to use LIDAR.

                People fuck up estimating depth with their eyes and their brains all the time, and yet are

              • The front facing cameras all are designed for different angles, it doesn't just have two high resolution wide angle cameras for stereoscopic.

              • 2D computational imaging has gotten really good at that (extracting stereo information from a 2D single or double frame image -- just like your eyes).

                The cameras on a Tesla are much further apart than human eyes so give much more depth information.

                Plus there's multiple cameras.

                The problem isn't that the cameras don't provide the same quality and amount of information as human eyes.

                The problem is that computer vision isn't nearly as advanced as the human brain at interpreting that information.

                Maybe it will get there in 5 years, maybe 20, and maybe never. But it's not there now.

                LIDAR compensates for some of those shortcomings in CV which is why everyone else uses them. That combo still might not be good enough for "Full Self Driving" but it's better than camera alone.

                But Musk has a

                • by ceoyoyo ( 59147 )

                  Stereo video processing is much better than your brain at generating 3D information. It might not be as good at acting on that information, but that's not a problem that LIDAR can solve.

      • It needs wide stereoscopy, one camera in each corner of the window. Reconstructing a 3d scene from a single viewpoint just makes life hard for no good reason.

      • by mjwx ( 966435 )

        It does need additional and higher-resolution cameras. For one thing, it needs side-facing cameras further up front so the FSD mode can see down intersections without having to creep forward. The B-pillar cameras are about 2 or 3 feet behind where a human's head would be when looking for cross-traffic. I don't like that there are situations where FSD has to jerkily reverse back because it jutted in too far into the intersection to see what's cross-traffic is coming towards it.

        Your first big problem is image processing and remote sensing. It's easy to tell from camera and other sensor data (I.E. radar or lidar) that something is there... Determining what "something" is becomes a lot harder as you've got a lot of parameters to match to get a reasonably positive match on something. I.E. how to tell that a bollard isn't a child.

        The second big problem is predictive analysis. Once you've identified your object, you then need to track its course and potential to intersect with yours

    • This kind of situational awareness is great, however I think it is less of an issue having Tesla FSD than drivers who are playing games and watching videos on their phone while driving. They have no situational awareness and sometimes a negative reaction time. I totally disagree about it being a hack job. They approach the problem with an aggressive nature but also a professionalism IMO. In clear conditions camera's are probably better than other solutions with their AI but LIDAR has the same issue as does
      • This kind of situational awareness is great, however I think it is less of an issue having Tesla FSD than drivers who are playing games and watching videos on their phone while driving. They have no situational awareness and sometimes a negative reaction time.

        I guarantee you that the average consumer, when presented with the phrase "Full Self Driving", will expect the software to be fully self driving, not requiring human intervention. They're going to expect to be able to play games or watch videos.

      • This kind of situational awareness is great, however I think it is less of an issue having Tesla FSD than drivers who are playing games and watching videos on their phone while driving.

        An issue that FSD probably compounds.

        They approach the problem with an aggressive nature but also a professionalism IMO. In clear conditions camera's are probably better than other solutions with their AI but LIDAR has the same issue as does radar (targets being partially or fully obscured.)

        Every video I've seen indicated the opposite. Their range is terrible and things regularly warp in and out of existence.

        Aside from that I suspect that they have a target tracking solution that keeps targets for the time that they think it may be appropriate even if they do not display them.

        So when a visible vehicle vanishes from Tesla's HUD you think the explanation is not that the Telsa has lost track of it, but that's it's still tracking it but has decided to hide it from the user for some reason?

        Humans have the same considerations as a pedestrian or other target once spotted my be obscured but remains in your mind. The goal here is to be 4x safer than the average driver for full release I think they are probably doing a great job on a difficult problem.

        I've never seen any evidence that Telsa's AI is remotely as safe as an average driver. There's occasional hand-wavy "accidents per mile" compa

        • by dgatwood ( 11270 )

          Aside from that I suspect that they have a target tracking solution that keeps targets for the time that they think it may be appropriate even if they do not display them.

          So when a visible vehicle vanishes from Tesla's HUD you think the explanation is not that the Telsa has lost track of it, but that's it's still tracking it but has decided to hide it from the user for some reason?

          I'm pretty sure the visualizations intentionally don't show anything past a certain distance threshold.

          Also, I have no idea if they visualize things that can't be seen, but whose position they are inferring based on remembering that something is on the other side of that truck or whatever.

          So maybe.

          • Aside from that I suspect that they have a target tracking solution that keeps targets for the time that they think it may be appropriate even if they do not display them.

            So when a visible vehicle vanishes from Tesla's HUD you think the explanation is not that the Telsa has lost track of it, but that's it's still tracking it but has decided to hide it from the user for some reason?

            I'm pretty sure the visualizations intentionally don't show anything past a certain distance threshold.

            Also, I have no idea if they visualize things that can't be seen, but whose position they are inferring based on remembering that something is on the other side of that truck or whatever.

            So maybe.

            Except things flicker in and out of existence without changing distance.

            I'd go with Occam's razor here. If the Tesla HUD shows vehicles winking in and out of existence it's because the Tesla doesn't have a great idea where they are.

            • by dgatwood ( 11270 )

              Aside from that I suspect that they have a target tracking solution that keeps targets for the time that they think it may be appropriate even if they do not display them.

              So when a visible vehicle vanishes from Tesla's HUD you think the explanation is not that the Telsa has lost track of it, but that's it's still tracking it but has decided to hide it from the user for some reason?

              I'm pretty sure the visualizations intentionally don't show anything past a certain distance threshold.

              Also, I have no idea if they visualize things that can't be seen, but whose position they are inferring based on remembering that something is on the other side of that truck or whatever.

              So maybe.

              Except things flicker in and out of existence without changing distance.

              I'd go with Occam's razor here. If the Tesla HUD shows vehicles winking in and out of existence it's because the Tesla doesn't have a great idea where they are.

              Probably true. That doesn't necessarily mean that it doesn't suspect that they exist, though — just that it is less certain than whatever certainty threshold or whatever is required for showing it. Whether it will react to something below that threshold or not is a separate question, and I have no idea. :-D

              • Aside from that I suspect that they have a target tracking solution that keeps targets for the time that they think it may be appropriate even if they do not display them.

                So when a visible vehicle vanishes from Tesla's HUD you think the explanation is not that the Telsa has lost track of it, but that's it's still tracking it but has decided to hide it from the user for some reason?

                I'm pretty sure the visualizations intentionally don't show anything past a certain distance threshold.

                Also, I have no idea if they visualize things that can't be seen, but whose position they are inferring based on remembering that something is on the other side of that truck or whatever.

                So maybe.

                Except things flicker in and out of existence without changing distance.

                I'd go with Occam's razor here. If the Tesla HUD shows vehicles winking in and out of existence it's because the Tesla doesn't have a great idea where they are.

                Probably true. That doesn't necessarily mean that it doesn't suspect that they exist, though — just that it is less certain than whatever certainty threshold or whatever is required for showing it. Whether it will react to something below that threshold or not is a separate question, and I have no idea. :-D

                Agreed. As to whether it's taking into account things below the threshold, if it is, that would include both things that exist and are missing from the HUD and things that don't exist and are missing from the HUD.

        • I mean they have 160k drivers using FSD now, so they should at least be able to do some A/B comparisons, maybe even some with random accept/decline into the beta program. Are those numbers anywhere?

          Just because you have the beta, doesn't mean you actually engage it on city streets, or very often. If you engage autopilot on the highway, you get the old autopilot stack with the old visualizations - only if the car sees that you are not on a divided highway do you get the "FSD" software.

          It's way better than it was - far less jerky and less random dynamiting of the brakes for no reason, but I only ever use it in light traffic conditions with clear lines of sight and nice wide roads, if at all. It needs

    • Tesla need to be sued for having such a misleading name for their software....

      • by AmiMoJo ( 196126 )

        They need to be sued for testing their alpha quality crapware on the public, with untrained safety drivers (Tesla owners) who are barely vetted for suitability.

    • by danskal ( 878841 )

      This is insightful? w.t.f.

      troll: "I haven't got Windows 11, but Windows 3.11 really sucks, herp derp"

      Slashdot: woah yeah, insightful!!

    • by DrXym ( 126579 )
      Not surprising at all. I'd actually be terrified to let a car drive itself having watched some of the videos where it makes dangerous mistakes and acts in unpredictable ways in traffic and non-optimal situations.
    • Near where I live there is a bridge that is around three car lengths long and only one lane. People look across to see if a car is coming and proceed or wait. I guess Tesla will take a long time to get to that situation.
    • Until Tesla says I can take a nap or read a book while the car is 'driving itself', I don't see the point in paying for this system, let alone allowing it's use on public roads.
    • Tesla AI does a better job of driving seeing 2 cars ahead than humans do seeing 10 cars ahead.

      Part of the problem is humans see too much. We see everything and focus on weird rather than known dangers.

      Not saying that seeing 3 or 4 cars ahead is not a good idea, but you are literally the guy focusing on the wrong thing.

      The best AI will probably see 4+ cars ahead. But I would rather take one that sees 2 cars ahead with perfect electronic speed reflexes than the guy fresh of the boat from England who just go

  • I.e. SAE-4 or SAE-5. They have SAE-2 and maybe have SAE-3 with this update. They are lying by misdirection. That lie _will_ kill people. But that cretin at the top of Tesla must apparently bolster his ego, no matter what.

  • Meh... (Score:5, Informative)

    by Ritz_Just_Ritz ( 883997 ) on Sunday October 02, 2022 @09:30PM (#62931855)

    I've been part of the beta for about 6 months or so. It works OK on well marked highways. It works a lot better than I'd expect, but still nowhere near autonomous, on side streets. It does a remarkable job with reacting to street signs and traffic lights. However, I wouldn't feel comfortable looking away from the road even for a couple of seconds. It rarely does anything super crazy, but you do get the occasional phantom panic stops and it tends to get confused by things like roads widening out for an exit lane.

    Overall, I've seen steady improvement and I like that it's a bit more functional than the "adaptive cruise" and "just stay in this lane" features of other cars I've driven. I doubt we'll see it being anywhere near safe enough for fully autonomous driving anytime soon, but it is still a handy feature as long as you're still paying attention to the road.

    Best,

    • What you are describing is still the Autopilot stack. I just use Autopilot, and there are very few times where I don't understand what triggers action-- even if it is wrong. The highway phantom braking/slowing is largely gone, but that was a huge issue. We are a very long way from Robotaxis... which is amazingly sad to me.

    • by AmiMoJo ( 196126 )

      What is the legal situation with phantom panic stops where you are?

      In the UK the general rule is that if you hit someone from behind you are probably at fault, but there are exceptions. One of those is if they brake hard without a genuine reason.

      There used to be a popular scam where five or six people would get in a car, wait until someone was behind them on a roundabout, and randomly brake hard. Blame the person who hit them from behind, big insurance claim for multiple injuries etc.

      • This is how it works everywhere in the USA that I'm aware of. In California and some other states you have the added feature of it always being illegal to hit a pedestrian in a crosswalk, unless you have the light AND could not reasonably have seen them.

  • by rickyb ( 898092 ) on Sunday October 02, 2022 @09:37PM (#62931861)
    Iâ(TM)ve been running the beta for 6 months or so. While Iâ(TM)ve seen some marginal improvements handling edge cases, I still need to intervene several times on a routine 8 mile route I take several times per week (and I report the mistakes each time to help âoetrainâ their model). In particular, FSD wants to get into turning lanes when it shouldnâ(TM)t and vice-versa. It wonâ(TM)t be ready until I would feel comfortable telling my non-techie parents to use it. Using that metric, itâ(TM)s nowhere close.
  • One thing I see very few people talking about directly, is what a vast data moat Tesla has built up now, with so many drivers and driver takeovers having been recorded...

    That is many orders of magnitude more than any other self driving car company can claim in terms of real world data, and the mess that is real world driving.

    Might it take a while to really get there? Yes. But Tesla is in a unique position to get to real self-driving first, both because of the vast training data they are building up, but a

  • It was 170K a week ago, but there's been some attrition.

  • "For example, when a Tesla makes a left turn into an intersection, it can choose a trajectory that doesn't make close calls with obstacles like people crossing the street."

    In an arrowless yield-on-green left turn intersection, pedestrians often have the walk signal and are crossing. The Tesla is NOT supposed to be making a turn under that circumstance.

    • In fact, in at least some states you're explicitly prohibited from even crossing the limit line until any pedestrians have fully crossed, curb to curb... nobody waits that long, though.

    • >> "For example, when a Tesla makes a left turn into an intersection, it can choose a trajectory that doesn't make close calls with obstacles like people crossing the street."

      And it only took them 35 software releases to decide that "making close calls with "obstacles" (people)" was a bad idea!

  • by kmoser ( 1469707 ) on Sunday October 02, 2022 @11:26PM (#62932013)
    What happens when a "Death Race 2000" bug slips into the latest release that causes Teslas to aim for, rather than avoid, vehicles and pedestrians? What happens when a malicious actor hacks the OTA update to insert their own such patch? All that hard work by Tesla engineers goes out the window when thousands of Teslas suddenly become killing (and/or suicide) machines.

    A few thousand cars operating in a carefully controlled test environment is one thing. Hundreds of thousands of such cars being operated by beta testers are a disaster waiting to happen. Just look at all the thousands of people affected when a bug brings down Google, Facebook, or AWS for a few hours. Those companies dedicate engineering resources at a similar order of magnitude as Tesla. Now imagine, instead of losing access go Google Docs or their favorite websites, those people get killed by a runaway car.

    Happy trails!
    • ...the same thing that happens when somebody decides they're fine to drive home after an evening at the bar? Or sees the person that their spouse cheated with and gives in to rage? Or has a medical emergency behind the wheel? Or when somebody decides they can handle the weather conditions, but can't?
      • by bws111 ( 1216812 )

        Except those things don't happen to every human driver at the same time, unlike what the OP is talking about.

    • by GoTeam ( 5042081 )
      Isn't this like asking what happens if a group of terrorists strap bombs to 100's of cars in a hundred different cities?
      • by kmoser ( 1469707 )
        Practically speaking, terrorists are limited to approximately one bomb per terrorist per vehicle, and the number of terrorists who could coordinate that would be several orders of magnitude smaller than the hundreds of thousands or even millions of Tesla vehicles. Also, to get 1,000 terrorists setting up bombs, you need at least 1,000 terrorists--a plan likely to fail when even one of them turns out to be an informant. To set up hundreds of thousands of Teslas to turn on pedestrians, you need just a small h
    • That happens when a malicious actor hacks the OTA update to insert their own such patch?

      Pretty sure when it comes to terrorist activity that involves compromising the whole software delivery chain including cryptographic signing keys and any independent technical safeguards, that the death-to-effort ratio is still much in favor of simply stealing some unsecured fertilizer from a farm.

      What's easier to protect - a manufacturing and delivery process fully under your control or each of billions pounds of oxidizer spread throughout the country?

      Just look at all the thousands of people affected when a bug brings down Google, Facebook, or AWS for a few hours. Those companies dedicate engineering resources at a similar order of magnitude as Tesla.

      They dedicate resources according to risk and literally

      • by kmoser ( 1469707 )

        Pretty sure when it comes to terrorist activity that involves compromising the whole software delivery chain including cryptographic signing keys and any independent technical safeguards, that the death-to-effort ratio is still much in favor of simply stealing some unsecured fertilizer from a farm.

        You're comparing apples (Teslas?) to oranges (fertilizer?). Malicious actors with the skills to compromise a software delivery chain, with the potential payout of having a hundreds of thousands of killing machines at their command, have little to no interest in stealing a few thousand pounds of fertilizer which, even if used in the most damaging way possible, might kill a couple of orders of magnitude fewer people. If anything, they have another such branch working on stealing fertilizer, while the geeks s

  • Last I checked, it's unethical to do human experimentation.
  • It's not clear why I would pay for the FSD software and then give Tesla the data to train that software for free.
    • by GoTeam ( 5042081 )

      It's not clear why I would pay for the FSD software and then give Tesla the data to train that software for free.

      Maybe you're just a generous person? :)

  • Terrifying (Score:5, Insightful)

    by lordlod ( 458156 ) on Monday October 03, 2022 @06:52AM (#62932669)

    How many pedestrians and other cars are unwittingly taking part in this live beta test?

    How can the regulators allow 160,000 of these to cars drive around, running software that even the manufacturer says isn't ready, with no warning signs on the cars, no formal training for the supervising drivers and seemingly no oversight?

  • by fluffernutter ( 1411889 ) on Monday October 03, 2022 @10:05AM (#62933077)
    I think people don't realize that we have never given anyone permission to get into a car accident. It is just something that happens.

    But these companies are designing self driving systems. They have every opportunity to test before it gets in an accident. They tell us it is safe. If they are looking for permission for it to me imperfect, they will never get it from me.
  • to qualify to receive their self-driving beta for months now, yet they still haven't offered it to me.

    While I guess it's impressive they've upped the number of recipients of it to 160,000? I feel like that still means it's just a lottery if you got it or not, once you tapped the button to request to participate and proved you can drive safely enough ... and one that means you probably have something like a 1 in 8 chance of receiving it after all that.

    To the people all concerned there are this many cars run

  • Proposing his own ridiculous peace deal for Russia to stop trying to wipe out Ukraine [twitter.com].

    On topic because it's another example of terrible judgment on Musk's part, just like 160k people in a "full self driving" beta that is neither full self driving nor a sane idea for a beta.

  • Is Tesla the only company that compels third parties to participate in it's AI beta test, whether they want to be involved or not? Because there doesn't seem to be much in the way of proper studies on the technology confirming that it's safe, nor who it prioritizes between the driver and pedestrians. There are lives at risk.

Living on Earth may be expensive, but it includes an annual free trip around the Sun.

Working...