Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Transportation

Elon Musk Says Tesla's Full Self-Driving Subscription Arrives In Early 2021 (engadget.com) 88

Yesterday, Elon Musk told Twitter followers that Tesla's Full Self-Driving subscription rollout will arrive "early next year." Engadget reports: In theory, you could add the autonomous (currently semi-autonomous) features without a steep up-front cost in a matter of months. You might not want to plan your schedule around that timetable. Tesla previously hoped to offer a Full Self-Driving subscription by the end of 2020, and that's clearly not happening.

Whenever the monthly plan arrives, it could be key to boosting adoption. If you lease your Tesla, you might not have to pay as much to use Full Self-Driving for the useful life of your EV. It could also give you an opportunity to try the features as long as you like without committing to a full purchase. It's safe to say the usual $10,000 price (as of this writing) is daunting if you're not completely sold on the technology.

This discussion has been archived. No new comments can be posted.

Elon Musk Says Tesla's Full Self-Driving Subscription Arrives In Early 2021

Comments Filter:
  • One word. (Score:2, Insightful)

    by MrNaz ( 730548 )

    Subscription.

    • It does make some sense. Self-driving isn't something that a car can do alone - it depends on a constant stream of map data, real-time information, and constant updates to the software. More of a service than a one-off purchase.

      • Just about every product can be subscriptionized with that rationale. Cars themselves need a constant stream of servicing, but we don't let ourselves be stuck with a contract with the manufacturer.

        Why are we ok with our cars being turned into subscription services but we react with horror when MS turns Windows into one?

        Is it because we have some kind of principle defining the two? Or is it because we love one corporate founder and hate the other?

        • I get the impression the subscription model and maybe the steep price for the self-driving system too might have something to do with Tesla's need to buy insurance to cover accidents caused by the system.

    • at first I was a big fan of Elon and what he wanted to do with his companies until he introduced subscriptions in places where it does not belong and it gets worse because other manufacturers are copying his subscription fetisch
  • by misnohmer ( 1636461 ) on Monday December 21, 2020 @08:37PM (#60855522)

    Elon has promised that many times. He got me once in 2016 when he said it was coming before by the end of 2016. Then he promised cost to coast demo by end of 2017, then I lost track of the promises, I remember "3 months maybe, 6 months for sure" a couple of years ago. Then in 2019 Tesla neutered the definition of Full Self Driving to be pretty much what Enhanced Auto Pilot used to be from 2016 to 2019. Perhaps he's finally going to have a Beta release of EAP as he sold it in 2016, though I suspect not available for cars sold in 2016 (will require later hardware). Wake me up when I can summon my Teslas from New York to L.A. like Elon promised, and Tesla takes responsibility for any accidents caused by FSD, heck, let's just say summon anywhere within a single charge range, give Elon a pass on automated charging stations with the "Tesla snake" he showed 4 or 5 years ago.

    • by Luthair ( 847766 ) on Monday December 21, 2020 @08:55PM (#60855560)
      Is there even a regulatory framework for it to be a thing at this time?
    • by AmiMoJo ( 196126 )

      I made a list of all the dubious claims Musk has made about Full Self Driving. It needs updating for his latest comments.

      **10th October 2014:** Bloomberg interview: "And so I think from the point at which true autonomous driving is possible which I now think is probably the five or six year timeframe.
      "I think weâ(TM)ll be able to achieve true autonomous driving, where you could literally get in the car, go to sleep and wake up at your destination,â

      **December 2015:** "We're going to end up with com

    • First, let's see that coast-to-coast demo.

    • This announcement does not concern when self-driving is ready, it's just that instead of paying for the FSD package, you will instead be able to rent it. The example given was that somebody is leasing a Tesla.

  • DRM. Check.
    Cultists. Check.
    Now all he needs is a gig economy "platform", and the evil capitalist toolkit is complete. ;)

  • And the much prophesied cattle cars have arrived!

  • by Anonymous Coward

    As someone who has worked on self driving technology something that became clear to me early on is that LIDAR was not going to work at scale. LIDAR cannot see very far, that's why you rarely see the larger prototypes with the big hardware on highways. LIDARs do not understand flying plastic bags, so much so that these 'objects' were disabled by the Uber self driving car team so that the car would not stop in the middle of the road all the time.

    Many of the self driving car demos are remotely assisted by huma

    • LIDAR is OK for neighborhood driving, but not anything with substantial speed because of the range issue you mentioned... I still think maybe some aspects of LIDAR might end up in final self driving cars but for sure cameras will be the primary technology that works best at the core.

      • by AmiMoJo ( 196126 )

        Lidar has far greater useful range than cameras. Waymo's hardware works out to 300m, which is about 10 seconds of driving at highway speeds. For longer range than that you need radar.

        The issue with cameras is that although you can have one with a super long lens and high resolution it's not very practical. You need them to be able to see things close to the car as well, so end up either having loads of them or using short range, wide angle types. The more cameras you have the more images you have to process

        • Lidar has far greater useful range than cameras.

          Only for massive units. Waymo LIDAR is like having a full roof carrier [google.com] and massive side bulges, it's fine for what Waymo is doing where they rely on tons of extra pre-recorded and analyzed information about routes, but not practical for consumer passenger vehicles where people would not put up with that bulk.

          LIDAR also doesn't give you any of the visual clues that cameras do, zero sense of color for example which is pretty key for understanding real-world dr

          • by AmiMoJo ( 196126 )

            Colour is only useful at short range where humans can see it.

            • Colour is only useful at short range where humans can see it.

              At long range color can mean the difference between detecting a deer and a decorative sculpture someone has placed by a mailbox...

              It's also a primary communication mechanism of road signage. Something you can't quite make out the shape of, but see significant amounts of orange in - you slow down ahead of time for.

              Color at long range is super important to determining which objects are potential hazards. Shape alone is way too easily obscured by o

              • by AmiMoJo ( 196126 )

                Signs don't work that way in Europe or Japan, they are all red with different symbols.

                What if it's a sculpture of a deer? Fast roads are separated by barriers. Slower roads you don't need 300m detection range.

                • Signs don't work that way in Europe or Japan, they are all red with different symbols.

                  Yes, all RED. As in, it's important to be able to note when something is red to know for sure it's a traffic sign. Which means you need to be able to see color way ahead of time.

                  Pretty sure other countries also still use the traditional orange cones for construction, at least that is what I have seen...

                  Also how is LIDAR supposed to read those symbols on signs? It cannot. Remember all it sees is shapes.

                  Fast roads are se

                  • by AmiMoJo ( 196126 )

                    Why does the camera need to read road signs at much further distance than humans can?

                    • Why does the camera need to read road signs at much further distance than humans can?

                      Going to make me break out the 640k quote eh?

                      Information obtained as soon as possible is useful in ways you obviously cannot imagine.

                      For instance, in early choice making for alternate routes. That is but one example of billions.

                    • Surely the earlier you can know something, the better. I think the argument here is that cameras neednâ(TM)t outperform the admittedly sophisticated dual-camera combo of the human meat-lens. It needs to match it. If it can eventually beat it, so much the better, but where the advantage starts to sway even today is with the computerâ(TM)s ability to not stop paying attention to 360 degrees of variables. Some of this ability is inherent to it being a machine, some will come with ongoing advancement
                    • Surely the earlier you can know something, the better.

                      I agree with that, but the problem is LIDAR is not really seeing nearly as much as cameras can.

                      They technically reach further, but are getting less information, especially than a pair of stereo cameras can produce.

                      I think the argument here is that cameras neednÃ(TM)t outperform the admittedly sophisticated dual-camera combo of the human meat-lens. It needs to match it.

                      I think cameras probably already match most human vision (just from the standpoint

              • by robi5 ( 1261542 )

                FWIW European traffic has different sign shapes precisely to help the driver discern their import from far away. For example, warning signs are all triangular. There's only small variation across countries https://en.wikipedia.org/wiki/... [wikipedia.org]

                Also, the amount of red ink is significant, which is why a stop sign, already distinct via its unique octagonal shape, is all red. Meanwhile, green and blue are mostly informational.

    • Re: (Score:2, Interesting)

      I understand everything you're saying, and some other people understand, too, but some companies, realizing when they were 95% done that it just wasn't going to 'wake up' and be as cognitive and adaptive as a human brain, just forged on ahead regardless, and then everyone else jumped on the SDC bandwagon too, just so they wouldn't be left behind if it suddenly, magically started working as intended. It's not going to happen. Not with the technology being used now. All the SDC fanboys are ignoring, as you sa
      • Progress never happens overnight. What you are seeing is the progress in motion. And it is progressing. When Elon Musk says one day, they will flip a switch, and the fleet will wake up, he is referring to the end point, not the progress there. It's sort of like installing an OS on to a hard drive. It "instantly" works, but that is not counting all the hours of coding and testing to get the OS in that condition to begin with.
        • Musk has a poor history of promising delivery dates though. Self-driving cars will happen one day - but based on Musk's past claims, I wouldn't put much stock in his timetable claims.

      • We'd be better off with better driver education/driver training and stricter, higher testing standards to keep shitty drivers off the roads. Furthermore how about some psychological testing of drivers to weed out the ones who will 'behave' just long enough to get their license, then go out and drive like it's live-action Mario Kart?

        Yeah, you're perfect. We get it.

        In the real world we know that Teslas will be better than the average human.

      • by robi5 ( 1261542 )

        Correct. The current tech makes incredibly dumb decisions. Like suddenly braking in the middle of an open road, with no obstacles or even shadows, water patches or road discoloration in sight. On the highway, when followed by some other high speed car. There's a long way from that to the ability to tell from a patch of pixels in the road area if the road itself is just differently colored; or it has the hallmarks of an oil spill; or if it's a human, an animal, a tire, or a paper bag. We humans use a ton of

  • Honestly, how does this guy not get sued by regulators for his naming practices?

    • by quenda ( 644621 )

      Honestly, how does this guy not get sued by regulators for his naming practices?

      Why do you ask, X Æ A-12?

  • Probably better to stay off the road once this roles out, at least until all the Teslas have impacted bridge supports, each other, centre medians, etc. At that point we will also be safe from sanctimonious Tesla drivers boring us with their religious bullshit.
    • or get hit by one and hit the jackpot how fast will they payout?

    • I dont personally think this iis much of an issue as others seem to think it is.

      The evidence so far is that your significantly safer with an AI driver than a human one. But anything less than complete safety might not be good enough for some.

      There is a reason they still have pilots on 747 jumbo jets, despite the fact the planes have been capable of full launch to landing automatic flight for decades. Perception.

      • by Geek On The Hill ( 7534494 ) on Tuesday December 22, 2020 @12:23AM (#60855900)

        Drunks rarely stagger in front of 747's while they're in flight.

        Drivers of 747's don't pull over to the side of a cloud because their kid needs to pee.

        747's don't stop short because a deer runs across the sky.

        It's a rare thing for the driver of a 747 to back up in the sky because they missed their exit.

        Drivers of 747's typically don't run stop signs, tailgate, cut each other off, or fight over parking spaces.

        It's pretty unusual for a newspaper, black plastic bag, or stray umbrella to be blown in front of a 747 at altitude.

        Few pilots will admit it, but the real reason why airplanes had autopilots generations ago is because flying is easier than driving. I'm one of the exceptions. I fly (though not 747's) and I drive. It takes a lot of study to master the knowledge part of flying -- most of which will never actually be used in the air -- but flying itself is easier than driving.

        Even landing is easy once it dawns on the pilot (hopefully while he or she is still a student) that landing is just flying to a point on the ground. That's why there have been actual cases when people who'd never piloted an aircraft in their lives were able to successfully be "talked down" to safe landings when the choice was to do so or die. It's simply not that hard to land an airplane. Complete novices who were scared for their lives have done it.

        Driving... That's another story. I learned to drive in New York City. I also drove a cab in Manhattan for a while when I got out of the service, and drove a truck hauling hazmat to put myself through college. Wanna know about some of the stupid shit drivers and pedestrians do? How much time do you have? I have stories.

        What it really comes down to is that flying is predictable. Most flights are so uneventful that both pilots could (and sometimes do) fall asleep without anyone noticing unless they overfly a waypoint. Driving, not so much.

        You also have someone watching over the situation when you're in the air. Even if you are flying VFR and haven't requested flight following, you're still on someone's radar (quite literally). They may not talk to you unless you do something exceptionally stupid or they have reason to believe you're in danger (or creating it), but they're watching.

        In short, there's really no comparison between self-driving cars and planes on autopilot. Driving is much more difficult, requires much better reflexes, and is done in a much less-predictable context. That's why rudimentary aircraft autopilots were already in use during WW1, but we still don't have self-driving cars.

        • That still doesnt change the fact that self driving cars crash at significantly lower rates than human driven cars.

          • by ishmaelflood ( 643277 ) on Tuesday December 22, 2020 @05:04AM (#60856184)

            I haven't seen a study that demonstrates this when accounting for confounding factors. Teslas on Autopilot are mostly nearly new, driven by middle aged middle class people, on freeways, in good conditions. The crash rates for non autopilot cars with the same confounding factors is .... not published.

          • by sphealey ( 2855 )

            The vast majority of documented semi-autonomous vehicle testing has been either in the Mediterranean climate regions of California or the arid regions of Arizona. I don't hear much about Tesla testing "FSD" on the South Side of Chicago a day or two after a 2 ft snowstorm. I do know that General Motors has tested semi-autonomous systems on its usual test routes in Michigan, and perhaps it is not a coincidence that GM significantly slowed its planned release of such technology ~3 years ago.

            • The only place I was ever scared to drive was on I-85 thru Atlanta trying to connect to I-20. It was raining in a construction zone and I was going 90 MPH trying not to die with people passing me on either side like I wasn't moving at all. Finally bailed out to an alternate route. When Tesla can drive in these conditions I will be impressed.

      • Perception is based on coverage, not statistics. People being hit by normal, human-driven cars don't even make the local news. But you can be sure that, for the first few years, every single death due to a self-driving car will get national coverage. That will cause a public perception that self-driving cars are reckless and dangerous, and politicians will react to that and regulate accordingly.

        Same reason people are afraid of terrorism in the US, even though more Americans have been killed by lightning str

    • by ELCouz ( 1338259 )
      This is a subscription based model. It is their best interest to keep the subscriber *ahem* driver alive.
    • by Jeremi ( 14640 )

      Naw, Darwin will only be present in Apple's self-driving car, and that won't be until 2024 at the earliest.

  • Public health (Score:5, Insightful)

    by endoboy ( 560088 ) on Monday December 21, 2020 @10:25PM (#60855728)

    how is it that Tesla can unleash a "Full Self Driving" algorithm on the public without regulatory review of any kind? I see a tort bonanza the first (inevitable) time they maim/kill a pedestrian or bicyclist.

    • by ribuck ( 943217 )

      Tesla is not claiming that FSD is Autonomous FSD, so FSD doesn't mean what the average Joe thinks it means.

      • Tesla is not claiming that FSD is Autonomous FSD, so FSD doesn't mean what the average Joe thinks it means.

        So just like "autopilot" doesn't mean what the average Joe thinks it means, now "full self-driving" doesn't mean what the average Joe thinks it means.

        Everyone (the average Joes I guess) who does not own a Tesla are too stupid to grok the concepts of "autopilot" and "full self-driving". Certainly the problem does not lie with whiz-bang marketing departments, but instead the average Joes being too goddamn stupid. Guess I'll go lick the windows now.

    • by AmiMoJo ( 196126 )

      At the moment it's in "beta" and the owner of the vehicle is supposedly responsible for monitoring it, and thus any accidents that occur as a result.

      This is problematic in many ways. For a start these are untrained members of the public, not qualified safety drivers who know what kind of erratic behaviour to expect and how to handle it. In particular they don't know where the limits are, where the point that they should take over is and judging by the videos people are posting they often leave it way too la

  • I'm planning on getting a Model Y at the end of Feb when the lease on my Leaf runs out. I was going to get the full self driving add on ($8k on the Model Y), but looked at what you get with it, and I just wouldn't use any of it other than to play with as a toy. Summon, Auto Lane Change and Autopark are cute, but even if they actually work reliably, they aren't really all that useful. The end-to-end navigation is what people think of, but the basic autopilot gets you 90% of that with the active cruise con

    • The end-to-end navigation is what people think of, but the basic autopilot gets you 90% of that with the active cruise control and Autosteer.

      Lane-keeping and tailgating-avoidance are not 90% of driving. They're more like 10%.

      • by vanyel ( 28049 )

        That's not what I said, but I'm not going to argue - let's just say it's what I want out of autopilot until it's good enough that I can read or go to sleep using it, no matter the weather or traffic conditions, which I'm not expecting in my lifetime (being an old fart).

  • by Jeremi ( 14640 ) on Tuesday December 22, 2020 @01:23AM (#60855962) Homepage

    The thing about software (even really cool software like Tesla's FSD codebase) is that the marginal cost of production is $0 -- once the developers get it sufficiently debugged and working, additional copies of the resulting program cost Tesla literally nothing more than a call to fwrite() to produce.

    That means that as soon as Tesla feels the first whiff of competition in the FSD market (which they likely are feeling already, given the presence of Waymo and others), they will be able to knock the price of their FSD package down considerably, until eventually the price reflects only the cost of the required hardware (which you paid for anyway, when you bought the car).

    Therefore, unless you are really rich and have money to burn for the privilege of doing rather risky beta-testing for free, you're better off not buying (or leasing) FSD for a few years.

    In the best-case scenario, FSD will be perfected, and it will be widely used across a large number of vehicles, and the price will come down dramatically -- at which point you can buy it on the cheap and save lots of money.

    In the worst-case scenario, FSD will run into unexpected major problems, and never quite live up to its promise, in which case you avoided buying a lemon, and saved even more money.

    Either way (as well as in any in-between scenario), you win by holding off until FSD has been commoditized.

    • by AmiMoJo ( 196126 )

      The original spec for Full Self Driving as sold was that it could drive you to work automatically, then go off and park itself, and finally pick you up again for the return journey. It was supposed to be able to drive itself across the entire continental United States, so e.g. you could take a flight to your destination and a day later the car catches up with you.

      These outlandish claims are a long, long way from being possible. Current Tesla vehicles don't even have self-cleaning sensors (apart from the one

  • by fluffernutter ( 1411889 ) on Tuesday December 22, 2020 @08:02AM (#60856346)
    As I have feared, automated driving is becoming more about "how far can we push liability" than "how safe can we keep people"? It seems that Tesla is willing to roll this out as long as they feel like they won't get sued rather than actually having a safe product.
  • Full Self-Driving. Actually full self driving. I cant believe it's not driving itself. Truly autonomous self driving on regulated car parks only. Self driving double plus star shot.

    Don't call it self driving until it can drive me home from a bar without me risking a DUI.

  • Whether FSD is a step beyond Autopilot depends on whether Tesla is willing to publicly declare the system to be at least Level 3. In the past, Tesla relied on innuendos to suggest that Autopilot was at least Level 3, while still publicly and legally stating that it was only Level 2 to avoid lawsuits and fraud allegations.

    If Tesla claims Level 3, then that's huge news and a truly significant advance. Otherwise, it's an incremental addition of more Level 2 features. Such advances are still welcome and pote

You know you've landed gear-up when it takes full power to taxi.

Working...