Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Transportation Software Technology

Tesla's Full Self-Driving Beta Rolls Out To Rave Reviews 157

Rei writes: "Oh, it's going dude!" "But there's no lanes -- there's cars on the side of the road!" "DUDE, it's navigating through it ALL, bro!" ... "It paused to look DUDE!" "There's NO LANES! Elon, you madman!"

Such was one of the many reaction videos to come out overnight as Tesla released a major upgrade of Autopilot to a limited public beta. Complete with a new LIDAR-like visualization system, the car now provides a detailed display of how it perceives the world as it dodges parked cars, takes unprotected left turns with cross traffic and pedestrians, etc. No issues have been reported thus far, although one driver aborted after a roundabout due to a panicked passenger who didn't like how it exited into a lane with parked cars ahead (though the screen showed that it planned to change lanes to go around them).

The new version is the result of a long-running "4D" rewrite at Tesla, to overcome local maxima in earlier versions. Instead of processing each camera individually as a static series of frames, the neural net now processes a unified stitch of inputs over time, allowing context to persist between cameras and between frames. This in particular enhances parallax processing of depth input, both in terms of parallax between camera viewpoints and parallax between vehicle positions. If the public beta goes well, Elon is hopeful that Tesla will be able to roll out the new version broadly by the end of the year.
This discussion has been archived. No new comments can be posted.

Tesla's Full Self-Driving Beta Rolls Out To Rave Reviews

Comments Filter:
  • by Rei ( 128717 ) on Thursday October 22, 2020 @06:31PM (#60637806) Homepage

    They definitely don't want people getting complacent ;)

    Full Self-Driving (Beta)

    Full Self-Driving is in early limited access Beta and must be used with additional caution. It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road. Do not become complacent.

    When Full Self-Driving is enabled your vehicle will make lane changes off highway, select forks to follow your navigation route, navigate around other vehicles and objects, and make left and right turns. Use Full Self-Driving in limited Beta only if you will pay constant attention to the road, and be prepared to act immediately, especially around blind corners, crossing intersections, and in narrow driving situations.

    Time will tell how reliable it is. But these videos are awesome, and I can't wait to get the update (even though it'll be Europe-crippled).

    • by Rei ( 128717 ) on Thursday October 22, 2020 @06:46PM (#60637842) Homepage

      A few [twitter.com] more [twitter.com] videos, beyond those linked in the above Electrek link [electrek.co].

      One of the owners plans to post a drone shot of the car doing a long city drive tomorrow.

    • Re: (Score:2, Insightful)

      by Moof123 ( 1292134 )

      "Do not become complacent."
      I thought Uber did a thorough job showing that people WILL become complacent, even when you pay them not to.

      Why are we beta testing on live roads with real humans again? How is this all legal exactly?

      • by Rei ( 128717 ) on Thursday October 22, 2020 @07:08PM (#60637924) Homepage

        What exactly is your proposal for maturing self-driving technology if not for using it on actual roads?

        1,35 million people die per year in car crashes. If you delay the advent of autonomy by just one day, you've caused the death of 4000 people. I fully agree that autonomy is something that should be pursued, and sooner or later, you have to put it on the road, and no, it will not be flawless. Which is why you first do heavy internal testing, then tiny public betas, then ever-growing betas as the results come in, assuming that the data supports that they're safer than a human driver.

        What you don't do is just, "It can't be fully birthed at maturity, so we should just not do it".

        Again, I don't think things will be to the point where you can just take humans entirely out of the loop soon. But I support the development of this technology regardless. Humans in cars kill people, every year, and - eventually, and with sufficient development, the overwhelming majority of these deaths can be eliminated from there out.

        • +1 Insightful

        • by vux984 ( 928602 )

          "What exactly is your proposal for maturing self-driving technology if not for using it on actual roads?"

          Maybe Tesla owns an abandoned piece of detroit or some other ghost town where its been paying pedestrians to crosswalk, jaywalk, dogwalk, and cycle, and hiring people to drive around aggressively, naturally, over-cautiously, open doors from parked cars, setup obstacles from trashbags to an old mattress to a shredded tire, to plywood sheets, to an 18 wheeler truck blocking 4 lanes the road, erect construc

          • by robbak ( 775424 ) on Thursday October 22, 2020 @09:17PM (#60638220) Homepage

            For a while, when cars have been driving on the street, this software has been running in the background - not actually driving the car, but analyzing the scene and deciding what it would do. They are releasing it to beta after they have seen it making good decisions in the real world for some time.

            • They also run massive simulators so that they can set up virtual scenarios for the cars to navigate.

              • by vux984 ( 928602 )

                And that's great. But again you can't simply equate passing in a simulator to mean its ready to be let loose on the streets unsupervised. Why NOT take a more cautious approach with fewer risks? Like geofencing it to an area they've simulated heavily, and limiting it to daylight hours on clear days.

                This reeks of publicity stunt, which is pretty much Textbook Musk.

                • by Kumiorava ( 95318 ) on Friday October 23, 2020 @01:03AM (#60638700)

                  We are letting uneducated drivers to the streets every day to learn how to drive, nobody requires humans to be fully trained pros. Why do you feel that a properly trained AI couldnâ(TM)t drive much better than an average human driver? The events happening out there are not that strange, see an obstacle, avoid an obstacle. More complex cases are also extremely complex for most humans who react in unoptimal ways to the events as well.

                  • We don't place enough restrictions on new drivers. Licenses should be graduated, with new license holders only permitted to operate subcompacts (with exceptions for people who are unable to do so, like wheelchair users.)

            • by vux984 ( 928602 )

              "They are releasing it to beta after they have seen it making good decisions in the real world for some time."

              I'm aware of this but its not the same as actually driving.

              That's "training the ai". It's hard, and some would say meaningless to then test it on its own training data.

              "They are releasing it to beta after they have seen it making good decisions in the real world for some time."

              It's pretty hard to assess that; because they weren't driving. So all you know is how different the car would have done vs w

              • and some would say meaningless to then test it on its own training data.

                AlphaZero wants to know if those people would like to play a nice game of Chess.

                I have no doubt, whatsoever, that any even semi-trained NN will *quickly* come to outperform human drivers in instances and ways human drivers didn't even realize they sucked at.

              • I'm aware of this but its not the same as actually driving.

                That's "training the ai". It's hard, and some would say meaningless to then test it on its own training data.

                That's not how control systems work. Training on visible data and then adjusting the response based on what the simulated response would be is almost the same as actually driving because the output is completely deterministic. Modeling a system's actions based on the understanding of how a car reacts to an input, and comparing it to human decision making is actually not that different from putting 16 year old you in the drivers seat of a learner car. You and the system both very much learn based on what is

          • Maybe Tesla owns an abandoned piece of detroit or some other ghost town where its been paying pedestrians to crosswalk, jaywalk, dogwalk, and cycle, and hiring people to drive around aggressively, naturally, over-cautiously, open doors from parked cars, setup obstacles from trashbags to an old mattress to a shredded tire, to plywood sheets, to an 18 wheeler truck blocking 4 lanes the road, erect construction pylons in typical ways to simulate construction, and maybe it has run these tests for the last year in daylight, dawn, dusk, midnight, moonlight, in fog, in wind, in rain, in snow... maybe they did do all that.

            Did they? Where? What exactly were the conditions of this heavy testing?

            It's almost as if you've never seen a screenshot of Grand Theft Auto.,,

          • Your proposal is a great way of teach a car how to drive in a lab. Even you as a learner driver at some point will have progressed out of the shopping center carpark and onto an actual road.

            Did they? Where? What exactly were the conditions of this heavy testing?

            I don't know man, why not get the detailed design documentation of every product you've ever touched? I'm sure companies are more than happy to open up all their R&D for any schmo who asks. /sarcasm

            Taking humans out of the active control loop to the point that they aren't actually driving, inevitably reduces the attention they are paying to the driving.

            What a wonderful end goal. I can't wait for the day where I don't need to so much as look out the window when I'm in the

        • by AmiMoJo ( 196126 )

          Waymo seems to have managed to do it without any fatal accidents. In fact the only accident they ever had was when the car ran into the side of a bus at 2 MPH after it failed to anticipate that the bus would ignore traffic rules.

          Their system is Level 4. No safety driver, limited operation area.

          I've lost count of the number of Tesla Autopilot fatalities but Waymo has been proven right. People can't be trusted with a Level 2 system, the closer it gets to being perfect the less attention they will pay to it.

        • by flink ( 18449 )

          What exactly is your proposal for maturing self-driving technology if not for using it on actual roads?

          Have 1000s of people drive around in cars instrumented exactly the way the car will be. Record all the telementry. Feed the telemetry to the AI for training. Record more scenarios. Release the feature as a beta to trained professional drivers only when the AI can navigate all the recorded scenarios perfectly without having to hand control back to the driver wile the car is in motion. Doing so while stopped at an intersection or safely pulled over is probably OK. Once it's soaked with professionals for

      • by jiriw ( 444695 ) on Thursday October 22, 2020 @10:50PM (#60638444) Homepage

        Why are we beta testing on live roads with real humans again? How is this all legal exactly?

        Why do 16-year olds (17/18 in other countries) have to practice driving for getting a drivers license on live roads with real humans again? How is this all legal exactly?

        1) Because there's no substitute for experience, both for humans and these kinds of AI. 2) It's legal because we make it so. And we make it legal because reasons. Reasons being various and ranging from convenience to eliminating the largest mortality cause in car crashes today, human drivers.

    • by dgatwood ( 11270 )

      Time will tell how reliable it is. But these videos are awesome, and I can't wait to get the update (even though it'll be Europe-crippled).

      Likewise. There are some highways that I drive where the current code (even with HW3) drives by braille, and used to try to hump the center barrier on CA-17. I have a similar problem with a road a block from my house, where I have a disengagement every time I drive through with AP turned on. I'm really looking forward to seeing if this improves that misbehavior.

      • There are some highways that I drive where the current code (even with HW3) drives by braille, and used to try to hump the center barrier on CA-17.

        Where? Does it do it in that dip on the NB side where humans are most likely to rub up against the jersey barriers, or somewhere else? I used to love driving the 17 back before it got super clogged and they dropped the limit to 55 and festooned it with CHP. 17 is not at all a hard road to drive if you know how to drive, except for all the dildos who don't who are in your way. People who think it is scary should try CA-175 between Hopland and Lakeport sometime.

  • ... not going to be happy with increased numbers of folks sleeping at the wheel.
  • by timholman ( 71886 ) on Thursday October 22, 2020 @06:48PM (#60637850)

    It's unpossible! There's no way this story could be true. I have been assured by numerous Slashdot experts that it will be decades before self-driving vehicles come to market. I mean, if the brilliant minds posting on Slashdot can't figure out how to do create an autonomous vehicle, then sure it can't be done, right?

    But in all seriousness, kudos to Musk and Tesla's engineers on this. It's a great step forward, and will ultimately transform the world for the better.

    • by Luthair ( 847766 ) on Thursday October 22, 2020 @06:52PM (#60637872)
      Except this is still level 2 and you need to be able to take it over for it any moment. More balanced coverage - https://jalopnik.com/tesla-beg... [jalopnik.com]
      • Re: (Score:3, Informative)

        by Rei ( 128717 )

        Jalopnik, a site for "balanced coverage" of Tesla?

        Here, lets take a look at Jalopnik's most recent headlines [jalopnik.com] on Tesla. They all are, in order:

        ----
        Tesla Begins Deploying Full Self-Driving Beta To Select Customers But It Is In No Way 'Full Self-Driving' (your article)
        Look At This Idiot In The Passenger Seat While His Tesla 'Drives' On Autopilot
        Tesla Owner Who Butt-Dialed A $4,000 Upgrade Hasn't Gotten A Refund So Far And Tesla's Handling This All Wrong
        Tesla Is Bending On Price Yet Again
        Elon's Vegas Loop Runs

        • by AmiMoJo ( 196126 )

          Are they wrong though?

          When Tesla started selling "Full Self Driving" back in 2016 they made some very specific promises about its performance.

          - Drive you to your destination with no interaction
          - Drop you off and go park itself, then pick you up again later
          - Can drive from coast to coast by itself, self-cleaning and recharging as necessary

          More recently

          - Will earn you money as a robot taxi while you aren't using it by the end of 2020

          Does this "full self driving" beta meet any of those goals? Well it says you

      • I don't find that balanced at all. Half or more of the article is a rant about takes Tesla's marketing wank of calling it "Full Self Driving", pretending there are large numbers of Tesla drivers who've somehow ignored all of the numerous warnings telling people they need to pay attention at all times and be ready to intervene, but that the one thing that would finally convince them to actually do that would be to call the feature something like "Limited Autonomous Driving".

        The rest is speculation. Whet i
      • There I just saved you from having to read jalopnik or any further posts by Luthair.
      • Jalopnik article balanced?

        "What this is saying is that the system, for all it can do, is just a Level 2 driver-assist system, just like every other semi-autonomous system on the market."

        How is it "just like every other semi-autonomous system on the market. ??" What other car can you buy with a system that can automatically make turns in city driving and stop for lights and stop signs .. all while avoiding pedestrian and other things. The fact that they could say Tesla has done nothing beyond cruise control

    • by Jeremi ( 14640 ) on Thursday October 22, 2020 @09:38PM (#60638260) Homepage

      I have been assured by numerous Slashdot experts that it will be decades before self-driving vehicles come to market.

      I think the Slashdot experts were operating under the assumption that self-driving cars would not be allowed to be sold or driven on public roads until after they were proven safe (for some reasonable definition of "safe").

      They didn't count on Elon coming along and saying "f*ck it, we're doing it live", and having enough reputational capital that nobody tries to stop him. Now we'll find out over the next year or so if the new FSD software is safe or not.

      • by ceoyoyo ( 59147 )

        Elon, purposely or not, chose the only really viable path towards something like this. Google et al. treated self driving as a monolith. It's got to be really, really good, or it doesn't go at all. Tesla deployed some limited features, upgraded them, upgraded them again, and again.

        People don't like big change. But if you make small enough incremental ones you can slip it in while they're preoccupied showing their friends the latest trick.

      • by AmiMoJo ( 196126 )

        Tesla say it's not safe, you must be monitoring it constantly and ready to take over in a fraction of a second.

        Worse still they might alter the behaviour one day so even if it took that corner safely yesterday there's no guarantee today.

        As people get complacent again the number of accidents will increase.

      • I think the Slashdot experts were operating under the assumption that self-driving cars would not be allowed to be sold or driven on public roads until after they were proven safe (for some reasonable definition of "safe").

        It doesn't matter what slashdot experts think. What matters is how lawyers and the courts will handle the first cases of an accident caused by this system.

        Logically, objectively, it makes sense to replace human drivers with autonomous cars if they can lower the fatality rate. But em

  • by galabar ( 518411 ) on Thursday October 22, 2020 @08:06PM (#60638046)
    I'm not a Ludite. However, I understand the work companies like Google and Cruise are doing, and it is difficult.
  • by ZipK ( 1051658 ) on Thursday October 22, 2020 @08:19PM (#60638078)
    Who's legally responsible when an auto-driving car gets in an accident? The "driver" or Tesla?
    • It will still be the driver. Tesla will have put a lot of work into the legal terms of agreement for using this technology. This isn't the usual click "Agree" when signing up for a new online service, the participants will have had to actively register for the beta and confirm beyond doubt that they have read and accepted the terms. The auto pilot will have to do something pretty drastically wrong, like actively ignore driver input, in order to be blamed for anything.

    • by Jeremi ( 14640 )

      Who's legally responsible when an auto-driving car gets in an accident? The "driver" or Tesla?

      That's the sort of question that only really gets answered after the accident (and the lawsuit) occurs. But I'll hazard a guess: everyone involved who has money, will get sued.

    • Comment removed based on user account deletion
      • You are correct, this is not what you consider full self driving.

        But you are incorrect when you say it's not what anyone is asking for. There are a lot of people who are asking for it. Yes, they want even more, but they want this, now. And it's one step closer to what you want, which is full self driving from beginning to end where the car may not even have a steering wheel or pedals anymore. You can't get there without going here first.

    • Who's legally responsible when an auto-driving car gets in an accident? The "driver" or Tesla?

      Don't worry. Tesla has a huge team of lawyers to make sure it's you, ZipK. Even if it happens 1000 miles away.

    • Who's legally responsible when an auto-driving car gets in an accident? The "driver" or Tesla?

      Although it's not really the same case exactly, when Musk was asked that question about the hobo-taxi mode Musk said that Tesla would be responsible for accidents, not the car owner (at the self driving car presentation a few years ago).

      • by AmiMoJo ( 196126 )

        I wonder how it will work in practice. If the car has an accident while operating as a taxi will Tesla immediately fix it for you, regardless of who they think is at fault? Or will you have to wait while they sort out liability?

        And who pays for cleaning up the vomit and skidmarks on the back seat? If Tesla is taking a cut of the earnings seems like they should be partly responsible.

      • by ledow ( 319597 )

        And what he says makes no difference until the law changes to allow that.

        Because if I break the law but, say, Tesla says they'll take the rap, the courts won't necessarily consider that valid at all. They'll still fine me, put points on my licence, throw me in jail, etc. and Tesla can't "buy me out" of that responsibility. Any more than your employer could buy you out of jail for committing, say, insider trading.

        Until the LAW says that, Musk's assertions are mere speculation, even with the best of intenti

    • People asked these same questions about cruise control, anti-lock brakes, automatic hi-beams, etc., etc.

      In general people tend to overestimate how difficult it will be for the law to adapt to new technology. This isn't rocket science, guys.
    • by ledow ( 319597 )

      Driver.

      Who may sue Tesla, but driver is 100% responsible.

      You'll know when self-driving is a thing when the car companies take all the burden, legally. Whether voluntarily (which you can't really do, you can't excuse the driver's actions as a third-party) or legislated.

      At that point, you "own" a fancy self-driving taxi, and you'll see profits plunge as the car suddenly becomes responsible for its own actions, and Tesla has to insure every one of their cars on the road against every minor bump.

    • What an amazingly American question to ask. No question about the technology or its affect on society, just straight to liability and suing.

  • by Nocturrne ( 912399 ) on Thursday October 22, 2020 @09:47PM (#60638282)

    Safety really can't improve much until all vehicles are aware of each other and have some level of autonomous braking and accident avoidance - it's coming though. These are interesting times.

    • by Joce640k ( 829181 ) on Thursday October 22, 2020 @10:35PM (#60638400) Homepage

      Don't you think it's weird that humans can drive cars without any of that?

      • Next you're going to tell us that humans don't have LIDAR.
      • by ceoyoyo ( 59147 )

        Most of the humans I know find it pretty difficult to drive without being aware of where the other cars are.

        I know a few that don't have autonomous braking and accident avoidance though. Honestly, you get tired of screaming pretty quick.

        • Most of the humans I know find it pretty difficult to drive without being aware of where the other cars are.

          Yet they manage to gather that information using only two forward facing cameras and a three small HUDs for rear view.

          • by ceoyoyo ( 59147 )

            Yeah, it's weird hey? It's as if you can simulate several cameras by mounting one in a turret.

        • I've seen an elderly person hit a parked car and continue driving as if nothing had happened. When they stopped further up the road i got out and asked them why they didn't stop and also why they didn't react to me trying to signal them down with my highbeams. They said they didn't have time to use their rear view mirror as they were focusing on the road.. I reported it to the police but i bet they still have their license.

          A huge percentage of the population has terrible vision/balance/attention and takin

        • Most of the humans I know find it pretty difficult to drive without being aware of where the other cars are.

          You know shit humans who need their driver's licenses revoked. You should absolutely be able to drive a car safely without knowing where cars are with the exception of the one immediately in front of you, and front sides of you with their indicators on. If you have to perform a maneuver that requires information about other areas than in front of you people generally turn their head to gain that additional information and do so incredibly safely without ever having to communicate with another driver beyond

          • by ceoyoyo ( 59147 )

            Lol. So you agree, it would be difficult to drive without knowing where the other cars are?

      • aware of each other and have some level of autonomous braking and accident avoidance

        I certainly am aware of other people on (or near) the road. It took training, and a brain that evolution had prepped for that, but I'm aware. It's called "situational awareness" and most humans have it naturally. The autonomous braking and accident avoidance is also built-in. Which is why the passengers in my car are so often hitting phantom brake pedals, I guess.

      • by GuB-42 ( 2483988 )

        Humans don't drive cars without any of that.

        We have turn signals, brake lights, horns and flashing headlights and reverse signals that serve no other purpose than communication. There is also a whole lot of "nonverbal" communication going on too. For example, you can position yourself in a certain way to indicate your intent to pass or to yield. You can also use exaggerated movement to indicate that you took a deliberate action instead of just adjusting yourself. Plenty of things. This, by the way, is a rea

    • Can't build the systems to rely on that, but of course V2X once common will add quite an impressive safety factor on top. Think of it this way, V2X cannot tell you that there are no hazards behind a blind turn, but sometimes it can tell you there is one and that is useful.
    • by AmiMoJo ( 196126 )

      Safety has improved a lot with better sensors in cars.

      For example modern vehicles have blind spot sensors, rear cross traffic sensors (for backing out of parking spaces), front radar for collision mitigation and following at a safe distance etc.

      From next model year Volvo will be fitting lidar, mostly for safety, e.g. detecting pedestrians.

      There is still plenty of scope for improvement.

    • by radja ( 58949 )

      that time will never come. Self-driving bicycles? not gonna happen.

  • Lets not be coy about it, it exists.

  • I wonder how many people that kills every year.

  • As a techie, I am awed by what they are able to do with "autonomous driving".

    However as an experienced driver and someone with 40+ years experience of being a human, I don't see autonomous cars in the near future as being able to drive us everywhere. Special roads with limited random stuff that can happen? Probably. Your local neighbourhood?

    Just think about the situations you have to process as a driver. A poor driver will react to whatever is right in front of them, and as a passenger you can often recognize them by how they respond late to situations. A good driver will be "kids on both side of the road, someone might be planning to run over, a car is coming, one of us will have to stop and the other go slow, who will it be ...". Or, "that garbage truck has its reverse driving lights on, it is probably about to back out on the main road" - and we respond _early_ and adapt. We can make decisions like "that tree has broken and is hanging over the road, a classical "widow maker" as bushcrafters would call it, no way am I driving under that". Or, you are driving on the motorway going fast on the left and there is a car running very close behind a truck which something about its driving pattern makes you think "that one might suddenly decide to shift lanes, and it might be some 90 years old driver who won't even signal or look in the mirror - better slow down a bit to give me time to react". Or ... there are just _so_ many examples you can immediately come up with.

    The thing is, driving has to work not most of the time and in most of situations, it has to work 100% of the time in _all_ situations. There is no way any Tesla artificial intelligence shiny thingie - probably not even if running on the earth's largest supercomputer but definitely not on some chip running on a few measly watts inside the car - is going to handle that. No way.

    Don't get me wrong. I am _really_ excited about the tech. And I see a role for autonomous cars, in designated areas which have been adapted to work with current tech level. But something that just takes you to work wherever and whenever without you as a driver, everywhere? No way.

    • by Compuser ( 14899 )

      So for the record, I am not a Tesla fan, do not own a Tesla car and have never owned Tesla stock. But I do think self driving cars are here to stay.

      >Just think about the situations you have to process as a driver.

      That's what they said about all the possible positions in chess and then when that barrier fell, all positions in Go. Companies now sell autocomplete for IDEs based on neural network anticipating your next line of code. Anticipating possible human behavior on the road is in that same class of di

  • by thegarbz ( 1787294 ) on Friday October 23, 2020 @02:59AM (#60638884)

    It's a technology people think is perpetually 50 years away when it is in fact very close.

    Unlike fusion which everyone thinks is very close but in fact perpetually 50 years away.

  • Complete with a new LIDAR-like visualization system

    The only way in which it's similar to LIDAR is that it uses light. Literally everything else is different. Most notably, it can't sense distance, it has to infer it; it uses mostly ambient light, night time aside; and it uses wholly different frequencies.

Technology is dominated by those who manage what they do not understand.

Working...