Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation AI

Tesla's Elon Musk Promises Full Self-Driving Autopilot Beta In 'A Month Or So' (cnet.com) 94

"I think we'll hopefully release a private beta of Autopilot — the full self-driving version of autopilot — in, I think a month or so?" CEO Elon Musk said this week at Tesla's annual shareholder meeting/Battery Day event. "And then people will really understand the magnitude of the change," said Musk adding, "It's profound. You'll see what it's like, it's amazing."

CNET reports that attendees then showed their approval "by honking the horns of their safety bubbles." "It's kind of hard for people to judge the progress of Autopilot," Musk told a crowd of shareholders present at the event, each social distancing in their own Tesla Model 3, drive-in style. "I'm driving a bleeding edge, alpha build of Autopilot, so I sort of have insight into what is going on."

Musk went on to explain how Tesla's engineers recently had to overhaul major parts of the Autopilot, including a rethinking of how the system sees the world. "We had to do a fundamental rewrite of the entire Autopilot software stack... We're now labeling 3D video, which is hugely different from when we were previously labeling single 2D images," Musk explained, referring to the way the Autopilot software understands what the objects it sees with its eight cameras are, and how it should react to them. "We're now labeling entire video segments, taking all cameras simultaneously and labeling that. The sophistication of the neural net of the car and the overall logic of the car is improved dramatically."

This discussion has been archived. No new comments can be posted.

Tesla's Elon Musk Promises Full Self-Driving Autopilot Beta In 'A Month Or So'

Comments Filter:
  • by fahrbot-bot ( 874524 ) on Saturday September 26, 2020 @04:48PM (#60546648)

    Tesla's Elon Musk Promises Full Self-Driving Autopilot Beta In 'A Month Or So'

    Heard that before. Guessing the timeline will be like Trump's ACA replacement. Just sayin' ...

    • by Joce640k ( 829181 ) on Saturday September 26, 2020 @04:50PM (#60546656) Homepage

      Keep shorting their stock, you're sure to win eventually...

      • by nonameid ( 7285982 ) on Saturday September 26, 2020 @05:14PM (#60546694)
        As a Tesla fan scheduled to pick up my second Tesla Tuesday stop. This is the kind of stuff that makes us reasonable fans look like cult members. They are great cars. My model 3 is the best car Iâ(TM)ve ever owned by a wide margin. Thereâ(TM)s nothing wrong with legit concerns though. Musk saying weâ(TM)d be able to summon the car from across the country and it would drive itself to you completely autonomously by the end of 2017 should not be swept under the rug. Many people including myself spent thousands on that and will likely never get anywhere near what was promised for the life of their car. I know Iâ(TM)m not going to since Iâ(TM)m trading in for the Y.
        • I'm in the same boat. It would be nice if we could at least transfer the AP pre-purchase to a new vehicle. If this doesn't happen, I consider suing Tesla for the cost of the autopilot once my car hits 5 years.
          • by saloomy ( 2817221 ) on Saturday September 26, 2020 @07:01PM (#60546850)
            Some of the features were delivered though. Navigate on autopilot was an FSD feature. I have an X, and I love the car. Itâ(TM)s getting better and better. It now figures out which lane at a stop light you are behind, and indicates when it turns green. Itâ(TM)s easy to see how Tesla can take the next step soon and enable turn on city streets. Not all turns will probably be supported, but the progress should be steady. I do think they could change the licensing model, but thatâ(TM)s neither here nor there.
            • by Cyberax ( 705495 ) on Saturday September 26, 2020 @07:31PM (#60546870)
              I have a 2015 Tesla Model S and 2017 Tesla Model 3. Right now there's no functional difference between autopilots in both of them. Heck, 2015 is better in some regards - it's much more assertive during lane changes. My 2017 model has more useless AP bling (like bad automatic lane changes) but it's not really of any real use.

              I'm fine with what I got for 2015 (although I hoped for a much better summoning experience). But for my 2017 model I was promised a full self-drive capability and right now it's not even close.
            • by dgatwood ( 11270 )

              Navigate on autopilot was an FSD feature.

              Depends on when you bought your car. Prior to 2019, it was actually an Enhanced Autopilot feature, not an FSD feature. It became an FSD feature in 2019, when they split up the EAP features, moving some of them up to FSD and the rest down into the base AP configuration.

              The only thing pre-2019 purchasers have gotten so far from their FSD purchase is the HW3 upgrade, if that.

            • by AmiMoJo ( 196126 )

              The original Full Self Driving they were selling back in 2016 included the following features:

              - Summon the car from the other side of the country, it will drive 1000s of miles by itself, charging and cleaning itself as needed, to your location.

              - Drive you to work while you read the paper.

              - Go off and find a parking space by itself, then come back to collect you later.

              More recently Musk added:

              - Robotaxi service, your car will earn you money while you are not using it and become an appreciating asset, worth m

          • by ShanghaiBill ( 739463 ) on Saturday September 26, 2020 @07:24PM (#60546866)

            My spouse bought her Model-S back in 2015, so we are already past the 5-year point.

            She paid extra for Autopilot, but never got FSD. However, she still got her money's worth. Even normal Autopilot is a big win for highway driving.

            • by Cyberax ( 705495 )
              I too have a 2015 Model S. It's still awesome and the AP helps a lot on highway. But my 2017 Model 3 was sold with a promise of full self-driving AP (and it says so on the purchase contract). It's now past 60000 miles, and by 2022 it's going to be past 100k. At that point I would consider it to be past most of the usable car's lifetime, so I will try to get my FSD either refunded with interest or ported to a new car.
        • Well, you know what they (should) say about AI with human lives on the line: the last 1% of the development takes 99% of the time.

          • Alas, thatâ(TM)s not how it works. Either the model will work or they will have to rewrite and retrain it again. So itâ(TM)s more like to 80%, then back to 0%, then to 85% then to 0%, rinse repeat.

            • True, but AI development is a lot more than training - it's also all the learning done by the design team in developing the training regime and underlying neural architecture.

              • Re: (Score:3, Insightful)

                by Maxo-Texas ( 864189 )

                It's also a problem gaining acceptance that it's not perfect- just much better than most human drivers.

                • If it can make a mistake that I wouldn't make, then I can't be expected to be held liable for insurance on it. It would be fine if it's 'mistakes' were the last bit that a human nor a computer could prevent but it may make a mistake that no human would make. That's what concerns me. They don't have them human enough.
          • There is no last 1% because there is no state of completion. Cars with no AI never got "done" being safer, and cars with it never will either. There isn't going to be any grand unveiling that changes everything all at once. Musk's constant promises far exceed the truth, yet the rollout of AI into cars and the level of self-driving continues to advance, with Tesla at the forefront.
            • >There is no last 1% because there is no state of completion
              Neither is there for virtually any software - but there's still a finish line of "good enough to ship".

              Unfortunately for Tesla, a full self-driving autopilot won't be able to pull the *very* common software dodge of "not warranted as suitable for any purpose, including those for which it was marketed"

            • by Ed Avis ( 5917 )
              The state of completion is when you can let the car drive itself and have a nap, as safely as driving it yourself.
              • That's not as much of a threshold as it sounds like because it's so situationally dependent. Even once AI cars are safer than human drivers for 99.9% of all trips, they will still require intervention for a few special circumstances. At least that's my bet.
              • by BranMan ( 29917 )

                I don't know about you, but I do not drive safely while I'm napping.

      • eventually win? personally I would not short or long tesla, too much risk in either direction. however the shorts have made a fortune on Tesla. you do realise they don't have the short positions permanently open?
  • by Anachronous Coward ( 6177134 ) on Saturday September 26, 2020 @04:51PM (#60546660)
    Doesn't exactly sound like a "promise."
    • by geekmux ( 1040042 ) on Saturday September 26, 2020 @05:22PM (#60546708)

      Doesn't exactly sound like a "promise."

      (Consumers reacting to a CEO statement from a PRIVATE company): "Wow, that sounds pretty cool."

      (Consumers reacting to a CEO statement from a PUBLIC company): "Yeah whatever bitch. Where's my money?"

      Sometimes I wonder just how much greed, taints people's reactions.

      Self-driving cars have been in our dreams since flying cars. Don't suppose you could put aside the pointless vitriol and just admire the progress, or would you prefer to keep worrying about your loved ones being killed by a fucking social media junkie who just had to answer that text message?

      Be patient. Intelligent humans are trying to solve for Mass Ignorance. That ain't fuckin' easy. Humans are really good at it.

    • Re: (Score:3, Interesting)

      by AmiMoJo ( 196126 )

      He didn't define "full self driving". Originally it was level 5, go anywhere without a driver, charge itself etc. At other times he has suggested that at first it will just be level 3.

      This seems like an admission that the robotaxi service won't launch this year as promised.

      • by MrL0G1C ( 867445 )

        Those inbetween levels should be banned, either the driver should drive the car or the computer should drive the car otherwise the human will fall asleep, text, watch videos etc. It's already happened and people have already died from not understanding the inbetween levels.

        And self-driving cars should be tested to show themself to be proficient at understanding every single aspect of every single (UK) highway code. Or else they don't pass the test and they don't get to drive until fixed.

        • by aberglas ( 991072 ) on Saturday September 26, 2020 @06:46PM (#60546818)

          Nope.

          They just have to be safer than existing human drivers. Perfection is the enemy of the good.

          Also some situations different from others. I'd say driving down freeways in good weather they could be 100%.

          But yes, the in between can be very dangerous.

          • by eepok ( 545733 ) on Saturday September 26, 2020 @10:59PM (#60547072) Homepage

            They need to be damn close to perfect if they're to be trusted and insured.

            The programming company should become the driver. If the programming company accepts liability, people will trust it. Currently that liability is distributed among millions upon millions of drivers. Even if US deaths are dropped by 90%, that's 4,000 deaths per year centralized on a couple companies.

            If the programming company defers liability to the non-driving human, people will simply not trust the vehicle.

            These are the non-programming issues that still aren't sorted that can destroy companies and sub-industries of their not sorted prior to level 5 AVs hitting the road.

            • Re: (Score:2, Insightful)

              The programming company should become the driver. If the programming company accepts liability, people will trust it.

              That shell company in the Caymans? Not sure many will trust it.

          • Re: (Score:3, Insightful)

            by MrL0G1C ( 867445 )

            If a car can't follow the highway code then it is questionable as to whether it can drive safely and in a manner predictable to other drivers, riders, cyclists and pedestrians.

            The standard of safety shouldn't simply be human drivers, self-driving vehicles should be as safe as patient drivers who are not tired and not under the influence of drink or drugs.

          • by Gimric ( 110667 )

            Nope.

            They just have to be safer than existing human drivers. Perfection is the enemy of the good.

            Also some situations different from others. I'd say driving down freeways in good weather they could be 100%.

            But yes, the in between can be very dangerous.

            No, that's not how legal liability works. The courts look at each accident in isolation to work out who was in the wrong. The fact that a driver is, statistically speaking, a safer than average driver doesn't matter when it comes to legal liability.

            Societies will have to decide how to handle liability for autonomous software. Imagine your loved one is killed by an autonomous vehicle. Do you just shrug your shoulders and say "Shit happens"?

          • by AmiMoJo ( 196126 )

            But that's not "full self driving", that's just level 2 again. Level 2 requires a human driver ready to take over, if say the weather changes or there is some other event that the car can't handle. Tesla specifically sold level 5, they made that extremely clear so must now deliver it or have broken their contract.

            I imagine investors will be rather upset when the promised robotaxi revenues fail to materialize and someone else beats them to it, e.g. Waymo.

            There are already cars with better level 2 than Tesla,

            • by dgatwood ( 11270 )

              I imagine investors will be rather upset when the promised robotaxi revenues fail to materialize and someone else beats them to it, e.g. Waymo.

              Waymo was already running minimally supervised robotaxis six months before Musk announced that plan, and fully driverless robotaxis almost a year ago. Don't you think that ship has already sailed? I mean okay, maybe not the revenues, but that's more a matter of scale rather than readiness.

        • Those inbetween levels should be banned

          Having used the current Autopilot for a while I disagree, but even ignoring that, I don't think you can argue that level 4 isn't perfectly safe. Level 4 is complete autonomy -- driver doesn't have to be paying attention, or be awake, or even at the wheel -- but not in all conditions/environments. A level 4 vehicle has to be able to recognize when it's entering a situation it can't handle and if the driver can't be called to come take over, it has to be able to bring itself to a safe stop out of traffic.

          I'

          • And honestly, even a freeway-only level 4 AI would be a *huge* boon to a lot of people by turning much of a frustrating commute into unstructured "me time".

            I just really hope they come up with a new name for full self driving mode - "autopilot" is already confusing enough people without calling the version that *can* drive on its own the same thing.

          • I'm not holding my breath, but I also wouldn't be shocked if Tesla is only a few months away from a freeway-driving-only level 4.

            I remember a way old TV show - think it was a pilot, maybe for a show that didn’t get picked up - where the main character was a LA private detective in the somewhat near future (think this was the 80s, so “near future” was probably right about now). I remember a scene where the dude was cruising down the freeway, dictating a letter or maybe in a call, leaning back with his feet up near the left-side view mirror because the car was doing everything. Then at one point the car announced

            • Freeway-only autonomy would solve a LOT of problems.

              Heh. It will be very convenient, and I'll be happy to have it, but I expect it to create a lot of problems. People will commute farther, and more. People will probably fly less, too, moving more medium-distance transit to the highways. I think I would drive instead of fly when I go to California for work: I'd get in my car at about 9 PM, lay the seat back and wake up in the bay area in time to get a shower and have breakfast, then start my day (we need fully-automated charging stations to completely realiz

              • Potentially, but I don't think we'll see huge changes in carrying capacity until the human drivers are mostly or entirely removed.

                No argument there. I was thinking about the situation once pretty much all vehicles have the technology. And, even then, there would have to be an actual requirement for its use if people want to see the biggest benefits - which I imagine will be hard-to-impossible to legislate.

    • by HiThere ( 15173 )

      It *can't* be a promise. Even if it were technically available, laws would need to be changed.

    • i think i'll wait until all vehicles can communicate with each other to find the position, speed and where they plan to drive to for all vehicles in my vicinity.
  • Comment removed based on user account deletion
    • Police can tell people to stop and pull over right now. Try not doing it, and see what happens.
      • Police can tell people to stop and pull over right now. Try not doing it, and see what happens.

        Lots of bad things. [google.com]

      • Police can tell people to stop and pull over right now. Try not doing it, and see what happens.

        Facts. I highly doubt Tesla is teaching their autopilot how to avoid a PIT maneuver.

        And that $300 police cruiser bumper is going to destroy a $30,000 Tesla bumper. You'd have to be stupid and rich.

  • Unsettling... (Score:5, Insightful)

    by ZombieCatInABox ( 5665338 ) on Saturday September 26, 2020 @05:07PM (#60546684)

    There's something unsettling about the words "self-driving" and "beta" together in the same sentence...

    • Re:Unsettling... (Score:4, Interesting)

      by 93 Escort Wagon ( 326346 ) on Saturday September 26, 2020 @05:25PM (#60546712)

      Well, at least from the news reports... It seems like when a Tesla autopilot fails, it’s the Tesla driver who pays the price. So you could argue any danger is “opt in”.

      • It is though? I own two Teslaâ(TM)s and although I love LOVE the cars, including Autopilot, I feel like itâ(TM)s always been BETA. You always have to keep an eye on it because although it drives great most of the time it gets confused by many lane highways and exists with odd markings.

      • More like, every time a Tesla autopilot fails, the company says it's the drivers fault since they didn't babysit the car at all times. Typical Silicon Valley mentality. If the car doesn't drive itself properly, you're driving it wrong.

        But then, I'm not part of Tesla's audience. I don't want a car to turn on the headlights or windshield wipers automatically, let alone drive itself.

    • He did say "bleeding edge" :-)

      I have a fully equipped Model 3 and Autopilot really is improving rapidly.
      • have a fully equipped Model 3 and Autopilot really is improving rapidly.

        Same here. It's good. But it really does need a re-write to improve its awareness. I only trust it not to make poor decisions on relatively clear highways.

    • There's something unsettling about the words "self-driving" and "beta" together in the same sentence...

      Have you ever met a 17-year old? They are able to drive legally in all states as far as I am aware....

      A 17-year old is VERY much a beta human, and we let all of them drive today - by the millions. They simply have somewhat higher insurance rates.

      Imagine five years after full self driving is released. What will be more expensive to insure... the FSD car, or a 17 year old? Or a *60* year old, for that ma

    • by antdude ( 79039 )

      Even after betas, I don't feel comfortable with this feature. It's not stable as KITT and KARR. :P

    • It amazes me that our government actually allows these things on the road with other people.

      There are so many points of failure with the hardware, software, image recognition as well as our completely unpredictable road markings and characteristics.

    • Everything goes to beta eventually. Even if they call it something else.
    • Just make the self-driving beta wear the "student driver" warning

  • by Bourdain ( 683477 ) on Saturday September 26, 2020 @06:53PM (#60546826)

    Wow

    That sounds scary to me - don't get me wrong, I love the idea of a self driving car and I know Tesla's are, in many respects, excellent cars, but at least autopilot worked pretty well generally

    The fact that much of the underlying code was rewritten for this version scares me that new unforeseen issues will arise when it's released into the real world

    I don't like having QA done by the end user - that's like putting Windows 10 on the freeway

    • They have a lot of data already, millions of kilometers of driving's worth. They probably already tested the new software against that, so in a way, people have already tested it even before it is released.
      • I hope the data they have is fully testable though

        I mean, I suppose they would theoretically need the actual video from the cameras on those cars for millions of miles as opposed to what the cameras interpreted in those miles since the new software interprets things differently

        I doubt they are retaining all of the video from those cars?

        That said, this is all really just BS as I don't know the details of their internal processes, they probably have some reasonable process to test this against at least some r

    • Each Tesla has two AI chips [cnet.com].

      It was my understanding that Tesla was able to load new algorithms into one of them and have it "play along" while a driver was operating the car. They could even do this while Autopilot is not engaged (since the new alg plays along without adjusting any physical inputs of the car).

      So it's entirely possible for them to test the software in existing cars without endangering anybody. They know the new algorithm beats the old one when it causes/avoids fewer crashes as it's playing

  • My guess is this beta will look a lot different from what most people are expecting. You won't be able to hop in your car at home and pull up in a parking spot at your destination. There will probably be a number of minor things that still aren't accounted for as well. "Full self driving in a month or so" could simply mean that AP can now attempt to make turns at lights and stop & go through stop signs.

    This is similar to the AP investor day stuff he talked about. He said that "AP will be feature complet

  • Right (Score:5, Interesting)

    by ruddk ( 5153113 ) on Saturday September 26, 2020 @07:24PM (#60546864)

    It really has to be a massive improvement. Today autopilot(just using cruise control) gets scared and brakes when it sees a car parked by the side of the road(outside the lines) and in a lot of places it feels annoying and I just switch it off. It works keeping distance to the car in front of me on the freeway but it’s behaviours doesn’t really inspire me to pay $8000 for the upgrade to full autopilot, I get a smoother ride just controlling the speed myself.

    It does seem to work well when a faster car drives past and pulls into the right lane too close in front of me, where it does not break to increase the distance. Well I don’t drive much on freeways so it isn’t a huge issue for me.

    • That kind of thing seems like it would need car-to-car communication to really fix. As things are, the system can't tell if that car in the shoulder has the motor on, ready to dart out in front of you, or if it's disabled and in park. I slow down at least a bit in that scenario anyway. As of a couple years ago, it's actually state law in Texas to slow down by 20 MPH when passing a vehicle on the shoulder. (But only if it's a police vehicle. Blue lives matter, I suppose.)

      You'd need to start removing "normal"

  • Even more idiot Tesla owners will die.

  • I think what htey are doing is great, but I feel like the side facing cameras being on the B-pillars are too far back to detect front cross traffic early. When pulling into an intersection, especially in many urban areas the car would have to jut out too far into the intersection to get a proper view of cross traffic. They need an extra pair of cameras that are placed more forward .. maybe just behind the headlights or in that groove in the front bumper under where the front license plate is. Rear cross tra

    • people are doing fine with just 2 low resolution vision devices.

      • Actually eyes are much higher resolution with 100x more dynamic range and have better placement than the 8 cameras on a Tesla when it comes to seeing forward cross traffic. Especially when you are at an intersection of a one way road with traffic coming from the right. The B-pillar on the right is much further back than your eyes are when youre sitting in the driver seat.

  • How can anyone be allowed to say this in the first place.
  • approval then anyone can run an autopilot.
  • Have the solved the issue that the auto pilot just doesnâ(TM)t seem to see white trucks?
  • In other words nothing will happen within an unknown timeframe

  • "I'm driving a bleeding edge, alpha build of Autopilot, so I sort of have insight into what is going on."

    Does that mean Musk is actually driving around in a Tesla with an autopilot like that? Or just testing it on a closed course? The difference in required intestinal fortitude is massive...

  • by AlanObject ( 3603453 ) on Sunday September 27, 2020 @10:40AM (#60547972)

    So this article has about 80 comments so far and although there are many opinions on how "safe" Tesla FSD might be none of them mention what the data Tesla has actually shows.

    Their shadow mode tells them pretty accurately just how many accidents their FSD would have been involved in had it been in control. From this they will also likely tell how serious those accidents might be and how many of them if they would have caused injury or maybe fatality.

    Real world data, real world conditions. All the "aha I got you" arguments about wipers, LIDAR, 3D object modeling, AI, Musk on weed, yadda yadda is pretty much useless against hard data. They know. Musk knows.

    I can understand why they don't publish that data. But sooner or later they will release this software for end-user use because their data shows them that the risk is acceptable. No doubt some would argue that the FSD risk is better than a human driver risk. Personally I don't argue that because it requires too many assumptions.

    What I do believe is that also sooner or later if they release FSD that data will be forced out into the open. There will be some accident, a lawsuit, and the plaintiffs will argue for discovery that the data must be turned over. That will be appealed and the defendants will lose. Then Tesla is going to have a "some of you are going to die, but that's a sacrifice I'm willing to make" type of moment.

    This isn't a legal jeopardy situation unique to Tesla or this situation. There are other FSD competitors out there, one with a fatality already on the books. All vehicles have potentially fatal engineering compromises like this. Just not as visible or dramatic.

    • The problem with real world data is that there is no way to know whether its coverage is adequate. It's entirely possible that somewhere there is a road where every single Tesla that drives it goes straight over a cliff. Even if that means only 5 out of 50 billion trips goes over, it's going to make big news.

  • Comment removed based on user account deletion
  • by Rick Schumann ( 4662797 ) on Sunday September 27, 2020 @01:41PM (#60548446) Journal
    That's what's in the future of any so-called 'driverless car'.
  • As someone who has now owned 3 different (used) Teslas ... a Model S, Model X and now a Model 3 Performance? I've had the chance to use AP1, AP 2 and the latest/greatest FSD software and hardware.

    I feel like really, AP1 (Intel's MobilEye that Tesla just licensed, back then) was an amazingly good, capable product given what it had to work with. Tesla spent a lot of time and money just trying to re-invent a parallel system when they decided they wanted to end the Intel licensing deal and do it themselves. An

"Confound these ancestors.... They've stolen our best ideas!" - Ben Jonson

Working...