Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation Software

Tesla Will Open Controversial FSD Beta Software To Owners With a Good Driving Record (techcrunch.com) 61

Tesla CEO Elon Musk said the company will use personal driving data to determine whether owners who have paid for its controversial "Full Self-Driving" software can access the latest beta version that promises more automated driving functions. TechCrunch reports: Musk tweeted late Thursday night that the FSD Beta v10.0.1 software update, which has already been pushed out to a group of select owners, will become more widely available starting September 24. Owners who have paid for FSD, which currently costs $10,000, will be offered access to the beta software through a "beta request button." Drivers who select the beta software will be asked for permission to access their driving behavior using Tesla's insurance calculator, Musk wrote in a tweet. "If driving behavior is good for seven days, beta access will be granted," Musk wrote.

The latest FSD Beta is supposed to automate driving on highways and city streets. However, this is still a Level 2 driver assistance system that requires the driver to pay attention, have their hands on the wheel and take control at all times. Recent videos posted showing owners' experiences with this beta software provide a mixed picture of its capability. In some videos, the vehicles handle city driving; in many others, drivers are seen taking control due to missed turns, being too close to the curb, failure to creep forward and, in one case, veering off suddenly toward pedestrians.

This discussion has been archived. No new comments can be posted.

Tesla Will Open Controversial FSD Beta Software To Owners With a Good Driving Record

Comments Filter:
  • by Anonymouse Cowtard ( 6211666 ) on Friday September 17, 2021 @08:10PM (#61806103) Homepage

    veering off suddenly toward pedestrians I see no potential problem with this. Are they unarmed protesters? It's fine.

  • So basically: move fast and break things?

  • automated driving needs good driving vs training?

    Good manual driving does not = good ready to take over driving

    • Need a learners permit.

    • by Xenx ( 2211586 )

      Good manual driving does not = good ready to take over driving

      I would assume good drivers pay better attention to their driving than bad drivers, and thus be more likely to take over when needed.

      • Drive a car: you gotta know how.

        Drive a self-driving car: you better know how to drive well.

        The narrative is wearing thinner and thinner...

        • by Xenx ( 2211586 )
          This a beta feature for an incomplete product, and they're been under more scrutiny. They're trying to either APPEAR to improve their testing, or improve safety in their testing... possibly both.
      • There's good drivers that are conscientious and there's good drivers that are skilled, and there's the best drivers that are both. Frankly, only the latter group should get to participate in this beta. That is, they should have to demonstrate that they not only have good habits, but also good reaction times. They're being expected to be there to take over if there's a mistake, while operating at highway speeds.

        • by Xenx ( 2211586 )
          The point is that selecting from only "good drivers" is going to be better than selecting from both groups. You're not picking them for being good, you're excluding the rest for being poorer drivers.
          • My point is that's not good enough, because you're only testing for drivers who are good at driving, not drivers who are good at being babysitters for beta software.

            • by Xenx ( 2211586 )
              Your argument is literally(not figuratively) worthless here. My point is that people that are less responsible drivers aren't going to be more responsible if the car is driving for them. It doesn't matter, in this context, how well each individual good driver is going to handle monitoring. They're already proven to be more responsible on average. They are going to be a better test group than the bad drivers, as long as safety is the concern.
              • What I am curious about as how they actually determine good. I have frequently found myself at odds with my Tesla over a particular decision, swerving on to the shoulder to avoid debris or a pothole only to be chastised by the computer for departing the lane. My guess is that the best drivers have a slightly higher pocket of incidents where the computer thinks they are bad than average drivers. Presumably there is balance at scale, but I still wonder.
                • by Xenx ( 2211586 )
                  I've found some information on what the Tesla Insurance Calculator checks, which is what they're basing it on, but nothing concrete. It also sounds like they weigh drivers against fleet averages.

                  The number of times ABS is activated.
                  Hours driven
                  Number of times autopilot is disabled due to ignored alerts
                  Number of forward collision warnings
                  Amount of time spent at an unsafe following distance
                  Rapid acceleration and hard braking
                  • So you buy a car that can do 0-60 in 1.99 seconds and can drive itself, but then it turns out you have to pick one?!

  • Tesla drivers with hellacious driving records would benefit more. But better drivers would provide better training.
  • So, you have to have your hands on the wheel, be paying attention and be ready to assume control if the software does something dumb ?
    It seems to me that's a lot more work than just driving the car in the first place.

    Sitting there constantly on the alert for my car suddenly turning into a suicide/murder bot seems draining to me.

    I would NOT bother.

  • Wouldn't the BAD drivers need this more? If I'm not driving, why does my driving ability matter? Of course you still have to pay extra attention while the car drives you around, thus making the whole autodrive thing fairly worthless if you are already a good driver...
    • by egr ( 932620 )
      Might be a strategy to filter out negative reviewers or to avoid bad stats on accident reports
    • by tlhIngan ( 30335 )

      Wouldn't the BAD drivers need this more? If I'm not driving, why does my driving ability matter? Of course you still have to pay extra attention while the car drives you around, thus making the whole autodrive thing fairly worthless if you are already a good driver...

      Probably because Tesla's self driving isn't very good? It's bad enough that it's being looked at because it has a very nasty habit of running into parked emergency vehicles.

      What makes it even stranger is that Teslas are known for quite good ADA

      • That's the nature of trained AI. The thing is, right now there is so little of it on the road that the government hasn't regulated it, but eventually it will be regulated, and then they'll want to be able to tell them to make changes, and what changes to make, and it will be like... OK, 3 years later we can do that. Wait, we have to stop until we can do it?! How do we train the new rules if we have to use the new rules before we're on the road?!

        It's really short-sighted. If they'd been developing an expert

      • Autopilot mostly operated off radar for detection of oncoming traffic, to standard radar a static car looks no different than static road. It's not imaging.

    • It's beta software, so the chance of malfunction is elevated .. a good driver would be needed to correct the car's action if it attempts to do something dangerous.

    • Wouldn't the BAD drivers need this more?

      Sure, if it worked right.

    • by quantaman ( 517394 ) on Saturday September 18, 2021 @11:22AM (#61808091)

      Wouldn't the BAD drivers need this more? If I'm not driving, why does my driving ability matter? Of course you still have to pay extra attention while the car drives you around, thus making the whole autodrive thing fairly worthless if you are already a good driver...

      Because it's not intended for testing or data gathering, it's meant for PR. Both to appease Tesla owners who paid for it and to satisfy his need to be the one to do self driving.

      Musk has been releasing "self-driving" since 2014, and 5 years from now he's be probably releasing "Complete Self Driving" that still isn't ready for unsupervised operation.

      For that reason, they're confining the beta to "good drivers" who are hopefully responsible enough to constantly supervise it and not cause an accident.

  • insurance good = short trip bad and other iffy stuff with the trackers
    https://clubthrifty.com/allsta... [clubthrifty.com]
    https://www.usnews.com/insuran... [usnews.com]

    https://clearsurance.com/blog/... [clearsurance.com]
    https://www.directline.com/car... [directline.com]

  • I can't wait till all beta software gets this level of "controversy". Next we'll be seeing articles about how the new Android beta could potentially accidentally launch a nuke under the right circumstances. Get those ad views guys, get those ad views.
    • No, this is different, it's almost certainly life and death. Self-driving cars will kill some lives, and save some lives - certainly a net reduction in the long term, and hopefully in the short term... but anyways self-driving is the most widespread deployment of safety-critical yet bleeding-edge technology we may see if our lifetimes.
    • An unusual post; subtle but really dumb...
  • I see all the jokes and the complaints about this. But really, you're testing a product that's still very much a beta, here. Elon's essentially saying, "This thing is going to screw up occasionally while you have it on self-driving mode. I need people who want to play with this technology but who are going to babysit it real carefully so we don't wind up with more crashes in the news, calling the entire project into question."

    So yeah, he's testing if you're capable of driving responsibly and carefully for

    • The problem with looking at this as a good beta test is the possible collateral damage (property and life) to those who happen to be on the road with the Telsa drivers participating in this test.
    • "Self-driving" was called into question a while ago.
    • I see all the jokes and the complaints about this. But really, you're testing a product that's still very much a beta, here. Elon's essentially saying, "This thing is going to screw up occasionally while you have it on self-driving mode. I need people who want to play with this technology but who are going to babysit it real carefully so we don't wind up with more crashes in the news, calling the entire project into question."

      Yeah, that's absolute BS.

      A beta is meant for when you've tested it in-house, fixed the major issues you know about, and now want a subset of the user base to help you shake out additional bugs.

      It is NOT meant for when it's full of serious known bugs. That would be "Early Access", just like all of those half-finished games that get released on Steam.

      The difference is when the Steam game crashes no one dies. And some of those Early Access Steam games might actually get finished in the next few years.

      The frustration is really with the exaggerated promises Tesla put out over the years, leading up to this.

      Lets be c

      • Modern Computer Vision alone simply isn't good enough to form the basis of a FSD system, no matter how much data you collected or how many cameras you have. They're going to need a breakthrough on the order of the original AlexNet applying CNNs and Deep Learning, and those kinds of breakthroughs are really hard to predict.

        For quite some time I had the same impression about FSD based on vision only. I was rather perplexed when Tesla announced they were ditching radar in their design and switching to vision only. Seriously WTF. To me it felt a lot like Boeing "deciding" to implement the MCAS to rely on a single fallible AoA sensor.

        Then I started reading more and looking at the explanatory videos on youTube about why they did this. At the lay reading level it actually does make sense.

        In the process of this type of auto

        • Modern Computer Vision alone simply isn't good enough to form the basis of a FSD system, no matter how much data you collected or how many cameras you have. They're going to need a breakthrough on the order of the original AlexNet applying CNNs and Deep Learning, and those kinds of breakthroughs are really hard to predict.

          For quite some time I had the same impression about FSD based on vision only. I was rather perplexed when Tesla announced they were ditching radar in their design and switching to vision only. Seriously WTF. To me it felt a lot like Boeing "deciding" to implement the MCAS to rely on a single fallible AoA sensor.

          Then I started reading more and looking at the explanatory videos on youTube about why they did this. At the lay reading level it actually does make sense.

          I think that's the problem, Musk is that lay person, it makes sense to him so he's pushing it on the devs, but CV just isn't advanced enough to reliably make use of that data.

          Just look at any video on youtube showing the HUD, in fact I grabbed one from a Tesla fanboy showing off FSD [youtube.com]. Forget the weird erratic driving and them bragging about "ZERO Interventions" even though the Tesla blew a stop sign, instead watch the HUD.

          The Tesla can't see more than one car ahead in its lane, it has trouble with groups of

          • instead watch the HUD.

            The Tesla can't see more than one car ahead in its lane, it has trouble with groups of pedestrians, cars occasionally warp in and out of existence, etc, etc. Heck, that "fog" the Tesla shows all over most of the HUD seems to be the Tesla indicating it has no idea what's going on there.

            I wouldn't assume that the panel display (which is not a HUD) is a fully up-to-date rendering of what is in the system's 4D. That display is a non-critical function and I would fully expect updating it to be assigned lower priority than other functions. The pixels you see there are solely painted for your entertainment and have nothing to do with the driving function.

            On that video -- the missed stop-sign is as clear a case as I have ever seen that the driver found a corner case where the decision algor

            • instead watch the HUD.

              The Tesla can't see more than one car ahead in its lane, it has trouble with groups of pedestrians, cars occasionally warp in and out of existence, etc, etc. Heck, that "fog" the Tesla shows all over most of the HUD seems to be the Tesla indicating it has no idea what's going on there.

              I wouldn't assume that the panel display (which is not a HUD) is a fully up-to-date rendering of what is in the system's 4D. That display is a non-critical function and I would fully expect updating it to be assigned lower priority than other functions. The pixels you see there are solely painted for your entertainment and have nothing to do with the driving function.

              So you think the panel is hiding and misplacing vehicles and pedestrians it knows about just for the heck of it?

              There's probably low probability stuff in the "fog" it's making decision on, and other things that are kept out of the display to avoid clutter, but if the Telsa saw those vehicles it would show those vehicles.

              I think the evidence is quite incontrovertible, the Tesla cannot see a lot of what's on the road.

              On that video -- the missed stop-sign is as clear a case as I have ever seen that the driver found a corner case where the decision algorithm didn't do what I would have done. But it did sense the stop sign -- it just mapped it into the same intersection as the prior stop sign which was about 2 car-lengths ago. I have seen human drivers roll through the same stop sign situation any number of times. The start was totally ungraceful for the situation but it did nothing dangerous.

              It was driving on the wrong side of the road. Besides, this isn't some cherrypicked example.

  • Hey, you want me to drive your Tesla for 7 days?

  • There are so many things wrong with this.

    FSD is level 5, aren't they closer to level 2?

    Who names a Beta 10.0.1 ?, that is bug fix territory, or in the case of Windows the first working release.

    v10, look they are twice of level 5!
    • by kmoser ( 1469707 )
      Software that is both "full self driving" and "beta" is an oxymoron. Being beta practically guarantees bugs that void the "full self driving" claim.
    • Outright misleading. Marketing 'Assisted' as 'Full' should be against the law. Charging money is even worse. In marketing terms its up there with fatally injured, slightly pregnant, whiter than white,healthy cigarettes, 'Democratic Peoples' regimes and 'Complete write-off, but the radio still works'. So said George Costanza 'Jerry, just remember. It's not a lie... if you believe it...'
  • Are the responsible or the skilful ones.

    In other words, the group that's most likely to make sure FSD remains OFF.

    What's the point?

  • When the car is FSD, what does the owner's ability to drive mater here?
  • What are they opening? They are giving closed source code, not even giving that. Just the result of that. So far from opening anything.
  • Face it, China will have nothing on big corporate America when it come to rights violations as we move into the future.

    The US population seems so much happier to accept corporate overlords than state ones - does that actually make sense. No, no it does not.
  • Tesla is clearly terrified that their "self driving" can go disastrously wrong so are trying to mitigate by choosing drivers most likely to intervene before it plows into a group of nuns or something.
  • "The latest FSD Beta is supposed to automate driving on highways and city streets. However, this is still a Level 2 driver assistance system that requires the driver to pay attention, have their hands on the wheel and take control at all times."

    Wait, what does the "F" in "FSD" stand for again?

    I have this feeling that we will see 100% IPv6 adoption before any vendor meets the Full definition of "FSD". Not sure why the marketing so utterly wrong in these early stages of automation.

    • One could make the argument that because it is in public testing they don't know what level it's capable of, although frankly this would be very silly because if they don't know what it's capable of they have no business handing it to the public.

      A more reasonable argument might be that they are being required to take these steps as part of the certification process.

      I don't think that vision is good enough.

  • Can we really pay attention for more than 20mins this way?

    I was asked to do something similar in a job. It was very, very difficult without giving your hands to do anything. This is why security guards have to do the rounds.
      You need psychological strategies that Tesla isn't providing.

  • So, not a single Tesla Model-S Plaid owner will qualify for the "Safe Driving Behavior" qualification for obvious reasons. Poor plaid drivers, smh.

A complex system that works is invariably found to have evolved from a simple system that works.

Working...