Tesla Will Open Controversial FSD Beta Software To Owners With a Good Driving Record (techcrunch.com) 61
Tesla CEO Elon Musk said the company will use personal driving data to determine whether owners who have paid for its controversial "Full Self-Driving" software can access the latest beta version that promises more automated driving functions. TechCrunch reports: Musk tweeted late Thursday night that the FSD Beta v10.0.1 software update, which has already been pushed out to a group of select owners, will become more widely available starting September 24. Owners who have paid for FSD, which currently costs $10,000, will be offered access to the beta software through a "beta request button." Drivers who select the beta software will be asked for permission to access their driving behavior using Tesla's insurance calculator, Musk wrote in a tweet. "If driving behavior is good for seven days, beta access will be granted," Musk wrote.
The latest FSD Beta is supposed to automate driving on highways and city streets. However, this is still a Level 2 driver assistance system that requires the driver to pay attention, have their hands on the wheel and take control at all times. Recent videos posted showing owners' experiences with this beta software provide a mixed picture of its capability. In some videos, the vehicles handle city driving; in many others, drivers are seen taking control due to missed turns, being too close to the curb, failure to creep forward and, in one case, veering off suddenly toward pedestrians.
The latest FSD Beta is supposed to automate driving on highways and city streets. However, this is still a Level 2 driver assistance system that requires the driver to pay attention, have their hands on the wheel and take control at all times. Recent videos posted showing owners' experiences with this beta software provide a mixed picture of its capability. In some videos, the vehicles handle city driving; in many others, drivers are seen taking control due to missed turns, being too close to the curb, failure to creep forward and, in one case, veering off suddenly toward pedestrians.
works as advertised (Score:3, Funny)
veering off suddenly toward pedestrians I see no potential problem with this. Are they unarmed protesters? It's fine.
Re: (Score:2)
*visions of the movie Maximum Overdrive, only sub with Tesla, Volt, etc*
pretty standard (Score:2)
So basically: move fast and break things?
Re: pretty standard (Score:2)
Basically desperately keeping the charade going that FSD is ever going to ship.
Re: (Score:2)
automated driving needs good driving vs training? (Score:2)
automated driving needs good driving vs training?
Good manual driving does not = good ready to take over driving
Re: (Score:2)
Need a learners permit.
Re: (Score:2)
Good manual driving does not = good ready to take over driving
I would assume good drivers pay better attention to their driving than bad drivers, and thus be more likely to take over when needed.
Re: automated driving needs good driving vs traini (Score:1)
Drive a self-driving car: you better know how to drive well.
The narrative is wearing thinner and thinner...
Re: (Score:2)
Re: (Score:2)
There's good drivers that are conscientious and there's good drivers that are skilled, and there's the best drivers that are both. Frankly, only the latter group should get to participate in this beta. That is, they should have to demonstrate that they not only have good habits, but also good reaction times. They're being expected to be there to take over if there's a mistake, while operating at highway speeds.
Re: (Score:2)
Re: (Score:2)
My point is that's not good enough, because you're only testing for drivers who are good at driving, not drivers who are good at being babysitters for beta software.
Re: (Score:2)
Re: automated driving needs good driving vs traini (Score:1)
Re: (Score:2)
The number of times ABS is activated.
Hours driven
Number of times autopilot is disabled due to ignored alerts
Number of forward collision warnings
Amount of time spent at an unsafe following distance
Rapid acceleration and hard braking
Re: (Score:1)
So you buy a car that can do 0-60 in 1.99 seconds and can drive itself, but then it turns out you have to pick one?!
Data mining? (Score:2)
Re: (Score:2)
better drivers would provide better training.
If you don't want your AI to have to think too hard.
I don't get it. (Score:2)
So, you have to have your hands on the wheel, be paying attention and be ready to assume control if the software does something dumb ?
It seems to me that's a lot more work than just driving the car in the first place.
Sitting there constantly on the alert for my car suddenly turning into a suicide/murder bot seems draining to me.
I would NOT bother.
Re:I don't get it. (Score:4, Insightful)
pay to be an beta tester! (Score:3)
pay to be an beta tester!
Wrong target group (Score:2)
Re: (Score:2)
Re: (Score:2)
Probably because Tesla's self driving isn't very good? It's bad enough that it's being looked at because it has a very nasty habit of running into parked emergency vehicles.
What makes it even stranger is that Teslas are known for quite good ADA
Re: (Score:2)
That's the nature of trained AI. The thing is, right now there is so little of it on the road that the government hasn't regulated it, but eventually it will be regulated, and then they'll want to be able to tell them to make changes, and what changes to make, and it will be like... OK, 3 years later we can do that. Wait, we have to stop until we can do it?! How do we train the new rules if we have to use the new rules before we're on the road?!
It's really short-sighted. If they'd been developing an expert
Re: Wrong target group (Score:2)
Autopilot mostly operated off radar for detection of oncoming traffic, to standard radar a static car looks no different than static road. It's not imaging.
Re: (Score:2)
It's beta software, so the chance of malfunction is elevated .. a good driver would be needed to correct the car's action if it attempts to do something dangerous.
Re: Wrong target group (Score:1)
It's beta software
What's your excuse??
Re: Wrong target group (Score:1)
Wouldn't the BAD drivers need this more?
Sure, if it worked right.
Re:Wrong target group (Score:4, Informative)
Wouldn't the BAD drivers need this more? If I'm not driving, why does my driving ability matter? Of course you still have to pay extra attention while the car drives you around, thus making the whole autodrive thing fairly worthless if you are already a good driver...
Because it's not intended for testing or data gathering, it's meant for PR. Both to appease Tesla owners who paid for it and to satisfy his need to be the one to do self driving.
Musk has been releasing "self-driving" since 2014, and 5 years from now he's be probably releasing "Complete Self Driving" that still isn't ready for unsupervised operation.
For that reason, they're confining the beta to "good drivers" who are hopefully responsible enough to constantly supervise it and not cause an accident.
insurance good = shurt trip bad (Score:2)
insurance good = short trip bad and other iffy stuff with the trackers
https://clubthrifty.com/allsta... [clubthrifty.com]
https://www.usnews.com/insuran... [usnews.com]
https://clearsurance.com/blog/... [clearsurance.com]
https://www.directline.com/car... [directline.com]
Please click!!! (Score:2)
Re: (Score:2)
Re: Please click!!! (Score:1)
This actually makes sense to me ... (Score:2, Insightful)
I see all the jokes and the complaints about this. But really, you're testing a product that's still very much a beta, here. Elon's essentially saying, "This thing is going to screw up occasionally while you have it on self-driving mode. I need people who want to play with this technology but who are going to babysit it real carefully so we don't wind up with more crashes in the news, calling the entire project into question."
So yeah, he's testing if you're capable of driving responsibly and carefully for
Re: (Score:3)
Re: This actually makes sense to me ... (Score:1)
Re: (Score:2)
I see all the jokes and the complaints about this. But really, you're testing a product that's still very much a beta, here. Elon's essentially saying, "This thing is going to screw up occasionally while you have it on self-driving mode. I need people who want to play with this technology but who are going to babysit it real carefully so we don't wind up with more crashes in the news, calling the entire project into question."
Yeah, that's absolute BS.
A beta is meant for when you've tested it in-house, fixed the major issues you know about, and now want a subset of the user base to help you shake out additional bugs.
It is NOT meant for when it's full of serious known bugs. That would be "Early Access", just like all of those half-finished games that get released on Steam.
The difference is when the Steam game crashes no one dies. And some of those Early Access Steam games might actually get finished in the next few years.
The frustration is really with the exaggerated promises Tesla put out over the years, leading up to this.
Lets be c
Re: (Score:2)
Modern Computer Vision alone simply isn't good enough to form the basis of a FSD system, no matter how much data you collected or how many cameras you have. They're going to need a breakthrough on the order of the original AlexNet applying CNNs and Deep Learning, and those kinds of breakthroughs are really hard to predict.
For quite some time I had the same impression about FSD based on vision only. I was rather perplexed when Tesla announced they were ditching radar in their design and switching to vision only. Seriously WTF. To me it felt a lot like Boeing "deciding" to implement the MCAS to rely on a single fallible AoA sensor.
Then I started reading more and looking at the explanatory videos on youTube about why they did this. At the lay reading level it actually does make sense.
In the process of this type of auto
Re: (Score:2)
Modern Computer Vision alone simply isn't good enough to form the basis of a FSD system, no matter how much data you collected or how many cameras you have. They're going to need a breakthrough on the order of the original AlexNet applying CNNs and Deep Learning, and those kinds of breakthroughs are really hard to predict.
For quite some time I had the same impression about FSD based on vision only. I was rather perplexed when Tesla announced they were ditching radar in their design and switching to vision only. Seriously WTF. To me it felt a lot like Boeing "deciding" to implement the MCAS to rely on a single fallible AoA sensor.
Then I started reading more and looking at the explanatory videos on youTube about why they did this. At the lay reading level it actually does make sense.
I think that's the problem, Musk is that lay person, it makes sense to him so he's pushing it on the devs, but CV just isn't advanced enough to reliably make use of that data.
Just look at any video on youtube showing the HUD, in fact I grabbed one from a Tesla fanboy showing off FSD [youtube.com]. Forget the weird erratic driving and them bragging about "ZERO Interventions" even though the Tesla blew a stop sign, instead watch the HUD.
The Tesla can't see more than one car ahead in its lane, it has trouble with groups of
Re: (Score:2)
instead watch the HUD.
The Tesla can't see more than one car ahead in its lane, it has trouble with groups of pedestrians, cars occasionally warp in and out of existence, etc, etc. Heck, that "fog" the Tesla shows all over most of the HUD seems to be the Tesla indicating it has no idea what's going on there.
I wouldn't assume that the panel display (which is not a HUD) is a fully up-to-date rendering of what is in the system's 4D. That display is a non-critical function and I would fully expect updating it to be assigned lower priority than other functions. The pixels you see there are solely painted for your entertainment and have nothing to do with the driving function.
On that video -- the missed stop-sign is as clear a case as I have ever seen that the driver found a corner case where the decision algor
Re: (Score:2)
instead watch the HUD.
The Tesla can't see more than one car ahead in its lane, it has trouble with groups of pedestrians, cars occasionally warp in and out of existence, etc, etc. Heck, that "fog" the Tesla shows all over most of the HUD seems to be the Tesla indicating it has no idea what's going on there.
I wouldn't assume that the panel display (which is not a HUD) is a fully up-to-date rendering of what is in the system's 4D. That display is a non-critical function and I would fully expect updating it to be assigned lower priority than other functions. The pixels you see there are solely painted for your entertainment and have nothing to do with the driving function.
So you think the panel is hiding and misplacing vehicles and pedestrians it knows about just for the heck of it?
There's probably low probability stuff in the "fog" it's making decision on, and other things that are kept out of the display to avoid clutter, but if the Telsa saw those vehicles it would show those vehicles.
I think the evidence is quite incontrovertible, the Tesla cannot see a lot of what's on the road.
On that video -- the missed stop-sign is as clear a case as I have ever seen that the driver found a corner case where the decision algorithm didn't do what I would have done. But it did sense the stop sign -- it just mapped it into the same intersection as the prior stop sign which was about 2 car-lengths ago. I have seen human drivers roll through the same stop sign situation any number of times. The start was totally ungraceful for the situation but it did nothing dangerous.
It was driving on the wrong side of the road. Besides, this isn't some cherrypicked example.
Soon to see on craigslist (Score:2)
Hey, you want me to drive your Tesla for 7 days?
'FSD Beta v10.0.1 software update' (Score:2)
FSD is level 5, aren't they closer to level 2?
Who names a Beta 10.0.1 ?, that is bug fix territory, or in the case of Windows the first working release.
v10, look they are twice of level 5!
Re: (Score:2)
Re: (Score:2)
Drivers with Good Driving Records... (Score:2)
Are the responsible or the skilful ones.
In other words, the group that's most likely to make sure FSD remains OFF.
What's the point?
Re: Drivers with Good Driving Records... (Score:1)
What's the point?
They're out of better ideas.
nonsense (Score:1)
Opening? (Score:2)
Re: Opening? (Score:1)
If this was China, people would be upset... (Score:1)
The US population seems so much happier to accept corporate overlords than state ones - does that actually make sense. No, no it does not.
Damage limitation (Score:2)
"F"...SD? (Score:2)
"The latest FSD Beta is supposed to automate driving on highways and city streets. However, this is still a Level 2 driver assistance system that requires the driver to pay attention, have their hands on the wheel and take control at all times."
Wait, what does the "F" in "FSD" stand for again?
I have this feeling that we will see 100% IPv6 adoption before any vendor meets the Full definition of "FSD". Not sure why the marketing so utterly wrong in these early stages of automation.
Re: (Score:2)
One could make the argument that because it is in public testing they don't know what level it's capable of, although frankly this would be very silly because if they don't know what it's capable of they have no business handing it to the public.
A more reasonable argument might be that they are being required to take these steps as part of the certification process.
I don't think that vision is good enough.
What coping strategies are Tesla giving here? (Score:1)
Can we really pay attention for more than 20mins this way?
I was asked to do something similar in a job. It was very, very difficult without giving your hands to do anything. This is why security guards have to do the rounds.
You need psychological strategies that Tesla isn't providing.
Tesla Plaid Owners All Unable To Get FSD (Score:1)
So, not a single Tesla Model-S Plaid owner will qualify for the "Safe Driving Behavior" qualification for obvious reasons. Poor plaid drivers, smh.