Tesla's 'Full Self-Driving' Beta Called 'Laughably Bad and Potentially Dangerous' (roadandtrack.com) 232
Car and Driver magazine has over a million readers. This month they called Tesla's "full self driving" beta "laughably bad and potentially dangerous."
schwit1 shares their report on a 13-minute video posted to YouTube of a Model 3 with FSD Beta 8.2 "fumbling its way around Oakland." Quite quickly, the video moves from "embarrassing mistakes" to "extremely risky, potentially harmful driving." In autonomous mode, the Tesla breaks a variety of traffic laws, starting with a last-minute attempt to cross a hard line and execute an illegal lane change. It then attempts to make a left turn next to another car, only to give up midway through the intersection and disengage. It goes on to take another turn far too wide, landing it in the oncoming lane and requiring driver intervention. Shortly thereafter, it crosses into the oncoming lane again on a straight stretch of road with bikers and oncoming traffic. It then drunkenly stumbles through an intersection and once again requires driver intervention to make it through. While making an unprotected left after a stop sign, it slows down before the turn and chills in the pathway of oncoming cars that have to brake to avoid hitting it...
The Tesla attempts to make a right turn at a red light where that's prohibited, once again nearly breaking the law and requiring the driver to actively prevent it from doing something. It randomly stops in the middle of the road, proceeds straight through a turn-only lane, stops behind a parked car, and eventually almost slams into a curb while making a turn. After holding up traffic to creep around a stopped car, it confidently drives directly into the oncoming lane before realizing its mistake and disengaging. Another traffic violation on the books — and yet another moment where the befuddled car just gives up and leaves it to the human driver to sort out the mess...
Then comes another near collision. This time, the Tesla arrives at an intersection where it has a stop sign and cross traffic doesn't. It proceeds with two cars incoming, the first car narrowly passing the car's front bumper and the trailing car braking to avoid T-boning the Model 3. It is absolutely unbelievable and indefensible that the driver, who is supposed to be monitoring the car to ensure safe operation, did not intervene there. It's even wilder that this software is available to the public. But that isn't the end of the video. To round it out, the Model 3 nearly slams into a Camry that has the right of way while trying to negotiate a kink in the road. Once it gets through that intersection, it drives straight for a fence and nearly plows directly into it.
Both of these incidents required driver intervention to avoid.
Their conclusion? "Tesla's software clearly does a decent job of identifying cars, stop signs, pedestrians, bikes, traffic lights, and other basic obstacles. Yet to think this constitutes anything close to 'full self-driving' is ludicrous."
schwit1 shares their report on a 13-minute video posted to YouTube of a Model 3 with FSD Beta 8.2 "fumbling its way around Oakland." Quite quickly, the video moves from "embarrassing mistakes" to "extremely risky, potentially harmful driving." In autonomous mode, the Tesla breaks a variety of traffic laws, starting with a last-minute attempt to cross a hard line and execute an illegal lane change. It then attempts to make a left turn next to another car, only to give up midway through the intersection and disengage. It goes on to take another turn far too wide, landing it in the oncoming lane and requiring driver intervention. Shortly thereafter, it crosses into the oncoming lane again on a straight stretch of road with bikers and oncoming traffic. It then drunkenly stumbles through an intersection and once again requires driver intervention to make it through. While making an unprotected left after a stop sign, it slows down before the turn and chills in the pathway of oncoming cars that have to brake to avoid hitting it...
The Tesla attempts to make a right turn at a red light where that's prohibited, once again nearly breaking the law and requiring the driver to actively prevent it from doing something. It randomly stops in the middle of the road, proceeds straight through a turn-only lane, stops behind a parked car, and eventually almost slams into a curb while making a turn. After holding up traffic to creep around a stopped car, it confidently drives directly into the oncoming lane before realizing its mistake and disengaging. Another traffic violation on the books — and yet another moment where the befuddled car just gives up and leaves it to the human driver to sort out the mess...
Then comes another near collision. This time, the Tesla arrives at an intersection where it has a stop sign and cross traffic doesn't. It proceeds with two cars incoming, the first car narrowly passing the car's front bumper and the trailing car braking to avoid T-boning the Model 3. It is absolutely unbelievable and indefensible that the driver, who is supposed to be monitoring the car to ensure safe operation, did not intervene there. It's even wilder that this software is available to the public. But that isn't the end of the video. To round it out, the Model 3 nearly slams into a Camry that has the right of way while trying to negotiate a kink in the road. Once it gets through that intersection, it drives straight for a fence and nearly plows directly into it.
Both of these incidents required driver intervention to avoid.
Their conclusion? "Tesla's software clearly does a decent job of identifying cars, stop signs, pedestrians, bikes, traffic lights, and other basic obstacles. Yet to think this constitutes anything close to 'full self-driving' is ludicrous."
millions of miles of data! (Score:5, Funny)
Re: (Score:2, Interesting)
Re: (Score:2)
but they have millions of miles of data!
This gives me grave concern for the skill level of the average Tesla driver.
Maybe they're still sampling data from Musk's Roadster?
Re: (Score:2)
but they have millions of miles of data!
This gives me grave concern for the skill level of the average Tesla driver.
Maybe they're still sampling data from Musk's Roadster?
Correction. This is the skill level of the average driver. And no, I'm not kidding because when reading this I was thinking I see most of these actions on a weekly basis. In fact, yesterday I went through someone driving 15 miles below the speed limit (on a one lane road), someone making a right turn from the left lane, and someone running a red light. This doesn't include on a regular basis people who don't know how to make turns across traffic. Instead of following a curve [wikihow.com] (part 2, step 4), they go d
I want to move there. (Score:2)
In my experience, these are relatively rare with human drivers, {...} I see that kind of error perhaps once in 10 hours of driving. That may sound much,
Please, could you DM me the address where you live? I would like to move in your neighborhood.
That's some impressively good-driving humans you happen to have around you!
(Said as someone who does most of his driving in CH and DE which have the reputation to be the calmer/saner driver. Don't get me started about when I need to drive in south-FR, IT, CZ, ...)
Re:millions of miles of data! (Score:4, Informative)
I actually watched the video before posting. It was slow, over-cautious and fumbling but at no point did it do anything downright dangerous.
Most of the complaints were, "That's closer than I would have got", but hey, it's a computer, it knows exactly where the corners of the car are and it can calculate distances more accurately than you.
It's a beta.
Re:millions of miles of data! (Score:5, Insightful)
You don't think it was dangerous to make a left turn across incoming traffic, forcing another car to brake in order to avoid a collusion? Or to cross a double yellow line, again with incoming traffic, in order to pas bicyclists? Or to continue driving forward towards a metal gate when a right turn was mandatory?
Re: (Score:3)
Did you miss the part where it drove through the intersection almost getting t-boned by oncoming traffic?
"It's a beta" (Score:3)
Re: (Score:3)
I actually watched the video before posting. It was slow, over-cautious and fumbling but at no point did it do anything downright dangerous.
Most of the complaints were, "That's closer than I would have got", but hey, it's a computer, it knows exactly where the corners of the car are and it can calculate distances more accurately than you.
It's a beta.
I also watched the video and you have a faaaar more generous interpretation than I.
It broke multiple traffic laws, wrong lane, straddling lanes, literally driving one the wrong side of the road.
It had at least one incident where it went too slow during a left turn and could have been t-boned, another time when it had a stop sign and cut off a car in a free flow lane.
Otherwise is was generally overcautious, forcing the operator to take over because the car was stopped in the middle of the road for no real re
Re:millions of miles of data! (Score:5, Insightful)
There's a point in the video where the car overtook some cyclists. The people in the car complained that it pulled out too far, "almost" into oncoming traffic. Leaving aside the debate about what "almost" means (we don't have exact distance measurements, only the opinion of a human), what should the car have done? Driven much closer to the cyclists?
Fact: The car safely overtook the cyclists and gave them more space than the humans would have done.
I fail to see how that's a bad thing. At that moment in time the car was a better driver than the humans inside it.
Re:millions of miles of data! (Score:4)
And this is the core of the matter: the autopilot clearly knows how to drive, at least in principle. But it drives different from a human drive, it may surprise other traffic participants, and it may also misread intentions. Having both humans and robots on the road is going to be a source of conflict until most drivers are robots.
That being said, some of the other moves in the video are a bit questionable. Never unsafe, I agree, but it does seem to get confused if there is a longer sequence of events to plan ahead. Especially the left turns seem to need some work. Human are pretty good at that, especially with experience. Computers will get there, but maybe they are more comfortable with simple situations for now.
Re:millions of miles of data! (Score:4, Informative)
I wonder who would have paid for damage to the car, in case of hitting the curb?
The driver, stop wondering. The buck stops with the guy in the seat until such a time as they are legally no longer required to be in the seat.
Re: millions of miles of data! (Score:2)
Not ready for use, but still pretty cool (Score:5, Insightful)
Look it's in no way ready for use, even in that very uncongested city driving. But it's still pretty cool.
I hope they get it right by the time I can no longer self drive.
Re:Not ready for use, but still pretty cool (Score:4, Interesting)
I don't think they have enough in the way of sensor input. Waymo's tests featuring minivans with rather unattractive spinning stuff on the bumpers and fenders and the stuff put up on the roof isn't exactly eyecatching, but it's there for a reason.
The kind of self-driving car you would need when you can't self-drive would probably look more like a van or RV. Without humans behind the wheel, it would make more sense to focus on the quality of the ride and the interior design, and I predict as close to livingroom-on-wheels as the aerodynamics will allow.
Re: (Score:3, Interesting)
I don't think they have enough in the way of sensor input.
Humans do it with a crappy 6 axis gyro/accelerometer and buggy stereo cams and actuate the controls all through a Rube Goldberg collection of linkages. We need better algorithms more than we need better sensors.
Re: Not ready for use, but still pretty cool (Score:2)
That only took a few billion years of evolution to get to the point where thousands of people still die on the roads every year due to human error.
Re:Not ready for use, but still pretty cool (Score:4, Insightful)
That's a short sighted answer. Sure, its possible to do withpout other sensors, but that doesn't mean other sensors wouldn't make it much easier. Bats can fly without sonar, but they do better with. You can find humans in the dark with your eyes, but an infrared sensor does it better. Musk initially didn't do lidar because it was expensive, but he may be paying the price for it now.
Re: Not ready for use, but still pretty cool (Score:5, Insightful)
Re: (Score:2)
"But to build a forest, you must first learn how to build a single tree." - Me
Re: (Score:3)
I don’t necessarily disagree with the point you’re trying to make, but I don’t think your particular argument holds much water, given that it’s extremely common for kids as young as 10-12 to be driving trucks and tractors every day on farms. It isn’t so much that they need 16-18 years of training, so much as that they’re:
A) Literally too small to physically work the controls until they reach a certain age
B) As a generalization for children their age, still too irresponsib
Re: Not ready for use, but still pretty cool (Score:4, Interesting)
Then again, comparing a human eyeball to a camera is also preposterous.
A human's field of view has only a small portion that's actually in focus - about 5 degrees (foveal). The rest is to some degree blurry. And we can't see as we move our eyes from one spot to another. (saccadic suppression).
Now if this sounds contrary to your experience, that's because your brain is filling in the missing data with what it expects to see.
So we have a computer, which is dumber than a human, attached to a bunch of sensors which are far better than a human at seeing things.
Which is an interesting comparison.
Re: Not ready for use, but still pretty cool (Score:2)
Re:Not ready for use, but still pretty cool (Score:5, Interesting)
Humans have several advantages though.
Those cameras can move and point in different directions. They are self cleaning too, and can cope with extreme amounts of dynamic range (e.g. facing the sun), as well as making use of movable shades when necessary.
The biggest advantage is the human brain, which was been developed over millions of years to rapidly process limited sensory input into an internal 3D representation of the world. It's also very good at handling unexepcted data and recognizing objects in a variety of lighting conditions, even when partially occluded.
Getting machines to that point is beyond our capability at the moment, especially in a mobile device. Supercomputers can't manage it, let alone what can be installed in a car.
That's why Waymo and most others use better sensors, particularly lidar, to gather data that greatly simplifies the processing needed.
Re: (Score:3)
Humans have several advantages though.
Those cameras can move and point in different directions. They are self cleaning too, and can cope with extreme amounts of dynamic range (e.g. facing the sun), as well as making use of movable shades when necessary.
The biggest advantage is the human brain, which was been developed over millions of years to rapidly process limited sensory input into an internal 3D representation of the world. It's also very good at handling unexepcted data and recognizing objects in a variety of lighting conditions, even when partially occluded
Not to mention we've designed roads, vehicles, and traffic laws around the capabilities of the human brain.
For instance, we're really good at deciding which distant object are vehicles and which are buildings but bad at doing the calculus to figure out safe passing distances. Hence we allow roads to be surrounded by visually diverse buildings but put traffic lights and stop signs at intersections.
Re: (Score:3)
Re: (Score:2)
Re:Not ready for use, but still pretty cool (Score:4, Interesting)
Humans do it with a crappy 6 axis gyro/accelerometer and buggy stereo cams
Actually we do it with a pair of crappy 3 axis gyros and a pair of buggy stereo cams, but we do it with a big special brain that produces the other data we need. And it also has a lot of other sensor input, such as the butt dyno.
It doesn't matter what sensors humans use, though. That's totally irrelevant. What matters is what sensors computers need to do the job. Also, remember that the goal is to do a better job, not a worse one.
Re:Not ready for use, but still pretty cool (Score:4, Insightful)
I'm not sure how you come to that conclusion. Right there in the summary, it says:
Tesla's software clearly does a decent job of identifying cars, stop signs, pedestrians, bikes, traffic lights, and other basic obstacles.
The Tesla seemingly does a fine job at identifying everything around it. That implies the sensors are just fine. It's the general navigation and driving algorithms that don't seem to be up to par.
Re: (Score:3)
The Tesla seemingly does a fine job at identifying everything around it.
No it doesn't. The bit you quoted says it does a decent job of identifying a limited set of things commonly found on or near roads, and if you watch the videos even that is glitchy.
Tesla is a very, very long way from having a generic vision system that can understand the world like a human does. They are banking on not having to go that far to make their system work. "Everything" is nowhere near accurate.
Re: (Score:2)
Watch the actual video.
Huh?
Are you asking slashdot readers to read all the summary and view the article before posting?
Re: (Score:2)
Since we see posts about the wrong topic from time to time, I'd say expecting people to read summaries and articles before posting is extremely optimistic.
Re: (Score:2)
Look at the last minute or so of the video. If the car understood what the gate closing the street was, why did it drive towards the gate rather than making a right turn to follow the road?
Earlier, if it understood what a double yellow line down the middle of a road was, why did it cross that line? "I wanted to pass bicyclists" is not an excuse for breaking that law.
Re: (Score:3)
Which places? As far as I can tell, California law [ca.gov] prohibits crossing (unbroken) double yellow lines except to turn left "at an intersection or into or out of a driveway", when making an otherwise valid U-turn, or when temporary signs move the traffic lanes. None of those applied to this case of the Tesla crossing the center lines.
Re: (Score:2)
While humans are drivi
Re: (Score:2)
Likewise, when the speed limit changes, it does not do a gradual speed change, but rather rapidly changes speed not giving the drivers behind time to react.
Your comment made me think about one of the highways heading into Astoria, Oregon. The speed limit is 55 - but, every so often, there's a small town. At the edge of these towns the speed limit suddenly drops to 25.
That'd be fun.
Re: (Score:2)
Absolutely. If you watch the video that the article comments upon, you have to be impressed. The drivers talk like it's an issue to cross a painted island to avoid bicyclists, when the law in many states specifically states a vehicle must stay at least 3 feet from bicyclists. There's a number of times when you can see the system acting with major caution when it senses humans. One time in the video they didn't know why the vehicle stopped in the road, when there was a bicyclist approaching on a sidewalk or
Re: (Score:3)
I have auto windows. One quit working about four years ago. I am *not* paying $250 to have someone fix a window crank.
Right there with you. Replacement regulator assemblies aren't very expensive, and even when doing it the first time, replacing a regulator shouldn't take more than half an hour or so. The real problem is that nowadays, power windows use lightweight, flimsy Bowden cable units instead of the heavier, more durable mechanisms they used to. You can run a stainless steel cable over a nylon pul
Expectations vs Reality (Score:2, Interesting)
Honestly, who was expecting otherwise? That's why it's called "beta". Also, I am supremely surprised these videos are legally allowed to be shared. I would of concluded a bit of a confidential agreement or the like would of been included in the EULA for signing up for the beta. The fact is people are stupid and their expectations never match reality. By showing these videos, you only give ammunition to the folks who wish to do things like ban self-driving cars which in my mind would be like banning the inte
Re: (Score:3)
Anyone with enough money and a compatible tesla can "sign up " for the beta, so your EULA requirement barring free speech would pretty much be trashed by any judge when challenged. I'm more amazed, or rather flabbergasted, that any joe schmoe would need to be responsible for that kind of atrocious driving. They make comments about the suburbs vs city driving, and I think the city dynamics really threw it for a loop. But you can already see the future coming where federal roadway guidelines will reduce man
Re: (Score:2)
Tesla minimizes their liability by calling it beta and blaming any accidents on the driver's failure to prevent them.
Re: (Score:2)
To make a not-a-car comparison since this is already about one, would someone run a company on beta software?
Re: (Score:2)
You and I have a very different point of view of what is "fun".
Re: (Score:2)
EVERYONE was expecting otherwise. Betas are for testing, not general consumption.
I am astounded that you are defending the crapfest that is "self driving". If you said this is expected in pre-production cars, you would be spot on. Released into the wild, this is an absolute sh!tshow and SHOULD be roundly condemned.
Re: (Score:2)
Beta testing a game is different from beta testing something that can kill you. They allowed "untrained" people, who don't know what to expect, to put themselves in situation that may kill them. I can't see how this (or could be) ever a good thing. Sure, have the Tesla tester use this in all sorts of places, but the general public? That is just negligent. I suppose that is what you would expect from a tech company, get the users to test the program, and damn the consequences.
I would expect the roll-out to t
Re:Hello, this is testing (Score:5, Informative)
Breaking it down into 2 points. "It can't kill you" is not correct. It didn't kill the person as they were paying attention (as they should have been) AND the oncoming car was also paying attention. This is very much NOT the same as "it can't kill you". It DIDN'T, doesn't mean it can't. If you seriously think people driving cars are always paying attention, I will point you to the crash stats, which make that statement a filthy lie. Or to put it another way, the fact that it hasn't killed someone yet, doesn't mean it can't. The evidence is clear that if the driver is not paying attention and another driver is also looking away for a second, someone will die. That should never happen, because we all pay attention, all the time and there are not billions of interactions every day, so we are fine
People are not trashing the amazing work that has been done. They are trashing that a product has been released that is in no way ready. It IS great work, it just is not ready for the general public.
Is it better than a person? Probably, in the majority of cases. This is NOT good enough for general adoption. The next solution to a problem has to be better than the previous solution, which auto-pilot has not managed to clear (yet)
Re: (Score:2)
it's never doing anything so outrageous people cannot react.
The issue is you never really know what it's going to do until it does it. If you disengage it before it does something that would cause harm, you don't know if it was actually going to, either. Right now FSD isn't good enough that you will get complacent about using it, but there will be a point where people are and they end up getting into accidents because of it.
Re: (Score:2)
From the video:
AI: "I've put you in the middle of the lane of opposing traffic. Now I disengage and leave it to you."
I'm pretty sure that is a situation that can lower your life expectancy.
Wasn't there something in Asimovs stories about a robot dropping a human from a height with the intention of catching her, thus not endangering her, but as soon as it let got, it thought "nah, it's the human's problem"?
Re: (Score:2)
Wasn't there something in Asimovs stories about a robot dropping a human from a height with the intention of catching her, thus not endangering her, but as soon as it let got, it thought "nah, it's the human's problem"?
I'm pretty sure not. It might be from a parody of Asimov.
Re: (Score:2)
That's the point I am making. It cannot kill you.
It absolutely can. Yes the driver can intervene but that requires a driver that's able to realise something is going amiss, adapt and react within less than a second or two at best to a situation that needed to be accounted for several seconds prior to that. Many drivers simply don't possess the skill level. Shit I drive 100,000 miles a year, have been accident free for over 2 million miles and I'd struggle with such short notice. The chances of a successful intervention are massively reduced the closer the
Re: (Score:2)
requires a driver that's able to realise something is going amiss, adapt and react within less than a second or two at best to a situation that needed to be accounted for several seconds prior to that. Many drivers simply don't possess the skill level.
Forget seconds. Airplanes have gotten into trouble when the autopilot suddenly gave up for no apparent reason or simply behaved differently from what the pilots were expecting, and they have dozens of seconds if not minutes to react. Not to mention pilots are trained to handle it, while drivers are not.
Re:Expectations vs Reality (Score:4)
Problem is, those stupid people are marketing and selling their expectations for $10K a pop, and putting this shit on the public roadways.
Re: (Score:2)
A rare commodity is being sold at a price point many upper and middle class can buy for a service that has taken years of investment. The price point actually sounds like a loss until they sell enough. It's probably primarily to understand better the returns of the service than to make profit.
Blame your legislators for letting it on the roadways or you know the people turning on the beta software in very difficult driving circumstances when they know it's a beta service. As for the people making the product
Re: (Score:3)
Scams aren't rare.
Re:Expectations vs Reality (Score:5, Insightful)
"Honestly, who was expecting otherwise? That's why it's called "beta". "
No. For this level of incompetence you want the backup driver to be full-time, skilled driver employed either by Tesla or a third-party testing organization--not the average, unskilled driver.
Re:Expectations vs Reality (Score:5, Insightful)
There is "beta" and then there is "pure hype". I've still have a 2015 Tesla which was supposed to "find me anywhere on private property". What it actually does, 6 years later, it drives up to 40 feet forwards or backwards while I hold the dead-man-switch making sure it doesn't cause any damage (for which I have accept responsibility when enabling the summon feature). Oh, and that car was also supposed to have 691hp, but after trying everything, including their CTO writing a blog how "EV horsepower is special and different", Tesla finally admitted that the car can only produce 463hp on its best day. Their excuse, "well, the motors are 691hp capable, but not he battery or the power delivery system we sold you". Do you think the car will EVER find me anywhere on private property or develop 691hp, with Tesla covering the necessary upgrade costs since I paid for the feature 6 years ago? Or do you think they will refund me any money? Of course not, just like my car has 691hp motors, all those suckers who paid for Full Self Driving since 2016, have Full Self Driving *capable* cars, except for the sensors, redundant components, and computing power required to do so - but hey, the windshield, roof, tires, and even seats are totally Full Self Driving capable! You can gloat to your friends, "I have a car with Full Self Driving capable floor mats!".
Btw, we also have a 2018 Tesla in the household too, and its highway autopilot is actually worse than the 2015 version - it's twitchy, it brakes for no reason, none of us in the house every use it, even though I do occasionally use the old one which works well as adaptive cruise control with lane keeping.
Elon has dreams, and he found that he can sell those dreams for thousands of dollars, and once the buyers realize they've been sold nothing but hype, he finds more buyers. On the bright side, he does use that money to keep on trying to build the dream, it's just that people who pay for it don't realize they are paying for development of the dream for other buyers in a distant future. So, people who paid for Full Self Driving in 2016, paid so that someone in 2032 might be able to buy a Tesla with actual Level 4 or 5 autonomy. Of course, that money also helped make Elon of of the richest men on earth, but that's just a side effect.
Re: (Score:2)
So you bought a product that did not deliver on basic claims like HP output. Felt cheated and turned around and bought another one?
That is the sort of consumer behavior that enables companies like Tesla to continue to hype. The fact that you bought another tells me its actually at least in your view a nice product compared to the alternatives.
It should be saleable without over promising and under delivering. However the buying public is rewarding the hype-and-vaporware-as-marketing strategy.
Re: (Score:2)
I did stop buying hype features after 2015, buying only base EV's as there was no comparable product, but never paid for another "coming soon" feature since. Then the Model 3 and eventually Model Y flood came, Tesla service went from stellar to crap, driven by corporate profit squeeze, so stopped buying Teslas completely after 2018. Notice that Tesla stopped worrying about existing customers, with their primary focus on new customer acquisition - more first time buyers who have not experienced the sales pit
Re:Expectations vs Reality (Score:5, Interesting)
By showing these videos, you only give ammunition to the folks who wish to do things like ban self-driving cars which in my mind would be like banning the internet in the 80s.
That is true.
However, so is the inverse.
By not showing these videos, you let lies about how good the technology is perpetuate.
I would of concluded a bit of a confidential agreement or the like would of been included in the EULA for signing up for the beta.
I'm of a mixed mind about it. I'm leaning toward the rights of the public though, purely because these are being beta tested in public, meaning anyone could take video of this, or become a consequence of its beta status.
If only operated in a private setting, I'd argue for enforceability of NDA.
Re: (Score:2)
Honestly, who was expecting otherwise? That's why it's called "beta".
I've been around the block a bit in the tech sector, about 35 years. This isn't a BETA, this isn't even close to a BETA. And you don't release critical software that affects safety with life ending consequences if it fails to the public in the state this is in. This is at best an ALPHA version that should be limited to a restricted number of testers.
Goalposts have shifted. (Score:5, Insightful)
When I first heard of fully automated driving, I thought of how nice it would be to not have pay attention for 16 hours between St. Louis and Denver. Today, the bar for "self driving" is in environments where even I almost break the law and almost run into people.
I don't trust hardly anyone in the city driving. I appreciate this being called out before we live in urban areas where there isn't a human to blame. I would happily sit in a Tesla driving down a highway in Utah, but I would never think it would be a good idea to do the same thing in New York City.
--
It always seems impossible until it's done.- Nelson Mandela
Re: (Score:2)
Yeah. Idiots like Musk changed them by claiming that as their goal.
Re: (Score:3)
I don't trust hardly anyone in the city driving.
Does that mean that you do trust the majority of city drivers?!
If the Tesla couldn't stay on the road (Score:4, Insightful)
So It Perfectly Replicates American Driving Habits (Score:4, Funny)
Stop signs in California (Score:4, Insightful)
California has utterly stupid stop signs.
Stop signs may be at a 4-way stop, or a two-way stop, where one direction has a stop sign, but the perpendicular traffic doesn't. In California, when you come up to a stop sign, there is no indication of which type of junction you have come to.
Perhaps you may see the reverse side of the stop signs for the other traffic, but what if you can't? Does that mean that they don't exist, or that you can't see them.
These types of junctions are a hazard to human drivers as well as to self-driving cars. In fact, self-driving cars may have an advantage, if the mapping data can include information on the existence or lack of stop signs for other traffic.
Re:Stop signs in California (Score:4, Interesting)
Re: (Score:2)
Perhaps you may see the reverse side of the stop signs for the other traffic, but what if you can't?
That's exactly why stop signs have their unique octagonal shape: So you can tell if other directions also have a stop sign, even from the back side. I don't recall seeing any intersection where the stop signs from other inlets could not be seen at all.
The little "all way" or "two way" hints below stop signs at many intersections are a nice-to-have that speeds up decision making, but they're not totally essential.
Re: (Score:2)
Stop signs are sub-optimal relative to 'Give Way' signs at most junctions but having driven in California their use of them does not create a hazard.
Even in San Francisco with its utterly fucking stupid stop signs every hundred fucking yards. It's a pain in the arse and it's shitty but it's not a hazard.
Re: (Score:2)
Many four-way stops do put a small rectangle underneath with text "All way." Treati
Re: (Score:2)
The answer there is to get rid of that bizzare 4-way stop thing. It's insane and America is the only place in the world with it so there's clearly no need for it.
They did a mistake trainging the ML system... (Score:2)
Why not ? (Score:2)
Beta testing (Score:5, Insightful)
In the post-Google world, "beta" means nothing.
Re: (Score:2)
Beta these days means, public release with no liability claims. This is Star Citizen level quality here. This should not be available to the general public.
Obligatory Xkcd (Score:3)
This time, the Tesla arrives at an intersection where it has a stop sign and cross traffic doesn't. It proceeds with two cars incoming, the first car narrowly passing the car's front bumper and the trailing car braking to avoid T-boning the Model 3. It is absolutely unbelievable and indefensible that the driver, who is supposed to be monitoring the car to ensure safe operation, did not intervene there.
Self Driving [xkcd.com]
(Perhaps the driver just hadn't completed the "registration" before reaching the intersection...)
Auto pilot driving zones (Score:2)
Re: (Score:2)
Re: (Score:2)
Our current infrastructure isn't even optimized for human drivers in most places really. Pedestrians would be crossing over/under streets not on them if they were optimized for driving.
Drives better than my mom.. (Score:3)
Re: (Score:2)
Re: (Score:3)
This is my thought as well. I don't own any TSLA, not in the market for a car, but would really, really like self driving to come to fruition. But I have seen a lot of the practice cars around, whether Tesla, Cruise, Waymo, Uber, etc. and they all get hung up at relatively trivial situations for humans that are complex in theory.
Four way stops were drivers are going out of turn, or trying to wave people through. Double parked delivery trucks, clueless pedestrians and the real fact that most of us have a
Found the problem! (Score:2)
It was driver error. They forgot to take it out of Ludicrous Mode.
Oakland is Tutorial Level (Score:2)
Wide streets laid out on a grid? Imagine this thing in London - carnage!
We're never going to have full self-driving judging by this video; we've barely got to the level of driving like a drunk.
How about Europe? (Score:2)
This is super wide straight roads on a grid with little traffic and neat intersections with traffic lights.
I wonder how will it perform in Europe which is a lot more irregular.
Well yeah (Score:2)
Maybe if there is an alert, attentive human to override the dumbass car then perhaps this would be okay but Musk has a perpetual pr
Let's not jump the gun here (Score:2)
After all, it is a test Tesla
It then drunkenly stumbles (Score:2)
i'm honestly baffled (Score:2)
this is not self-driving
as long as it requires you to have your hands on the wheel and ready to take over with no warning (a 'ding' and giving up at the same time is not a warning) it is not "Selfdriving"
it is "lol let me see you get out of the dangerous situations i'll get you in"
how can nobody see that this is STUPIDLY DANGEROUS and ABSURD?
Not only does the driver have to drive-without-driving, they also have to reverse-engineer the autopilot's intention from its actions, evaluate IN REALTIME whether said
The cars don't talk to each other yet (Score:2)
Really? (Score:2)
Test Driving Raised Flags (Score:2)
Re: (Score:2)
Re: (Score:2)
Yeah, how about Tesla get their training data without breaking multiple traffic laws and endangering many people.
This is not the way. This is criminally irresponsible.
Re: (Score:2)
You must have watched a different video. I saw a car driven like there was a drunk 12-year-old behind the wheel. The headline was literally correct - I did actually laugh at this load of junk failing to navigate an extremely simple road system on a clear day with lots of room and visibility.
Re: (Score:2)
The reason you can't see the incoming cars in that last bit is because the video's camera is facing out the front windshield. The people in the car see, and discuss, cases like that well before those cars have to brake! You should discourse that your "more reasoned" is really more motivated reasoning.
Re: (Score:2)
Re: (Score:2)
I can’t tell you what it would have done in that situation and I don’t want to pay for fsd as my experience with autopilot, didn’t give me confidence that the 8000 dollar software package is worth it.
But with autopilot it will start to brake for cyclists coming from a side road if it looks like they are not stopping. It clearly identifies pedestrians and cyclists, and when running autopilot, I can see it reacts on cars that aren’t shown on the screen so it sees more that they decide
Re: We don't have the infrastructure (Score:2)