'I'm Not Drunk, It's My Car.' Tesla's 'Full Self-Driving' Gets Mixed Reviews (cnn.com) 175
CNN describes the reactions posted online by six beta testers of Tesla's "full self-driving" software, saying they "appear to be both delighted and alarmed by what they've experienced so far."
"Turn left. Come on. What are you doing?" said one frustrated Tesla owner as his car appeared slow to change lanes during a trip he posted on YouTube last week. "I swear I'm not drunk you guys, I'm not drunk, it's my car...."
CNN Business reviewed hours of footage and found early impressions of the software are a mixed bag. At times the testers are impressed with the "full self-driving" technology, in other cases they say it's overly cautious. The videos also show unsafe situations that appear to result from the car not understanding traffic well enough.
Brandon McGowen, one of the beta testers, has posted videos online in which his Tesla nearly drives off the road or into a median. He's not the only driver who claims to have experienced trouble while testing the software. Beta testers Zeb Hallock, James Locke, Rafael Santoni, Kim Paquette, and a YouTuber who goes by "Tesla Raj," have highlighted concerns. In videos reviewed by CNN Business, Teslas appear to blow through red lights, stop well short of intersections, miss turns, nearly rear-end a parked car, make a turn from the wrong lane, and speed over speed bumps...
Tesla has warned current drivers to pay extra attention to the road, and keep their hands on the wheel. "Do not become complacent," Tesla warned the drivers in a message displayed when they installed the software, which CNN viewed in multiple videos posted by people testing the software. "It may do the wrong thing at the worst time...."
The cars...have shown a pattern of coming to a full stop when entering a roundabout, even when no car is blocking their path. Videos show that full self-driving often slows for speed humps, but won't necessarily slow down for speed bumps. In at least one case, the Tesla "full self-driving" software appeared to confuse a one-way street for a two-way street, according to the video.
Paquette estimated in her Talking Tesla interview that her Tesla might be as good a driver as her, if she'd had "maybe three bourbons."
CNN Business reviewed hours of footage and found early impressions of the software are a mixed bag. At times the testers are impressed with the "full self-driving" technology, in other cases they say it's overly cautious. The videos also show unsafe situations that appear to result from the car not understanding traffic well enough.
Brandon McGowen, one of the beta testers, has posted videos online in which his Tesla nearly drives off the road or into a median. He's not the only driver who claims to have experienced trouble while testing the software. Beta testers Zeb Hallock, James Locke, Rafael Santoni, Kim Paquette, and a YouTuber who goes by "Tesla Raj," have highlighted concerns. In videos reviewed by CNN Business, Teslas appear to blow through red lights, stop well short of intersections, miss turns, nearly rear-end a parked car, make a turn from the wrong lane, and speed over speed bumps...
Tesla has warned current drivers to pay extra attention to the road, and keep their hands on the wheel. "Do not become complacent," Tesla warned the drivers in a message displayed when they installed the software, which CNN viewed in multiple videos posted by people testing the software. "It may do the wrong thing at the worst time...."
The cars...have shown a pattern of coming to a full stop when entering a roundabout, even when no car is blocking their path. Videos show that full self-driving often slows for speed humps, but won't necessarily slow down for speed bumps. In at least one case, the Tesla "full self-driving" software appeared to confuse a one-way street for a two-way street, according to the video.
Paquette estimated in her Talking Tesla interview that her Tesla might be as good a driver as her, if she'd had "maybe three bourbons."
Hard to see where we go from here... (Score:5, Insightful)
Sensors: Easy to Test (Score:2)
It's easy for Tesla to see if the sensors are adequate. Ignoring the sonar and radar, which are mostly for short-range (sonar) and maintaining following distance (radar), they are relying on cameras. They can take the feeds from all the cameras, and have a trained driver determine if it is safe to do things like proceed at a stop sign, using only what they see in the cameras. If a human can do it, then the cameras are sufficient. If a human can't do it, but a human driver in the car could, then the came
Re: Sensors: Easy to Test (Score:2)
That is not relevant. The computer is not a human, so what a human can do with the sensor input doesn't tell us anything useful. The computer is better at watching multiple cameras, but not as good at determining context.
Re: (Score:2)
I'm not saying a human could do it in real-time integrating all the camera data, but a human could easily determine if there's enough information. The theory is that the AI can eventually get good enough at determining context, but the questions about sensors has usually been whether they provide sufficient data for that to even be theoretically possible.
Re: (Score:2)
Re:Hard to see where we go from here... (Score:4, Informative)
No, Brandon doesn't seem to be up to the task [twitter.com] (read the thread). He spends every drive in a state of panic. Literally none of the other beta drivers react like he does. A lot of people, including other FSD beta drivers, are mad at him for feeding FUD like this with his panicky overreactions.
FSD Beta is not perfect, and nobody expects it to be perfect at this stage. But it is bloody amazing, and all of the FSD beta drivers (including panicky Brandon) agree that it's improving rapidly between revisions. Which anyone can see [youtube.com] for [youtube.com] themselves [twitter.com] by [youtube.com] just [youtube.com] watching [youtube.com] them [youtube.com] - unbiased random samplings, rather than selectively picking out the bad (it even handles gravel roads covered in leaves like a champ [youtube.com]). Or by chatting with them the FSD beta drivers, as I do regularly. Seriously, do so - they're almost all on Twitter; easily found [twitter.com]. All of them will acknowledge that FSD beta isn't perfect, but they're really impressed by it and what it bodes, and how fast it's improving. Funny then how CNN didn't bother interviewing them.
But of course, I know how this is going to go. Lots of people still think that standard AP doesn't see stationary vehicles, even though it has for the past several years, just because one upon a time it didn't. People aren't going to chat with the actual drivers, all of whom except for Brandon who don't drive around in a panic. People are going to take Brandon's constant state of panic in his videos, or the occasional random clip, and use that to paint a static picture of FSD, and overplay the frequency of interventions and their severity (after watching a lot of videos closely, including the car's displayed detections and plans, about half of interventions were entirely unnecessary, and most of the rest had to do with not wanting to drive like a grandma or not miss a turn). Which should be obvious.
Thankfully, the world will move on without them.
Re: (Score:2)
Re: (Score:2)
I agree with aarrggh when he says "It seems like we are a very long way from being able to use it like a taxi. The current sensors don’t seem to be up for the task, along with nearly everything else."
My impression is it's not only the sensors too. The kinds of mistakes it's making seem to be related to something more fundamental than things that can be tuned or corrected at a beta stage. It's an impression. And the system is impressive but it still looks like it can fail just as catastrophically to re
Re: (Score:3)
Honestly, I think the best testing population would be certified driving instructors. But that's irrelevant, as pretty much everyone but Brandon seems to be doing just fine supervising it without spending every drive hyperventalating.
Re: (Score:2)
Re: (Score:2)
Things like poorly marked speed bumps (and potholes) are not well detected by the system today
The obvious way to handle them is to maintain a database. Then, unless the car is the first SDC to encounter a brand new speed bump, it will already know it is there.
Re: (Score:3)
It does [youtube.com] detect [twitter.com] speed [youtu.be] bumps [youtu.be].
Why is it that people never realize when they're being fed a hand-selected series of negatives in order to portray a more sensationalist narrative? Yes, of course there exist videos of it missing speed bumps (just like humans sometimes do). No, that is not the general case. And it will increasingly be rarer over time. This is the first time it's out in public hands, e.g. catching edge cases from a vastly larger dataset than internal testing could ever have hoped.
Re: Sensors aere fine (Score:2)
Re: (Score:2)
No, you can't drive over a speed bump at full speed without harm. You can damage the suspension, bottom out the car and scrape the undercarriage (which can break a variety of things), and harm the people in the car (they can hit their heads on the roof or bang on any other surface of the car). Its one of those things were you'll do it a dozen times with no harm, then you'll roll snake eyes once and it will fuck everything up. And large pot holes are worse.
Re: (Score:3, Informative)
It's irrelevant because FSD does slow for speed bumps [slashdot.org] - a couple hand-selected misses notwithstanding. Humans also occasionally miss speed bumps.
Re: Sensors aere fine (Score:2)
Not sure what your point is.
Re: (Score:2)
Re: (Score:2)
A quick search of youtube does nothing show if humans are better or worse, the number people driving is much larger than the numbers of self driving cars out there. Do people make mistakes absolutely, do self driving cars make mistakes sure, which is better don't know and we won't find out until there are a lot more on the road driving in every day situations.
Re: (Score:2)
"It always amazes me how all people claim that are far better drivers than computers when a quick check on Youtube will show people does things you would not think any intelligent human being of doing."
The lack of logic in your statement is stunning. You're basing your assessment of human driving upon Youtube videos as evidence? WTF?!? Would you expect videos of people driving normally?
Now, just to be clear, I'm not arguing that Teslas aren't better than the majority of human drivers. And that's mostly
Re:Sensors aere fine (Score:5, Insightful)
An alpha version should't be on public roads, let alone something they charge for. And, it's not even dealing with winter blizzard conditions.
Re: (Score:2)
An alpha version should't be on public roads,
Why? If drivers are told to pay careful attention while it is on - why not?
Cruise control can just as easily lead to disaster, should no car have that? Or lane avoidance which ships in a number of modern cars now? In some of them the quality is probably around alpha at best.
The simple truth is that cars have shipped for a long time with what are essentially alpha features that lots of miles have smoothed out. Self driving is no different.
We don't need luddites
Re:Sensors aere fine (Score:5, Insightful)
First off- humans don't work that way. They don't maintain full attention when not engaged. That's just CYA bullshit
Secondly- the car doesn't just put its driver at risk, which might be an acceptable risk. It puts everyone else on the road at risk, and they didn't agree to be human guinea pigs.
Cruise control is totally different. It's not marketed as "Full self driving". People using it are still actively involved in driving. And truthfully, I'm NOT sure that cruise control is a good thing- I'm a better driver with it off than on. And it should only be used in special circumstances where sudden starts/stops are unlikely, which isn't how people are using this.
As for blizzards- you believe a thing Musk says? That pretty much invalidates anything you say, he's the world's biggest bullshit artist.
Re: (Score:3, Insightful)
They sold it to a bunch of random people who paid for it months ago and aren't trained testers. They aren't going to be paying attention. You're either delusional or just lying if you're saying they will.
And I accept the normal risks of the road. You don't have the right to increase those for me by putting beta software on a bunch of random cars that can completely take over them. Keep that shit on private tracks or with professional drivers until its ACTUALLY fully self driving.
Re: (Score:2)
Do you know how many new drivers there are on the road at any given time? You already share the road with barely-trained, inexperienced drivers.
I assume you bring this up as a problem to be solved, not an desirable situation to be copied.
Re: (Score:2)
Cruise control can just as easily lead to disaster, should no car have that?
Cruise control has been around long enough, and in roughly the same state of technology, that drivers know its limitations. Self-drive is new, and even most Tesla owners are not fully aware of the rapidly changing boundaries of applicability.
We don't need luddites jumping at every shadow to slow humanity to a crawl.
In Democrat-infested states there are whole legislatures that will take the first stupid-driver accident as an excuse to ban self-drive forever. And unlike nuclear power, they can't just import the advantages of self-drive from states that do adopt it.
Re: (Score:3)
then it's not "full self-driving."
STUDENT DRIVER (Score:2, Insightful)
Re: Sensors aere fine (Score:2)
You don't have to qualify that with the word "blizzard": the extremely hazardous condition where a clear, sunny dry road is suddenly glare ice and you've lost 90%+ traction is a) not uncommon and b) nearly undetectable even to experienced drivers, the only cue being some road surface specular highlights being slightly more shiny than they "should be".
I know I'm supposed to worship on Slashdot at the Church of Unbridled Technological Optimism, but we're at least ten years away from actual unattended driving.
Re: (Score:2)
Eh, they're not using LIDAR due to expense where every other company is using it. So I'm not so sure they aren't missing something. But yeah, they don't need better cameras, but better software. The real question is how long that will take. They're betting on years, but I don't expect to see it this decade.
Re: Sensors aere fine (Score:4, Insightful)
One of the given examples is running over speed bumps without slowing because it thinks it's a line, not a bump. The sensors absolutely are the problem. You wouldn't make that mistake with LIDAR. And that is only one example, there are plenty more.
Doing FSD without LIDAR isn't just harder, it's stupid and unnecessary, and endangers lives. For what? More profit? Fuck that.
Re:Sensors aere fine (Score:4)
Because of how many people are using and improving it, we are about a year away from it becoming as good as most human drivers.
That's the most optimistic thing I've heard all year.
Re: (Score:2)
We're five years away. Like we were five years ago, and like we will be in five.
Feeding more data into a deep learning model gives better results up to a point, and then you need orders of magnitude more data to make a dent in it. Deep learning does not scale well at all.
Re: (Score:2)
You have no idea how fast the technology can improve with enough data.
But this depends on the human drivers not being total idiots during the beta, like that hired safety driver who was playing a video game while the car plowed into a pedestrian.
Re: (Score:2)
we are about a year away from it becoming as good as most human drivers
We absolutely are NOT a year away. That's such an unimaginably uninformed and naive comment. Tesla's "self-driving" is all smoke and mirrors and it's going to kill more people than it kills already. If we had a functional governmental regulatory body Tesla would be sued to high heaven and forced to remove "autopilot" and "self-driving" from their products.
Is it my car? (Score:2)
Or am I just drunk?
Re: (Score:2)
You're probably drunk.
Re: (Score:2)
You bought the E85 version of the Tesla.
Re:Is it my car? (Score:5, Funny)
Or am I just drunk?
Police: "Please step out of your car, we need to do a field sobriety test."
Driver: "OK, what do I need to do?"
Police: "Nothing, the field sobriety test is for your car."
Re:Is it my car? (Score:5, Funny)
Telsa failed sobriety test. When asked to recite the alphabet it instead played trans-Siberian orchestra and started waving its doors around.
Re: (Score:2)
You kid, but the real question is what regulatory tests has this system passed? Does it have to pass state and federal tests? Do all states have tests? Does the govt have federal self-driving tests?
Re: (Score:2)
Nothing yet. That responsibility is still shouldered by the human behind the wheel, that is still required to have a valid driver's license. And who also will be held responsible should the self-driving system cause a crash.
Re: (Score:2)
her Tesla might be as good a driver as her, if she'd had "maybe three bourbons"
I fully support the idea of subdividing the level 4 / 5 autonomy categories by this metric. "Tesla now offer a 3 bourbon level 5 self-driving car, while Mercedes is about to release a 1 bourbon level 4"
Cops stop you on the road at midnight (Score:5, Funny)
Cop: Sir, what are you doing?
Guy: I'm just driving around, is that illegal?
Cop: Sir, are you drunk?
Guy: Of course not, it's the self-driving of my Tesla it's still in beta.
Cop: What Tesla, sir? You are half naked and were running in the neighborhood before we arrested you.
Guy: You'll have to speak up, I'm wearing a towel.
Re: (Score:2)
The neckbeard's dream of a wild night on the surface.
I'm not going to buy the full pdf, sorry.
More.. (Score:2)
Cop: That appears to be a '82 Honda Civic sir, not a Tesla.
Guy: Well that explains why I had a hell of a time stuffing the batteries in the tank.
Cop: Well sir as this is Florida, I'm going to let you off with a warning.
What's the problem? (Score:5, Insightful)
Teslas appear to blow through red lights, stop well short of intersections, miss turns, nearly rear-end a parked car, make a turn from the wrong lane, and speed over speed bumps...
I see most of this every day. Today a pick up truck made a left turn on red in front of me as my light was green, routinely people are stopping half to a full car length behind the white line, people cut over from the right lane into the left to make a turn, and speed bumps are driven over almost at speed.
In other words, the Tesla is driving just like people do. Success!
Re: (Score:3)
You jest, but it's not that far from the truth.
Tesla doesn't have to be perfect. "Better than the average human" is a really, really low bar.
Re: (Score:3)
Re:What's the problem? (Score:5, Insightful)
No, "better than the average human" is actually an astonishingly high bar to cross.
Yes, there are a lot of crashes, many fatalities, around the world every year. However, humans also put *shitton* of miles in.
Just the other day there was a slashdot article, Waymo posted details about their self driving data. 6.1 Million miles, over 21 months. The USA as a whole puts that in every **minute**. And, in those miles there were 47 collisions, therefore averaging a little under 130,000 miles between crashes. Humans in the USA average close to about 500,000 between crashes, and about 90 million miles between fatal crashes. Waymo has a long way to go.
Additionally, those figures already include all the drunks, morons, sleepy drivers, and shitbox operators. If we had better designed cities and better public transportation, that would make the bar to cross even higher, and provide more economical and equitable transportation than self driving cars ever will.
Re: (Score:2)
Just the other day there was a slashdot article, Waymo posted details about their self driving data. 6.1 Million miles, over 21 months.
And how many more miles in simulators? Turns out it's tens of billions.
https://www.ecosia.org/search?... [ecosia.org]
Re: (Score:2)
There's nothing inherently wrong with simulators, but if they have 10s of billions of hours in simulators and are still reporting crashes four times more often than the average person in real life it does beg the question is the simulator beneficial or are they not applying what they learn from the simulator to real driving?
Re:What's the problem? (Score:4)
Tesla doesn't have to be perfect. "Better than the average human" is a really, really low bar.
The average driver gets into one auto accident every 17.9 years [forbes.com]. If a Tesla (or any self-driving car) can match that level of reliability, it will be quite an accomplishment.
Re: (Score:2)
Tesla claims that its statistics show that accidents per million miles are fewer in cars with Autopilot.
Re: (Score:3)
Yeah, because the company constantly warns that you need to babysit the system at all times, and if you do get into an accident, it was clearly and entirely user error. Statistics from marketing departments are always BS.
Re: (Score:2)
Waymo has already exceeded the ability of the average driver, although only in a limited area (Level 4 autonomy): https://tech.slashdot.org/stor... [slashdot.org]
Re: (Score:2)
I'd take that with a pinch of salt. A simple deterrence for reporting a minor collision or near-miss where the driver has to take control would be to get the driver to have to stop immediately and file a very long and detailed report... with a etch-a-sketch. It's in Waymo's interests to make self-driving look good and make reporting awkward but with plausible deniability.
Re: (Score:2)
A lot of that is because the existing self-driving systems get to do all the "easy" driving. Meanwhile, human drivers have to be able to handle driving in any situation. If you had two human drivers, and one of them only drove in clear weather, on good condition roads, on routes they were familiar with, and the other had to handle all the bad weather, poorly market roads, construction zones, detours, unfamiliar areas, and so on, it would only be natural that the first driver would have less accidents.
Re: (Score:2)
Except that's not how legal liability works. A human driver doesn't get a pass just because they are generally a good driver, and better than the average driver.
There are serious legal issues to be addressed before this technology goes into use.
Re: (Score:2)
Tesla has to be way better than the average human because Tesla is liable for all the accidents its full self driving system will have. If it has 1 million self driving cars out there and it's only as good as the average human it's going to be covering a million people's worth of liability insurance, a million people's worth of injury pay-outs, a million people's worth of repairs etc.
Also note that the kind of driving you describe may be common in the US but in Europe and Japan the driving test is apparentl
Re: (Score:2)
Yes, and seriously, if you've done any work on autonomous driving at all, you know that that's how everyone expects it to go: Step a) autonomous cars are driving like shit. Step b) they improve slightly and now drive like an average driver - people consider it inacceptable. Step c) they improve slightly and now drive like a reasonably good driver - people find it inaccepable. Step d) they improve slightly and now drive better than most human drivers - first reviews say that autonomous driving may be conside
Re: (Score:2)
So far all autonomous driving advocates I've heard went step b) because our software is shit, lobby to adapt the infrastructure to it.
Idiots (Score:2, Insightful)
I don't trust it (Score:2)
From what I have seen using the "autopilot" feature, I am not about to buy the Full Self Driving feature for $9400(the price around here). I believe there was a half price sale in September on the upgrade, but there were a lot of other things I'd rather buy. :D On the freeway the lane keeping works and the speed sort of works but I get a much smoother ride by driving myself.
Re: (Score:2)
Nobody needs a fully semi-automatic car.
This is going to get somebody hurt or killed (Score:2)
Gaining experience. (Score:4, Insightful)
It's clear that the self-driving is unfinished but since this is a "beta test" I'm assuming every time someone suddenly takes over that it's noted in the uploaded data. The result could very well be these beta testers are providing the feedback information needed to improve the neural network. It's disappointing that it's not everything people wanted it to be. Let's hope they are taking a sane approach which will rapidly improve the self-driving neural networks.
That's exactly what Tesla is doing (Score:3)
I'm assuming every time someone suddenly takes over that it's noted in the uploaded data.
That's exactly what Tesla said they do in the Full Self Driving presentation they gave a year or two ago - any driver intervention is noted and used as a point to update the system. So the more people are using this, the faster it will improve - especially if they are paying close attention and making corrections when needed.
Re: (Score:3)
The issue is that they are getting untrained members of the public to test this on public roads. It's clearly not ready for that and they should be using trained safety drivers with very careful monitoring to make sure they are paying attention.
All they are doing is trying to shift liability for accidents onto fools who signed up to be crash test dummies in their beta programme.
Re: (Score:2)
All they are doing is trying to shift liability for accidents
I doubt it's about liability and more about development/testing time. It's not the most ethical way of doing things but if they are updating it weekly based on feedback then it's definitely the fastest.
Actually Brandon McGowen's videos praise the FSD. (Score:2)
Re:Actually Brandon McGowen's videos praise the FS (Score:4, Insightful)
> 90% of the time the FSD does great and the commenters agree.
So if I take an autopilot Tesla to the store (it takes me?), there's a 90% chance I'll get there safely? So only a 10% of the time it wrecks. Gonna have to do several orders of magnitude better than that before I put my life in its hands.
Re:Actually Brandon McGowen's videos praise the FS (Score:4)
Gonna have to do several orders of magnitude better than that before I put my life in its hands.
Yes, that is the goal, and it is the massive real-world usage-testing and data-collection that will (if all goes according to plan) allow Tesla to achieve that goal.
Re: (Score:2)
Re: (Score:2)
It isn't released to the public yet, it's still being developed. It's only available to people who want to help in developing and testing it. You are looking at a half built house and complaining it won't have a roof.
Re: Actually Brandon McGowen's videos praise the F (Score:2)
He said public road moron.
Re: (Score:2)
I said "not released to the public" you stupid idiot. It's for people who signed up as testers. Nothing the car does can't be overriden by the driver. Last I checked non-FSD cars require driver attention too .. so not sure what the problem is that test software requires driver attention .. driving requires driver attention anyway.
A mixture of delight and alarm? (Score:3, Insightful)
Re: (Score:2)
Sounds like my Dad teaching me how to drive.
I assume you're describing things from his point of view?
Re: (Score:2)
Re: (Score:2)
Indeed. 40 years ago for me, and dad has been dead for 25.
"Dammit, keep your foot on the gas in the corners!"
So much fun learning to drive in a Porsche (356 to learn, then the 911 when I actually got my license - when dad died and they became mine I sold the 911, still have the 356)
Not enough information - hysteria over logic (Score:5, Insightful)
Speaking for myself only: it is not enough to say "It didn't do what I wanted! It's Broken!" is a natural but useless reaction.
What I want to know is what specifically failed.
Of course I am not going to get that but I would at least like to know in each fail case what happened: A) The 4-D model that the system constructed did not match reality, or B) The model was correct but the driver program chose the wrong thing to do. There is actually a C) case which is that the model was correct, the program was correct, but the hardware was not fast enough to execute the programmed logic in real time.
You can't know how far off the product is from being release grade until you know at least this level of detail.
Examples of the case-A type of failure are the most worrisome. This is where the argument about whether the sensors the car has are adequate or not. If the 4D model turns out to be correct then the senor suite is up to the job If it wasn't then sensors might be the problem but it could also mean that the data ingestion subsystem might be insufficient.
Examples of case-B I would think would be easier to address because simulations of the failure mode could be constructed and re-run until the logic was right. I have to wonder what regressions they might encounter. That would be a job all by itself.
However we don't get these kinds of observations for these anecdotal incidents. In my view that makes these discussions pointless.
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
What I want to know is what specifically failed.
That's the problem with IA. Even a postmortem is hard to analyse. And it can fail unexpectedly in any situation and it's not like a normal program that will fail in extreme cases (huge inputs, etc), proof of concepts have shown that changing a single pixel in an image can invert the results...
Re: (Score:2)
This comment is too rational for slashdot.
the law needs to be and bad outcome is dui autocar (Score:3)
the law needs to be worked on now and setting laws too soon can lead to bad outcomes later. Like you can get an dui in an auto car EVEN if the only control is an E-stop button as they courts can that e-stop button = in control. or just haveing the e-taxi app on your phone.
Re: (Score:2)
Re: (Score:3, Insightful)
If you drive one drunk, you run a much greater chance of running someone over or being involved in a crash than you do when sober.
But also a much greater chance of running someone over if you drive drunk without that system.
Let's wait until the technology is actually good enough...
It is. It's not perfect, but it's foolish to let the perfect be the enemy of the good.
Tesla has a lot of flaws, but when you look at the real-world performance of this system, it's more safe than your average human is. Tesla won't get better if it's not running massive data collection programs in the real world. In real-world driving conditions, with real drivers. That's what they're doing.
The alternative is Waymo
Re: (Score:3)
Oooh, poor baby. Show us on the doll where Elon touched you.
How do you know a Tesla is in full auto mode? (Score:4, Funny)
It uses its turn signals...
Re:How do you know a Tesla is in full auto mode? (Score:4, Funny)
Perhaps there should be some universally understood, externally visible indicator that a car is under software control, as opposed to under primate control (or feline, if your cat is named Toonces). I propose a strip of red LEDs across the front of the car that light up in a sweeping pattern, left to right and back again. Perhaps a humming noise to go with it. And when you engage software control the car should say "BY YOUR COMMAND IMPERIOUS LEADER".
Next level beta testing (Score:4, Interesting)
Honestly, I have no idea what motivates some people. Hey, help work out some bugs. Worst case, you get injured or die.
Also, I love how all the car people are constantly "pay attention, pay attention" while selling this feature as a way, well to do the complete opposite.
I'm fine with safety features designed to overcome inattention (collision avoidance, lane drift) as well to give you a bit of a rest when safe (low speed rush hour mode). Sit back and @#$%@% around on your phone because fines are just fines mode, err, not so much.
And, in 2020, we figured out just how much commuting can be avoided with letting people work at home. I wonder the demand for this will fall off after people realize that traffic doesn't have to be a nightmare for everybody with flexible work and public transportation.
Call an Uber. (Score:3)
Re: (Score:2)
Agree that it's very important. Less accidents mean less trauma.
Below data is from before the self driving update that's currently being rolled out.
Average number of accidents:
US Average: 1 accident per 479k miles driven.
Tesla without autopilot and without active safety features: 1560k miles driven.
Tesla without autopilot engaged: 2270k miles driven.
Tesla with autopilot engaged: 4530k miles driven.
I think 2270k miles is the important line. It's street driving, when autopilot isn't generally engaged and wh
Re: (Score:2)
"Full Self Driving" (Score:4, Informative)
I feel like there's a lesson here and Tesla has failed to learn it...
Remember a few years back, when Tesla "Autopilot" first became a thing, and several people were killed when they took the name of that feature at face value, assuming it was actually a fully automatic driving mode for the car? Then when the driver slept on his commute to work the car slammed into a median at highway speed, unceremoniously ending him?
Now, Tesla releases a package called "Full Self-Driving?" Another "automatic" AI-powered driving mode that needs to be constantly monitored by the driver in a manner that makes it seem just as dubious a claim as "Autopilot" was? And the marketing buzzline is actually "Full Self-Driving?" Is anyone else raising their eyebrows over this?
It sounds to me like Tesla must want to get sued for wrongful death. Or maybe grievous bodily harm. Hopefully casualties are limited to Tesla customers who are too dumb to read the instruction manual. Based on initial reports, it seems the potential for harm to innocent bystanders is still quite significant.
Calling this feature "Full Self-Driving" seems downright irresponsible. I wonder--what they will call future iterations of this technology? "Actual Full Self-Driving For Real?" No really we mean it this time! It won't hit parked cars or pedestrians we promise!
What does the software license look like? (Score:2)
Is Tesla putting their money where their mouth is, and accepting legal liability for any errors their software makes? Or is it the usual software license where they disclaim any liability whatsoever? Because normal software licenses tell you not to use it for life and death situations, and says that you can't sue even if it blows up.
"Bow! Kneel! Yield!", mocked Lex Luthor (Score:2)
The cars...have shown a pattern of coming to a full stop when entering a roundabout
So it behaves exactly like humans in America with roundabouts.
Y'all need a trial by fire in Europe where people shoot right into car-sized gaps while moving at speed. Yielding is for impossible fits, nothing more.
My Prediction (Score:2)
Using this "Full Self-Driving" feature on a daily basis will give you an anxiety disorder or make an existing one worse.