Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Transportation AI

'I'm Not Drunk, It's My Car.' Tesla's 'Full Self-Driving' Gets Mixed Reviews (cnn.com) 175

CNN describes the reactions posted online by six beta testers of Tesla's "full self-driving" software, saying they "appear to be both delighted and alarmed by what they've experienced so far." "Turn left. Come on. What are you doing?" said one frustrated Tesla owner as his car appeared slow to change lanes during a trip he posted on YouTube last week. "I swear I'm not drunk you guys, I'm not drunk, it's my car...."

CNN Business reviewed hours of footage and found early impressions of the software are a mixed bag. At times the testers are impressed with the "full self-driving" technology, in other cases they say it's overly cautious. The videos also show unsafe situations that appear to result from the car not understanding traffic well enough.

Brandon McGowen, one of the beta testers, has posted videos online in which his Tesla nearly drives off the road or into a median. He's not the only driver who claims to have experienced trouble while testing the software. Beta testers Zeb Hallock, James Locke, Rafael Santoni, Kim Paquette, and a YouTuber who goes by "Tesla Raj," have highlighted concerns. In videos reviewed by CNN Business, Teslas appear to blow through red lights, stop well short of intersections, miss turns, nearly rear-end a parked car, make a turn from the wrong lane, and speed over speed bumps...

Tesla has warned current drivers to pay extra attention to the road, and keep their hands on the wheel. "Do not become complacent," Tesla warned the drivers in a message displayed when they installed the software, which CNN viewed in multiple videos posted by people testing the software. "It may do the wrong thing at the worst time...."

The cars...have shown a pattern of coming to a full stop when entering a roundabout, even when no car is blocking their path. Videos show that full self-driving often slows for speed humps, but won't necessarily slow down for speed bumps. In at least one case, the Tesla "full self-driving" software appeared to confuse a one-way street for a two-way street, according to the video.

Paquette estimated in her Talking Tesla interview that her Tesla might be as good a driver as her, if she'd had "maybe three bourbons."

This discussion has been archived. No new comments can be posted.

'I'm Not Drunk, It's My Car.' Tesla's 'Full Self-Driving' Gets Mixed Reviews

Comments Filter:
  • by aaarrrgggh ( 9205 ) on Saturday October 31, 2020 @09:41PM (#60670522)
    It seems like we are a very long way from being able to use it like a taxi. The current sensors don’t seem to be up for the task, along with nearly everything else. Really hope I am wrong, because I would really like to have my car come pick me up sometimes...
    • It's easy for Tesla to see if the sensors are adequate. Ignoring the sonar and radar, which are mostly for short-range (sonar) and maintaining following distance (radar), they are relying on cameras. They can take the feeds from all the cameras, and have a trained driver determine if it is safe to do things like proceed at a stop sign, using only what they see in the cameras. If a human can do it, then the cameras are sufficient. If a human can't do it, but a human driver in the car could, then the came

      • That is not relevant. The computer is not a human, so what a human can do with the sensor input doesn't tell us anything useful. The computer is better at watching multiple cameras, but not as good at determining context.

        • by crow ( 16139 )

          I'm not saying a human could do it in real-time integrating all the camera data, but a human could easily determine if there's enough information. The theory is that the AI can eventually get good enough at determining context, but the questions about sensors has usually been whether they provide sufficient data for that to even be theoretically possible.

      • My experience is that the car does not have as long of a range perspective as a human at highway speeds for judging closing time properly, and for predicting interaction with other vehicles. (My car is not set for mad max mode though, which might overcome that limitation at the cost of driving more aggressively than the local norms.) Aside from traffic cones (which the car is better at spotting than me, even if they are irrelevant), side vision seems completely buggy. It might be something that the physics
    • by Rei ( 128717 ) on Sunday November 01, 2020 @09:47AM (#60671590) Homepage

      The current sensors don’t seem to be up for the task

      No, Brandon doesn't seem to be up to the task [twitter.com] (read the thread). He spends every drive in a state of panic. Literally none of the other beta drivers react like he does. A lot of people, including other FSD beta drivers, are mad at him for feeding FUD like this with his panicky overreactions.

      FSD Beta is not perfect, and nobody expects it to be perfect at this stage. But it is bloody amazing, and all of the FSD beta drivers (including panicky Brandon) agree that it's improving rapidly between revisions. Which anyone can see [youtube.com] for [youtube.com] themselves [twitter.com] by [youtube.com] just [youtube.com] watching [youtube.com] them [youtube.com] - unbiased random samplings, rather than selectively picking out the bad (it even handles gravel roads covered in leaves like a champ [youtube.com]). Or by chatting with them the FSD beta drivers, as I do regularly. Seriously, do so - they're almost all on Twitter; easily found [twitter.com]. All of them will acknowledge that FSD beta isn't perfect, but they're really impressed by it and what it bodes, and how fast it's improving. Funny then how CNN didn't bother interviewing them.

      But of course, I know how this is going to go. Lots of people still think that standard AP doesn't see stationary vehicles, even though it has for the past several years, just because one upon a time it didn't. People aren't going to chat with the actual drivers, all of whom except for Brandon who don't drive around in a panic. People are going to take Brandon's constant state of panic in his videos, or the occasional random clip, and use that to paint a static picture of FSD, and overplay the frequency of interventions and their severity (after watching a lot of videos closely, including the car's displayed detections and plans, about half of interventions were entirely unnecessary, and most of the rest had to do with not wanting to drive like a grandma or not miss a turn). Which should be obvious.

      Thankfully, the world will move on without them.

      • I personally don’t think the sensors are up to the task after the first thousand or so miles I have driven with my Tesla. 200m forward range at best will lead to an uncomfortable ride (although it is likely adequate on California highways). The trash can and traffic come detection give you a pretty good idea of what the system is capable of, which is very cool— but the detection/rendering of cars on the side (in the general release software) shows more of a bias towards identifying an obstructi
      • I agree with aarrggh when he says "It seems like we are a very long way from being able to use it like a taxi. The current sensors don’t seem to be up for the task, along with nearly everything else."

        My impression is it's not only the sensors too. The kinds of mistakes it's making seem to be related to something more fundamental than things that can be tuned or corrected at a beta stage. It's an impression. And the system is impressive but it still looks like it can fail just as catastrophically to re

  • Or am I just drunk?

  • by DontBeAMoran ( 4843879 ) on Saturday October 31, 2020 @09:50PM (#60670546)

    Cop: Sir, what are you doing?
    Guy: I'm just driving around, is that illegal?
    Cop: Sir, are you drunk?
    Guy: Of course not, it's the self-driving of my Tesla it's still in beta.
    Cop: What Tesla, sir? You are half naked and were running in the neighborhood before we arrested you.
    Guy: You'll have to speak up, I'm wearing a towel.
         

    • The neckbeard's dream of a wild night on the surface.

      I'm not going to buy the full pdf, sorry.

    • Cop: That appears to be a '82 Honda Civic sir, not a Tesla.
      Guy: Well that explains why I had a hell of a time stuffing the batteries in the tank.
      Cop: Well sir as this is Florida, I'm going to let you off with a warning.

  • by quonset ( 4839537 ) on Saturday October 31, 2020 @09:51PM (#60670548)

    Teslas appear to blow through red lights, stop well short of intersections, miss turns, nearly rear-end a parked car, make a turn from the wrong lane, and speed over speed bumps...

    I see most of this every day. Today a pick up truck made a left turn on red in front of me as my light was green, routinely people are stopping half to a full car length behind the white line, people cut over from the right lane into the left to make a turn, and speed bumps are driven over almost at speed.

    In other words, the Tesla is driving just like people do. Success!

    • You jest, but it's not that far from the truth.

      Tesla doesn't have to be perfect. "Better than the average human" is a really, really low bar.

      • The reality is all self driving cars MUST be significantly better than humans or they will never be accepted. So many legal and liability minefields make anything less than being equal to the BEST drivers a failure.
      • by Whatsisname ( 891214 ) on Sunday November 01, 2020 @12:42AM (#60670824) Homepage

        No, "better than the average human" is actually an astonishingly high bar to cross.

        Yes, there are a lot of crashes, many fatalities, around the world every year. However, humans also put *shitton* of miles in.

        Just the other day there was a slashdot article, Waymo posted details about their self driving data. 6.1 Million miles, over 21 months. The USA as a whole puts that in every **minute**. And, in those miles there were 47 collisions, therefore averaging a little under 130,000 miles between crashes. Humans in the USA average close to about 500,000 between crashes, and about 90 million miles between fatal crashes. Waymo has a long way to go.

        Additionally, those figures already include all the drunks, morons, sleepy drivers, and shitbox operators. If we had better designed cities and better public transportation, that would make the bar to cross even higher, and provide more economical and equitable transportation than self driving cars ever will.

        • Just the other day there was a slashdot article, Waymo posted details about their self driving data. 6.1 Million miles, over 21 months.

          And how many more miles in simulators? Turns out it's tens of billions.

          https://www.ecosia.org/search?... [ecosia.org]

          • by N1AK ( 864906 )

            And how many more miles in simulators? Turns out it's tens of billions.

            There's nothing inherently wrong with simulators, but if they have 10s of billions of hours in simulators and are still reporting crashes four times more often than the average person in real life it does beg the question is the simulator beneficial or are they not applying what they learn from the simulator to real driving?

      • by Jeremi ( 14640 ) on Sunday November 01, 2020 @01:19AM (#60670886) Homepage

        Tesla doesn't have to be perfect. "Better than the average human" is a really, really low bar.

        The average driver gets into one auto accident every 17.9 years [forbes.com]. If a Tesla (or any self-driving car) can match that level of reliability, it will be quite an accomplishment.

        • The average driver gets into one auto accident every 17.9 years. If a Tesla (or any self-driving car) can match that level of reliability, it will be quite an accomplishment.

          Tesla claims that its statistics show that accidents per million miles are fewer in cars with Autopilot.

          • Yeah, because the company constantly warns that you need to babysit the system at all times, and if you do get into an accident, it was clearly and entirely user error. Statistics from marketing departments are always BS.

        • by AmiMoJo ( 196126 )

          Waymo has already exceeded the ability of the average driver, although only in a limited area (Level 4 autonomy): https://tech.slashdot.org/stor... [slashdot.org]

          • by MrL0G1C ( 867445 )

            I'd take that with a pinch of salt. A simple deterrence for reporting a minor collision or near-miss where the driver has to take control would be to get the driver to have to stop immediately and file a very long and detailed report... with a etch-a-sketch. It's in Waymo's interests to make self-driving look good and make reporting awkward but with plausible deniability.

          • A lot of that is because the existing self-driving systems get to do all the "easy" driving. Meanwhile, human drivers have to be able to handle driving in any situation. If you had two human drivers, and one of them only drove in clear weather, on good condition roads, on routes they were familiar with, and the other had to handle all the bad weather, poorly market roads, construction zones, detours, unfamiliar areas, and so on, it would only be natural that the first driver would have less accidents.

        • by Gimric ( 110667 )

          Except that's not how legal liability works. A human driver doesn't get a pass just because they are generally a good driver, and better than the average driver.

          There are serious legal issues to be addressed before this technology goes into use.

      • by AmiMoJo ( 196126 )

        Tesla has to be way better than the average human because Tesla is liable for all the accidents its full self driving system will have. If it has 1 million self driving cars out there and it's only as good as the average human it's going to be covering a million people's worth of liability insurance, a million people's worth of injury pay-outs, a million people's worth of repairs etc.

        Also note that the kind of driving you describe may be common in the US but in Europe and Japan the driving test is apparentl

    • by Tom ( 822 )

      Yes, and seriously, if you've done any work on autonomous driving at all, you know that that's how everyone expects it to go: Step a) autonomous cars are driving like shit. Step b) they improve slightly and now drive like an average driver - people consider it inacceptable. Step c) they improve slightly and now drive like a reasonably good driver - people find it inaccepable. Step d) they improve slightly and now drive better than most human drivers - first reviews say that autonomous driving may be conside

      • by mvdwege ( 243851 )

        So far all autonomous driving advocates I've heard went step b) because our software is shit, lobby to adapt the infrastructure to it.

  • Idiots (Score:2, Insightful)

    People are idiots for accepting Tesla's deal. Basically they are putting themselves on the line to 'intervene' with the 'intelligence' in time lest they accept all ramifications for when it screws up. That is a bad deal. Driving is expensive, you don't want to take risks with your record.
  • From what I have seen using the "autopilot" feature, I am not about to buy the Full Self Driving feature for $9400(the price around here). I believe there was a half price sale in September on the upgrade, but there were a lot of other things I'd rather buy. :D On the freeway the lane keeping works and the speed sort of works but I get a much smoother ride by driving myself.

  • but the data their collect will make it all worth it.
  • by Gravis Zero ( 934156 ) on Saturday October 31, 2020 @10:04PM (#60670582)

    It's clear that the self-driving is unfinished but since this is a "beta test" I'm assuming every time someone suddenly takes over that it's noted in the uploaded data. The result could very well be these beta testers are providing the feedback information needed to improve the neural network. It's disappointing that it's not everything people wanted it to be. Let's hope they are taking a sane approach which will rapidly improve the self-driving neural networks.

    • I'm assuming every time someone suddenly takes over that it's noted in the uploaded data.

      That's exactly what Tesla said they do in the Full Self Driving presentation they gave a year or two ago - any driver intervention is noted and used as a point to update the system. So the more people are using this, the faster it will improve - especially if they are paying close attention and making corrections when needed.

    • by AmiMoJo ( 196126 )

      The issue is that they are getting untrained members of the public to test this on public roads. It's clearly not ready for that and they should be using trained safety drivers with very careful monitoring to make sure they are paying attention.

      All they are doing is trying to shift liability for accidents onto fools who signed up to be crash test dummies in their beta programme.

      • All they are doing is trying to shift liability for accidents

        I doubt it's about liability and more about development/testing time. It's not the most ethical way of doing things but if they are updating it weekly based on feedback then it's definitely the fastest.

  • If you actually watch Brandon McGowen's videos, the FSD mostly does a great job, and the vast majority of people in the comments agree. It does occasionally have problems, but it'd be a lot weirder if it didn't. It's not like all those videos are just one problem after another, 90% of the time the FSD does great and the commenters agree.
  • by EmoryM ( 2726097 ) on Saturday October 31, 2020 @10:15PM (#60670598)
    Sounds like my Dad teaching me how to drive.
    • Sounds like my Dad teaching me how to drive.

      I assume you're describing things from his point of view?

    • Deee-Lite was on the radio and the alarm was going off?
    • Indeed. 40 years ago for me, and dad has been dead for 25.

      "Dammit, keep your foot on the gas in the corners!"

      So much fun learning to drive in a Porsche (356 to learn, then the 911 when I actually got my license - when dad died and they became mine I sold the 911, still have the 356)

  • by AlanObject ( 3603453 ) on Saturday October 31, 2020 @10:46PM (#60670652)

    Speaking for myself only: it is not enough to say "It didn't do what I wanted! It's Broken!" is a natural but useless reaction.

    What I want to know is what specifically failed.

    Of course I am not going to get that but I would at least like to know in each fail case what happened: A) The 4-D model that the system constructed did not match reality, or B) The model was correct but the driver program chose the wrong thing to do. There is actually a C) case which is that the model was correct, the program was correct, but the hardware was not fast enough to execute the programmed logic in real time.

    You can't know how far off the product is from being release grade until you know at least this level of detail.

    Examples of the case-A type of failure are the most worrisome. This is where the argument about whether the sensors the car has are adequate or not. If the 4D model turns out to be correct then the senor suite is up to the job If it wasn't then sensors might be the problem but it could also mean that the data ingestion subsystem might be insufficient.

    Examples of case-B I would think would be easier to address because simulations of the failure mode could be constructed and re-run until the logic was right. I have to wonder what regressions they might encounter. That would be a job all by itself.

    However we don't get these kinds of observations for these anecdotal incidents. In my view that makes these discussions pointless.

    • So, what would a Tesla do in a blizzard? Just sit there and not move at all thinking that it is enclosed in styrofoam? This is why most Minnesotans laugh at the idea of a Tesla.
    • by dargaud ( 518470 )

      What I want to know is what specifically failed.

      That's the problem with IA. Even a postmortem is hard to analyse. And it can fail unexpectedly in any situation and it's not like a normal program that will fail in extreme cases (huge inputs, etc), proof of concepts have shown that changing a single pixel in an image can invert the results...

    • This comment is too rational for slashdot.

  • by Joe_Dragon ( 2206452 ) on Saturday October 31, 2020 @10:52PM (#60670660)

    the law needs to be worked on now and setting laws too soon can lead to bad outcomes later. Like you can get an dui in an auto car EVEN if the only control is an E-stop button as they courts can that e-stop button = in control. or just haveing the e-taxi app on your phone.

    • Comment removed based on user account deletion
      • Re: (Score:3, Insightful)

        by apoc.famine ( 621563 )

        If you drive one drunk, you run a much greater chance of running someone over or being involved in a crash than you do when sober.

        But also a much greater chance of running someone over if you drive drunk without that system.

        Let's wait until the technology is actually good enough...

        It is. It's not perfect, but it's foolish to let the perfect be the enemy of the good.

        Tesla has a lot of flaws, but when you look at the real-world performance of this system, it's more safe than your average human is. Tesla won't get better if it's not running massive data collection programs in the real world. In real-world driving conditions, with real drivers. That's what they're doing.

        The alternative is Waymo

  • by cliffjumper222 ( 229876 ) on Saturday October 31, 2020 @11:07PM (#60670686)

    It uses its turn signals...

    • by sheramil ( 921315 ) on Sunday November 01, 2020 @03:30AM (#60670994)

      Perhaps there should be some universally understood, externally visible indicator that a car is under software control, as opposed to under primate control (or feline, if your cat is named Toonces). I propose a strip of red LEDs across the front of the car that light up in a sweeping pattern, left to right and back again. Perhaps a humming noise to go with it. And when you engage software control the car should say "BY YOUR COMMAND IMPERIOUS LEADER".

  • by ndykman ( 659315 ) on Saturday October 31, 2020 @11:41PM (#60670730)

    Honestly, I have no idea what motivates some people. Hey, help work out some bugs. Worst case, you get injured or die.

    Also, I love how all the car people are constantly "pay attention, pay attention" while selling this feature as a way, well to do the complete opposite.

    I'm fine with safety features designed to overcome inattention (collision avoidance, lane drift) as well to give you a bit of a rest when safe (low speed rush hour mode). Sit back and @#$%@% around on your phone because fines are just fines mode, err, not so much.

    And, in 2020, we figured out just how much commuting can be avoided with letting people work at home. I wonder the demand for this will fall off after people realize that traffic doesn't have to be a nightmare for everybody with flexible work and public transportation.

  • by zawarski ( 1381571 ) on Sunday November 01, 2020 @01:09AM (#60670872)
    This is about as necessary as going to Mars.
    • Agree that it's very important. Less accidents mean less trauma.

      Below data is from before the self driving update that's currently being rolled out.

      Average number of accidents:

      US Average: 1 accident per 479k miles driven.
      Tesla without autopilot and without active safety features: 1560k miles driven.
      Tesla without autopilot engaged: 2270k miles driven.
      Tesla with autopilot engaged: 4530k miles driven.

      I think 2270k miles is the important line. It's street driving, when autopilot isn't generally engaged and wh

  • Comment removed based on user account deletion
  • "Full Self Driving" (Score:4, Informative)

    by Roger Wilcox ( 776904 ) on Sunday November 01, 2020 @01:46AM (#60670918)

    I feel like there's a lesson here and Tesla has failed to learn it...

    Remember a few years back, when Tesla "Autopilot" first became a thing, and several people were killed when they took the name of that feature at face value, assuming it was actually a fully automatic driving mode for the car? Then when the driver slept on his commute to work the car slammed into a median at highway speed, unceremoniously ending him?

    Now, Tesla releases a package called "Full Self-Driving?" Another "automatic" AI-powered driving mode that needs to be constantly monitored by the driver in a manner that makes it seem just as dubious a claim as "Autopilot" was? And the marketing buzzline is actually "Full Self-Driving?" Is anyone else raising their eyebrows over this?

    It sounds to me like Tesla must want to get sued for wrongful death. Or maybe grievous bodily harm. Hopefully casualties are limited to Tesla customers who are too dumb to read the instruction manual. Based on initial reports, it seems the potential for harm to innocent bystanders is still quite significant.

    Tesla considers it to be beta software and says it's not intended for fully autonomous operation. Drivers are expected to keep their eyes on the road and hands on the wheel at all times.

    Calling this feature "Full Self-Driving" seems downright irresponsible. I wonder--what they will call future iterations of this technology? "Actual Full Self-Driving For Real?" No really we mean it this time! It won't hit parked cars or pedestrians we promise!

  • Is Tesla putting their money where their mouth is, and accepting legal liability for any errors their software makes? Or is it the usual software license where they disclaim any liability whatsoever? Because normal software licenses tell you not to use it for life and death situations, and says that you can't sue even if it blows up.

  • The cars...have shown a pattern of coming to a full stop when entering a roundabout

    So it behaves exactly like humans in America with roundabouts.

    Y'all need a trial by fire in Europe where people shoot right into car-sized gaps while moving at speed. Yielding is for impossible fits, nothing more.

  • Using this "Full Self-Driving" feature on a daily basis will give you an anxiety disorder or make an existing one worse.

Thus spake the master programmer: "After three days without programming, life becomes meaningless." -- Geoffrey James, "The Tao of Programming"

Working...