Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Transportation AI

Tesla's New 'Smart Summon' Feature Reportedly Crashes a Car Into a Garage (jalopnik.com) 131

Tesla owners who paid for "full self-driving capability" received a software update this week with a Smart Summon feature. In private parking lots, and always within line of sight, the Tesla will magically make its way to an awaiting owner. "Smart Summon can be stopped at any point by the owner removing their finger from a button on the phone app, at which point the car stops immediately..." reports Jalopnik.

But their article cites some critical tweets -- including one Twitter user who complained their Tesla "went forward and ran into the side of garage... Be forewarned... Enhanced summon isn't safe or production ready."

Jalopnik writes: Again, impressive tech, but I can get any 15 year old with a learner's permit to ram a car into the side of a garage for a lot less money. I mean, it's cool advanced AI can now drive into the side of a garage, I guess...

On the plus side, sure, it's great for impressing people and not getting wet in the rain or having to walk to your car, possibly with a bunch of heavy crap, but at the same time, when has it ever been okay to attempt to be "in control" of your car from potentially across a parking lot? There's plenty of cases where Smart Summon has worked just fine. And yes, people do stupid shit in parking lots every day. Tesla does specify that it's a Beta release, which is fine for most software, but does it make sense when that software is driving a full-sized car in a public space?

This is a tricky one. I'm pretty sure we'll see more Smart Summon issues and fender-benders because the world is messy and confusing.

The article also questions whether the Tesla will notice when it's driving the wrong way down a one-way parking lot lane -- since it appears to be doing just that in the test lot where Tesla filmed the Smart Summon introductory video.
This discussion has been archived. No new comments can be posted.

Tesla's New 'Smart Summon' Feature Reportedly Crashes a Car Into a Garage

Comments Filter:
  • by complete loony ( 663508 ) <Jeremy.Lakeman@nOSpaM.gmail.com> on Sunday September 29, 2019 @07:48PM (#59251200)
    This isn't the first time that Tesla has released a new feature for beta testing. If it isn't safe enough for everyone, when enabled all the time, then it isn't safe for anyone.
    • by sinij ( 911942 ) on Sunday September 29, 2019 @08:07PM (#59251236)
      Tesla are most IT people out of Silicon Valley, releasing half-baked products and patching bugs on live is a religion. I think they call it Agile or some-such.
    • by TrekkieGod ( 627867 ) on Sunday September 29, 2019 @08:08PM (#59251238) Homepage Journal

      By that definition, cruise control should not have been implemented in cars until adaptive cruise control became ready. Accidents have happened because people enabled cruise control, turned their attention elsewhere, and didn't see the car in front of them braking.

      It's perfectly ok to enable this feature and put the onus on the owner to know when it needs to turn off because it's about to hit something. Cruise control has done that for decades, and it makes no difference if you're inside the car or out. If you're not in a position where you can clearly see what the car is doing and maintain control, then don't use it.

      • by sjames ( 1099 )

        I'm guessing the reaction time of the control for this is significantly longer than it takes to hit the brake in a car w/ cruise control. I'm certain that the 'driver''s ability to see potential problems is much greater when they're actually sitting in the car.

        • I'm guessing the reaction time of the control for this is significantly longer than it takes to hit the brake in a car w/ cruise control.

          Because you've trained a lot with all your past driving experience until you reached this level. So you're reacting fast with your brake pedal.
          It's the first time you're using a remote control on your full-size car (as opposed to the RC car models back when you were a kid) so you're a lot less used to react in that situation.

          I'm certain that the 'driver''s ability to see potential problems is much greater when they're actually sitting in the car.

          Actually not. In general you have a better view of the direct surrounding to avoid a fender bender *from the outside*, rather from the inside from where you don't even see the direct ca

          • by sjames ( 1099 )

            In part. But the control mechanism is likely to be intrinsically slower. First, your phone is not at all a real-time system. So you take your finger off of the button. Eventually, the touch screen notices. Even later, the event system gets the notification from the kernel driver, decides it belongs to the app and posts the event. Eventually, the app gets it's time slice and starts processing events. The release may not be the first event in the queue (if you have a pulse, your finger isn't dead still on the

      • By that definition, cruise control should not have been implemented in cars until adaptive cruise control became ready. Accidents have happened because people enabled cruise control, turned their attention elsewhere, and didn't see the car in front of them braking.

        Having a fixed speed doesn't relieve a person from the act of steering, too. This "summon" feature is completely hands-off on everything. That's the difference - there IS no way for a human to step in and take control. In fact, you watch it from a non-normal angle (which can mess with depth perception and visibility of other obstacles). You just hope that it hasn't built up enough speed to keep sliding forward even when you take your finger off the app.

        • by TrekkieGod ( 627867 ) on Sunday September 29, 2019 @09:05PM (#59251332) Homepage Journal

          This "summon" feature is completely hands-off on everything. That's the difference - there IS no way for a human to step in and take control.

          The human has to keep their finger on the summon button the entire time, as you yourself pointed out. They're in control. If the car is steering in a direction it shouldn't be, you take your finger off.

          You just hope that it hasn't built up enough speed to keep sliding forward even when you take your finger off the app.

          If Tesla allows the car to build up enough speed that it doesn't immediately stop when you take your finger off, I would agree it's on Tesla. The videos I watched in the article show the car moving pretty slow the entire time, though.

          • Comment removed based on user account deletion
            • It's useful in that it will let you get your car out of a space even when you can't get in the driver's door because some jackwagon has parked his dually next to it. Or to bring your car to the covered area in front of the hotel while it's raining. Of course it's useful! If it doesn't crash, that is.

              Tesla needs lidar. Refusing to add it is dumb.

            • You've made an assumption about how this works that's wrong. That is why you're confused.

              That said, it's a stupid gimmick even if you understand how it works.

      • by taiwanjohn ( 103839 ) on Sunday September 29, 2019 @08:50PM (#59251300)

        From what I've heard, the smart summon feature only works as long as you keep your finger on the "Come to me" button in your phone app. If you take your finger off the button, the car immediately stops. So the user would have to be pretty stupid to let this happen.

        • Lusers are stupid (Score:5, Insightful)

          by Latent Heat ( 558884 ) on Sunday September 29, 2019 @11:06PM (#59251532)

          Yup, the software failed because a luser did something stupid. This is in no way the fault of the software developer.

        • by AmiMoJo ( 196126 )

          Consider how much effort has gone into increasing the driver's ability to see obstacles and dangers around the car. Three mirrors, maybe some cameras, a nice large windscreen and more windows at the side. Some cars tilt the wing mirrors down for you, and some have ultrasonic sensors too.

          Even with all that, accidents happen.

          Now the same driver is expected to have the same level of situational awareness and visibility when stood a couple of hundred metres away, possibly with cars between them and their own ve

          • The top speed of a Tesla in smart summon mode is roughly a brisk walk. The obstacle in this case was a garage. Frankly, I'm skeptical of the whole story. I'll be curious to see if Tesla Inc. has more data from the car's sensor suite that can shed more light on this situation.

        • by AmiMoJo ( 196126 )

          Smart Summon gives you a "map" of the area: https://teslamotorsclub.com/tm... [teslamotorsclub.com]

          Unfortunately it's worse than useless, it's actually misleading. The "map" is just what the ultrasonics can see, and as we all know those things are not that reliable. They don't see objects close to the ground or too far above it.

          The phone screen should just say "look at the car you idiot!", not give people information that is probably wrong anyway and which they will try to rely on rather than walking to where they can see the ca

      • by Waccoon ( 1186667 ) on Sunday September 29, 2019 @11:14PM (#59251544)

        By that definition, cruise control should not have been implemented in cars until adaptive cruise control became ready. Accidents have happened because people enabled cruise control, turned their attention elsewhere, and didn't see the car in front of them braking.

        Cruise control was advertised as exactly what it is -- it holds the speed of the vehicle until you turn it off. It does one thing and makes no real decisions on its own, making it very predictable. Autopilot has been advertised as an intelligent system capable of complex decisions, and is practically a lifesaving technology that makes the car think better than you do, and as a bonus will wash your socks, too. It's hard to supervise a system like that, even with your finger on the killswitch.

        Marketing has gotten out of control, which is why this dogfood is allowed on the market with so little testing.

        • by AmiMoJo ( 196126 ) on Monday September 30, 2019 @03:01AM (#59251860) Homepage Journal

          Right, and this isn't the only day 1 crash. This guy crashed into another car in a parking lot:

          https://twitter.com/DavidFe838... [twitter.com]

          He's asking who is responsible, him or Tesla. Obviously it's him, but the fact that he even has to ask tells you all you need to know: people think it's a "smart" self driving feature and that Tesla will prevent accidents, when in fact they are driving the car and fully responsible for anything that happens as a result.

          • by samwichse ( 1056268 ) on Monday September 30, 2019 @09:38AM (#59252460)

            Wait, how was the Tesla supposed to prevent that from happening?

            His car had the right of way and the backing car just backed right out into it. Clearly the fault of the other driver.

            • by AmiMoJo ( 196126 )

              I tend to agree, but I'm also sure the other drive will argue that the Tesla should have stopped when it saw him start to move (as a human I would have) and go for 50/50 liability.

              • by bigpat ( 158134 )

                I tend to agree, but I'm also sure the other drive will argue that the Tesla should have stopped when it saw him start to move (as a human I would have) and go for 50/50 liability.

                Are you watching the same video I am... The problem was that the Tesla did stop for one car backing out and then the other car backed up and hit it from the side. I've seen human drivers do this dozens of times in parking lots over the years.

                Stopping and then getting hit by someone backing out of the parking space... clearly the other drivers fault.

                • by AmiMoJo ( 196126 )

                  Maybe not... The twitter post is the first video, and then he followed it up with a second video of the same incident but from the dashcam.

                  On the dashcam one you can see that the Tesla pulls out and at the same time another car starts reversing too. It's not the one near the camera in the first video, and in the first video you only see it move for a moment before the guy starts running.

      • By that definition, cruise control should not have been implemented in cars until adaptive cruise control became ready.

        The big difference is that there are a practical few seconds with cruise control where the driver has a good view of what was happening and could take over in plenty of time. With autonomous driving where the "driver" is outside of the car, the driver's view is not what most people are used to when driving, the time window for assuming control is very small, and it's not clear what assuming control means. Simply hard braking the car in the middle of the parking lot is not necessarily the safe thing to do.

        • The driver can't be held responsible.. ever.. if they aren't in the car controlling it.
          • The driver can't be held responsible.. ever.. if they aren't in the car controlling it.

            A person can definitely be responsible for a car accident even if they aren't behind the wheel. For example, if a person fails to secure a car, leaves the car, and the car rolls or shifts into gear and hits something, that person is liable. As another example, a drunk driver who isn't directly controlling the car is liable, as is potentially the person giving him too much to drink and the person who gives him the keys to the car. As another example, a parent who knowingly allows a minor or unfit person t

            • by dgatwood ( 11270 )

              A person can definitely be responsible for a car accident even if they aren't behind the wheel. For example, if a person fails to secure a car, leaves the car, and the car rolls or shifts into gear and hits something, that person is liable. As another example, a drunk driver who isn't directly controlling the car is liable, as is potentially the person giving him too much to drink and the person who gives him the keys to the car. As another example, a parent who knowingly allows a minor or unfit person to d

              • I imagine the liability for the Tesla summon feature is similar to driving a car that malfunctions. The fact that I didn't design or manufacture the car or that the malfunction is not directly my fault is irrelevant. I made the decision to use the car, so I'm liable for the consequences of any damage resulting from that decision. I can sue the manufacturer if I wish, but the direct liability is still mine.

                • by dgatwood ( 11270 )

                  For a strict two-party lawsuit, that's certainly true. However, implicit in that is the assumption that the victim does not realize the problem and name the manufacturer as a codefendant up front. If that happens, a judge or jury would have to determine a percentage of fault, and it's anybody's guess how that would go.

      • by bigpat ( 158134 )

        Heck, by that definition all cars should be taken off the road, I have witnessed a non-Tesla crash in the last week and I have seen dozens of non-Tesla crashes over my lifetime due to driver's mistakes and/or inclement weather... heck last time I rode a horse it ran me into a tree limb.

        Trains and planes are relatively safe, but still have their share of deaths.

        We need a better threshold or criteria than 100% safe, especially when we are talking about collision avoidance technology that has the potential to

    • and when someone dies who will do the time?

      • Re: (Score:2, Interesting)

        and when someone dies who will do the time?

        Nobody. Jail time is for people who are physically dangerous, not merely incompetent.

        Tesla, like all other car manufacturers, has already killed people due to defects.

        If we waited for perfection, nothing would ever happen.

        • OK, but what're your thoughts on the Uber "self driving" car that ran over that bicyclist? Where is the line for you?
          • by ShanghaiBill ( 739463 ) on Sunday September 29, 2019 @11:10PM (#59251534)

            OK, but what're your thoughts on the Uber "self driving" car that ran over that bicyclist? Where is the line for you?

            It was a pedestrian (she was pushing the bicycle, not riding it), and the self driving software had braking disabled.

            So a human was responsible for braking. The human was distracted (looking at her cellphone) and didn't brake in time.

            The obvious lesson is that we need to get more autonomous cars on road so we can get unreliable humans out of the loop.

            Autonomous cars don't have to be perfect, they just have to be an improvement over humans. That is not a high bar.

            • That is not a high bar.

              SDC advocates like to say that, but the truth is, better than humans is an astonishingly high bar to cross. Yes, a lot of people get injured or killed in motor vehicle crashes, but we as society also drive an absolute shitload, far more than is sensible.

              Humans one of the most sophisticated visual processing systems in existence, coupled with the capability to improvise and make sense of new situations. We humans are actually pretty goddamn good.

              • Humans one of the most sophisticated visual processing systems in existence

                That doesn't help if the human is looking at his cellphone.

                We humans are actually pretty goddamn good.

                ... when we are paying attention.

                Even the best human needs about 1.5 seconds between seeing a problem and applying the brakes.

                At 60 MPH, that is 132 feet, before the brake starts to have an effect.

                A computer can respond in 10 ms. At 60 MPS, that is 11 inches.

                In most traffic situations, response time is far more important than deep reasoning.

                • Sure, so when all humans start suddenly closing their eyes at the wheel self driving will look good. Is that the case you're going for?
                  • https://www.cdc.gov/features/d... [cdc.gov]

                    CDC estimates 6000 fatal crashes from falling asleep at the wheel
                    4% of surveyed adults had fallen asleep at the wheel in the last 30 days

                    Humans suddenly closing their eyes at the wheel.

                    NHTSA: 3166 fatal crashes from distracted driving in 2017

                    Humans not even looking where they're going like running toddlers.

              • Humans drive 10 TRILLION miles a year in the US, they drive 450K miles per accident.
            • Hey thanks for responding. I realized after posting that my inquiry sounded aggressive, but I intended it honestly. Thanks
            • by tazan ( 652775 )
              The problem with that logic is it's never going to happen. All the advancements that make self driving cars possible can also be added to a normal car to improve the driving of humans.
          • Failings on Uber side were:

            1. Speeding
            2. Driver not watching the road
            3. Uber had disabled Volvo's emergency braking system
            4, Uber's vehicle failed to take avoiding action

            Failings of the pedestrian:

            1. Not crossing the road at a designated place eg. jaywalking
            2. Not monitoring the oncoming vehicle as she crossed the road lanes, she got about 90% across before impact.

            Had any of these points not of occurred then the accident would have been unlikely to have happened.

            The accident was not a single point of failur

    • Re: (Score:2, Insightful)

      by hambone142 ( 2551854 )

      It blows me away that governments actually allow self driving cars with unproven software to be on the roads with the public.

      We're so concerned about drunk driving yet we have zero proof that cars like this (and others) are compatible with all of our roadways and traffic conditions.

      It's rather irresponsible to allow them to mix with the public.

      • by ShanghaiBill ( 739463 ) on Sunday September 29, 2019 @09:07PM (#59251334)

        It blows me away that governments actually allow self driving cars with unproven software to be on the roads with the public.

        That's nothing. They also allow humans to drive, despite over 100 deaths per day in America.

        Motor vehicle fatality rate in US by year [wikipedia.org]

        • This is the reality we should measure against. If AI causes fewer deaths, even if not perfect, we should use it.

          The only downside is lawsuits slowing introduction of safer tech (assuming it is safer.)

          • by sinij ( 911942 )

            This is the reality we should measure against. If AI causes fewer deaths, even if not perfect, we should use it.

            I have a concern with "should use it". An option to use it on public roads if your car has one is not the same "should use it" as mandatory equipment on all new cars by 2025.

          • Yeah, but you're forgetting the just world hypothesis. If a human driver causes an accident and kills themselves, well they clearly deserved it.

            All AI crashes will be tragedies.

        • Humans driving abilities are tested, they have to show knowledge of road laws and demonstrate an ability to control the car to a certified state official before being permitted to drive on the roads. Has tesla's latest software received it's learners permit yet?

    • Because you have taken the time to look into the alleged report of the car crashing into the garage side, and despite a near-unanimous opinion it was a fake, you went on to blame Tesla.

    • If it isn't safe enough for everyone, when enabled all the time, then it isn't safe for anyone.

      I know. We need to abolish all cars.

    • If it isn't safe enough for everyone, when enabled all the time, then it isn't safe for anyone.

      Yep. We need to abolish all cars.

  • No video = fake (Score:5, Informative)

    by Anonymous Coward on Sunday September 29, 2019 @08:05PM (#59251230)

    How does fake shit like this get on first page?
    They guy posted a single picture of a car. Where is the video? You do realize that car has video running while driving and USB always has last hour from 3 cameras (may be 4 cameras with latest software version now). Any reasonable Tesla owner would save the last clip or eject the drive and copy out TeslaCam folder.

    • 3xactly. Tesla has black box up the wazoo. Far from proving guilt the vast majority of the time, it is the only way to save the company from massive numbers of fraudulent lawsuits.

      In this case let's wait and see before possibly serving someone's special interests.

  • Fake? (Score:5, Insightful)

    by Artem S. Tashkinov ( 764309 ) on Sunday September 29, 2019 @08:29PM (#59251270) Homepage
    The referred twitter post looks like a complete fake and a lie, "Car went forward and ran into the side of garage" - now look at the picture [twimg.com]. Until we see a video of the event there's nothing to really talk about.
    • Unverified tweets from random people are now headline news. Get with the times, please. It's just editors doing a smart summon of clicks.

    • by AmiMoJo ( 196126 )

      Here's a video of a different Smart Summon crash, this time in a parking lot: https://twitter.com/DavidFe838... [twitter.com]

      If you scroll down there is a dashcam video from the Tesla's point of view too. I also saw a forum post where someone had scraped their wing mirror on the wall using Summon.

      See how the guy on Twitter is asking who is responsible for that accident, him or Tesla? The marketing is confusing and the app doesn't clearly warn users that they are still driving the car and need to act that way, not relying

      • AmiMoJo pointed out:

        Here's a video of a different Smart Summon crash, this time in a parking lot: https://twitter.com/DavidFe838... [twitter.com]

        What are you trying to say here?

        Both videos make it absolutely clear that the human driver in the other car pulled out of his parking space and ran into the Tesla's quarter panel. If the Tesla owner had been behind the wheel and fully situation-aware, the crash still would have happened exactly as it did in driverless Summon mode.

        You can't blame software for human error on the part of the driver who caused the crash - unless you're referring to the Instagram feed the idiot who drove into

        • by AmiMoJo ( 196126 )

          Don't you think that the Tesla could have avoided that accident? From the dashcam view it seems like there were a few seconds where it could have reacted by stopping or using the horn. I certainly would have if I had been driving.

          In the UK incidents like that are usually 50/50 liability. Either party could have reasonably avoided the accident so both end up being liable. Insurance companies love 50/50 because it keeps claims costs down.

          • by dgatwood ( 11270 )

            Don't you think that the Tesla could have avoided that accident? From the dashcam view it seems like there were a few seconds where it could have reacted by stopping or using the horn. I certainly would have if I had been driving.

            This. Smart Horn should be the very next feature they add to Smart Summon.

      • Looks just like any other parking lot crash when humans are in the car.

  • A thinking person would realize Smart Summon wasn't the cause of this. Yes, Smart Summon is new and in beta, but the functionality is far from new. Smart Summon relies on many other Tesla features, such as the cameras, the radar, and the 360 degree ultrasonic sensors. All of these features have been in use for years and have successfully avoided an unknown number of incidents. All Smart Summon does is utilize these features to navigate a parking lot or a driveway. My guess is that this guy put his car in dr
    • A thinking person would realize that Tesla already has problems with all those technologies you listed, such as running into a stopped fire engine, and not blindly assume that said features are problem-free in a parking lot.

  • Summon (Score:5, Funny)

    by PPH ( 736903 ) on Sunday September 29, 2019 @08:34PM (#59251276)

    Candles, a pentagram and a goat should be involved somehow.

  • Too soon. (Score:4, Insightful)

    by bjwest ( 14070 ) on Sunday September 29, 2019 @08:47PM (#59251296)
    We're not even to the point we can trust them not to crash on a freeway, no way are we to the point we can have them maneuvering around confined spaces/congested roads on their own. I think we're five to ten years out from that level yet.
    • I guess we better not let humans behind the wheel, we can't trust them either. Way to many accidents are caused by humans.

    • Going at snail's pace in a parking lot yielding to everything that moves is probably safer than driving at freeway speeds and maybe not recognizing a barrier.

    • by AmiMoJo ( 196126 )

      The only reason they are doing this is parking lots and private garages is that it avoids the legal problems of beta testing on public roads.

      Tesla seem desperate to get "full self driving" working. Not only have they already been selling it since 2016 and are looking at lawsuits if they don't deliver soon, but they seem to be pinning their hopes on a robotaxi service being a major source of income by next year.

      That's right, Musk announced 2020 for full self driving robot taxis. Apparently Tesla owners will

  • by AlanObject ( 3603453 ) on Sunday September 29, 2019 @10:15PM (#59251468)

    Speaking as a Tesla fanboi I really don't get this.

    It would seem to me that before you release any software that moves the car you absolutely have to get the collision avoidance layer working first.

    The car is moving slow and could stop within a few centimeters. The car is loaded with sensors every direction. Proximity sensors are the easiest sensors to implement. It is not like they have to identify the object -- just that it is there. There should be some daemon/task somewhere that says "hey we are about to hit something" so we STOP NOW. And we don't go that way again until that thing isn't there anymore!

    There isn't some task that does this? Or if there is it was shut off to give the CPU cycles to the Summon feature? Or if there is it isn't bulletproof?

    I'm ordering a Model 3 within the next month or two. This feature isn't a deal breaker for me since I don't have to use it. But it still is disappointing that they would handle a tech rollout like this.

    • by novakyu ( 636495 )

      Speaking as someone who intensely dislikes Elon Musk, I do have to go with the AC who guessed this might be fake [slashdot.org]. How did the dent get between the front wheel and the front door?

    • by AmiMoJo ( 196126 )

      Tesla can't do collision avoidance, it's impossible with their current hardware.

      Their cameras don't point far down enough to see obstacles very close to the car. For that they have ultrasonic sensors but as anyone who has used them knows they are not that reliable. They also can't see things like kerbs because they are too low.

      The car needs to get close to other objects. When it parks or comes out of a space it will be close to adjacent vehicles, and close to the wall in a garage. Many parking structures ha

  • And cars are sometimes depressed.

  • I remember seeing this kind of thing in 'Knight Rider' in the 1980's. Car could be summoned by talking to it from a wrist watch.

    That was a kids show though, reality is always so much more awkward.

    • Not having bad language or nudity doesn't make something a "kids show". Knight Rider is suitable for all ages.
      • The absence of bad language in anything involving cars makes a show too unrealistic to be a serious show.

  • Summon (and presumably auto park) are going to have obvious problems on car parks which have any of these things - other people, other cars, one way systems, multi-level or underground sections where GPS is unavailable, barriers, dead ends, roundabouts, stop / yield signs, bollards, angled parking spaces, kerbs, surface works, taped off areas / cones / bollards, pedestrian crossings etc.

    No wonder Tesla are weaseling by saying summon requires you see the car to use it. Presumably that's so you can frantica

  • "The article also questions whether the Tesla will notice when it's driving the wrong way down a one-way parking lot lane -- since it appears to be doing just that in the test lot where Tesla filmed the Smart Summon introductory video."

    So... it drives just like half the jerks at the grocery. Nothing changes.

  • by King_TJ ( 85913 ) on Monday September 30, 2019 @08:45AM (#59252308) Journal

    All of this doesn't surprise me a bit, but it's par for the course with one of these vehicles.

    I've owned a Model S and now a Model X, and highly recommend them to anyone interested in an electric car with good range and performance, a good charging network for road trips, and lots of nice computerized features.

    But all things "autopilot" on these vehicles have never really worked better than "most of the time, within certain parameters". For example? Every time I get caught in a big rainstorm, the car's ability to steer itself within a lane turns off because it stops seeing the lines on the road ahead. The older autopilot version 1 in my Tesla S (the MobilEye stuff made by Intel) used to try to read speed limit signs with OCR - but would regularly misinterpret a "Route 79" sign by my house as speed limit 79MPH (when it's really a 35MPH zone there).

    Even before this new "smart summon", just the basic summon forward/backwards with the keyfob used to occasionally mess up when trying to back it in or out of my garage. One of the sensors would mistakenly decide it needed to turn the wheels to avoid a non-existent obstacle and if I didn't stop it from moving, was going to run into the corner of my garage instead of continuing straight in or out.

    To be fair? I don't really see any of the competitors offering anything comparable that works a lot better..... But Tesla owners SHOULD realize that all of these autopilot features are only to be used under careful guidance and when you can immediately cancel them as needed. I'm glad Tesla pushes the envelope this way, because I think they'd never get the feedback from many owners regularly trying to use them in all sorts of conditions if they just tested in-house and refused to release something until it was deemed perfect. But yeah - if you don't want to risk your own car, essentially helping them test and further develop this stuff? You're better off buying the car without the autopilot and saving yourself $6-7K. You can order ANY Tesla without it enabled in the software, even though the hardware will still be physically present on the car.

    • If they should be used under such careful guidance, why add a feature that allows to make the car move without ensuring the driver is in the place the car is designed for him or her to safely operate it from; the drivers seat?
    • Every time I get caught in a big rainstorm, the car's ability to steer itself within a lane turns off because it stops seeing the lines on the road ahead.

      Every human in my state has that problem too. My state is unusually good at road construction and maintenance, but they do have one major problem. They're still using a paint formula from the 1970s to paint their lane markers. They all vanish in the rain, no matter who (or what) is driving.

  • Tesla orders more rope for the 'people who want to hang themselves' window.
  • How many people bother with obeying one-way markings in parking lots? I know one place where it seems that fewer than half the cars are going in the marked direction. If ignoring the markings makes the drive shorter or quicker, people ignore markings.
  • Named after the Lone Ranger's horse.

    Figured we'd get there. Whistle and the car comes to you.

  • I hope the insurance company cranks up the premiums for Tesla owners to cover this liability.

  • I thought this feature was for parking lots.... Was he just so excited he couldn't wait to use it and took the car outside to do this?

    Otherwise, what's next? Summoning it to the kitchen?

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...