Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI Transportation

Consumer Reports: Tesla's New Automatic Lane-Changing Is Much Worse Than a Human Driver (consumerreports.org) 115

"Tesla's updated Navigate on Autopilot software now lets some drivers choose whether the car can automatically change lanes without his or her input," writes Consumer Reports -- before complaining that the feature "doesn't work very well and could create safety risks for drivers."

An anonymous reader quotes their report: In practice, we found that the new Navigate on Autopilot lane-changing feature lagged far behind a human driver's skills. The feature cut off cars without leaving enough space, and even passed other cars in ways that violate state laws, according to several law enforcement representatives CR interviewed for this report. As a result, the driver often had to prevent the system from making poor decisions. "The system's role should be to help the driver, but the way this technology is deployed, it's the other way around," says Jake Fisher, Consumer Reports' senior director of auto testing. "It's incredibly nearsighted. It doesn't appear to react to brake lights or turn signals, it can't anticipate what other drivers will do, and as a result, you constantly have to be one step ahead of it...."

Multiple testers reported that the Tesla often changed lanes in ways that a safe human driver wouldn't -- cutting too closely in front of other cars, and passing on the right. An area of particular concern is Tesla's claim that the vehicle's three rearward-facing cameras can detect fast-approaching objects from the rear better than the average driver can. Our testers found the opposite to be true in practice. "The system has trouble responding to vehicles that approach quickly from behind," Fisher says. "Because of this, the system will often cut off a vehicle that is going at a much faster speed, since it doesn't seem to sense the oncoming car until it's relatively close."

Fisher says merging into traffic is another problem. "It is reluctant to merge in heavy traffic, but when it does, it often immediately applies the brakes to create space behind the follow car," he says, "and this can be a rude surprise to the vehicle you cut off... This isn't a convenience at all. Monitoring the system is much harder than just changing lanes yourself."

In the article David Friedman, vice president of advocacy at Consumer Reports, complains that Tesla "is showing what not to do on the path toward self-driving cars: release increasingly automated driving systems that aren't vetted properly."
This discussion has been archived. No new comments can be posted.

Consumer Reports: Tesla's New Automatic Lane-Changing Is Much Worse Than a Human Driver

Comments Filter:
  • Don't worry (Score:4, Interesting)

    by AmiMoJo ( 196126 ) on Saturday May 25, 2019 @04:36PM (#58654574) Homepage Journal

    Next year Tesla will be launching it's robotaxi service, I'm sure all the bugs will be fixed by then!

    Or were they planning on having it drive like a typical taxi driver?

    • by Anonymous Coward

      Oh no are starting to smell common sense in use of things called autopilot? Let me guess. Lane changing involves monitoring standard vehicle behavior so as to smoothly merge. How strange? What? Some 19-line Python/cucumber/R script can't get the job done?

    • by Anonymous Coward

      hopefully it will launch a auto-apostrophe service for people like you who can't tell its from it is

    • by Anonymous Coward

      Tesla will be in chapter 7 by this time next year

      Demand is way down, The mythical $35K Model 3 is dead and gone. Existing customers are furious about quality control issues and repair times that often take your vehicle out of service for MONTHS before a repair can be made.

      Tesla is a shit show, and the tweeter in chief has no one to blame but himself.

      Bankruptcy secured (takes extra long drag on joint)

      • My model 3 is working perfectly, and is a pleasure to drive.
        I know 3 other people who have one and they are happy too.
        And about once every 40 days or so thereâ(TM)s a software upgrade and he car has a few more features, a bit more power and autopilots a little bit better.
        I am not a car person but itâ(TM)s great.
        In the six months that Iâ(TM)ve had it now, the autopilot has gone from marginal to hundreds of miles without the need to interfere which is cool.

    • Or were they planning on having it drive like a typical taxi driver?

      And whoever posted this was duping like a typical Slashdot editor.

    • Next year Tesla will be launching it's robotaxi service, I'm sure all the bugs will be fixed by then!

      I bet it'll be called ElonCab.

  • by Anonymous Coward
    lol
  • by JoeyRox ( 2711699 ) on Saturday May 25, 2019 @04:45PM (#58654606)
    The feature cut off cars without leaving enough space, and even passed other cars in ways that violate state laws, according to several law enforcement representatives CR interviewed for this report.

    That sounds on par with human skills to me.
    • by Anonymous Coward

      > That sounds on par with human skills to me.

      The description made me say "It drives like an asshole." Cut in gaps too small then brake check the car you cut off. That's not an average driver. That's not the "follow every law exactly" driver we usually worry about when self driving pops up for discussion. That's the driver that treats traffic as zero sum and makes it worse.

    • It's supposed to be a safer human, not a normal one. Definitely not a normal one.

      • by Anonymous Coward

        It doesn't shoot people, or honk uncontrollably. Does that count?

      • by pnutjam ( 523990 )
        Most people don't realize that the road can actually stretch, if you pull into a space too small. It makes a "honk" sound when the road stretches.
    • by Luckyo ( 1726890 )

      That might actually be a problem. Whatever AI they're training, they're probably training it in urban environments where being aggressive is required to get anywhere.

      It's like those "twitter AIs" that were trained on the "twitter users responses". They got just as bad as your average twitter mob member. So it's not that AI is bad at learning. It's that AI learned from bad actors, human drivers in urban environments.

  • by Anonymous Coward

    Where is Rei to explain how patch #2376 will fix these issues and the follow up patch will being full self driving?

    • by Anonymous Coward

      Jesus guys, let her rest in peace. Bad enough she'd be a victim to the faulty self driving system in her Tesla, worse that you'd continue to make fun of her weeks afterwards.

  • Future Conflict (Score:3, Interesting)

    by Jim Sadler ( 5963822 ) on Saturday May 25, 2019 @05:01PM (#58654648)
    Automated cars and trucks will become much more able then human drivers and that could create chaos. For example, suppose you want shift from the middle lane to the fast lane and your vehicle knows it can do it with a full six inches of clearance and all of a sudden you have a car only six inches in front of you. It could get to the point at which you know that the computers are super trustworthy but can a human mind be in such a vehicle without wearing a blind fold? Seeing heavy objects at speed coming so close to you will cause some very primitive emotions to take place. Also imagine the effects that this sort of thing would have on a motorcyclist. The fear of getting wiped out may be overwhelming. I am all for autonomous cars and trucks but it will have strange effects that very few people are taking into account.
    • by 93 Escort Wagon ( 326346 ) on Saturday May 25, 2019 @05:47PM (#58654848)

      For example, suppose you want shift from the middle lane to the fast lane and your vehicle knows it can do it with a full six inches of clearance and all of a sudden you have a car only six inches in front of you.

      That would be a very badly programmed vehicle, given it's only considering a scenario where nothing goes wrong. Six inches isn't even remotely enough stopping distance, even for a computer-controlled car.

      • It's alarmingly common in real traffic.Also, the "fast lane" may not be travelling fast at all, or its speed may be very close to that of the middle lane. Drivers underestimate the relative speed of the vehicles in the fast lane quite frequently, especially when the vehicle in the "fast lane" is in the blind spot of the vehicle in the middle lane. The ideal of anticipating potential risks and leaving safety margins to accommodate them, is a goal of many driving courses, but violated in actual traffic by man

      • Six inches isn't even remotely enough stopping distance, even for a computer-controlled car.

        The term you're looking for is following distance.

        • Re: (Score:2, Informative)

          by Anonymous Coward

          There is nothing wrong with saying stopping distance here since we're talking about the danger of slamming in the back of another vehicle. You do want your stopping distance to be shorter than following distance by at least a smidge unless you like brain damage.

    • Re:Future Conflict (Score:5, Interesting)

      by Dutch Gun ( 899105 ) on Saturday May 25, 2019 @05:49PM (#58654856)

      I think companies will simply not program vehicles to do things like you suggested. In fact, there are already counter-examples of companies making deliberate concession to natural human tendencies. As one example, engineers at Waymo noticed that people tend to get somewhat nervous about being next to very large vehicles, and instinctively want to put a bit more room between them. They've actually adjusted their algorithms to account for this by giving large vehicles slightly more room than is strictly necessary, simply to put their passengers more at ease.

      The same will undoubtedly hold true for the psychology of other drivers. It seems highly unlikely that an autonomous vehicle would (deliberately) be programmed to significantly exceed human-level tolerances of safety, rightly because of what you pointed out (not to mention probably being illegal). Any instances of Tesla vehicles doing this are undoubtely the result of nascent and still imperfect technology.

      Perhaps this will change as humans naturally grow to trust vehicles more than human drivers, but I'd wager the first generation or two of autonomous vehicles are going to be fairly conservative drivers.

      • But what will they do when a car comes careening half on the sidewalk towards them? Pretty easy to anticipate a contrite situation as you mention, but can they truly take responsibility for a life in all situations?
      • by pnutjam ( 523990 )
        I don't know. I've seen alot of articles about people "attacking" self driving cars. If you read the incidents, it sounds more like people attacking the cars of jerks who don't leave enough room at crosswalks or look like they are trying to run over pedestrians.
    • Automated cars and trucks will become much more able then human drivers and that could create chaos. For example, suppose you want shift from the middle lane to the fast lane and your vehicle knows it can do it with a full six inches of clearance and all of a sudden you have a car only six inches in front of you. It could get to the point at which you know that the computers are super trustworthy but can a human mind be in such a vehicle without wearing a blind fold? Seeing heavy objects at speed coming so close to you will cause some very primitive emotions to take place. Also imagine the effects that this sort of thing would have on a motorcyclist. The fear of getting wiped out may be overwhelming. I am all for autonomous cars and trucks but it will have strange effects that very few people are taking into account.

      You're too busy thinking in the details. Things like this get factored into product decisions. If an update causes people to be uncomfortable, it's going to get rolled back. As long as the passenger is buying the car, I think we're safe.

  • What, not enough click-baiting the last time around?

    It was a silly article that didn't make sense then, and it makes no less sense today. OMG it won't change lanes for me exactly the way I want, every single time! So turn the wheel and change lanes exactly as you would if autopilot did t exist, and autopilot shuts off when you do.

    This is bitching about handling edge cases that you can instantly override and handle yourself any time you want.

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      It's actually complaining about a setting the vast majority of Tesla owners have off because as much as we love our cars, most of us do not fully trust them and watch them.

      The feature in question is lane change without notification.

      Overall, the difference between Tesla's advanced cruise control and everyone else's is that Tesla's works. It keeps you centered in your lane. It has had issues with lanes splitting and merging, but it has gotten pretty good about lane merging.

      Having used the 2018 Toyota, GM Supe

  • by fahrbot-bot ( 874524 ) on Saturday May 25, 2019 @05:16PM (#58654706)

    Multiple testers reported that the Tesla often changed lanes in ways that a safe human driver wouldn't -- ... and passing on the right.

    From Passing on the Right [allenandallen.com]

    In most states, it is legal to pass on the right under certain conditions.
    In Virginia, a driver is allowed to pass another vehicle on the right:

    1: When the other vehicle is making or is about to make a left turn, and its driver has given the required signal;
    2: On a roadway that is wide enough for two or more lines of moving vehicles in each direction, as long as the roadway is free from obstructions (such as parked vehicles);
    3: On a one-way street that is wide enough for two or more lines of moving vehicles, as long as the roadway is free from obstructions.

    Drivers are not allowed to pass on the right unless they can do so safely. Also, drivers are not allowed to pass on the shoulder of the highway or off the pavement (i.e., off the main traveled portion of the roadway) unless there is a lawfully placed sign that specifically permits passing in these areas.

    As far as "cutting too closely in front of other cars", I once squeezed my way into a line of traffic right in front of a Police car. He pulled me over and yammered about cutting in too closely. I replied, "perhaps you were following too closely" (which he definitely was). He let me go, but wasn't super happy about it.

    • As far as "cutting too closely in front of other cars", I once squeezed my way into a line of traffic right in front of a Police car. He pulled me over and yammered about cutting in too closely. I replied, "perhaps you were following too closely" (which he definitely was). He let me go, but wasn't super happy about it.

      The way it's supposed to work is that you signal to pull into a line of traffic. The car behind you sees you signaling, and pauses to create a gap large enough for you to pull in. Then you

      • by stikves ( 127823 )

        I see more and more people actively tying to prevent me from changing lanes.

        As soon as they see my signal, or sense that I am about to change lanes they would accelerate, even if that means running a red light (happened today), or following the next vehicle too closely, increasing potential for a rear end collusion.

        When that happens, if it is not safe to pass, I just wait. Usually their lane is slower (exits), and I can merge in a few more cars forward of them.

        Or if there is room, and if it is safe, I drive

      • by ledow ( 319597 )

        You indicate your intention.

        If people *decide* to give way to you, that's up to them. Sure, it's rude, but they aren't required to do so.

        In the same way that you pulling out into a road, you'll have a stop/give way section. In America they are more often Stops because you guys can't be trusted to properly give way? I dunno. In Europe, they are more often give-ways. There's a line, you can cross it any time you like, you don't have to stop for it... but you have to give the OTHER TRAFFIC priority. Yes,

    • I think you're putting "Passing on the right" - and the legality thereof - above the statement from the human testers "That was safe to do".

      Legal and safe are by no means the same things.

  • SNAKE OIL salesman extraordinaire. Seriously he may literally be in the running with Steve Jobs here.
    • SNAKE OIL salesman extraordinaire.

      Funny thing, then, how much of his "oil" is bringing revolutionary upheaval to some of the most important industries.

      You could have just said "self-driving sucks" without revealing your cognitive weaknesses...

      • Major vehicle manufacturers have not even felt threatened enough to try EVs yet; they're watching parts fall off of Teslas and know that the electric engine is the easy part and they have the hard part covered. If you mean that he is making headway for self driving, christ, I don't give Waymo much credit but at least they have done 2.2 percent the length that a human does without an accident.

        You're just a FAN BOI aren't you?
  • /. drinking game (Score:4, Interesting)

    by seoras ( 147590 ) on Saturday May 25, 2019 @05:46PM (#58654834)

    Take a shot every time there is a Slashdot story bashing, or encouraging rants about, Tesla or Apple.
    A decade ago it was Microsoft who got the shit kicked out of them on a daily basis here. (Remember the Bill Gates "Borg" icon?)
    How times have changed.

    • That's because Microsoft is an evil corporation that harmed us all. Tesla, however, is likely receiving negative coverage because one of the admins has shorted the stock or is being paid by someone who has. At least Jon Katz is gone.
  • "In practice, we found that the new Navigate on Autopilot lane-changing feature lagged far behind a human driver's skills."

    A theoretical, perfect human indeed.

    "The feature cut off cars without leaving enough space, and even passed other cars in ways that violate state laws,"

    Wow, so IOW it works already like real humans?

    • Wow, so IOW it works already like real humans?

      As I've said for a long time, it's just not a winning message to brag that your automated driving system at its best does just as well as a human driver at its worst.

  • They didnâ(TM)t take Virginia drivers into consideration. Given the complete disregard for turn signals, headlights, and the perpendicular for random lane changes by VA drivers, I find it hard to fathom a machine doing worse.

  • Seriously .... (Score:4, Insightful)

    by King_TJ ( 85913 ) on Saturday May 25, 2019 @07:16PM (#58655158) Journal

    I think this is a fair critique, but at the same time? Anyone who actually owns a Tesla for a period of time should be pretty used to the idea that its "autopilot" is a driver assistance "toy" that "usually works pretty well" but needs your occasional guidance and control.

    Elon Musk is a dreamer, at the end of the day. It's a nice combination to pair with being an engineer, and it makes him shoot for lofty goals that many others would never bother with. So I'm cool with all of that. But I also realize you can't just quote some tweet of his as gospel about what the company will be releasing next. Even when all he's excited about is adding new Atari games you can play on the car's screen, he's had to go back on that due to copyright issues in at least one case! So often, he weaves a big story about what he envisions doing -- only for it to be scaled back to about HALF of whatever he was saying initially.

    Almost everyone who buys a Tesla winds up really liking the vehicle. That's what matters most. But I think you won't even want one if you're not kind of a gadget-freak or techie type. And that really should be the kind of individual who just likes to play with new ideas. If Tesla said, "We could give you automatic lane change capabilities today but they're not really that great at predicting cars coming up really fast in the lane you want to change into, and ... it sometimes brakes to match speeds with a car in front without much regard for the car behind. OR... you could wait another year while we keep it from you, until we've tested it more?" You KNOW the majority would ask for the feature NOW, and say they'll just "be cautious when using it".

    I've always felt like my Model S is kind of primitive in its ability to be aware of its surroundings. But even the most basic ability to sense distance from a curb, or to try to show you the vehicles it sees around you in all directions is FAR more than any other vehicle can do that I ever owned. So yeah, I use ALL autopilot features as "cool, gee-whiz stuff" that I enable only when I'm confident it can probably navigate it well, and with a foot ready to hit the brakes any time I get a bit uncomfortable with something it's starting to do.

  • Is the fact this will improve over time and the entire point is to have every car have this technology. It will trickle down even if it isn't perfect and once everyone has it it will be almost flawless. Why do we have to be so critical of new tech all the time? This is the baby years of AI driving and there have been 50000 doomsday articles on it and how bad it sucks or that Tesla isn't using Lidar and it's going to fail (I like Lidar but come on the camera tech is working great, jesus). The fact is if all
    • by ledow ( 319597 )

      There is no evidence whatsoever that AI automatically improves over time.

      And this is not AI. It's mostly heuristics (i.e. human programmed rules).

      Go look at *any* AI/neural net project that involves human-like decisions (things like chess computers etc. are vastly biased in favour of processing brute-force).

      Does a fair enough job, learns fast, and then *immediately* plateaus as its knowledge is not increased by just exposing it to millions of such situations in any notable fashion. You can train it initia

  • Seriously? What planet does Consumer Reports test their cars on?

  • It doesn't appear to react to brake lights or turn signals,

    cutting too closely in front of other cars, and passing on the right.

    will often cut off a vehicle that is going at a much faster speed,

    Sounds like what I see many human drivers doing every single day. At least the Tesla auto drive won't flip me off when I have to slam my brakes and blow my horn.

Successful and fortunate crime is called virtue. - Seneca

Working...