Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Transportation Software Technology

Consumer Reports: Latest Autopilot 'Far Less Competent Than a Human' (arstechnica.com) 196

An anonymous reader quotes a report from Ars Technica: In recent weeks, Tesla has been pushing out a new version of Autopilot with automatic lane-change capabilities to Model 3s -- including one owned by Consumer Reports. So the group dispatched several drivers to highways around the group's car-testing center in Connecticut to test the feature. The results weren't good. The "latest version of Tesla's automatic lane-changing feature is far less competent than a human driver," Consumer Reports declares.

CR found that the Model 3's rear cameras didn't seem able to see very far behind the vehicle. Autopilot has forward-facing radar to help detect vehicles ahead of the car and measure their speed, but it lacks rear-facing radar that would give the car advance warning of vehicles approaching quickly from the rear. The result: CR found that the Model 3 tended to cut off cars that were approaching rapidly from behind. The vehicle also violated some Connecticut driving laws, the testers found. "Several CR testers observed Navigate on Autopilot initiate a pass on the right on a two-lane divided highway," writes CR's Keith Barry. "We checked with a law enforcement official who confirmed this is considered an 'improper pass' in Connecticut and could result in a ticket." The vehicle also failed to move back over to the right lane after completing a pass as required by state law, CR reports.
Ultimately, driving with Autopilot's automatic lane-changing feature is "much harder than just changing lanes yourself," writes CR's Jake Fisher said. "Using the system is like monitoring a kid behind the wheel for the very first time. As any parent knows, it's far more convenient and less stressful to simply drive yourself."
This discussion has been archived. No new comments can be posted.

Consumer Reports: Latest Autopilot 'Far Less Competent Than a Human'

Comments Filter:
  • Tesla is not really an "auto-driving" company. Auto-driving is a side venture for them such that they probably cannot compete with dedicated auto-drive companies such as Waymo in bot quality. And even Waymo is behind their release plans.

    Unless Tesla has geniuses that come up with home-run breakthroughs, they will probably eventually end up buying bot tech from the dedicated companies.

  • Consumer Reports: Latest Autopilot 'Far Less Competent Than a Human'

    Proponents of driverless cars have blown the incompetence of human drivers completely out of proportion. I'm not going to pretend all humans are good drivers, they are not, but on the whole it is amazing just how well traffic works and how limited the casualties are given the sheer volume of traffic.

    • by Rick Schumann ( 4662797 ) on Wednesday May 22, 2019 @07:54PM (#58639222) Journal
      Yep. The self-driving car fans sound more or less unhinged. If you had just come to this planet and were talking to one of them about ground vehicle traffic, they'd make it sound to you like it's Mad Max out there, like morgues everywhere are filling up daily with traffic deaths.
      Meanwhile the 'technology' is half-baked at best, and has only got as far as it has on marketing department and media hype.
      • by Kokuyo ( 549451 )

        Uh, have I missed something? Looking at traffic deaths in any country I'd have to say the only reason morgues aren't piling the bodies in hallways is because we have a substantial number of them and they work pretty efficiently.
        Now granted, self-driving cars aren't going to arrive in the next decade, I think, but have you seen the amount of image manipulation we can do automatically and on the fly? Think back to the nineties and tell me you wouldn't have believed that avhievable in your lifetime and yet her

        • You fail to take into account the fact that people drive 3.3 billion miles per day in the US and manage 400K+ miles without such as a fender bender. If you put that into perspective with deaths, the deaths are quite small. No one would drive if they weren't.
        • You're over-reacting to numbers that seem big but that aren't in context. Without even looking I can safely say I can come up with any number of things that people die of every single day that would not only make traffic-related deaths seem insignificant, but that would be much more horrifying to you. Meanwhile the traffic deaths that so-called 'self driving cars' will be responsible for (due to their inherent inability to actually think, a quality that we do not understand the workings of in biological bra
      • by AmiMoJo ( 196126 )

        To be fair it's more like the Tesla fans... They seem to believe Musk's claims of full autonomy next year, despite the fact that he previously predicted it back in 2017.

      • by N1AK ( 864906 )
        37,133 deaths in a year (2017 USA) isn't exactly inconsequential, and I imagine you'd be as pleased by being broadly dismissed on the basis that automatic car cynics "make it sound like driving a car is 100.000% safe and that bringing in automatic cars will lead to the extermination of mankind" as self-driving car 'fans' are by your lazy and inaccurate summary of their views.

        Find me evidence that people are arguing widely for fully self-driving cars to be on the road while their safety standard isn't at
        • Americans drive 3.22 TRILLION miles a year and go 450K+ miles without a fender bender.
        • You're assuming that MARKETING AND MEDIA HYPE don't exist, and it DOES. These machines are half-assed at best, can't THINK, and are not up to the task -- and no amount of 'machine learning' will fix that, because it has no ability to THINK. Read the above: https://slashdot.org/comments.... [slashdot.org] neuroscientists will tell you I'm right because that's where I get my information from. 'Deep learning algorithms' are only a tiny little part of cognition and we don't even begin to understand how that works -- and that
    • To be fair, it sounds like an issue with the sensors rather than a competency issue. It's not that the AI doesn't know how to handle cars passing at speed, it simply doesn't see them in time. Where are the rear facing cameras mounted anyway? You'd expect them on the tips of the wing mirrors, so that the system can see past other cars in heavy traffic.

      Humans have similar problems in this situation, by the way, though probably not to the same degree. It can be pretty hard judging the speed of a car com
      • The sensors that may work (360 degree lidar, 360 degree radar, 360 degree cameras) will be $50K per car for some time. That's the problem.
  • almost (Score:5, Funny)

    by Anonymous Coward on Wednesday May 22, 2019 @07:07PM (#58639008)

    Musk says 2020 for full autonomy and robotaxis so all he has to do is make it change lanes, not cut people off, allow hands off, work off the highway, not run into barriers, don't go under semi trucks, and not explode. Then all he needs is to handle fog, rain, sleet, snow, slush and smoke.

    20 months... can't wait. Gonna be awesome.

  • by ugen ( 93902 ) on Wednesday May 22, 2019 @07:27PM (#58639084)

    Therein lies the dilemma for self driving cars. You and I, humans, can (and will) occasionally break various traffic laws and, mostly, get away with it. Yet a large corporation can't, neither legally nor practically speaking, program its vehicles to break traffic laws the way its users do. There is a substantial difference in corporate and individual responsibility. (And that's A-ok with me - given the amount of concentrated power corporations have, they should be held to a much higher standard.)

    In any case, I predict that self driving cars, if they are ever competent enough, would have to drive like the proverbial granny, following the letter of every traffic law - and that's just not very pleasant or acceptable to human users.

    • In any case, I predict that self driving cars, if they are ever competent enough, would have to drive like the proverbial granny, following the letter of every traffic law - and that's just not very pleasant or acceptable to human users.
      Yep. That's the reality of the situation, and guess what? People aren't going to tolerate that. Of course that's just one of the most minor problems that will cause SDCs to fail, there are far worse problems they have and will continue to have because the entire approach t
      • Laws are going to need to be rewritten more carefully to avoid any reliance upon common sense.

        That will annoy police that love booking people for obscure violations!

    • by mellon ( 7048 )

      Mine does drive like a granny. I have to goose the gas pedal whenever someone is exiting, because it won't pass them until they are entirely out of the lane, at which point I'm doing 30mph on the highway.

      This report from Consumer Reports doesn't match my experience, though. My experience is that when there's a car behind me and I want to change lanes, the Model 3 will slow down to get behind that car before changing lanes, which is not at all what I want, but is also not at all what they are describing

    • You and I, humans, can (and will) occasionally break various traffic laws and, mostly, get away with it.

      But do you need to? Or are you just impatient and creating additional risk to yourself and those around you.

    • by guruevi ( 827432 )

      Not just that, but the laws are often contradictory. You need to stop for schoolbusses and move out of the way of emergency vehicles but no traffic laws on the book will have exceptions for that. So you can be stopped in-lane of a two-lane divided highway which is illegal or you may need to drive into oncoming traffic, run a red light etc.

    • by sad_ ( 7868 )

      "In any case, I predict that self driving cars, if they are ever competent enough, would have to drive like the proverbial granny"

      i don't know, there was a post on /. a few days ago about allowing you to set the type of driving the system would assume, and there would be something like a 'douchebag' mode.

    • The correct answer is to allow consumer updates and blame the meat bags that flash the "ludicrous mode autopilot" when they eventually crash. Legally you're 100% in the clear since your EULA explicitly forbids it.
    • In any case, I predict that self driving cars, if they are ever competent enough, would have to drive like the proverbial granny, following the letter of every traffic law - and that's just not very pleasant or acceptable to human users.

      I'd find it pleasant and acceptable.

      If someone is driving me, I want to be able to ignore the commute and do something productive. Like send a work email or drink a beer. Driving like a granny allows for that, while driving like a taxi driver does not.

    • Tesla is implementing aggressiveness settings in their system, at the driver's choice and risk (among humans, not FSD machines).

    • Uber was founded on breaking established labor laws. At best they'll pay a fine and a bribe^X contribute to campaigns to get the laws changed to exempt self driving cars.
      • you're right they'll drive like grannies. It saves on gas and you can sync yourself to the lights so you don't have to stop at reds all the time. It's actually going to be a problem from an energy consumption process. We'll consume too little. Ask yourself what even a 5% drop in gas consumption would do to the finances of the middle east...
  • I wonder if this will drive the stock below $200 tomorrow... It's already down 45% from 6 months ago...
    • 8pm close was $190.65, will be interesting to see what the 4-7am volume is like.

      But $180 is where the bloodiest battle will be fought, both technically on the chart and as the level of Elon's impending margin call (shades of Valeant).

  • by Dereck1701 ( 1922824 ) on Wednesday May 22, 2019 @09:29PM (#58639540)

    I wonder what their definition of a "human driver" is? The average driver stressed, tired, unskilled & distracted driver? Or an alert, well rested, well trained professional driver. A lot of the "issues" that the article seems to focus on are fairly common for many drivers (passing on the right, not recognizing a fast approaching car from behind, failure to follow random/arbitrary local laws). They seem to want the vehicle to behave perfectly in all situations, which is impossible. The bar should be "is it safer than the average driver", not "safer than a professional driver on his best day right after getting driving course refresher, a nap and a back massage". Have they proven the base level of safety? It sounds like the jury is still out on that but it seems to be close enough that no one can definitively say one way or another.

  • by speedlaw ( 878924 ) on Wednesday May 22, 2019 @09:38PM (#58639576) Homepage
    I'm sure BMW, MB, GM and others have tossed money at this. They also know that the sensors will get dirty...the driver will be drunk...the weather will turn to black. Road stripes will be unreadable. This is why we don't see auto-drive from the majors. Tesla assumes a best case scenario that makes the majors just laugh.
    • by TheSync ( 5291 )

      All the majors have radar-adaptive cruise control though. I use it every day in my urban commute.

    • Audi have announced (and postponed) Level 3 autonomy for the A8. It is touted as "automatic traffic jam driving", and only works for speeds of up to 60km/h, but it handles lane changes as well as everything else, and requires no supervision when in use.
  • And land changing is probably one of the simpler autonomous actions that computers must solve. Yet we see consumer reports say the system falls foul of state law.

    Another example of the issues autonomous cars would face - schoolbuses are governed by all kinds of nuanced and different rules about when it is necessary to stop, or illegal to pass or overtake them. Sometimes also you must stop for an oncoming schoolbus but not always. Every US state is different. Every country is different. Imagine the nightma

  • According to Kurzweil's predictions, supercomputers able to emulate human intelligence should be mainstream by now. Yet we can't even manage to have an autopilot that operates on par with a human driver.

    Perhaps all those AI proponents have been a bit too optimistic? Is it possible they used a neural network to generate their predictions, instead of common sense?

  • Consumer Reports words this like if Tesla is currently selling the system to be better than human-level. It's not currently. Nobody claims it to be so. Additionally, overtaking from the right side is called 'undertaking', and is in certain regions illegal, others legal, others bad behavior. There is a variable in the system which tells the AP wheiter it's allowed or not. In any case, currently the system asks the driver to confirm whatever it does, so what's the fuss about? I guess someone is just trying to
    • This is all because Musk is a salesman with no morals and he can come very close to a lie and make people believe the lie. Take the name Autopilot for example; Musk knows full well that many people will think it through as far as "an autopilot keeps me safe in the air and the pilot never has to intervene, so autopilot will keep me safe on the ground and I will not have to intervene". We have discussed ad nauseum the comparison between the two but let's just call out the fact that the fact that there IS a
      • Autopilot is really good and improving. I can imagine traveling across 2 states without any intervention; but that doesn't mean it doesn't need supervision. And often in such a trip a few interventions might be needed. Certainly not a 1000 as you suggest.

        It's not full autonomy and Tesla is clear about that. Maybe some people can not understand there is something between white and black, which is a lot better than not having it?

        For a company without marketing budget it is incredible it can cope with so muc

        • It seems a lot to spend for the capability, to the extend of why I can't understand why people buy iPhones; it seems to be some other fancy they are serving by buying it because that can't possibly be good enough to make it worthwhile. Myself, I love adaptive cruise control but I don't find myself pining for the steering too. Even if I did, the lines are covered or missing for 50% of the time where I am.
  • In Canada it is ALWAYS the law if there is more than one lane than slower traffic keeps to the right at ALL times. In other words, unless you are passing or turning soon you keep to the right. You won't get a ticket for not doing it, but you're a moronic ass if you do.
  • Constant left-lane turtles. Driving on I-95 is like driving an obstacle course. Of course, you HAVE to pass on the right even though you know it's wrong and illegal. Maybe the Tesla actually drives like a CT driver does. We don't give it enough credit.

Some people claim that the UNIX learning curve is steep, but at least you only have to climb it once.

Working...