Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Transportation

Is a $1000 Aftermarket Add-On As Capable As Tesla's Autopilot and Cadillac's Super Cruise? (caranddriver.com) 113

Car and Driver tested a $1,000 aftermarket autonomous driving add-on from Comma.ai against the best factory systems currently on the market. Slashdot reader schwit1 shares the report: If the self-driving car is the promised land, then today's ever proliferating driver-assist features are the desert. Diminished claims and "it's harder than we thought" mea culpas from self-driving's loudest advocates suggest we'll be wandering here for many years to come. At least the technology is meandering in the right direction, though. Thanks to recent software updates, the most sophisticated systems -- Cadillac's Super Cruise and Tesla's Autopilot -- are more capable today than they were initially. This report on those systems includes a lesser known third player. For $998, upstart Comma.ai sells an aftermarket dash cam and wiring harness that taps into and overrides the factory-installed assistance systems in many Honda and Toyota models as well as some Chrysler, Kia, and Lexus vehicles, among others. When activated, Comma.ai's Openpilot software assumes control over the steering, brakes, and throttle, and it reduces the frequent reminders to keep your hands on the wheel. As you might imagine, automakers do not endorse this hack.

Any one of these systems could confidently track the center of a lane for hours with minimal driver input on reasonably straight highways. Although no automaker admits that infotainment is part of its system's machine learning, right after we went hands-free, Hinder's "Get Stoned" started playing through the Cadillac's speakers. We ignored that suggestion and threw the three systems at the toughest highway kinks, interchanges, and two-lane roads surrounding our Ann Arbor home base until either they or we flinched. There was some of each.

Cadillac Super Cruise
Highs: Locked-on-its-lane control, handles the difficult maneuvers with aplomb.
Lows: Works only on mapped limited-access highways, steering control not as confident at night, very little information shown to the driver.
Verdict: A capable and conservative commuting ally.

Tesla Autopilot
Highs: Best user interface, most versatile, extremely capable.
Lows: Dramatic steering inputs when it makes an occasional mistake, no more hands-free capability.
Verdict: One of the best, but can it really evolve all the way to self-driving?

Comma.ai Highs: Capable steering, brake, and throttle control.
Lows: A too-large and unadjustable gap from cars ahead, slows substantially for curves, flashes unnecessary warnings.
Verdict: If this is what's possible with a single camera, perhaps the hardware required for self-driving won't be as extensive as expected.

This discussion has been archived. No new comments can be posted.

Is a $1000 Aftermarket Add-On As Capable As Tesla's Autopilot and Cadillac's Super Cruise?

Comments Filter:
  • by nehumanuscrede ( 624750 ) on Tuesday February 11, 2020 @10:12PM (#59718030)

    If you sell the vehicle and the next owner still gets to use said tech, then it's ahead of Tesla already . . . . :|

    https://www.zdnet.com/article/... [zdnet.com]

    • Dontcha worry, the most restrictive license possible will win out.
    • Re:For starters (Score:5, Informative)

      by 140Mandak262Jamuna ( 970587 ) on Wednesday February 12, 2020 @10:11AM (#59719336) Journal
      That car was a warranty return, related to some manufacturing defect. The screen developed yellow fringe. Tesla replaced the car for the original owner and sold the car bought back under lemon law in auction.

      Tesla's standard practice has always been to remove all software features added by the original owner, and sell the basic car. Tesla says it always discloses that fact. Tesla claims that allows the car to be listed for the lowest possible price, and people who don't want those features need not pay for those features. Tesla claims the audit was done before the car was sold.

      The dealer did not disclose it was a lemon law buy back. Dealer claims the FSD feature was working when he bought it in the auction, told the user it was simply a glitch and Tesla will fix it when he sold it to the customer. Dealer showed the original Monroney sticker to the buyer.

      To avoid bad press, Tesla could buy the right of the buyer to sue the used car dealer for selling cars without full disclosure for, say, 8000$, and sell the user FSD for the same 8000$. Now Tesla has a happy customer with FSD. And it can go after the used car dealer to recover the money, set the record straight and to serve as a warning to other used car dealers playing fast and loose with Tesla. This is what I would do, but I am not the one running things in Tesla.

      • by MobyDisk ( 75490 )

        To avoid bad press, Tesla could

        That's a great approach. I was thinking Tesla should just give them the feature, but this is even smarter. I didn't know you could buy/sell the right to sue.

  • by Rick Schumann ( 4662797 ) on Tuesday February 11, 2020 @10:22PM (#59718056) Journal
    So-called 'self driving cars' are still more hype than they are a reality mainly because there is so much money invested in R&D when they all thought it was going to be just another design cycle, then they discovered that not even being able to define what 'cognition/thinking/reasoning' is let alone being able to write software that can do what a human brain does kind of brings the entire thing to a screeching halt. 'Deep learning algorithms' are not cutting it, it can't make intuitive leaps like a biological brain can, and just piling on more and more so-called 'training data' isn't cutting it either. Furthermore all the prototypes they keep trotting out to us in the media are festooned with dozens of sophisticated sensors, not just one camera sitting on your dash, so if you actually shell out $1000 for this thing, you're likely going to end up getting wrecked in the very literal sense of the word.
    • Re: (Score:2, Interesting)

      by Anonymous Coward

      So-called 'self driving cars' are still more hype than they are a reality mainly because there is so much money invested in R&D when they all thought it was going to be just another design cycle, then they discovered that not even being able to define what 'cognition/thinking/reasoning' is let alone being able to write software that can do what a human brain does kind of brings the entire thing to a screeching halt. 'Deep learning algorithms' are not cutting it, it can't make intuitive leaps like a biological brain can, and just piling on more and more so-called 'training data' isn't cutting it either. Furthermore all the prototypes they keep trotting out to us in the media are festooned with dozens of sophisticated sensors, not just one camera sitting on your dash, so if you actually shell out $1000 for this thing, you're likely going to end up getting wrecked in the very literal sense of the word.

      Exactly, and what you're seeing now is the low hanging fruit being used to drain easily impressed overly affluent arrogant money manager. This is the monorail salesman all over again. Look, we don't even have self driving gold carts yet.

      There are so many problems we could be solving in this regard, just better driver training would cut accident and insurance rates dramatically. Just look at comparable rates between countries with differing training standards.

      Countries like Sweden, Denmark and Switzerland ha

      • Re: (Score:3, Insightful)

        So, why don't we just improve driver training and make the test as hard as it is in the safest countries.

        This is precisely what I've been saying for quite some time now.
        Back in the 80's we had Driver Ed/Driver Training in highschools. All that got cut at some point, and I'm pretty sure if you looked at the statistics you'd see that sometime after that accident rates started rising.
        Also, over however many years we've had an influx of adult drivers from other countries who aren't even as good at driving as we are, and I think that's contributed to the problem as well. I'm not against (legal!) immigration, but

        • and I'm pretty sure if you looked at the statistics you'd see that sometime after that accident rates started rising.

          Fatalities rates for driving have been consistently dropping for decades, not rising. This could be due to safer cars, but I couldn't find any chart showing non-fatal accidents and if those have increased or decreased.

          Also, over however many years we've had an influx of adult drivers from other countries who aren't even as good at driving as we are, and I think that's contributed to the problem as well.

          Sheesh, why don't you just yell out, "Those darn Asian drivers!" and be done with it?

      • You can't compare accident rates by population. It isn't a meaningful statistic. You need to compare accident rates over total miles driven.
        • by dryeo ( 100693 )

          That's meaningless too as driving a thousand miles on a straight divided highway is different then a hundred miles in heavy rush hour traffic or a skinny winding mountain road..

          • It's more meaningful than comparing it per capita. After all, car ownership/ridership per capita differs per country, state, region, and even city, too.

        • The wikipedia article https://en.wikipedia.org/wiki/List_of_countries_by_traffic-related_death_rate [wikipedia.org] provides both figures. The US is still almost twice as bad as Denmark and more than twice as bad as Sweden and England in fatalities per km. Mexico is much worse.

          The question is why? Vehicle safety features and maintenance is one aspect. Road design and maintenance is another. I've heard that England has very strict tests for driver's licenses, it makes sense that those tests and the training needed to pass t

      • So, why don't we just improve driver training and make the test as hard as it is in the safest countries.

        Clearly the reason is there's no vast sums to be made by saving average lives. It's not glamorous or exciting to make driving tests harder. Most people are just too lazy to bother upgrading their skills when they can just bury their heads in the sand and pretend technology will solve all their problems.

        You're overthinking this.

        The reason we can't make driver testing more restrictive is because, in most of the country, you literally can't function and survive without a car.

        Countries like Sweden, Denmark, and Switzerland? They all tend to have a combination of good city design and public transportation that makes having a car a luxury.

    • Utter FUD. First, the bar to be "better than the average driver" is depressingly low. We are a ways off, but not that far.

      Second, if you used this as basically a "better lane keep assist" then it's fine. I've considered installing one in my Pacifica primarily for longer highway trips. You don't set it to "drive to CA" and take a nap. You use it like a much enhanced cruise control.

      Plenty of basic lane keep assist implementations work fine with a camera or two.

      It's not self driving - it's just a better lane k

      • I'm all for 'driver assist' technologies. What I'm dead set against is being replaced by half-assed machines that really can't do the job and will fail at it at a critical moment. No way in hell I'm strapping myself into something that has no controls for a human other than maybe a big red 'E-Stop' button, that it may or may not acknowledge before something disastrous happens. Also I don't know anyone of any age who wants to get into such a vehicle either, so it's not like it's just me.
        They finally crack t
        • by rtb61 ( 674572 )

          I am deeply disturbed they did not even bother to check or rate system security, auto driving, that is a life or death feature, they did not even mention system security not once, like, well it's completely unimportant that a failure of system security will actually BSOD you, might not be an actual blue screen but when it drives you off a cliff, into on coming traffic or into a train but it will still very much kill you. Not one consideration for hack resistance, not even a little bit. I find that strangely

        • Who in their right mind is threatening to replace human drivers with cars that have no manual controls? A ton more money, R&D and experience has gone into automating flight in aircraft, to the point that Tesla's system called "Autopilot" borrows a term from aircraft that's been in use since at least the 80s.

          We've been doing automation with airplanes for much longer, and arguably airplanes are a LOT easier to automate (because there's basically no risk of running into "traffic" -- vehicles piloted by oth

      • by flink ( 18449 )

        Letting a machine assume control to the extent that the human has nothing to do most of the time and then asking the human to step in with a fraction of a second to react on the rare occasion that the automation can't cope with the input it is receiving is the worst of both worlds. The person behind the wheel will not have the same situational awareness that someone who was maintaining manual control would have and will have a much higher error rate. Everyone claims that they are the exception and maintai

      • It's easy for an AI to drive better than the average driver 99.999% of the time. The other 0.001% of the time, you die horribly.

      • by AmiMoJo ( 196126 )

        First, the bar to be "better than the average driver" is depressingly low.

        Also too low for Tesla. Say it's 10x better than a human and they have 1 million vehicles self-driving. They need the equivalent of 100,000 human's insurance coverage.

        But it's worse than that because they need commercial insurance which is a lot more expensive than private insurance, not least because they are potentially liable for far more than an individual would be in a serious accident.

    • The people in power are insane for letting people buy this.

      That there are fucknuts on the road who will take any excuse to drive unsafely is a known factor. That we give them systems we know for a fact can't evade stopped cars on the highway and then let them take their hands off the wheel and do whatever ... that's not just on them. Our regulators and and news media aren't working any more, that they let Tesla and now others get away with this shit is embarrassing.

      The emperor has no clothes, but until a ca

    • by dargaud ( 518470 )

      just piling on more and more so-called 'training data' isn't cutting it either.

      It isn't surprising. Looking at insurance accident stats you can see that it takes humans ***10 years*** to learn to drive safely. Accidents happen fairly often in the first 10 years, and less after that. On average, obviously. So you'd expect AI to take a while to master the same set of skills...

    • I agree with this. You only need to listen to Elon Musk ranting on about AI to see that these folks are deluded. They have also fallen for the exponential doubling problem, where they believe we just need a few more cycles of the magical doubling and we'll have more computing power than the human brain blah blah blah. The reality is that 'moore's law' was always just an observation of engineering development (mainly, that linearly improving feature size generates a square law return in number of transistor

    • You're insane if you buy this expecting it to be a self-driving car. It's NOT. It's NOT advertised as such, either. Comma.ai has the *eventual goal* of getting to self-driving cars, but with no delusions that the current state of the art is not close to that. It's a development kit.

      In terms of actual utility, an ALERT driver ready to take over in a split-second can use OpenPilot and Comma's hardware to alleviate workload in ordinary conditions. Choosing to engage your driver assistance in abnormal condition

  • It's all BS. (Score:4, Insightful)

    by msauve ( 701917 ) on Tuesday February 11, 2020 @10:27PM (#59718072)
    Driving in between the white stripes on a sunny California day is simple. Driving on a =Canadian road during a snowstorm, not so much. Tesla engineers are from the US CA, so, well...
    • Re: (Score:2, Insightful)

      by Dan East ( 318230 )

      So are you advocating that cars should not have cruise control, because people don't use cruise control when driving in snowstorms in Canada?

      • by msauve ( 701917 )
        Are you seriously posting here, not knowing the difference between cruise control and self driving cars?
      • by ceoyoyo ( 59147 )

        Actually, using cruise control during a snowstorm is a good way to end up in the ditch.

        However, there's also no reason a lane keeping program couldn't be taught to estimate where the lane is just as well as a person can. Probably better. It just can't use the conveniently painted lines, because you can't see them.

        • And the GP doesn't know that Tesla actually has this, and does a rather decent job staying in the lane even when the markings aren't visible if reports are to be believed.

          • by ceoyoyo ( 59147 )

            That would be pretty cool to see. I'll have to try and book a test drive when it's snowing sometime. Another great safety feature would be a car that looks at the road, realizes it has no idea where it is, and refuses to drive more than 10 km/h because it knows the human driver doesn't either.

        • Telling where the lane -- or even the -road- -- is, when it's under a 6 to 12 inch blanket of fresh snow is difficult task, even for a person. I wouldn't expect a limited AI to be up to the task for years more.

          • by ceoyoyo ( 59147 )

            I wouldn't be surprised if it was just as good as a person. People are terrible at it.

            Someone else commented that Tesla did train their autopilot on snow covered roads.

      • by jrumney ( 197329 )

        Also:
        Cars shouldn't have traction control, because traction control is useless on ice.
        Cars shouldn't have airbags, because airbags are dangerous when a baby carseat is used on the front passenger seat.
        Cars shouldn't have wheels, because sleds would be better on ice.

      • by AmiMoJo ( 196126 )

        The problem is that Tesla have been selling "full self driving" since 2016 on the basis that the car will be able to drive from coast to coast by itself and take your children to school for you.

        If the car can't cope with a very wide variety of difficult situations then they can't deliver what they already sold. And the current cars don't even have wipers for all the cameras, which suggests they are kinda clueless still.

    • by SuperKendall ( 25149 ) on Tuesday February 11, 2020 @11:52PM (#59718300)

      Driving in between the white stripes on a sunny California day is simple. Driving on a =Canadian road during a snowstorm, not so much. Tesla engineers are from the US CA, so, well...

      If you watch the long presentation Musk did on self deriving, you'll find that Tesla's system builds up an internal model that understands where the road is based on everything around, not just the lines... they specifically mentioned a snowy road where the system knew where the road was under the snow - the same way a human driver would.

      Tesla also has data from ALL OVER THE WORLD now, not just CA, that they use for training in a variety of scenarios.

      Which is why Tesla will be the ones to actually deliver robust real-world self driving in all conditions, not limited areas.

      • If you watch the long presentation Musk did on self deriving

        That's part of Tesla's soon-to-be-announced AutoCalculus system from what I understand.

      • by ceoyoyo ( 59147 )

        They also showed it estimating where the road went around a corner... where it couldn't see.

        Even without full self driving, lane keeping on a snowy road would be a great safety feature. Despite all the Slashdot posts claiming the contrary, humans are terrible at it.

        • They also showed it estimating where the road went around a corner... where it couldn't see.>

          That's where it estimated the road MIGHT be, just as any human would around a blind curve. You turn until the road stops turning...

          As the vehicle proceeds it uses new data to understand where the road actually is, then adjust the steering, understand yet?? Have you ever driven a car??

          Sorry you cannot understand technical videos, but that's a problem on your end, not Teslas.

          I really hope you were being sarcasti

          • by ceoyoyo ( 59147 )

            Hi Kendall! I see you're still your charming self! And your understanding of English is pretty much as it's always been too! Have a good day!!!!!

      • you'll find that Tesla's system builds up an internal model that understands where the road is based on everything around, not just the lines

        Yep, it worked well for one guy I saw on YouTube where his car was preparing to slam into a wall on the side of the road because.. well I'm not sure why. I guess we'd have to ask the car's 'internal model'.

      • Tesla also has data from ALL OVER THE WORLD now, not just CA, that they use for training in a variety of scenarios.

        Which is why Tesla will be the ones to actually deliver robust real-world self driving in all conditions, not limited areas.

        Which is also the strategy that Geo Hotz has repeatedly written about [medium.com] : they started as a cloud-stored dash-cam, with the explicit intent of leveraging all the uploaded content to have a giant mass of data to train their models.

        They also intent to do similar large scale data gathering for their curent driving-assisting models, recording meticulously whenever the driver needed to override the system and take over control.

        This is also the counter intuitive reason why LIDAR-less tools (like Tesla and Comma.ai)

      • by sbaker ( 47485 )

        That's why Tesla have a test facility in Delta Junction Alaska.

        https://cdn.carbuzz.com/galler... [carbuzz.com]

        https://www.cnet.com/roadshow/... [cnet.com]

        Yeah...what were you saying about ignorance...THAT.

    • Re:It's all BS. (Score:5, Insightful)

      by Dixie_Flatline ( 5077 ) <<moc.liamg> <ta> <hog.naj.tnecniv>> on Wednesday February 12, 2020 @07:24AM (#59719050) Homepage

      Driving on roads where you can't see the lane lines in the winter is arguably easier. At that point, your only goal is to not hit anything else, and as long as you're vague consistent about it, nobody cares because they have the same problems. Just follow the taillights in front of you and stay on the side where you're going to turn next. You have to be mindful of the people around you on all sides, but staying inside a boundary that is little more than an agreed upon convention isn't your concern anymore.

    • by Firedog ( 230345 )

      I come from a snowy part of the world. Even there, the roads are clear about 90% of the time. So autopilot would still be useful.

  • "If this is what's possible with a single camera..."

    If they want to see what's possible with a single camera, they should test an older Tesla with the AP1 system, based on a MobilEye camera, along with radar and sonar. For most Highway driving, it's just as good as the latest Tesla system, apart from lane changes and taking exits. The camera even reads speed limit signs (something the newer Tesla systems use a database for instead).

    From the sound of it, the Comma.ai system is not as good at the Tesla AP1

  • They all sound dangerous to me, but this is the way we are headed.

    The one that keeps a larger gap between you and the car ahead sounds a bit safer and I am surprised they complained about that. Although I guess if the gap is too big (say a safe defensive driving sized gap) then you will always be a victim up 'Mr. I can parallel park at highway speed' jumping in front of you.

    Oh and that silly Sonata 'smaht pahk' feature just begs for someone to get blocked in by auto parked cars on each side and rage w
    • by sbaker ( 47485 )

      On a Tesla it's adjustable between 1 and 6 car lengths - but what is a "safe" distance changes when the car has sub-millisecond reaction times and a radar system that bounces radar waves off of the road under the car in front so it can see TWO cars in front. It often starts to brake when the car TWO in front of you starts to slow down and the guy immediately in front of you hasn't even noticed yet.

      It kinda freaks you out a bit the first few times it does it...it's like "Why did this stupid car just slam o

      • Several states strongly suggest a three second gap between cars. Fewer than half of all drivers maintain such a gap. Since Car & Driver doesn't give actual numbers for their claim of too much space, we're left guessing. I suspect the C&D testers were unhappy with the autodrive actually leaving an adequate margin.
  • As per our local Tesla shill, we were going to get full self drive level 5 autonomy from Tesla by the end of 2019. Checking my calendar, I notice 2019 has ended. Therefore we have full level 5 self driving cars already. Mine is out there right now making me money as a taxi.
    • Bro, do you even CrossFit? ONE MILLION Tesla robotaxis in 2020! Proof: https://bigthink.com/technolog... [bigthink.com]
       
      Then, first manned mission to Mars in 2024. But first: shoot some more satellites into orbit.

      • Y'all are so behind the curve and beyond the horizon, all you zombies. I already bought ticket #999,997 on the Rocketship "Elon the First".

        Leaving the green hills of Earth to be the man from Mars. And my children will leave Red Planet one day on the glory road to be true citizens of the Galaxy.

        Arriving there January 1st, 2050 and starting with my potato farm and moonshine distillery upon arrival. Call me the Farmer in the sky.

        The time for the stars is today, not Friday!

  • too large a gap? and it's assisted driving not self driving. how many upgraded car sensor pack levels has tesla already sold as upgradeable to fully autonomous? like, they did not know for sure they would be able to do it with the sensors in the car but went ahead and sold it with such a promise anyway because consumer protection is dead.

  • by Arzaboa ( 2804779 ) on Tuesday February 11, 2020 @11:01PM (#59718176)

    I'm so excited that we just put the testing of automated vehicles into the hands of a couple coders in between jobs and the open source community.

    Pro's: Break everything moving very fast.
    Con's: What could go wrong?!
    Verdict: Don't look now, we're coming for you!

    --
    Never be afraid to trust an unknown future to a known god. - Corrie Ten Boom

    • The railroads can't make it work right. There's a clue.
      • by sbaker ( 47485 )

        Maybe because it takes multiple minutes to stop a 2 mile long coal train - and two minutes ahead of you is too far to resolve with cameras or radar?

        • Everything on the track is equipped with the system and every part of the trackage is mapped. All the current maps are downloaded into the train before leaving the terminal. Everything should be tracked in the office, but they can't do it 100%. There is no 2 minute issue, it goes beyond that.
    • Cracks me up people gloom and doom about this stuff, meanwhile some asshole is doing 103mph on the freeway snorting cocaine off his hand, a 90 year old man is driving the wrong way up the off ramp, and some woman is weaving back and forth 6 feet in her lane while she's smacking lipstick on her ugly mug in the mirror.

      But I should be worried about poorly developed software...fucking right.

      • some asshole is doing 103mph on the freeway snorting cocaine off his hand

        A) That's not *his* hand.

        B) That's not a hand.

      • Those are rare but admittedly very dangerous cases, but if the automated system has a similar issue? Now everyone is doing 103 on the highway, everyone is driving the wrong way, and everyone is swerving side to side in their lane. See the problem?
      • I've been driving for several decades. Have yet to see anything like those situations. I turned on autopilot on my model 3 JUST ONCE and it immediately started doing some heart stopping stupid shit. I'll keep my trust in humans for now, thanks.
    • The problem isn't open-source. Open-source coders are among the best in the world. See the LAMP stack for an example. See the Linux kernel updates for another.

      But they're out of jobs? Sure! Some of them don't fit the "establishment" criteria for hiring. So they work on open-source project.

      You like the "establishment"? Established closed source companies kill hundreds of people (Boeing 737 MAX). Also they can't get Starship to sync its clock, something NTP fixed in the 1980s.

      Grumman can't get the F-35

  • Ya but can it honk the horn and flip the asshole off in front of you? I thought not.
  • These systems will not drive the last mile to completely autonomous cars. If you want to see the state of the art, you need to look at Waymo. Self-driving trucks running between transit points outside of town will be on the roads sooner than self-driving cars. They don't have to learn every driving situation, they only have know one stretch of road.
  • >A too-large and unadjustable gap from cars ahead

    Yeah, that is latency biting you in the ass. This thing sounds half-assed and dangerous.

  • aftermarket EULA says deaths = your at risk of both big lawsuits and maybe even hard time.

  • ... that saying, "as good as Tesla's Autopilot" is a good comparison.

    Tesla’s Autopilot in Spotlight in New NTSB Reports on Crashes

    https://www.bloomberg.com/news... [bloomberg.com]

  • Obviously.

    That said, there is a wide gulf between reality and the hype that goes with these lane assist / departure systems. They are basically advanced driver assistance systems and should be marketed as such and not "auto pilot", "self-driving" or some other intentionally misleading bullshit.

  • by DogDude ( 805747 ) on Wednesday February 12, 2020 @08:25AM (#59719158)
    The ONLY reason for these sorts of things is because people are addicted to their phones. Full stop.

    Putting a regular car in cruise control and paying attention to the world outside of the windows really isn't that fucking difficult.

    1. Put down the phone.
    2. Drive your car.
    • by swillden ( 191260 ) <shawn-ds@willden.org> on Wednesday February 12, 2020 @11:42AM (#59719636) Journal

      The ONLY reason for these sorts of things is because people are addicted to their phones. Full stop. Putting a regular car in cruise control and paying attention to the world outside of the windows really isn't that fucking difficult. 1. Put down the phone. 2. Drive your car.

      I don't want to. I have better things to do. I, for one, will be very happy when I can be driven places by my car, without any need for me to stop doing what I'd really rather be doing, and without the cost of paying a human driver.

    • Yeah.

      Autopilot systems are generally overrated. Computers should take away the monotony. That's what got the 737-Max in trouble - their systems were doing more than taking away the monotony.

      Keeping in the lane on a highway, not hitting the car in front of you is probably all drivers need. Maybe lane switching on the highway as a bonus. But really how important is it for your car to do all the driving?

      • I had the impression that Boeing had a lot of problems that built up to the disasters.
        __They pushed the aerodynamics too hard. It was too easy to get the plane into an unrecoverable condition.
        __The automatic controls did the wrong thing in border cases.
        __It took too long to disengage the automatic controls.
        __It was difficult or impossible to manually override the automatic controls when they were engaged..
        __Boeing denied that substantial retraining was necessary to fly the 737-MAX safely.
    • Oh joy, another argument from the position of "it's always been that way, so why should it change now?"

      You must be one of the "late adopters" who gets dragged into new things kicking and screaming. By chance are you running Windows 98, MacOS 9, or Red Hat Enterprise Linux 5 on your computer right now? Android 2.2 on your phone?

  • by Darren Hiebert ( 626456 ) on Wednesday February 12, 2020 @09:34AM (#59719258) Homepage

    So they compare three systems and rate Tesla with "Verdict: One of the best," without using the word "best" for either of the other two. What the hell does that mean? Sounds cowardly to me.

    And then, to add insult to injury, the verdict continues, "...but can it really evolve all the way to self-driving?" This is an idiotic question in what is supposed to be a "verdict" about its capabilities. Merely trying to raise doubt about "one of the best" that they were too cowardly to call "the best". There was no similar questions about either of the other two.

    This is sufficient for me to disregard this "review" as motivated by factors other than determining which was the superior system.

    • by sbaker ( 47485 )

      That note puzzled me too.

      "One" of the best implies that there are others...and there really aren't.

      Waymo, Uber, Apple, etc don't have production cars on the roads and between the three of them only have a few million miles of data to train their AI's.

      Tesla have a million cars on the roads - each collecting 1000 miles a month. That means that Tesla collect more data in one DAY than the other AI self driving systems have ever collected in their entire history.

      AI systems NEED lots of data to train on. It's n

  • As someone who has owned two different Teslas now (with the original Intel MobilEye system first, and Tesla's own AP 2 system now), plus tried a few of the other driving assistant features out on Hyundai and other rental cars?

    I feel like we're at the point where everything is "2 steps forward, and one back". The more they try to add additional functionality to get closer to "full self driving" capability, the more issues come up with some of the existing functionality.

    For example? The old "AP 1" Tesla syst

    • by ledow ( 319597 )

      The software plateaued. AI always plateaus. Dangerously, it always builds quickly just enough to look "cool and interesting" at first to the untrained eye, only to then never significantly improve.

      I look at these products and now - they're not going to get any better for a long time. You can bolt on more sensors, you can add in more processing, you can tack on more backup systems but the system itself will plateau and right at a point that people who deal with stuff could predict.

      Even your OCR example...

  • Does anyone know what happens when camera is blinded driving into the sun? Meat sacks like me tend to move my head to shield my eyes with the visor, but what is a fixed mounted camera going to do?
  • Without some kind of side/rear camera system, changing lanes is impossible. You absolutely need to be able to see what (if anything) is coming from behind on the adjacent lane. So unless the vehicle has pretty much 360 degree camera coverage - it's either a death trap - or it can't overtake or merge in traffic.

    The Caddilac SuperCruise relies on extremely high resolution maps - like better than 10cm resolution. It only works on 200,000 miles of US highways - there are 4 million miles of roads.,,so this i

Math is like love -- a simple idea but it can get complicated. -- R. Drabek

Working...