Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AI Transportation

Tesla's 'Full Self-Driving' Beta Called 'Laughably Bad and Potentially Dangerous' (roadandtrack.com) 232

Car and Driver magazine has over a million readers. This month they called Tesla's "full self driving" beta "laughably bad and potentially dangerous."

schwit1 shares their report on a 13-minute video posted to YouTube of a Model 3 with FSD Beta 8.2 "fumbling its way around Oakland." Quite quickly, the video moves from "embarrassing mistakes" to "extremely risky, potentially harmful driving." In autonomous mode, the Tesla breaks a variety of traffic laws, starting with a last-minute attempt to cross a hard line and execute an illegal lane change. It then attempts to make a left turn next to another car, only to give up midway through the intersection and disengage. It goes on to take another turn far too wide, landing it in the oncoming lane and requiring driver intervention. Shortly thereafter, it crosses into the oncoming lane again on a straight stretch of road with bikers and oncoming traffic. It then drunkenly stumbles through an intersection and once again requires driver intervention to make it through. While making an unprotected left after a stop sign, it slows down before the turn and chills in the pathway of oncoming cars that have to brake to avoid hitting it...

The Tesla attempts to make a right turn at a red light where that's prohibited, once again nearly breaking the law and requiring the driver to actively prevent it from doing something. It randomly stops in the middle of the road, proceeds straight through a turn-only lane, stops behind a parked car, and eventually almost slams into a curb while making a turn. After holding up traffic to creep around a stopped car, it confidently drives directly into the oncoming lane before realizing its mistake and disengaging. Another traffic violation on the books — and yet another moment where the befuddled car just gives up and leaves it to the human driver to sort out the mess...

Then comes another near collision. This time, the Tesla arrives at an intersection where it has a stop sign and cross traffic doesn't. It proceeds with two cars incoming, the first car narrowly passing the car's front bumper and the trailing car braking to avoid T-boning the Model 3. It is absolutely unbelievable and indefensible that the driver, who is supposed to be monitoring the car to ensure safe operation, did not intervene there. It's even wilder that this software is available to the public. But that isn't the end of the video. To round it out, the Model 3 nearly slams into a Camry that has the right of way while trying to negotiate a kink in the road. Once it gets through that intersection, it drives straight for a fence and nearly plows directly into it.

Both of these incidents required driver intervention to avoid.

Their conclusion? "Tesla's software clearly does a decent job of identifying cars, stop signs, pedestrians, bikes, traffic lights, and other basic obstacles. Yet to think this constitutes anything close to 'full self-driving' is ludicrous."
This discussion has been archived. No new comments can be posted.

Tesla's 'Full Self-Driving' Beta Called 'Laughably Bad and Potentially Dangerous'

Comments Filter:
  • by fluffernutter ( 1411889 ) on Saturday March 27, 2021 @11:46PM (#61207224)
    but they have millions of miles of data!
    • Re: (Score:2, Interesting)

    • by TWX ( 665546 )

      but they have millions of miles of data!

      This gives me grave concern for the skill level of the average Tesla driver.

      Maybe they're still sampling data from Musk's Roadster?

      • by Anonymous Coward

        but they have millions of miles of data!

        This gives me grave concern for the skill level of the average Tesla driver.

        Maybe they're still sampling data from Musk's Roadster?

        Correction. This is the skill level of the average driver. And no, I'm not kidding because when reading this I was thinking I see most of these actions on a weekly basis. In fact, yesterday I went through someone driving 15 miles below the speed limit (on a one lane road), someone making a right turn from the left lane, and someone running a red light. This doesn't include on a regular basis people who don't know how to make turns across traffic. Instead of following a curve [wikihow.com] (part 2, step 4), they go d

    • by Joce640k ( 829181 ) on Sunday March 28, 2021 @03:27AM (#61207608) Homepage

      I actually watched the video before posting. It was slow, over-cautious and fumbling but at no point did it do anything downright dangerous.

      Most of the complaints were, "That's closer than I would have got", but hey, it's a computer, it knows exactly where the corners of the car are and it can calculate distances more accurately than you.

      It's a beta.

      • by Entrope ( 68843 ) on Sunday March 28, 2021 @06:23AM (#61207858) Homepage

        You don't think it was dangerous to make a left turn across incoming traffic, forcing another car to brake in order to avoid a collusion? Or to cross a double yellow line, again with incoming traffic, in order to pas bicyclists? Or to continue driving forward towards a metal gate when a right turn was mandatory?

      • by Zxern ( 766543 )

        Did you miss the part where it drove through the intersection almost getting t-boned by oncoming traffic?

      • "It's a beta" is okay if you're releasing a video player or a calculator app or a video game. It's not okay if you're releasing a car.
      • I actually watched the video before posting. It was slow, over-cautious and fumbling but at no point did it do anything downright dangerous.

        Most of the complaints were, "That's closer than I would have got", but hey, it's a computer, it knows exactly where the corners of the car are and it can calculate distances more accurately than you.

        It's a beta.

        I also watched the video and you have a faaaar more generous interpretation than I.

        It broke multiple traffic laws, wrong lane, straddling lanes, literally driving one the wrong side of the road.

        It had at least one incident where it went too slow during a left turn and could have been t-boned, another time when it had a stop sign and cut off a car in a free flow lane.

        Otherwise is was generally overcautious, forcing the operator to take over because the car was stopped in the middle of the road for no real re

    • Rei, you better get in here people are saying bad things about Tesla!
  • by magzteel ( 5013587 ) on Saturday March 27, 2021 @11:51PM (#61207236)

    Look it's in no way ready for use, even in that very uncongested city driving. But it's still pretty cool.
    I hope they get it right by the time I can no longer self drive.

    • by TWX ( 665546 ) on Sunday March 28, 2021 @12:00AM (#61207256)

      I don't think they have enough in the way of sensor input. Waymo's tests featuring minivans with rather unattractive spinning stuff on the bumpers and fenders and the stuff put up on the roof isn't exactly eyecatching, but it's there for a reason.

      The kind of self-driving car you would need when you can't self-drive would probably look more like a van or RV. Without humans behind the wheel, it would make more sense to focus on the quality of the ride and the interior design, and I predict as close to livingroom-on-wheels as the aerodynamics will allow.

      • Re: (Score:3, Interesting)

        by burtosis ( 1124179 )

        I don't think they have enough in the way of sensor input.

        Humans do it with a crappy 6 axis gyro/accelerometer and buggy stereo cams and actuate the controls all through a Rube Goldberg collection of linkages. We need better algorithms more than we need better sensors.

        • That only took a few billion years of evolution to get to the point where thousands of people still die on the roads every year due to human error.

        • by AuMatar ( 183847 ) on Sunday March 28, 2021 @12:44AM (#61207336)

          That's a short sighted answer. Sure, its possible to do withpout other sensors, but that doesn't mean other sensors wouldn't make it much easier. Bats can fly without sonar, but they do better with. You can find humans in the dark with your eyes, but an infrared sensor does it better. Musk initially didn't do lidar because it was expensive, but he may be paying the price for it now.

        • by feranick ( 858651 ) on Sunday March 28, 2021 @01:03AM (#61207362)
          It's not just algorithms. It's the complexity and efficiency of the CPU. Comparing the CPU to the brain is preposterous. Let alone that training a human takes years (legally, 16-18) before they can drive, and it's not something that can be done by cutting corners. There is a reason children cannot drive. It's the overall training on ethical choices. Focusing on sensors and algorithms is looking at a tree, and missing the complexity of the forest.
          • "But to build a forest, you must first learn how to build a single tree." - Me

          • I don’t necessarily disagree with the point you’re trying to make, but I don’t think your particular argument holds much water, given that it’s extremely common for kids as young as 10-12 to be driving trucks and tractors every day on farms. It isn’t so much that they need 16-18 years of training, so much as that they’re:
            A) Literally too small to physically work the controls until they reach a certain age
            B) As a generalization for children their age, still too irresponsib

          • by dasunt ( 249686 ) on Sunday March 28, 2021 @09:08AM (#61208336)

            It's not just algorithms. It's the complexity and efficiency of the CPU. Comparing the CPU to the brain is preposterous. Let alone that training a human takes years (legally, 16-18) before they can drive, and it's not something that can be done by cutting corners.

            Then again, comparing a human eyeball to a camera is also preposterous.

            A human's field of view has only a small portion that's actually in focus - about 5 degrees (foveal). The rest is to some degree blurry. And we can't see as we move our eyes from one spot to another. (saccadic suppression).

            Now if this sounds contrary to your experience, that's because your brain is filling in the missing data with what it expects to see.

            So we have a computer, which is dumber than a human, attached to a bunch of sensors which are far better than a human at seeing things.

            Which is an interesting comparison.

        • Humans have one hell of a control system, and that stereo camera is on a super whizzbang gimbal system, not just an array of poorly located monocular vision cameras. Tesla's biggest shortcoming is Elon prioritized appearance over good sensor placement, and they out the cameras in places which result in the easy occlusion of view by simple shit like garbage cans, trees and cars. The default behavior is to creep into traffic, blindly, to try and get a peep of what's coming. That is an insta-fail.
        • by AmiMoJo ( 196126 ) on Sunday March 28, 2021 @04:48AM (#61207706) Homepage Journal

          Humans have several advantages though.

          Those cameras can move and point in different directions. They are self cleaning too, and can cope with extreme amounts of dynamic range (e.g. facing the sun), as well as making use of movable shades when necessary.

          The biggest advantage is the human brain, which was been developed over millions of years to rapidly process limited sensory input into an internal 3D representation of the world. It's also very good at handling unexepcted data and recognizing objects in a variety of lighting conditions, even when partially occluded.

          Getting machines to that point is beyond our capability at the moment, especially in a mobile device. Supercomputers can't manage it, let alone what can be installed in a car.

          That's why Waymo and most others use better sensors, particularly lidar, to gather data that greatly simplifies the processing needed.

          • Humans have several advantages though.

            Those cameras can move and point in different directions. They are self cleaning too, and can cope with extreme amounts of dynamic range (e.g. facing the sun), as well as making use of movable shades when necessary.

            The biggest advantage is the human brain, which was been developed over millions of years to rapidly process limited sensory input into an internal 3D representation of the world. It's also very good at handling unexepcted data and recognizing objects in a variety of lighting conditions, even when partially occluded

            Not to mention we've designed roads, vehicles, and traffic laws around the capabilities of the human brain.

            For instance, we're really good at deciding which distant object are vehicles and which are buildings but bad at doing the calculus to figure out safe passing distances. Hence we allow roads to be surrounded by visually diverse buildings but put traffic lights and stop signs at intersections.

        • Yes and all of the infrastructure is designed around the mechanisms that humans actually have. Also the human stereo cams can rotate about 120 degrees in either direction and are augmented by mirrors that give a full 360 degree view. Also the human image processing algorithms are extremely good at not just *identifying* objects but *predicting* their behavior. Also they can be patched/upgraded instantaneously.
        • Yeah but humans have brains with more power than supercomputers that also comes with massive adaptive learning abilities and a self teaching AI that the tech sector is decades off being able to replicate..
        • Humans do it with a crappy 6 axis gyro/accelerometer and buggy stereo cams

          Actually we do it with a pair of crappy 3 axis gyros and a pair of buggy stereo cams, but we do it with a big special brain that produces the other data we need. And it also has a lot of other sensor input, such as the butt dyno.

          It doesn't matter what sensors humans use, though. That's totally irrelevant. What matters is what sensors computers need to do the job. Also, remember that the goal is to do a better job, not a worse one.

      • by Dutch Gun ( 899105 ) on Sunday March 28, 2021 @01:05AM (#61207366)

        I'm not sure how you come to that conclusion. Right there in the summary, it says:

        Tesla's software clearly does a decent job of identifying cars, stop signs, pedestrians, bikes, traffic lights, and other basic obstacles.

        The Tesla seemingly does a fine job at identifying everything around it. That implies the sensors are just fine. It's the general navigation and driving algorithms that don't seem to be up to par.

        • by AmiMoJo ( 196126 )

          The Tesla seemingly does a fine job at identifying everything around it.

          No it doesn't. The bit you quoted says it does a decent job of identifying a limited set of things commonly found on or near roads, and if you watch the videos even that is glitchy.

          Tesla is a very, very long way from having a generic vision system that can understand the world like a human does. They are banking on not having to go that far to make their system work. "Everything" is nowhere near accurate.

    • by fermion ( 181285 )
      I driven cars with active cruise control. It is nice because it keeps to the lane, adjusts speed, and avoid collisions. What I notice is it drives like a computer, which is sometimes dangerous. So if a car in front of you is turning right, it treats it as a stoped car and stops the car even though there is no danger of collision. Likewise, when the speed limit changes, it does not do a gradual speed change, but rather rapidly changes speed not giving the drivers behind time to react.

      While humans are drivi

      • Likewise, when the speed limit changes, it does not do a gradual speed change, but rather rapidly changes speed not giving the drivers behind time to react.

        Your comment made me think about one of the highways heading into Astoria, Oregon. The speed limit is 55 - but, every so often, there's a small town. At the edge of these towns the speed limit suddenly drops to 25.

        That'd be fun.

    • by fred911 ( 83970 )

      Absolutely. If you watch the video that the article comments upon, you have to be impressed. The drivers talk like it's an issue to cross a painted island to avoid bicyclists, when the law in many states specifically states a vehicle must stay at least 3 feet from bicyclists. There's a number of times when you can see the system acting with major caution when it senses humans. One time in the video they didn't know why the vehicle stopped in the road, when there was a bicyclist approaching on a sidewalk or

  • Honestly, who was expecting otherwise? That's why it's called "beta". Also, I am supremely surprised these videos are legally allowed to be shared. I would of concluded a bit of a confidential agreement or the like would of been included in the EULA for signing up for the beta. The fact is people are stupid and their expectations never match reality. By showing these videos, you only give ammunition to the folks who wish to do things like ban self-driving cars which in my mind would be like banning the inte

    • by dknj ( 441802 )

      Anyone with enough money and a compatible tesla can "sign up " for the beta, so your EULA requirement barring free speech would pretty much be trashed by any judge when challenged. I'm more amazed, or rather flabbergasted, that any joe schmoe would need to be responsible for that kind of atrocious driving. They make comments about the suburbs vs city driving, and I think the city dynamics really threw it for a loop. But you can already see the future coming where federal roadway guidelines will reduce man

      • by AmiMoJo ( 196126 )

        Tesla minimizes their liability by calling it beta and blaming any accidents on the driver's failure to prevent them.

      • Woo this is gonna be a fun decade.

        You and I have a very different point of view of what is "fun".

    • EVERYONE was expecting otherwise. Betas are for testing, not general consumption.

      I am astounded that you are defending the crapfest that is "self driving". If you said this is expected in pre-production cars, you would be spot on. Released into the wild, this is an absolute sh!tshow and SHOULD be roundly condemned.

    • by msauve ( 701917 ) on Sunday March 28, 2021 @01:44AM (#61207436)
      >The fact is people are stupid and their expectations never match reality.

      Problem is, those stupid people are marketing and selling their expectations for $10K a pop, and putting this shit on the public roadways.
      • A rare commodity is being sold at a price point many upper and middle class can buy for a service that has taken years of investment. The price point actually sounds like a loss until they sell enough. It's probably primarily to understand better the returns of the service than to make profit.

        Blame your legislators for letting it on the roadways or you know the people turning on the beta software in very difficult driving circumstances when they know it's a beta service. As for the people making the product

    • by PastTense ( 150947 ) on Sunday March 28, 2021 @02:16AM (#61207484)

      "Honestly, who was expecting otherwise? That's why it's called "beta". "

      No. For this level of incompetence you want the backup driver to be full-time, skilled driver employed either by Tesla or a third-party testing organization--not the average, unskilled driver.

    • by misnohmer ( 1636461 ) on Sunday March 28, 2021 @03:06AM (#61207568)

      There is "beta" and then there is "pure hype". I've still have a 2015 Tesla which was supposed to "find me anywhere on private property". What it actually does, 6 years later, it drives up to 40 feet forwards or backwards while I hold the dead-man-switch making sure it doesn't cause any damage (for which I have accept responsibility when enabling the summon feature). Oh, and that car was also supposed to have 691hp, but after trying everything, including their CTO writing a blog how "EV horsepower is special and different", Tesla finally admitted that the car can only produce 463hp on its best day. Their excuse, "well, the motors are 691hp capable, but not he battery or the power delivery system we sold you". Do you think the car will EVER find me anywhere on private property or develop 691hp, with Tesla covering the necessary upgrade costs since I paid for the feature 6 years ago? Or do you think they will refund me any money? Of course not, just like my car has 691hp motors, all those suckers who paid for Full Self Driving since 2016, have Full Self Driving *capable* cars, except for the sensors, redundant components, and computing power required to do so - but hey, the windshield, roof, tires, and even seats are totally Full Self Driving capable! You can gloat to your friends, "I have a car with Full Self Driving capable floor mats!".

      Btw, we also have a 2018 Tesla in the household too, and its highway autopilot is actually worse than the 2015 version - it's twitchy, it brakes for no reason, none of us in the house every use it, even though I do occasionally use the old one which works well as adaptive cruise control with lane keeping.

      Elon has dreams, and he found that he can sell those dreams for thousands of dollars, and once the buyers realize they've been sold nothing but hype, he finds more buyers. On the bright side, he does use that money to keep on trying to build the dream, it's just that people who pay for it don't realize they are paying for development of the dream for other buyers in a distant future. So, people who paid for Full Self Driving in 2016, paid so that someone in 2032 might be able to buy a Tesla with actual Level 4 or 5 autonomy. Of course, that money also helped make Elon of of the richest men on earth, but that's just a side effect.

      • by DarkOx ( 621550 )

        So you bought a product that did not deliver on basic claims like HP output. Felt cheated and turned around and bought another one?

        That is the sort of consumer behavior that enables companies like Tesla to continue to hype. The fact that you bought another tells me its actually at least in your view a nice product compared to the alternatives.

        It should be saleable without over promising and under delivering. However the buying public is rewarding the hype-and-vaporware-as-marketing strategy.

        • I did stop buying hype features after 2015, buying only base EV's as there was no comparable product, but never paid for another "coming soon" feature since. Then the Model 3 and eventually Model Y flood came, Tesla service went from stellar to crap, driven by corporate profit squeeze, so stopped buying Teslas completely after 2018. Notice that Tesla stopped worrying about existing customers, with their primary focus on new customer acquisition - more first time buyers who have not experienced the sales pit

    • by DamnOregonian ( 963763 ) on Sunday March 28, 2021 @03:26AM (#61207606)

      By showing these videos, you only give ammunition to the folks who wish to do things like ban self-driving cars which in my mind would be like banning the internet in the 80s.

      That is true.
      However, so is the inverse.
      By not showing these videos, you let lies about how good the technology is perpetuate.

      I would of concluded a bit of a confidential agreement or the like would of been included in the EULA for signing up for the beta.

      I'm of a mixed mind about it. I'm leaning toward the rights of the public though, purely because these are being beta tested in public, meaning anyone could take video of this, or become a consequence of its beta status.

      If only operated in a private setting, I'd argue for enforceability of NDA.

    • Honestly, who was expecting otherwise? That's why it's called "beta".

      I've been around the block a bit in the tech sector, about 35 years. This isn't a BETA, this isn't even close to a BETA. And you don't release critical software that affects safety with life ending consequences if it fails to the public in the state this is in. This is at best an ALPHA version that should be limited to a restricted number of testers.

  • by Arzaboa ( 2804779 ) on Sunday March 28, 2021 @12:07AM (#61207268)

    When I first heard of fully automated driving, I thought of how nice it would be to not have pay attention for 16 hours between St. Louis and Denver. Today, the bar for "self driving" is in environments where even I almost break the law and almost run into people.

    I don't trust hardly anyone in the city driving. I appreciate this being called out before we live in urban areas where there isn't a human to blame. I would happily sit in a Tesla driving down a highway in Utah, but I would never think it would be a good idea to do the same thing in New York City.

    --
    It always seems impossible until it's done.- Nelson Mandela

  • by spazmonkey ( 920425 ) on Sunday March 28, 2021 @12:10AM (#61207278)
    It was obviously the cities fault for building the roads in the wrong place. I bet they didn't even -ask- Elon where to put them. Probably haters trying to short TSLA stock.
  • by tgeek ( 941867 ) on Sunday March 28, 2021 @01:04AM (#61207364)
    Congratulations Tesla for your (dubious) achievement!
  • by whoever57 ( 658626 ) on Sunday March 28, 2021 @01:09AM (#61207372) Journal

    California has utterly stupid stop signs.

    Stop signs may be at a 4-way stop, or a two-way stop, where one direction has a stop sign, but the perpendicular traffic doesn't. In California, when you come up to a stop sign, there is no indication of which type of junction you have come to.

    Perhaps you may see the reverse side of the stop signs for the other traffic, but what if you can't? Does that mean that they don't exist, or that you can't see them.

    These types of junctions are a hazard to human drivers as well as to self-driving cars. In fact, self-driving cars may have an advantage, if the mapping data can include information on the existence or lack of stop signs for other traffic.

    • by tgeek ( 941867 ) on Sunday March 28, 2021 @02:26AM (#61207502)
      AFAIK, stop signs in California are no different than any other US stop signs. I suppose it has a lot to do with how you were trained. My old drivers-ed instructor 40 years ago use to emphasize that the purpose of the stop sign was not just to make you stop the vehicle, but to give you time to ensure that it WAS safe to enter the intersection not just that it SHOULD be safe if other drivers were acting properly. I remember thinking at the time he was being kinda anal about that . . . but 40+ years later I gotta admit he wasn't wrong. (He also used to say the coroner could write "But he had the right-of-way" on your death certificate if you should ever fail to avoid an accident where it wasn't your fault - but other than that bit of dark humor he was a pretty good instructor)
    • Perhaps you may see the reverse side of the stop signs for the other traffic, but what if you can't?

      That's exactly why stop signs have their unique octagonal shape: So you can tell if other directions also have a stop sign, even from the back side. I don't recall seeing any intersection where the stop signs from other inlets could not be seen at all.

      The little "all way" or "two way" hints below stop signs at many intersections are a nice-to-have that speeds up decision making, but they're not totally essential.

    • by Cederic ( 9623 )

      Stop signs are sub-optimal relative to 'Give Way' signs at most junctions but having driven in California their use of them does not create a hazard.

      Even in San Francisco with its utterly fucking stupid stop signs every hundred fucking yards. It's a pain in the arse and it's shitty but it's not a hazard.

    • This shouldn't be a hazard for human drivers. If you approach a stop sign, you stop. Assume its a two way stop sign. If you can proceed without inhibiting cross-traffic, do so. If not, as the cross-traffic approaches it will either just continue and you can go once the intersection is clear or the operators of those vehicles will stop and then you know that's it's a four-way stop and you can proceed. That's not hard.

      Many four-way stops do put a small rectangle underneath with text "All way." Treati

    • The answer there is to get rid of that bizzare 4-way stop thing. It's insane and America is the only place in the world with it so there's clearly no need for it.

  • ...it was trained to maximize the amount of fines collected by the car owner!
  • Simply fill a car park with a dozens teslas and watch the fun begin
  • Beta testing (Score:5, Insightful)

    by peppepz ( 1311345 ) on Sunday March 28, 2021 @02:39AM (#61207522)
    Back when I was young, beta testing meant that a product was feature-complete and with no bugs that were known by the developers, so it would enter a phase of testing by a wider public.
    In the post-Google world, "beta" means nothing.
    • by Zxern ( 766543 )

      Beta these days means, public release with no liability claims. This is Star Citizen level quality here. This should not be available to the general public.

  • by fahrbot-bot ( 874524 ) on Sunday March 28, 2021 @02:57AM (#61207554)

    This time, the Tesla arrives at an intersection where it has a stop sign and cross traffic doesn't. It proceeds with two cars incoming, the first car narrowly passing the car's front bumper and the trailing car braking to avoid T-boning the Model 3. It is absolutely unbelievable and indefensible that the driver, who is supposed to be monitoring the car to ensure safe operation, did not intervene there.

    Self Driving [xkcd.com]

    (Perhaps the driver just hadn't completed the "registration" before reaching the intersection...)

  • Auto pilot anywhere a tough task, so start easier with auto pilot zones. Others know , like bus lanes. Low , no pedestrian areas, well mapped terrain etc... consistent routes, monitoring.
    • I've been saying this for longer that Tesla has existed. Our current infrastructure is optimized for human drivers. If you want machines to drive, build infrastructure for it. There are self-driving subways and self-driving buses in operation. Even self-driving snow plows in the mountains. They all operate safely because the infrastructure was upgraded. Trying to get a computer to use human-like sensors is silly.
      • by Zxern ( 766543 )

        Our current infrastructure isn't even optimized for human drivers in most places really. Pedestrians would be crossing over/under streets not on them if they were optimized for driving.

  • by John Trumpian ( 6529466 ) on Sunday March 28, 2021 @04:46AM (#61207704)
    And that is a compliment !
    • What I got out of the video was that Oakland drivers suck and computers many never fully mimic a human driver responsibly. Yea it made mistakes, but so did all the human drivers as well. Honking for waiting more than 500ms? Double parking everywhere? Driving in a city requires driving like an asshole sometimes (making aggressive turns, taking chances, speeding, etc...). I struggle to see how self-driving would ever work in a place like Boston where lines mean nothing and signals are suggestions. Tesla
      • This is my thought as well. I don't own any TSLA, not in the market for a car, but would really, really like self driving to come to fruition. But I have seen a lot of the practice cars around, whether Tesla, Cruise, Waymo, Uber, etc. and they all get hung up at relatively trivial situations for humans that are complex in theory.

        Four way stops were drivers are going out of turn, or trying to wave people through. Double parked delivery trucks, clueless pedestrians and the real fact that most of us have a

  • "...to think this constitutes anything close to 'full self-driving' is ludicrous."

    It was driver error. They forgot to take it out of Ludicrous Mode.
  • Wide streets laid out on a grid? Imagine this thing in London - carnage!

    We're never going to have full self-driving judging by this video; we've barely got to the level of driving like a drunk.

  • This is super wide straight roads on a grid with little traffic and neat intersections with traffic lights.

    I wonder how will it perform in Europe which is a lot more irregular.

  • This should hardly be a surprise to anybody who isn't drinking the kool-aid. The rules of the road would be hard enough to reliably implement let alone the dynamic conditions that occur on them - break downs, floods, road works, lights that are out, diversions, blind corners, cars backing out, emergency vehicles, pedestrians crossing, flying debris, cops directing traffic etc.

    Maybe if there is an alert, attentive human to override the dumbass car then perhaps this would be okay but Musk has a perpetual pr

  • After all, it is a test Tesla

  • I like the colorful language and all, but what does this actually mean for a car?
  • this is not self-driving

    as long as it requires you to have your hands on the wheel and ready to take over with no warning (a 'ding' and giving up at the same time is not a warning) it is not "Selfdriving"

    it is "lol let me see you get out of the dangerous situations i'll get you in"

    how can nobody see that this is STUPIDLY DANGEROUS and ABSURD?

    Not only does the driver have to drive-without-driving, they also have to reverse-engineer the autopilot's intention from its actions, evaluate IN REALTIME whether said

  • Safe use of automated vehicles requires all vehicles to communicate. An individual car with an automated driving system is capable of interpreting a street sign or determining where other objects are around it -- but that does nothing to express the intent of the other vehicle. All vehicles need to express not only their position but also their 'next hop' and their destination. Vehicles need to express routing information and broadcast it to other vehicles around them so that a consensus judgement can be
  • I understand that the Tesla is an entirely electric vehicle, and doesn't use gasoline, but do they really officially refer to the accelerator as the "GO PEDAL"?
  • I did a test drive at the local dealership recently in Colorado. While the technology was quite cool on the highway, I would not rely on it to get me anywhere for at least another 2-3 years, even in light traffic, normal daylight driving conditions with no unusual behaviors from nearby drivers. It makes some very questionable choices, and actively put me in dangerous situations. I was curious to see how it would behave in certain circumstances and kept hand hovering on the wheel ready to jump in at any seco

The computer is to the information industry roughly what the central power station is to the electrical industry. -- Peter Drucker

Working...