Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Transportation AI

Tesla Newest Autopilot Navigation Now Handles Traffic Lights and Stop Signs (electrek.co) 110

"Tesla has released a new, highly anticipated Traffic Light and Stop Sign Control feature," reports Electrek: As we reported last month, Tesla has started to push an Autopilot update with the actual ability to handle intersections to some drivers in its "early access fleet," a group of owners who beta test new software update from Tesla. We even got to see a quick demo video. Later, we also took a look at the manual for the new feature that explains in detail how it works as well as its limitations.

Now Tesla is starting to push the new feature to the wider fleet in the U.S. as part of a new 2020.12.6 software update... Some owners have already started testing the feature.

Electrek argues that while the update may be of limited use, it's "more about improving Autopilot to help Tesla achieve full self-driving...

"Drivers using this feature will basically train Autopilot to cross intersections."
This discussion has been archived. No new comments can be posted.

Tesla Newest Autopilot Navigation Now Handles Traffic Lights and Stop Signs

Comments Filter:
  • You mean... (Score:1, Flamebait)

    Tesla put it on the road without the ability to recognize traffic lights and stop signs?

    • Does your car have that functionality ?

      • You mean the functionality that's required for proper self-driving?? I expect better from Tesla - and anyone attempting to shill for them.
      • Every car I have driven has that functionality. The driver.
    • The system as it is is intended for highway driving right now. This is work to move the system in a more general direction.
    • Re: (Score:2, Interesting)

      by AmiMoJo ( 196126 )

      It still can't reliably recognize traffic lights. This is just a driver assistance feature, sometimes it works and sometimes it doesn't. You have to be paying attention to the lights yourself and checking to see if it is applying the brake when they are red.

      I expect we will see an increase in red light running accidents with Teslas as people fall into a false sense of security.

      • Detecting the existence of a signalized intersection isn't difficult outside a blizzard, it's determining the exact light status for the correct lane that's a challenge. The default behavior should be to just stop at the intersection until the human wakes up or the machine is 100% certain.

      • I expect we will see an increase in red light running accidents

        If you approach FAST enough the Red light will magically shift to Green and so they'd be the ones at fault. The on-board camera will show that as well. The Tesla's supposed to have a really, really superb acceleration curve, right?

        • I expect we will see an increase in red light running accidents

          If you approach FAST enough the Red light will magically shift to Green and so they'd be the ones at fault. The on-board camera will show that as well. The Tesla's supposed to have a really, really superb acceleration curve, right?

          Superb acceleration, but sadly for your hypothesis, not a high top speed.

      • >"I expect we will see an increase in red light running accidents with Teslas as people fall into a false sense of security."

        I can't wait to see how Teslas vs. humans deal with complex 4-way stop signs that require subtle judgements, anticipation, and car to car signaling. Similar with complex dual, short on/off ramps.

      • by Rei ( 128717 )

        You could RTFA.

        When the vehicle approaches a traffic light, while it registers the colour of the light, it will always stop for it. It alerts you well in advance so that you can override that with the accelerator; otherwise it'll start gradually slowing down for it. It makes you confirm driving through an intersection, which is in every way imaginable a safer scenario than anything else, including no recognition of lights at all. More to the point, if you're approaching the light from a distance and it's

        • Re: (Score:1, Troll)

          by AmiMoJo ( 196126 )

          Huh. So they nerfed it. Worrying for investors who gave them money with the promise of having full self driving robot taxis this year.

          I guess this is just to make their customers do free training work then. Compare what the customer does to what the car thinks and try to improve it.

      • sometimes it works and sometimes it doesn't.

        In what way is unreliable traffic control device recognition a "driver assistance feature"? Many things are still useful when they don't work sometimes. But some things have to work perfectly or not at all. Mr Musk and his customers appear to be unable to tell the difference. I frankly couldn't care less if Tesla's vehicles occasionally choose to kill their users. Darwinian evolution and all that. If you ask me, the human race could do with a bit of upgradin

      • by stevew ( 4845 )

        Agreed - got the update yesterday and took it for a local drive.

        It stops at EVERY major intersection. If you hit the accelerator briefly or the right handle on the steering wheel - it will continue instead of slowing to a stop. If it is behind another car, then it will function as always and use the radar seeing the car in front moving to decide to continue again.

        It has rough edges. Saw a youtube demonstration where two stop signs were in site at the same time a short distance apart. The system saw the 2

      • >It still can't reliably recognize traffic lights. This is just a driver assistance feature, sometimes it works and sometimes it doesn't. You have to be paying attention to the lights yourself and checking to see if it is applying the brake when they are red.

        A few times I've been behind some oversized truck and could not see the lights. But the camera up higher could, so I could see the lights on the screen. I like that.

  • So another unusable feature for Tesla owners in Europe and elsewhere.
    • What makes you think this is actually usable in America? After the first 20 accidents caused by this thing, then a complete denial by their gloriously arrogant leader, and finally the NHTSA stepping in, this feature will be revoked with an OTA update.
    • And useless in America, too.
    • by ebvwfbw ( 864834 )

      LOL, I consider this a good thing for people outside of the US. Anyone in the US is part of a test group with 3 ton vehicles moving around. Maybe they work, maybe they'll kill you. I didn't sign up to be in this test group.

  • The article's summary declares a beta version is being tested by drivers who train the software to move closer to the goal of an autonomous vehicle. Or said another way, autonomous drivers train the beta version by testing it. This achieves a parameter of human tolerances and preferences for deceleration.

    What's the saying? It's not the fall, but the sudden deceleration?

    Necesselery speaking, said Dr. McNally, every stop along your radish is an incremental reduction to the carrot of your carriage.
    • The article's summary declares a beta version is being tested by drivers who train the software to move closer to the goal of an autonomous vehicle.

      The summary said it's being moved OUT of beta, into general release.

      What's the saying? It's not the fall, but the sudden deceleration?

      Go into any car. Gradually slow down as you approach a red light. pay attention to how that feels.

      Now, approach that same red light but SLAM down on the brake as hard as you can from full speed (do not do this with anyone else be

      • Go into any car.

        ~#25149

        Are you typically this patronizing? I'm humming Gary Numan's Cars while reading your response and phoning a divorce attorney.

        ...from aggressive to timid.
        ~A super Ken doll worth loving

        Why a car, motor mouth? Why not any vehicle? Or Neal Stephenson's deliverator? Waymo/Alphabet speculated much?

        What I addressed by vegetable and a reference to an atlas is the component of human behavior I learned to appreciate from an article decades ago about the perceived time saved attempting to outpace traffic signals.

        I've done Musk's team a favor and directed them to the near

  • Here, where I live it is legal to make a free left turn against a red light when turning onto a one-way street. Does this Tesla feature know how to do that?
    • I don't know and I would guess not, but logical rules like that are really easy to program, vs. recognizing the traffic light in the first place which is incredibly difficult.
    • by stevew ( 4845 )

      The autopilot doesn't make turns other than lane changes on city streets under any circumstance yet.

  • Tesla's auto-navigation features are an interesting case study in risk.

    On one hand, it is a risk to enable technology that is not yet fully functional, in that it does not yet handle every imaginable scenario. Yet they have found a way to release the technology gradually--agile style.

    On the other hand, other automakers are going about developing self-driving technology the old-fashioned, waterfall way, and nobody will be delivering any time soon.

    Tesla has had some accidents related to their self-driving tec

    • On the other hand, other automakers are going about developing self-driving technology the old-fashioned, waterfall way, and nobody will be delivering any time soon.

      Have you read or watched any of the Munro reports? (The company who tears down cars and then sell their secrets to other manufacturers...) They said Tesla is 10 years ahead of any other major auto maker in self driving, and 5-7 years ahead in every area of manufacturing, design, batteries, etc.

    • Less risky? It should be self evident that putting alpha level software in charge of a 4000+ pound vehicle is much more risky to human life than getting it right on a private course. So far, releasing alpha level software has proven very healthy for Musk's wallet, however.

      Which risk was your concern? Human safety or Musk's financial future?
      • Less risky? It should be self evident that putting alpha level software in charge of a 4000+ pound vehicle is much more risky to human life than getting it right on a private course.

        Have you ever driven a Tesla? One with autopilot? I didn't think so.

        My suggest. Zip you flap trap. Go spend a couple hours driving one, and get back to us.

      • All software, when it is first released, goes through an alpha phase. That's true whether it is released through a waterfall big-bang approach, or through an incremental agile approach. When the big automakers finally get their self-driving technology out there, it too will be an alpha release. But the risk level is always lower for an incremental release than for a very large release. Tesla's software may be alpha or beta now, but it's going through a lot of real-world testing that the other automakers don

      • A closed course will miss every edge case. The only way we'll be able to validate self driving car systems is on the road.

        Tesla has this for its Automatic Emergency Braking system. They've pushed numerous updates and it's now the best rated AEB from any manufacturer. It got there thanks to over the air updates and uploaded training data.

        You could correctly make the argument that Tesla could have put numerous cameras and developed an AI inference chip for free. But that's not generally how the market works

        • It's up to the testers to simulate every edge case ON a closed course. If that's too hard, then you have no business doing self driving.
          • It's up to the testers to simulate every edge case ON a closed course. If that's too hard, then you have no business doing self driving.

            By that logic no teenager should be allowed on the road until they've been successfully driving accident free on a closed course for 10 years.

            Autopilot is already 10X safer than humans:
            https://electrek.co/2019/10/23... [electrek.co]

            • That's nonsense. A teenager is not a brand new technology. Teaching people to drive is a long solved problem. Teaching a computer to drive has never been done.
  • Automatically stops unless you press the gas to go through a green light. This is essentially undoing your driving muscle memory. When drivers have it disabled they will forget to press the brake after enough training time.
  • Dead Lights (Score:4, Insightful)

    by kackle ( 910159 ) on Sunday April 26, 2020 @11:39AM (#59992646)
    And what if the lights are dead due to a power failure; does it just cruise, full speed, through the intersection?

    Personally, I'm sick of this technological joke already: Gambling with people's lives while the oh-so-smart engineers think they can predict/compensate for the infinite number of possibilities found on the roads. When tech fails us, and it does every month (pay attention, you'll notice it), we just accept the annoyance and/or reboot the "thing". But these toys will slow traffic repeatedly (when befuddled by anything unusual) and even kill some people. In the name of what, exactly? Don't tell me it's laziness. Don't tell me it's money.
    • And what if the lights are dead due to a power failure; does it just cruise, full speed, through the intersection?

      Go and observe a real intersection when it is out; most human drivers do that today.

      Only a few treat it as a four way stop; the problem is that may not actually be what the law says [theglobeandmail.com], or even common convention. Even if it's good idea.

      But why would a Tesla not being able to make the correct choice, depending on your definition of correct? Just because there are no lights, does not mean the syst

    • If it's burnt out it comes to a stop.
      If it's red it comes to a stop.
      If it's yellow it comes to a stop.
      If it's green it comes to a stop.
      If it's abducted by aliens but in the map from a previous car seeing a light there it comes to a stop.
      If you brake it comes to a stop.
      If you disengage Ap it comes to a stop.

      It only doesn't come to a stop if you positively confirm manually that it's safe.

    • Re: (Score:3, Informative)

      And what if the lights are dead due to a power failure; does it just cruise, full speed, through the intersection

      You could read the fucking article. But I know... That's asking far too much on slashdot.

      The car stops at stoplights that are not lit. In this release, it is up to the driver to tell the car to proceed, even if it is green.

    • "does it just cruise, full speed, through the intersection?"

      Under the Principle of Least Surprise, they should since that is what most human drivers do.

  • "Did you hear about the new horseless buggies? They got some kinda engine inside, makes the thing go by itself, don't need no horse pulling it."

    "Well I declare, that's purely unnatural! I'll bet it doesn't work very well, and if it does, it'll mean the end of civilization!"

    A hundred years or so later -- the engines work a lot better, but civilization not so much.

     

    • "Did you hear about the new horseless buggies? They got some kinda engine inside, makes the thing go by itself, don't need no horse pulling it."

      "Well I declare, that's purely unnatural! I'll bet it doesn't work very well, and if it does, it'll mean the end of civilization!"

      A hundred years or so later -- the engines work a lot better, but civilization not so much.

      You, I like.
      https://www.penguinrandomhouse... [penguinrandomhouse.com]

      The topic of this thread is better understood by framing the relationship of how roads are laid out to the autonomous vehicle than the driver to the vehicle. I won't cite the country, but road designs that involve access channels to enterprise developing along main roads have and are in stages of development in developing economies.

  • *in perfect clear weather and only if the sun isn't behind or near the sign or stoplight otherwise the cameras get blinded, oh and only if the sign or light is facing exactly the right way and isn't storm damaged or aged or temporarily off due to a power outage. Oh and if a bird doesn't fly by or land on it. Basically as long as your entire city is in flawlessly brand new, pristine condition to the point of being considered laboratory conditions, it works great.

    Seriously, look up the ACTUAL unbiased test

All seems condemned in the long run to approximate a state akin to Gaussian noise. -- James Martin

Working...