Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Transportation

Elon Musk Hints at Tesla's Secret Project 'Dojo' Making the Difference in Race To Full Self-Driving (electrek.co) 122

Elon Musk set an aggressive deadline for Tesla to achieve full self-driving capability, but the electric automaker might have an ace up its sleeve that mostly went under the radar: project 'Dojo.' Electrek: Over the weekend, Musk hinted that it could make the difference. During Tesla's Autonomy Day earlier this year, Musk and other Tesla executives gave presentations about what the company is doing to try to achieve full self-driving capability by the end of next year. While most people were focused on the unveiling of Tesla's new HW3 'Full Self-Driving Computer,' which was being explained for the first time and is now installed in all new Tesla vehicles, there was a brief mention of another computer, the Dojo computer, that Tesla is working on and it could be a game-changer.

Last weekend, Musk was asked about the secret project and while the CEO didn't reveal anything new, he did hint that it could make the difference. During Autonomy Day, Musk briefly mentioned the project 'Dojo': "We do have a major program at Tesla which we don't have enough time to talk about today called "Dojo." That's a super powerful training computer. The goal of Dojo will be to be able to take in vast amounts of data and train at a video level and do unsupervised massive training of vast amounts of video with the Dojo program -- or Dojo computer."

This discussion has been archived. No new comments can be posted.

Elon Musk Hints at Tesla's Secret Project 'Dojo' Making the Difference in Race To Full Self-Driving

Comments Filter:
  • by Anonymous Coward

    Project Dojo is training the Tesla marketing department to redefine full self-driving!

  • . . . a much more difficult task would be to ratchet it up one more notch to an ultra powerful training computer . . . maybe it could train humans to drive.

    But I'll admit that would be quite a challenge.

  • by olau ( 314197 ) on Monday August 05, 2019 @11:36AM (#59043976) Homepage

    There was recently a talk by Andrej Karpathy [slideslive.com] in charge of the self-driving project at Tesla in which he actually discusses some of the problems with trying to tackle such a complex machine learning problem with several people working on the same neural net in today's dev tools.

    Time to train from scratch was definitely a main constraint, with people sometimes hacking the network just to avoid having to wait, like messing with file timestamps to avoid a full recompile. It sounded like full retraining took days, but shortcuts resulted in results that could not be reproduced.

    New programming paradigms, new hardware. Just like the first 3D graphics card back in the Quake days.

  • by SuperKendall ( 25149 ) on Monday August 05, 2019 @11:38AM (#59044004)

    Dojo might be a really nice fancy purpose built computer for training, but the real secret to Tesla's eventual success was revealed plainly at that same event.

    It's the fact that Testla can use video fragments from any Tesla anywhere for training. They can ask the entire network of cars to collect data for specific edge cases they may have discovered, then have a large training set of real world cases with real world drivers handing strange situations to learn how to react.

    People here scoff at this becoming a reality but the truth is we are really, really close to full self driving at a technical level, then we just need to worry about the regulatory side of things...

    • Re: (Score:1, Troll)

      Yeah, the key to intelligence is just to throw data at computers and say "this is bad" or "this is good". Welcome to 2019 cutting edge research.

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      How many "edge cases" are there? A moose might be easy, but how about a baglady with a shopping cart? How will the system BE SURE, it detects what is perceieves and act accordingly. A hole in the road, or just a puddle. The difference can be overwhelming. What new edge cases will upgrades create?

      Driver assist: Yes, possible.
      But car should slow down and stop, recognizing what need manual intervention. Can't do that on highways, and might be too annoying for practical usage on smaller roads.

      So, over-hyped and

      • A moose might be easy, but how about a baglady with a shopping cart?

        Once you have enough cars on the road you can query for just about anything.

        The case given in the presentation (which anyone who wants to talk about this intelligently must watch) was cars with bikes on the back. At first the system was thinking there was a bike on the road, then after asking the driver system for examples of encounters with cars and bikes (either separate or together) the system could figure out when a bike was part of a

        • by HiThere ( 15173 )

          Actually, they *did* do the easy case first. Driver assist on a freeway is the easy case. The problem is, you can't get people to pay attention quickly enough when it shifts from easy to problematic.

        • by Wheely ( 2500 )

          As a Tesla driver I can assure you that all this does not happen. Not at all.

    • Wouuww any privacy issues in this? Any security issues? How many rogue owners to make the Tesla hive mind learn really weird stuff?
      • Wouuww any privacy issues in this?

        Not really, the data is all anonymized (they don't know whose car data came from), after all the data is looking at everything around the car, not the car itself (just looking at driver inputs and the outside facing cameras).

        How many rogue owners to make the Tesla hive mind learn really weird stuff?

        You have no idea if data you are generating is being used, so any attempt would mean "driving weird" all the time. Not to mention that truly aberrant behavior would probably be

        • Plus, why would a group of owners want to make their cars worse?

          Hey, let's go spend $50k on a car in order to fuck with a company I just gave $50k to, in a way that they could easily detect and remove from the data set.

      • by Anonymous Coward

        Did you sign a consent form to have your car provide your intellectual property (the driving reactions you take, as well as the video you recorded during your drive, all of which should be copyright of you, the owner and/or operator of the vehicle.)

        The current sickness of the information age that everyone is ignoring is the tacit allowance of someone to take ownership of intellectual property that doesn't really belong to them and use it to make a profit, while not offering the same consideration back to so

    • People here scoff at this becoming a reality but the truth is we are really, really close to full self driving at a technical level, then we just need to worry about the regulatory side of things...

      "Close"? If you mean within 5 years I doubt we'll see it in cars I can buy. If you mean within 10-20 years yeah maybe at least for some conditions and use cases. I think it's going to happen but I'm rather more pessimistic on the time frame than Elon is, and maybe you. Within my remaining expected lifespan yes I think it will actually happen. I just think it's going to take another decade at least (probably longer) to work out the kinks to the point where we can even really start seriously considering

      • For example when I drive to work how do I tell my future car where to park without me taking the wheel?

        The car shows all the available options on the big screen, and then you just put your finger on the one you want ?

      • For example when I drive to work how do I tell my future car where to park without me taking the wheel?

        Well, a Tesla can already park itself both parallel, and backing into a space. So really it just needs to self-drive around until it finds a space, and then use the software already written.

        You're asking the wrong questions. Why do you give a damn where the car parks, if it can drop you off at the door and come to you when you are ready to go? Does it matter if you are parked in spot #23 or spot #52?

        • by Anonymous Coward

          Have you been to a mall? An airport? A city street infested with skyscrapers full of people?

          Can you even begin to imagine the clusterfuck of gridlock that will occur when every single person tries to have her car pick them up at the front door?

        • Comment removed based on user account deletion
    • by Kjella ( 173770 )

      It's the fact that Testla can use video fragments from any Tesla anywhere for training. They can ask the entire network of cars to collect data for specific edge cases they may have discovered, then have a large training set of real world cases with real world drivers handing strange situations to learn how to react.

      Isn't the hard part deciphering when the edge case is happening, not the reaction? I mean humans are capable of a lot of things, but in a car it's mostly appropriate speed, distance and use of a few signals. For example, say the edge case is big wild animal (bear, moose, bison, lion, elephant etc.) the reaction is 99% just slow down or stop and wait for it to clear the road. But there's nothing telling the car that's what just happened, it can't know this was the solution to a BWA problem unless it's detect

      • It's the fact that Testla can use video fragments from any Tesla anywhere for training. They can ask the entire network of cars to collect data for specific edge cases they may have discovered, then have a large training set of real world cases with real world drivers handing strange situations to learn how to react.

        Isn't the hard part deciphering when the edge case is happening, not the reaction? I mean humans are capable of a lot of things, but in a car it's mostly appropriate speed, distance and use of a few signals. For example, say the edge case is big wild animal (bear, moose, bison, lion, elephant etc.) the reaction is 99% just slow down or stop and wait for it to clear the road. But there's nothing telling the car that's what just happened, it can't know this was the solution to a BWA problem unless it's detected the BWA itself, which means it'll have to be running video analysis all the time. It also won't have the person's perception of the situation, like if the sensors see a person but the driver does nothing until he abruptly jams the brakes that's probably when he saw the person. That means it's a poor idea for the AI to mimic a human because it was actually the human being sub-optimal. I suppose there's some useful data in there, but it's hardly magic.

        The edge cases can be found by searching data sets for anomalous or extreme sensor readings. Humans can review them to determine that they are what they are looking for.

        In simple terms - You show the computer how to identify them. How to react, and how not to react. If all you have are poor reactions then you explain this to the computer.

      • by ceoyoyo ( 59147 )

        Teslas have radar. Search for episodes where the car slows down (or doesn't) for some object that is poorly recognized by the camera systems.

    • by Wheely ( 2500 )

      Nowhere near it in my view,

      As a Model X owner (which I love by the way) I see no evidence that we are even remotely close to full self driving or that this neural net Tesla keep talking about is anything more than a large database that doesn't work very well. For the last two years, auto-pilot still slams on the brakes at exactly the same point on my way home from work, still disengages over a motorway bridge I drive on, still thinks a particular wall is a bus and still dives into the middle of large on ra

  • I doubt throwing a big computer at the problem with work. The issues are:
    - Lack of data on edge cases, which are by their definition very rare. You can't train a machine learning algorithm if you don't have (enough) data to train it on.
    - On top of that, there is the lack of predictability of our social fabric. Predicting how other road users will behave basically means you have to have almost consciousness level AI.
    - Even if you could model the existing social fabric, you can't predict how it will change on

    • We all learned growing up that due to Moore's Law the solution to everything in the future will be to throw more CPU at it.

    • - On top of that, there is the lack of predictability of our social fabric. Predicting how other road users will behave basically means you have to have almost consciousness level AI.
      - Even if you could model the existing social fabric, you can't predict how it will change once human start attempting to predict the behavior of self driving cars, since they don't meaningfully exist yet.

      These behaviors you keep mentioning are dictated by a set of written rules of the road to which we all follow.

      Don't need con

      • Don't need consciousness level AI to read a rulebook.

        But what if the rulebook doesn't apply ? A big fallen tree blocks the road, do you drive on the sidewalk, or across a solid line to get through, or do you stop and wait for someone to remove the tree?

        • But what if the rulebook doesn't apply ? A big fallen tree blocks the road, do you drive on the sidewalk, or across a solid line to get through, or do you stop and wait for someone to remove the tree?

          You turn into someone's driveway, turn around, and go back the way you came and take an alternate route. You know, like humans do. With some fumbling around and false starts, no doubt. You know, like humans do.

          • You turn into someone's driveway, turn around, and go back the way you came and take an alternate route. You know, like humans do.

            Often, humans just drive around the obstacle, even if it means breaking some rules.

          • In most places that turning around is illegal, even more so if there are sign posted declaring it illegal to trespass.
            So now your car has ignored the rulebook and broken the law.
            • so it can turn around in the street right in front of the tree with a 3-point turn? It's not like there is oncoming traffic - the road is fully blocked.

              Or are you going to say that's illegal too?

              • In most states that is totally legal, provided you stay on the public road. there are some exception like there is a 'no u-turn' sign posted, and in most places the road has to be marked that passing is allowed, so no double yellow.
                But for here lets say it is a passing zone, there are no sign preventing, and you do not cross into private property. Since it is safe to do so you would be legal to turn. Now comes the part of having to get a self-driving car to follow that and all the different laws that ex
        • How about what any driver would already know how to do - turn around and bypass the problem on another route.

        • These are the worst possible questions. Because humans also can't make these decisions "correctly".

          "Do I sit and hope someone will come along? Do I call someone? Do I try to offroad and risk damaging my car?"

          There is no correct answer if it's the only road without a 10 hour detour. It might be faster to find a chainsaw. It might be better to go home and give up. It might be better to park and rent a car on the other side.

          But this all assumes that humanity has been extinguished from earth and no longer e

    • lack of data on edge cases, which are by their definition very rare. You can't train a machine learning algorithm if you don't have (enough) data to train it on.

      When the odds are one in a million seconds (once every roughly year of driving) and you have a million cars on the road that means you are observing that edge case somewhere in the LTE connected fleet once every second. That means you could collect roughly an hour of said footage every day.

      For comparison Waymo just announced one of the most expansive data sets available for academic studies will have 16.7 hours of footage. So in a month Tesla could have twice that... just of one specific one-in-a-million

      • A man at the side of the road waves you to pull over. You recognise that he is a cop. You pull over.

        A man at the side of the road waves you to pull over. You suspect that he is a gangbanger. You don't pull over.

        So, Tesla has to teach their cars the difference between a cop and a gangbanger.

  • Elon is a really amazing guy. He said in April Tesla will have a full robotaxi network available in 2020. Sounds like all they need to do is get this Dojo training computer working and then download it into the HW3. The future is going to be amazing (on Mars).

  • There is as much learning AI in a Tesla as there is water vapour in Amazon Cloud.
  • Hai sensei Musk-san, sen sen no sen.

  • I thought the main problem holding back full autonomous driving was the expense of LIDAR.
    • Tesla doesn't use LIDAR though, Waymo and others do. Tesla's simply use cameras and image processing. Good enough for humans...

  • ..so unless Musks' people have a serious leg up on every (so-called) AI researcher on the planet, not to mention every neuroscientist on the planet, they'll never have 'full self-driving'.
  • Sorry, but fully self-driving can't happen without LIDAR, which he is opposed to.

Parts that positively cannot be assembled in improper order will be.

Working...