TMC is an independent, primarily volunteer organization that relies on ad revenue to cover its operating costs. Please consider whitelisting TMC on your ad blocker and becoming a Supporting Member. For more info: Support TMC

Were HW3-gen NNs running passively since July?

Discussion in 'Autopilot & Autonomous/FSD' started by strangecosmos2, Dec 28, 2019.

  1. strangecosmos2

    strangecosmos2 Koopa Troopa

    Joined:
    Nov 24, 2019
    Messages:
    177
    Location:
    New Donk City
    #1 strangecosmos2, Dec 28, 2019
    Last edited: Dec 28, 2019
    u/topper3418 on Reddit pointed out this excerpt from Tesla's Q2 update letter, which was published on July 24, 2019:

    “We are making progress towards stopping at stop signs and traffic lights. This feature is currently operating in “shadow mode” in the fleet, which compares our software algorithm to real-world driver behavior across tens of millions of instances around the world.”
    Now, this doesn't necessarily imply the software running passively (in “shadow mode”) included the bigger, more computationally intensive neural networks developed for HW3. But it would make sense for Tesla to deploy these passively as soon as possible to test and train them. Especially since Karpathy said in October 2018 (nine months before this update letter) that he was excited to deploy the new NNs.

    Prior to the Q2 update letter in July, the last update I'm aware of was from Elon on Twitter in April:

    “The Tesla Full Self-Driving Computer now in production is at about 5% compute load for these tasks [i.e. Navigate on Autopilot] or 10% with full fail-over redundancy”​

    Elon also tweeted that that the compute load on HW2.5 was “~80%”.

    I only just realized that Elon said “for these tasks”, i.e. the features that were active in customers' cars at the time. That doesn't include anything that might have been running in shadow mode.

    So — pending further evidence — my hunch is that the HW3-gen NNs have been running passively in HW3 cars since at least July. I would guess since HW3 began entering production cars in March/April. So, ~6-9 months already, rather than just the last week in which the FSD Visualization Preview got pushed.

    u/keco185 on Reddit suggested that Tesla has been using human driving behaviour to train red light and stop sign detection. I think this makes sense, since if a human Tesla driver stops when the Autopilot planner isn't expecting it, that could be used to signal a false negative for a red light or stop sign. Conversely, if the human goes when the planner isn't expecting it, that could signal a false positive for a red light or stop sign. These “surprises” or “disagreements” could be used to curate examples to be hand-labelled for training (and also for testing). Aurora has described using a similar approach.

    On r/SelfDrivingCars, u/brandonlive also speculated that Tesla may be using maps to detect false negatives for stop signs and traffic lights. I think that's a brilliant idea.

    Other active learning techniques — like NN ensemble disagreement — could be used as well.

    Karpathy's most recent talk about the new, HW3-gen NNs for anyone who missed it:

     
    • Disagree x 1
  2. tyson

    tyson Member

    Joined:
    Nov 13, 2016
    Messages:
    478
    Location:
    IA
    Browse @greentheonly on Twitter.... Stop light and stop sign detection/display has been around for a long while now..... Not just on AP HW3 either
     
    • Helpful x 1
  3. strangecosmos2

    strangecosmos2 Koopa Troopa

    Joined:
    Nov 24, 2019
    Messages:
    177
    Location:
    New Donk City
    True, the Q2 update letter may be referring to a smaller, HW2-compatible version of stop light and red light detection.
     
    • Disagree x 1
  4. tyson

    tyson Member

    Joined:
    Nov 13, 2016
    Messages:
    478
    Location:
    IA
    Could be. He has deep access to the Autopilot computer so if it's in production code he will see it and usually report it if it's noteworthy. Lots of good info on there about "shadow mode" too.
     
  5. strangecosmos2

    strangecosmos2 Koopa Troopa

    Joined:
    Nov 24, 2019
    Messages:
    177
    Location:
    New Donk City
    I don’t think green has access to a car with a rooted HW3 computer yet, though.
     
    • Disagree x 1
  6. tyson

    tyson Member

    Joined:
    Nov 13, 2016
    Messages:
    478
    Location:
    IA
    Yeah, that may be... Though I wouldn't rule it out. I know he has a HW3 unit
     
  7. tyson

    tyson Member

    Joined:
    Nov 13, 2016
    Messages:
    478
    Location:
    IA
    One that he retrofitted to an MCU1 car too.. something Tesla has yet to start doing
     
  8. strangecosmos2

    strangecosmos2 Koopa Troopa

    Joined:
    Nov 24, 2019
    Messages:
    177
    Location:
    New Donk City
    @greentheonly, December 23:

    “Don't have root access on my hw3 unit”
     
    • Disagree x 1
  9. tyson

    tyson Member

    Joined:
    Nov 13, 2016
    Messages:
    478
    Location:
    IA
    Yeah not arguing whether or not he has root. Mearly pointing out that he is one of the only real world people who have the level of access to answer the questions your asking and is a good source of information.
     
    • Like x 1
  10. Mardak

    Mardak Member

    Joined:
    Oct 13, 2018
    Messages:
    521
    Location:
    USA
    I believe this is just running the same HW2/.5 behavior "as-is" on the new hardware. From autonomy day, Bannon showed a slide with HW2.5 capable of handling 110 frames per second while FSD computer has a 21x increase to 2300 frames per second.

    Doing some frame counting of @verygreen videos on youtube, it shows the video updating 30 frames per second while the overlay for lines and boxes are not as frequent for non-main cameras. Notably, it looks like every frame can update the main camera overlay while the other overlays update every 4th frame. 30 main camera frames + 7 cameras * 8 frames = 86 frames per second. And 86 / 110 = 78% compute load for HW2.5 while 86 / 2300 = 4% -- both numbers pretty close to the 80% and 5% numbers.

    One guess is Autopilot will soon use full frame rate from all cameras on FSD computer as that would be 30 * 8 = 240, so just barely over 10% load (and 20% with redundancy). This will be especially important for reacting to quickly moving cross traffic as only handling 8 frames per second has a 125ms gap while 30fps has a 33ms gap. Or converting that to distance assuming cross traffic moving at 60mph: 33ms gap = 3 feet while 125ms gap is 11 feet.
     
    • Informative x 1
  11. croman

    croman Active Member

    Joined:
    Nov 21, 2016
    Messages:
    4,605
    Location:
    Chicago, IL
    Simple. Tesla misleads. There is no real shadow mode. Anyone who believes it should also keep leaving cookies and milk on the 24th.
     
  12. strangecosmos2

    strangecosmos2 Koopa Troopa

    Joined:
    Nov 24, 2019
    Messages:
    177
    Location:
    New Donk City
    #12 strangecosmos2, Dec 28, 2019
    Last edited: Dec 28, 2019
    1) Isn't verygreen using HW2 (or HW2.5) in the videos?

    2) What I'm suggesting is that both generations of NNs are running on HW3. If verygreen is using HW3, he may be capturing visualizations from the HW2-gen NNs.

    I think shadow mode is most likely real. The reasons I think so are: 1) Tesla says so, 2) other companies like Aurora also use shadow mode by another name, and 3) I've never seen any strong evidence to the contrary.

    As far as I can tell, green's argument that shadow mode doesn't exist is simply that Teslas don't record and upload sensor data or perception NN data on 100% of the miles they drive. That seems like a non-sequitur to me. That's not how Elon or anyone else at Tesla has described shadow mode, as far as I'm aware. Who said shadow mode means Teslas are recording video or perception NN ouputs all the time and uploading all of them at the end of each day?

    In my understanding, the term "shadow mode" is used in two senses: 1) a broad sense in which software (including NNs) runs passively on cars for testing and/or training purposes and 2) a narrow sense in which when a human driver takes a trajectory that the Autopilot planner computes a low probability for, this is a signal that can be used to trigger a data upload. (Or something along these lines; I'm just describing how Aurora does it and assuming Tesla does the same thing.)

    If ~99.9% of the time, the Autopilot planner and the human planner agree, there is no need to upload that data. The only data that should be upload is some carefully curated subset of the ~0.01% of the time that they disagree. And that is consistent with how Elon has described shadow mode.
     
    • Like x 1
    • Disagree x 1
  13. Mardak

    Mardak Member

    Joined:
    Oct 13, 2018
    Messages:
    521
    Location:
    USA
    I know, and I'm suggesting the 80% and 5% load numbers match up with the same NN getting same inputs on both old and new hardware. The actual frame math is actually more nuanced with various optimizations to crop or differently prefer various cameras, but overall 80% of 110 fps budget of HW2.5 is close to 5% of 2300 fps budget of FSD computer.

    It would probably be safer to load a different NN onto the B node than having driving-critical operation compete on the same node anyway, and the recent finding from verygreen might be a test run of this. Similar to adding cone visualizations as a test deployment in preparation to showing more visualizations like traffic lights, starting to use node B with even the same NN could be a test before running with a different NN. Although I believe so far verygreen hasn't noticed additional NN in the software, and I guess average owners can also notice by monitoring software update sizes significantly increasing from roughly 500MB.
     
  14. croman

    croman Active Member

    Joined:
    Nov 21, 2016
    Messages:
    4,605
    Location:
    Chicago, IL
    You are just making stuff up without a real basis in reality. Tesla doesn't do what you suppose.
     
    • Disagree x 2
  15. EVNow

    EVNow Well-Known Member

    Joined:
    Sep 5, 2009
    Messages:
    9,152
    Location:
    Seattle, WA
    You are just making stuff up without a real basis in reality.
     
    • Disagree x 1
  16. croman

    croman Active Member

    Joined:
    Nov 21, 2016
    Messages:
    4,605
    Location:
    Chicago, IL
    #16 croman, Dec 29, 2019
    Last edited by a moderator: Dec 30, 2019
    The code is reality.
     
    • Disagree x 1
  17. EVNow

    EVNow Well-Known Member

    Joined:
    Sep 5, 2009
    Messages:
    9,152
    Location:
    Seattle, WA
    #17 EVNow, Dec 29, 2019
    Last edited by a moderator: Dec 30, 2019
    And you have the code ?

    Seems dangerous to assume one hacker knows everything and misses nothing (even though Soylent has said clearly that verygreen misses a lot).

    I've a clear idea of what we know for a fact and what we can infer and what is speculation. You seem to have difficulty distinguishing among those.

    Fact : Tesla says they run some things in shadow mode.

    Fact : Verygreen says he hasn't seen any "shadow mode" but has seen evidence of various triggers.

    What is your inference, what is your speculation ?

    ps : I've spent enough years in my life explaining s/w architecture to people who still don't get it ! So, hacking is more like the parable about blind men trying to figure out what they are touching - while touching an elephant.
     
    • Disagree x 1
  18. croman

    croman Active Member

    Joined:
    Nov 21, 2016
    Messages:
    4,605
    Location:
    Chicago, IL
    There is no evidence for shadow mode. As everyone says it's just Tesla's word and they are proven liars, particularly Elon.

    Green isn't God but I'm not the one operating on faith. His track record speaks better than Tesla and he does see the code and has the smarts and skills to properly interpret it. He also has ethics and morals unlike Tesla.

    But ultimately I'm not the one making assertions without basis. Shadow mode had yet to be proven and it would have to be clear to anyone with root. All who have root access, particularly to APE, are in agreement except the true believers who don't need something as pesky and evidence to push their unfounded beliefs.
     
  19. loquitur

    loquitur Member

    Joined:
    Oct 11, 2018
    Messages:
    49
    Location:
    San Francisco
    What's with the all-or-none "shadow mode" thing? It could be restricted to development team
    Teslas (in software, one always eats one's own dog-food first), or employee cars for a
    broader roll-out, etc. This is not binary.
     
  20. strangecosmos2

    strangecosmos2 Koopa Troopa

    Joined:
    Nov 24, 2019
    Messages:
    177
    Location:
    New Donk City
    I believe green's claim is nothing more than that HW2/3 Teslas aren't recording video or object detections all the time and uploading all of them.

    I think continuous recording is what green interprets as “shadow mode”, but I don't interpret it that way.

    I interpret shadow mode as being just another set of event-based triggers of the sort that green has confirmed exist. In the context of shadow mode, events that trigger snapshots are likely disagreements between the trajectory outputted by the Autopilot planner and the car's trajectory while being fully human-driven.
     
    • Disagree x 1

Share This Page

  • About Us

    Formed in 2006, Tesla Motors Club (TMC) was the first independent online Tesla community. Today it remains the largest and most dynamic community of Tesla enthusiasts. Learn more.
  • Do you value your experience at TMC? Consider becoming a Supporting Member of Tesla Motors Club. As a thank you for your contribution, you'll get nearly no ads in the Community and Groups sections. Additional perks are available depending on the level of contribution. Please visit the Account Upgrades page for more details.


    SUPPORT TMC