TMC is an independent, primarily volunteer organization that relies on ad revenue to cover its operating costs. Please consider whitelisting TMC on your ad blocker and becoming a Supporting Member. For more info: Support TMC
Start a Discussionhttps://teslamotorsclub.com/tmc/tags/

Fleet learning with AP 2.0: changing brains

Discussion in 'Autonomous Vehicles' started by Cobbler, Feb 27, 2017.

  1. Cobbler

    Cobbler Member

    Joined:
    Sep 22, 2015
    Messages:
    85
    Location:
    België
    Over the past weeks, I've learned more and more about the concept of AP 2.0 and how the 'AI' in the car will try to learn itself how to drive based on the sensor inputs of the car.

    The NVIDIA system, based on the DAVE2, was able to succesfully steer the car after a period of trial-and-error.
    How Our Deep Learning Tech Taught a Car to Drive | NVIDIA Blog
    The dataset created, works for that setup only. If this dataset would've transfered to a different setup with a slightly different camera, steering behaviour, ... the setup would've to recalibrate/relearn itself to get the system working again.
    It's like saying: my brain has been optimised to work with my body characteristics. If my brain is transplanted in a different body that has bad eyes or less muscles, I would have trouble getting to control everything and it would take a learning process to adapt before desired control of the body is possible.

    Now to Tesla and the AP2.0 setup
    I assume that every setup has some slightly different setups among each car they are building:
    - on the camera setup, angle setups
    - steering behaviour (wheel size?)
    - deterioration after a certain period of use so that the car behaves different after a while
    (caused by dirty sensors, mechanics wear,....)

    Is it possible that each car is going to develop a unique dataset to succesfully autosteer and that they are difficult to interchange with other cars?
    Or is the learning system taking a different approach?
     
  2. JeffK

    JeffK Well-Known Member

    Joined:
    Apr 27, 2016
    Messages:
    5,368
    Location:
    Indianapolis
    Most systems that work with neural networks have a camera calibration step which creates some values to normalize the input to a standard.... therefore allowing you to use the same model on all vehicles (in a region of the world). The calibration values might be unique, but nothing else has to be.
     
    • Like x 2
  3. lunitiks

    lunitiks ˭ ˭ ʽʽʽʽʽʽʽʽʽ ʭ ʼʼʼʼʼʼʼʼʼ ˭ ˭

    Joined:
    Nov 19, 2016
    Messages:
    1,732
    Location:
    Prawn Island, VC
    Unique camera calibration values on S, X and 3 of course.

    PX2 registers your current speed, wheel rotation, steering wheel angle etc. through CAN/Ethernet data
     
    • Informative x 1
  4. malcolm

    malcolm Active Member

    Joined:
    Nov 12, 2006
    Messages:
    2,604
    Ah, what you need is an Eymorg Controller. Your brain won't even notice the difference.

    [​IMG]
     

Share This Page