TMC is an independent, primarily volunteer organization that relies on ad revenue to cover its operating costs. Please consider whitelisting TMC on your ad blocker and becoming a Supporting Member. For more info: Support TMC

BBC Autopilot video

Discussion in 'Model S' started by electronhauler, Jun 14, 2018.

  1. E-Ryc

    E-Ryc Member

    Joined:
    Jun 6, 2018
    Messages:
    154
    Location:
    Prague, CZ (EU)
    I'd say this also kind of shows weakness of neural net based driving - it's difficult to enforce "corner case behavior". I.e. If car ahead of you moves out of the way and "noise" cloud moving very quickly towards you appears on radar, there is something fishy - do something (i.e. the same as the car ahead of you). I know there can be false positives, but RELIABLE obstacle avoidance is necessary before even thinking about FSD. At least we did it that way in our (model scale) robotic projects.
     
  2. R.S

    R.S Active Member

    Joined:
    Mar 8, 2015
    Messages:
    1,196
    Location:
    Munich, Bavaria, Germany
    That shouldn't be a problem for neural net training. Companies like Google, Tesla, or GM do millions of simulations just like that, to train their neural nets for vehicles/people suddenly entering the road and such. The AI learns pretty quickly, that it's better to steer way from an object, if possible.

    As long as the sensors detect it, an AI can be trained to avoid it. In real life, though, you need to be really certain, that what your sensors tell you is actually true. Because if your car suddenly changes lanes and brakes, just because it saw some shadow on the road and the vehicle in front just happened to change lanes, then you can't put that in a consumer vehicle.

    And since shadow braking still happens, I don't think shadow lane changing plus braking is a good idea.
     
  3. E-Ryc

    E-Ryc Member

    Joined:
    Jun 6, 2018
    Messages:
    154
    Location:
    Prague, CZ (EU)
    I'm not NN expert (at all) but my understanding is that you have no guarantee what NN does in situation which even slightly differs from training/testing data (based on some unrelated input). I.e. it stops on red normally (tested) but suddenly, if there is someone in blue jacket waiting by the crossing (untested situation), it decides not to stop. Or it may vary based on version.

    So I'd expect some low level (not NN based) "reflexes" (if wall then stop). And the intelligence running on top of them.
     

Share This Page

  • About Us

    Formed in 2006, Tesla Motors Club (TMC) was the first independent online Tesla community. Today it remains the largest and most dynamic community of Tesla enthusiasts. Learn more.
  • Do you value your experience at TMC? Consider becoming a Supporting Member of Tesla Motors Club. As a thank you for your contribution, you'll get nearly no ads in the Community and Groups sections. Additional perks are available depending on the level of contribution. Please visit the Account Upgrades page for more details.


    SUPPORT TMC