Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Predictions for 2020

This site may earn commission on affiliate links.
So what are your FSD predictions for 2020?

I think, at a minimum, we will see traffic light and stop sign response and basic "city level AP" in 2020. And, I think Tesla will do the AP3 upgrades in 2020 and focus almost exclusively on AP3 features in 2020.
Yes - 2020 is all about City NOA.

BTW, I was hoping the city NOA would be released in Q1. But I now think it might be Q2, with EAP release being in Q1.

Also, it is possible city NOA will be released in stages - stop at sign stops first, then traffic light stops, then right turns and protected finally left turns.
 
  • Like
Reactions: APotatoGod
BTW, I was hoping the city NOA would be released in Q1. But I now think it might be Q2, with EAP release being in Q1.

City NOA could still be released at the end of Q1. It just depends on how well EA testing goes.

Also, it is possible city NOA will be released in stages - stop at sign stops first, then traffic light stops, then right turns and protected finally left turns.

It's possible. But I think Tesla will release traffic light and stop sign response together. The feature is bundled together on the website, plus we know from the visualizations that Tesla already has both traffic light and stop sign detection enabled.

But the website does separate traffic light/stop sign response and "automatic city driving". So it is possible that Tesla will release traffic light and stop sign response first and then later release "turning at intersections" once they've fully validated that traffic light/stop sign response is reliable. I think that would make sense.
 
  • Like
Reactions: APotatoGod
It's possible. But I think Tesla will release traffic light and stop sign response together. The feature is bundled together on the website, plus we know from the visualizations that Tesla already has both traffic light and stop sign detection enabled.

But the website does separate traffic light/stop sign response and "automatic city driving". So it is possible that Tesla will release traffic light and stop sign response first and then later release "turning at intersections" once they've fully validated that traffic light/stop sign response is reliable. I think that would make sense.
The order of feature release will depend more on EAP and even before that the error rate they are seeing. Ofcourse all the features themselves might not be done in one go.

In general I'd think they will release the easy / less risky parts first.
 
You know what I mean
You are saying - if shown a clear picture, humans can always tell ?

But that is not what we are talking about - its about what they do in real life driving. I think Tesla will be pretty good in ideal conditions too. Edge conditions with occluded lights, sun at the back, weird angles, confusingly high number of traffic lights etc will pose problems - both for Tesla and for humans.

One thing humans do better is to look at what rest of the traffic is doing and draw conclusions when the conditions are not ideal. For eg. if another car in front stops and goes, they will assume a stop sign, look carefully and infer a stop sign even if occluded. That will be difficult and takes lot of training for the car. What Tesla will do better is not get distracted.

ps : Another thing we haven't touched here is map based stop/traffic signs. From what we have seen (from @verygreen ), Tesla uses maps to look for stop/traffic signs. That is obviously error prone - and Tesla may need to either restrict City NOA to areas with good latest maps or move away from maps for this.
 
Humans have essentially perfect light *recognition* if they're paying attention.
Are you ignoring this - or are you saying in all these conditions humans are perfect ?

Edge conditions with occluded lights, sun at the back, weird angles, confusingly high number of traffic lights etc will pose problems - both for Tesla and for humans.
ps : We aren't even yet talking about people with less than perfect vision.
 
  • Like
Reactions: APotatoGod
Due to error or deliberate rule-breaking, human drivers run red lights between 0.02% and 0.6% of the time.

So, Tesla needs an error rate of something like 0.01%. That's 1 in 10,000. That means you need to test on at least 100,000 red lights to know whether the red light detector is more accurate than a human.

Plus you need to test on green lights to know the false positive rate.
 
Last edited:
Are you ignoring this - or are you saying in all these conditions humans are perfect ?

Edge conditions with occluded lights, sun at the back, weird angles, confusingly high number of traffic lights etc will pose problems - both for Tesla and for humans.
ps : We aren't even yet talking about people with less than perfect vision.

Sure, that's why I said essentially perfect.

As for impaired vision and similar issues, there are laws for that. Again, sure, not everyone is following the laws, but Tesla should be able to meet the light recognition ability of a typical law abiding human. I think light recognition is very obvious to this type of human. Again, I say recognition, not interpretation.
 
  • Informative
Reactions: APotatoGod
Sure, that's why I said essentially perfect.
You are not putting any boundaries around what you mean by perfect. So we can leave it here.

Due to error or deliberate rule-breaking, human drivers run red lights between 0.02% and 0.6% of the time.
That is interesting. So 4 Nines is where they need to be - 99.99% accurate. I'm guessing without looking up that those lights are typical 4 way stops without elevation/curvature issues.
 
  • Informative
Reactions: APotatoGod
You are not putting any boundaries around what you mean by perfect. So we can leave it here.

True, this is a situation of "we'll know it when we see it."

I think we can all agree that Tesla is under more media scrutiny than most other companies. It won't be good to see videos of Teslas not recognizing lights "obvious" to humans (when they roll out automatic stopping). I only mention this since I already saw a video where the new FSD visualizations didn't recognize a green light, so the failure rate per videos is around 1/40 lights (per my video watching).
 
Humans have essentially perfect light *recognition* if they're paying attention.

The Tesla is always "paying attention" and should be able to recognize the light to at least the same accuracy.

I'm talking about recognition. Yes, humans sometimes purposely run lights or aren't paying attention.
Hmmm. Maybe Tesla ought to allow drivers to program how to run red lights and stop signs. Selectable, such as "run 50% of red lights and stop signs."
 
the new FSD visualizations didn't recognize a green light, so the failure rate per videos is around 1/40 lights (per my video watching).
Do you remember if it incorrectly showed it as red or no color or didn't even notice the traffic light? In any case, the visualizations of multiple traffic lights does reveal Tesla is attempting to track all the ones facing the car, and if one light is green while another light is "no color" (in addition to the driver entering the intersection), this is probably easy automatically labeled training data that hopefully Tesla can then use to quickly improve recognition.
 
  • Like
Reactions: APotatoGod
Do you remember if it incorrectly showed it as red or no color or didn't even notice the traffic light? In any case, the visualizations of multiple traffic lights does reveal Tesla is attempting to track all the ones facing the car, and if one light is green while another light is "no color" (in addition to the driver entering the intersection), this is probably easy automatically labeled training data that hopefully Tesla can then use to quickly improve recognition.

Doesn't recognize the green light at 1:33. Let us know if you figure out why
 
Doesn't recognize the green light at 1:33. Let us know if you figure out why
It doesn’t visualize a green light. Exactly what is visualized and what is seen might be different things. For example if
Probabilities for RedYellowGreen are
R: 0.1
Y: 0
G: 0.9

Will it visualize it as green? Or did they set the cutoff to 0.99? As for deciding when to drive, is it purely based on the traffic light or also from other vehicles, pedestrians, time since green or a lot of other factors? We don’t know, it is hidden in the neural network. The traffic light output and it’s feature layers from the hydra net into the RNN is just one of many inputs.

But this is a good example of data that Tesla might decide to collect, situations when the FSD is uncertain and the driver decides to drive. They could just autolabel it to be whatever the driver did.
 
  • Informative
Reactions: APotatoGod