Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Theory: Autopilot 1.0 hardware can be trained to avoid stopped objects

This site may earn commission on affiliate links.
Here is a theory I have - Perhaps AP 1.0 can be trained to avoid stopped objects in the road such as the incident in Norway with the delivery truck in the lane - and of course the fatality of July 1.

Remember that Mobileye's EyeQ3 comes with a trained data set from Mobileye which breaks computer vision down into sub-tasks groups such as
  • Path planning
  • Object detection
  • Edge detection
  • Free space detection
  • Pedestrian detection
Mobileye has stated publicly that they have a rapidly growing employee base which does nothing but "annotate" images all day for the neural networks to train on - marking images with category labels such as "person." Then the neural network is trained to recognize such categories.

Mobileye has never publicly stated that EyeQ3 is a closed box - for all we know Mobileye is giving Tesla improved versions of its firmware as time goes on and they get more and more images annotated.

Lastly we know that Mobileye's executives have stated that EyeQ3 is a very "unstressed" piece of hardware running at only 5-10% of its computing capacity.

On top of this fact is the fact that Tesla is writing its own sensor fusion software and is engaged in its own fleet learning project.

It seems plausible that at this very moment Mobileye is now annotating images with labels such as "stopped truck sitting in highway" and "large semi trailer spanned across highway" - and that sometime soon these new components of image detection could be loaded into firmware on all current autopilot equipped Teslas.

Mobileye has stated that in 2018 "turn across path" detection is coming but I did not read anything which specified whether or not EyeQ3 is powerful enough to handle it - maybe it will now be rushed into a higher priority.
 
I don't see why you couldn't train the system with the hardware we have in our cars right to say:

Apply some degree of brakes and sound alarm if the following occurs:

1 - Radar signature says we are crossing under an overpass

BUT

2 - Image recognition algorithm detects very high confidence of semi trailer across highway.

It seems plausible that nobody at Mobileye or Tesla thought to train the networks with the specific corner case of a lightly colored tractor trailer crossing the car's path against a light sky. But it also seems plausible the networks can now be trained to recognize such rare scenarios if Mobileye and Tesla put forth the effort to annotate such images and train the networks.

In this way you could perhaps overcome the shortcoming of the existing radar hardware simply by increasing the intelligence of the image recognition algorithms.
 
An interesting part of the fatal crash investigation will be to find out whether Autopilot warned the driver at all. The two most likely answers are: he wasn't warned at all, or he was warned too late.

The European driver in the stopped truck crash was warned with audible tones. So, presumably, with image detection, Tesla can detect stopped vehicles from road signs.

However, they're still not confident enough with the system to automatically apply the brakes. This almost certainly means they're getting too many false positives (as Musk himself alluded to). They of course know this because they're running algorithms constantly in the background. When the car encounters something the system (secretly) believes should be a braking event, and the driver DOESN'T brake, then that case gets transmitted back to Tesla for analysis/learning. They know exactly how many of these occur. Then, they change their algorithm and deploy it again.

But, I wonder if either limitations of the camera or processing power will prevent automatic braking from being achieved in this autopilot generation.

Of course, I believe we're less than six months away from Autopilot 2.0, and possibly even less than that for shipping hardware, so the issue will be mitigated soon.

But, back again to the fatal crash -- I wonder if the lack of camera recognition resulted in no warning for the driver whatsoever.
 
AP 1 just doesn't have enough data to work with. A stopped vehicle or a crossing vehicle looks to the system like too many other things. Trying to detect stopped cars or crossing traffic would probably cause a lot of false alerts.