Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla replacing ultrasonic sensors with Tesla Vision

This site may earn commission on affiliate links.

diplomat33

Average guy who loves autonomous vehicles
Aug 3, 2017
12,698
18,656
USA
Tesla announced today that they are transitioning away from ultrasonic sensors and replacing them Tesla Vision:

Today, we are taking the next step in Tesla Vision by removing ultrasonic sensors (USS) from Model 3 and Model Y. We will continue this rollout with Model 3 and Model Y, globally, over the next few months, followed by Model S and Model X in 2023.

Along with the removal of USS, we have simultaneously launched our vision-based occupancy network – currently used in Full Self-Driving (FSD) Beta – to replace the inputs generated by USS. With today’s software, this approach gives Autopilot high-definition spatial positioning, longer range visibility and ability to identify and differentiate between objects. As with many Tesla features, our occupancy network will continue to improve rapidly over time.

For a short period of time during this transition, Tesla Vision vehicles that are not equipped with USS will be delivered with some features temporarily limited or inactive, including:
  • Park Assist: alerts you of surrounding objects when the vehicle is traveling <5 mph.
  • Autopark: automatically maneuvers into parallel or perpendicular parking spaces.
  • Summon: manually moves your vehicle forward or in reverse via the Tesla app.
  • Smart Summon: navigates your vehicle to your location or location of your choice via the Tesla app.
In the near future, once these features achieve performance parity to today’s vehicles, they will be restored via a series of over-the-air software updates. All other available Autopilot, Enhanced Autopilot and Full Self-Driving capability features will be active at delivery, depending on order configuration.


screenshot-www.tesla.com-2022.10.07-15_56_58.png
 
I don’t see how this will be possible especially in front of the bumper without adding extra cameras. Even then “measuring” distance via AI algorithm from a camera feed is not going to be as precise as ultrasonic.

Tesla Vision TACC is still not as smooth as other cars with radar based adaptive cruise…
OpenPilot is totally vision based and is smoother than all of the above.
 
I don’t see how this will be possible especially in front of the bumper without adding extra cameras. Even then “measuring” distance via AI algorithm from a camera feed is not going to be as precise as ultrasonic.

Tesla Vision TACC is still not as smooth as other cars with radar based adaptive cruise…
Maybe they'll also be adding more cameras and switching to AP4 hardware as well? 🤔
 
...but just like us humans there will be NO WAY to sense what is near the front bumper. This means if you pull up close to a concrete bolster/bumper and park. Then you forget and so does your car and you drive forward.......:eek:
It remains to be seen if that is true. Tesla's have a relatively short and sloped front, so there's only a small vision shadow created in the front of the car. As one pulls forward into a parking space, there might might be enough object persistence to give full coverage. I'm sure that the some people here are feverishly calculating what that vision dead zone might be for each model. Also, if radar is put back in all cars, there might not be be a front dead zone at all

As to the back view, the combination of the two repeaters and the rear camera seem to have virtually complete coverage, except just off of ground level. Again object persistence might eliminate any gaps.

Lastly, maybe this is a hint that Tesla is finally going to give us a nicely stitched bird's eye view. We can always hope.