Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla replacing ultrasonic sensors with Tesla Vision

This site may earn commission on affiliate links.
A little virtual tennis ball shows up on the screen and then alarms "STOP" when the ball hits the front of the Tesla, just like the ultras did when you got to 12 inches. 😄
I use the floor mounted Stop Sign
D5201EF8-3186-4083-8E5E-9D84B68E7575.jpeg
 
  • Funny
Reactions: cbdream99 and Dewg
  • Like
Reactions: enemji and mongo
Some sample clips from the new cams in this thread:
Someone posted some footage from older cams earlier in the thread too. New cams see a little more of the side of the car and less of the mirror for example. Can’t tell if they see more to the side of the car but I wouldn’t be surprised if there’s some cropping happening in the video output
 
Last edited:
  • Like
Reactions: oholladay
4) Ultrasonic capabilities will be replaced with the new vision-only neural nets, which (for simplicity) break down objects into blocks. At close range, these blocks are pretty small (possibly down to 1cm).
5) The new visualizations on the screen when you're parking will show distances to objects by using the new Occupancy Network.
These two items are pure speculation on your (and Elon's) part. There's simply no indication that the occupancy network renders items down to the resolution you suggest. Perhaps that is why some people are "chicken littl'ing" this. And yes, it's very similar to the removal of radar which brought back the phantom braking problem in Navigate on AutoPilot that was all but solved.
 
  • Like
Reactions: momo3605
These two items are pure speculation on your (and Elon's) part. There's simply no indication that the occupancy network renders items down to the resolution you suggest. Perhaps that is why some people are "chicken littl'ing" this. And yes, it's very similar to the removal of radar which brought back the phantom braking problem in Navigate on AutoPilot that was all but solved.
Re 4) from AI Day 2022:
So how we solved there are three big steps the first step is high Precision trajectory and structure recovery by multi-camera visual inertial odometry so here all the features including ground surface are inferred from videos by neural networks then tracked and reconstructed in the vector space so the typical drift rate of this trajectory in car is like 1.3 centimeter per meter and 0.45 Milli radian per meter which is pretty decent uh considering it's compact compute requirement than the recovery service and raw details are also used as a strong guidance for the later manual verification step this is also enabled in every FSD vehicle so we get pre-processed trajectories and structures along with the trip data
Drift of 1.3 cm implies system resolution is at least that high.

Re 5)
Once the ON has a high resolution voxel cloud, representing vehicle clearance is easy.
 
  • Like
Reactions: Dewg
No idea where you are quoting from but someone must have missed all grammar classes. :) It is painful to read.
Otherwise, if I understand it correctly, it should be OK with static objects in the blind spots, given the USS accuracy of >12” (30cm).
On the other hand, I am very skeptical of the .45 milli rad per meter. That is resolution of .45 millimeter from one meter distance. Not sure if it is possible with the current cameras. If anyone has the specs then we can calculate their resolution.
 
Hope someone with a new 3/Y would posts here soon. Would like to know what the display shows when parking. Does it show any warnings like Proximity Detection not Available or any distance information at all? Also any beeps or just silence?
 
No idea where you are quoting from but someone must have missed all grammar classes. :) It is painful to read.
Otherwise, if I understand it correctly, it should be OK with static objects in the blind spots, given the USS accuracy of >12” (30cm).
On the other hand, I am very skeptical of the .45 milli rad per meter. That is resolution of .45 millimeter from one meter distance. Not sure if it is possible with the current cameras. If anyone has the specs then we can calculate their resolution.
Auto transcript from YouTube. I'd reccomend listening to verify, I was focused on the centimeter part. If milli radians per meter is the right unit, then it's a worse angular accuracy the further out something is, basically .45mm*distance ^2 giving 45mm angular accuracy at 10 meters. This could be correct for long baseline observations (visual version of SAR), not a single image.

USS can detect objects closer than 12, but it doesn't give a range. There is a minimum though. Accuracy in the 12-36 range should be pretty good.
 
Hope someone with a new 3/Y would posts here soon. Would like to know what the display shows when parking. Does it show any warnings like Proximity Detection not Available or any distance information at all? Also any beeps or just silence?
I saw some comments on Reddit that said that it literally shows nothing, no sounds or anything
 
It bears repeating again since people continually miss it, but what the visualizations show now are irrelevant to what the occupancy network will show. The occupancy network is a new system that shows blocks of occupied space, instead of taking a fixed database of objects and assigning the closest possible object. It also has object permanence and handles objects straddling two camera views gracefully (something the previous system doesn't do).

Previous posts that covered this:
Tesla replacing ultrasonic sensors with Tesla Vision
Tesla replacing ultrasonic sensors with Tesla Vision
A Look at Tesla's Occupancy Networks
I was in my driveway with the back of the car 3 feet from the house and it was yu first drive of the day. How does your comment explain a ghost motrocyclist behind me?
 
The best match the NN came up with for your house was a motorcyclist.

Now they use lumpy shapes to represent unknown things.
My HOUSE was replaced by a motorcycle? Now I know why the AI guy quit earlier this year. Elon has Ukraine peace plan, He counseled Kanye West about insulting Jews, he spoke/did not speak w Putin, he's going broke w Starlink in Ukraine, he's gonna fix Twiiter which has basically become a sewer. He's 50 which means he was born in 1972. Just checked, Apartheid ended in 1994, so I think he learned a thing or two from massa whitey. btw I'm white
 
I was in my driveway with the back of the car 3 feet from the house and it was yu first drive of the day. How does your comment explain a ghost motrocyclist behind me?
As another mentioned, the current system just looks up the most similar object it has in its database (in this case a motorcyclist) and assigns that. So it's not unusual at all to falsely identify an area as a vehicle (for example I have cabinets in front of my car when I pull in and it'll show a semi truck in front of me).

The occupancy network instead only has the task of determining if the given area in a space is occupied or not (it shows blocks). At most it'll label the blocks a different color, but it won't be trying to put a predetermined object model there.

Anyways, read the links I posted. The Occupancy Network is an entirely new system that does not work like the previous one, so what you observe in the old system is irrelevant to determine how the new system would work.
 
  • Like
Reactions: DrGriz