Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla replacing ultrasonic sensors with Tesla Vision

This site may earn commission on affiliate links.
hahaha this guy has no idea what he's talking about. Turned it off at vision being "dramatically" superior than vision + radar in every way.
I held on for a little longer, to where he said "if people can do it with their eyes, the car will be able to do it". Well, I can't do it with my eyes, which is exactly the reason the sensors were added in the first place.

I know some people drove cars without sensors back in the day. I did. The main difference is the cars had a very different hood design that made it much easier to judge the distance to a wall or object. However the new modern slope design makes it much harder. So trying to compare with old cars without sensors does not fly.
 
Tesla only analyses what it sees at present moment in time and forgets what it saw as soon as the object disappears from the cameras view
Occupancy Network presented at AI Day specifically mentions using video context to predict things that are occluded. Here's a screenshot where there's 2 cones towards the right. One is visible from the front but the closer one is probably in range of the ultrasonic sensors but out of view of the fisheye and right pillar (and right repeater as it's still ahead of the vehicle), and all these cones are visualized without them disappearing:

occupancy memory.jpg


(From top left to top right: left repeater, left pillar, fisheye, right pillar, right repeater [mostly cropped])

I suppose potentially Tesla could be making predictions based on the casted shadow visible from the right pillar, but most likely it's from memory. (Although making accurate predictions from just shadows would be some pretty neat future superhuman capability.)
 
TV can see objects. TV can calculate distance to objects. TV has limited visibility due to the location of the cameras. So, if you are approaching a parking curb, and TV can see the curb (it's several feet in front of you when you start your parking), then when the curb is closer to the car and the cameras can't pick it up anymore, the computer still knows exactly where it was when it last saw it, and knows exactly how the car is moving. It can therefore estimate where the curb will be as you roll towards it.


Which works great, all the way up until it doesn't.

1665169334749.png
 
  • Like
Reactions: kevin1
Autofocus was mature before lidar was present. And autofocus works by phase detection or maximizing contrast, not by measuring distance. They can compute distance after the fact by knowing where the lens ended up and trigonometry, but to be clear that's deriving distance from focus, and not deriving focus from distance.
That was def true 5 years ago. There have been some changes since then that are really neat! I recommend looking into it
 
Looks like somebody updated the sensor coverage hero image at the top of Autopilot:
2022 sensors.jpg


But… I don't think the image is actually correct in representing the camera visibility. E.g., repeater camera's view should only be towards the back but still towards the sides for 60° coverage.

And for reference, here's how it looked in 2016:
2016 sensors.jpg
 
I think lot of people discussing this whole topic are missing that Tesla is talking about a tech they are rolling out in FSD Beta. I suggest reading about occupancy networks mentioned in the announcement:

What you see generally right now with most Teslas without FSD Beta is not representative of what they plan to push out.
Have you actually driven in a Tesla with FSD? Mine has it. Here is an image of FSD "bird eye view" in my garage:
IMG_0214-s.jpg


According to the view I can just drive drive straight out of my garage without hitting the car next to me.

Here is the reality:

IMG_0218-s.jpg


This is years of FSD development. It can't even correctly identify stuff in plain view, let alone occluded.
 
which is so generic it can apply to nearly everything.
Allow me to rephrase:
Not everything that dips below the hood line is predictable.

The occupancy network is great in that it can estimate trajectories without needing to identify objects.
But in some cases you NEED to identify objects to understand what could happen when the object is out of sight.
If you try to estimate an occluded chicken's position with a Kalman filter, you're going to have a bad time.
 
Actually, they are, at least on my 2016 Model S.

I frequently drive along Highway 80 coming out of Reno towards the Bay Area. A large part of this is two lanes in each direction with a concrete barrier separating the opposing lanes. The driver's side ultrasonic sensors clearly detect the barrier, even at 70+ mph speeds, as indicated by the yellow radiating lines on my center display. Dunno how accurate the distance data would be, though...

I don't see how Tesla Vision could accurately gauge the distance to a featureless garage wall, but maybe Elon does. Of course, my not-terribly-good auto wipers are still a "beta" feature after many years, so maybe Elon doesn't...
I agree they absolutely work on the highway with my 2015 Model S. When a large truck rides up next to me and drifts closer into my lane, the radar detects it and moves my car away from that truck.
 
  • Like
Reactions: snorp and kavyboy
Allow me to rephrase:
Not everything that dips below the hood line is predictable.

The occupancy network is great in that it can estimate trajectories without needing to identify objects.
But in some cases you NEED to identify objects to understand what could happen when the object is out of sight.
If you try to estimate an occluded chicken's position with a Kalman filter, you're going to have a bad time.
An excellent edge case. Let's work on getting the car to park in 98% of boring parking scenarios that everyone on the planet deals with every day. Then we'll work on not running over chickens.
 
“They could be damaged” - so could the camera
To be fair, the likelihood of needing service for ultrasonic sensors is probably much higher than cameras. Even a slight bump to the front or rear with no other visible cosmetic damage can cause any of the 12 ultrasonic sensors to pop off preventing some usage of Autopilot "Ultrasonic sensors not communicating" (or other issues from collecting dead bugs or even washing the car). Maybe Tesla has more insights into the long term reliability and maintenance of these sensors. I wonder if Elon Musk "working on service" noticed a surprising amount of backlog related to ultrasonic sensors and decided to "fix" the problem holistically given his insights into the progress of Occupancy network.

I was quite happy with the switch away from radar to Tesla Vision with FSD Beta as in the wintertime, even a little bit of snow covering up the radar area could cause Autopilot to dangerously shut off without much notice.
 
Have you actually driven in a Tesla with FSD? Mine has it. Here is an image of FSD "bird eye view" in my garage:
View attachment 861202

According to the view I can just drive drive straight out of my garage without hitting the car next to me.

Here is the reality:

View attachment 861203

This is years of FSD development. It can't even correctly identify stuff in plain view, let alone occluded.
So let's drill down further. Are you suggesting that you'd not accept any solution that doesn't recognize and display every possible item that people have in their garage? Would you be more comfortable if instead of a car showing up, it just showed a grey "blob" that it knew couldn't be collided with?

And according to the view in your car, you can indeed drive straight out of your garage without hitting the object to your left. What's more important? That it displays the object as exactly what it is, or that it doesn't hit it when it pulls out?
 
Have you actually driven in a Tesla with FSD? Mine has it. Here is an image of FSD "bird eye view" in my garage:
View attachment 861202

According to the view I can just drive drive straight out of my garage without hitting the car next to me.

Here is the reality:

View attachment 861203

This is years of FSD development. It can't even correctly identify stuff in plain view, let alone occluded.
You just straight up ignored completely what I wrote. What you see currently for Tesla general visualizations (including FSD without the latest Beta updates) is not the representative of what Tesla is talking about (FYI even in those visualizations it's not showing everything the system detects in the first place).

Again read the article and then come back to comment. I encourage everyone to do so. If too lazy, you can look at the screenshots people posted on how the occupancy network visualizations look.
 
So let's drill down further. Are you suggesting that you'd not accept any solution that doesn't recognize and display every possible item that people have in their garage? Would you be more comfortable if instead of a car showing up, it just showed a grey "blob" that it knew couldn't be collided with?
I would like a regular 360 view like many other manufactures already have, if not, then the red lines the sensors currently display. They work well enough for me.

Maybe some day in the distant future Tesla will have a view that correctly identifies everything, however that is likely years away, given Tesla timelines.
And according to the view in your car, you can indeed drive straight out of your garage without hitting the object to your left. What's more important? That it displays the object as exactly what it is, or that it doesn't hit it when it pulls out?
Not really, if I drive straight it will collide with the side of the garage door. Perhaps I did not get the camera angle quite right. At best I might have an inch clearance, which is way less than the approximate foot displayed in the view.

However you seem to be missing the point. It is unable to display large walls, bicycles, etc. What hope is there it will display a tricycle left by a kid in front of the garage door? Ask any parent, this happens all the time. The sensors are able to pick it up. I know, because it has happened.
 
  • Like
Reactions: daktari
Not really, if I drive straight it will collide with the side of the garage door. Perhaps I did not get the camera angle quite right. At best I might have an inch clearance, which is way less than the approximate foot displayed in the view.
But the view shows that you would hit something if you went straight forward, doesn't it?

1665175925264.png
 
Last edited:
You just straight up ignored completely what I wrote. What you see currently for Tesla general visualizations (including FSD without the latest Beta updates) is not the representative of what Tesla is talking about (FYI even in those visualizations it's not showing everything the system detects in the first place).

Again read the article and then come back to comment. I encourage everyone to do so. If too lazy, you can look at the screenshots people posted on how the occupancy network visualizations look.
I think you are completely ignoring the point many are making in this thread, which is Tesla should first fix the software, then remove the sensors. In addition they need to add another camera in the front. Since occupancy network is not going to detect something left in front of the car that was not there the day before, e.g. a tricycle, If the camera can't see it.
 
But the view shows that you would hit something if you went straight forward:
Ha ha, that is actually the "road" i.e. my driveway. So no.

Also, the wall is about 17" from the car, where the red line is. In the display those points are much further.

Decided to go drive a little forward, to see how it dealt with the wall. Now I can really floor it! No obstacles!
IMG_0219.jpg
 
Last edited:
Ha ha, that is actually the "road" i.e. my driveway. So no.

Also, the wall is about 17" from the car, where the red line is. In the display those points are much further.

Decided to go drive a little forward, to see how it dealt with the wall. Now I can really floor it! No obstacles!
View attachment 861235
Oh, I get it now. It took me a sec to understand what you were describing. Now that the ultrasonics have been removed and not replaced with any new software as of this morning, your scenario makes more sense.
 
  • Like
Reactions: Mark II