Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla replacing ultrasonic sensors with Tesla Vision

This site may earn commission on affiliate links.
Yes, this is totally what people think of when they think of birds-eye view; hard to imagine how they thought otherwise. After all, Elon is technically correct.




It’s awesome that we have Elon to cut through the semantically incorrect naming of car features and bring clarity to the car market.

In some 2016+ models, BMW did not have front facing cameras, so their birds eye view camera was more like a 270 degree view. I bet Tesla could stitch together something like that if they wanted. But clearly none of the driver assistance features are a priority other than FSD Beta.
 
Tesla clearly waiting for FSD Beta to enable the 360 view (which of course is possible with current camera positioning, in most situations).

Tesla doesn't have clear view of in front of the vehicle either, which is why i suggested 270 degree stitched video. Or are you talking about the vector based 360 view referred to in a tweet above? I'm pretty sure the birds eye view we get in FSB Beta is all he meant. Maybe it'll get more detailed when we get parking lot stack.
 
Tesla doesn't have clear view of in front of the vehicle either,
Correct.
Or are you talking about the vector based 360 view referred to in a tweet above
No.

I'm talking about just constructing it with the "parking lot stack" (which obviously can't be limited to parking lots) or whatever memory-based construction of the world, which may or may not exist, now or in the future.
 
  • Like
Reactions: Dan D.
Tesla doesn't have clear view of in front of the vehicle either, which is why i suggested 270 degree stitched video. Or are you talking about the vector based 360 view referred to in a tweet above? I'm pretty sure the birds eye view we get in FSB Beta is all he meant. Maybe it'll get more detailed when we get parking lot stack.
They could stitch together the birds-eye view you're talking about using "historical data" from the drive in to the parking spot. Then overlay an "uncertainty" to the areas that have no live data in front of the car.

This would be technically impressive, and give an appearance of being a real view, but also be unsafe and rely on the driver to ensure there is nothing in the blind spots.

Or they can not release the stitched view and no one is the wiser to the fact that they have blind spots.

The vector birds-eye is another technical out, since it's still not made clear to the average driver that the blind spots exist.
 
Yes, this is totally what people think of when they think of birds-eye view; hard to imagine how they could think otherwise. After all, Elon is technically correct.




(These are just the first few random hits obviously.)

It’s awesome that we have Elon to cut through the semantically incorrect naming of car features and bring clarity to the car market.
I don't know what's funny about my comment, but I'm just stating the facts. Elon didn't say other naming is "wrong", nor was that what I was implying. But that doesn't change the fact he was referring to something else, by using the same terminology his engineers were using (as were engineers in other companies, see below article from 2020).


Here's Andrej talking about it in depth in a February 2020 ScaledML presentation:

It's others that made various incorrect assumptions and speculations about what he meant and the internet ran with it (as it does in many cases, it's like the telephone game).

Then people get indignant about not getting a feature that was never promised in the first place.
 
Last edited:
It's others that made various incorrect assumptions and speculations about what he meant and the internet ran with it (as it does in many cases, it's like the telephone game).
I guess that is my point - no one made any incorrect assumptions. They just “assumed” that he meant what he said. It’s not ok to use standard terminology and not mean what it normally means, without explicitly saying so (which Elon did not do).

Birds Eye view means one thing, so we can say and be correct that Elon (and Andrej?) meant something else, but that does not change the fact that Elon used the wrong term, even if he meant to use the term.

It’s very absolute with no latitude. That’s how language works. If you change the meaning of words, those changes have to be explained and accepted before they are meaningful.

No one made the “assumption” that he meant the normal use of the word; that is not an assumption to do so! Assuming would mean taking an ambiguous use of a term and deciding it was one or the other without checking. But no ambiguity existed here.

Then people get indignant about not getting a feature that was never promised in the first place.
Seems perfectly reasonable for people to be upset that they didn’t get Birds Eye view since that was the feature they were told they would be getting (if they were latching on to Elon’s Tweets). I would have advised them not to count on it, but that is neither here nor there…

Note Andrej’s presentation referred multiple times to “top-down view” and the video illustrations also showed a top-down view (I stopped watching after two minutes or so). This confirms and supports the traditional use of the term. This presentation really is very theoretical though and isn’t talking about the practical use of birds-eye view imaging. He’s talking about predicting things further away from the car than people would typically care about.
 
Last edited:
I guess that is my point - no one made any incorrect assumptions. They just “assumed” that he meant what he said. It’s not ok to use standard terminology and not mean what it normally means, without explicitly saying so (which Elon did not do).

Birds Eye view means one thing, so we can say and be correct that Elon (and Andrej?) meant something else, but that does not change the fact that Elon used the wrong term, even if he meant to use the term.

It’s very absolute with no latitude. That’s how language works. If you change the meaning of words, those changes have to be explained and accepted before they are meaningful.

No one made the “assumption” that he meant the normal use of the word; that is not an assumption to do so! Assuming would mean taking an ambiguous use of a term and deciding it was one or the other without checking. But no ambiguity existed here.
Assumptions are things like that article showing a Tesla with a stitched parking camera view. Elon never promised that.

Birds Eye View as a phrase simply means a view from the top like a bird sees which may or may not be angled (which Tesla did deliver in their visualizations). It does not mean stitched camera view! There is nothing in that phrase that defines that (even if most manufacturers implemented it that way)! This part is indisputable.

And vector space already implies it would be a generated view (based on drawn lines, AKA VECTORS) not something directly stitching video footage.

Example of vector based graphics for example in Google maps (allows free tilting of the 3D view, which exactly was what Tesla was working on to allow):
Vector-Graphics-and-3D-Mode-in-Google-Maps-5-0-for-Android-4.png

https://news.softpedia.com/news/Vec...e-in-Google-Maps-5-0-for-Android-173531.shtml

And I already showed that in the engineering world the way the word is used is exactly how Tesla engineers and Elon was using it.

Seems perfectly reasonable for people to be upset that they didn’t get Birds Eye view since that was the feature they were told they would be getting (if they were latching on to Elon’s Tweets).

Note Andrej’s presentation referred multiple times to “top-down view” and the video illustrations also showed a top-down view (I stopped watching after two minutes or so).
7 seconds in his presentation shows BEV net and says Bird's Eye View predictions, something he said out loud too. It couldn't be more clear.
 
Last edited:
Birds Eye View as a phrase simply means a view from the top like a bird sees which may or may not be angled (which Tesla did deliver in their visualizations). It does not mean stitched camera view! There is nothing in that phrase that defines that (even if most manufacturers implemented it that way)! This part is indisputable.
I guess we’ll have to disagree.

Terms mean things, just the way it works.
7 seconds in his presentation shows BEV net and says Bird's Eye View predictions, something he said out loud too. It couldn't be more clear.
Right, he says top-down view. Multiple times. Then shows a top-down view with no angle (otherwise it wouldn’t be a top-down view, since that too has a very specific meaning).

Technically every car has a birds eye view.
View attachment 872651
When in doubt, ask the local AI. Very clarifying. It just knows. (A photo of vector-space bird’s-eye view for a car)

2DC90A34-6919-485E-A6F9-1D86B6EF3E8E.jpeg
 
Last edited:
I have a M3 RWD that I will be collecting on the 6th of December.

Just wanted to know when parking if I don’t have USS will the car still tell me how far away objects are from the front and rear of the vehicle?
According to YT videos, it seems to detect some objects and shows them in the car driving window. It seems to have better recognition of what the object is and unknown things showing as traffic cones.

However, it lacks the audio notice and distance that you have with USS. Also, it misses some things that USS detects.

Bottom line, it looks like a system in development; it is definitely different and not clear if they will be able to achieve parity.
 
  • Informative
  • Like
Reactions: Bestill and kavyboy
According to YT videos, it seems to detect some objects and shows them in the car driving window. It seems to have better recognition of what the object is and unknown things showing as traffic cones.

However, it lacks the audio notice and distance that you have with USS. Also, it misses some things that USS detects.

Bottom line, it looks like a system in development; it is definitely different and not clear if they will be able to achieve parity.
That's my complaint. Tesla takes the approach of removing the hardware and then figuring out how to make the functionality work with what's left rather than the other way around. There's no reason for them to do so instead of leaving the hardware in place until full functionality is achieved.
 
Ugh - we're back here again.

Let me try one more time:

The global supply problem (and chip shortage), which is affecting the entire auto industry, hit ultrasonic sensors recently. The leading companies that produce the hardware couldn't build/deliver enough to satisfy the demands of the various auto manufacturers. Many car companies removed ultrasonics from their new vehicles and told new owners their parking features were not available until sometime in the future.

Tesla had a choice. 1) Remove ultrasonics from their new cars, just like many others did, and tell customers that parking features would not be available until sometime in the future. 2) Delay production of new cars until they can get supplies of ultrasonics.

Since wait times for new cars are already several months, with some builds being 6+ months, delaying new car production would hit the bottom line significantly - who the heck wants to wait 12 months for a new car? The obvious choice is the first one, which Tesla did, just like most other companies.

Now, here's the difference between Tesla and the other car companies: Tesla is not going to put in ultrasonics later, when the global supply problem is resolved. Many other car companies are telling their customers they will retroactively install ultrasonics when they're available. Tesla is going to leverage their vision-only system to take over the functionality the ultrasonics were fulfilling.

So, NEW Teslas without ultrasonics don't have parking functions - just like the other car companies. OLD Teslas with ultrasonics still have their parking functions. Tesla is going to bring parking functions to NEW cars via Vision Only when it's ready (probably tested by the FSD Beta fleet, just like when radar was removed). Once it's tested and ready, it will roll out to the entire fleet. Until that time, OLD Teslas will still have ultrasonics and parking features. And we'll see how the timing compares to the other car companies. Can Tesla get parking functions restored to NEW owners before we see the other car companies getting inventory to install ultrasonics in their cars?
 
According to YT videos, it seems to detect some objects and shows them in the car driving window. It seems to have better recognition of what the object is and unknown things showing as traffic cones.

However, it lacks the audio notice and distance that you have with USS. Also, it misses some things that USS detects.

Bottom line, it looks like a system in development; it is definitely different and not clear if they will be able to achieve parity.
No worries thanks for you help. Not a major issue for me personally I am easy either way, just curious more than anything..

Does anybody know if UK M3 RWD are with or without USS, due to collect on the 6th of December. Or how do I check if I have 23 VIN?