Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla replacing ultrasonic sensors with Tesla Vision

This site may earn commission on affiliate links.
It might be worth mentioning that Vision systems are already what power Autopark functions in other vehicles, most of which are probably running Mobileye


Ultrasonics are probably there just as another failsafe layer for uncommon situations etc, cutting parts and taking on the risk definitely feels like it fits into the Tesla ethos.

And clearly these are continued efforts to build back what was lost since Tesla and Mobileye went friends off.
 
GUID-EDC3DE75-E906-4B2D-8B85-B37B810CBD7D-online-en-US.png

You can see the sensor layout in the manual. You can see a valley right square in the middle. That is where the car may become "blind" to a thin object there.

That's why in the limitations it says:
"The parking sensors may not function correctly in these situations:
...
  • The object is thin (such as a sign post)."
Model 3 Owner's Manual | Tesla
Don't think so. It was like one of these signs.
1664936039521.png
 
You're telling me a multi-billion dollar company hasn't thought about this?

If anything this may confirm the rumors of the new cameras in the fog-light housings. We may also see a 360 camera system with the removal of the USS.
This is Tesla we are talking about here. No, they either have not thought about it, hand waved it away, or simply figured this will all magically be figured out in 2017 along with coast to coast with the magic of "FSD".

Or maybe I'm wrong. Perhaps, if you're a proper musk rider with sufficient Twatter and SusanTube influence, you too may have the opportunity to beta* test Obsoletus Grime (starting at *20k*, pending regulatory approval).

At the sight of your garage or bike lane highway divider parking space, your faithful companion bot will stumble out of his/her/their reserved passenger seat, hopefully breaking his/her/their fall on one of the neighborhood crotch goblins, and finally, with all the grace of an arthritic 90 year old, wave your recently upgraded Vision-only, stock pumping, wonder of an electron machine straight into oncoming traffic. The future is here**!

How wrong was I to pass judgement so prematurely and in such haste! This is a multi-billion company after all. To top that off, with a tech bro genius CEO at the helm delivering timeless classic hits such as,

- autonomous driving is basically a solved problem (2016)

- Sorry pedo guy, you really did ask for it (2018)

- 1 million robo taxis in 2020 (2019)



There is no stopping this company. The future is now!


*alpha
**2 weeks
 
Does it mean this system only works only when auto* is used? if yes, then how's going to help drivers who park manually without USS? The note suggests there is no plan to disable/remove USS from the existing fleet, so there is nothing to prevent Tesla from doing so in the future. I am not sure I like that ... time to think about jumping ship.
 
I assume it will have some short-term memory of the occupancy network output and it gets translated around as you pull into the spot, otherwise it would be pretty useless.
If it can simulate the current frontal wall/car detection that would be the primary usage I have for the sensors currently. I'm skeptical however it'll be as accurate as the sensors in terms of detecting down to inches. Given the post says they are removing the sensors before the software is even figured out, it suggest however it's not ready yet.

I'm not concerned about the issue of an object getting under the car while it is moving that other people pointed out, the ultrasonic sensors were not really designed for that application anyways (the limitations in the manual points out it has difficulty with low and small objects).
 
It might be worth mentioning that Vision systems are already what power Autopark functions in other vehicles, most of which are probably running Mobileye


Ultrasonics are probably there just as another failsafe layer for uncommon situations etc, cutting parts and taking on the risk definitely feels like it fits into the Tesla ethos.

And clearly these are continued efforts to build back what was lost since Tesla and Mobileye went friends off.

No. Ultrasonics are used by actual people to park.
 
Most teslas park backwards (based on all the footage they get from superchargers obviously), so this removal allows for cost savings as the rear cam view is sufficient 🤷‍♂️
Cost savings for Tesla yes, I don’t think consumers will see any of that and it doesnt make a lot of sense with Tesla batting in the premium/luxury price range where you’re paying a lot more for that last bit of polish. I’m sure Vision-only will be good enough, maybe it will miss a bit of the nuance but it’ll be fine. But again when you’re at $50-60k+ USD, I think people tend to expect all the nuance.

Now if this were in a budget-conscious car and it was able to do 95% of the job for significantly lower price, that would be a good value proposition. At this price point, I would expect to have all the sensors even if they only add 5% by complementing the other sensors.
 
Cost savings for Tesla yes, I don’t think consumers will see any of that and it doesnt make a lot of sense with Tesla batting in the premium/luxury price range where you’re paying a lot more for that last bit of polish. I’m sure Vision-only will be good enough, maybe it will miss a bit of the nuance but it’ll be fine. But again when you’re at $50-60k+ USD, I think people tend to expect all the nuance.

Now if this were in a budget-conscious car and it was able to do 95% of the job for significantly lower price, that would be a good value proposition. At this price point, I would expect to have all the sensors even if they only add 5% by complementing the other sensors.
Yeah, I think it’s a bit silly to remove right now. Maybe if HW4 includes cameras with smaller blind spots *and* the software is ready then I’d understand, but right now I am not convinced it will be a good experience if they’re not changing the rest of the hardware
 
No. Ultrasonics are used by actual people to park.
Yeah I imagine Tesla figures the cameras will do the job well enough, not as good but well enough.

I‘m definitely not excusing this move, just trying to dispel this idea, pushed by Tesla, that Vision systems are something unique when they’re already powering this stuff (Autopark and other ADAS features) across the industry. Cameras in other vehicles can also surely do the job well enough, the ultrasonics have a specific role and complement the cameras and whatever other sensors.

The only benefit I see here for consumers is not having the ultrasonic sensor dots messing with aesthetics. Other than that, I think you’ll get most but not all of the performance and Tesla will pocket those cost savings.
 
I think it's possible for Tesla Vision to achieve close to the same level of accuracy as the USSs at some point.

I also don't think it'll actually happen. They must have just made the decision to ditch USSs due to parts shortages or pricing, and they're going to do another radar stunt where it takes a year to get close to the current functionality.

There's no way this is related to HW4. If it were, they would have mentioned that first, and the USS removal would have been a footnote. Tesla tends to over promise and under deliver. Granted, their "under deliver" is pretty spectacular IMO. But, it means everything will be slightly worse than whatever it is they promise, and they're not promising much here.

Perhaps they shouldn't remove USS before implementing the features "with parity" using TeslaVision? I mean, what if TeslaVision doesn't work as well? What are they going to do then? Retrofit thousands of cars? Ha! All seems a bit cart before the horse.
They'll retrofit the thousands of cars with USSs right after they retrofit the millions of cars with a radar and windshield wiper sensor lol
 
Like others have said, this is a much bigger deal when manually parking, especially into a tight space.

My garage is tiny, so I have to go a few inches past when it says STOP for my garage door to close. I don’t ever expect Tesla Vision to help me do that.

I mean… *gestures to attached photo*
 

Attachments

  • 822188E0-6D5B-4AEB-B275-4961B3A56185.jpeg
    822188E0-6D5B-4AEB-B275-4961B3A56185.jpeg
    242.7 KB · Views: 138
My theory is that it’s more of a parts/supply or cost issue and Musk was like we can’t get enough sensors/modules or we need to cut costs so screw the sensors. Y’all need to figure out how to do it with cameras only now.

If it was a long time in the making then they would have had the software running in the background on current cars to compare with USS feedback and fine tune the operation to reach parity or superiority vs USS before launching and nixing the sensors.

But they’re just abruptly cutting out the sensors when the software isn’t even ready yet so it makes it seem like a hasty decision and they *hope* they can replicate the function with cameras and software eventually. Just like removing radar and yet Tesla Vision is still subpar in many functions compared to radar 1.5 years later.
 
...much bigger deal when manually parking, especially into a tight space...

As @AndrewZ pointed out, MobilEye claims it can do a very good job with pure vision. Notice that, out of 11 cameras, its 4 cameras at 4 sides of the car are dedicated for Parking (labeled as "Park Cam" below:

super-vision-car-large.png



Current Tesla sonars are problematically missing at the side doors, which can cause collisions with the garage door frame.

The solution is adding sonars at the side doors or a better camera system that can do a 360-degree bird's eye view.

Tesla cameras' locations are not designed for a live 360-degree bird's eye view.

Others, like the Hyundai Ioniq5, can give you a 360-degree bird's eye view. The top and bottom blue tapes on the garage door are in the vertical range of view. When getting closer, Dan Odowd's short mannequin blocked from camera view by the Tesla hood's height is not a problem in other brands.


frhi9wrihbl81.jpg



a5ff7wrihbl81.jpg



Even when Tesla has all the correct sensors, including camera locations, Tesla software has been very weak in collision-avoiding competency from the day of 2014 AP1 till now, 2022 AP3.
 
Last edited:
The problem is that engineering design is so often by fiat at Tesla. "No more ultrasonics they just make everything more complicated". Okay, I actually agree with that. I'm just hoping that the decree doesn't include "And no, you cannot add more cameras."
Simplicity makes a ton of sense when it’s combined with price/value or a result of doing things insanely well under the hood, like with Apple.

But I don’t know why people are paying premium/luxury prices for simplicity in the $50k+ range, it’s really supposed to be the opposite. Heck thinking of high end watches, different functions are literally called complications and the price goes up with difficulty of execution. You’re not paying a premium for doing the easy thing, you’re paying a premium to get the expertise and skill required to do difficult things and execute them at the highest level.

Using more sensors is hard, costs more, requires fusing or whatever, but are those reasons for Tesla to not do them? Does anyone actually believe Vision-only will be superior to other systems that also heavily use Vision but are complemented by other sensors, or will it just get most of the way with some sacrifices here and there?