Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla replacing ultrasonic sensors with Tesla Vision

This site may earn commission on affiliate links.
I think it's safe to assume that vision USS will be at least somewhat worse than using the actual sensors for a long time to come, if not forever
Mostly depends on Tesla's priorities, and if things line up to need more accurate proximity detection. For example, maybe Smart Park needs to pull into supercharger spots that have distance requirements for charging cables to be useful or Optimus needs to interact with things with inches of tolerance.

Maybe Tesla will even update the park assist visualization as Occupancy Network can provide a lot more detail than existing ultrasonic sensors.
 
A lot of what you’re referring to applies to still photography, not video. Dynamic exposure sensitivity introduces noise. Contrast does not make up for dynamic range. I saw a video just yesterday of AP almost slamming into a barrier because the high contrast video made it look like something it wasn’t. The enhanced night vision only works on still photography. I think it’s certainly possible for the front cameras to provide stereoscopic vision, but that doesn’t apply to the other cameras.
Note this is in incorrect, even for the cameras Tesla is using. The AR0132 sensors Tesla is using has hardware HDR functionality built in.

It is not limited to stills photography because the hardware has multiframe HDR support built in (it can take different exposures back to back continuously, combine it into a 120 dB image all in hardware and output a video stream).

This in fact has been the case for a lot of image sensors given there are plenty of applications that call for HDR video (for example surveillance). There's other techniques like DOL-HDR and Quad bayer HDR that other sensors use to accomplish the same thing.

To be fair however, Tesla may not necessarily be using this mode, but it is there if they wanted to.
 
  • Informative
Reactions: Dewg and DarkForest
"vision only" pretends that there is a giant semi-truck *right next to my car* every time I pull into the garage.
"vision only" also slams on the brakes when i drive on a curvy road and traffic comes from the opposite direction
.... but yeah... "vision only" will *definitely* be able to tell me +/- a few inches how far away I am from a wall :rolleyes:

this is cost cutting/ making manufacturing easier by leaving out parts which are standard on cars costing less than half... meanwhile Tesla's don't even have rear cross traffic alerts either - something which is standard on a $30k Mazda
 
I really hope they stick to not removing USS functionality from existing cars, would hate to have 12 useless sensors just sitting there.
They’re going to disable them next year.

I guarantee it.

Just like they did with radar.
what happens when the layout of things changes while the car is asleep, etc
Maybe they’ll do persistent sentry mode (full time), and that’s why they needed a better 12v battery.
I am not buying a car without rear corner radars for cross traffic alert
You don’t need cross traffic alerts if you always park backwards which is made easy with USS…

Wait….

Oh no!
Note this is in incorrect, even for the cameras Tesla is using. The AR0132 sensors Tesla is using has hardware HDR functionality built in.

It is not limited to stills photography because the hardware has multiframe HDR support built in (it can take different exposures back to back continuously, combine it into a 120 dB image all in hardware and output a video stream).

This in fact has been the case for a lot of image sensors given there are plenty of applications that call for HDR video (for example surveillance). There's other techniques like DOL-HDR and Quad bayer HDR that other sensors use to accomplish the same thing.

To be fair however, Tesla may not necessarily be using this mode, but it is there if they wanted to.
Ok, yeah. Seems video HDR can be pretty legit. Thanks for the info!

To your last point, the video we get to see from the cameras doesn’t look that great, and I know they do all sorts of pre-processing before sending the camera feed to the NNs. I’m really curious what the feed looks like in the various stages.
 
They’re going to disable them next year.

I guarantee it.

Just like they did with radar.

Maybe they’ll do persistent sentry mode (full time), and that’s why they needed a better 12v battery.

You don’t need cross traffic alerts if you always park backwards which is made easy with USS…

Wait….

Oh no!

Ok, yeah. Seems video HDR can be pretty legit. Thanks for the info!

To your last point, the video we get to see from the cameras doesn’t look that great, and I know they do all sorts of pre-processing before sending the camera feed to the NNs. I’m really curious what the feed looks like in the various stages.

You don’t really sound like a shill anymore. Might want to update your signature haha
 
Got it. Thanks. I was under the impression from Elon that birds eye view was on its way to our cars really soon since FSD Beta persons already have it per Elon

By Tinsae Aregay Sep 22 2021 - 3:29pm

Elon Musk Confirms ‘Vector-Space Birds Eye View’ Coming To All Tesla Vehicles Next Month​

Tesla has currently released the company's latest vector-space birds-eye view visualization to FSD Beta testers. However, Elon Musk now says the mind of the car visualization will be released to all Tesla vehicles, including ones in Europe, next month.
Ha! The so called "Vector-Space Birds Eye View" is not available on Model S with FSDb. Further more, according to my regular FSD visualization there is a massive truck in my garage that I may collide with. So I am not using that any time soon to help park.
 
Terrible decision by Elon seriously. I don’t even think the cameras would help me parked in my spot… Do you guys think it would work well with my parking spot?
 

Attachments

  • 44E9109C-25F1-44A8-8E96-2BD36A641A80.jpeg
    44E9109C-25F1-44A8-8E96-2BD36A641A80.jpeg
    429.3 KB · Views: 87
  • CCDE4C17-160C-42F4-A8EF-C8732FAA969C.jpeg
    CCDE4C17-160C-42F4-A8EF-C8732FAA969C.jpeg
    428.7 KB · Views: 69
  • 7BE657A7-EADE-4BF8-9275-E3B061F3C1FF.jpeg
    7BE657A7-EADE-4BF8-9275-E3B061F3C1FF.jpeg
    275 KB · Views: 76
  • 591C7F3D-18CD-4D38-8AFC-071F22FD0F9E.jpeg
    591C7F3D-18CD-4D38-8AFC-071F22FD0F9E.jpeg
    425.7 KB · Views: 71
  • 85E82F89-A96B-477D-B1D8-8B75958A04E2.jpeg
    85E82F89-A96B-477D-B1D8-8B75958A04E2.jpeg
    903.7 KB · Views: 63
Terrible decision by Elon seriously. I don’t even think the cameras would help me parked in my spot… Do you guys think it would work well with my parking spot?
Sure why not? I'd sooner trust a bunch of blocky voxels from a neural network than bouncing sounds waves off of concrete (which in particular is a troublesome material for ultrasonics).
 
the USS sensors on my 2022 Model 3 cant detect the parking wheel stops since IIRC the manual states it cant see below 5 inches. Maybe Tesla vision with memory, can detect it. Regardless, luckily, the model 3 has slightly more clearance than the wheel stops.

I'm not sure if they will delete USS functionality in the future, unless the vision based system becomes superior. It makes sense for radar due to the tech difference and waiting everyone to get the same AP experience/NN. USS seems to be an add on on top of vision. who knows.
 
Last edited: