Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla replacing ultrasonic sensors with Tesla Vision

This site may earn commission on affiliate links.
Wipers are another story. Part of the difficulty is that they are designed to work for the camera, not the driver.

I've often seen this, but never quite understood it! :)

I mean technically you're right - the cameras obviously look at the top, centre of the screen where they themselves are located, so you can say they decide when to wipe for their own part of the screen, and not for where the driver is looking out. But exactly the same is true for traditional rain sensor systems - the sensor is located in the same place and so is also working for itself rather than the driver, strictly speaking. Yet traditional rain sensor systems work! Pretty perfectly in my 2001 BMW 5-Series and 2012 Ford S-Max. Also how different can the rain conditions actually be between the sensor location, and the part of the screen about a foot away the driver is looking through? I'd say no different whatsoever, so I wonder what the heck we even mean to distinguish between 'for the driver' and 'for the camera'. Both driver and camera are going to have the same conditions on their part of the screen, and both have the same needs - a clear screen.
 
I've often seen this, but never quite understood it! :)

I mean technically you're right - the cameras obviously look at the top, centre of the screen where they themselves are located, so you can say they decide when to wipe for their own part of the screen, and not for where the driver is looking out. But exactly the same is true for traditional rain sensor systems - the sensor is located in the same place and so is also working for itself rather than the driver, strictly speaking. Yet traditional rain sensor systems work! Pretty perfectly in my 2001 BMW 5-Series and 2012 Ford S-Max. Also how different can the rain conditions actually be between the sensor location, and the part of the screen about a foot away the driver is looking through? I'd say no different whatsoever, so I wonder what the heck we even mean to distinguish between 'for the driver' and 'for the camera'. Both driver and camera are going to have the same conditions on their part of the screen, and both have the same needs - a clear screen.
For Sure It does not work quite as well as my previous Lexus, at least for MYLR 2023
 
I've often seen this, but never quite understood it! :)

I mean technically you're right - the cameras obviously look at the top, centre of the screen where they themselves are located, so you can say they decide when to wipe for their own part of the screen, and not for where the driver is looking out. But exactly the same is true for traditional rain sensor systems - the sensor is located in the same place and so is also working for itself rather than the driver, strictly speaking. Yet traditional rain sensor systems work! Pretty perfectly in my 2001 BMW 5-Series and 2012 Ford S-Max. Also how different can the rain conditions actually be between the sensor location, and the part of the screen about a foot away the driver is looking through? I'd say no different whatsoever, so I wonder what the heck we even mean to distinguish between 'for the driver' and 'for the camera'. Both driver and camera are going to have the same conditions on their part of the screen, and both have the same needs - a clear screen.
Well, I would only say that from my own observation there is often much more water on the lower half of the windshield than above in marginal situations. Certainly, this is due to the aerodynamics of the front of the car.

In addition, I believe most rain sensors are based on IR, and while Tesla cameras apparently use some IR, it may be the visible spectrum aspect of the camera that is causing the misbehavior.

Clearly (/s) it is not a perfect solution.
 
  • Like
Reactions: stopcrazypp
IMHO, the difficulty may be because surely it's impossible for any of the cameras to focus on the screen as they are all looking out to the distance. That means all they will see of droplets is some degree of blurriness of the image. Must be hard to process that and accurately determine degree of rainfall. If you really want to do this with cameras, a cheap low res dedicated camera actually focused on the screen would give you a fighting chance. But then if you need a dedicated camera, better still to just fit a rain sensor! I dunno, cameras not suited to the task and a dedicated neural-network to try to make some sense of it just seems so amazingly inelegant engineering when a $1 specialised sensor is available!! Ughh.

That said, I only just got my M3, and in the little I've driven it, which has included all versions of rain plus some snow, my wipers don't seem horrible. Never left me blinded due to inaction, at least, but I've absolutely noticed them frantically flapping at full speed when the screen is already clear. Previous owner applied RainX and I've heard some say that helps, and some say that makes it worse, so it will be interesting to see what happens as it wears off.
 
I've often seen this, but never quite understood it! :)

I mean technically you're right - the cameras obviously look at the top, centre of the screen where they themselves are located, so you can say they decide when to wipe for their own part of the screen, and not for where the driver is looking out. But exactly the same is true for traditional rain sensor systems - the sensor is located in the same place and so is also working for itself rather than the driver, strictly speaking. Yet traditional rain sensor systems work! Pretty perfectly in my 2001 BMW 5-Series and 2012 Ford S-Max. Also how different can the rain conditions actually be between the sensor location, and the part of the screen about a foot away the driver is looking through? I'd say no different whatsoever, so I wonder what the heck we even mean to distinguish between 'for the driver' and 'for the camera'. Both driver and camera are going to have the same conditions on their part of the screen, and both have the same needs - a clear screen.
I say this from experience: when there is road spray, it gets all over the windshield but the top of the car (where the cameras are) is clear. This will result them not activating even when windshield is obscured.

Googling some pictures, the BMW rain sensors are mounted under the camera, so it is in a slightly better location.

By necessity, AP needs to know if camera is obscured by rain, which is probably why they decided to use it also for wiper control. They can probably address a lot of complaints by simply having a sensitivity adjustment. There are probably also a lot of heuristics they can apply to optimize it for human usage (instead of camera usage). It just haven't been a priority for them (just like high beams).
 
  • Like
Reactions: DrGriz
I say this from experience: when there is road spray, it gets all over the windshield but the top of the car (where the cameras are) is clear. This will result them not activating even when windshield is obscured.

Googling some pictures, the BMW rain sensors are mounted under the camera, so it is in a slightly better location.

By necessity, AP needs to know if camera is obscured by rain, which is probably why they decided to use it also for wiper control. They can probably address a lot of complaints by simply having a sensitivity adjustment. There are probably also a lot of heuristics they can apply to optimize it for human usage (instead of camera usage). It just haven't been a priority for them (just like high beams).

But AP could also be told it's raining by a rain sensor! Then both AP and auto wipers would be better.
 
But AP could also be told it's raining by a rain sensor! Then both AP and auto wipers would be better.
That has the same problem that it is offset from the camera. By using the camera, it is the camera directly telling if it is obscured or not (kind of like a driver choosing to wipe based on clarity of whole windshield). It also works to tell if other non-rain particles are on the windshield (like dust or bugs).
 
  • Like
Reactions: DrGriz
I have a 2021 MY with USS. Is the thought that once a new Vision solution is figured out, cars with USS will have the sensors disabled in future firmware? Or might the USS remain functional and the Vision solution will be just for newer cars without USS?
Nobody knows. Anything other than that is pure speculation. Tesla claimed in the newsletter that they “Don’t plan to remove uss functionality at this time”… whatever that means. We don’t know.
 
  • Like
Reactions: Benito1283
Is the thought that once a new Vision solution is figured out, cars with USS will have the sensors disabled in future firmware? Or might the USS remain functional and the Vision solution will be just for newer cars without USS?

It’s no cost to Tesla to just leave both so they will. At least until vision is superior in every single situation (which seems like it will never be the case).

I’m sure there will be an option to disable USS in cars if vision sensing is ever released. (I think there is only an option to make them quiet at the moment?)
 
  • Funny
  • Like
Reactions: Benito1283 and Boza
<speculation> They disabled the existing radar. I would not be surprised if they disable the USS in the name of Tesla Vision </speculation>

They have cost incentive to disable USS (single code base and no maintenance) and it aligns with the TV dogma. My bet is that within a year they will be disabled.
 
who knows. I assume there is someone on the AI team with a brain and has thought of potential problems.
Yes, but a little prick at the accounting department felt otherwise. Tesla vision could never be argued from an engineering standpoint. It still can’t even work the headlights or windshield wipers. You were asking for a lot with parking assistance. Over 10 years ago my BMW had sensors that could actually detect, moisture, temperature, humidity and the speed of vehicle, I know, mind blown.
 
  • Like
Reactions: BrerBear
Yes, but a little prick at the accounting department felt otherwise. Tesla vision could never be argued from an engineering standpoint. It still can’t even work the headlights or windshield wipers. You were asking for a lot with parking assistance. Over 10 years ago my BMW had sensors that could actually detect, moisture, temperature, humidity and the speed of vehicle, I know, mind blown.
well if you look at the Lex Fridman interview with the chap who was head of AI at tesla (and left the company already) he was spitting some bs about multiple sensors and such.

i mean guy is deluded as me about ballet..
 
I've often seen this, but never quite understood it! :)

I mean technically you're right - the cameras obviously look at the top, centre of the screen where they themselves are located, so you can say they decide when to wipe for their own part of the screen, and not for where the driver is looking out. But exactly the same is true for traditional rain sensor systems - the sensor is located in the same place and so is also working for itself rather than the driver, strictly speaking. Yet traditional rain sensor systems work! Pretty perfectly in my 2001 BMW 5-Series and 2012 Ford S-Max. Also how different can the rain conditions actually be between the sensor location, and the part of the screen about a foot away the driver is looking through? I'd say no different whatsoever, so I wonder what the heck we even mean to distinguish between 'for the driver' and 'for the camera'. Both driver and camera are going to have the same conditions on their part of the screen, and both have the same needs - a clear screen.

You'd think that the amount of splash on a windshield is pretty uniform, but just a few days ago, we were having light wet snow here, and for whatever reason, the mess was accumulating on the bottom half of the windshield, while the top was pretty clean. Had to press the stalk button to clear my view.
 
  • Like
Reactions: DrGriz