Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Raindrops as seen by the fisheye camera when stationary, you were wondering about earlier:
View attachment 244808

And the effect of wipers swipe, also seen by the fisheye camera:
View attachment 244809
Totally awesome. What we're seeing here is exactly the issues discussed upthread: Raindrops out of focus. Sure we're all able to see 'em, but weesa humans. Computer no understand!

Anyway, I'm still going against the flow on the vision-based wiper theory - I'm all in on the dysfunctional hvac sensor theory
 
  • Funny
Reactions: MTOman
Totally awesome. What we're seeing here is exactly the issues discussed upthread: Raindrops out of focus. Sure we're all able to see 'em, but weesa humans. Computer no understand!

Anyway, I'm still going against the flow on the vision-based wiper theory - I'm all in on the dysfunctional hvac sensor theory

Based on those images, I actually see it as "yeah its totally possible."

I know based on your past postings you know what you're talking about, so I'm not trying to be condescending. :) Just my take.

Computers don't need to see raindrops as we see them, they just need to have a uniquely identifiable "shape" (for lack of better word). We see raindrops clearly, but perhaps a raindrop to the computers never needs to be in focus. It just needs to provide a reasonably expected pattern by the computer.

Further, there is a known and regular cadence that windshield wipers run at. By taking multiple frames just after a wiper crosses the camera, I expect its possible to get a decent sample for how hard its raining and how fast the wipers should go (or if they should completely turn off).

I think it's going to be fine.

As someone who lives in Los Angeles, I'd rather have the driving features.
As someone who lived in Seattle for a number of years, sorry everyone else. :p
 
  • Like
Reactions: croman
Totally awesome. What we're seeing here is exactly the issues discussed upthread: Raindrops out of focus. Sure we're all able to see 'em, but weesa humans. Computer no understand!

Anyway, I'm still going against the flow on the vision-based wiper theory - I'm all in on the dysfunctional hvac sensor theory
It's not completely in focus, but it's still far more well focused than on the main camera. You can even see the individual drops.
The wide angle camera is far closer to the pictures in A and B in your post, than to C and D. The main camera looks like C and D (there is just some vague blurs, not enough to make out individual drops with any accuracy).
AP2.0 Cameras: Capabilities and Limitations?
 
Ya, maybe.

Anyway where is the line drawn between speculating, imagining and just plain guessing? Rain sensing wipers is my favorite topic because it's such a silly feature to be missing and none of us have ANY clue if this is due to difficulties in the SW department, insufficient HW or just priorities. It'll be all boring again when we know the real answer :)
 
Ya, maybe.

Anyway where is the line drawn between speculating, imagining and just plain guessing? Rain sensing wipers is my favorite topic because it's such a silly feature to be missing and none of us have ANY clue if this is due to difficulties in the SW department, insufficient HW or just priorities. It'll be all boring again when we know the real answer :)
You forgot one more: legal/patents. It could be whatever solution they come up with butts up against patent issues. And the problem with a lot of these patents being held by suppliers is they might not agree to license for any reasonable price or even any (they might just say: buy our rain sensor).
 
  • Like
Reactions: Matias
You forgot one more: legal/patents. It could be whatever solution they come up with butts up against patent issues. And the problem with a lot of these patents being held by suppliers is they might not agree to license for any reasonable price or even any (they might just say: buy our rain sensor).

It just makes 0 sense for Tesla not include an off the shelf rain sensor. Apart from the technical difficulty with the task, the legal side of this is very tough to navigate. It boggles the mind that Tesla is wasting time with rain sensing wipes through NN recognition instead of a tried and true (and cheap!) approach of off the shelf sensors.
 
Thanks for the @DamianXVI great tool I procesed a surface street and a highway images.

Sadly the uyvy format of the backup cam does not really work in any of the tools I have since e.g. imagemagic insists it must be interleaved when it's not in my case, so for now I just converted it to a grayscale image using 8 bit data from one of the channels.

Upside - I am really impressed with how all the side cameras are unaffected by all the rain. Downside - backup camera on highway really got clogged with water and apparently no amount of Rain-X helped.

street:
main.jpg narrow.jpg rightpillar.jpg rightrepeater.jpg leftpillar.jpg leftrepeater.jpg fisheye.jpg backup.jpg
 
Upside - I am really impressed with how all the side cameras are unaffected by all the rain. Downside - backup camera on highway really got clogged with water and apparently no amount of Rain-X helped.

Do side cameras include both the B pillar ones and the front quarter panel ones (hard for me to figure which cameras are which just from photos)? Those were my concerns when AP2 was first released. Happy to see it looks like they work pretty well! (Every S owner knew the rear cam would be mostly useless in the rain, so wasn't surprised about that one)
 
Do side cameras include both the B pillar ones and the front quarter panel ones (hard for me to figure which cameras are which just from photos)? Those were my concerns when AP2 was first released. Happy to see it looks like they work pretty well! (Every S owner knew the rear cam would be mostly useless in the rain, so wasn't surprised about that one)
Yes. Those pictures taht are Repeater are the quarterpanel ones, the pillar are B pillars.
 
  • Informative
Reactions: trils0n
Thanks for the @DamianXVI great tool I procesed a surface street and a highway images.

Sadly the uyvy format of the backup cam does not really work in any of the tools I have since e.g. imagemagic insists it must be interleaved when it's not in my case, so for now I just converted it to a grayscale image using 8 bit data from one of the channels.

Upside - I am really impressed with how all the side cameras are unaffected by all the rain. Downside - backup camera on highway really got clogged with water and apparently no amount of Rain-X helped.

street:
View attachment 245203 View attachment 245206 View attachment 245204 View attachment 245205 View attachment 245201 View attachment 245202 View attachment 245200 View attachment 245199
Yeah, looks like the rear camera is definitely a problem in the rain. The others did surprisingly well.
 
the main camera cluster is within range of wipers so the rain is mostly controllable, but there are no wipers on the rest of them, so I am really impressed by the performance of side cameras in the face of significant rain (no idea how much Rain-X helped on the side cameras).

BTW another observation, while taking these snapshots AS was really wobbly in the lane, as if it was not really sure where to aim for.
 
The general purpose is to use the red channel for red(ish) objects: stop signs, red lights, brake lights, human skin has a high red response (all nationalities). That plus the HDR monochrome information is fed straight into the computer vision algorithms and/or NNs. They don't need to identify sky as much as signs, signals, cars, pedestrians, and other obstacles within the drivable path.

Although, if you discard the red data sometimes the sky will be totally blown out, it's easy to see how the broadside of a brightly lit white truck might blend into it.

I wonder if monchrome cameras with a red channel are sufficient to distinguish construction site line markings (orange, sometimes yellow) from regular lane markings (white/gray).