Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Suspected repeater camera defect that affects FSD performance

This site may earn commission on affiliate links.
Sorry for the 'aside', but I'm wondering if the cameras see infrared. I live in a rural area and, so far, my Y has not reacted to deer during the day or at night either on or along side the road. Any thoughts on that? Strangely, last year it slowed and I realized that a family of coyotes was crossing far down the road.
I don't believe the cameras are sensitive to infrared, but they were not designed for humans to see the image from them. This is why the colors on the feed from the cameras might seem a bit strange, because they were designed for usage only with the Autopilot computer. The ability to use them as dashcams, sentry cameras, and blind spot monitor cameras came after with software updates because people asked for those features.

You can read more info about what colors the cameras are more sensitive to and why they were chosen that way at the following links. The TeslaTap article is now a bit outdated, but the info is still relevant.


This thread mentions the different color filters in different generations of AP cameras Upgrading AP cameras
 
I think we're kinda saying the same thing. I agree the response curve is highly non-linear, but that's a distinct problem from re-mapping IR into the visible spectrum (e.g. with security cameras). With correct filtering, you can build a camera that can see the full spectrum from visible to IR (though I'd be the first to admit this is non-trivial). Or, equivalently, you simply have multiple cameras and combine the images in post-processing. My point was just that if you want to present such a system to a human, you HAVE to remap everything into the visible spectrum, which inevitably eliminates information, whereas for a car visual system and NN you can directly train against the full spectrum. So we need to be careful when drawing comparisons using remapped IR images that a human can see.
Except what @stopcrazypp was saying is that it’s likely that the cameras have filters on them and don’t remap colors at all. If they have a (non-switchable) filter on the camera then the car doesn’t remap and but also never gets the information at all. Essentially, the spectrum processing is being done on a hardware level at the camera.

It all depends not just on the capabilities of the camera optics and sensor but on how much dynamic range it's capable of handling at a given time and what kind of image data it can send to the computer.

I'd also note that camera technology and cost has advanced significantly over the past 5-10 years so it's a virtual certainty that Tesla is dealing with multiple different cameras with different capabilities.
 
Last edited:
Nope you dont get it, and you clearly never will.

I didnt say ANYTHING about the NN getting better or worse or the same. I simply said WE DONT KNOW. Since we dont know any claim made based on KNOWING is invalid. End discussion.

If i flip a coin, and you say, without looking, its heads, i can say “you dont know that” without knowing if its heads or tails. You keep claiming my saying “ you dontvknow that” is the same as my claiming its really tails, which is nonsense.
But this is not what the car is seeing, it's what our human eyes are seeing after its projected through an LCD panel.

And, quite apart from anything else, you are seeing a rendering on an LCD panel with all that implies, while the NN is seeing the camera output directly (albeit, at the moment, with some significant video processing). And what about that processing? You do understand that the dynamic range of the cameras is far greater than the LCD panel, right? That allows the NN to see details in light and dark areas at the same time without the tricks needed to get a compromised image for you to look at.
You've kind of changed your point, though. You initially said (at least seemed to be saying) that the car was seeing more data than was being presented, which we don't know, either.

Regardless, I don't think either of us is really saying anything that different and ultimately it will be difficult to get a definitive answer. My fear is that even if it does compromise FSD capabilities Tesla will just try to compensate rather than fixing the problem because fixing it would cost money.
 
I think we're kinda saying the same thing. I agree the response curve is highly non-linear, but that's a distinct problem from re-mapping IR into the visible spectrum (e.g. with security cameras). With correct filtering, you can build a camera that can see the full spectrum from visible to IR (though I'd be the first to admit this is non-trivial). Or, equivalently, you simply have multiple cameras and combine the images in post-processing. My point was just that if you want to present such a system to a human, you HAVE to remap everything into the visible spectrum, which inevitably eliminates information, whereas for a car visual system and NN you can directly train against the full spectrum. So we need to be careful when drawing comparisons using remapped IR images that a human can see.
I'm not really saying the same thing, I'm saying doing things with a real physical IR filter vs software is fundamentally different and isn't simply a matter of remapping the data. You can't use software to replicate a IR filter, because of the problem shown above (the IR simply overwhelms visible light for certain objects like foliage).

Sure, you can definitely build a full spectrum camera that can get around this, but it'll be complex, about as complex as having multiple cameras and combining them. One idea for example is to have dedicated IR pixels, similar to how a Bayer filtered camera has different pixels filtered differently. But it likely won't be cheaper than adding a switchable IR filter, and causes a loss of some pixels on either mode, so kind of defeats the purpose, given the use of IR was intended for low light conditions.

The point is it can't be achieved purely in software, due to the issue being a fundamental physical problem. There has to be some hardware change (no matter if switchable IR filter, custom Bayer/IR hybrid filter, outputs of separate visible light and IR cameras combined, etc).
 
Last edited:
  • Like
Reactions: sleepydoc
Does anyone know if this issue is related to the mysterious error message "Auto park unavailable" that I get in my MX from time to time. It only seems to happen when there is bright sun on the side of the car.

I am guessing the error is due to the repeater being blinded. And yes, the message should be inhibited if the car is in 'Drive'.
I bought my 2018 model X new and auto-park has only worked once.
 
Sure, you can definitely build a full spectrum camera that can get around this, but it'll be complex, about as complex as having multiple cameras and combining them. One idea for example is to have dedicated IR pixels, similar to how a Bayer filtered camera has different pixels filtered differently. But it likely won't be cheaper than adding a switchable IR filter, and causes a loss of some pixels on either mode, so kind of defeats the purpose, given the use of IR was intended for low light conditions.

The point is it can't be achieved purely in software, due to the issue being a fundamental physical problem. There has to be some hardware change (no matter if switchable IR filter, custom Bayer/IR hybrid filter, outputs of separate visible light and IR cameras combined, etc).
I dont disagree with any of this. My point was in response to the original post showing a regular vs IR-mapped image and arguing from that (or so I understood) that IR cameras would not be useful to AV cars. Which I felt was invalid since regular vision plus IR vision (however it is achieved) wouldn't suffer from the color aliasing present when you try to remap IR into the human spectrum.
 
I would throw it out there that from the dashcam footage, IR (infrared) is filtered out, so it can't see IR
I just cooked some lunch and put the black frying pan in front of the main camera in a dark garage, and I couldn't see anything "lighting up" when viewing with TeslaCam. So it seems like the sensors don't detect it or even the RCCB "clear" channels have IR filtered too.
 
I dont disagree with any of this. My point was in response to the original post showing a regular vs IR-mapped image and arguing from that (or so I understood) that IR cameras would not be useful to AV cars. Which I felt was invalid since regular vision plus IR vision (however it is achieved) wouldn't suffer from the color aliasing present when you try to remap IR into the human spectrum.
That wasn't the point for the picture I showed, I was merely pointing out given how the dashcam footage looked, especially the foliage, Tesla was not using an IR sensitive camera (aka one without an IR filter or with a switchable one). I was not making a comment on whether it would be useful or not to AVs.
 
  • Like
Reactions: sleepydoc
Some posters here are confusing infra red detection with thermal detection
Oh thanks. 😅 Indeed the heated pan is the "far" end of infrared. Looks like Tesla's cameras do indeed see the "near" end of infrared with my quick test with the Wii's sensor bar:
side on infrared.jpg
side off infrared.jpg


And front camera too:
front infrared.jpg


And yes there was no visible light I could see from the sensor bar with the lights off.
 
Tesla just refused to fix my cameras under warranty (they’d be happy to take $300 from me to do it). They said it’s not a defect. Does anyone know how to complain to Tesla?
I had an existing Mobile Service appointment for 2/8. I sent a message that I added and issue with the cameras glare at night requesting if it was a warranty item. Received a reply back the next day and they said yes it is covered. Updated service estimate shows the camera issue at $0. I'd ask again, worst they can say is nope.
 
Oh thanks. 😅 Indeed the heated pan is the "far" end of infrared. Looks like Tesla's cameras do indeed see the "near" end of infrared with my quick test with the Wii's sensor bar:
View attachment 764594View attachment 764595

And front camera too:
View attachment 764596

And yes there was no visible light I could see from the sensor bar with the lights off.
From what I can find, the wii sensor bar uses 850nm LEDs.
Infrared module for Wii remote application
Although these are still invisible to the human eye, they typically have faint glow when you look directly into the LED, and should be easily detected by a camera even with a typical IR filter.
Infrared module for Wii remote application
Even with a 940nm LED (which do not glow to the human eye when you look directly into them), given IR filters are not perfect cutoffs and have varying cutoff points, they may still show up on cameras with IR filters if you aim directly at the LED. That's how those hidden camera detector apps work.

A more proper test of a camera without any IR filter at all is to shine the lights onto a reflecting surface while it was pitch dark. You should see the surface strongly illuminated if the camera had no IR filter. That is how security camera night vision works. It doesn't help much if all you can detect are IR leds shining into the camera, you need to be able to detect objects that the IR led is illuminating.

I'll see if I can try that test when I have the chance in the weekend, as I have security cameras that have both 850nm and 940nm LEDs.
 
Last edited:
Oh thanks. 😅 Indeed the heated pan is the "far" end of infrared. Looks like Tesla's cameras do indeed see the "near" end of infrared with my quick test with the Wii's sensor bar:
View attachment 764594View attachment 764595

And front camera too:
View attachment 764596

And yes there was no visible light I could see from the sensor bar with the lights off.
Yes, most cameras do if they don't have an IR cut filter in front of the sensor. That's nowhere near the wavelength of your frying pan though. 🤣
 
Well you are conflating two issues.

First and foremost, my posts were pointing out that the argument "I cannot see X on screen therefore FSD cannot see X" is an invalid assumption. Since this logic is the basis of the argument that FSD is impacted by light-leakage, the argument is invalidated. That's not speculation, its just logic.

Of course, invalidating one argument doesnt mean there are not other valid arguments about the possible impact of light leakage. As you note, without true knowledge about how the cameras/NN handle this we cannot know. However, its worth noting that FSD is a primary project for Tesla, and has been for several years, with massive resources assigned to it. The FSD vision stack has been running in-house for 2-3 years now (in some form or other). If light-leakage was an issue for FSD, do you think it would have gone un-noticed within Tesla all that time? And, given the importance of the FSD project, do you not think it would have been addressed long ago if it was? Tesla make production changes to the cars all the time, can you imagine a scenario in which the relatively trivial change was refused if the FSD team felt it was impacting the capabilities of FSD?

Yes, this is speculation also, but it seems more logical to me than assuming Tesla "covered-up" an issue, and potentially crippled one of their most important and visible development efforts. That seems to me to be drifting into the realm of conspiracy theory. So why are they fixing it now? Because it only became significant when they added the user-visible view to the UI as a convenience.
My question is not specific to this issue but since we have some folks with camera knowledge on this thread I wondered if someone could answer the following question. Based on how Tesla is using the camera vs. how the human eye works can that extend the distance the camera can see? There are lots of posts that refer to a 80 meter limit for the cameras. Is that valid when Tesla is not using the cameras the same way the human eye does and as the distance extends past 80 meters at what distance do we think the camera cannot adequately function for FSD? I would assume some distance greater then 80 meters is still ok but maybe not.
 
Based on how Tesla is using the camera vs. how the human eye works can that extend the distance the camera can see
It's not really clear how Tesla's marketing page for Autopilot came up with the "Max distance 80m" for the pillar "Forward Looking Side Cameras" as clearly all cameras are able to see the sun…

Here's a shrunk down image from the pillar camera from AI Day:
pillar 80m.jpg


The far/outer edge red curb line for Page Mill looking from Foothill does seem to stop at roughly 80m in this prediction/visualization. But one can still see multiple street lamps further and even a road sign, which could be more comparable in size to a vehicle. Those are at roughly 160m / 500ft distance, and a human could probably determine if a vehicle is there and approaching quickly or not. And remember that this is a lower resolution version of the view after it has gone through image processing for human viewing. Could FSD Beta neural networks reliably detect objects in the pillar camera further than 80m? Maybe?
 
How far away a camera can 'see' an object depends on many variables including sensor resolution, lens focal length, resolving power, how the data from the sensor is processed, lighting & weather conditions etc. etc. so there isn't a definitive number and I assume 80m was chosen based on the technology and configuration of camera used at that time.

What the camera resolves and what the MCU interprets as a particular object are two different things as well, just like the human eye/visual cortex. At 80m the front/pillar cameras with their wide angle lenses are not going to be able to resolve a small object particularly well. Once you get too far away, a human-sized object is going to be a few fuzzy pixels on the sensor and no amount of processing will be able to interpret what it actually is.

It could get better in the future and it probably will, but there are already a lot of 'old' cameras out there installed in Teslas so they need to be conservative in their approach or cars will constantly need camera upgrades to keep up. The bottom line is, it only needs to be as good as the human eye/brain to do a good job.
 
I'll see if I can try that test when I have the chance in the weekend, as I have security cameras that have both 850nm and 940nm LEDs.
For anyone still interested in the follow up to the infrared tests @Mardak did upthread. The results were exactly as I mentioned, all of the Tesla cameras (at least the ones the dashcam shows) have IR filters, so they filter out the IR. Shining a IR light onto a surface still has the surface essentially pitch black to the Tesla cameras, so useless for detecting objects using IR light.

I used the infrared leds from my Wyze V3, (which has four 850nm and four 940nm LEDs that can be independently switched on or off). When pitch dark, although they are dimmer than the 850nm, I can still see the 940nm with my naked eye (same thing using my cellphone), so both of them actually still emit some visible light. I have a 2021 Model 3 (delivered end of 2020), so YMMV depending on which model you have.

Here's the results of shining the leds directly into the cameras. As a control, I shone the led into Wyze V2 cameras and switched on/off the IR (infrared filter). As you can see without the IR filter, it's like shining a flashlight into the camera.

850nm:
850nm_leds.jpg


940nm:
940nm_leds.jpg


I then took test images. This is the reference from the Tesla Dashcam (shot with a flashlight shining onto the surface), I cropped them:
0_flashlight.jpg


This is with the 850nm:
850nm.jpg


This is how the 850nm looked to my Wyze V3 security camera with IR filter off in the position that the above picture was taken. As expected, looks as bright as a flashlight was shining on it. Note the front images for the security camera was shot through the windshield, so the IR light is diminished, although it still clearly illuminates the object:
850nm IR.jpg


This is the Tesla's on 940nm:
940nm.jpg


This is the security camera on 940nm:
940nm IR.jpg
 
Last edited:
...just thinking about the early Model S/X needing new cameras and this issue. I'm wondering if Tesla is not upgrading these because someone might think additional upgrades might be need with the cameras in the future? As a Beta FSD tester, there are times where the car want to jump into crossing traffic. ...not sure if that is a camera issue of a AI perception issue, as debated in other threads, but it might be that all the side cameras will need replacing at some point for FSD anyway.
 
This issue came to mind on a drive recently, so I decided to test it on the highway (therefore, I'm testing AP/NoA, not FSD).

First, I waited for a faster car to approach me on my left side. Then I would turn on my left blinker before the car entered my blind spot. In this scenario, the internal glare is no longer apparent in the popup. The camera, being blinded by the approaching headlights, adjusts its exposure sensitivity and no longer picks up the internal glare. Not surprisingly, AP/NoA works flawlessly and doesn't allow me to change lanes until the car passes.

Next, with no car to my left, I turn on my left blinker. Here, the internal glare is on full display in the popup. Yet the car moves over immediately.

I did this for about 5 cars. Never had an issue. This is despite my car having this problem since its manufacture in 2018. And I've never had a problem with AP trying to change lanes when a car was near my blind spot.

I encourage everyone to try this for themselves so we can stop being concerned about something that's most likely a non-issue.
 
This issue came to mind on a drive recently, so I decided to test it on the highway (therefore, I'm testing AP/NoA, not FSD).

First, I waited for a faster car to approach me on my left side. Then I would turn on my left blinker before the car entered my blind spot. In this scenario, the internal glare is no longer apparent in the popup. The camera, being blinded by the approaching headlights, adjusts its exposure sensitivity and no longer picks up the internal glare. Not surprisingly, AP/NoA works flawlessly and doesn't allow me to change lanes until the car passes.

Next, with no car to my left, I turn on my left blinker. Here, the internal glare is on full display in the popup. Yet the car moves over immediately.

I did this for about 5 cars. Never had an issue. This is despite my car having this problem since its manufacture in 2018. And I've never had a problem with AP trying to change lanes when a car was near my blind spot.

I encourage everyone to try this for themselves so we can stop being concerned about something that's most likely a non-issue.

Counterpoint: this car with no headlights
You're relying on the headlights of the other car to compensate for the defect in your blind spot camera.

HX2iVk5ywDfQueuK1DWwVLEDZKsPpnk899Z9qbjKwY0.png