Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Suspected repeater camera defect that affects FSD performance

This site may earn commission on affiliate links.
We'll see eventually how big of an issue this is I suppose. I'm just speaking from my experience with motion tracking that this issue raises major red flags to me. Maybe there's a regulatory loophole that allows them to reduce the blinker rate for additional data, maybe the issue gets superseded altogether with camera upgrades, maybe they figure something out on the software side. I just don't want to be taken advantage of given I paid for FSD which has delivered zero value for me so far as a UK resident. I can only really hope they deliver what they promise, and get defensive when it looks as if they're trying to get away with delivering less.

If you watch the video at the top of my OP someone from these forums found a way to fix it, but it requires you to drill the repeater housing which'll deffo void your warranty if Tesla ever ends up seeing it. Maybe they'll be noting any logs of anyone disconnecting and reconnecting repeater cams.
I somehow misses this before I posted, don't fancy drilling holes in things.
As an aside, I don't often see someone using "deffo" and certainly on this side of the pond - then I noticed your location :D
 
  • Like
Reactions: Scotty7
Going back to the blinded camera, that's affecting step 2 in the above process - Analyse and identify points of high contrast. If you're blinding a camera with indicator light 50% of the time during a lane change or junction turn you're screwing up 50% of your data to base decisions on. Everything in the visual area of the glare is having its perceived colours, contrast and brightness dramatically shifted. As far as the computer is concerned - everything in that area is now completely different. You can either develop some kind of advanced algorithm to try and account for the glare in every scenario (which isn't going to work well because depending on what's behind the glare the area can be different in colour or contrast AND will require you to spend additional processing power), or completely disregard the frames with the glare on and extrapolate what's happening using object permanence. Either way, you're spending a ton of development time compensating for a manufacturing defect, both of which are going to strongly affect the decisions FSD is making in strange ways.
Does anyone have any actual hard evidence that the camera/NN is indeed "blinded"? Sure, the color contrast changes dramatically, and a few spots seem to wash out in the view shown on screen. But this is not what the car is seeing, it's what our human eyes are seeing after its projected through an LCD panel. If people are just extrapolating from "I can't see stuff on the screen" to "the car can't see stuff" then I think that is a very dubious claim.

Stop for a moment and think about what the NN/cameras are already coping with. Driving in blazing sunlight, head on. Then suddenly driving in dark shadows or tunnels. Driving in dark conditions, with varying degrees of lighting, and different colored street lights (sodium, anyone?). I've carefully watched the cars ability to distinguish lane lines and cars, and dont see the NNs having much trouble coping with these far harsher changes in conditions. Cameras are not eyes, and while our eyes may beat out the camera in some areas, they are certainly weaker in others, such as the speed with which they can adjust to different ambient light conditions (seconds for eyes, fractions of a second for cameras).

In addition, FSD (and NoA) has been driving with this "blinded" view for several years already. People only got all worked up when Tesla added the side-view assist feature. My guess is Tesla added the camera fix not because FSD needed it, but because humans needed it. But of course I'm speculating, as is everyone else getting all excited about "fixing" a problem that has not even been shown (so far as I can see) to exist.
 
Does anyone have any actual hard evidence that the camera/NN is indeed "blinded"? Sure, the color contrast changes dramatically, and a few spots seem to wash out in the view shown on screen. But this is not what the car is seeing, it's what our human eyes are seeing after its projected through an LCD panel. If people are just extrapolating from "I can't see stuff on the screen" to "the car can't see stuff" then I think that is a very dubious claim.

Stop for a moment and think about what the NN/cameras are already coping with. Driving in blazing sunlight, head on. Then suddenly driving in dark shadows or tunnels. Driving in dark conditions, with varying degrees of lighting, and different colored street lights (sodium, anyone?). I've carefully watched the cars ability to distinguish lane lines and cars, and dont see the NNs having much trouble coping with these far harsher changes in conditions. Cameras are not eyes, and while our eyes may beat out the camera in some areas, they are certainly weaker in others, such as the speed with which they can adjust to different ambient light conditions (seconds for eyes, fractions of a second for cameras).

In addition, FSD (and NoA) has been driving with this "blinded" view for several years already. People only got all worked up when Tesla added the side-view assist feature. My guess is Tesla added the camera fix not because FSD needed it, but because humans needed it. But of course I'm speculating, as is everyone else getting all excited about "fixing" a problem that has not even been shown (so far as I can see) to exist.
Some people on this forum have said the repeater problem affects both their cameras, only one camera, or neither, so various cars out there have different combinations of the repeater cameras. I've also noticed many people saying they have different experiences with FSD/AP - some have more disengagements some have less - so maybe the cameras do have an influence on the driving. Maybe the computer can filter out the noise by dropping frames, but that has some effect on scene accuracy and reaction time too.

I find it interesting how people with the same model cars report entirely different experiences with FSD/AP/Phantom braking. I haven't seen any clear studies on whether "identical" cars with identical software do indeed have different driving reactions. This repeater camera may at least be one factor that makes "identical" cars not identical.

Maybe there are other hardware variations in other areas of the car that hold hidden & undocumented defects/improvements too?
 
  • Like
Reactions: Stach and Scotty7
Hi All,

I would add a caution to those who might want to repair their repeater cameras
with black tape.
The repeaters snap into place with 3 or 4 plastic tabs on their housings.
These tabs are shaped like the number one (1).
The long body of the one acts like a spring.
The top "hook" of the one bends out of the way and then snaps
into position when the repeater is pressed into place.
It is not really designed to be removed as these tabs will break
when the repeater is removed.
If one could reach inside of the car body you could fold some of the
tabs out of the way... But short of removing the frunk tub or maybe a
wheel well liner, I don't see a way to remove the repeater without
breaking the tabs.

FWIW - Good luck,

Shawn

PS - check eBay if you wish.
Part numbers 1125106-XX-X or 1125107-XX-X
or Search "Tesla Repeater Camera"
Some sellers mention the broken tabs - some
do not mention the broken tabs.
They appear missing or broken in the listing photos.
 
those look like vias (a via is a plated thru hole that joins pcb layers).

so light goes thru a via? never heard of that before. and yes, that's a design defect even though I'll have to say, its an unusual one.

and it should be replaced under warranty. the cameras are not for people, they are essential for any kind of automation in the car!

defect. tesla screwed up. and they wont want to pay for it, either.

lazy bums.
 
  • Like
Reactions: Scotty7 and Dan D.
Maybe the computer can filter out the noise by dropping frames, but that has some effect on scene accuracy and reaction time too.
I see no reason why it would need to drop frames. Again, the assumption is that the NN has problems with perception when the turn signal is on, which has in no way been shown to be true. As you note, there is a wide variety of experiences in phantom braking and FSD behavior, but there are far too many variables at play here to correlate that to issues with turn-signal bleed-thru (though I should note in passing that I have full bleed-thru and have almost never seen phantom braking in 3.5 year of NoA and 5 months of FSD beta).
 
I see no reason why it would need to drop frames. Again, the assumption is that the NN has problems with perception when the turn signal is on, which has in no way been shown to be true. As you note, there is a wide variety of experiences in phantom braking and FSD behavior, but there are far too many variables at play here to correlate that to issues with turn-signal bleed-thru (though I should note in passing that I have full bleed-thru and have almost never seen phantom braking in 3.5 year of NoA and 5 months of FSD beta).
"As you note, there is a wide variety of experiences in phantom braking and FSD behavior, but there are far too many variables at play here to correlate that to issues with turn-signal bleed-thru"


I'd like to know what hardware/software/configuration/geolocation/car model does show a correlation between driving experiences. I can't understand why I keep reading forum members saying "I NEVER have ________ (that problem)", and others saying "It happens every drive", for whatever we are talking about. Eg: Phantom braking, auto-headlight strobing, FSD behaviour. Are the cars actually responding differently, or can we put this down entirely to some users being absolutely unaware their cars are doing what the other users are experiencing?

I would theorize there is something different about the cars. If the entirety of the configuration was equalized over identical cars, running identical software, on identical geolocations and conditions would there still be a difference? Until someone investigates this in a rigorous scientific way I guess we just have theories then. Or we can continue to see forum members arguing why their perceptions differ, not guessing that it might be the car that differs.
 
I would theorize there is something different about the cars. If the entirety of the configuration was equalized over identical cars, running identical software, on identical geolocations and conditions would there still be a difference? Until someone investigates this in a rigorous scientific way I guess we just have theories then. Or we can continue to see forum members arguing why their perceptions differ, not guessing that it might be the car that differs.
There are many variables. it's difficult to replicate the same driving experience more than once: weather, traffic, time of day etc all conspire against that (and all change the inputs to the car from the cameras). Then there are manufacturing tolerances, though I doubt they play a major role as the car handles much of this via the camera calibration process (though tolerances in the mechanical systems such as steering will be wider). Then you also have the human element: expectations about what they car should do (or at least think it should do), deciding what constitutes a phantom braking event (assuming, for example, that anything they dont see as a need for braking means the car was mistaken, rather than them).
 
the idea that you can throw any low res garbage at NN and it will 'sort it out' - that's pure rubbish.

low res data in ==> low res outputs.

adding junk almost never helps (dither is one case, but this isn't that case).

please stop apologizing for tesla. light leaking in WILL affect the algorithm. why do you think it wont? you think NN just works by magic?
 
Does anyone have any actual hard evidence that the camera/NN is indeed "blinded"? Sure, the color contrast changes dramatically, and a few spots seem to wash out in the view shown on screen. But this is not what the car is seeing, it's what our human eyes are seeing after its projected through an LCD panel. If people are just extrapolating from "I can't see stuff on the screen" to "the car can't see stuff" then I think that is a very dubious claim.

Stop for a moment and think about what the NN/cameras are already coping with. Driving in blazing sunlight, head on. Then suddenly driving in dark shadows or tunnels. Driving in dark conditions, with varying degrees of lighting, and different colored street lights (sodium, anyone?). I've carefully watched the cars ability to distinguish lane lines and cars, and dont see the NNs having much trouble coping with these far harsher changes in conditions. Cameras are not eyes, and while our eyes may beat out the camera in some areas, they are certainly weaker in others, such as the speed with which they can adjust to different ambient light conditions (seconds for eyes, fractions of a second for cameras).

In addition, FSD (and NoA) has been driving with this "blinded" view for several years already. People only got all worked up when Tesla added the side-view assist feature. My guess is Tesla added the camera fix not because FSD needed it, but because humans needed it. But of course I'm speculating, as is everyone else getting all excited about "fixing" a problem that has not even been shown (so far as I can see) to exist.
Totally agree that the RCCB raw feed from the cameras that the NN receives will be different from what we see on the dashcam or the blind-spot popup. Saved clips are definitely at a low bitrate and the raw input won't have post-processing applied to it.

I've attached some images of my own repeater glare here:

Glare 1s.jpg


Glare 2s.jpg


You can argue you can still see some elements behind the glare, but let's put this way - there's no way this is helping. It's a complete mess of colours and contrast; the image is definitely being messed with in a negative way and it's degrading the quality of the image.

I will concede that short of being able to see the raw camera imagery the NN receives, or literally running tests on the performance of a car with the issue vs one without it, it's difficult to draw hard conclusions. I think it's fair to say this is probably going to have a negative effect - whether that's minor or major - on the quality of available information to the FSD computer, and therefore an impact on decisions it makes. It's evident Telsa thought it was worth applying tape onto the PCB of the repeater cameras to mitigate the issue. The new PCBs don't have this effect either, but it's impossible to know if that was intentional or simply a byproduct of the redesign / a new manufacturer.

At least with the front cameras you have an array of 3 cams equipped with different specs to deal with things like direct sunlight and to add some layer of redundancy.
 
You can argue you can still see some elements behind the glare, but let's put this way - there's no way this is helping. It's a complete mess of colours and contrast; the image is definitely being messed with in a negative way and it's degrading the quality of the image.
totally agree, and that's about what I see as well. And yes, I see no reason to NOT fix it going forward (as, apparently, Tesla have done). But as you note, you can just about make out the items in the glare area, and that's exactly what all that smart camera/NN stack is about .. extracting a decent signal from all the noise. Of course, you should make efforts to lower the noise (that's why FSD washes the windshield and uses high beams etc), but people should remember how sensitive and discriminating the cameras can be.

Let me digress into a real-world example (slight tangent). In my line of work we have apps that can detect your heatbeat reliably just from your smartphone camera looking at your face during a video call. How? Well, every time your heat beats, your blood pressure goes up briefly (pressure bump) .. this causes the capillaries in your face to swell just a TINY amount, making your face go very slightly redder (its called a "micro blush"). And with some sophisticated filtering/alogoithms, we can detect this change reliably, even across a video call and all its compression artifacts. In fact, we can do this from across the room using wall mounted cameras for several people at once. (And we can detect respiration rate as well, but that's a different approach.)

Can humans do this? No. So don't underestimate the abilities of the cameras to extract data from poor signals, guys!

(Of course, its a mistake to claim that because of this tangential work the car can avoid being blinded, but my point is that arguing from "what a human can see" isnt really valid.)
 
The repeater glare seen by many people here has been revealed by the channel below to be a design flaw with the circuit board of the camera. The PCB has three holes which allows the indicator light to leak directly onto the image sensor, bypassing the front of the lens entirely (aka, it's not the external indicator light being seen by the camera).

https://www.youtube.com/watch?v=_BUPsjguqdY

View attachment 761636

Furthermore, they show multiple repeater camera's they've found and documented that Tesla has already acknowledged the design flaw and and began at some point covering these holes with tape, to ultimately be resolved with a new design all-together. This directly conflcits with Tesla's messaging when denying some owners' repair requests by aruging it's a feature of the camera. By definition, light leak is a problem, and in this scenario it's not caused by the itself but the board it's attached to. It's simply not possible this glare was intentionally designed, particularly given the evidence of Tesla trying to correct it.

I believe they're quietly letting this issue fall by the wayside, however, that's not good enough. This will be insufficient for those who bought FSD which we know will almost entirely rely on image feeds from the cameras. It's simple deduction to know that having these blinded by a design flaw at a critical time - lane changes and turning at junctions - will heavily impede FSD performance and potentially be dangerous.

View attachment 761638

Tesla is arguing this is simply a characteristic of the camera, and you can pay to have a newer version that is "design enhanced". This is logically inconsistent given their DIY tape solution in production. Also, some owners have had their cameras replaced under warranty, likely before those service centre agents received the memo to downplay this issue. They're arguing these instances were done under 'goodwill'. If anyone has received warranty replacements for the cameras specifically for this glare / light leak issue it would be helpful do feel free to comment here.

View attachment 761639

Long story short, these defective cameras will need to be replaced FOC for FSD owners sooner or later, given FSD entitles us to some degree of hardware replacement to make the feature work if needed. Some owners have had to pay out of pocket to have these replaced for this issue, and will have to likely wrestle a refund out of Tesla at some point in the future. Very reminiscent of the eMMC issue.
Time and time again, Tesla has tried to BS their way through stuff like this.

Haven't they realized by now, that there are some really brilliant people who have well-tuned BS detectors buying these cars?

Elon: You might be the richest guy on the planet, and certainly had a vision a couple decades ago in which no one else believed, but you underestimate your customers.
 
the idea that you can throw any low res garbage at NN and it will 'sort it out' - that's pure rubbish.

low res data in ==> low res outputs.

adding junk almost never helps (dither is one case, but this isn't that case).

please stop apologizing for tesla. light leaking in WILL affect the algorithm. why do you think it wont? you think NN just works by magic?
GIGO still applies, I take it.
 
  • Like
Reactions: pilotSteve
Is this really all you people have to complain about? Tesla's doing a great job then.

There's no doubt in my mind that this is a flaw that needs fixing - as Tesla has done in later builds. It's still wild speculation to suggest that this will cause an issue with FSD - especially as spatial and temporal permanence is something they're working hard on in order to improve FSD. If they can see something when the blinker is off, temporal permanence will tell them that it's still there IF the video is blinded when the blinker is on. At worst, IMHO, it'll cause a small degradation in performance in niche situations - but as a software developer (though not involved in AI or Image Processing), I'll suggest that EVERY non-trivial software application has dozens, if not hundreds, of degradations in performance in niche situations. The question is whether the sum of all those degradations take the system out of it's required performance envelope, and none of us have any idea if that's true for FSD.

I'll let Tesla's engineers make that call. In general (other than FSD delivery schedules and the gawdawful MP3 player) they do a good job.
 
Time and time again, Tesla has tried to BS their way through stuff like this.

Haven't they realized by now, that there are some really brilliant people who have well-tuned BS detectors buying these cars?

Elon: You might be the richest guy on the planet, and certainly had a vision a couple decades ago in which no one else believed, but you underestimate your customers.
See my post here .. I dont think anyone has demonstrated that this is actually a defect.
 
  • Like
Reactions: android04
FYI Tesla has specifically instructed their service centers to turn down and requests to replace these. Tried doing it last week and they told me upper management has said these are not a defect and will not be replaced under warranty.
 
Does anyone have any actual hard evidence that the camera/NN is indeed "blinded"? Sure, the color contrast changes dramatically, and a few spots seem to wash out in the view shown on screen. But this is not what the car is seeing, it's what our human eyes are seeing after its projected through an LCD panel. If people are just extrapolating from "I can't see stuff on the screen" to "the car can't see stuff" then I think that is a very dubious claim.

Stop for a moment and think about what the NN/cameras are already coping with. Driving in blazing sunlight, head on. Then suddenly driving in dark shadows or tunnels. Driving in dark conditions, with varying degrees of lighting, and different colored street lights (sodium, anyone?). I've carefully watched the cars ability to distinguish lane lines and cars, and dont see the NNs having much trouble coping with these far harsher changes in conditions. Cameras are not eyes, and while our eyes may beat out the camera in some areas, they are certainly weaker in others, such as the speed with which they can adjust to different ambient light conditions (seconds for eyes, fractions of a second for cameras).

In addition, FSD (and NoA) has been driving with this "blinded" view for several years already. People only got all worked up when Tesla added the side-view assist feature. My guess is Tesla added the camera fix not because FSD needed it, but because humans needed it. But of course I'm speculating, as is everyone else getting all excited about "fixing" a problem that has not even been shown (so far as I can see) to exist.
You tell me - is this a problem?

blinded camera.jpg