You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
I somehow misses this before I posted, don't fancy drilling holes in things.We'll see eventually how big of an issue this is I suppose. I'm just speaking from my experience with motion tracking that this issue raises major red flags to me. Maybe there's a regulatory loophole that allows them to reduce the blinker rate for additional data, maybe the issue gets superseded altogether with camera upgrades, maybe they figure something out on the software side. I just don't want to be taken advantage of given I paid for FSD which has delivered zero value for me so far as a UK resident. I can only really hope they deliver what they promise, and get defensive when it looks as if they're trying to get away with delivering less.
If you watch the video at the top of my OP someone from these forums found a way to fix it, but it requires you to drill the repeater housing which'll deffo void your warranty if Tesla ever ends up seeing it. Maybe they'll be noting any logs of anyone disconnecting and reconnecting repeater cams.
Does anyone have any actual hard evidence that the camera/NN is indeed "blinded"? Sure, the color contrast changes dramatically, and a few spots seem to wash out in the view shown on screen. But this is not what the car is seeing, it's what our human eyes are seeing after its projected through an LCD panel. If people are just extrapolating from "I can't see stuff on the screen" to "the car can't see stuff" then I think that is a very dubious claim.Going back to the blinded camera, that's affecting step 2 in the above process - Analyse and identify points of high contrast. If you're blinding a camera with indicator light 50% of the time during a lane change or junction turn you're screwing up 50% of your data to base decisions on. Everything in the visual area of the glare is having its perceived colours, contrast and brightness dramatically shifted. As far as the computer is concerned - everything in that area is now completely different. You can either develop some kind of advanced algorithm to try and account for the glare in every scenario (which isn't going to work well because depending on what's behind the glare the area can be different in colour or contrast AND will require you to spend additional processing power), or completely disregard the frames with the glare on and extrapolate what's happening using object permanence. Either way, you're spending a ton of development time compensating for a manufacturing defect, both of which are going to strongly affect the decisions FSD is making in strange ways.
Some people on this forum have said the repeater problem affects both their cameras, only one camera, or neither, so various cars out there have different combinations of the repeater cameras. I've also noticed many people saying they have different experiences with FSD/AP - some have more disengagements some have less - so maybe the cameras do have an influence on the driving. Maybe the computer can filter out the noise by dropping frames, but that has some effect on scene accuracy and reaction time too.Does anyone have any actual hard evidence that the camera/NN is indeed "blinded"? Sure, the color contrast changes dramatically, and a few spots seem to wash out in the view shown on screen. But this is not what the car is seeing, it's what our human eyes are seeing after its projected through an LCD panel. If people are just extrapolating from "I can't see stuff on the screen" to "the car can't see stuff" then I think that is a very dubious claim.
Stop for a moment and think about what the NN/cameras are already coping with. Driving in blazing sunlight, head on. Then suddenly driving in dark shadows or tunnels. Driving in dark conditions, with varying degrees of lighting, and different colored street lights (sodium, anyone?). I've carefully watched the cars ability to distinguish lane lines and cars, and dont see the NNs having much trouble coping with these far harsher changes in conditions. Cameras are not eyes, and while our eyes may beat out the camera in some areas, they are certainly weaker in others, such as the speed with which they can adjust to different ambient light conditions (seconds for eyes, fractions of a second for cameras).
In addition, FSD (and NoA) has been driving with this "blinded" view for several years already. People only got all worked up when Tesla added the side-view assist feature. My guess is Tesla added the camera fix not because FSD needed it, but because humans needed it. But of course I'm speculating, as is everyone else getting all excited about "fixing" a problem that has not even been shown (so far as I can see) to exist.
I see no reason why it would need to drop frames. Again, the assumption is that the NN has problems with perception when the turn signal is on, which has in no way been shown to be true. As you note, there is a wide variety of experiences in phantom braking and FSD behavior, but there are far too many variables at play here to correlate that to issues with turn-signal bleed-thru (though I should note in passing that I have full bleed-thru and have almost never seen phantom braking in 3.5 year of NoA and 5 months of FSD beta).Maybe the computer can filter out the noise by dropping frames, but that has some effect on scene accuracy and reaction time too.
"As you note, there is a wide variety of experiences in phantom braking and FSD behavior, but there are far too many variables at play here to correlate that to issues with turn-signal bleed-thru"I see no reason why it would need to drop frames. Again, the assumption is that the NN has problems with perception when the turn signal is on, which has in no way been shown to be true. As you note, there is a wide variety of experiences in phantom braking and FSD behavior, but there are far too many variables at play here to correlate that to issues with turn-signal bleed-thru (though I should note in passing that I have full bleed-thru and have almost never seen phantom braking in 3.5 year of NoA and 5 months of FSD beta).
There are many variables. it's difficult to replicate the same driving experience more than once: weather, traffic, time of day etc all conspire against that (and all change the inputs to the car from the cameras). Then there are manufacturing tolerances, though I doubt they play a major role as the car handles much of this via the camera calibration process (though tolerances in the mechanical systems such as steering will be wider). Then you also have the human element: expectations about what they car should do (or at least think it should do), deciding what constitutes a phantom braking event (assuming, for example, that anything they dont see as a need for braking means the car was mistaken, rather than them).I would theorize there is something different about the cars. If the entirety of the configuration was equalized over identical cars, running identical software, on identical geolocations and conditions would there still be a difference? Until someone investigates this in a rigorous scientific way I guess we just have theories then. Or we can continue to see forum members arguing why their perceptions differ, not guessing that it might be the car that differs.
Totally agree that the RCCB raw feed from the cameras that the NN receives will be different from what we see on the dashcam or the blind-spot popup. Saved clips are definitely at a low bitrate and the raw input won't have post-processing applied to it.Does anyone have any actual hard evidence that the camera/NN is indeed "blinded"? Sure, the color contrast changes dramatically, and a few spots seem to wash out in the view shown on screen. But this is not what the car is seeing, it's what our human eyes are seeing after its projected through an LCD panel. If people are just extrapolating from "I can't see stuff on the screen" to "the car can't see stuff" then I think that is a very dubious claim.
Stop for a moment and think about what the NN/cameras are already coping with. Driving in blazing sunlight, head on. Then suddenly driving in dark shadows or tunnels. Driving in dark conditions, with varying degrees of lighting, and different colored street lights (sodium, anyone?). I've carefully watched the cars ability to distinguish lane lines and cars, and dont see the NNs having much trouble coping with these far harsher changes in conditions. Cameras are not eyes, and while our eyes may beat out the camera in some areas, they are certainly weaker in others, such as the speed with which they can adjust to different ambient light conditions (seconds for eyes, fractions of a second for cameras).
In addition, FSD (and NoA) has been driving with this "blinded" view for several years already. People only got all worked up when Tesla added the side-view assist feature. My guess is Tesla added the camera fix not because FSD needed it, but because humans needed it. But of course I'm speculating, as is everyone else getting all excited about "fixing" a problem that has not even been shown (so far as I can see) to exist.
totally agree, and that's about what I see as well. And yes, I see no reason to NOT fix it going forward (as, apparently, Tesla have done). But as you note, you can just about make out the items in the glare area, and that's exactly what all that smart camera/NN stack is about .. extracting a decent signal from all the noise. Of course, you should make efforts to lower the noise (that's why FSD washes the windshield and uses high beams etc), but people should remember how sensitive and discriminating the cameras can be.You can argue you can still see some elements behind the glare, but let's put this way - there's no way this is helping. It's a complete mess of colours and contrast; the image is definitely being messed with in a negative way and it's degrading the quality of the image.
Time and time again, Tesla has tried to BS their way through stuff like this.The repeater glare seen by many people here has been revealed by the channel below to be a design flaw with the circuit board of the camera. The PCB has three holes which allows the indicator light to leak directly onto the image sensor, bypassing the front of the lens entirely (aka, it's not the external indicator light being seen by the camera).
https://www.youtube.com/watch?v=_BUPsjguqdY
View attachment 761636
Furthermore, they show multiple repeater camera's they've found and documented that Tesla has already acknowledged the design flaw and and began at some point covering these holes with tape, to ultimately be resolved with a new design all-together. This directly conflcits with Tesla's messaging when denying some owners' repair requests by aruging it's a feature of the camera. By definition, light leak is a problem, and in this scenario it's not caused by the itself but the board it's attached to. It's simply not possible this glare was intentionally designed, particularly given the evidence of Tesla trying to correct it.
I believe they're quietly letting this issue fall by the wayside, however, that's not good enough. This will be insufficient for those who bought FSD which we know will almost entirely rely on image feeds from the cameras. It's simple deduction to know that having these blinded by a design flaw at a critical time - lane changes and turning at junctions - will heavily impede FSD performance and potentially be dangerous.
View attachment 761638
Tesla is arguing this is simply a characteristic of the camera, and you can pay to have a newer version that is "design enhanced". This is logically inconsistent given their DIY tape solution in production. Also, some owners have had their cameras replaced under warranty, likely before those service centre agents received the memo to downplay this issue. They're arguing these instances were done under 'goodwill'. If anyone has received warranty replacements for the cameras specifically for this glare / light leak issue it would be helpful do feel free to comment here.
View attachment 761639
Long story short, these defective cameras will need to be replaced FOC for FSD owners sooner or later, given FSD entitles us to some degree of hardware replacement to make the feature work if needed. Some owners have had to pay out of pocket to have these replaced for this issue, and will have to likely wrestle a refund out of Tesla at some point in the future. Very reminiscent of the eMMC issue.
GIGO still applies, I take it.the idea that you can throw any low res garbage at NN and it will 'sort it out' - that's pure rubbish.
low res data in ==> low res outputs.
adding junk almost never helps (dither is one case, but this isn't that case).
please stop apologizing for tesla. light leaking in WILL affect the algorithm. why do you think it wont? you think NN just works by magic?
See my post here .. I dont think anyone has demonstrated that this is actually a defect.Time and time again, Tesla has tried to BS their way through stuff like this.
Haven't they realized by now, that there are some really brilliant people who have well-tuned BS detectors buying these cars?
Elon: You might be the richest guy on the planet, and certainly had a vision a couple decades ago in which no one else believed, but you underestimate your customers.
You tell me - is this a problem?Does anyone have any actual hard evidence that the camera/NN is indeed "blinded"? Sure, the color contrast changes dramatically, and a few spots seem to wash out in the view shown on screen. But this is not what the car is seeing, it's what our human eyes are seeing after its projected through an LCD panel. If people are just extrapolating from "I can't see stuff on the screen" to "the car can't see stuff" then I think that is a very dubious claim.
Stop for a moment and think about what the NN/cameras are already coping with. Driving in blazing sunlight, head on. Then suddenly driving in dark shadows or tunnels. Driving in dark conditions, with varying degrees of lighting, and different colored street lights (sodium, anyone?). I've carefully watched the cars ability to distinguish lane lines and cars, and dont see the NNs having much trouble coping with these far harsher changes in conditions. Cameras are not eyes, and while our eyes may beat out the camera in some areas, they are certainly weaker in others, such as the speed with which they can adjust to different ambient light conditions (seconds for eyes, fractions of a second for cameras).
In addition, FSD (and NoA) has been driving with this "blinded" view for several years already. People only got all worked up when Tesla added the side-view assist feature. My guess is Tesla added the camera fix not because FSD needed it, but because humans needed it. But of course I'm speculating, as is everyone else getting all excited about "fixing" a problem that has not even been shown (so far as I can see) to exist.
Dstarnik: do you have FSD?Tesla just refused to fix my cameras under warranty (they’d be happy to take $300 from me to do it). They said it’s not a defect. Does anyone know how to complain to Tesla?