Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Disabled driver assist features?

This site may earn commission on affiliate links.
The Thames Valley winter sun and damp roads seem to be causing my Model 3 to disable cruise control etc very frequently when driving into the sun. How can Tesla hope to deliver full self driving in the foreseeable future - or is this just my car?
 
FSD will use more cameras than TACC/AP will.

There are only 3 cameras with a useful forward view of where the car is going (as opposed to whether other cars etc may come from side roads). As you know they are grouped together with similar elevation views so unless Tesla decides to add extra cameras which they won't reftrofit then no way will that make a difference and anyway all forward facing cameras will suffer the same issue unless theres some new technology allowing them to see against the sun or Tesla adds some other form of sensor, Addiitonal microwaves, radars, sonars will all interfer with other vehicles. So squint technology, long antiglare tubes on a bigger multi-camera array?? Or park up until the sun's gone down....
 
There are only 3 cameras with a useful forward view of where the car is going (as opposed to whether other cars etc may come from side roads). As you know they are grouped together with similar elevation views so unless Tesla decides to add extra cameras which they won't reftrofit then no way will that make a difference and anyway all forward facing cameras will suffer the same issue unless theres some new technology allowing them to see against the sun or Tesla adds some other form of sensor, Addiitonal microwaves, radars, sonars will all interfer with other vehicles. So squint technology, long antiglare tubes on a bigger multi-camera array?? Or park up until the sun's gone down....

Do you know what cameras are actually being used for TACC? Do you know what image processing Tesla is performing on each of these cameras to improve dynamic range ? Love to see what they are using. The only thing I have seen is below (from How do Tesla cameras self clean?), but that is before any image processing has been applied other than any colour bias in the camera itself. Plus it does not identify which of the cameras TACC/AP is using vs EAP/FSD although you can probably guess some of them.

merged_text-png.245227


I don't, but Tesla seem confident that they have covered all bases so that FSD will be ok with what they are using and whilst I wish there was currently better ability to see through the crap (which they already do with at least one of the cameras as it is focused to see better in certain conditions), when power of HW3 starts being utilised, I am sure we will see dramatic improvements in this area. Already, the car seems to see some things better than I can in marginal conditions and I have pretty good eyesight.
 
Do you know what cameras are actually being used for TACC? Do you know what image processing Tesla is performing on each of these cameras to improve dynamic range ? Love to see what they are using. The only thing I have seen is below (from How do Tesla cameras self clean?), but that is before any image processing has been applied other than any colour bias in the camera itself. Plus it does not identify which of the cameras TACC/AP is using vs EAP/FSD although you can probably guess some of them.

I don't, but Tesla seem confident that they have covered all bases so that FSD will be ok with what they are using and whilst I wish there was currently better ability to see through the crap (which they already do with at least one of the cameras as it is focused to see better in certain conditions), when power of HW3 starts being utilised, I am sure we will see dramatic improvements in this area. Already, the car seems to see some things better than I can in marginal conditions and I have pretty good eyesight.

There are a few vids about of the tesla cam interpretations which are certainly technologically clever but you can't beat physics. It's easy enough to bias cameras to widen the visible spectrum or towards the ends - improved IR for better rain/night views and so forth but that doesn't stop you being dazzled by a full spectrum sun. You can cut the dazzle effect by very narrow focus just off the direction of the sun - as I suggested above soemthing like long thin tubes - but then you need a way bigger array to cover all the angles. I'm sure Elon thinks he can do it but I don't think he can do it with current equipment. If he was always right then we wouldn't need HW3 and they wouldn't be designing HW4.

As for which cams are used for what - I'd guess they use all of them all the time but just don't display the info or give the driver the advantage unless he's paid for it 'cos that's the easy way to program. It's a bit like making a chip then cutting the tracks to the math co-processor and selling it cheaper. It may be good marketing but lousy morality when you could just split the difference on cost.
 
There are a few vids about of the tesla cam interpretations which are certainly technologically clever but you can't beat physics... .

This video by Scott Kubo (How Does Tesla Autopilot Fare Driving Directly Into The Sun? Video) seems to suggest much better performance than eg I'm getting (at least he's getting some performance - my TACC wouldn't even engage in those conditions). Scott also shows what the cameras actually see - the images look pretty decent to me despite the low sun and mucky surfaces.
I'm wondering why this difference? He's using NoA, I imagine, whereas I'm using TACC. Or maybe I'm doing something else wrong?
 
  • Like
Reactions: MrBadger
I've seen reports of NoA behaving better than standard AP - backs up my camera count view - but I've not seen this other than NoA giving a 'Bad weather', some functions limited warning a couple of times that I assumed was to do with lane change being restricted.

I've also heard that blind spot etc detection issues will be mitigated in a software update.