This is a HUGELY underappreciated issue!
Its pretty clear to anybody with a Tesla in the UK, especially in the last few months, that the current suite of sensors is absolutely inadequate to provide FSD, even given an infinite amount of compute. It will probably be absolutely fine for California, I have no doubt, but so far this year, EVERY single day I've driven more than 10 miles in my new model Y, the car ends up with a horribly grimey reverse camera view, and its absolutely constant that I get warnings that autopilot cannot function because a side pillar camera or 'multiple cameras' have their view blocked or blinded.
There is no way this can be fixed by software. They have to have some kind of self-cleaning camera, and TBH its pretty ridiculous that tesla don't realize this. They should relocate the whole autopilot team to a state that gets constant bad weather, because right now, they are massively over-fitting for California sunshine.
The same problem is probably why they think USS can be replaced by vision only, or that USS isn't required to park. Not all weather is Californian. Not all roads are Californian width. I'd love to see Elon try and park a model X with no USS in a UK car parking space.
There are very few advantages other auto firms have over Tesla, but being aware of real world driving conditions is sadly one of them. I genuinely think this is a risk that investors underappreciate.