Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

[uk] UltraSonic Sensors removal/TV replacement performance

This site may earn commission on affiliate links.
A typical view from my rear camera following a trip down the M40 in winter. Good luck with that Tesla Vision.

View attachment 899159
Irrespective of latest camera upgrade news (from 1MP to 5MP) - doesn't mean squat if they haven't redesigned the housing or placement of the cameras. There seems no way of using vision to provide any form of parking assistance, given the front cameras are in the wrong place, and the rear camera is at the whim of the elements.
 
It's been done to death but ultrasonics work well because they don't need to discern things, it's a simple sound reflection test. They are arguably the best way of measuring distance without a remote emitter/receiver. They still work to a decent enough performance if dirty or with moisture on.

Just like rain sensors, they are a purpose built "dumb" device that doesn't have to use AI to work out what to do.

Other than purely for aesthetics I really don't see the gain in what Tesla are doing, and plenty of evidence to suggest these cameras will have a very tough job at replicating this functionality, and may end up having a very narrow "ideal" operating conditions (not unlike our auto wiper systems)
 
A typical view from my rear camera following a trip down the M40 in winter. Good luck with that Tesla Vision.

View attachment 899159
Regular occurence with me as well. Can't get over the fact that we're expected to carry a spray bottle with us to fix this issue. The more I think about this whole USS palaver the more infuriating it becomes. The whole "the cameras work just as good" notion is just nonsense - I was trying to back up into a tight space the other day and the cameras were completely blocked. Genuinely hope Tesla address this issue otherwise and it's not just a fix for future builds.
 
A typical view from my rear camera following a trip down the M40 in winter. Good luck with that Tesla Vision.

View attachment 899159
The rear pointing side cameras still have a clear view. So will the cameras in the windscreen. I suspect the way Tesla will implement the vision-based parking system is by using all of the cameras to continuously generate a 3D map of the area the car is driving through (when driving at a slow speed). It's a technique called Simultaneous Localisation And Mapping (SLAM) which is widely used in robotics (some robot lawnmowers and vacuums already use this). Hopefully, Tesla will display a 3D view of the information it's gathered on the screen and where it thinks the car is in relation to the surroundings and obstacles. This method doesn't have to rely on the boot-mounted camera (but could use data from it if it was clear), nor do they need a camera in the front bumper.
This approach will only work when there is enough light reflected back from the scene and if the items in the scene have enough texture and contrast to be seen... In many ways this should be superior to a USS system but there are bound to be edge cases where it falls over.
 
The rear pointing side cameras still have a clear view. So will the cameras in the windscreen. I suspect the way Tesla will implement the vision-based parking system is by using all of the cameras to continuously generate a 3D map of the area the car is driving through (when driving at a slow speed). It's a technique called Simultaneous Localisation And Mapping (SLAM) which is widely used in robotics (some robot lawnmowers and vacuums already use this). Hopefully, Tesla will display a 3D view of the information it's gathered on the screen and where it thinks the car is in relation to the surroundings and obstacles. This method doesn't have to rely on the boot-mounted camera (but could use data from it if it was clear), nor do they need a camera in the front bumper.
This approach will only work when there is enough light reflected back from the scene and if the items in the scene have enough texture and contrast to be seen... In many ways this should be superior to a USS system but there are bound to be edge cases where it falls over.
Oi! - Get out of here with this positive post :D

Certainly hope you're right btw, be nice to not see these threads here and everywhere else about it all the time.
 
  • Funny
Reactions: Adopado
Oi! - Get out of here with this positive post :D

Certainly hope you're right btw, be nice to not see these threads here and everywhere else about it all the time.
Be nice not to have Elon Musk over promise and not deliver yet again. Then maybe we wouldn't have to call out the crap experience of owning "the world's most advanced car" that doesn't let you reverse safely.
 
The rear pointing side cameras still have a clear view. So will the cameras in the windscreen. I suspect the way Tesla will implement the vision-based parking system is by using all of the cameras to continuously generate a 3D map of the area the car is driving through (when driving at a slow speed). It's a technique called Simultaneous Localisation And Mapping (SLAM) which is widely used in robotics (some robot lawnmowers and vacuums already use this). Hopefully, Tesla will display a 3D view of the information it's gathered on the screen and where it thinks the car is in relation to the surroundings and obstacles. This method doesn't have to rely on the boot-mounted camera (but could use data from it if it was clear), nor do they need a camera in the front bumper.
This approach will only work when there is enough light reflected back from the scene and if the items in the scene have enough texture and contrast to be seen... In many ways this should be superior to a USS system but there are bound to be edge cases where it falls over.

They could. But development and support of this sounds potentially more expensive than some beeping sensors and leaving it to your brain to work out the rest
 
“Just because I tweet something does not mean people believe it” - EM

We can throw in theories, complicated terminology, etc. but the sad reality is that, currently, the $60k Tesla is worse than a $30k Honda at such mundane task like parking. And it is unclear if/when there will be a fix.

Otherwise, a theoretical discussion around different potential approaches to mapping the real world is interesting. All the power to FSD and its beta testers but a car is a tool for the majority of customers.
 
They could. But development and support of this sounds potentially more expensive than some beeping sensors and leaving it to your brain to work out the rest
Not really. They already have a map of the (immediate) environment , part of which is already displayed on our screens. It doesn’t need any additional hardware or inputs that aren’t already in the car.

Don’t know, I’m not a software developer but I can see how this approach may work.
 
Not really. They already have a map of the (immediate) environment , part of which is already displayed on our screens. It doesn’t need any additional hardware or inputs that aren’t already in the car.

Don’t know, I’m not a software developer but I can see how this approach may work.
And after four months, still nothing. Insert MrBeanWaitingPatiently.jpg
 
Not really. They already have a map of the (immediate) environment , part of which is already displayed on our screens. It doesn’t need any additional hardware or inputs that aren’t already in the car.

Don’t know, I’m not a software developer but I can see how this approach may work.

Feels like low speed recreation of surrounding would need different approaches vs higher speed regular driving and to create a higher resolution ‘mesh’ of your surround
 
Feels like low speed recreation of surrounding would need different approaches vs higher speed regular driving and to create a higher resolution ‘mesh’ of your surround
Hence the orders for higher res cameras and high def radars. Current sensors obviously unable to do the do. They will need to get some on the road to figure out how to make it work (if it does) so can't see a rollout soon. And implies a retrofit, which they won't pay for, preferring to make people take part in a class action suit instead.
 
Feels like low speed recreation of surrounding would need different approaches vs higher speed regular driving and to create a higher resolution ‘mesh’ of your surround
Nothing to do with “low speed recreation”. It already does that when displaying cars around you when in slow moving traffic or waiting for a traffic light.

The system can recognise traffic lights, others cars, cones, etc. What it now needs to recognise/display is walls and other obstacles.

Since the radar got switched off, it thinks the wall on our drive is now a lorry. It just needs to be retrained that there’s now things like walls…