Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

[uk] UltraSonic Sensors removal/TV replacement performance

This site may earn commission on affiliate links.
They should have perfected “vision only” parking before removing USS. But seeing as Vision AP still isn’t up to par with the old Radar AP, I don’t have much hope that they will perfect it.

Model 3 starts from £48k, and it will have no USS and for now there’s no replacement for it, not even a Beta of this vision parking assist. To me that doesn’t sound right.
 
  • Like
Reactions: Boza
I suspect that Tesla removed the USS because they had supply problems and wanted to keep production flowing to try to meet their numbers.
They did it in a hurry and hope that the all seeing Tesla Vision will be able to fill the gap at some point in the future.
 
  • Like
Reactions: Brolan
I suspect that Tesla removed the USS because they had supply problems and wanted to keep production flowing to try to meet their numbers.
They did it in a hurry and hope that the all seeing Tesla Vision will be able to fill the gap at some point in the future.
I seriously doubt it, nearly every car sold in ‘the west’ has them. They are produced in significant volumes (e.g. tens of millions a year). Seemingly no other manufacturer is reporting problems with them and hasn’t at all since covid started.

The decision to remove USS was almost certainly a conscious one and would have been taken a long time ago.

You only have to look at the direction of Tesla’s autonomous systems and the wider software industry (E.g. smart phones) to conclude this. Physical hardware are is being replaced by software all the time.

Removing the sensors requires a retooling of the front and rear bumpers and not something you can do overnight. They will also have contracts with the sensor manufacturer and once production stops, getting it going again is difficult (see covid).

The sensor manufacturer is also not going to be minded to produce them for an unspecified amount of time or quantity, how can they run their own business like that?

The real issue is that their software isn’t keeping pace with the decisions and deadlines set. It’s like we haven’t seen that before, oh wait…
 
  • Informative
  • Like
Reactions: ACarneiro and init6
I seriously doubt it, nearly every car sold in ‘the west’ has them. They are produced in significant volumes (e.g. tens of millions a year). Seemingly no other manufacturer is reporting problems with them and hasn’t at all since covid started.

There are quite a few manufacturers limiting ultrasonic detectors in one way or other, or just delaying delivery timeframes. Some are removing them from the front, some creating a limited spec car with them removed, but at a lower price point. Tesla are pretty much removing them across their whole range, and not openly changing any pricing. This fits with Tesla unifying their SKU's as much as possible - maybe if they can't fit them to all cars, they won't fit them to any - just to keep the product lineup unified. They then sell it as a product improvement, just not right now because they need to first make that bit 'production ready'.

I can see some benefits to vision assistance, but I can also see some real world examples where it simply will not work anywhere near as well as ultrasonics, if at all. imho, if they wanted a product improvement, they would continue with both, and bring the benefits of a vision system, to an all ready well established ultrasonic based system. That would allow both systems to compliment the shortcomings of the other - but Tesla have said in the past with Radar, but in not so many words, they struggle with mixing the outputs of different types of sensor, and some would argue (swerving trucks etc as they pass from one camera to another), that they also struggle with mixing the output from the same types of sensors too.
 
Tesla shares have dropped due to supply chain and shipping issues, which comes at a time when Elon needs all the money he can get for his twitter buyout. If they can remove one outsourced component then thats one less item to slow down the manufacturing and shipping of the money making machine that Musk is. Software costs to write, but once it is functional it won't then cost any more, unlike USS which he is forced to outsource time, money and potential delays which all impact the bottom line of a vehicle.
It is all about the Benjamins unfortunately, not about making the product better. Once Tesla is run like a normal automotive company then their cars will get better over time, and not worse, which seems to be the case now. Nearly every change that is happening is about removing third party hardware from the vehicles and replacing it with less efficient, Tesla replacements. Just look at radar removal, USS removal, proposed stalk removal, I am sure there are more too.
Such a shame really as Tesla still have many excellent qualities that are slowly being eroded due to the short sighted ideas being implemented.

What is really worrying, and shouldn't be overlooked, is that fact that Tesla's biggest and best producing factory is in China. China, who have not made it a secret that they will invade Taiwan, and reclaim it as China governed land. This will put Musk on a collision course with the world, as the USA has said they will militarily defend Taiwan against Chinese aggression, and Taiwan also produce the majority of high end computer chips, so it is a lose lose situation for Musk and for Tesla.

Imagine all the built Tesla's rotting away because the world have imposed sanctions on China like they have Russia, and the flow of chips needed to build Tesla's worldwide suddenly drying up??

Anyway, enough doom mongering, it will soon be Christmas, and Q4 for deliveries:)
 
They should have perfected “vision only” parking before removing USS. But seeing as Vision AP still isn’t up to par with the old Radar AP, I don’t have much hope that they will perfect it.

Model 3 starts from £48k, and it will have no USS and for now there’s no replacement for it, not even a Beta of this vision parking assist. To me that doesn’t sound right.
My guess is that Tesla made this decision some time ago, making the necessary design changes to the car and started to reduce inventory of the sensors (and stopped orders for more).

It seems the timing has not gone quite perfect, as the software release is not available quite yet, but it looks like a gap of only a few weeks before vision parking is available, so not many people will be impacted and in few months this will be largely forgotten.
 
It seems the timing has not gone quite perfect, as the software release is not available quite yet, but it looks like a gap of only a few weeks before vision parking is available, so not many people will be impacted and in few months this will be largely forgotten.

We are talking about 'Park Assist' here, not Auto Park. Two completely different things.

'Park Assist' is about the sensors being used for far more than parking, like reversing out of spaces, or manoeuvring in a tight space etc etc.
'Auto Park' is, urm, the ability of car to park itself, and as promised by EM in the fading past, reverse park, which I guess means manoeuvrer itself out of a sparking space. Adding a bit of conjecture here, only in very specific scenarios such as reverse parking in totally perpendicular lined parking spaces, and parallel parking in some form. We got very limited vision parking around a year ago, and even more limited ultrasonic parking before that. So the EM promises of parking is simply playing catchup. And of course, you need EAP/FSD for that.

imho.

Don't get me wrong, I can see many benefits of a vision based park assist, such as being able to see where you are in relation to an obstacle when your mirrors are not seeing that blind spot, or exact camera output not on screen to pick up an obstacle or parking line, but I've had too many squeaky bum moments reversing into the abyss to know that a vision based approach does not work sufficiently enough of the time that you can only rely on your 6th sense. Or, get some help from the ultrasonics that will warn you that you are reversing into something (in this case a tractor multipoint hitch) in pitch dark on a pea souper evening.

Vision approach needs far more than tweaking a few cameras. It needs some fundamental changes to the environmental protection of many of the cameras and far better lighting when reversing. Unfortunately, many modern vehicles are designed with compromised vision that greatly increase the reliance of driver aids.

Long gone are the days when you knew to the inch where the corners of your car were, either by experience, or by hearing the exhaust throbbing off the car behind you. Bring back the exhaust pipe sensor I say...
 
Tesla Vision based on cameras alone cannot be reliable.

Last week I drove most of the way home at night with one or more cameras blocked or blinded, it wasn't even raining and there was no mud or dirt on the car or cameras.

Today it was raining heavily and when I started backing in to a parking space I could hardly see out of the rear camera because of drops of water.

XPENG have flying cars and self driving cars and they employ USS, lidar, radar, and cameras.
 
  • Like
Reactions: Boza
We are talking about 'Park Assist' here, not Auto Park. Two completely different things.
I was talking about Park Assist, not Auto Park (actually called Park Seek).

IIRC there is new improved code for Park Assist upcoming in the new stack which will not use USS. it would be odd for Park Assist and Park Seek to use totally different code, it is just the end use case that is different.
Vision approach needs far more than tweaking a few cameras. It needs some fundamental changes to the environmental protection of many of the cameras and far better lighting when reversing.

It has had more than a "tweaking" - Tesla have created occupancy networks which will change fundamentally how the car sees its surroundings.

This has not been released in any build yet in the UK. Optimus also uses these occupancy networks to walk around its environment, so its mapping should be ideal for garages etc. Check out the AI day 2 presentation on this, and Ashoks video explaining it.
Last week I drove most of the way home at night with one or more cameras blocked or blinded, it wasn't even raining and there was no mud or dirt on the car or cameras.

That is because in the UK we are still using the old software stack, where each camera is isolated with no vector space or occupancy network. There are video's of Beta in USA doing incredibly well in heavy rain.

Tesla Vision based on cameras alone cannot be reliable.

The problem with Lidar systems is that without supplemental vision they are useless. so in the end it all comes down to vision, and in this field Tesla seems to be lightyears ahead both in software and hardware.
 
  • Like
Reactions: pow216 and init6
I was talking about Park Assist, not Auto Park (actually called Park Seek).

IIRC there is new improved code for Park Assist upcoming in the new stack which will not use USS. it would be odd for Park Assist and Park Seek to use totally different code, it is just the end use case that is different.


It has had more than a "tweaking" - Tesla have created occupancy networks which will change fundamentally how the car sees its surroundings.

This has not been released in any build yet in the UK. Optimus also uses these occupancy networks to walk around its environment, so its mapping should be ideal for garages etc. Check out the AI day 2 presentation on this, and Ashoks video explaining it.


That is because in the UK we are still using the old software stack, where each camera is isolated with no vector space or occupancy network. There are video's of Beta in USA doing incredibly well in heavy rain.



The problem with Lidar systems is that without supplemental vision they are useless. so in the end it all comes down to vision, and in this field Tesla seems to be lightyears ahead both in software and hardware.
Agree with the bulk of your statements, but vision will always be an AI interpretation of a 'flat' 2D image seen through the cameras.
Lidar can accurately determine if there is an obstacle ahead and how far it is. That's why they are used in 3D imaging..

When it comes to putting my life & my passengers' on the line, I'll always be more confident in a laser telling me if there is 100% an object I'm about to collide with, rather than the smartest pixel-based interpretation that will assume there is or isn't with a certain confidence interval which will always be <99.999%...
 
Interestingly, vision had a failure in one of Chris’s (Dirty Tesla) YouTube videos when it saw a reflection of a barrier on a very wet road surface.
The car stopped at the reflection. One might liken that to a tromp d’œil created in pavement art. We would see an apparently real image, cameras, I assume, would see the same. How it would be interpreted is another matter.
One might be concerned about the reflection of a gantry on a wet motorway surface, though angles of reflection may negate an issue.
 
Who gives a s**t about a "new stack" that might as well not exist as far as we're concerned? When are we likely to ever see it, when FSD beta is available here? (i.e. several years away).

It might as well not exist as far as it having any relevance to what UK drivers experience.

The "new stack" seems to be able to do just about everything short of curing cancer, meanwhile in the real world us plebs have to deal with Ye Olde Stack that we actually have.
 
The base case for new orders should be: sensors removed. This may result in your car not seeing low objects while you park etc. This may or may not be improved in the future.

Those assuming ‘Tesla have removed sensors so a software replacement must be coming’ is just crazy talk. Tesla removed sensors to lower cost. What happens after that isn’t so important.
 
  • Like
Reactions: CWT3LR