Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla Vision vs Parking Sensors

This site may earn commission on affiliate links.
I am talking about version 12 doing parking assist, how complicated can this be? So are you saying my car will always use ultrasonic sensors for Park assist and not use vision only? Please educate Elon or yourself on how vision only works. I was not talking about two versions of FSD.
Are you saying the full self driving module does not park the car?
FSD will park the car using vision. This doesn't necessarily have anything to do with whether the park assist function you as a human driver use is USS based or vision based.
 
Yes, I am very frustrated that Elon sold vaporware as a product. But my question was not about me parking, but why I was getting a park assist unavailable error, when I my ultrasonic sensors were disconnected when I removed my bumper. I was still able to park by myself, but my car was giving me an error. I guess I was the fool thinking that it was Tesla‘s parking assist, that I paid for and has never worked with or without the sensors. And you’re saying the park assist is me parking. I was sure summon was going to use the ultrasonic sensors until they went to pure vision, but please explain.
if it is using vision only for parking, I theorize that they just did not remove the error from the code. That is when the forum explain to me. I did not understand how the world works.
I do not. I still don’t. Everyone is involved in a circle flirt and not reading the original question.
Will my ultrasonic sensors be used by version 12 for automated parking assist? While we’re at it, will vision only ever make my wipers work properly on the single stack, on dojo, thoughts and prayers.

In fact, using both ultrasonic sensors and vision at slow speeds for parking, removes the argument that the computer does not have time to analyze both inputs. in my opinion, it is Elon willing to sacrifice functionality, usability and safety to save a couple dollars. If Stockholders think this is smart, Rocc on. Those sensors are at a pretty good level for spotting things at bumper level like let’s say a baby stroller. Vision only, guaranteed safer against hitting posts, columns, neighboring cars and babies, then humans or ultrasonic and radar. Tesla next Super Bowl commercial is written.

After five years of paying for a product and owning it, the product has not been delivered, the product has not even been defined, the product has not even applied for regulatory approval. Executives from Tesla are still saying that unattended FSD will be out by the end of this year. It could be a stock pump, it could be engineers, just letting us have early access to the future. The past Tesla over exaggerations are no indication of the future and I am no psychic, but I am calling 100% BS on this prediction.
 
Last edited:
Yes, I am very frustrated that Elon sold vaporware as a product. But my question was not about me parking, but why I was getting a park assist unavailable error, when I my ultrasonic sensors were disconnected when I removed my bumper. I was still able to park by myself, but my car was giving me an error. I guess I was the fool thinking that it was Tesla‘s parking assist, that I paid for and has never worked with or without the sensors. And you’re saying the park assist is me parking. I was sure summon was going to use the ultrasonic sensors until they went to pure vision, but please explain.
if it is using vision only for parking, I theorize that they just did not remove the error from the code. That is when the forum explain to me. I did not understand how the world works.
I do not. I still don’t. Everyone is involved in a circle flirt and not reading the original question.
Will my ultrasonic sensors be used by version 12 for automated parking assist? While we’re at it, will vision only ever make my wipers work properly on the single stack, on dojo, thoughts and prayers.

In fact, using both ultrasonic sensors and vision at slow speeds for parking, removes the argument that the computer does not have time to analyze both inputs. in my opinion, it is Elon willing to sacrifice functionality, usability and safety to save a couple dollars. If Stockholders think this is smart, Rocc on. Those sensors are at a pretty good level for spotting things at bumper level like let’s say a baby stroller

After five years of paying for a product and owning it, the product has not been delivered, the product has not even been defined, the product has not even applied for regulatory approval. Executives from Tesla are still saying that unattended FSD will be out by the end of this year. It could be a stock pump, it could be engineers, just letting us have early access to the future. The past Tesla over exaggerations are no indication of the future and I am no psychic, but I am calling 100% BS on this prediction.
I think you are asking several questions which are in the same area but have different answers, and that's why you don't understand the answers people are giving.

If you have a car with USS then you are currently still on the original park assist software which depends on the USS. If you disconnect them you will get an error, if you connect them you should have the same function you always had. Disconnecting them will not move you to the new park assist function.

v12 FSD will not use the USS for parking functions. Tesla haven't stated this but I think it's a safe bet as they do not provide enough data. Imagine parking your car with the windows and cameras covered over and only relying on the 'wavy lines' USS display; it would be impossible.

Cars running v12 FSD may or may not use the high fidelity park assist for the human parking assistance. There is no connection between the two; the car could be using pure vision for self parking and still give you the USS park assist function.

The rest of it is just ranting at Tesla. I don't work at Tesla, neither does anyone else here, so I can't help you with that.
 
  • Like
Reactions: dronus and Jules22
You’re saying that like it’s a joke but it literally does. They are building towards a car that does not yet exist and have gambled that the hardware already sold can be turned in to it via software update.
Yes, but that's part of the joke.

Maybe cameras alone can do FSD, but right now they can't even match a basic radar system from a decade ago for adaptive cruise control.

Maybe one day they'll have a functioning auto wiper system, but for now they can't match a cheap IR system from a decade ago.

Maybe one day they'll have a usable camera system for parking, but for now it's in many ways less effective than this 2007 Infiniti's 360 degree camera:

Different is not always better.

Tesla Vision is already as good or better by any fair minded comparison in it's first iteration
This statement is impugning the intent of everybody who disagrees with you as not being fair minded, even as you defend the grey undulating blobs against crisp real-time video.

Here's a challenge: parallel park a 2023 Volvo using nothing but its camera system. Now do the same with a 2023 Tesla. Whichever car gets closer to the curb without scraping a wheel wins.

Everyone here knows which car will win this competition.
 
Last edited:
Tesla Vision is already as good or better by any fair minded comparison in it's first iteration
This statement is impugning the intent of everybody who disagrees with you as not being fair minded, even as you defend the grey undulating blobs against crisp real-time video.

Here's a challenge: parallel park a 2023 Volvo using nothing but its camera system. Now do the same with a 2023 Tesla. Whichever car gets closer to the curb without scraping a wheel wins.

Everyone here knows which car will win this competition.

Tesla Vision might be great in the future, but now it's not even good.
 
This statement is impugning the intent of everybody who disagrees with you
It's an opinion based on first hand experience using all of the discussed systems, which is supported by most who have that first hand experience as well as three pieces of video evidence in this thread.

That opinion is also supported by the reviewer in the video submitted by one of those arguing against Tesla Vision (who also has no experience using it).
 
  • Like
Reactions: SPadival
This statement is impugning the intent of everybody who disagrees with you as not being fair minded, even as you defend the grey undulating blobs against crisp real-time video.

Here's a challenge: parallel park a 2023 Volvo using nothing but its camera system. Now do the same with a 2023 Tesla. Whichever car gets closer to the curb without scraping a wheel wins.

Everyone here knows which car will win this competition.

Tesla Vision might be great in the future, but now it's not even good.
That is a fair real world test
 
  • Like
Reactions: EatsShoots
People forget very quickly how many times Tesla have proven all of the 'experts' wrong time and time again.
And here you are arguing with plenty of owners who had a pre 2022 with USS enabled where TACC worked relatively well, and here we are 2 years later and the development of Tesla Vision still hasn't restored TACC to the same level of capability as before. Owners don't forget that, and get reminded of it every time TACC phantom brakes violently and they have to explain to their passengers that it was the car not the driver as to why they were scared. You're not on whirlpool arguing with EV haters, you're on here arguing with owners who parted with their dollars and have been let down by the Tesla experience hence the scepticism.

Owners who have had Tesla making their cars worse with a software update don't really care how much confidence you have in that Tesla will solve for vision or prove us wrong. Restoring TACC to a level of capability that you get from a 2015 VW with Adaptive Cruise Control would be a good start for restoring owner trust with Tesla. Not using the owner base as dev and test environment would also be a nice change.
 
And here you are arguing with plenty of owners who had a pre 2022 with USS enabled where TACC worked relatively well, and here we are 2 years later and the development of Tesla Vision still hasn't restored TACC to the same level of capability as before. Owners don't forget that, and get reminded of it every time TACC phantom brakes violently and they have to explain to their passengers that it was the car not the driver as to why they were scared. You're not on whirlpool arguing with EV haters, you're on here arguing with owners who parted with their dollars and have been let down by the Tesla experience hence the scepticism.

Owners who have had Tesla making their cars worse with a software update don't really care how much confidence you have in that Tesla will solve for vision or prove us wrong. Restoring TACC to a level of capability that you get from a 2015 VW with Adaptive Cruise Control would be a good start for restoring owner trust with Tesla. Not using the owner base as dev and test environment would also be a nice change.
That is a brilliant summary
 
And here you are arguing with plenty of owners who had a pre 2022 with USS enabled where TACC worked relatively well, and here we are 2 years later and the development of Tesla Vision still hasn't restored TACC to the same level of capability as before. Owners don't forget that, and get reminded of it every time TACC phantom brakes violently and they have to explain to their passengers that it was the car not the driver as to why they were scared. You're not on whirlpool arguing with EV haters, you're on here arguing with owners who parted with their dollars and have been let down by the Tesla experience hence the scepticism.

Owners who have had Tesla making their cars worse with a software update don't really care how much confidence you have in that Tesla will solve for vision or prove us wrong. Restoring TACC to a level of capability that you get from a 2015 VW with Adaptive Cruise Control would be a good start for restoring owner trust with Tesla. Not using the owner base as dev and test environment would also be a nice change.
They should have never removed the sensors and radar, just use it to enhance the experience. Radar + USS + cameras is a superior experience. Cars being sold in the 2020s and on should be more advanced and easier to drive than cars before. That's part of the reason I keep harping on a front bumper camera - it's about convenience. I'd take a Vision-only car if it had this (heck even the Cybertruck has one).
 
And here you are arguing with plenty of owners who had a pre 2022 with USS enabled where TACC worked relatively well, and here we are 2 years later and the development of Tesla Vision still hasn't restored TACC to the same level of capability as before. Owners don't forget that, and get reminded of it every time TACC phantom brakes violently and they have to explain to their passengers that it was the car not the driver as to why they were scared. You're not on whirlpool arguing with EV haters, you're on here arguing with owners who parted with their dollars and have been let down by the Tesla experience hence the scepticism.

Owners who have had Tesla making their cars worse with a software update don't really care how much confidence you have in that Tesla will solve for vision or prove us wrong. Restoring TACC to a level of capability that you get from a 2015 VW with Adaptive Cruise Control would be a good start for restoring owner trust with Tesla. Not using the owner base as dev and test environment would also be a nice change.

Completely off topic, we are talking about the current Tesla Vision parking assist vs USS and we have a couple of owners who have only experienced one of those, arguing with owners who have experienced both in the Tesla as well as many other vehicle systems.

Also TACC never used USS it's always been camera/radar(for those vehicles that had radar fitted).

Ultra Sonic Sensors are only useful for close in objects at low speeds, they have never been used for any form of cruise control.
 
As someone who drove a lot of TACC kilometres both with radar enabled and vision only, I think you're looking back with rose-tinted glases. There was phantom braking in the radar days too - in fact I would suggest a lot more, since true phantom braking (where it isn't apparent what the car is reacting to) seems to be pretty rare with the vision stack. When radar was enabled you used to get braking when there was a combination of overhanging obstruction (eg a tree or overpass well above car height) and dark shadows - vision-only seemed to get rid of that particular failure case, which happened to be the one I saw most often.
 
As someone who drove a lot of TACC kilometres both with radar enabled and vision only, I think you're looking back with rose-tinted glases. There was phantom braking in the radar days too - in fact I would suggest a lot more, since true phantom braking (where it isn't apparent what the car is reacting to) seems to be pretty rare with the vision stack. When radar was enabled you used to get braking when there was a combination of overhanging obstruction (eg a tree or overpass well above car height) and dark shadows - vision-only seemed to get rid of that particular failure case, which happened to be the one I saw most often.
I agree that there was phantom braking with the radar and vision, but it was a superior system simply based on the fact that 1) it would keep working in heavy rain / low visibility conditions, 2) it wouldn't slam on the brakes if a pedestrian stood close to the kerb, and 3) you could use it with a distance to car of 1 (and it would work beautifully in heavy Sydney traffic.

Tesla engineers went on the record to confirm that vision only wasn't quite yet right when they launched, but they would get there eventually using the data from the customer base. Two years later they haven't yet delivered an equal system, and the justification for making owners part of the development program in the pursuit of reduction of complexity isn't good enough.
 
Completely off topic, we are talking about the current Tesla Vision parking assist vs USS and we have a couple of owners who have only experienced one of those, arguing with owners who have experienced both in the Tesla as well as many other vehicle systems.

Also TACC never used USS it's always been camera/radar(for those vehicles that had radar fitted).

Ultra Sonic Sensors are only useful for close in objects at low speeds, they have never been used for any form of cruise control.
FYI, I have used the latest tesla vision system in a model 3. Tesla are keen to have me swap my model S for a highlander so loaned me one for a day, very recently. I’m guessing you have not tried a 2023/4 camera system though. A decent one.
 
I’m guessing you have not tried a 2023/4 camera system though. A decent one.
several... All only 4 camera systems though so they all suffer the same well documented limitations that are overcome by TV. Also Tesla didn't ever have 360 vision so what's your point? It's clearly nothing to do with the thread topic. The thread is about TV as a replacement for USS.
 
Last edited:
As someone who drove a lot of TACC kilometres both with radar enabled and vision only, I think you're looking back with rose-tinted glases. There was phantom braking in the radar days too - in fact I would suggest a lot more, since true phantom braking (where it isn't apparent what the car is reacting to) seems to be pretty rare with the vision stack. When radar was enabled you used to get braking when there was a combination of overhanging obstruction (eg a tree or overpass well above car height) and dark shadows - vision-only seemed to get rid of that particular failure case, which happened to be the one I saw most often.
I have a merc eqe with multiple radars, multiple camera, and USS. 2023 model with 2023 software. I have a 2018 tesla S with 2024 software along with radar (unsure if its still active), multiple camera, and USS. Both offer TACC and lane keeping. The merc has not done a single event of braking for something that I will not hit, including branches and shadows. The model S though (along with its twin sold last year)…well those features cannot be safely used around the city. Hence I do not see radar as the cause of the problem, rather the way that it is implemented.
 
I've had multiple phantom braking events on the freeway because the shadow of a vehicle (usually dark) crosses the lane divider but the car itself is still well within it's own lane. It took a while to work out what was happening but now I can recognise it every time...
 
  • Like
Reactions: EatsShoots
I agree that there was phantom braking with the radar and vision, but it was a superior system simply based on the fact that 1) it would keep working in heavy rain / low visibility conditions, 2) it wouldn't slam on the brakes if a pedestrian stood close to the kerb, and 3) you could use it with a distance to car of 1 (and it would work beautifully in heavy Sydney traffic.
Yes, the TACC with radar definitely worked better in very low visibility conditions. I've never used a follow distance shorter than 4 though, so that bit didn't bother me.
 
  • Like
Reactions: Maximillan