Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

[uk] UltraSonic Sensors removal/TV replacement performance

This site may earn commission on affiliate links.
Personally, I don't think that USS will be disabled for cars with it today, people are noticing that the new cars also have the newer cameras. I expect in a few months when the park-assist returns with improved capabilities there will be similar threads complaining that we can't have it or get retrofits of the required hardware.
I do admire your confidence that Elons rhetoric of park assist will return in a few months - Elons months have at least 180 days in them - so probably about right.
 
I do admire your confidence that Elons rhetoric of park assist will return in a few months - Elons months have at least 180 days in them - so probably about right.
It didn't come from Elon, the grownups at Tesla wrote it.

Same happened with removing Radar in the US, they explained it would be a few months to reinstate the features that were initally missing, and they kept to their timescale.
 
  • Like
  • Disagree
Reactions: pow216 and CWT3LR
Same happened with removing Radar in the US, they explained it would be a few months to reinstate the features that were initally missing, and they kept to their timescale.

So they have got back to the original follow distance of 1 and parity or exceed max autopilot speed?

Rhetorical question. Doesn't sound like parity of feature set to me.
 
I lost AP functionality this weekend in the middle of the motorway due to a "Poor weather conditions" error message. It started to rain slightly...

Every single Tesla Vision believer should experience one of those before trying to defend this ridiculous implementation over radar/lidar...
Believe that the latest updates automatically turn on automated windshield wipers when in Autopilot.
 
It's come to me that my DJI Drone uses Camera sensors for distance and they are very accurate.

Also shows the distance from obstacles on the screen in ft and beeps when you get close.

That said, this is always flown in dry, visible conditions and there are 2 cameras on the very front and two on the very back of the machine.

Really hoping this gets sorted SOON!
 
  • Like
Reactions: init6
the human will focus just on the area of immediate interest
I feel this ^^^^ keeps getting overlooked. I like a lot of what Karpathy says and the idea that the cost of a sensor is way beyond what is immediately evident is surely true, but unless every facet of human vision is replicated, then the reality is that your sensor suit has to make up any shortfall, may be using a non vision or independent subsystem.
 
based on “yes but the wipers/headlights/FSD”

There is surely a big build up of concern / frustration / anticipation over these areas. Combined with the general evolution of both Tesla's approach (es) and the product performance I don't think some speculation is groundless.

For me, the total cost of making changes to sensor suits or indeed pretty much anything it would seem, especially while having to support multiple stacks would seem very unappealing to a manufacturer, just as the prospect of owning a car stuck in an obsolete configuration is unappealing to an owner, especially one who paid for features that seem to be getting further from being successfully delivered.
 
Last edited:
  • Like
Reactions: Boza
Well, I agree that Tesla need to reintroduce the ability for drivers to override the auto-wipers like you can with the auto-dips when they have false positives, and this will not affect the safety when using vision only Autopilot. My current 2022 car doesn't have radar fitted, so I've perhaps had vision only longer than most and in over 3K miles of driving the wipers have irritated me with dry wiping twice, certainly not a ruinous experience for the whole idea of electric vehicles.

Personally, I don't think that USS will be disabled for cars with it today, people are noticing that the new cars also have the newer cameras. I expect in a few months when the park-assist returns with improved capabilities there will be similar threads complaining that we can't have it or get retrofits of the required hardware.
I really admire your belief in Tesla. You have a relatively new car which probably explains it. Unfortunately, a few interactions with a SC and couple of firmware updates may change that.
As for “spoiling EV experience”, I was referring to Tesla EV. As an EV, it is awesome (although the others are starting to catch up). Otherwise, for a $100k+ car, the experience is pretty mediocre. Just rent a car in a similar price range and you will see the difference (EV or ICE).
 
  • Like
Reactions: CWT3LR
It seems far more likely that for development speed they omitted oncoming traffic in the visualization as it isn't essential and never really worked reliably on the pre-FSDb AutoPilot anyway.
Tesla vision visualisation is *absolute crap* compared to the release before it. It's like they abandoned it. It doesn't always even see car in front. Rarely parked cars, almost never people. It also stops working completely in the dark even in brightly lit streets..

But bins.. it can see bins 3 streets away.. They should get the guy who wrote that code to do the rest.

Why would they cripple it that much? Only sane reason is they needed the processing power for the extra work due to lack of radar... it's fair enough to say it's nonessential (although when someone tried to plant themselves on my bonnet and tesla vision didn't even see them let alone brake IMO they took out too much).
 
I really admire your belief in Tesla. You have a relatively new car which probably explains it. Unfortunately, a few interactions with a SC and couple of firmware updates may change that.
As for “spoiling EV experience”, I was referring to Tesla EV. As an EV, it is awesome (although the others are starting to catch up). Otherwise, for a $100k+ car, the experience is pretty mediocre. Just rent a car in a similar price range and you will see the difference (EV or ICE).
This is my second Tesla, as you can see from the sig. Always had pleasant dealing with the local service centers and rangers here in the UK.
 
Tesla vision visualisation is *absolute crap* compared to the release before it. It's like they abandoned it. It doesn't always even see car in front. Rarely parked cars, almost never people. It also stops working completely in the dark even in brightly lit streets..

But bins.. it can see bins 3 streets away.. They should get the guy who wrote that code to do the rest.

Why would they cripple it that much? Only sane reason is they needed the processing power for the extra work due to lack of radar... it's fair enough to say it's nonessential (although when someone tried to plant themselves on my bonnet and tesla vision didn't even see them let alone brake IMO they took out too much).
the reality is do we even need to see a visualisation? The windscreen and windows have served me well for over 50 years. It’s a pretty distraction to impress passengers with, but frankly that’s about it. I’d rather have a larger speedo and some easy access controls together with a larger satnav area.
Even in FSD beta it’s only providing confirmation of what the car sees as important. That’s useful in testing, but pointless once it works properly. One day… To us in the U.K. it’s a waste of space.
 
Tesla vision visualisation is *absolute crap* compared to the release before it. It's like they abandoned it. It doesn't always even see car in front. Rarely parked cars, almost never people. It also stops working completely in the dark even in brightly lit streets..

But bins.. it can see bins 3 streets away.. They should get the guy who wrote that code to do the rest.

Why would they cripple it that much? Only sane reason is they needed the processing power for the extra work due to lack of radar... it's fair enough to say it's nonessential (although when someone tried to plant themselves on my bonnet and tesla vision didn't even see them let alone brake IMO they took out too much).
So explain the visualisation that FSD Beta has which is also vision only on the same hardware?

1667233481620.png


Yes I agree they have degraded the visualisation, my guess would be simply to save development time when integrating the Vision code. So, being CPU constrained isn't the 'only sane reason'.
 
  • Like
Reactions: Adopado
As hinted at in a previous post @Dilly, I find it harder to justify much or any use of processing resources on the visualisations. All the driver needs is 'processor power availability' indication so you know if you are maxing out the system and should anticipate some kind of intervention being needed by the 'driver'.

However, if Tesla put the visualisations there to instill confidence, then they should work. Not sure what other purpose they serve at this time.
 
the reality is do we even need to see a visualisation? The windscreen and windows have served me well for over 50 years. It’s a pretty distraction to impress passengers with, but frankly that’s about it. I’d rather have a larger speedo and some easy access controls together with a larger satnav area.
Even in FSD beta it’s only providing confirmation of what the car sees as important. That’s useful in testing, but pointless once it works properly. One day… To us in the U.K. it’s a waste of space.
It's somewhat pointless but it's a good indication of what data the system is seeing and acting upon, which was supposed to the the point originally. If it can't see something it can't act on it, hence its inability to see anything at night is a worry enough that I no longer trust AP (I also don't trust the safety systems will work at night having witnessed them fail).

It's another case of something being updated way before it was ready.. the end users being beta testers.
 
On a positive note - My car farts, can do a lightshow, can play games on the screen and can watch a film - all of which I don't use. Dear Elon, can you send your frippery development teams over to the wiper and auto headlight team to give them a hand - and (as @Tony Hoyle said) the guy that wrote the visualisations for bins - as he is streets ahead of the game.