Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

[uk] UltraSonic Sensors removal/TV replacement performance

This site may earn commission on affiliate links.
Yes. And I'm fine with a disagree. But honestly couldn't see what the disagree was relating to.
Sorry, I had half typed a reply but its long winded and wanted to cite sources but have been running around All week. Tl;dr: terminology issue, some bits of Tesla Vision are active in the UK/rest of world.

It wasn’t you necessarily I was disagreeing with but what the Tesla tech said and herein lies the problem in that a whole bunch of people (myself included probably) are using interchangeable terms that mean different things.
Tesla Vision recently is sometimes being used to mean vision based radar and USS replacement, which is kinda accurate for rest of world vehicles, however Tesla Vision is more than that which it appears was the context of the conversation with the service tech, and Tesla themselves have contradicted some of that statement about it not being on UK vehicles, but crucially with some caveats.

Tesla have begun to enable some parts of Tesla Vision, however UK and rest of world vehicles still don’t have most of required elements of Tesla Vision stack deployed (the overall solution not just USS or radar replacement parts) because this also includes a dramatic upgrade to the neural nets so it can drive better and use more cameras than are currently active when in autopilot.

For example, the UK autopilot stack we have is not really Tesla Vision but instead still based on iterations of the old one from 2017/18. A specific example of this is that of the three cameras on the windscreen only one of them had been in use for (most) Autopilot functions like highway lane keeping. To be clear I’m not saying the other cameras aren’t used at all, they are now in a limited way, but they haven’t yet merged most of the Tesla Vision stack which uses them constantly for European vehicles, instead we have a weird Frankenstein mix of the two where the other front cameras are now doing some things like measuring distance instead of radar, and those elements are part of the larger Tesla Vision solution, but crucially UK/rest of world vehicles still don’t appear to be using Tesla Vision for actual Autopilot driving tasks, like changing lanes and not hitting things. Which I suspect is where the tech was coming from.

So my ‘disagree’ was based on the tech saying no cars are running Tesla Vision in the UK, because some bits are, but they’re also sorta correct in that unless you’re running the FSD beta code branch in the USA, the whole Tesla Vision stack isn’t running here, that appears to be due to the fact that stack isn’t yet accurately tuned to EU/UK signs and roads, so they can’t just merge it all yet without it being a potential safety risk.

Of note, the USS replacement software is already running in shadow mode on UK/rest of world vehicles (corroborated by @greentheonly on Twitter) I haven’t worked out exactly which software version this happened in, only that it was merged before the holiday update, but it appears the solution is built and “working” but they’re still testing it in the vehicles before they make it generally available.

Moving in to speculation territory because I can’t test further now as have now traded my car with USS for one without, but it appears that they‘re pulling data from both USS and cameras of the current fleet and I would be willing to bet that they’re currently tweaking the neural networks until the USS and camera data match up as closely as possible before flipping the switch in production.

So to summarise, Tesla Vision is sorta, kinda here in some areas, but also definitely not here in others. USS replacement is definitely already running on the cars though even if we can’t see it in the main UI and we’re just waiting for them to flip the switch on what is currently implemented as a dark feature.
no worries, he is new here, still in rose tinted glasses during his honey moon period with the car, therefore the fanboiest of them all.

he might change in time )
3 years and 2nd car and the tint hasn’t worn off yet. 🙃
 
That sounds like a company with no idea what they are doing, who had very persuasive sales people from lidar radar and USS suppliers pay them a visit. What do they intend to do when they have 3 different types of sensor all disagreeing?
We think phantom breaking is a pain, wait until the cameras say go, the radar says slow down, the USS says STOP! happening multiple times a second.

I know teslavision isn't 'there' yet, but IMHO it is the most sensible route. The only sensor suite we KNOW enabled human level driving, is human sensors. ears and eyes.

Tesla have the balls to go all-in on the system they think will provide full autonomy. It sounds like volvo want to add whatever sensors its customers think sound advanced.
So what happens if, as widely reported, Tesla introduce a new HD radar? Presumably it’s doomed to failure because of sensor disagreement.
 
That sounds like a company with no idea what they are doing, who had very persuasive sales people from lidar radar and USS suppliers pay them a visit. What do they intend to do when they have 3 different types of sensor all disagreeing?
We think phantom breaking is a pain, wait until the cameras say go, the radar says slow down, the USS says STOP! happening multiple times a second.

I know teslavision isn't 'there' yet, but IMHO it is the most sensible route. The only sensor suite we KNOW enabled human level driving, is human sensors. ears and eyes.

Tesla have the balls to go all-in on the system they think will provide full autonomy. It sounds like volvo want to add whatever sensors its customers think sound advanced.

I don't understand this. Driver aids and autonomy aren't mutually exclusive and different sensors can provide different functions. And i think Volvo do know what they are doing in the area of safety. The company that has forgotten what they are doing are they ones shipping £60k+ cars without features advertised at the time, and who have given no indication of timelines as to when (or if) the features will be enabled. No good telling my misses she can't have parking sensors but one day the car might be able to fully drive itself (although - it very likely wont, and if i wanted it i would have to pay another £7k anyhow).
 
So what happens if, as widely reported, Tesla introduce a new HD radar? Presumably it’s doomed to failure because of sensor disagreement.
The issue of disagreement in the old radar was because it was way lower resolution than the cameras and isn’t as stable as an image when firing repeatedly. important to clarify that radar doesn‘t send back pixels but instead a few ‘points’ with distances and their speed.

The old radar only could do about 40 points per “image” it sent back to the system, which compared to a camera is tiny, so what it might interpret as an obstacle the camera doesn’t.

Example: what a low res radar sees vs. what a point cloud generated from the camera sees.
 
Aerospace is a lot easier and primarily radar only.
Radar, GPS, inertial sensors, barometric sensors, air speed sensors, magnetic sensors… And somehow they deal with ground proximity radar and barometric sensors disagreements.

Removing sensors because of “sensor disagreement” is a ridiculous argument - as if you remove the head because you have a headache. There is a very well developed body of knowledge around signals processing. However, Tesla chose to ignore it - similarly to how they ignored the body of knowledge around UX. At some point ignoring the existing knowledge is not innovation but stupidity.
 
I don't understand this. Driver aids and autonomy aren't mutually exclusive and different sensors can provide different functions. And i think Volvo do know what they are doing in the area of safety. The company that has forgotten what they are doing are they ones shipping £60k+ cars without features advertised at the time, and who have given no indication of timelines as to when (or if) the features will be enabled. No good telling my misses she can't have parking sensors but one day the car might be able to fully drive itself (although - it very likely wont, and if i wanted it i would have to pay another £7k anyhow).
I watched a YouTube video over Christmas, showing the development of robotics by Boston Dynamics. Incredible to see the last two decades or so, and the impressive capability they have managed to develop. Contrasting with the lame presentation of the Elonbot, it seems as though they have tried dozens of combinations of camera, ldiar, radar, ultrasonics, "touch" based sensors - the whole gamut - to reach their current level of sophistication.

Whereas we get from Tesla some pretty videos pretending to show you what the robot "sees", some pseudoscience about occupancy networks, and lots of hype to try to reflate the share price.

There's a reason we don't yet have full self driving despite the world's biggest and brightest companies working on it - it's hard. Hard to get the necessary suite of sensors to work holistically; hard to predict factors which by their nature are highly unpredictable and spontaneous. What is clear from the companies that have reached level 3 and are approaching level 4 is that you need all types of sensors - each with their own strengths and weaknesses - to make it usable.

Whereas Tesla have bet the farm on one type of sensor, and as a result seems to be approaching level 2, but is likely to be a blind alley if they want to reach 3 or 4.

Time for a new CEO and a board willing to accept that they got the current strategy wrong.
 
I appreciate you taking the time to elaborate. I have read your comments and I think you make some very valid points such as those relating to terminology and context.

Tesla themselves have contradicted some of that statement about it not being on UK vehicles, but crucially with some caveats.

I've read that article a few times, and imo it is full of self contradiction and double talk. Basically it's a complete waste for non US cars as the caveat is pretty much 'non-US, anything goes'!

we have a weird Frankenstein mix of the two where the other front cameras are now doing some things like measuring distance instead of radar, and those elements are part of the larger Tesla Vision solution, but crucially UK/rest of world vehicles still don’t appear to be using Tesla Vision for actual Autopilot driving tasks

Your assessment sounds quite possible to me, but it's so frustrating that we really have no idea. I've just spent 2 days driving through France and although I'm still on older software, I have previously installed many updates that included all sorts of claimed improvements. The performance now still includes so many poor / inconsistent behaviours that I have seen in previous versions, I struggle to believe there is any great improvement to be had with any version. One of the old traits of aborting lane changes mid manoeuvre when a truck or van is in the rear near side quarter region is as evident now as ever. Likewise vehicle visualisations dancing around randomly as a vehicle moves from one camera to another.

"Frankenstein mix" is exactly what it feels like I'm driving.

the fact that stack isn’t yet accurately tuned to EU/UK signs and roads, so they can’t just merge it all yet without it being a potential safety risk.

I don't believe our EU / UK cars have been fine tuned at all. May be minor tweeks, but where exactly in the 'Frankenstack' who knows?

it appears that they‘re pulling data from both USS and cameras of the current fleet

I used to see some sizable uploads from my car, especially after FSD drives with many manual interventions, but nothing for a long time now. I don't think Tesla have the resources or interest to deal with patching up a stack that will never be part of a finished product.

USS replacement is definitely already running on the cars though even if we can’t see it in the main UI and we’re just waiting for them to flip the switch on what is currently implemented as a dark feature.

If Occupancy Network really can use vision only data to produce all the same data as a successful multi technology sensor suite implementation, then I can see that would have some benefits. But we know it can't (because the cameras have more / different limitations compared with other technologies), and Tesla probably knows that too, hence upcoming HR Radar.

Saying that UK cars are running xyz code is imo meaningless unless they are actually using the code to control something live. Running code in shadow mode ready to flip a switch is not the same as 'running the code' which would manifest itself in the way the car performs.

The Tesla tech saying UK cars are not running Tesla Vision is pretty meaningless really.
 
  • Like
Reactions: boombap
The issue of disagreement in the old radar was because it was way lower resolution than the cameras and isn’t as stable as an image when firing repeatedly. important to clarify that radar doesn‘t send back pixels but instead a few ‘points’ with distances and their speed.

The old radar only could do about 40 points per “image” it sent back to the system, which compared to a camera is tiny, so what it might interpret as an obstacle the camera doesn’t.

Example: what a low res radar sees vs. what a point cloud generated from the camera sees.
I don’t see how that answers the question of “sensory disagreement”. Cameras and radar of whatever ilk still provide completely different data that may potentially conflict. Thus some of the completely unproven assertions on this thread that only vision will work.
 
I don’t see how that answers the question of “sensory disagreement”. Cameras and radar of whatever ilk still provide completely different data that may potentially conflict. Thus some of the completely unproven assertions on this thread that only vision will work.
Realistically until they ship the HD radar, it’s unlikely any of us will be able to answer that, but if its coming then it means they’ve tested it and seen good results which outweigh the bad ones with the old radar.

It’s not impossible to stitch different sensors together, it worked with the USS for example, it’s when they disagree that‘s the problem and the increased accuracy might fix that. That said for all we know the HD radar could be an upgrade for cabin driver sensing given that’s also been indicated in the past and becoming part of the latest euro safety standards.
 
I don’t see how that answers the question of “sensory disagreement”. Cameras and radar of whatever ilk still provide completely different data that may potentially conflict. Thus some of the completely unproven assertions on this thread that only vision will work.
Please note that our own sensors do not always agree. But I have not seen anyone removing their ears because they disagree with what they see - we deal with it. Also, when in unknown situations, we rely on _all_ sensors to form a picture, even if they contradict each other.

The goal is to form a picture in the widest possible set of circumstances, including when a class of sensors are “blind”. Hence, the more different classes one has, the better. The software should be able to deal with filtering out information based on the context - the same way we do.
 
Please note that our own sensors do not always agree. But I have not seen anyone removing their ears because they disagree with what they see - we deal with it. Also, when in unknown situations, we rely on _all_ sensors to form a picture, even if they contradict each other.

The goal is to form a picture in the widest possible set of circumstances, including when a class of sensors are “blind”. Hence, the more different classes one has, the better. The software should be able to deal with filtering out information based on the context - the same way we do.
Actually when "our" sensors disagree we often get sick and need a lie down! If you've ever experienced severe sea sickness there are times that you might think about "removing some of those disagreeing sensors"!!
 
The issue of disagreement in the old radar was because it was way lower resolution than the cameras and isn’t as stable as an image when firing repeatedly. important to clarify that radar doesn‘t send back pixels but instead a few ‘points’ with distances and their speed.

The old radar only could do about 40 points per “image” it sent back to the system, which compared to a camera is tiny, so what it might interpret as an obstacle the camera doesn’t.
It's funny that you're a strong advocate of agile elsewhere, and minimum viable products, yet here you're effectively saying they went with a rubbish radar without realising at the time. Surely you can see a mistake was made. And if your counter argument is they can do without the radar altogether, then that still points to a mistake as why fit one in the first place?

On second thoughts, please don't bother answering.
 
  • Funny
  • Like
Reactions: H43lio and yessuz
It's funny that you're a strong advocate of agile elsewhere, and minimum viable products, yet here you're effectively saying they went with a rubbish radar without realising at the time. Surely you can see a mistake was made. And if your counter argument is they can do without the radar altogether, then that still points to a mistake as why fit one in the first place?

On second thoughts, please don't bother answering.
Sadly, I had to place them on ignore. Not good for my blood pressure, contending with zealotry.
 
  • Like
Reactions: DrJFoster
It's funny that you're a strong advocate of agile elsewhere, and minimum viable products, yet here you're effectively saying they went with a rubbish radar without realising at the time. Surely you can see a mistake was made. And if your counter argument is they can do without the radar altogether, then that still points to a mistake as why fit one in the first place?

On second thoughts, please don't bother answering.
That a ridiculous argument. The radar worked well for basic autopilot. At some point the software outgrew it and it stopped being useful. It really isn’t hard. If you‘re gonna quote me I’ll reply all I like, thanks.
 
and Tesla probably knows that too, hence upcoming HR Radar.
Has there actually been any confirmation what this radar is for? Everyone's making the assumption that it's a drop in replacement for the old radar that was removed. Last I heard, it's possible it's not even intended for use on Tesla's cars?
 
And 60% of the range / efficiency. Go fly with MG. Good luck.
Hmmm, really. Real world winter testing in Car this month. Would be as a second EV for us, but even as a first car EV, the MG4 is an interesting proposition for the price.

image.jpg
 
  • Like
Reactions: CWT3LR