Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

[uk] UltraSonic Sensors removal/TV replacement performance

This site may earn commission on affiliate links.
Hmmm, really. Real world winter testing in Car this month. Would be as a second EV for us, but even as a first car EV, the MG4 is an interesting proposition for the price.

View attachment 893879
Some very nice cars being developed in China now. I was struck by the fact they have no shame in ripping off other people's ideas and designs.
1673272007352.png
 
Realistically until they ship the HD radar, it’s unlikely any of us will be able to answer that, but if its coming then it means they’ve tested it and seen good results which outweigh the bad ones with the old radar.

It’s not impossible to stitch different sensors together, it worked with the USS for example, it’s when they disagree that‘s the problem and the increased accuracy might fix that. That said for all we know the HD radar could be an upgrade for cabin driver sensing given that’s also been indicated in the past and becoming part of the latest euro safety standards.

I see people pondering over 'sensor disagreement' as if Tesla are struggling with something completely new and unexplored. Fact is it's nothing new at all. Sure an autonomous car for public roads is a new(ish) application, but complex autonomous systems with multiple sensors have been around for decades. So no it sure as hell isn't impossible to stitch different sensors together - it's been done! :) Aircraft autopilot and auto landing systems as a close enough example. There we have not only multiple types of sensors, but also double or triple (redundant) copies of the same sensors - so even more potential disagreement to deal with than just combining cameras with radar. But that's what everyone is doing... except Tesla!

People repeat 'so what happens when camera says x, radar says y, lidar says z, etc' as if it's some profound 'gotcha' question that lays bare the impossibility of the task, and Elon's genius in avoiding the problem. It implies the more sensors you have, the more disagreement you can have - completely different data from every different sensor. But that's not how it works at all! It's more like the opposite - you typically get most sensors in agreement (they are all sensing the same environment after all) and one/some outliers. The more sensors you have, in fact the easier it is to identify the outliers/faulty sensors. Have enough redundancy to be highly confident in identifying the outliers, you even have self-troubleshooting. Then you decide what to do about it - voting, averaging, take most conservative action, etc. With only one sensor, when that's wrong you take the wrong action, blissfully unaware.

And ughhh - all this 'yeah but humans are Vision only!' is just asinine! Come back when the cameras are moving (or fixed in all locations needed), stereoscopic, self cleaning, super high res, high dymanic range and, this is the big one, the neural network attached to them is even 1% as sophisticated and capable as human brain V1.0! Or is that V1-billion or so, depends how we number the updates :). Yes the only known successful human-level driving system is a human, as somebody put it. Likewise the only known human-level aircraft piloting system is a human, but I don't see any aerospace companies developing a camera only autopilot (and the multi-sensor ones are close approaching human-level). It's not always best to copy nature's solutions - see the lack of cars employing legs or aircraft with flapping wings!

I fully expect other manufacturers to get their first, before Tesla Vision FSD.
 
I fully expect other manufacturers to get their first, before Tesla Vision FSD.
I think you really have to define where 'there' is for this to be a valid challenge.

Rightly or wrongly, Tesla have set out attempting to build a level 5 capable system that can drive anywhere in the world relying entirely on in-vehicle analysis to do it. Is anyone else trying to do that? Or are other manufacturers aiming for a solution that will be finished sooner but is ultimately less capable?

Doesn't matter how good an autonomous taxi in San Francisco or even London is if I live in a rural area of the UK.
 
  • Like
Reactions: Casss and boombap
I see people pondering over 'sensor disagreement' as if Tesla are struggling with something completely new and unexplored. Fact is it's nothing new at all. Sure an autonomous car for public roads is a new(ish) application, but complex autonomous systems with multiple sensors have been around for decades. So no it sure as hell isn't impossible to stitch different sensors together - it's been done! :) Aircraft autopilot and auto landing systems as a close enough example. There we have not only multiple types of sensors, but also double or triple (redundant) copies of the same sensors - so even more potential disagreement to deal with than just combining cameras with radar. But that's what everyone is doing... except Tesla!

People repeat 'so what happens when camera says x, radar says y, lidar says z, etc' as if it's some profound 'gotcha' question that lays bare the impossibility of the task, and Elon's genius in avoiding the problem. It implies the more sensors you have, the more disagreement you can have - completely different data from every different sensor. But that's not how it works at all! It's more like the opposite - you typically get most sensors in agreement (they are all sensing the same environment after all) and one/some outliers. The more sensors you have, in fact the easier it is to identify the outliers/faulty sensors. Have enough redundancy to be highly confident in identifying the outliers, you even have self-troubleshooting. Then you decide what to do about it - voting, averaging, take most conservative action, etc. With only one sensor, when that's wrong you take the wrong action, blissfully unaware.

And ughhh - all this 'yeah but humans are Vision only!' is just asinine! Come back when the cameras are moving (or fixed in all locations needed), stereoscopic, self cleaning, super high res, high dymanic range and, this is the big one, the neural network attached to them is even 1% as sophisticated and capable as human brain V1.0! Or is that V1-billion or so, depends how we number the updates :). Yes the only known successful human-level driving system is a human, as somebody put it. Likewise the only known human-level aircraft piloting system is a human, but I don't see any aerospace companies developing a camera only autopilot (and the multi-sensor ones are close approaching human-level). It's not always best to copy nature's solutions - see the lack of cars employing legs or aircraft with flapping wings!

I fully expect other manufacturers to get their first, before Tesla Vision FSD.
In fact, the claim that humans use vision only when driving is incorrect. It is illegal to drive with headset covering both ears.
 
  • Like
Reactions: H43lio and jamesp26
Come back when the cameras are moving (or fixed in all locations needed), stereoscopic, self cleaning, super high res, high dymanic range and, this is the big one, the neural network attached to them is even 1% as sophisticated and capable as human brain V1.0! Or is that V1-billion or so, depends how we number the updates :). Yes the only known successful human-level driving system is a human, as somebody put it.
I'm not 100% sure what you're on about here, but you're wrong - see below. No one is saying sensor stitching is ground breaking - this is all old news, been done to death on this forum and others. They didn't hire Karpathy to implement a basic sensor suite.
Rightly or wrongly, Tesla have set out attempting to build a level 5 capable system that can drive anywhere in the world relying entirely on in-vehicle analysis to do it. Is anyone else trying to do that? Or are other manufacturers aiming for a solution that will be finished sooner but is ultimately less capable?
Bang on. This is the gamble that Tesla have taken, and this is why we're all driving cars that need constantly updating ("beta users"). This is why Tesla's approach to software, TACC, auto steer and other driving aids is exciting. Caveat: Would I / have I bought FSD? No. Do I feel bad for people who have who didn't fully understand what they were getting in to or feel it was mis-sold? Yes. The two are not mutually exclusive.
 
I'm not 100% sure what you're on about here, but you're wrong - see below. No one is saying sensor stitching is ground breaking - this is all old news, been done to death on this forum and others. They didn't hire Karpathy to implement a basic sensor suite.

Bang on. This is the gamble that Tesla have taken, and this is why we're all driving cars that need constantly updating ("beta users"). This is why Tesla's approach to software, TACC, auto steer and other driving aids is exciting. Caveat: Would I / have I bought FSD? No. Do I feel bad for people who have who didn't fully understand what they were getting in to or feel it was mis-sold? Yes. The two are not mutually exclusive.
Why does my car need constant updating? I have not purchased what Tesla calls FSD, nor have I any interest in it. So why should my car suffer regressions and software inadequacies that diminish my driving experience, solely for the benefit of the people who DO want to use FSD in the US?

The answer I suspect lies in the attitude of the company to "my" car. As someone sagely said on this forum, Tesla's attitude is that this is their car, on loan to me, and they can do what the hell they like. Even if it means that functionality is reduced over time - tough. It's not yours, so shut about it.

A bit like John Deere did with "their" tractors.
 
  • Like
Reactions: Boza
Unfortunately the direction so many things are going. This is the first car I've had that 'required' (I know it's not really compulsory) a life support payment to retain full functionality in the form of premium connectivity.

As to why Tesla require you to update, it reduces the support overhead to have everyone running on as few versions of the software as possible, plus they don't have to have a tested upgrade process from every version to every other version. There are probably also lifetime shortening tunings or even safety relevant improvements in some releases and they want you to be running (what they consider to be) the best version.

This requirement is pretty common in the software world, it just seems odd because we haven't traditionally considered cars to be a software platform.
 
  • Like
Reactions: Js1977
That’s true. That’s likely why Tesla plans on using the microphone in the car to detect sirens.
Did this ever get released or is it still stuck in the Elon-time vortex? That's pretty cool.

This fills me with fear though: "the start of Tesla adding a “Hey, Tesla”-like wake command to activate the voice-assistant".The Tesla voice control is trash. Driving down the road having a conversation in a voice activated Tesla is going to look like you're driving a clown car as all the settings are randomly flipped.
 
  • Like
Reactions: H43lio
Is there any update - rumour, leak or otherwise - on the TV updates to recover USS functions for Q4 buyers? What about the "Single stack" release?
You are post #1,655 on one of at least 3 threads on this topic. Do you think we'd all be sat here discussing this if there was no news?



...


The answer is no, there is no news and, yeah we're all wasting our lives on these threads 😭 .
 
Did this ever get released or is it still stuck in the Elon-time vortex? That's pretty cool.

This fills me with fear though: "the start of Tesla adding a “Hey, Tesla”-like wake command to activate the voice-assistant".The Tesla voice control is trash. Driving down the road having a conversation in a voice activated Tesla is going to look like you're driving a clown car as all the settings are randomly flipped.
Haven’t seen it appear in the release notes yet, but it was also only mentioned in the context of FSD beta, so probably not in UK even if they have flipped a switch in the USA.