Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

[uk] UltraSonic Sensors removal/TV replacement performance

This site may earn commission on affiliate links.
The current AP CPU struggle to process the raw feed from existing cameras already.
This sounds like something made up, we don't have any access to understand the load on the CPU, and the point is to move load from the cpu to the neural net engine. I suggest people are doom-speculating.
It is shocking that they still think of Tesla as a startup. That may be the core of the issue.
Andrej doesn't work for Tesla anymore, so isn't speaking for them. Yes he was head of AutoPilot so clearly knows far more about this than any of us are likely to, but he can also speak more freely as he's no longer an employee. I've worked for lots of large organisations, nearly all of them would like the areas of their business that are at the front of innovation to work 'like a start-up', take risks, fail fast etc. It's how things develop with pace. It's also specifically why I bought a Tesla in the first place.
 
  • Like
Reactions: pow216 and Skie
It's well known that Tesla are struggling with processing power - there was supposed to be a backup computer for AP and they had to repurpose it because the first one was becoming overloaded.

The neural net doesn't reduce CPU load, it's part of CPU load, it's not some magic device that doesn't need processing power.

Even the transition to vision AP we've lost the processing of oncoming traffic.. they're having to sacrifice things. New hardware is well overdue given advances made in the last few years, and they'll be working on it I'm sure.
 
  • Like
  • Disagree
Reactions: STUtoday and CWT3LR
It's well known that Tesla are struggling with processing power - there was supposed to be a backup computer for AP and they had to repurpose it because the first one was becoming overloaded.

The neural net doesn't reduce CPU load, it's part of CPU load, it's not some magic device that doesn't need processing power.

Even the transition to vision AP we've lost the processing of oncoming traffic.. they're having to sacrifice things. New hardware is well overdue given advances made in the last few years, and they'll be working on it I'm sure.
The neural engine is not the came as the cpu, they are both on the same die, but they are different functions, here's a diagram of the die

1667141141722.png


The CPU is over there on the right. As you can make out there are 3 x 4 core Arm Cortex A72 processors. These are for general computing tasks, while the NPU is what handles the neural nets. Yes the CPU is going to have some part to play in that, like it has with GPU, but it's fairly minimal.

I assume the source of your claims are from the chap on Twitter called Green, he puts out a lot of theories about how the AutoPilot computer is working, none of them I've seen remotely confirmed by Tesla, and I would suggest treating them as they deserve. I call BS on it see
 
This sounds like something made up, we don't have any access to understand the load on the CPU, and the point is to move load from the cpu to the neural net engine. I suggest people are doom-speculating.

Andrej doesn't work for Tesla anymore, so isn't speaking for them. Yes he was head of AutoPilot so clearly knows far more about this than any of us are likely to, but he can also speak more freely as he's no longer an employee. I've worked for lots of large organisations, nearly all of them would like the areas of their business that are at the front of innovation to work 'like a start-up', take risks, fail fast etc. It's how things develop with pace. It's also specifically why I bought a Tesla in the first place.
More mature organizations have the ability to “isolate” startup divisions from the rest. They understand that most of their customers are not early adopters like you.

For most of us the car is a tool. For example, I would gladly give up on the promise of FSD (not even sure what that means anymore) in return of working auto wipers, auto headlights, better UI, working adaptive cruise control. The current level of driving assistance that FSD provides - auto park, keeping lanes on highway - is enough for me (and probably most of the other customers, considering how many FSD packages they sell/rent). When/if they prove the other things (that is what FSDb is for) happy to have a look.

Majority of the customers are buying Tesla because it is a great EV (mature part of the business) and could care less about the AI (startup part of the business). Currently, Tesla intermingles those and ruins the EV experience for existing owners.
 
More mature organizations have the ability to “isolate” startup divisions from the rest. They understand that most of their customers are not early adopters like you.

For most of us the car is a tool. For example, I would gladly give up on the promise of FSD (not even sure what that means anymore) in return of working auto wipers, auto headlights, better UI, working adaptive cruise control. The current level of driving assistance that FSD provides - auto park, keeping lanes on highway - is enough for me (and probably most of the other customers, considering how many FSD packages they sell/rent). When/if they prove the other things (that is what FSDb is for) happy to have a look.

Majority of the customers are buying Tesla because it is a great EV (mature part of the business) and could care less about the AI (startup part of the business). Currently, Tesla intermingles those and ruins the EV experience for existing owners.
I'm sorry if that wasn't obvious in 2020, Elon had already revealed the Cybertruck and talking about the roadmap leading to autonomous taxis, it seemed pretty clear what the brand was about to me. The auto-wipers, auto-dips and UI haven't significantly changed since a test drive have they?

I do agree that the new requirement to force wipers/headlight to auto isn't really working due to the issues discussed in other threads, but it's taking it a long way to say that they are 'ruining the EV experience'. Removing radar seems to have had little to no impact to using AutoPilot in a perfectly normal way for me. This thread however is about an announced feature that isn't going to affect your car according to Tesla, and people buying new Tesla's now are being clearly informed about, so can judge whether they want to be part of this process.
 
I'm sorry if that wasn't obvious in 2020, Elon had already revealed the Cybertruck and talking about the roadmap leading to autonomous taxis, it seemed pretty clear what the brand was about to me. The auto-wipers, auto-dips and UI haven't significantly changed since a test drive have they?

I do agree that the new requirement to force wipers/headlight to auto isn't really working due to the issues discussed in other threads, but it's taking it a long way to say that they are 'ruining the EV experience'. Removing radar seems to have had little to no impact to using AutoPilot in a perfectly normal way for me. This thread however is about an announced feature that isn't going to affect your car according to Tesla, and people buying new Tesla's now are being clearly informed about, so can judge whether they want to be part of this process.
No one has ever mentioned that existing functionality would be removed from the “legacy” fleet. Had I known that I probably would have not bought the car.

You are correct - auto wipers and high beams haven’t changed much. They did not work well when I bought the car; they do not work well now. However, the difference is that after the upgrade I cannot use the AP because of them. That is a major change from simply avoiding them. From that perspective, the latest “upgrade” reduces the utility value of the car for me because it is equivalent of removing the AP. Hence, “ruining the experience”.

As for the USS, the historical behavior of Tesla points to disabling of USS for the existing fleet sometime in a near future, probably in less than a year. That is why the two topics were co-mingled. Behavior pattern.
 
  • Like
Reactions: nufan and CWT3LR
How far ahead do you look when driving? Your brain is capable of picking up a driving related anomaly and focussing on it in a way that essentially enhances detail. You would need 200mm telephoto or equvelent hi Res to equal that..in stereo. Just think about the judgements you make on country roads when planning to overtake a tractor...assessing distances to curves, rate if acceleration, other road hazards such as blind field gates....
Who are you - Col. Steve Austin? :)
The human eye has a focal length of about 22mm. Equivalent field of view with both eyes is 43mm. Everything else is wetware.
 
  • Like
Reactions: Vidyutyaan
I assume the source of your claims are from the chap on Twitter called Green, he puts out a lot of theories about how the AutoPilot computer is working, none of them I've seen remotely confirmed by Tesla, and I would suggest treating them as they deserve. I call BS on it see
@verygreen has full access to the compute hardware so can very much monitor (and influence) what the car is up to. Far from theoretical. Along with others, they can see what the NN’s are up to in great detail - and been proven right on countless occasions.

I would say short of some Tesla developers, they probably know the most about what the state of play is wrt on board compute although even if their predictions are only based on their past expertise and experience.

Certainly the last people to call BS on unless you know better than them.
 
  • Like
Reactions: CWT3LR
Who are you - Col. Steve Austin? :)
The human eye has a focal length of about 22mm. Equivalent field of view with both eyes is 43mm. Everything else is wetware.
As you're fully aware I never said the eye had 200mm focal length but was comparing the practical effect of what we can do with our eyes/brain compared to detail in a still image. Brain fills in detail (or belief of detail) unless specifically looking at areas of the view but remains 'on the trips' for when anything anomalous occurs. Replicating that ability with camera/AI is challenging either with way higher resolution images or some ability to instant zoom .
 
As you're fully aware I never said the eye had 200mm focal length but was comparing the practical effect of what we can do with our eyes/brain compared to detail in a still image. Brain fills in detail (or belief of detail) unless specifically looking at areas of the view but remains 'on the trips' for when anything anomalous occurs. Replicating that ability with camera/AI is challenging either with way higher resolution images or some ability to instant zoom .
My point was that you don't need a 200mm lens to emulate the human eye, just better software. In many cases computer vision surpasses the human eye, especially at the periphery. Here are the papers presented to this years CVPR conference to give a flavour of what can be achieved by computer vision these days. The analogy with the human eye is getting less relevant every day.

The real question is whether Tesla can deliver it or not. ;)
 
  • Like
Reactions: JupiterMan
wrt Jan 2019 HW3 analysis for one. Karpathy confirmed the snapshots worked the way I described as well.
but obviously you can believe whatever you want.
People take your observations and then make wild simplifications to draw conclusions such as :-

Even the transition to vision AP we've lost the processing of oncoming traffic.. they're having to sacrifice things.

Anyone who has seen the visualisation capabilities for FSD Beta can see that Tesla Vision on HW3 is clearly able to display oncoming traffic, and it doesn't really make sense that removing radar would increase CPU load. It seems far more likely that for development speed they omitted oncoming traffic in the visualization as it isn't essential and never really worked reliably on the pre-FSDb AutoPilot anyway.
 
  • Like
Reactions: JupiterMan
No one has ever mentioned that existing functionality would be removed from the “legacy” fleet. Had I known that I probably would have not bought the car.

You are correct - auto wipers and high beams haven’t changed much. They did not work well when I bought the car; they do not work well now. However, the difference is that after the upgrade I cannot use the AP because of them. That is a major change from simply avoiding them. From that perspective, the latest “upgrade” reduces the utility value of the car for me because it is equivalent of removing the AP. Hence, “ruining the experience”.

As for the USS, the historical behavior of Tesla points to disabling of USS for the existing fleet sometime in a near future, probably in less than a year. That is why the two topics were co-mingled. Behavior pattern.
Well, I agree that Tesla need to reintroduce the ability for drivers to override the auto-wipers like you can with the auto-dips when they have false positives, and this will not affect the safety when using vision only Autopilot. My current 2022 car doesn't have radar fitted, so I've perhaps had vision only longer than most and in over 3K miles of driving the wipers have irritated me with dry wiping twice, certainly not a ruinous experience for the whole idea of electric vehicles.

Personally, I don't think that USS will be disabled for cars with it today, people are noticing that the new cars also have the newer cameras. I expect in a few months when the park-assist returns with improved capabilities there will be similar threads complaining that we can't have it or get retrofits of the required hardware.
 
My point was that you don't need a 200mm lens to emulate the human eye, just better software. In many cases computer vision surpasses the human eye, especially at the periphery. Here are the papers presented to this years CVPR conference to give a flavour of what can be achieved by computer vision these days. The analogy with the human eye is getting less relevant every day.

The real question is whether Tesla can deliver it or not. ;)
You need the resolution capability for any object to be detected as what it is. So what Res do you need on a camera to recognise it as a car as far ahead and as quickly as a human..because while the human will focus just on the area of immediate interest AI will be looking at the whole?
Apparently modelling human mechanistic computational power theoretically takes 10^13 - 10^17 flops. Tesla's AI chip runs 10^10 so you need 100 - 10mill times as much processing capability...more if trying to beat the human with surround vision....
 
  • Funny
  • Like
Reactions: init6 and CWT3LR