Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
It won’t be Thursday or Friday if there are bugs. We’ve seen it before. When there is an issue with a build like 10.6, it took almost a week for 10.6.1 to start getting delivered. First they take two days to roll out the initial release, then pause, then take a couple days to fix, then ship to internal testers again, then after a little while roll out to group A, then the rest
Yes, this is like the old movie, “Ground Hog Day”. I hope they get it out before the holiday for everyone to enjoy.
 
  • Like
Reactions: pilotSteve
What exactly is "explicit photon count"
I think he's basically referring to using the 12-bit RAW (RCCB) pixel data instead converting to YUV, which would have added latency and quantization errors. He has referred to this as "actual photon counts" before:

However, that might not be totally accurate as I believe their camera sensors work at 36Hz, so there is some accumulation of photons over ~28ms that result in each pixel reporting 1 of 4096 values. But assuming each R/C/C/B sub-pixel measurement reports 1 of 8 possible values, that does get closer to counting the number of photons that triggered the measurement.
 
I think he's basically referring to using the 12-bit RAW (RCCB) pixel data instead converting to YUV, which would have added latency and quantization errors. He has referred to this as "actual photon counts" before:

However, that might not be totally accurate as I believe their camera sensors work at 36Hz, so there is some accumulation of photons over ~28ms that result in each pixel reporting 1 of 4096 values. But assuming each R/C/C/B sub-pixel measurement reports 1 of 8 possible values, that does get closer to counting the number of photons that triggered the measurement.
Interesting - medical industry uses photon counts and special equipment for that.
 
Unless I’m missing something, I don’t see any direct statements that 2021.44.25.1 features will be included in 10.8.
Nothing saying otherwise either. The fact that Waypoints will be included seems to point in that direction.
Waypoints was introduced in 2021.40 itself. So, 44.25 would have the features discussed in that thread. We are assuming 10.8 will have 10.7 improvement plus .25 holiday features. No confirmation yet.

IMO, Elon doesn’t want to leave the most loyal and enthusiastic customers I.e. beta testers out of holiday “gift”.
 
Unless I’m missing something, I don’t see any direct statements that 2021.44.25.1 features will be included in 10.8.
Nothing saying otherwise either. The fact that Waypoints will be included seems to point in that direction.
Is this one clear enough? :oops: Other than the missed date.:eek:

Screen Shot 2021-12-22 at 6.06.23 PM.png
 
Interesting - medical industry uses photon counts and special equipment for that.
There are different studies on how to use photon counts to see through occlusions. You can bet DARPA is studying this intently. Here is some basic info for anyone interested. Of course the "cheep" cameras wound never be able to do individual photons but looks like Tesla is training the software to look for and identify low photon patterns.

 
Last edited:
  • Informative
Reactions: EVNow and outdoors
"Pure vision, especially when using explicit photon count, is much better than radar+vision, as the latter has too much ambiguity – when radar & vision disagree, it is not clear which one to believe"-- Elon Musk.

This is pretty appalling. They don't know which one is correct so they'll just pick one. I would think that they could measure the variance of each and weight them accordingly. He's not saying that vision is better, just that they can't handle the difference.

This argues for a third method (lidar), so that you can choose the two values amongst the three that agree the most, unless you have some definite error bounds on vision alone. But Musk said that "... it is not clear which one to believe." This implies that vision isn't any more accurate than radar which is at odds with his previous statements.

"it is not clear which one to believe" is total BS. There are lots of AV companies that use cameras, radar and lidar and they have all figured out "which one to believe". It is very clear to them "which one to believe". And they make it work with multiple cameras, multiple radar and multiple lidar in 360 degrees around the car. But Tesla can't do sensor fusion of 1 radar and 3 forward cameras? Gimme a break. :rolleyes: Elon should just be honest and say "I am not interested in sensor fusion because I want to do vision-only" instead of making up these BS excuses about how sensor fusion is too hard and does not work.
 
"it is not clear which one to believe" is total BS. There are lots of AV companies that use cameras, radar and lidar and they have all figured out "which one to believe".


Then why don't they have robotaxis everywhere?

You keep insisting everyone else has this self driving thing figured out while they also haven't done anything other than tiny, heavily geofenced, only-a-few-cars-total, only-in-cars-they-own, consumer facing deployments.
 
Then why don't they have robotaxis everywhere?

You keep insisting everyone else has this self driving thing figured out while they also haven't done anything other than tiny, heavily geofenced, only-a-few-cars-total, only-in-cars-they-own, consumer facing deployments.

I never said that they have all of self-driving figured out. I simply said that they have good sensor fusion. Not the same thing. "solving FSD" is more than just doing sensor fusion.

I was merely responding to Elon's claim that sensor fusion of cameras and radar is too hard. I am pointing out that it is false because other companies have good sensor fusion. I am not claiming anyone has "solved FSD".

They don't have robotaxis everywhere because they have not solved the "AV problem" yet. That is completely different from the sensor fusion problem that Elon is claiming.
 
Last edited:
I never said that they have all of self-driving figured out. I simply said that they have good sensor fusion. Not the same thing.

Until they have self driving figured out we have no idea if they have "good" sensor fusion or not.

Since it might turn out, and Tesla certainly appears to believe, that such attempts at fusion ends up making self driving worse overall.


I'm totally on board with the idea Tesla might be wrong about it- I'm totally NOT on board with the idea anybody knows if they're right or wrong until someone actually solves self driving.
 
"Pure vision, especially when using explicit photon count, is much better than radar+vision, as the latter has too much ambiguity – when radar & vision disagree, it is not clear which one to believe"-- Elon Musk.

This is pretty appalling. They don't know which one is correct so they'll just pick one. I would think that they could measure the variance of each and weight them accordingly. He's not saying that vision is better, just that they can't handle the difference.

This argues for a third method (lidar), so that you can choose the two values amongst the three that agree the most, unless you have some definite error bounds on vision alone. But Musk said that "... it is not clear which one to believe." This implies that vision isn't any more accurate than radar which is at odds with his previous statements.
Do you really think Elon just did eenie meanie miney mo?
 
Until they have self driving figured out we have no idea if they have "good" sensor fusion or not.

Since it might turn out, and Tesla certainly appears to believe, that such attempts at fusion ends up making self driving worse overall.


I'm totally on board with the idea Tesla might be wrong about it- I'm totally NOT on board with the idea anybody knows if they're right or wrong until someone actually solves self driving.
And even then, there will be a degree of "in the eye of the beholder" with respect to how well it functions compared to other technologies.
 
  • Like
Reactions: EVNow
BS excuses about how sensor fusion is too hard and does not work.
I think it's more about money than being lazy about sensor fusion. As an engineer, if I proposed something that was just as good but saved money, I got a few solid attaboys. In contrast, if I proposed something that was slightly better but cost more, I got a polite thank you. In my experience, both incremental improvements and cost savings associated with applying new ideas to an existing product were always modest.

Using more sensors sounds great, but it adds both cost and complexity. The latter generally reduces reliability. It's the same logic that Tesla used for its automatic wiper tech. Sure, a wetness sensor costs next to nothing but it also needs a connection to the wiring harness. Doing everything based on vision has obvious drawbacks, but it definitely saves money.
 
"Pure vision, especially when using explicit photon count, is much better than radar+vision, as the latter has too much ambiguity – when radar & vision disagree, it is not clear which one to believe"-- Elon Musk.

This is pretty appalling. They don't know which one is correct so they'll just pick one. I would think that they could measure the variance of each and weight them accordingly. He's not saying that vision is better, just that they can't handle the difference.

This argues for a third method (lidar), so that you can choose the two values amongst the three that agree the most, unless you have some definite error bounds on vision alone. But Musk said that "... it is not clear which one to believe." This implies that vision isn't any more accurate than radar which is at odds with his previous statements.

"it is not clear which one to believe" is total BS. There are lots of AV companies that use cameras, radar and lidar and they have all figured out "which one to believe". It is very clear to them "which one to believe". And they make it work with multiple cameras, multiple radar and multiple lidar in 360 degrees around the car. But Tesla can't do sensor fusion of 1 radar and 3 forward cameras? Gimme a break. :rolleyes: Elon should just be honest and say "I am not interested in sensor fusion because I want to do vision-only" instead of making up these BS excuses about how sensor fusion is too hard and does not work.

I never said that they have all of self-driving figured out. I simply said that they have good sensor fusion. Not the same thing. "solving FSD" is more than just doing sensor fusion.

I was merely responding to Elon's claim that sensor fusion of cameras and radar is too hard. I am pointing out that it is false because other companies have good sensor fusion. I am not claiming anyone has "solved FSD".

They don't have robotaxis everywhere because they have not solved the "AV problem" yet. That is completely different from the sensor fusion problem that Elon is claiming.

What exactly is "good sensor fusion"? Maybe Tesla felt they had "good sensor fusion", but pure vision could be even better. We don't know. You don't know. Maybe Tesla is right, maybe wrong. Regardless, Elon and the Tesla engineers and developers who opted for Pure Vision have more relevant insights than any of us.

So what insights into Tesla's internal data, testing, experience, etc., do you guys have that warrant your "pretty appalling" and "total BS" comments?
 
  • Like
Reactions: outdoors
Perfectly. Exactly what I was looking for. Although. It’s not impossible that he is referring to 2 different releases, since beta is a seperate release. “Plus” rather than “Including” ;)
They will need to realign the beta and stable branches every once in a while so the beta cars don't fall too far behind and they can make sure there won't be any new issues that came up because of an accidental bug when trying to upstream FSD to the stable branch. Never know what Elon's tweets actually mean but it reads to me like they skipped 10.7 so they can release 10.8 with the holiday update fork.
 
  • Like
Reactions: jebinc