Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla.com - "Transitioning to Tesla Vision"

This site may earn commission on affiliate links.
I believe that the remembering and predicting part, i.e "temporal continuity and extrapolation" is exactly the most exciting benefit of v9, and by extension all the similar, though possibly hastily-constructed, updates to various other AP features.

The improved stability seen in the visualization is probably evidence of this. There are ways to achieve a more stable display without really reflecting a fundamental improvement in NN confidence, but I don't believe they're just playing display games with us; it makes perfect sense that the temporal analysis would have this stabilizing effect.

Regarding the radar-sees-past discussion, I again want to caution everyone against over-hyping that radar capability, and by extension being overly mourning its removal. The radar cannot "see", identify classify and track a lead vehicle just because some microwaves scooched under/around the in-between car and the reflection scooched back again.

Yes it's wonderful to be alerted that "there's something up there and it's suddenly stopping" - but that's about all the info you get. I'd be cautious about the idea that it would draw a box and track it reliably; that is probably more effective from the advanced camera NN catching glimpses as a human driver would. And if the system is aware, from glimpses, that the immediately-in-front vehicle is following a lead vehicle too closely, then it should adjust its own following distance in defense against a possible pile-up - the same way a smart and experienced human would. I'm not denying that the radar could help out here, just saying that its output is far less detailed than people are implying.
Yeah, definitely agree. I think the advantage of radar wasn't so much related to visualizations, but rather the ability to potentially know if an occluded car suddenly slowed down. It allowed AP to reduce speed even prior to the car directly ahead hitting the brakes. That is a fundamental sensor physics thing that vision cannot compensate for. You can't react to what you don't see.

I think it is fair to not treat radar as a silver bullet when it comes to AP/FSD. But, at the same time, radar is a fundamentally different type of sensing modality with inherent advantages over vision and it seems silly for people to start pretending like that isn't true any more. It certainly has plenty of drawbacks as well and Tesla has obviously had a hard time with sensor fusion, but it feels disingenuous to pretend like it isn't still fundamentally superior to vision in a small subset of situations and that no amount of vision-only improvements can replace those abilities of radar for the simple reason that vision works in a different region of the electromagnetic spectrum and the sensors work differently.
 
Removing radar shows Elon's influence in a manner similar to SpaceX; there's a push to improve a non-redundant system to the point that it's more effective, efficient, and safer than an alternate redundant system. I believe Elon's approach and maximization of efficiency is meant to lead to faster innovation, overall.
That doesn't make much sense stated this way.
By definition a redundant system is going to be more reliable than a non-redundant one.

The only way the pure vision system is going to be more reliable or accurate than the one with radar included is if radar specifically makes the system perform worse. But that is just not really possible because if they have proper sensor fusion then they can already tell when the radar is giving them an incorrect reading.

Every sensor will have noise and artifacts in its data. Sensor fusion done right can determine how to fuse the different sensory data to come up with something that is more reliable than the individual pieces.

The most likely explanation for the removal of radar is that in their evaluations vision based ranging became so good most of the time that they think now they can remove the radar without losing much robustness or fidelity.
They would only do that if their internal evaluations on their test datasets yielded positive results. Let's hope their test datasets cover most of the scenarios. I'm particularly concerned about hilly roads where the vertical position of cars in the camera's field of view can vary without the distance changing.
 
That doesn't make much sense stated this way.
By definition a redundant system is going to be more reliable than a non-redundant one.

Radar doesn't provide redundancy because radar itself can't drive the car. Multiple cameras provide redundancy.

Redundancy is more related to dealing with hardware failures rather than sensor fusion. Sensor fusion provides more confidence in perceiving environmental objects in most cases. Redundancy is having multiple of the same sensor / system in case one fails.

 
Radar doesn't provide redundancy because radar itself can't drive the car. Multiple cameras provide redundancy.

Redundancy is more related to dealing with hardware failures rather than sensor fusion. Sensor fusion provides more confidence in most cases. Redundancy is having multiple of the same sensor / system in case one fails.


Radar can provide redundancy for specific tasks. For example, radar does provide redundancy for measuring the velocity of objects.
 
Radar can provide redundancy for specific tasks. For example, radar does provide redundancy for measuring the velocity of objects.

I think almost all fsd developers only use one sensing modality (whichever sensor provides the most accurate information) to measure things like position, velocity, etc.

For example, with Tesla's radar enabled, Tesla is going to rely on the radar for all velocity measurements (for front of the car). But you're right, perhaps if the radar fails, then it may fall back on the camera.
 
Is there any indication Tesla actually added more cameras to compensate for the loss of radar? This Forbes article says it did in the headline, but the body of the article itself is silent on the issue.

 
what are everyone’s thoughts on moving forward with picking up my M3 in June? I found a used 2020 M3 with FSD that is a pretty good price. Would you take the new car with just autopilot and no radar or take the 2020 with 6k miles, FSD, and radar?
 
what are everyone’s thoughts on moving forward with picking up my M3 in June? I found a used 2020 M3 with FSD that is a pretty good price. Would you take the new car with just autopilot and no radar or take the 2020 with 6k miles, FSD, and radar?

Up to you, the newer one has heat pump and the newer (better) console. This vision-only transition seems to be going well, and it's likely Tesla is going to transition all cars within the next 2-3 months max.
 
  • Like
Reactions: Anthonyp
Tesla vision will only work in optimal weather conditions. Imagine driving down Hwy 5, coming around a corner and hitting Tully fog. Or in the Midwest a sudden thunder storm. I could not even see 2 feet in front of the car at times. Everything was fine then wam, i could not see. the car will never be level 3 even without some way to see in heavy conditions to at least give the driver a chance to pull over safely. And the one to two seconds it takes for the driver to realize the situation and to react could mean the difference between life and death. I think Musk is making a huge mistake here. But it is his dime.
 
  • Like
Reactions: YieldFarmer
The real future is when the system can remember and intuit that a car is there even when it cannot be seen based on seeing it in the past. That seems like it's needed to be "superhuman."
I have no doubt the algorithms are doing that already. In typical dynamic sensing systems once an object is identified a track is initiated and future positions and velocities are predicted. Even if the object is temporarily obscured from view the system knows it's there, at least for a while or until that hypothesis is disproven. Kinematic history plays a very important role in this track, i.e., where it was seen last, which direction it was headed, and how fast it was going. So it does sort of intuit it you might say.
 
From reddit. A user’s experience with a radar-less model 3 in rain conditions:

Sounds like TACC was almost unusable when cars were kicking up water in front on wet roads. Doesn’t sound good to me 😞Seems like a regression. I’ve had pretty good experience with TACC even in moderate to heavy rain conditions
 
It 'sounds' good but try living in the country. We live 25 miles from the nearest 'city' of 100k people. We drive Texas freeways every day. For those you who do not understand that is a two lane ( one lane each way) that have speed limits of 75 mph. We find fallen branches, dead and live deer on the road, cow, alligator strips, armadillos and boards almost every day. While I use fsd everyday , when rains heavy it is hard to see and I confess I rely on radar to get me thru
 
I have no doubt the algorithms are doing that already. In typical dynamic sensing systems once an object is identified a track is initiated and future positions and velocities are predicted. Even if the object is temporarily obscured from view the system knows it's there, at least for a while or until that hypothesis is disproven. Kinematic history plays a very important role in this track, i.e., where it was seen last, which direction it was headed, and how fast it was going. So it does sort of intuit it you might say.
I'd like to see any proof of this. Just today I noticed the display alternating between cones and "sticks" for a row of sticks. It happily alternated between them almost frame by frame which doesn't indicate any kind of permanence. I've seen trash cans come and go, stop lights flicker in and out of existence... And this at a stop, or moving, and with static obstacles, which are always fully in view, some of the easiest cases. Doesn't seem like there is currently any frame-frame kinematic tracking.

This is way harder if what you are trying to track is itself moving, like a car in front of the truck in front of you.

Happy to be proved wrong, but I've never seen any evidence and you having "no doubt" doesn't quite meet my bar.
 
True but also not that big of a deal? Hasn't rained here in the Bay Area in at least 6 months I'm pretty sure. Don't think the engineers working here care too much nor can they test it..

It's a big deal for the people who live in places where it rains. There've been a number of physical design problems with the cars that were clearly stupid to anyone who's lived somewhere outside of the bay area/most of California with similar weather, it'd be unfortunate to see the software trending the same way.
 
Last edited:
It's a big deal for the people who live in places where it rains. There've been a number of physical design problems with the cars that were clearly stupid to anyone who's lived somewhere outside of the bay area, it'd be unfortunate to see the software trending the same way.
Good point. It did seem like Tesla did not understand the different needs of cold climates in the past for example. Rain concerns me. Also ice (or frost) covering the lenses. Here in Northern NJ we have to scrape the ice off the windows many mornings, and every car gets covered with white salt spray from the roads in winter. How do they see through this stuff? Do the wipers cover the portion of the windshield where the front cameras are?
 
It's a big deal for the people who live in places where it rains. There've been a number of physical design problems with the cars that were clearly stupid to anyone who's lived somewhere outside of the bay area, it'd be unfortunate to see the software trending the same way.
I live in a place with little rain overall, but I'm all the more concerned for the assist performance when it does rain. The upkeep of painted lines and reflectors is not great on main roads and may be nonexistent on secondary ones, and the desert soil is prone to wash over the road. Consequently the lines almost disappear in wet conditions. I hate driving at night this way.

So for me this is a big reason to want extra assistance and confidence in the rain, not giving up and beeping out. I sincerely hope Tesla is able to train on this from the fleet data, and improve rainy-weather AP within this year.
 
  • Like
Reactions: YieldFarmer