Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Discussion: Tesla Vision system for Model 3/Y

This site may earn commission on affiliate links.
Found the latest to be slightly better at some things, even in heavy rain the past two days, 30mi of driving home from work, only have a few disengagement due to the heavy weather...this is why autonomous isn't happening anytime soon with anybody, in FL, the rain can be horrendous. It was pretty good rain, I would say a med-med/heavy and it did well with as I said, a few autopilot can't work now notices due to the above.

Sentry was supposed to be fixed, some say it is, I say I don't think mine is still working. I tried testing it, maybe it takes a few minutes to start working once leaving the car, but the Sentry Eyeball screen wouldn't come on when walking up next to the car, no phone, no keycard near me, car locked. It wouldn't flash lights or anything...I've turned it on/off, ensured all the boxes unchecked like exclude home/work etc...still didn't seem to work. Guess I need more testing or someone can explain circumstances it should work?!?!

Every car with ACC I've owned, they all do the same thing the person above reported, about a car moving over a lane or turning, and your car keeps slowing down and takes a few extra seconds to then accelerate. That is entirely software I believe. My other two cars with ACC, a BMW and VW, both with radar and cameras, do the same thing. They take an abnormal amount of time when a person driving would start to get on the accelerator and go with the car ahead and there is pretty much non reason/chance that vehicle would not be out of the turn/road by the time you are comfortably near it. The cars all seem to be very conservative about this behavior. Also, none of these systems seem to handle/pickup early enough while fast approaching traffic at a light stopped like your human eye does...they all wait too late, or will just run you 50mph into the back of a car with no slowing, or slows too late so you have to intervene and collision alert is going off. Which BTW, had an incident this morning where coming up fast with no AutoPilot/Autosteer going on, and the car didn't even attempt to alert me I was going to ram the cars at the light stopped. I have collision set at regular/med, but even in Late, it should have been going nuts...but didn't say a thing. I know this isn't city driving and I don't have FSD or Beta...maybe they handle this better...regular autopilot/ACC just doesn't do it I found no matter what manufacturer and technology I've owned.
 
Last edited:
Found the latest to be slightly better at some things, even in heavy rain the past two days, 30mi of driving home from work, only have a few disengagement due to the heavy weather...this is why autonomous isn't happening anytime soon with anybody, in FL, the rain can be horrendous. It was pretty good rain, I would say a med-med/heavy and it did well with as I said, a few autopilot can't work now notices due to the above.
Can't Tesla figure bad weather out in the near term though? If I can drive in zero visibility somewhat safely (conditions being relative) then can't my car do a better job? Cap speed to 50 or less on the highway, initiate the hazards so other cars can see more easily, make less or no lane changes. Perhaps even pull onto the shoulder and wait until conditions improve? This is exactly where I expect 8 cameras and a computer to shine - they take away the human risk tolerance by meaningfully adjusting car behavior via speed or even stopping until the heavy rain or snow lets up.
 
My phantom braking also occurs going up or cresting a hill, especially when another car is coming over it in the other direction.
This is all daytime driving.
2021.4.18.2
Maybe .10 version will help.
This is interesting to hear. I would guess that about 70% of my phantom braking events is happening when cresting hills... And it's super flat in this area of Michigan, so these are just very slight crests were talking about. On the exact same roads, I never once noticed this correlation happening with my previous Model 3.

Side note: I tried 80mph AP on the expressway yesterday (daytime, sunny) and it was flawless.
 
Can't Tesla figure bad weather out in the near term though? If I can drive in zero visibility somewhat safely (conditions being relative) then can't my car do a better job? Cap speed to 50 or less on the highway, initiate the hazards so other cars can see more easily, make less or no lane changes. Perhaps even pull onto the shoulder and wait until conditions improve? This is exactly where I expect 8 cameras and a computer to shine - they take away the human risk tolerance by meaningfully adjusting car behavior via speed or even stopping until the heavy rain or snow lets up.
You should never initialize hazards while driving in the rain. Or, while driving anytime for that matter. It is against the law. At least in Florida.

Only time hazards are used is when your car is disabled/stopped to alert others. Some yahoos always do this in FL during hard downpours, most of us think you are seeing a broken down/stopped car, but they are driving along in the rain like idiots with their hazards on.
 
  • Like
  • Disagree
Reactions: jrmt23 and WhiteWi
You should never initialize hazards while driving in the rain. Or, while driving anytime for that matter. It is against the law. At least in Florida.

Only time hazards are used is when your car is disabled/stopped to alert others. Some yahoos always do this in FL during hard downpours, most of us think you are seeing a broken down/stopped car, but they are driving along in the rain like idiots with their hazards on.
I'll disagree - it's extremely helpful to spot those slow driving yahoos in their 20 year old cars with dim taillights. You're assuming everyone slows down to a reasonable speed given conditions, and that's just not the case. I could drive 15-20 under the speed limit on NJ highways in heavy rain and someone will still be traveling over the speed limit.

PS - your state seems to agree with me and 40 other states as they just changed that law and effective today, you can drive on the highway with the hazards on in the heavy rain and fog.
 
(2021.4.18.10) Just autopiloted home in heavy rain at night, seemed to work fine! Had my first phantom braking earlier in the day though :/ The car suddenly started to slow down moderately quickly right before a traffic light fixutre (with a shadow cast on the ground by the horizontal pole above)
 
Do you have a reference for this? The angle stated in the data sheet does indeed appear to be the angular resolution. It is essentially the beam width of the array and specifies the closest two objects can be separated in angle and still be recognized as distinct. How Tesla uses this radar is anyone's guess, but the device itself does appear to be a scanning phased array device capable of both radial and angular position measurement.

No, it's not capable of angular position measurement. If you want to insist that it's capable of this, please find a reference stating it can do this. I've looked at multiple data sheets, reviews, and filings for the ARS4 and ARS4-A RADAR sensors from Continental in several languages/countries, and none of them have said it does angular position.

It's true that a planar phased array device can be capable of angular measurement; that doesn't mean all of them can do it. It's not true that this device is capable of angular measurement. That's simply not what it's designed to do and not a piece of information it outputs.

It is very much engineered to be very effective and very fast at determining distance and velocity for a cloud of objects (up to 64 of them at once), and it does so for two different ranges at once, with 17 scans per second.

1625242103352.png



Meanwhile, here's a reference to its ability to provide the angle relation between 2 objects:

1625242073336.png


 
I can't recall or find who posted the Andrey Karpathy (Tesla AI engineer) Workshop video, but thanks! I finally had time to watch it.

Confirms my earlier suspicions: RADAR is introducing a lot of noise which takes a lot of work to reconcile. It's wasteful of processing power and fails to provide much valuable information that isn't already there from vision, which means it's a net negative at run time. Moreover, time and effort spent mitigating the noise was holding back progress on the actual vision components, which means that sticking with reconciling RADAR into the model holds back the video AI from getting better, so it's a net negative to code and neural net development.

This graph from the video is a great example – the orange line is the legacy system's perception based on RADAR combined with vision:

1625243297272.png


This happens at high speed when the car ahead slams on the brakes and decelerates rapidly. The RADAR interprets sees the incoming signals as multiple different cars appearing at different positions just ahead; it detects points and reports them.

The vision system sees the same car, appearing closer and closer.

The system needs to fuse these two sets of a data together and a lot of code and processing is spent ruling out the idea that there are multiple cars.

For contrast, the blue line shows the latest vision-only system, uncorrupted by the RADAR noise.

----

Here's another example. This time, the car is rapidly approaching a stopped truck (the white rectangle on the should of the road ahead in the video frame.)

1625243773814.png


There are also buildings and trees and other "stopped" opbjects that the radar perceives, and it can't tell if they are on the road or not. So the RADAR is sending signals for all of theses, and the fused RADAR-vision system has to work to ignore all these "false stopped objects." (I say "false" because they're not on the road. They are real stopped objects create a false "warning.") But that *same* filtering that weeds out the false stop is also weeding out the *real* stationary object of interest - the truck! As a result, the fused RADAR-vision model (in orange) takes longer to figure out that there 's a stopped object ahead until we're about 110 meters away.

But the new pure-vision model, without that RADAR noise, (in blue), consistently concludes there's a stationary object *in the roadway* at 145 meters away. That's significanlty sooner.

Link to the video segment that shows the example above:



(There's also an example of the false RADAR noise when passing under a bridge.)

---

And don't talk to me about driving this road in fog... in dense fog, where the vision can't see more than, say, 30 meters ahead, all those RADAR reports of stopped objects would mean the car can't tell which is real or not to rule them out, and then it has to slow to a speed where it can brake within 30 meters, because it has to go slow enough that it can stop for any of these false stops... which means, it's going no faster than it can see.

Even with RADAR, the car can go no faster in the fog than it can see the road ahead... if we tell the software to assume that the road continues straight (which is what we humans do while driving such a road), then it still has to ignore all those RADAR messages about stopped objects. If we are forced to tell it to ignore all stopped objects outside our field of vision in order to keep going, then it can't help us in the rare case where there's actually something in the road ahead! Simply put, it's not safe safe to drive so fast that you can't brake in time for something you can see.

And yes, if it had LIDAR, it could probably tell which objects are on the road vs off the road... but my Model 3 does not have LIDAR. It only has RADAR and vision for distance measurements... so I'll be glad when I have the software that turns off the RADAR.

When the fog is dense enough that you can't see the road ahead, it's safe neither for humans nor the Model 3. You just have to go slow. Even a car with LIDAR - which could see an object on the roadway in the fog ahead - can't go faster, because LIDAR can't tell it where the road is! If there's a hole in the roard, or it's washed-out, etc., the LIDAR can't tell you. It's still unsafe to go too fast when visibility is gone.
 

Attachments

  • 1625243089001.png
    1625243089001.png
    1.1 MB · Views: 41
Not sur
No, it's not capable of angular position measurement. If you want to insist that it's capable of this, please find a reference stating it can do this. I've looked at multiple data sheets, reviews, and filings for the ARS4 and ARS4-A RADAR sensors from Continental in several languages/countries, and none of them have said it does angular position.

It's true that a planar phased array device can be capable of angular measurement; that doesn't mean all of them can do it. It's not true that this device is capable of angular measurement. That's simply not what it's designed to do and not a piece of information it outputs.

It is very much engineered to be very effective and very fast at determining distance and velocity for a cloud of objects (up to 64 of them at once), and it does so for two different ranges at once, with 17 scans per second.

View attachment 680694


Meanwhile, here's a reference to its ability to provide the angle relation between 2 objects:

View attachment 680693

Not sure why you think this device can’t measure azimuth angle. It does appear to be a scanning phased array, and the spec sheet specifies the beam width and angular resolution. In fact, it seems the phrase you highlighted above says it does measure angle. So I must be missing your point. If this phased array beam steering device cannot report the angle a reflection came from then someone must have gone through a lot of trouble to gimp it.

I am very familiar with phased array radar systems and I can tell you have some experience as well. I genuinely would like to learn more about why you believe this device cannot report AOA. I’m not saying it does, just that it ought to be able to based on its description as I read it.
 
I found a video online and some other threads. Seems it still works for now even with a covered camera. Hopefully it stays that way through the v11 update.

Thanks for posting. I hate the 75 MPH speed limit on autosteer but I'm too scared to update out of fear for increased nanny features. Eventually they are going to patch out our steering wheel weights and blocked webcams to appease the lawyers and I really don't want to jiggle the steering wheel every 30 seconds or whatever as that is even more unsafe IMO.
 
Thanks for posting. I hate the 75 MPH speed limit on autosteer but I'm too scared to update out of fear for increased nanny features. Eventually they are going to patch out our steering wheel weights and blocked webcams to appease the lawyers and I really don't want to jiggle the steering wheel every 30 seconds or whatever as that is even more unsafe IMO.
Do you even own Tesla?
 
We took our first long trip in our new radar less Y and its not bad, but not as good as our 3 with radar. The car was slow to accelerate in line with the car infront and late to brake of the car was braking at more than a moderate pace (radar car much better at this). When it was raining, the car handled the weather fine but the rain was moderate to light. In areas of hills, particularly when going over a crest and the car couldn’t see the lines briefly we were brake checked and my wife made me turn it off.
My opinion is, and just my opinion, is that the radarless current autopilot offering is a step back, but I fully expect it to be on par and then exceed the current radar/camera offering.
 
So my car officially started bugging me to upgrade to 2021.4.18.2 which adds the interior camera monitoring. I currently have my camera covered and dont really want to have to uncover it but fear doing so will stop AP from working. Has anyone with the update tried covering the camera to see if it stops AP from being activated?
AP is fine if you keep the camera covered.

It has no effect on use of AP and you can turn it off in the privacy settings in the car (if memory serves).

I live dangerously and keep the camera on. I even agreed to share the video with Tesla, which I think only goes to them in the event of a crash or serious failure.

If one is paranoid about being monitored, it's fair to ask whether buying a car with a gazillion cameras, aggressive machine learning software and hardware and the ability to map out your travels with a list of Supercar chargers and your expected battery level once you get to those chargers is even advisable.
 
I can't recall or find who posted the Andrey Karpathy (Tesla AI engineer) Workshop video, but thanks! I finally had time to watch it.

Confirms my earlier suspicions: RADAR is introducing a lot of noise which takes a lot of work to reconcile. It's wasteful of processing power and fails to provide much valuable information that isn't already there from vision, which means it's a net negative at run time. Moreover, time and effort spent mitigating the noise was holding back progress on the actual vision components, which means that sticking with reconciling RADAR into the model holds back the video AI from getting better, so it's a net negative to code and neural net development.

This graph from the video is a great example – the orange line is the legacy system's perception based on RADAR combined with vision:

View attachment 680703

This happens at high speed when the car ahead slams on the brakes and decelerates rapidly. The RADAR interprets sees the incoming signals as multiple different cars appearing at different positions just ahead; it detects points and reports them.

The vision system sees the same car, appearing closer and closer.

The system needs to fuse these two sets of a data together and a lot of code and processing is spent ruling out the idea that there are multiple cars.

For contrast, the blue line shows the latest vision-only system, uncorrupted by the RADAR noise.

----

Here's another example. This time, the car is rapidly approaching a stopped truck (the white rectangle on the should of the road ahead in the video frame.)

View attachment 680708

There are also buildings and trees and other "stopped" opbjects that the radar perceives, and it can't tell if they are on the road or not. So the RADAR is sending signals for all of theses, and the fused RADAR-vision system has to work to ignore all these "false stopped objects." (I say "false" because they're not on the road. They are real stopped objects create a false "warning.") But that *same* filtering that weeds out the false stop is also weeding out the *real* stationary object of interest - the truck! As a result, the fused RADAR-vision model (in orange) takes longer to figure out that there 's a stopped object ahead until we're about 110 meters away.

But the new pure-vision model, without that RADAR noise, (in blue), consistently concludes there's a stationary object *in the roadway* at 145 meters away. That's significanlty sooner.

Link to the video segment that shows the example above:



(There's also an example of the false RADAR noise when passing under a bridge.)

---

And don't talk to me about driving this road in fog... in dense fog, where the vision can't see more than, say, 30 meters ahead, all those RADAR reports of stopped objects would mean the car can't tell which is real or not to rule them out, and then it has to slow to a speed where it can brake within 30 meters, because it has to go slow enough that it can stop for any of these false stops... which means, it's going no faster than it can see.

Even with RADAR, the car can go no faster in the fog than it can see the road ahead... if we tell the software to assume that the road continues straight (which is what we humans do while driving such a road), then it still has to ignore all those RADAR messages about stopped objects. If we are forced to tell it to ignore all stopped objects outside our field of vision in order to keep going, then it can't help us in the rare case where there's actually something in the road ahead! Simply put, it's not safe safe to drive so fast that you can't brake in time for something you can see.

And yes, if it had LIDAR, it could probably tell which objects are on the road vs off the road... but my Model 3 does not have LIDAR. It only has RADAR and vision for distance measurements... so I'll be glad when I have the software that turns off the RADAR.

When the fog is dense enough that you can't see the road ahead, it's safe neither for humans nor the Model 3. You just have to go slow. Even a car with LIDAR - which could see an object on the roadway in the fog ahead - can't go faster, because LIDAR can't tell it where the road is! If there's a hole in the roard, or it's washed-out, etc., the LIDAR can't tell you. It's still unsafe to go too fast when visibility is gone.
Thanks for sharing.
 
If one is paranoid about being monitored, it's fair to ask whether buying a car with a gazillion cameras, aggressive machine learning software and hardware and the ability to map out your travels with a list of Supercar chargers and your expected battery level once you get to those chargers is even advisable.
I agree. The same applies to carrying a cellphone. To play the game is to share. I think it's foolhardy to believe claims that "no personally identifiable information" is used or retained. I still own a phone, of course, and a Tesla, but I am under no illusions of total privacy. I wonder, does it have a "Date Mode"? ;)
 
Show me a data sheet or description of its capabilities anywhere that says that it does.
The data sheet you already linked above says it does. But for more proof look at some images of test scans. These two-dimensional images were constructed from range and angle measurements produced by the radar. It is abundantly clear not everything is straight ahead, hence, angle of arrival was measured and considered. These radars don't physically move to scan, they are phased arrays and scan electronically over angle. Images like these cannot be made without angular data, which means the radar is capable of measuring angle of arrival.
Screen Shot 2021-07-03 at 10.49.04 AM.png


Screen Shot 2021-07-03 at 10.49.55 AM.png
 
Agreed. It does inspire confidence. I especially liked the comments about how their system is still learning by watching so many of us drive and comparing what AP did with what the human driver did and learning from it. He mentioned an example case where the human braked but AP didn’t think it needed too. This ongoing refinement is really where it’s at. Exciting!
 
I agree. The same applies to carrying a cellphone. To play the game is to share. I think it's foolhardy to believe claims that "no personally identifiable information" is used or retained. I still own a phone, of course, and a Tesla, but I am under no illusions of total privacy. I wonder, does it have a "Date Mode"? ;)
I think this is the right attitude.