It seems radar should “see” an object in the path whether or not it is moving. White objects do seem to represent a challenge. That long ago fatal crash in Florida, the one in which a tractor trailer rig made a left turn across the path of the S, the trailer was white so wasn’t seen by the car. I understand fixed objects are a problem. When one comes up over a hill with an overpass in the near distance, that would be seen as an obstacle across the road. GPS helps there, those overpasses don’t move. When one travels in the late afternoon when tree shadows are long across the road, the camera sees the dark patches on the road and can brake to avoid these shadow objects. That’s better now than previously.
You're not understanding how this works. There are reasons it doesn't see like you're thinking, which I'll try to explain here. What you're describing is much more typical of something like LIDAR, where you're projecting a laser along a single line, getting a range, then moving to a new line.
Automotive radar is generally continuous wave radar, in Tesla's case chirping continuous wave. What that means it the transmitter is putting out a single signal that covers the entire area the radar sees the whole time. It sees things by doppler effect and direction finding - objects moving at a different speed than the car reflect the signal at a different frequency than the one transmitted, with the frequency difference proportional to the speed difference. Using multiple receivers and the timing difference between the other frequency signal arriving at them, it has a sense of where the object is relative to it.
The chirping gives the car distance - 20 times per second, the car does a frequency hop, then times how long that hop takes to come back on the doppler frequency. Both the distance and direction are of limited resolution, based on how precisely the car can measure the timing involved.
So the initial approach of all automotive radar was just to throw out anything that wasn't moving (had a doppler shift that's the same as the car's speed.) That's why they couldn't be used to come to a complete stop and start up again. Some of those first generation designs are still in use, like what Ford offers in the Fusion.
Second generation continues to throw out everything that's stationary, but once the radar locks on to something in front that is moving, it remembers it and keeps track of the distance with the chirps. That's where most cars with ACC are today, able to to full stop and go flawlessly as long as the car in front was moving and in range when they got in front.
Tesla added another layer - the car does some sort of sensor fusion (not sure exactly how it's implemented) and compares things the radar sees with things the camera neural net thinks are vehicles - and so if the camera NN recognizes a car as a car, TACC will start responding to it and tracking it with the chirps even though it is stopped and was never seen to move.
Now you've got a soda can pointed at the car, a stopped car, and a big road sign next to the car. So the radar bounces off of all of them, with a doppler shift that says it's stationary. But it comes back as one return, and the signal is mixed in with all the cracks in the road and overpasses and such - all on the same frequency, but returning the chirp with different timing. Breaking them apart to realize that one of them is a stopped car on the road requires a whole lot of processing, which has to be done in very little time (twenty chirps per second coming through...) Unless the camera NN recognizes it as a car, all three of the above designs will likely hit it.
One approach Tesla said they were taking that can mitigate that is the radar whitelist. Basically, the cars all make a record of the stationary returns they see as they're driving, with as much information about location and signal intensity as the car can put together. Those get reported to the mothership, and it stitches them into map tiles of what a car should see on a given road in a given lane.
Then the car would download tiles for the area around it, and if the car sees something different than the tile says, it's either a stopped car or the aforementioned aluminum can (which has a nice concave bottom that gives a radar return far beyond its size.) If the resolution is high enough, it can serves as a third reference to back up the cameras and GPS for Autonomous driving in the future - that return is 5.2 degrees to my left at 200 feet, so by the map I'm six inches into the right lane.
This was supposed to be rolling out as part of 8.1 back in the day, but I haven't heard anything about it in a year or two, so I'm not sure what the status is. One challenge is how you deal with false positives caused by changes to the road signs or other nearby objects. I guess the car could pop a red hands take over immediately at the first couple times a car sees it, then add it to the map?