Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Possible removal of radar?

This site may earn commission on affiliate links.

Almost ready with FSD Beta V9.0. Step change improvement is massive, especially for weird corner cases & bad weather. Pure vision, no radar.

Elon seems rather confident in the vision-only approach.


(Twitter user) Owen Sparks: "Does this mean you can remove the radar from production, or will it still be included as a back up?

Elon Musk: "Remove".

Other than the radar bouncing under the car in front of you, I think if anything it adds to the confusion of the car's perception. I believe it's the primary cause of phantom braking.

And the "see the car in front of the car in front of you" probably already is getting trained for FSD. So yeah, for 99.9% of cases, the radar seems to be dead weight and a hindrance.

My guess is radar will stick around for another 6 months to a year, then disappear once Tesla has enough data to support what they suspect.

Edit: Ultrasonics will remain for situations where obstructions are very close (e.g. in a garage) and vision cannot provide useful ranging information.

So many posts arguing whether radar is needed, helpful, a hindrance.

I don't understand the science behind all of this to have a strong opinion. But I know someone who does. And I trust him to decide.

Dropping radar is hard to accept for some (myself included). I think it’s best to realize while adding radar ought to be a positive and makes sense in theory, if Tesla isn’t seeing positive results out of the NN, it simply may not be practically better. Vision from the 7? cameras might just be so good that the added complexity of the radar parameters doesn’t make a tangible improvement.
 
My recollection is that the phantom breaking problem was mostly caused by radar.
So radar might be great some of the time and may give false positives at other times.
With 5 forward facing cameras the chances of all 5 being obscured at the same time is very remote.

So my guess is that the birdseye view uses everything any camera can currently see, past history and training. So the difference is no false positives, but perhaps missing the odd thing radar might catch earlier.

The car can always slow down and increase the follow distance in any situation where the birdseye view is less confident that it has the full picture.
 
I’m just very curious how vision only might do with large stationary objects. Pulling into a garage, parking lot with lots of parked cars, low contrast areas, etc. I figure if something fills the field of view of those three front cameras you’ll lose definition on the distance between your car and the large object in front of you.

I guess ultra sonics can take over when close enough, but I never have much confidence in them since cars and trucks always dance around in adjacent lanes and stuff. Maybe the ultra sonic data isn’t being fed into the NN yet and they could become much more powerful with the right software and refinement behind them?
 
My recollection is that the phantom breaking problem was mostly caused by radar.
So radar might be great some of the time and may give false positives at other times.
With 5 forward facing cameras the chances of all 5 being obscured at the same time is very remote.

So my guess is that the birdseye view uses everything any camera can currently see, past history and training. So the difference is no false positives, but perhaps missing the odd thing radar might catch earlier.

The car can always slow down and increase the follow distance in any situation where the birdseye view is less confident that it has the full picture.

VAG has been using radar for ACC since 2014 without issue, it's Musky's implementation in Tesla that's at issue.
 
I’m just very curious how vision only might do with large stationary objects. Pulling into a garage, parking lot with lots of parked cars, low contrast areas, etc. I figure if something fills the field of view of those three front cameras you’ll lose definition on the distance between your car and the large object in front of you.

I guess ultra sonics can take over when close enough, but I never have much confidence in them since cars and trucks always dance around in adjacent lanes and stuff. Maybe the ultra sonic data isn’t being fed into the NN yet and they could become much more powerful with the right software and refinement behind them?
It should be very hard for a large stationary object to look like driveable space.
Drivable space needs curbs, lane lines, other vechicles or something like that looks like a road.

I think those crashes into stationary trucks may have had more to do with radar.

If a truck is close enough to fill the field of view, the cameras should be able to see the number plate, if they can't they should slow down until they can.

If a truck did cut in from an adjacent lane, they would have seen it in cameras earlier.
If it is stationary and the car is approaching it from behind, surely it sees the number plate during the approach?

If a visual situation is good enough to fool FSD, it probably also fools many human drivers.
 
VAG has been using radar for ACC since 2014 without issue, it's Musky's implementation in Tesla that's at issue.
That could be true, all I know is radar was causing problems at Tesla.

VAGs ACC might not be a FSD solution, the driver may have to pay attention, so emergency breaking in response to radar might not happen.

If the VAG solution does do emergency breaking in response to radar, it would be interesting to know why it works better.
 
It should be very hard for a large stationary object to look like driveable space.
Drivable space needs curbs, lane lines, other vechicles or something like that looks like a road.

I think those crashes into stationary trucks may have had more to do with radar.

If a truck is close enough to fill the field of view, the cameras should be able to see the number plate, if they can't they should slow down until they can.

If a truck did cut in from an adjacent lane, they would have seen it in cameras earlier.
If it is stationary and the car is approaching it from behind, surely it sees the number plate during the approach?

If a visual situation is good enough to fool FSD, it probably also fools many human drivers.

I’m suggesting large stationary objects like the side of a building, or concrete wall of a parking garage. If you’re twenty or thirty feet away but pulling up to a parking space, will vision be accurate enough to judge the distance to get close enough for ultrasonic to take over? Maybe using the lines on the space, but being flat on the ground that might be hard to use as a reference to actually judge distance. Will this break summon more than it already is etc?

I can totally see turning radar off in “at speeds” driving (highway, city driving), but is vision accurate enough in bumper to bumper traffic and other low speed situations? Can it discern the difference between 8 feet and 12 feet?

The ultrasonics sound like they have a range of about 7 feet... I just would want very good distance measurements when in busy stop and go traffic and other close range interactions...
 
VAG has been using radar for ACC since 2014 without issue, it's Musky's implementation in Tesla that's at issue.
That could be true, all I know is radar was causing problems at Tesla.

VAGs ACC might not be a FSD solution, the driver may have to pay attention, so emergency breaking in response to radar might not happen.

If the VAG solution does do emergency breaking in response to radar, it would be interesting to know why it works better.
I’m suggesting large stationary objects like the side of a building, or concrete wall of a parking garage. If you’re twenty or thirty feet away but pulling up to a parking space, will vision be accurate enough to judge the distance to get close enough for ultrasonic to take over? Maybe using the lines on the space, but being flat on the ground that might be hard to use as a reference to actually judge distance. Will this break summon more than it already is etc?

I can totally see turning radar off in “at speeds” driving (highway, city driving), but is vision accurate enough in bumper to bumper traffic and other low speed situations? Can it discern the difference between 8 feet and 12 feet?

The ultrasonics sound like they have a range of about 7 feet... I just would want very good distance measurements when in busy stop and go traffic and other close range interactions...
They probably use low def maps and only encounter this problem in low speed situations.
The car knows that it is in a parking area and would slow down.

Tesla has a lot of training data, they must have tested these scenarios.
If radar was useful in particular situations they would probably retain it.
 
That could be true, all I know is radar was causing problems at Tesla.

VAGs ACC might not be a FSD solution, the driver may have to pay attention, so emergency breaking in response to radar might not happen.

If the VAG solution does do emergency breaking in response to radar, it would be interesting to know why it works better.

They probably use low def maps and only encounter this problem in low speed situations.
The car knows that it is in a parking area and would slow down.

Tesla has a lot of training data, they must have tested these scenarios.
If radar was useful in particular situations they would probably retain it.

Yep, have to put a lot of faith in Musk... I just worry he sees it as being plausible on the whole (highway and normal city driving) and then he gets absolutely focused on “don’t need it! This will work” before all the edge cases are determined. Then his team spends thousands of not tens of thousands of hours making it work in those cases.

Look at the other guys, they’ve largely caught up to Musk with “self driving” on freeways fairly quickly, especially for being large risk adverse established manufacturers. (Meaning I highly doubt they’re telling engineers to try whatever they think works, and if it’s good they’ll validate it fleet wide for potential addition etc.) Largely because they’ve slapped high def maps and or lidar into the cars. Could you imagine if Tesla had high def maps on the freeways in addition to what they’ve developed in vision?

I guess I would be happier if Musk said something like “radar removal has been running in shadow mode for the last year and agrees or exceeds current decisions.” I also would be more open if he tossed in some comments about summon and other components.

Can vision determine a shopping car in the traffic lane when maybe behind it is a bunch of bushes in the winter (so just bare sticks and stuff). Though at the same time, would radar return a hit on something like a shopping cart that’s fairly small and made up of a lot of empty space?

I picture Musk having a need to fit square and round shaped objects through something and instead of designing his receiver to have a square and round opening, he insists his team makes it work with only one opening and they come up with a way to melt down one of the objects, cast it into the other shape, machine and polish it back to final finish, and then pass it through the receiver.... and sure, the receiver is more simple in their case and it’s cheaper or less parts... but they designed a whole forged and finishing shop to get to that space...

He does deliver on crazy things though. Radar removal probably will work... I hope it just doesn’t make some “off normal” cases worse for the next couple years in the process.
 
  • Like
Reactions: MC3OZ
This is vision in fog at night - its normally a 60/70mph road, but iirc we were down to around 40mph and about 50 yards human visibility.

The cameras are pretty good seeing in these conditions, but I'm not convinced they will be under same conditions in daylight or rain/spray instead of fog when radar would be pretty much unaffected. Of course, most other cars at present will be vision limited, but not all. But it is reassuring that in poor conditions that the car can currently probably 'see' better than my unaided (better than) 20/20 vision.

As CAV (connected autonomous vehicles) develop, I hope that Tesla will not just be relying on computer vision for these situations. Those with additional sensors, V2V etc, will cope much better and Tesla may likely be relegated to the slow lane, literally unless they equally adapt.

1618147192644.png
 
It always blew my mind when cars totally fail at emergency braking tests. Trying to augment (a la "sensor fusion") or patch radar's failing seems like wasted effort when there's so much information coming from the cameras. Radar-based cruise or whatever is a legacy holdover whose time has come.


"Radar has low angular resolution, so it had only a crude idea of the environment around the vehicle. What radar is quite good at, however, is figuring out how fast objects are moving. And so a key strategy for making the technology work was to ignore anything that wasn't moving."

That could be true, all I know is radar was causing problems at Tesla.

VAGs ACC might not be a FSD solution, the driver may have to pay attention, so emergency breaking in response to radar might not happen.

If the VAG solution does do emergency breaking in response to radar, it would be interesting to know why it works better.
 
  • Informative
Reactions: MC3OZ
Theory is that Tesla will take slices in time. Sounds plausible.

My idea building on this - can't Tesla pulse the headlights and use this technique in place of LIDAR pulses?
I replied to this in another thread:
https://teslamotorsclub.com/tmc/posts/5453120/

The video presenter's explanation on "slices in time" is not credible. As to headlight pulsing, that sounds like strobal echo-location, which is entirely possible in theory but I'll leave it to you to think about the disadvantages.
 
Go on - hit me with those cons
Only if you give me some pros/cons too.

Cons:
  • Need for absolute darkness to distinguish between strobe return and ambient light
  • Open shutter with real-time photon-gathering is not feasible while actually moving
  • Processing of photon-time-return to build up a depth profile using non-coherent light from a diffuse source is impractical (which is why Lidar uses laser light)
  • Light return from fog likely to negate anything they are trying to achieve (hence why you don't use high beams in fog)
  • Insane epileptic effect of 100+W strobing to continue to accumulate data while driving
Perhaps I mis-spoke when I said strobal echo-location is entirely possible in theory
 
Last edited:
  • Like
Reactions: goRt
If Tesla is chasing the 9's in accuracy/safety, dont understand why they wouldn't include radar, even if that means just one extra 9.

Because radar will catch a corner case somewhere, ie zero vis pileup on interstate that no vision camera will catch, but radar will see from a large enough distance to slow down in time.

Even if it means relying on vision NN 100% of the time when visibility is greater than ok, it should be there just to catch this one case.
 
  • Like
Reactions: goRt and Dan D.
If Tesla is chasing the 9's in accuracy/safety, dont understand why they wouldn't include radar, even if that means just one extra 9.

Because radar will catch a corner case somewhere, ie zero vis pileup on interstate that no vision camera will catch, but radar will see from a large enough distance to slow down in time.

Even if it means relying on vision NN 100% of the time when visibility is greater than ok, it should be there just to catch this one case.



If the trade is:

Vision only, and it's ready this year

or

Keep working on making a forward-only, relatively low res, radar fuse with the 360-degree real 30 fps 4D camera vision feeds.... which might take years, if ever, to get to work well enough


They may feel the amount of safety radar adds isn't worth waiting for compared to how much safer "good" L4 vision only would be compared to L2 or human driving.