Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla vision only - no radar from 2022 Q2 cars

This site may earn commission on affiliate links.
One of the things that I love about Tesla is that they seem to be holding true to the ethos that the simplification of the modern car is a must. Especially in areas where it actually provides a net positive for the owner and the planet. It seems that if there is a strong enough design or engineering argument for something, they try it. This is the opposite of what the big guys have been doing for decades.

I’m all for them removing radar completely if it means they can redouble their efforts on vision.

A quick thought - removing radar doesn’t just remove the radar component. It removes a slew of other components and dependents (both physical and software) I.e the radar heater element, the wiring, the mounting components, the space in the bumper moulding, the space and time for the component on the factory floor, etc. etc. the list goes on… just for radar. And that’s not even touching on the logistics and testing.
 
One of the things that I love about Tesla is that they seem to be holding true to the ethos that the simplification of the modern car is a must. Especially in areas where it actually provides a net positive for the owner and the planet. It seems that if there is a strong enough design or engineering argument for something, they try it. This is the opposite of what the big guys have been doing for decades.

I’m all for them removing radar completely if it means they can redouble their efforts on vision.

A quick thought - removing radar doesn’t just remove the radar component. It removes a slew of other components and dependents (both physical and software) I.e the radar heater element, the wiring, the mounting components, the space in the bumper moulding, the space and time for the component on the factory floor, etc. etc. the list goes on… just for radar. And that’s not even touching on the logistics and testing.
I don’t think anyone questions that, but it’s the IFs in the statement

IF it can be made to be as good or better

and

IF it’s ready now

The former of those, time will tell, but the second one is the one that frustrates people the most. You can’t remove radar rendering all the passive safety systems defunct until you have worked out how to do it without the radar, assuming it is possible (auto wipers being an example). That’s just reckless especially with passive safety systems. The safe way is develop a better system using vision, then make the switch is the only responsible way to go. That’s not what they have done in the US and the lack of safety stats which they used to provide makes me question whether that’s still the case
 
I don’t think anyone questions that, but it’s the IFs in the statement

IF it can be made to be as good or better

and

IF it’s ready now

The former of those, time will tell, but the second one is the one that frustrates people the most. You can’t remove radar rendering all the passive safety systems defunct until you have worked out how to do it without the radar, assuming it is possible (auto wipers being an example). That’s just reckless especially with passive safety systems. The safe way is develop a better system using vision, then make the switch is the only responsible way to go. That’s not what they have done in the US and the lack of safety stats which they used to provide makes me question whether that’s still the case
I agree and I think we can agree that all of this is “what if’s” until they do it, or don’t.

Its only a problem if it proves to be a problem and it’s only great if it proves to be great.

I would be highly surprised if they blundered into this. A quiet reversal would have happened by now if there were any doubts, because it’s a big deal. Just my two cents though.
 
One of the things that I love about Tesla is that they seem to be holding true to the ethos that the simplification of the modern car is a must. Especially in areas where it actually provides a net positive for the owner and the planet. It seems that if there is a strong enough design or engineering argument for something, they try it. This is the opposite of what the big guys have been doing for decades.

I’m all for them removing radar completely if it means they can redouble their efforts on vision.

A quick thought - removing radar doesn’t just remove the radar component. It removes a slew of other components and dependents (both physical and software) I.e the radar heater element, the wiring, the mounting components, the space in the bumper moulding, the space and time for the component on the factory floor, etc. etc. the list goes on… just for radar. And that’s not even touching on the logistics and testing.
Seriously?! It’s about £. Always about £. Nothing at all about the planet etc.
 
  • Disagree
Reactions: spdpsba
Seriously?! It’s about £. Always about £. Nothing at all about the planet etc.
Maybe ‘planet’ was too broad. They’re committed to the replacement of the ICE as the de facto means of transportation for most people.

Part of my point really, they’re a huge business why would they blunder into such a big change which will get picked up and talked about everywhere. They care about their stock price
 
  • Like
Reactions: UrbanSplash
One of the things that I love about Tesla is that they seem to be holding true to the ethos that the simplification of the modern car is a must
One could argue that removing tried and tested radar for cruise control and the IR sensors for rain in order to reinvent the wheel using just video is actually complicating matters rather than simplifying them
 
One could argue that removing tried and tested radar for cruise control and the IR sensors for rain in order to reinvent the wheel using just video is actually complicating matters rather than simplifying them
This was my thinking exactly. Replace a cheap sensor that works well with a software maintenance headache for the lifetime of the product. Good decision.
 
This was my thinking exactly. Replace a cheap sensor that works well with a software maintenance headache for the lifetime of the product. Good decision.
The assumption here is that radar works perfectly and has no issues. If you watch the video I posted above, you'll see that radar introduces problems that aren't there in vision. And with only two sensors you can never be sure which one is right. There's an interesting article here which also includes a image of how poor the radar's resolution is.

This is not just a Tesla problem. We had a Jaguar with radar that would alert at this junction every time there was a car waiting to join the main carriageway:
Screenshot 2022-03-19 at 10.23.06.png
Quite often it would slam on the brakes, but in those days we didn't call it phantom braking. Oh, and our Tesla never alerts here. :)
 
I understand why they want to take away the LiDAR sensors as humans don’t have LiDAR either but they need to really get on top of the issues this has caused in the US. They have real problems with the vehicles not recognising things properly and lots of phantom braking events now.
 
I understand why they want to take away the LiDAR sensors as humans don’t have LiDAR either but they need to really get on top of the issues this has caused in the US. They have real problems with the vehicles not recognising things properly and lots of phantom braking events now.
I don't understand this argument that "humans do it all with eyes so so the car can do it with cameras"
Humans crash, alot, some of the time because they failed to see stuff. So its not like we are a great model to emulate. Evolution of human eyesight was not driven by driving cars so who ( including nature) is saying it is the best way to manage a vehicle? It was just the best sense we had available for the job. But its not the only one we have is it? We use some of the others when we drive particularly hearing. How often have you reacted while driving to something you heard rather than saw. Does the car have microphones as part of its FSD system? I don't think so but please correct me if wrong.
I am not saying it is not possible to do FSD on vision only but the fact that humans do it is not to me a cogent argument.
 
Humans crash, alot, some of the time because they failed to see stuff

Yeah, but car is looking every which way all the time. Loads of cameras etc. My 2x eyes and head-swivel don't work like that!

Example: There is a video of a car stopped in traffic (on AP). It is hit from behind by someone who failed to slow down. By the time of the impact AP had steered into oncoming traffic, to avoid shunting the car in front, because it already knew that lane was clear. In my case if I had seen the car coming in my rear view mirror, computed that impact was inevitable, I very much doubt I would have had the presence of mind, and time, to check the oncoming traffic lane and also steer into it.

Of the videos of FSD Beta that I have seen I think the most impressive thing is when approach a crossroads (where the car has right of way). The instant that the roads, to left and right, become visible they show the wireframe for every vehicle parked / moving in both directions. Any crossing traffic, travelling at highspeed and thus unable to stop in time, would enable FSD AP to react. If I was driving manually I reckon I would fail to avoid someone jumping the lights in almost every scenario.
 
  • Like
Reactions: init6
I am not saying it is not possible to do FSD on vision only but the fact that humans do it is not to me a cogent argument.
That's not an argument I've heard for vision, but there's got to be some reason that all the self-driving tech uses vision (and radar is a form of vision - just a different wavelength). In the article I linked to above, they make the point that seeing things and understanding what they are is the crux of the problem. Seeing is easy, understanding what the items are is more difficult and won't be fixed by adding more sensors.
 
I don't understand this argument that "humans do it all with eyes so so the car can do it with cameras"
Humans crash, alot, some of the time because they failed to see stuff. So its not like we are a great model to emulate. Evolution of human eyesight was not driven by driving cars so who ( including nature) is saying it is the best way to manage a vehicle? It was just the best sense we had available for the job. But its not the only one we have is it? We use some of the others when we drive particularly hearing. How often have you reacted while driving to something you heard rather than saw. Does the car have microphones as part of its FSD system? I don't think so but please correct me if wrong.
I am not saying it is not possible to do FSD on vision only but the fact that humans do it is not to me a cogent argument.
It is the argument Tesla use. If they can do it without LiDAR then that’s a lot of money saved as even basic LiDAR units cost a few hundred each.
 
Radar isn’t really like vision on a different wavelength, radar sends out a signal and measures it’s return time to give distance.

We also have two eyes to create a sense of depth, we allow one eyed people to drive but we know their depth perception is compromised. Tesla don’t create depth from binocular vision from what I’ve seen.

Tesla say they dropped radar because it was an unreliable input and couldn’t be resolved with the image. At the event where they talked about their latest stuff they also said they had trouble resolving the different camera inputs where the images overlap. Their individual camera inputs identified cars but the directions, size, distance etc were all slightly different for what they thought was the same car. They used a model to resolve those differences to get the best fit from the various feeds, so there isn’t with vision a 100% reliable view of the world, only a best fit of what the cameras added together say. To me Tesla could include the radar data in that model and train accordingly or, if the radar is unreliable in its accuracy, start fitting a better radar. Radar gives depth much better than vision. Since the Tesla AP hardware came out they’ve changed the processing capability 3 times (making 4 versions in total) and changed some of the cameras, they’ve also restarted the software approach a number of times, so it should be no surprise that the sensor suite as a whole needs to be updated. These cameras and radar were specified in 2016, 6 years ago, how much better are camera phones and even lidar on phones now compared to your phone 6 years ago?

For me, vision only would need a camera at the top of each A pillar, that would give great stereo vision for depth, the ability to see further ahead than the centre cameras do, less susceptible to total failure from splashed water, mud etc. and be genuinely better than human vision, but nobody seems to be doing that.