Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Article on why Tesla dumped radar

This site may earn commission on affiliate links.
A Barron's article attempts to explain why vision only is better than sensor fusion to investors. Apparently ~25% of TSLA $650 stock price is expectation of big improvements in autonomous driving. I personally think it's a mistake to rely totally on vision and imagine the decision to remove radar is based on other factors.

Barron’s recently estimated that $100 to $200 of Tesla’s current stock price, at about $650 on Wednesday, was based on expectations the company would deliver of higher self driving functionality to drivers by year-end.
 
will any of this keep my FSD enabled car from going up onto curbs when using Smart Summon or almost scraping entire side (if I had not intervened) while on Smart Summon? I keep hearing all of the fantastic things of AI and DOJO and supercomputing and teragigs of data used to train and blah blah blah. But yet the results on the streets...
 

Attachments

  • IMG_5694.jpg
    IMG_5694.jpg
    96.8 KB · Views: 89

@23:54 Andrej Karpathy explains why they are not going for sensor fusion.
Interesting video, very informative with technical details from testing, thanks! I notice he says that radar is responsible for false breaking due to stationary overpasses yet I have a 2015 MS with AP1 and radar and it has never braked due to overpasses, or for shadows. The emergency braking mentioned in the video also seems to work very well in my car, e.g. when a car in front slows waaay down to make a turn.
 
Let's see their data in inclement weather. I have new radarless MY and it clearly isn't as good in emergency braking situations as my outgoing Acura was unless the weather is very clear.

It has no clue in fog or anything more than a heavy rain. I am not talking about AP, but AEB. As for AP, the cursed auto high beams are ridiculous and I can't tell you how many times I have had oncoming drivers flash their lights at me. I am about ready to rename my car Mr. Magoo because of how blind it is at times.

I would buy more into the vision only system if they had upgraded some of the other tech. I would think there are better sensors that work in low light conditions better than the current ones. This has been the biggest letdown of my 2021 MY LR (the not ready for prime time) driver aids. So tired of AP and TACC limited to 75 mph as well. Most of the people on the highways are running closer to 80.
 
  • Like
Reactions: QUBO

DayTrippin:

Thank you for not incompetently misspelling 'braking'.

And word, on sensors. While I understand the rationale behind cameras only -- which are after all the main sensors you humans have -- augmented reality would make the cars 'superhuman' in perception.

I've supported cameras for a long time but now feel that cameras only, is going the wrong direction. Radar can most certainly be improved, and I myself have been saved an accident by it bouncing under the car in front to watch the one in front of it.

And lidar, while not yet concealable, is light-years ahead of where it was only a couple of years ago.

True, the more chefs making the stew, the more likelihood of falsing. But pair side sensors, front sensors, and have a voting/consensus system.
 
  • Like
Reactions: QUBO

I've supported cameras for a long time but now feel that cameras only, is going the wrong direction. Radar can most certainly be improved, and I myself have been saved an accident by it bouncing under the car in front to watch the one in front of it.​

Fully agree with this. I’m an engineer working in a different field, but with some experience in ML. The arguments I hear in favor of camera-only autonomy generally are as follows:

1. Humans only have eyes, so why shouldn’t cars be able to drive with just cameras?
2. This is now just a software problem, the sensors aren’t the hard part
3. Teslas have more cameras than people have eyes, so they should reach super-human performance at driving tasks.

I find none of these arguments convincing, especially the idea that AI is just a software problem that needs some more data.

I think Tesla is headed towards a more feature-rich level2 ADAS that will still require human intervention at all times. Personally think that’s the worst of both worlds, if not dangerous.

To get to autonomy, you need a system that can drive without human interaction in some geographically or weather-constrained scenarios. I don’t see that happening without multiple redundant sensing modalities.

MobileEye, for example, is training independent systems that drive using cameras and radar+lidar. This gives them the ability to hand over control if the cameras are faced with a situation that where they can’t perceive well. Still a hard problem, but at least seems plausible.

I think Tesla is just pushing forward with the only path available to them that will keep the hype train moving.
 
Last edited:
  • Informative
  • Like
Reactions: QUBO and hcdavis3