Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

No more radar means the car cannot see through fog?

This site may earn commission on affiliate links.
but relying solely on vision seems to make it dependent only on one type of sensor to make all decisions.
That's actually what Electrek said in a way:

As we previously reported, the idea of moving to only cameras using computer vision is that the only known system that can drive right now is the human brain. It relies on input from human eyes, which are closer to cameras than anything else. With cameras all around the vehicle with different fields of view, Tesla can achieve greater vision than humans, and the problem becomes only solving computer vision, which the automaker believes it is on the way to solving.

That's great if cameras CAN actually do everything well enough. They're good enough for humans because that's all we've got. It seems narrow-minded to think that vision is solely the way to do it. As people have already stated in this thread, other tools for "seeing" the road have their own unique advantages. My guess is Tesla is tired of trying to reconcile the different data presented by the different sensors and is just punting to cameras.

This short post has some good info on the different sensors and their pros/cons: What's Best for Autonomous Cars: LiDAR vs Radar vs Cameras
 
  • Like
Reactions: aesculus and jpk195
That's actually what Electrek said in a way:

As we previously reported, the idea of moving to only cameras using computer vision is that the only known system that can drive right now is the human brain. It relies on input from human eyes, which are closer to cameras than anything else. With cameras all around the vehicle with different fields of view, Tesla can achieve greater vision than humans, and the problem becomes only solving computer vision, which the automaker believes it is on the way to solving.

That's great if cameras CAN actually do everything well enough. They're good enough for humans because that's all we've got. It seems narrow-minded to think that vision is solely the way to do it. As people have already stated in this thread, other tools for "seeing" the road have their own unique advantages. My guess is Tesla is tired of trying to reconcile the different data presented by the different sensors and is just punting to cameras.

This short post has some good info on the different sensors and their pros/cons: What's Best for Autonomous Cars: LiDAR vs Radar vs Cameras
Thanks , will read up on Radar and Lidar
 
That's actually what Electrek said in a way:

As we previously reported, the idea of moving to only cameras using computer vision is that the only known system that can drive right now is the human brain. It relies on input from human eyes, which are closer to cameras than anything else. With cameras all around the vehicle with different fields of view, Tesla can achieve greater vision than humans, and the problem becomes only solving computer vision, which the automaker believes it is on the way to solving.

That's great if cameras CAN actually do everything well enough. They're good enough for humans because that's all we've got. It seems narrow-minded to think that vision is solely the way to do it. As people have already stated in this thread, other tools for "seeing" the road have their own unique advantages. My guess is Tesla is tired of trying to reconcile the different data presented by the different sensors and is just punting to cameras.

This short post has some good info on the different sensors and their pros/cons: What's Best for Autonomous Cars: LiDAR vs Radar vs Cameras
Agree - this is likely more about their data strategy than how inherently useful radar is/is not. They have a huge lead in terms of real-world camera data, so they are betting on that path. Regardless, no guarantee it will work, or that it is the optimal approach.
 
Last time I used AP1 in the fog/rain, it worked fine. I hope that doesn't go away.
AP is actually very useful in bad weather conditions, like slower highway travel in torrential rain. Did the invisible grey car without headlights that just passed cut in front of me? Did the car (that I can't see) in front of the car in front of me (that I can see) stop suddenly? AP detects and reacts, which help without me having to take my eyes off the road.
(There are plenty of traffic pile-up videos out there showing why stopping by the side of the road isn't necessarily safer, BTW.)
Yet another reason to refuse updates for AP1 cars (so far my reason was, it seems to work most of the time, I don't want some update to **** that up). I wish there was an option in the UI to just "skip all future updates until I ask for one" instead of being bothered every time I put the car in Park, or simply disable the "check for new updates" - perhaps it would tell Elon how many people don't want updates he thinks are the best thing since sliced bread.
 
Last edited:
  • Like
Reactions: kavyboy and David29
Interesting discussion. Coincidentally, I watched a video just yesterday about the Sony EV prototype, which said the vehicle had a suite of something like 40 sensors with various technologies. I suppose that might mean that Sony is "trying everything" before they narrow it down to a cost-effective solution. (I doubt they will be offering a car for sale anytime soon, but it sounds as if they want to break into the market to supply electronic systems to other car makers.)
 
Theoretically, if humans are solely relying on vision (two cameras able to swivel about 180 degrees), with Tesla's cameras providing a continuous 360 degree view, Tesla Vision should be able to do at least as good as a human and probably better.

Clearly there are road conditions when Tesla Vision will have difficulty operating - such as heavy precipitation or fog. But in those conditions, human drivers also have difficulty, usually requiring the driver to slow down or possibly stop until conditions improve - which is also likely what the AP software will also do. If the software can't see adequately, it will likely also slow down - and possibly find a parking spot.

There are reports of Tesla test vehicles driving around with LIDAR. The speculation is that Tesla is considering adding LIDAR to new vehicles - which would be a significant change in direction, especially based on Tesla removing radar from new vehicles. What's more likely is that Tesla is using the LIDAR on the test vehicles to calibrate Tesla Vision - comparing the detection of Tesla's cameras and the AP's imaging engine vs. what the LIDAR is detecting.

Also, Tesla has access to data from every vehicle that has the AP2 and later sensor suite, even if the vehicle doesn't have the AP/FSD software activated. Tesla can use this large fleet of vehicles to quickly collect whatever data they need to help validate and improve the Vision software, even before the next FSD release is distributed. This remains a huge advantage Tesla has over all of its competition - which instead rely on a small number of test vehicles coupled with simulators.

Obviously, having more sensors would provide the AP/FSD software much more data. Allowing the software to compare the results from different sensors to improve the quality of the image recognition. Relying only on camera data seems very risky - though if Tesla is able to succeed, they will significantly reduce the cost and complexity of the FSD systems.
 
Clearly there are road conditions when Tesla Vision will have difficulty operating - such as heavy precipitation or fog. But in those conditions, human drivers also have difficulty, usually requiring the driver to slow down or possibly stop until conditions improve - which is also likely what the AP software will also do. If the software can't see adequately, it will likely also slow down - and possibly find a parking spot.
It should slow down, activate as much exterior lighting as possible (including hazard flashers if speed has to drop below a certain point.)

It should observe the behavior of other vehicles if necessary. If it sees human drivers pulling over, it should also pull over to the road shoulder if it exists. I don't think Tesla is setup for V2V communication.

Or it might just go into Dalek Panic Mode and alert the human behind the wheel with a scream:
 
  • Informative
Reactions: MorrisonHiker
An advanced autonomous driving system must also be able to handle Emergency Vehicle approaching responses, as well as funeral processions, and railroad crossings without signals and gates (crossbars only). There are probably other everyday driving occurrences that must be handled. Vision won't be the only thing needed.
 
A tree has a speed of zero because it is not moving at all, so it is ignored. Most of the time, that is safe to ignore because no one is crazy enough to plant a tree right in front of my lane that my car is moving forward.

Maybe not a tree, but ....

Pole.jpg
 
All this talk about radar vs vision has me wondering if we've grown FSD expectations out of line with reality. I can't expect an automated car to be able to do things a human can't do e.g. drive through a blizzard. I've seen Colorado road conditions so bad that no normal human would try to drive. White out conditions, no visibility of any part of the road or more than 4 ft in front of you. The highway patrol closes off the on ramps to prevent the stupid from killing themselves or getting hopelessly stuck. All this talk about detecting things in extreme conditions seems crazy. I didn't think FSD would make my car superhuman. Self parking or summon on the other hand seem very reasonable.
 
  • Like
Reactions: Jimmer
Yet another reason to refuse updates for AP1 cars (so far my reason was, it seems to work most of the time, I don't want some update to **** that up). I wish there was an option in the UI to just "skip all future updates until I ask for one" instead of being bothered every time I put the car in Park, or simply disable the "check for new updates" - perhaps it would tell Elon how many people don't want updates he thinks are the best thing since sliced bread.

I doubt very much they'd be making any changes to the core autopilot functionality of AP1 cars.
 
When it gets foggy where I live, my car says the radar can't see and auto pilot is currently unavailable so I don't think much will be affected by losing the radar.
That message pops up when driving towards a setting sun.
So Elon's 'drive itself across the country' needs to be revised with the disclaimer that as long as it isn't foggy or too sunny on the chosen day. :rolleyes:
 
  • Like
Reactions: jpk195 and Platini
I doubt very much they'd be making any changes to the core autopilot functionality of AP1 cars.
I applaud your optimism. When MCU2 came out, MCU1, which was a whole different processor and wifi module, it ended up advertising that it has a 5GHz WiFi radio built in (took me a while to debug what was going on, bought a WiFi sniffer just for that, had to create a dedicated 2.4GHz only network for one of our Teslas), so they definitely share code. Going forward, radar may be removed from code, or perhaps some new algorithm will start malfunctioning with radar present because Tesla will stop testing with the radar present, just like they stopped testing many previous configurations they no longer produce. For me, I no longer accept updates in the MCU1 car because things are working better than many updates from the past (I'd say speed and reliability is back to only a little worse than v7 days), I don't expect any new features I would want, and I doubt they test it much anymore, except on customers.
 
  • Like
Reactions: kavyboy
.
I guess the 2 car thing is moot, per this guy that Elon agrees with :
It's not just for reduced visibility. If you are looking for smelly butts, you know you can relax and not strain your vision when you don't smell one. Once you do smell one, you start being more vigilant. In AI term, the smell helps quantify the level of confidence of what you're classifying. If you see a brown mass, your AI may decide it's 50% sure it's chocolate and 50% that it's a dung. Smell can definitely help sway the percentages to make a much more accurate assessment. There is a reason nature equipped humans with different senses, not just eyes around the head.
 
An advanced autonomous driving system must also be able to handle Emergency Vehicle approaching responses, as well as funeral processions, and railroad crossings without signals and gates (crossbars only). There are probably other everyday driving occurrences that must be handled. Vision won't be the only thing needed.
This remains a failing of Tesla's AP strategy.

Drivers also have hearing, allowing them to detect items which are out of view - such as approaching emergency vehicles or trains - especially useful when approaching intersections or going around corners limiting visibility.

Another interesting challenge will be handling humans providing instructions to drivers with hand signals and/or blowing a whistle. Will the FSD software be able to recognize this - including cases when the human directing traffic may not be directly ahead and using a whistle to get the driver to pay attention.

There are other "edge cases" that FSD must be able to handle. For example, while driving on highways, occasionally there will be wideload vehicles with flags or possibly a vehicles ahead and behind the wideload. The wideload vehicle may be in its lane, but the vehicle itself may protrude, at least partly, into your lane. Will the FSD software be smart enough to recognize the warnings that a wideload vehicle is present (flags, signs, signal lights) and detect not only where the vehicle is sitting on the pavement, but also where the edge of the vehicle is located, which will be wider than the base of the vehicle.

Even though radar doesn't have the accuracy of vision, it did provide additional data human drivers don't have. Several years ago, we were almost run off the road on a two lane highway with tight turns and vegetation blocking view of the road ahead. We came around a turn and encountered an 18 wheeler taking the turn at too high a speed, causing it to move into our lane. I had only a moment to find a place (with very little shoulder) to move and get out of its way. With radar, it's possible the software could see through the vegetation enough to detection something approaching, and provide more time to prepare for a response.

Without radar, Tesla is definitely taking a huge risk - but if they are successful, it will be a game changer... Though if they fail and have to add radar back to their vehicles - hopefully they've made it possible for radar to be added back after the factory for the new 3/Y models being produced now.
 
  • Like
Reactions: MorrisonHiker
Without radar, Tesla is definitely taking a huge risk - but if they are successful, it will be a game changer... Though if they fail and have to add radar back to their vehicles - hopefully they've made it possible for radar to be added back after the factory for the new 3/Y models being produced now.
I agree with this statement 100%. My concern is how many lives could have been saved if they aren't successful and the current setup of vision + radar was used and optimized instead.
 
I applaud your optimism. When MCU2 came out, MCU1, which was a whole different processor and wifi module, it ended up advertising that it has a 5GHz WiFi radio built in (took me a while to debug what was going on, bought a WiFi sniffer just for that, had to create a dedicated 2.4GHz only network for one of our Teslas), so they definitely share code. Going forward, radar may be removed from code, or perhaps some new algorithm will start malfunctioning with radar present because Tesla will stop testing with the radar present, just like they stopped testing many previous configurations they no longer produce. For me, I no longer accept updates in the MCU1 car because things are working better than many updates from the past (I'd say speed and reliability is back to only a little worse than v7 days), I don't expect any new features I would want, and I doubt they test it much anymore, except on customers.
MCU1/2 and AP1/2/3/4/... are radically different degrees of integration.

MCU1 is a tesla product running tesla code.

AP1 has one camera + radar and was produced by another company (mobileye). It's basically a black box.

None of the work they've been doing against AP2 maps back to AP1 at all. They've tinkered with guardrails and UI but I'd be surprised if they had access to the whole AP engine and I'd be even more surprised if they backport *any* of their machine learning heuristics back into AP1.

I could be wrong, I'm just another dog on the internet....