Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD rewrite will go out on Oct 20 to limited beta

This site may earn commission on affiliate links.
I think Elon‘s logic on LiDAR versus cameras goes like this:

What level of positional and size accuracy is needed for driving? Around 10cm

Within the next 3 to 5 years, given the pace of vision NNs, will cameras be able to achieve this accuracy? Yes

Therefore lidar is unnecessary.

I agree that is elon's logic. But if so, it seems like a bad one. It seems problematic to promise FSD based entirely on the assumption that you can achieve the needed accuracy of your primary sensor at some point in the future.

It assumes that camera vision will achieve 10cm accuracy in the next 3 to 5 years. That is not a guarantee. What if it takes a lot longer? What if it is not possible? It has already been 4 years since Tesla first promised FSD. It certainly appears like it will take longer than 3-5 years.

it is a gross oversimplification since FSD requires more than positional accuracy.

It also ignores the fact that even when do you achieve this accuracy, you have not solved FSD yet since you still need to do prediction, planning and driving policy. So even achieving this accuracy with camera vision does not automatically give you FSD.

Lastly, we can already achieve better accuracy now with lidar. Companies like Waymo and Cruise are even deploying driverless cars now on public roads. So why would we wait 3-5 years or more until camera vision maybe achieves that accuracy? Why not just go with the solution that we know works now?
 
Last edited:
I don't know, you're asking if Tesla can design cars?

Anyway, the point is that lidar was an option, was considered, and rejected.

The Lidars that were available in 2016, Tesla made the right choice of passing on those.. maybe the Scala 1 might have been available in 2016, but still likely not ready and ready for the volume tesla needed, and even if it was it would not have added that much value.
 
You can't be serious. The long range LIDAR available TODAY are completely out of the practical price range for a Model S. How many of the $75,000 Velodyne LIDARS would Tesla have put on each Model S in 2016?

What today, there is lidar available ranging from $100-$1000 all of which would add value to tesla autopilot / fsd... the $100 would add a small amount of value and the $1000 would add much more... obviously.
 
  • Disagree
Reactions: mikes_fsd
I think it's simpler than all this - Elon looked around and realized humans drive just fine just using vision. So he bet it all that a vision-based system will win in the end.

I'm not convinced that the monocular cameras that are on the cars will achieve what he wants though. No sense of depth perception unless an object is in the field of view of two cameras, or the object is moving across a camera's FoV.
 
You can't be serious. The long range LIDAR available TODAY are completely out of the practical price range for a Model S. How many of the $75,000 Velodyne LIDARS would Tesla have put on each Model S in 2016?
Also... the $75000 price is a joke, and that is not a production lidar... futhermore, just adding a forward facing only lidar adds a ton of value.
 
  • Like
Reactions: diplomat33
I think it's simpler than all this - Elon looked around and realized humans drive just fine just using vision. So he bet it all that a vision-based system will win in the end.

I'm not convinced that the monocular cameras that are on the cars will achieve what he wants though. No sense of depth perception unless an object is in the field of view of two cameras, or the object is moving across a camera's FoV.

You can do depth perception with just one camera but it requires some complex neural networks and computing power. And I am unsure how accurate it is.
 
I think Elon‘s logic on LiDAR versus cameras goes like this:

What level of positional and size accuracy is needed for driving? Around 10cm

Within the next 3 to 5 years, given the pace of vision NNs, will cameras be able to achieve this accuracy? Yes

Therefore lidar is unnecessary.

One thing often overlooked with Lidar is that it is an active system (it emits signals rather than just passively receiving them), and as such has potential for cross-interference between cars. When a large number of cars have lidar you are going to get a lot of crosstalk and signal interference, which doesnt help accuracy. And yes, radar has the same potential problems, but the nature of the frequency ranges and accuracy goals tends to mitigate this.
 
One thing often overlooked with Lidar is that it is an active system (it emits signals rather than just passively receiving them), and as such has potential for cross-interference between cars. When a large number of cars have lidar you are going to get a lot of crosstalk and signal interference, which doesnt help accuracy. And yes, radar has the same potential problems, but the nature of the frequency ranges and accuracy goals tends to mitigate this.
it's overlooked because its not a big issue
 
  • Like
Reactions: diplomat33
One thing often overlooked with Lidar is that it is an active system (it emits signals rather than just passively receiving them), and as such has potential for cross-interference between cars. When a large number of cars have lidar you are going to get a lot of crosstalk and signal interference, which doesnt help accuracy. And yes, radar has the same potential problems, but the nature of the frequency ranges and accuracy goals tends to mitigate this.

Look at the upcoming FMCW lidar and radar. I don't think Waymo intends to run their cars at highway speed with current lidar because of range limitations. Current lidar also has the same sun flair problem as Tesla's cameras. FMCW lidar uses 1550nm which is largely not present in solar radiation.
 
Look at the upcoming FMCW lidar and radar. I don't think Waymo intends to run their cars at highway speed with current lidar because of range limitations. Current lidar also has the same sun flair problem as Tesla's cameras. FMCW lidar uses 1550nm which is largely not present in solar radiation.

Waymo mentions their 5th generation has a range greater than 300 m and can identify objects driving directly into sun on the brightest day:

"As one of the Waymo Driver's most powerful sensors, lidar paints a 3D picture of its surroundings, allowing us to measure the size and distance of objects around our vehicle, whether they're up close or over 300 meters away. Lidar data can be used to identify objects driving into the sun on the brightest days as well as on moonless nights."
Waypoint - The official Waymo blog: Introducing the 5th-generation Waymo Driver: Informed by experience, designed for scale, engineered to tackle more environments
 
  • Informative
Reactions: zecar
Waymo mentions their 5th generation has a range greater than 300 m and can identify objects driving directly into sun on the brightest day:

"As one of the Waymo Driver's most powerful sensors, lidar paints a 3D picture of its surroundings, allowing us to measure the size and distance of objects around our vehicle, whether they're up close or over 300 meters away. Lidar data can be used to identify objects driving into the sun on the brightest days as well as on moonless nights."
Waypoint - The official Waymo blog: Introducing the 5th-generation Waymo Driver: Informed by experience, designed for scale, engineered to tackle more environments

It’d be interesting to see how well that lidar works in some midwestern lake effect snow bands.
 
I think it's simpler than all this - Elon looked around and realized humans drive just fine just using vision. So he bet it all that a vision-based system will win in the end.

I'm not convinced that the monocular cameras that are on the cars will achieve what he wants though. No sense of depth perception unless an object is in the field of view of two cameras, or the object is moving across a camera's FoV.

It’s actually a misconception that you use both eyes for depth perception at distance. Binocular depth is for up close, things further away at driving distance our brain uses monocular cues, ie shadows, relative object size.

Driving with one eye feels uncomfortable if you are used to using both, but I do have plenty of patients who are driving just fine with good vision in one eye (taking extra precautions to scan more in the area of lost peripheral vision of course).
 
[
It’s actually a misconception that you use both eyes for depth perception at distance. Binocular depth is for up close, things further away at driving distance our brain uses monocular cues, ie shadows, relative object size.

Driving with one eye feels uncomfortable if you are used to using both, but I do have plenty of patients who are driving just fine with good vision in one eye (taking extra precautions to scan more in the area of lost peripheral vision of course).

Yes, the angle difference between the two eyes/camera is too small to estimate size at distance. I assume a humans and well as Tesla does object ID to estimate distance. The list of what Tesla must accomplish in software compared to competitors is daunting. The newest lidars give not only distance but velocity. The newest car radars can sense stationary objects.

We fool ourselves about distance, of course. How do people determine the size of a UFO in the air at a distance? There is usually no reference to estimate size. 'It was two football fields in size". Really, was it next to a football field?
 
[


Yes, the angle difference between the two eyes/camera is too small to estimate size at distance. I assume a humans and well as Tesla does object ID to estimate distance. The list of what Tesla must accomplish in software compared to competitors is daunting. The newest lidars give not only distance but velocity. The newest car radars can sense stationary objects.

We fool ourselves about distance, of course. How do people determine the size of a UFO in the air at a distance? There is usually no reference to estimate size. 'It was two football fields in size". Really, was it next to a football field?

well yes in the vacuum of the sky without relative objects we aren’t good at estimating distance.

But on the ground, I would say we are pretty good. And it makes sense, if we weren’t good at knowing if something was far or close we would be dead back in the caveman days
 
  • Like
Reactions: mikes_fsd
I have asked this question a few time whenever I see LIDAR being mentioned. Please, someone tell me how much power a 360deg LIDAR that can see 300m away without interference consumes?

I have a a mini LIDAR that can only properly see 8m away with a 512x512 resolution and it consumes 12W.

Would Tesla be able to accelerate the switch to sustainable energy with a sensor suite that consumes maybe 10-20x more?

i would be willing to bet that the Wayne sensor suite & processing must consume at least 1000W vs ~100W of Tesla’s FSD solution. I would imagine the aerodynamic drag cause my the large sensor horns would also greatly diminish efficiency on the highway.

So can someone confirm how much power these long-range LIDAR systems consumes? Treating point clouds from a LIDAR is extremely expensive and matching LIDAR data with RGB data is also very expensive computationnally. My 512x512 point cloud çan at time overwhelm a 6-core i7 with 2080ti GPU when trying to reconstruct in into a 3D object. You need neural networks to break that data down in a more useful way. LIDAR for FSD is a brute force approach and if you don’t figure out the RGB NN stuff you’re kinda screwed long term anyways.
 
  • Informative
Reactions: Magellan55