Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

LIDAR

This site may earn commission on affiliate links.
I'm looking to trade my 2013 Model S for a new one, mostly to get self driving features. I'm concerned that Tesla is the only company developing self driving NOT using LIDAR. Anyone know why Tesla thinks they can get the necessary information from more cameras instead of using LIDAR?
 
...why Tesla thinks they can get the necessary information from more cameras instead of using LIDAR?

“Once you solve cameras for vision, autonomy is solved; if you don’t solve vision, it’s not solved … You can absolutely be superhuman with just cameras.”

Tesla seems to think that present sensors are currently do their job of detection very well and it's the job of software to make sense of those data.

Thus, Tesla thinks why would you want to add LIDAR on when you haven't finished fine tuning what you already have now?
 
Last edited:
  • Like
Reactions: PrGrPa
Whether you use a sequence of 2D images from cameras or a sequence of 3D point clouds from LIDAR, the problem remains the same.

Potential hazards still have to be identified and tracked; vehicles, pedestrians, traffic lights, road signs.

LIDAR just sounds cool and futuristic, but it doesn't simplify this task.
 
  • Like
Reactions: PrGrPa and Bebop
Whether you use a sequence of 2D images from cameras or a sequence of 3D point clouds from LIDAR, the problem remains the same.

Potential hazards still have to be identified and tracked; vehicles, pedestrians, traffic lights, road signs.

LIDAR just sounds cool and futuristic, but it doesn't simplify this task.

All the sensors in the world won’t help you if you can’t use the data from them intelligently.
 
LIDAR = Light Detection And Ranging
RADAR = Radio Detection And Ranging

Radio is a form of light though. LIDAR qualification is a much shorter wavelength so you see smaller things like raindrops and snowflakes and fog. Elon thinks this is a major disadvantage of LIDAR and will prevent its use in self driving hence Tesla does not think it is necessary. Time will tell.
 
You can absolutely be superhuman with cameras alone. No doubt about it.

But you can IMO be even better with 360 degree radar and Lidar redundancy, because radar will see through objects and radar + Lidar e.g. in darkness or when a camera vision is otherwise restricted (e.g. blocked).
 
  • Like
Reactions: Kant.Ing
Radio is a form of light though. LIDAR qualification is a much shorter wavelength so you see smaller things like raindrops and snowflakes and fog. Elon thinks this is a major disadvantage of LIDAR and will prevent its use in self driving hence Tesla does not think it is necessary. Time will tell.

Well, one big problem is that Tesla does not have 360 radar either. Only very narrow front radar. Radar is great for seeing through objects, for example, which Teslas can not do towards the sides or backwards (e.g. fast approaching car on the other lane behind your tail car).

As for Lidar, rain and snow can be calculated out, so that is not really a problem for Lidar anymore. What is great about Lidar is that it can see in darkness and provide a security blanket for vision in a way that radar can not...

The question isn't vision vs. lidar. The question is Tesla's near-vision-only plan vs. other's with 360 degree vision, 360 degree radar and 360 degree lidar plans...

Tesla still has time to change plans, of course. We shall see if they'll stick to vision only.
 
Whether you use a sequence of 2D images from cameras or a sequence of 3D point clouds from LIDAR, the problem remains the same.

Potential hazards still have to be identified and tracked; vehicles, pedestrians, traffic lights, road signs.

LIDAR just sounds cool and futuristic, but it doesn't simplify this task.
That is not correct....with the lidar you lose a few steps in between concerning depth of field, distance, size etc.
This can be substituted by radar + optical to some degree, but linking several sensors for the same picture is always problematic and error prone, and the sheer resolution and view angle of the lidar sensor alone seems to be enough for every other manufacturer to deem it absolutely vital.

Guess we`ll see, but in this matter I bet against Tesla.
 
That is not correct....with the lidar you lose a few steps in between concerning depth of field, distance, size etc.
This can be substituted by radar + optical to some degree, but linking several sensors for the same picture is always problematic and error prone, and the sheer resolution and view angle of the lidar sensor alone seems to be enough for every other manufacturer to deem it absolutely vital.

Guess we`ll see, but in this matter I bet against Tesla.

I wonder if they all deem Lidar as absolutely vital primary sensor, though. For example the first volume production car to use Lidar, the new Audi A8, uses Lidar as a secondary sensor and EyeQ3 vision as the primary forward-looking one. These are complemented by five radars surrounding the car...

It might well be that vision is the primary source in the future, while Lidar provides a secondary image and radar a tertiary one. These are fused to form a comprehensive image of the surroundings and then algorithms or deep learning is used to form a view on what to do if there is a discrepancy between these - and if there isn't, there is very high probablity of things being a-okay...

I think Tesla's lack of redundancy 360 degrees-wise is the biggest issue where I might expect them to add more sensors. If they had 360 vision and 360 radar, I might be more willing to believe the current suite is a "final" one...
 
One other current problem with LIDAR is that they're big, clunky, expensive spinning contraptions. We will see solid state LIDAR in the near future, but we're not there just yet. Once we do, and once the price is down, I expect that Tesla will adopt it in addition to radar.
 
One other current problem with LIDAR is that they're big, clunky, expensive spinning contraptions.
uuuuum
A168335_small.jpg

Valeo LIDAR sensors as used in the A8 are already in the low 3 digit range and they`re getting cheaper rapidly...
Cost and size are on their way out as arguments against this type of sensor. Ofc it will be a long time until they`ll be a dime a dozen like the cameras in the Tesla, but hardware costs of 2-3k for all AP hardware combined is definitely ok for the premium class at least.
 
uuuuum
A168335_small.jpg

Valeo LIDAR sensors as used in the A8 are already in the low 3 digit range and they`re getting cheaper rapidly...
Cost and size are on their way out as arguments against this type of sensor. Ofc it will be a long time until they`ll be a dime a dozen like the cameras in the Tesla, but hardware costs of 2-3k for all AP hardware combined is definitely ok for the premium class at least.

The Valeo unit is a mechanical scanner. It's certainly an improvement on the big rooftop contraptions, but it's big and expensive by auto industry standards. Solid state scanners have the potential to bring the cost down to <$20 per unit. And that's what you need for widespread adoption.
 
  • Like
Reactions: zmarty
My car drives fine with just 2 organic cameras.
Many do nicely with only one, even as Airline Transport Pilots, although that ones does require a SODA (Statement of Demonstrated Ability) which says one cannot pilot successfully with one one eye, but since you can you've proven you can overcome the impediment.

It seems to me all the arguments about the relative merits of sensor types and variety tend to ignore the obvious: the problem is really almost all about understanding, not perception. If we could have software good enough to replicate the best of a monocular human we could have level 5. If we cannot do that we'll never have level 5. More sensors seem to be inherently better but they probably also make the understanding even harder.

We aren't there yet. I'm not holding my breath either.
 
  • Like
Reactions: zmarty
The Valeo unit is a mechanical scanner. It's certainly an improvement on the big rooftop contraptions, but it's big and expensive by auto industry standards. Solid state scanners have the potential to bring the cost down to <$20 per unit. And that's what you need for widespread adoption.
I think it`ll be as with all big new features. It will show up in the premium class as absurdly expensive extra and will trickle down to middle and low class when the hardware becomes cheap enough.
 
This is just a love of larger numbers:

3D is assumed to be superior to 2D, 360 degrees is assumed to be superior to 20-20.

Humans are great at driving simply because senses, brainpower and mechanical outputs are all well matched.

Giving drivers better vision - e.g night vision goggles or some sort of vision derived from radar wouldn't make them automatically safer. Same for computers.

Our big problems are fatigue and distraction and simply driving too fast for the location/conditions.

Safer driving isn't about "seeing" more, it's about a better "balance"

If humans can drive at night, looking only forwards with our mark one eyeballs then so can robots looking ahead through cameras. They can be safer simply by not getting tired.

Radar is a useful addition to see through fog, but tbh, if you need to see through fog then you're probably driving too fast (particularly since AEB only reduces impact speed)
 
Last edited:
This is just a love of larger numbers:

3D is assumed to be superior to 2D, 360 degrees is assumed to be superior to 20-20.

Humans are great at driving simply because senses, brainpower and mechanical outputs are all well matched.

Giving drivers better vision - e.g night vision goggles or some sort of vision derived from radar wouldn't make them automatically safer. Same for computers.

Our big problems are fatigue and distraction and simply driving too fast for the location/conditions.

Safer driving isn't about "seeing" more, it's about a better "balance"

If humans can drive at night, looking only forwards with our mark one eyeballs then so can robots looking ahead through cameras. They can be safer simply by not getting tired.

Radar is a useful addition to see through fog, but tbh, if you need to see through fog then you're probably driving too fast (particularly since AEB only reduces impact speed)
as someone working in industrial facility automation where I basically work 24 / 7 with all kinds of sensors, actuators and the software involved....

seriously this bullshit comparison to human vision or human behavior has to stop. No software or hardware comes even close to how a human perceives his environment. This system is NOT directly translatable (yet?)
People argumenting with "human sight" simply have no clue what they`re talking about or have seen to many sci-fi movies with some weird AIs.....

Bringing the human system into technical discussions has philosophical value at best....
 
Last edited:
  • Like
Reactions: sandpiper