Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Possible scientific vindication of using just cameras, no lidar

This site may earn commission on affiliate links.
You guys miss my point. Sure the front radar on AP2 is wide enough to see cars driving in front lanes driving in the same direction. It is not wide enough to see cross-traffic, nor rear cross-traffic. It will not see a car approaching from the side when a camera misses it, say, due to weather.

The Germans for example commonly have 5-6 radars to cover near-360 around the car in radar. This requires radars in each corner at least.
 
(at least for those of us who do not irrationally demand that robot drivers be better than humans).

The weather question is what mostly troubles me. A speck of dirt on a camera would create a blind-spot if the system is not redundant in some manner. Even Tesla has seen the need for redundancy in the front (multi-camera plus radar). A human can go out and clean or move their head...

IMO it is a weakness of the suite to not have redundancy elsewhere. Dismissing lidar, 360 degree radar at least would have been great.
 
The weather question is what mostly troubles me. A speck of dirt on a camera would create a blind-spot if the system is not redundant in some manner. Even Tesla has seen the need for redundancy in the front (multi-camera plus radar). A human can go out and clean or move their head...

IMO it is a weakness of the suite to not have redundancy elsewhere. Dismissing lidar, 360 degree radar at least would have been great.
The AP 2/2.5 system is not going to be all weather. That's just obvious and reality. It will also require some new habits of course. I for one will carry a microfiber in my glovebox and make a habit of doing a 30 second walkaround and wipe of the camera before setting off on a drive. I'm a private pilot - before every flight we are taught to do a fixed flight pre-check that tests mechanical systems for faults before takeoff. Checking/wiping lenses will be very similar and almost no work.

As for the theoretical speck of dirt blinding the lenses - it could happen. However again, for safety that statistically exceeds humans this is not likely to be an obstacle. To obscure forward vision you need that dirt on the front windshield high off the ground - where it is hard for road dirt to accumulate anyway. These cameras are also covered by the windshield wipers. In a true dirt-pocalypse you have radar and side cameras and should be able to at least set off alarms and start slowing the car/pulling over to stop.

Level 5? I have never claimed AP2/2.5 will get there. I think Level 4 with some geographic and weather restrictions is where we will end up. Or maybe it will just be a very robust Level 3 after a couple more years of development. I think of course like everyone else that more sensors are coming in time for future Teslas - as well as some kind of cleaning system for camera lenses.
 
Last edited:
And @AnxietyRanger since you like to tell people whether or not their conversation topics are interesting - here's judgment on one of your favorites - redundancy. It's not interesting. It's obvious that redundancy is more desirable. It's also obvious that it costs money and development time (and in fact may hinder AI development in the short turn because more sensors have to be fused).

Again to draw an airplane analogy - crap costs money mon. You want redundant inertial navigation computers for your autopilot in your Cessna? Want to lower the odds that you go t*ts up on final in minimums cause you lost the glidepath on your GPS approach due to a hardware failure of a navcom? Pony up the bread dude and write a big fat check to your favorite aviation electronics dealer. It ain't illegal NOT to have a backup but it sure is nice to have it - if you have the dough. Nobody on airplane forums wastes their breath like you do yammering on and on about the desirability of redundancy. It's a silly point (again to use one of your favorite insults) - it is obviously true.
 
It is not whether cameras can theoretically be enough. It is whether they’re in practice enough.

Creating real time 3D map with cameras only is computionally more difficult than with using also lidar.
I’m sure Tesla will eventually solve it. Question is, whether the other guys using also lidar will solve it earlier.

All the systems which at the moment are truly autonomous use also lidar.
 
I think one of sad ironies of all this is that @calisnow - given our general disposition towards each other - probably gave one of the best answers to my redundancy question/post on this forum. There is clear acknowledgement of my point, discussion of some of the angles to the issue and evolution to his opinion. Fair enough.

If for the most part that would have been the answer to me making a point about redundancy (instead of some diatribe about how no redundancy is actually better), my interest in making the point would have quickly waned. I have no need to repeat things to people who signal they've actually received the thought, even when they don't agree with it. Why should I.

We all like to be heard. Agreement is not necessary.
 
Re: weather and visibility, I was dumbfounded how in the hell AP2 running 2017.42 “saw” lane lines consistently and accurately in yesterday’s northeast monsoon for hours when I could barely make them out.

I don’t doubt blindfolding the cameras with mud will impact their performance, but I also believe it’ll be able to shrug off less-than-optically-perfect glass, based on what I experienced yesterday.
 
I think frontal vision is pretty well taken care of. It has wipers, heating and redundancy. Rain can be learned/handled as you noted...

Snow, ice, mud and non-redundant sides/rear are bigger questions. We shall see once those cameras are enabled...
 
Do we know exactly how wide the radar beam is?

Bosch MRR:
upload_2017-10-31_1-2-15.png


Source:
HW2.5 capabilities
 
Re: weather and visibility, I was dumbfounded how in the hell AP2 running 2017.42 “saw” lane lines consistently and accurately in yesterday’s northeast monsoon for hours when I could barely make them out.

I don’t doubt blindfolding the cameras with mud will impact their performance, but I also believe it’ll be able to shrug off less-than-optically-perfect glass, based on what I experienced yesterday.

FYI, I've driven with AP2 through some intense rain and I agree with .40 its the best its ever been. In the worst of the rain, the lead car (light gray) in the IC started winking in and out. Literally the lead car would disappear and then reappear quickly (I think it was the wipers and not the rain causing the sight issues). The car stayed within the lanes (I believe, I mean, it was raining!) and it seemed confident but I was a bit unnerved. The rain subsided and it did not display this behavior. I had the wipers set to max (which I find to be inadequate for heavy rains -- the motors are not fast enough though they do fling the water a great distance). Obviously with less intense rain, I lowered them accordingly (yes, no auto wipers yet).
 
  • Informative
Reactions: strangecosmos
Question is, whether the other guys using also lidar will solve it earlier.

Google is ahead in terms of functionality with a LIDAR-based system, but they're aiming for a taxi/ridesharing service, not a consumer product. Nobody knows how much their sensor array costs, but we all know LIDAR is more expensive than cameras. Those using LIDAR seem to think that either the cost will fall, or they can own the vehicles (and sensors) and spread the cost across multiple consumers. If Tesla meets their goals without LIDAR, not only will they have the only system affordable enough for consumer cars, but they'll have a platform for taxis and ride sharing that has a lower hardware cost, so they can undercut the LIDAR-based competition.
 
I think frontal vision is pretty well taken care of. It has wipers, heating and redundancy. Rain can be learned/handled as you noted...

Snow, ice, mud and non-redundant sides/rear are bigger questions. We shall see once those cameras are enabled...

Sadly in the winter (earlier this year), the first sensor to give up was the forward radar with AP2.0, and that is with some ice buildup on the front bumper.
 
Sadly in the winter (earlier this year), the first sensor to give up was the forward radar with AP2.0, and that is with some ice buildup on the front bumper.

Something to watch out for, then. I didn't get much winter experience with AP2 last winter.

To be honest, snow can be brutal on any forward-facing radar, so this is not particularly a Tesla issue. AP2 radar should be better placed than AP1 radar, though, for this?

But yeah, this goes to that redundancy point. If you have multiple sensors every which way, less likely you are that one particular sensor will be a show stopper...
 
Yeah, first day with snow here today, no autopilot or TACC after 15 minutes of driving, the front of the car was like this:
2017-11-14 09.38.41 (Small).jpg

Since the radar has no heating element it does not take that much to disable it.
The AP1 pre-facelift center located radar has heating right?

On my previous Audi it took a lot of snow to disable ACC functionality (heated radar located beside the fog light), and it only happened on longer drives.
 
Yeah, first day with snow here today, no autopilot or TACC after 15 minutes of driving, the front of the car was like this:

Really? That miniscule amount of snow was enough to take it out? That's bad. Ugh, California car...

I've had adaptive cruise on half a dozen cars and usually the incident that the radar gets disabled happens maybe a couple of times a winter.

Even my first Audi A6 with adaptive cruise in mid-00s had a heated place in front of the radar.
 
My car has been caked no issue. Second winter already started with this car. Only soaking rain behind big rig disabled my radar. 3 blizzards last year and none even temporarily disabled AP. AP worked just fine in heavy snow. Hopefully even better this time since it works much better overall.