Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Is Tesla using the rear view camera for autopilot?

You don't think it's good for blind spot checking, or rather "pre" blind spot checking?

When you're driving, watch the rear view cam and you tell me if you can tell when a car is actually in your blind spot. Cars look much further away than they actually are. For anything AP related like blind spot checking, you really need a 3D view of the space around the car.
 
When you're driving, watch the rear view cam and you tell me if you can tell when a car is actually in your blind spot. Cars look much further away than they actually are. For anything AP related like blind spot checking, you really need a 3D view of the space around the car.

I definitely can tell a car is in my blind spot, in fact that's why I drive with the rear view camera always on. Cars do look further away than they actually are, but that's obvious so you just correct for that.

But the car has blind spot sensors, so it wouldn't need to use the camera to tell that there's a car in the blind spot. I was thinking the camera could be useful to see if a car is coming up fast in the next lane so that it knows to just let him pass before attempting a lane change. If I can adjust my depth perception of the camera based on 'feel' I have to believe a computer can more accurately make those adjustments.
 
That's exactly what I was thinking: Warning of rapidly approaching vehicles.
This could then be included into the self-driving part of the Autopilot system in order for the car to decide whether it is safe to change lanes.
 
Sure they could..or could have. But probably they're not. Never mind the distortion..easily corrected with a transform algorithm. You should be able to easily detect delta positions..am I going to get hit..or is that car in a bad spot. Are those red lights!? Probably not a useful night time camera (infra red is filtered) but it is high def and you can do a lot with that information. I haven't heard about this..but never say never.
 
Sure they could..or could have. But probably they're not. Never mind the distortion..easily corrected with a transform algorithm. You should be able to easily detect delta positions..am I going to get hit..or is that car in a bad spot. Are those red lights!? Probably not a useful night time camera (infra red is filtered) but it is high def and you can do a lot with that information. I haven't heard about this..but never say never.

This.
 
There is no question the rear-view video footage can be used in conjunction with MobileEye software to enhance Autopilot. As mentioned previously, adjusting for fish-eye curvature is trivial. It could most definitely help with lane changes (i.e. fast approaching/overtaking cars), rapidly approaching car warnings, and enhancing the current ultrasonic blind spot detection. It could even be used to help with lane guidance. Basically, as a secondary verification that the car is centered in the lane with a completely different lighting angle. (In the case of faded or hard to see lines, the system might want to check the rear camera to ensure the paths continue to match.)

That said, the question is whether the current processing hardware has the cycles to analyzing the footage. We know the front camera will be used for lane detection and it's already used to read speed limit signs. So, there's video analyzing hardware in the car already. The big unknown is if the rear camera can take advantage of this. If I had to guess, I'd assume the processing power is already being maxed out for the forward facing analysis.
 
I retract what I said -- I did some "empirical" testing today, and yes, the rear view can be used for (human) blind spot detection -- but the driver needs to realize that cars in the blind spots in the rear view camera appear to be at least 1.5, if not 2 full car lengths behind the rear end of the Model S, when in fact they are directly in the blind spot.

But in all cases, I could see said "blind spot" cars either in my rear-view or side-view mirrors. So if your mirrors are properly set, you really shouldn't have any blind spots.
 
The current rear view camera is useless with rain splash on it. Other issues with distortion and wide angle can be compensated for with software (assuming enough horsepower which I doubt is currently installed).

The MobileEye system probably is capable of doing this in hardware. Their whole vision system is based on a custom hardware vision pipeline. Whether it is a suitable camera for the application (and I'm guessing it is not either due to quality/dynamic range or interconnect incompatibility) is another question.
 
But the car has blind spot sensors, so it wouldn't need to use the camera to tell that there's a car in the blind spot. I was thinking the camera could be useful to see if a car is coming up fast in the next lane so that it knows to just let him pass before attempting a lane change. If I can adjust my depth perception of the camera based on 'feel' I have to believe a computer can more accurately make those adjustments.

I've suggested this made sense a couple times before. So far no evidence has emerged that Tesla is doing anything of the sort - but the current "final" version of Autopilot to date doesn't make its own lane changes without the user signaling for it, so it doesn't really need the ability I suppose.
Walter
 
Put a piece of tape over the rear camera, engage TACC, and see if you get any warning messages. That would be the easiest way to tell.

It's not that easy.

1. I doubt it's using the rear camera for TACC.
2. If it's using the rear camera for the mystical autopilot and this mystical autopilot will work in rain, they could have a sensor fusion algorithm which uses the camera when the visibility is good (no rain, to make autopilot EVEN better), or doesn't use the camera (rain/snow/etc., makes the autopilot just OK, but still usable).