Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Any reason why Tesla doesn't integrate the Cabin Camera with autopilot?

This site may earn commission on affiliate links.
We already know about the cabin camera and its ability to track the driver's eyes. Why doesn't Tesla use this to eliminate the need for drivers to keep their hands on the wheel? This is already done by other companies such as GM, Ford and Mercedes for example. It's kind of crazy that the "most advanced" driver assist system in the world still requires you to hold the wheel while every major manufacturer is moving to eye tracking.

As of right now the cabin camera is an annoyance when it should instead be a feature that enables us to drive autonomously hands-free. The system is being used in a negative way (annoys you if you look away, disables autopilot if you're distracted, etc) and is not being used positively for the user.
 
Last edited:
We already know about the cabin camera and its ability to track the driver's eyes. Why doesn't Tesla use this to eliminate the need for drivers to keep their hands on the wheel?...

Initially, the idea of Tesla cabin camera was not Autopilot/FSD related. It was to monitor passengers of Tesla Robotaxi. If there's no human driver, there's no need to monitor a non-existent driver.

However, as there are more public pressure after numerous cases of accidents and deaths, Tesla decided halfway that it is doing it for safety's sake: Turning the function of monitoring passengers into monitoring human drivers.

Halfway because even Model 3 and Y has a cabin camera but it is blind when the cabin is too dark (there's no roadside lights).

Halfway because now the S and X have a cabin camera with infrared to work in complete darkness but that is not still the case of old and brand new Model 3 and Y that lack infrared LED lights.

A working cabin camera in complete darkness is not new as other brands have done it. Even cheapo brands like $1,999 OpenPilot can.

But to be hands-free, the steering must be stable enough to trust the automatic steering without human intervention.

Tesla's AutoSteer is not stable enough since it first sold AutoSteer version 1 in 2014 and AP2 in 2016 and now we are in AP3 in 2022 with FSD beta.

It is not stable enough because, in the past, it would drive straight into a cement median if the driver didn't intervene (the driver didn't and died of course).

Today, the FSD beta is still not stable enough because it might just steer into obstacles like pedestrians, bicyclists in the city, and of course, the famous collision with the green bollard).

But why?

When GM sold you a hands-free Super Cruise in 2017, you got hands-free as advertised. Notice it's only hands-free in areas that have been HD-mapped (most major US and Canadian highways or 200,000 miles in 2017 and 400,000 miles by the end of 2022).

Maybe it is harder for Tesla to attain hands-free because Tesla hates to "game".


Tesla believes in the generalized FSD. Others rely on lots of homework like HD map for a route and maybe Tesla doesn't feel it should stoop so low to game its way to the Autonomous Vehicle challenge.
 
We already know about the cabin camera and its ability to track the driver's eyes. Why doesn't Tesla use this to eliminate the need for drivers to keep their hands on the wheel? This is already done by other companies such as GM, Ford and Mercedes for example. It's kind of crazy that the "most advanced" driver assist system in the world still requires you to hold the wheel while every major manufacturer is moving to eye tracking.

As of right now the cabin camera is an annoyance when it should instead be a feature that enables us to drive autonomously hands-free. The system is being used in a negative way (annoys you if you look away, disables autopilot if you're distracted, etc) and is not being used positively for the user.
It has EXTREMELY limited ability to track your eyes. It is a standard visible light camera that was intended for monitoring passengers in "your robotaxi" and NOT an IR tracking camera like Ford/GM and others as Tam stated. Tesla is working to train the software but not sure how that will get past its physical limitations. There is just NO WAY it can see through all types of sunglasses and can be fooled by picture, lighting changes or other objects. While IR trackers could see through welder's glasses at midnight and easily identify the exact location of your pupils under almost ANY conditions.