Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

AI Addict - Tesla Driver Monitoring System

This site may earn commission on affiliate links.
Sounds like this guy also doesn't know what he's talking about.
They did a wave without infrared, but Tesla's 100% have infrared for the DMS on all models since ~aug 2021.
Cool. Thanks. That's news to me too. The main problem with Tesla's DMS is that it doesn't work as shown in the video.
Sensors (camera and IR) need to be placed in the dash in front of the driver facing upwards so that baseball caps and stuff doesn't occlude. Policy need to be a lot less leniant (or whatever the right word is for not detecting a human).

They clearly suck at DMS and this type of smear wouldn't stick if they did better.
 
Cool. Thanks. That's news to me too. The main problem with Tesla's DMS is that it doesn't work as shown in the video.
Sensors (camera and IR) need to be placed in the dash in front of the driver facing upwards so that baseball caps and stuff doesn't occlude. Policy need to be a lot less leniant (or whatever the right word is for not detecting a human).

They clearly suck at DMS and this type of smear wouldn't stick if they did better.

Personally, I want to see this type of test performed on other cars DMS before I would say any of them suck vs don't suck.

It feels pretty scummy to
  1. use a wheel weight to bypass one part of the system entirely (when Tesla has been cracking down on them)
  2. use a weight in the seat when in the video you claim "nothing in the drivers seat" while framing the shot to hide the weight
  3. and then only run the system for ~10s before your "test".
If you have FSD, try covering the cabin camera entirely. It will drive for ~20-30s before screaming at you.

I know how bitchy the system is when I even remotely try to change music on the infotainment screen (fix your music UI tesla), so my best guess is they are abusing the 20-30s grace period at start to show how "this DMS is clearly flawed."

Easy fix for Tesla would be to verify they detect a human before allowing FSD to enable. But that might draw the ire of some that may not be recognized properly (think burn victims with severe facial injuries).
 
Last edited:
  • Like
Reactions: spacecoin
Since FSD is a driver-supervised system, it is expected to miss stuff and of course it does. If it wants to be a self-driving system (as the name keeps suggesting) then of course it needs to never ever miss stuff like this.

However, it does generally see pedestrians in the road ahead of it. So I am curious as to why in this test it so reliably does not see the mannequin. It is possible its training data does not trigger on something about the mannequin. That does not mean that's not a problem, because you could thus also miss some types of children on the road, but the video may overstate it. A LIDAR of course would not miss a target of this sort.
 
  • Like
Reactions: spacecoin
However, it does generally see pedestrians in the road ahead of it. So I am curious as to why in this test it so reliably does not see the mannequin. It is possible its training data does not trigger on something about the mannequin. That does not mean that's not a problem, because you could thus also miss some types of children on the road, but the video may overstate it. A LIDAR of course would not miss a target of this sort.
Perhaps the city VRU neural net isn't active on this road? Tesla is juggling nets in and out of memory depending on driving conditions. It can be that or just a really poor generalising.
 
Since FSD is a driver-supervised system, it is expected to miss stuff and of course it does. If it wants to be a self-driving system (as the name keeps suggesting) then of course it needs to never ever miss stuff like this.

However, it does generally see pedestrians in the road ahead of it. So I am curious as to why in this test it so reliably does not see the mannequin. It is possible its training data does not trigger on something about the mannequin. That does not mean that's not a problem, because you could thus also miss some types of children on the road, but the video may overstate it. A LIDAR of course would not miss a target of this sort.
There are a couple of online folks that periodically test FSD and frequently it refuses to acknowledge or even brake for sizeable roadway objects. Even dogs can be hit and miss - no pun intended.

Most AV industry associations and leadership believe multiple sensors are the way forward for safer and more reliable AV designs.
 
The only fair thing to do is test each car brand/model/year that has DMS and see how well they do in a set of standardized tests across different scenarios. Daytime, nighttime, sunglasses, handsfree, geolocations, eye colours, holding phones, candy bars, etc.

No avoidance tricks. Just how well they do, and the time taken to respond. It's no secret that Teslas can be fooled fairly easily, but just how well do they work when used properly? They can also include a section on how easy it is to fool them I suppose. But none of this gets to the bottom of how well any of these systems work at actually detecting driver attentiveness.
 
There are a couple of online folks that periodically test FSD and frequently it refuses to acknowledge or even brake for sizeable roadway objects. Even dogs can be hit and miss - no pun intended.

Most AV industry associations and leadership believe multiple sensors are the way forward for safer and more reliable AV designs.
Obviously -- I have written extensively on the question of the reliability of pure computer vision for this problem.

That's not my point though. Computer vision with 99% recall is impressive to lay people, but still orders of magnitude below where it needs to be for a self-driving systems. But it should still detect things most of the time, from the viewpoint of a person trying it out. The thing that confuses many Tesla owners about the quality of FSD is they get very excited when it completes a full drive or even two in a row when they don't realize that Waymo and Cruise are doing 100,000 trips in a row and still aren't quite ready -- that's how much higher the bar is.

But a video like this shouldn't be possible even with a crappy system.
 
  • Like
Reactions: spacecoin
I'm curious. Set a route locally and limit your speed to 35 max. Wheel down anytime it adjusts upward. Would FSD Beta handle the drives any better at that max speed? You'd obviously need to plan the route to streets with that low limit, or limit yourself to night drives when there isn't much traffic on streets with higher limits.