Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Our thoughts on FSD Beta now detecting Autopilot cheat devices | TMC Podcast Clip

This site may earn commission on affiliate links.

FSD Beta V10.69.3.1 can sense the use of defeat devices that some owners were employing in order to avoid having to keep their hands on the steering wheel. We ask how it is doing this.

This is a clip from Tesla Motors Club Podcast #27. The full podcast video is available here: https://youtu.be/LtDViqsiAMc
 
Oh, those pesky corner cases. Always mucking things up.

Again, if you just take a look at the cabin camera (anyone can do this), you’ll find it’s pretty obvious what is happening. No idea if neural nets can do it, but if self-driving is possible a more or less static image should be possible to pattern match even with tons of variability (again, if this is a viable solution, which I don’t know).
 
Again, if you just take a look at the cabin camera (anyone can do this), you’ll find it’s pretty obvious what is happening. No idea if neural nets can do it, but if self-driving is possible a more or less static image should be possible to pattern match even with tons of variability (again, if this is a viable solution, which I don’t know).
OK, so I went out to my M3 and got sentry enabled and watched as I positioned my arms/hands in various positions. I was wearing a loose pile jacket because it's cold in Texas. The first thing I noticed is that there is almost no view of my left arm. The camera FOV is not wide enough to capture more than a sliver. If I rest my elbow on the door rest, there is zero difference in the video whether my hand is on my knee, or holding the wheel anywhere from seven to nine o'clock position. There is a very slight change if I hold the wheel at ten o'clock, but the bulkiness of the jacket makes it impossible to discern anything about the difference.

The right arm is better in view, however, in all cases, the lower part of the image is blocked by the rear view mirror. Still, the image of the right arm is exactly the same whether my hand is resting on my knee or holding the wheel anywhere from three to five o'clock.

Had my arms been bare, it might be easier to discern where the right arm is going, but not the left. But it doesn't matter because any use of the camera to determine hand position must work for all types of persons wearing all types of clothing. The only alternative would be for the car to direct people to remove clothing from their upper body when arm position could not be determined. This might not be well received by some drivers!
 
OK, so I went out to my M3 and got sentry enabled and watched as I positioned my arms/hands in various positions. I was wearing a loose pile jacket because it's cold in Texas. The first thing I noticed is that there is almost no view of my left arm. The camera FOV is not wide enough to capture more than a sliver. If I rest my elbow on the door rest, there is zero difference in the video whether my hand is on my knee, or holding the wheel anywhere from seven to nine o'clock position. There is a very slight change if I hold the wheel at ten o'clock, but the bulkiness of the jacket makes it impossible to discern anything about the difference.

The right arm is better in view, however, in all cases, the lower part of the image is blocked by the rear view mirror. Still, the image of the right arm is exactly the same whether my hand is resting on my knee or holding the wheel anywhere from three to five o'clock.

Had my arms been bare, it might be easier to discern where the right arm is going, but not the left. But it doesn't matter because any use of the camera to determine hand position must work for all types of persons wearing all types of clothing. The only alternative would be for the car to direct people to remove clothing from their upper body when arm position could not be determined. This might not be well received by some drivers!
It definitely makes it a bit more difficult when wearing a jacket. I’ll post some pictures in a bit and let people guess what I am doing. Spoiler alert: If you look at the entirety of the driving task (steering, etc.), it is not difficult to tell whether hands are on the wheel, even with a jacket.

It just is completely possible, regardless of the conditions. On a long straight stretch of road it might be more difficult. But it is pretty easy to determine a driver’s typical habits right away.
 
  • Disagree
Reactions: sleepydoc
took a 60 mile trip up interstate 95 today with weight attached. On the way north, got one warning to turn yoke. I did so. later on I got the red wheel saying take over/no more AP allowed for this trip. (or whatever it says.)

On the way back, same highway, weight attached, I simply used my hand to tug on the yoke a bit once every 9 mins or so. Had zero warnings/zero issues using that method. I tested looking away/down/at phone. Seemed to make no difference. Was still a very relaxing drive as it was easy to randomly tug on the wheel every few mins to avoid any warning/shutdown of AP.
 
  • Love
Reactions: AlanSubie4Life
took a 60 mile trip up interstate 95 today with weight attached. On the way north, got one warning to turn yoke. I did so. later on I got the red wheel saying take over/no more AP allowed for this trip. (or whatever it says.)

On the way back, same highway, weight attached, I simply used my hand to tug on the yoke a bit once every 9 mins or so. Had zero warnings/zero issues using that method. I tested looking away/down/at phone. Seemed to make no difference. Was still a very relaxing drive as it was easy to randomly tug on the wheel every few mins to avoid any warning/shutdown of AP.
yup. i tried this recently and found it will cut your AP about ever 10 or so minutes, but a quick tug the other direction fixes the system is reconizing a fixed weight over a period of time as proof of cheating.
 
  • Like
Reactions: AlanSubie4Life
It definitely makes it a bit more difficult when wearing a jacket. I’ll post some pictures in a bit and let people guess what I am doing. Spoiler alert: If you look at the entirety of the driving task (steering, etc.), it is not difficult to tell whether hands are on the wheel, even with a jacket.

It just is completely possible, regardless of the conditions. On a long straight stretch of road it might be more difficult. But it is pretty easy to determine a driver’s typical habits right away.
The fundamental problem is that the camera can’t see what it needs to see. I routinely drive with my hands at the bottom of the wheel of just my left hand on the wheel. What you propose would completely miss this. It isn’t an ‘edge case,’ it’s a limitation of the system.
 
  • Like
Reactions: MTOman and Supcom
The fundamental problem is that the camera can’t see what it needs to see. I routinely drive with my hands at the bottom of the wheel of just my left hand on the wheel. What you propose would completely miss this. It isn’t an ‘edge case,’ it’s a limitation of the system.

Have you thought about what happens with your hand and arm as you move it to steer around corners? Remember you don’t just let the wheel slip through on corners, since you need precision control for corrections (since you are driving!). I think even if you did let it slide it would be clear from the camera what was happening (even though none of this is visible, as you say). It’s just something that you can figure out by observing.

My point here is that it is easy to see what the driver is doing with the existing camera. The details don’t matter, but watch some video of it and you’ll see what I mean. It’s shockingly good, actually. Could there be a better camera position? Certainly! But what is there is good enough for a human to figure it out (which is my point).

The human brain is remarkable.
 
Last edited:
  • Disagree
Reactions: sleepydoc
Have you thought about what happens with your hand and arm as you move it to steer around corners? Remember you don’t just let the wheel slip through on corners, since you need precision control for corrections (since you are driving!). I think even if you did let it slide it would be clear from the camera what was happening (even though none of this is visible, as you say). It’s just something that you can figure out by observing.

My point here is that it is easy to see what the driver is doing with the existing camera. The details don’t matter, but watch some video of it and you’ll see what I mean. It’s shockingly good, actually. Could there be a better camera position? Certainly! But what is there is good enough for a human to figure it out (which is my point).

The human brain is remarkable.
If I’m driving, Tesla doesn’t need to look at my hands or feet or face or anything else. If I’m using FSD then FSD is driving and I’m keeping an eye on it in case it does something stupid. When it turns, I keep my hands at the ready, but they don’t necessarily move and I never turn the wheel myself because the only thing that accomplishes is a disengagement.

But that’s all beside the point because on a regular basis I use FSD to drive on county and state highways that have long, straight stretches In which my hands and arms barely move. Unless you consider this an edge case?

Regardless, any algorithm needs to be accruate all the time and despite your contortions and protestations, the camera will clearly not be.
 
Regardless, any algorithm needs to be accruate all the time and despite your contortions and protestations, the camera will clearly not be.

Camera is not an algorithm. Just a sensor.
Unless you consider this an edge case?

No, I don’t. I think it is reasonable to heavily weight eye position and head position in that case.
 
I’m not sure what to tell you. If you look ahead vs. look at your phone held over the cupholders, it looks totally different. Especially the head. Super easy to tell.

And it is also pretty easy to tell when hands are in the lap vs. the vicinity of the wheel (from the arm position).

I agree holding a phone over the cupholders vs. holding the wheel is hard to tell. But not sure how that is relevant. Because as soon as you look at the phone it is obvious. And I don’t think many people love to just hold their phone awkwardly without looking at it, or randomly hold their arms out like a zombie.

So yes, there are arm positions that are indistinguishable from holding the wheel, but we don’t care about them much. People who don’t hold the wheel have hands in lap (identified by arms), or are using the phone or screen (identified by gaze).

All seems very easy for a human to identify.
Do you move your whole head to look down a foot? Elbow on the arm rest, hand slightly in front of cupholder next to the wheel.

I have zero strikes.
I checked the view from the new feature in the app: you literally can’t tell.
 
  • Like
Reactions: sleepydoc
Camera is not an algorithm. Just a sensor.


No, I don’t. I think it is reasonable to heavily weight eye position and head position in that case.
This is what is tracked. Someone a while ago got into the system while it was active.
1672118779821.jpeg
 
  • Like
Reactions: AlanSubie4Life
It definitely makes it a bit more difficult when wearing a jacket. I’ll post some pictures in a bit and let people guess what I am doing. Spoiler alert: If you look at the entirety of the driving task (steering, etc.), it is not difficult to tell whether hands are on the wheel, even with a jacket.

It just is completely possible, regardless of the conditions. On a long straight stretch of road it might be more difficult. But it is pretty easy to determine a driver’s typical habits right away.
Here are the pictures. In these pictures, I'm either not holding the wheel, or I'm holding the wheel and steering. I'll let people figure it out. It's pretty easy! Answer Options: Steering Left/Right, hands in lap, Steering Straight ahead

1)
IMG_3898.jpeg

2)

IMG_3899.jpeg

3)

IMG_3897.jpeg

4)

IMG_3896.jpeg

5)

IMG_3895.jpeg

6)

IMG_3894.jpeg

7)

IMG_3893.jpeg


1) ?
2) ?
3) ?
4) ?
5) ?
6) ?
7) ?

Prize of praise to the first to answer all seven correctly.

Turns out it's really easy to figure out (only one of the above is hard)! Add on top of that that you have easy visual cues based on gaze and head tilt, and it's very easy to see whether the driver is paying attention, and even whether they are holding the wheel. These are just poor quality still images, much worse than what the car has - when you see the video images it's even easier to tell what is going on.

I definitely don't think Tesla needs the torque sensor except as a fallback in very special cases, if they have extremely high intelligence (human level) analyzing the video stream.
 
  • Like
Reactions: 2101Guy
Not true. Tesla arbitrary chooses which firmwares reset your strikes. Historically, it has been every 3-4 upgrades. And we all know how infrequent we receive updates. It's a robbery.
I’m not sure what you’re disagreeing with. It sounds like we’re both talking about the same thing. I didn’t say what the strikes do now, I said what I think they should do…. And I think they should reset every update. Just my opinion you know.
 
  • Like
Reactions: FSDtester#1
Talking to a wall here…
Attention matters. Hand position doesn’t.
There are plenty of hand/arm positions that are indistinguishable, thus any algorithm using the camera to sense hand position will fail.
Remember how a human can tell.

Polarized glasses defeat that. I’ve tested it.
Remember the head too. Remember what a human can see.
 
Talking to a wall here…
Attention matters. Hand position doesn’t.
There are plenty of hand/arm positions that are indistinguishable, thus any algorithm using the camera to sense hand position will fail.
You were talking to me?

Remember how a human can tell.


Remember the head too. Remember what a human can see.
Remember…. Remember…. The 5th of November
 
  • Funny
Reactions: AlanSubie4Life