Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Our thoughts on FSD Beta now detecting Autopilot cheat devices | TMC Podcast Clip

This site may earn commission on affiliate links.

FSD Beta V10.69.3.1 can sense the use of defeat devices that some owners were employing in order to avoid having to keep their hands on the steering wheel. We ask how it is doing this.

This is a clip from Tesla Motors Club Podcast #27. The full podcast video is available here: https://youtu.be/LtDViqsiAMc
 
1) ?
2) ?
3) ?
4) ?
5) ?
6) ?
7) ?

Prize of praise to the first to answer all seven correctly.

Sad that no one attempted to answer. Options given above!

I do think Tesla has reduced the torque nag requirements. I can go many minutes between nags during the day without sunglasses on as long as I am careful to always keep eyes forward and hands away from my face.

I used to torque the wheel out of habit but I have nearly stopped doing that (I’ve taken to just steering the car normally at 9 and 3 most of the time). It’s getting better and better. It’s possible my old method might result in a warning about defeat device now, but I am not sure.
 
Last edited:
  • Like
Reactions: Ramphex
a human can't always tell.
I agree with this, but it is not about “always.” It’s about understanding the driver’s attentiveness overall and what they are likely doing with their hands. A human can do this easily in nearly all cases especially if they are told when the car is turning, etc. Even if at a particular instant they can’t tell, overall they will have a very good idea. If you are not sure for long enough, you issue a nag.

More to the point for this thread: what if the camera sees both hands of the driver while also detecting consistent wheel torque? That is a really easy way to detect a defeat device in a very short observation window! Where else would the consistent torque be from? (Someone should try, to to see whether Tesla takes advantage of simple ways of doing things…)

I encourage watching the long videos here and just try to gauge how you would grade the driver at any point in time, and how accurate you would be. I think the camera can give excellent information on how attentive the driver is, with a sophisticated enough system to figure out what is happening (like a human brain). It’s much easier than using a still image (which also is not that hard, usually, though it can be).

Could there be better camera positions (designed for this task)? Sure. But raw image data (at least in daytime) seems good enough to me as long as the system is smart enough.
 
Last edited:
I'm on FSD Beta 10.69.25.1 (2022.44.30.5).

Today I got a "Remove Autopilot Cheat Device Immediately" in red text along with beeps, DURING an auto lane change initiated by NoA to exit the freeway. Possibly the worst timing ever! No warnings what so ever. I think the algorithm may be more sensitive during lane changes, so be forewarned. I think this warning text is new, part of 2022.44.30.5 or one before (2022.44.25.5).

I already got kicked out of FSD Beta a month ago after the FSD Beta 10.69.3.1 (2022.36.20) release, which introduced the "cheat device" algorithm. 5/5 strikes on a long highway stretch while on NoA (with even FSD explicitly disabled in settings).

Full Self-Driving (Beta) Suspension

Improper usage is when you, or another driver of your vehicle, receive five 'Forced Autopilot Disengagements'. A disengagement is when the Autopilot system disengages for the remainder of a trip after the driver receives several audio and visual warnings for inattentiveness.
Except there is no warnings, and it strikes out immediately. I already sent out an complaint email to the FSD team and they acknowledged receipt. But they don't seem to care or do anything about it, and if anything they made the algorithm even more sensitive. Maybe if more people complain, they'll start listening?

IMO, the FSD team should spend time perfecting Driver Monitoring, rather than waste time and money going round in circles detecting "cheat devices". OpenPilot and GM Supercruise has Driver Monitoring since 2018 and it works so well. Even the newcomer Ford BlueCruise has now enabled Driver Monitoring for full hand-off driving.
 
Last edited:
But they don't seem to care or do anything about it, and if anything they made the algorithm even more sensitive.
I assume you are using a cheat device?

I’d recommend keeping hands on the wheel and countertorquing. That should resolve the issue for the most part. Perhaps.

Ideally Tesla’s detection algorithm will be more sophisticated than that and detect such attempts to bypass it (for example looking for two free hands on the camera, with torque on the wheel). But probably not yet. Or just analyzing the torque and distinguishing it from human torque (probably simplest?)
 
Last edited:
I should have mentioned: 2 of the 5 strikes last month was from when I looked at the center display to adjust the fan speed (this is back when it was a slider), and the other time was me glancing at the efficiency graph to see if I'm on track to my next supercharger stop. Both times my eyes were off the road for less than 3 seconds. Adjusting the HVAC takes like 3 touches and there's no way to do it precisely than looking at the screen.

I guess what I'm trying to say is that maybe it's not a simple as detecting wheel force. There may be other factors that make it much more sensitive, such as eye/head tracking. So combination of no wheel force (or "incorrect" force) PLUS eyes off the road trigger at the same time trigger an immediate strike out. So since that experience, I rarely use my center display while on NoA/AP in fear of a strike out for the drive. Really bad UX, Tesla. We shouldn't be afraid to use our cars in fear of losing feature we paid for.
 
I agree with this, but it is not about “always.” It’s about understanding the driver’s attentiveness overall and what they are likely doing with their hands. A human can do this easily in nearly all cases especially if they are told when the car is turning, etc. Even if at a particular instant they can’t tell, overall they will have a very good idea. If you are not sure for long enough, you issue a nag.

More to the point for this thread: what if the camera sees both hands of the driver while also detecting consistent wheel torque? That is a really easy way to detect a defeat device in a very short observation window! Where else would the consistent torque be from? (Someone should try, to to see whether Tesla takes advantage of simple ways of doing things…)

I encourage watching the long videos here and just try to gauge how you would grade the driver at any point in time, and how accurate you would be. I think the camera can give excellent information on how attentive the driver is, with a sophisticated enough system to figure out what is happening (like a human brain). It’s much easier than using a still image (which also is not that hard, usually, though it can be).

Could there be better camera positions (designed for this task)? Sure. But raw image data (at least in daytime) seems good enough to me as long as the system is smart enough.
*sigh*
Your hands don’t matter.
 
So since that experience, I rarely use my center display while on NoA/AP in fear of a strike out for the drive. Really bad UX, Tesla. We shouldn't be afraid to use our cars in fear of losing feature we paid for.

Seems like the right response of the car has properly detected a wheel weight.

Your hands don’t matter.
How do you figure?

I mean, if it is possible for the car to tell, it seems like a very good thing to keep track of. For example, when turning it is highly advisable and super easy to have hands on the wheel with no disengagement except for errors.
 
I'm on FSD Beta 10.69.25.1 (2022.44.30.5).

Today I got a "Remove Autopilot Cheat Device Immediately" in red text along with beeps, DURING an auto lane change initiated by NoA to exit the freeway. Possibly the worst timing ever! No warnings what so ever. I think the algorithm may be more sensitive during lane changes, so be forewarned. I think this warning text is new, part of 2022.44.30.5 or one before (2022.44.25.5).

I already got kicked out of FSD Beta a month ago after the FSD Beta 10.69.3.1 (2022.36.20) release, which introduced the "cheat device" algorithm. 5/5 strikes on a long highway stretch while on NoA (with even FSD explicitly disabled in settings).


Except there is no warnings, and it strikes out immediately. I already sent out an complaint email to the FSD team and they acknowledged receipt. But they don't seem to care or do anything about it, and if anything they made the algorithm even more sensitive. Maybe if more people complain, they'll start listening?


IMO, the FSD team should spend time perfecting Driver Monitoring, rather than waste time and money going round in circles detecting "cheat devices". OpenPilot and GM Supercruise has Driver Monitoring since 2018 and it works so well. Even the newcomer Ford BlueCruise has now enabled Driver Monitoring for full hand-off driving.
I got a cheat device warning today. No strikes.
image.jpg
 
I'm on FSD Beta 10.69.25.1 (2022.44.30.5).

Today I got a "Remove Autopilot Cheat Device Immediately" in red text along with beeps, DURING an auto lane change initiated by NoA to exit the freeway. Possibly the worst timing ever! No warnings what so ever. I think the algorithm may be more sensitive during lane changes, so be forewarned. I think this warning text is new, part of 2022.44.30.5 or one before (2022.44.25.5).

I already got kicked out of FSD Beta a month ago after the FSD Beta 10.69.3.1 (2022.36.20) release, which introduced the "cheat device" algorithm. 5/5 strikes on a long highway stretch while on NoA (with even FSD explicitly disabled in settings).


Except there is no warnings, and it strikes out immediately. I already sent out an complaint email to the FSD team and they acknowledged receipt. But they don't seem to care or do anything about it, and if anything they made the algorithm even more sensitive. Maybe if more people complain, they'll start listening?


IMO, the FSD team should spend time perfecting Driver Monitoring, rather than waste time and money going round in circles detecting "cheat devices". OpenPilot and GM Supercruise has Driver Monitoring since 2018 and it works so well. Even the newcomer Ford BlueCruise has now enabled Driver Monitoring for full hand-off driving.
It's all a gimmick man. They spend more time in creating driver monitoring and anti-cheat device code than they do on making the FSD drive better. I'm at 3/5 strikes for no reason lol, one for having a phone on my chest recording the dash display :D
 
I'm on FSD Beta 10.69.25.1 (2022.44.30.5).

Today I got a "Remove Autopilot Cheat Device Immediately" in red text along with beeps, DURING an auto lane change initiated by NoA to exit the freeway. Possibly the worst timing ever! No warnings what so ever. I think the algorithm may be more sensitive during lane changes, so be forewarned. I think this warning text is new, part of 2022.44.30.5 or one before (2022.44.25.5).

I already got kicked out of FSD Beta a month ago after the FSD Beta 10.69.3.1 (2022.36.20) release, which introduced the "cheat device" algorithm. 5/5 strikes on a long highway stretch while on NoA (with even FSD explicitly disabled in settings).


Except there is no warnings, and it strikes out immediately. I already sent out an complaint email to the FSD team and they acknowledged receipt. But they don't seem to care or do anything about it, and if anything they made the algorithm even more sensitive. Maybe if more people complain, they'll start listening?


IMO, the FSD team should spend time perfecting Driver Monitoring, rather than waste time and money going round in circles detecting "cheat devices". OpenPilot and GM Supercruise has Driver Monitoring since 2018 and it works so well. Even the newcomer Ford BlueCruise has now enabled Driver Monitoring for full hand-off driving.
I’m in exactly the same boat you are. Since it was a new warning, I couldn’t read the screen in time to understand it was the cheat device. 5/5 strikes in a single day on a road trip. Still don’t have access to FSD 3 updates later. $15,000 for this *sugar*.
 
I’m in exactly the same boat you are. Since it was a new warning, I couldn’t read the screen in time to understand it was the cheat device. 5/5 strikes in a single day on a road trip. Still don’t have access to FSD 3 updates later. $15,000 for this *sugar*.
What model and year do you drive? I’m seeing reports of people with interior cameras not getting strikes for cheat device warnings…including me. It would make sense if the wheel is the ONLY attention checking device…
 
Yes, this same "instant disqualification" happened to me yesterday. We just bought our Model Y, and I am learning the nuances of AutoPilot. So here is how it happened to me:

I had been driving on Autopilot for almost two hours on a largely empty 4-lane highway, torquing the wheel when either the initial lower-screen message or later upper-screen blue flash presented themselves. No issues, not even phantom braking. There was a feature on a mountain to the left of our car that I wanted to look at more closely. I had the sunshade over the top of the driver side window. To see the mountain feature, I had to crane my neck and bend my head lower. And of course, I was looking away from the forward view. I had looked to the left in this manner maybe two or three times, briefly -- no more than a few seconds at a time. Then I heard the harsh beeps. Turning my focus back to the screen, I briefly saw the terse comment in red at the bottom of the screen, indicating I was guilty - GUILTY, I SAY! - of using a "defeating device". I was banned from using AutoSteer for the remainder of the trip; as I was fairly close to my destination, I did not opt to pull off and restart the drive.

The first time this happens and someone doesn't get out of the car to reset it, and ends up having a crash that Autopilot would have prevented, if anyone dies, that person's family is going to sue Tesla for everything they are worth and win. Trying to detect defeat devices is a mistake. If Tesla fails to detect a defeat device and someone gets hurt, it is the driver's fault. If Tesla thinks there is a defeat device and shuts off a driver safety feature as punishment and someone gets hurt, it is Tesla's fault. From a product liability perspective, this is absolutely the most stupid thing Tesla could do.

And that's if they are wrong about the use of a cheat device.

If they are right, then Tesla would be in even more trouble. If Autopilot shuts down instantly, without adequate warning, with direct knowledge that the user's hands are not on the wheel, if someone dies, that's premeditation, making Tesla potentially guilty of first-degree murder. Not negligence, not homicide. Premeditated first-degree murder.

Tesla needs to hire some competent legal counsel, and put them in the loop for these sorts of product decisions. There's no way even the most incompetent lawyer in the world would allow something like what I'm hearing described to go out to the public. I trust that this idiocy has been reverted at this point.

As an aside, while I was on vacation, I let my MCU1 Model X install an update without realizing that they put me on the FSD Beta train. After reading this thread, I'm starting to wonder if that was a mistake. If this thing kicks out in the wrong way un CA-17, hands on the wheel or not, the sudden torque change could be deadly. I had enough near misses in the early days of high Autopilot on that road, and I'm wary of going back....
 
Finally got hit with this ~38 days after getting 10.69.3.1+. Apparently it didn't like me keeping me hand on the wheel like I always do, immediate detection and strike. The message was ~"take over immediately", but I see defeat device detected in notifications. Seems like my car without interior camera is limit to 3 strikes as well.

What's the alternative?

1. Keep my hands off the wheel, and eventually get audio warnings and strikeouts because you CAN'T SEE the tiny warning at the bottom of the screen while paying attention to the road. A complaint made many many times in the last ~7 years. They could at least flash the whole binnacle screen white.
2. Keep doing what I'm doing until FSD is disabled due to strikeouts
3. Stop using autopilot entirely.
 
  • Like
Reactions: MTOman
Finally got hit with this ~38 days after getting 10.69.3.1+. Apparently it didn't like me keeping me hand on the wheel like I always do, immediate detection and strike. The message was ~"take over immediately", but I see defeat device detected in notifications. Seems like my car without interior camera is limit to 3 strikes as well.

What's the alternative?

1. Keep my hands off the wheel, and eventually get audio warnings and strikeouts because you CAN'T SEE the tiny warning at the bottom of the screen while paying attention to the road. A complaint made many many times in the last ~7 years. They could at least flash the whole binnacle screen white.
2. Keep doing what I'm doing until FSD is disabled due to strikeouts
3. Stop using autopilot entirely.

4. Assume that they'll do a strike reset for V11 anyway, contact the FSD team and complain about the bogus strike, post on Twitter complaining about the bogus strike, and generally make as much noise as possible. If everyone complained enough about false strikes to cause bad press, then their engineering management would pay more much attention to user satisfaction than to the trolls trying to get FSD banned for failing to detect abuse.
 
  • Like
Reactions: MTOman
shuts off a driver safety feature

It’s not a driver safety feature. It is a convenience feature; it is not for safety.

Even when FSD Beta features are disabled, lane departure features and other safety features (AEB) are still available. (For example if you go into jail for exceeding 85mph, lane keep assist and ELDA still work - unless you abuse them and they are then shut off - probably ELDA is not shut off ever but not sure.)
 
  • Disagree
Reactions: ChimpledPot
I would argue that FSD provides additional detection of obstacles that a person might miss, ergo it is a safety feature. Doubly so if the driver is fatigued.

That detection (and response!), to the extent it exists, should still operate with FSD disabled. Obviously a bit tricky to test but no reason to think it would be disabled since no other safety features are.

Only the convenience features are disabled.
 
4. Assume that they'll do a strike reset for V11 anyway, contact the FSD team and complain about the bogus strike, post on Twitter complaining about the bogus strike, and generally make as much noise as possible. If everyone complained enough about false strikes to cause bad press, then their engineering management would pay more much attention to user satisfaction than to the trolls trying to get FSD banned for failing to detect abuse.
At this point, I have no faith even v11 will reset the strikes. It's already been 3 updates for me and I'm still locked out. I've already emailed [email protected] and followed up 4 times. Radio silence. This entire experience has me considering another automaker brand for my next EV.