Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Seriously people? NHTSA is up Tesla's ass on driver monitoring - "Fixed" means the monitoring will be back, and that nag-free drives for 10 mins was a bug.

That's a possible interpretation. But if it was really a fixing a bug, then why would Elon respond to it by saying they'd fix the bug in the 12.4 minor release, instead of another 12.3.x patch release?

I think it's more likely he's saying the nag will be fixed as a new feature in 12.4 by being "killed" as Omar suggested in the tweet Elon was directly replying to.
 
  • Like
Reactions: DanCar
Not a chance. If they eliminate nags, then drivers don't have to pay attention, and that's a Level 3 feature.
No it doesn't mean that. Driver's have to pay attention and that is enforced by the camera and usage agreement. If the driver looks away then the nag will return. What this means is there are not any nags when driver is paying attention according to the camera software and seat sensor.
 
Elon implies 12.4 will be hands-free, in some capacity:

Nah, clearly he's implying that 12.4 will give you reason to want to hold the wheel which means you'll never see the nag again.

(/s obv)

Will be interesting to see how long (if it makes it out at all) until the NHTSA has them revert it lol, especially after the scare letter they sent from where he merely implied it on twitter.
 
  • Informative
Reactions: primedive
Interesting how so many people are completely misinterpreting or twisting this. If Tesla removes the requirement to torque the wheel, it's because they believe they've refined and proven camera monitoring instead.

A different method to ensure the driver is paying attention. Not to allow the driver to stop playing attention.

Yet there's so much off-base discussion in the X responses. On the giddy-enthusiastic side, about this being L3, or this allowing people to drive with their eyes closed etc. On the pessimistic critic side, about this being irresponsible or unacceptable to NHTSA. All just basically missing the point.

Whether or not the camera+NN really does work well enough now, whether or not the wheel nag removal will apply to all cars or just newer IR-equipped cars, both remain to be seen - but assuming Tesla is oblivious of these points is nonsense.

Regarding NHTSA, they've already accepted the principle of L2 camera-based driver attention monitoring. I've seen Ford commercials showing the driver clapping to the music while Blue Cruise barrels along (towing a trailer no less).

The issue for Tesla has been whether the interior camera hardware and software, originally intended for robotaxi cabin monitoring, has been successfully upgraded and repurposed for the driver monitoring task. The competitors use a more direct view and a more fundamentally IR-based method; I commented on this before:
... I suspect that the above-mirror position of the cabin camera and IR LED probably doesn't correspond to the optimal position for a purposefully designed eye tracking setup. There's no doubt that the AI analysis software can be trained to estimate where you're looking, more or less the same way you can infer where someone is looking whenever you can see his/her face. But I'm not sure that the "real" geometrical IR-return eye tracking method can be accomplished by Tesla's setup, as effectively as it may be in a purpose-built setup as with Ford Blue Cruise or similar systems.
Tesla thinks they've got it, at least according to Eon's recent posts. NHTSA should be open to approving (or more accurately, open to not objecting/interfering) based on testing data and real-world results.
 
I have yet to get 12.ANYTHING on my 2021 Refresh Model S....BUT my wife's 2017 Model X on 12.3.6 is actually quite good. (She is even impressed and she has called everything up to and including 11.4.9 total 💩)

Her car of course has no internal camera so I wonder how enjoyable those early FSD cars will be with less hands on the wheel and no camera NAG. Maybe those 🦄 will have quite the resale value too.

FSD
Free Unlimited SuperCharging
Free Unlimited Premium Connectivity
NO INTERNAL CAMERA to NAG!!!
 
Whether or not the camera+NN really does work well enough now, whether or not the wheel nag removal will apply to all cars or just newer IR-equipped cars, both remain to be seen - but assuming Tesla is oblivious of these points is nonsense.

Yeah that's going to be pretty interesting as, for example, early FSD adopters in non-camera cars will likely be upset if they keep nags and nobody else does....As would the even larger # of folks with non-IR cameras (though I suppose that'd be a relatively easy upgrade for Tesla to offer if they wished since the wiring is all there)


Regarding NHTSA, they've already accepted the principle of L2 camera-based driver attention monitoring. I've seen Ford commercials showing the driver clapping to the music while Blue Cruise barrels along (towing a trailer no less).

While that's true, keep in mind Fords system is vastly more restricted in where you can turn it on. It also has a better type of camera (barring the newer IR ones perhaps-, and a much better FOV for driver monitoring due to its placement. (which you do touch on as well) So not quite apples to apples there.

Another interesting aspect-- do the nags ONLY go away for FSD, but stay for basic AP? Because unlike Ford, Tesla allows (and always has allowed) turning on that system on the majority of roads it's not intended to be used on-- so monitoring would be significantly more vital there.




FSD for Cybertruck still a few months away (Elon time):



This should at least debunk the idea folks have that OEMs can go license FSD and it'll immediately work just fine as long as they use Teslas cameras and computer.... Sounds like instead they need a bunch of miles per type of vehicle to be able to get it working on that type of vehicle-- rather than some folks theories they just need to set the size of the vehicle as a variable somewhere.
 
  • Like
Reactions: JHCCAZ
I have yet to get 12.ANYTHING on my 2021 Refresh Model S....BUT my wife's 2017 Model X on 12.3.6 is actually quite good. (She is even impressed and she has called everything up to and including 11.4.9 total 💩)

Her car of course has no internal camera so I wonder how enjoyable those early FSD cars will be with less hands on the wheel and no camera NAG. Maybe those 🦄 will have quite the resale value too.

FSD
Free Unlimited SuperCharging
Free Unlimited Premium Connectivity
NO INTERNAL CAMERA to NAG!!!

They will almost definitely keep the steering wheel nag on the pre camera cars.
 
100% they will keep it until FSD is unsupervised.

Maybe in 20 years at today's rate of FSD progress.

I remember back in the early days of AP1, there was no steering nag at all. Those were the good days.

If that is the case I sure do prefer that over the camera. I can't even switch a song on my screen in my Refresh S without "PAY ATTENTION" LMAO

I agree. And you can find a nag defeat device for the steering input too, which is a plus.
 
Sounds like instead they need a bunch of miles per type of vehicle to be able to get it working on that type of vehicle-- rather than some folks theories they just need to set the size of the vehicle as a variable somewhere.
Not really. The control aspects of CT are quite different and thus need probably new set of APIs for FSD to interface. May have absolutely nothing to do with "new type of vehicle". For eg. CT doesn't use CAN bus but ethernet.
 
Not really. The control aspects of CT are quite different and thus need probably new set of APIs for FSD to interface. May have absolutely nothing to do with "new type of vehicle". For eg. CT doesn't use CAN bus but ethernet.


Why would that matter? You can connect two computers over ethernet, token ring, canbus, acoustic modems, whatever you want- it's just a way to get the packet from A to B-- the computers on either end don't care how the signal got there.

The physical wheel today since it's steer by wire, has to send a steering signal to control steering angle-- the FSD computer would just send the same signal, the same way.

Likewise electronic control of throttle and brake are trivial solved problems.
 
Sounds like instead they need a bunch of miles per type of vehicle to be able to get it working on that type of vehicle-- rather than some folks theories they just need to set the size of the vehicle as a variable somewhere.
Not really. The control aspects of CT are quite different and thus need probably new set of APIs for FSD to interface. May have absolutely nothing to do with "new type of vehicle". For eg. CT doesn't use CAN bus but ethernet.
I think the problem of Cybertruck FSD likely has more to do with its size and viewpoint limitations (without integrating the bumper camera video) and its different control response curves, particularly the variable-ratio (adaptive and context-based) steering. And more significantly, Its relatively low priority.

I could be proven wrong, but I don't believe FSD interfaces to the vehicle at the CANbus level, or generally at the module control level. It interfaces at the steering motor, accelerator input and brake activation level.

As humans with some driving experience, we can get into an unfamiliar car and do pretty well, as long as we take it easy pulling out of the lot to get a feel for the car. And by the way, modern cars are specifically engineered to respond within a relatively narrow and familiar envelope. Much different if we were plunked down into a delivery van or a city bus or an armored personnel carrier. We can learn any of these but shouldn't be thrown into the soup without some acclimation time. Even the cars I grew up with had significant variations in the touchiness of the brakes, the steering effort and feel (not to mention the shift and clutch action which is becoming a nostalgic hobbyist skill).

All of us here had to get used to the tesla/EV one-pedal driving style. Some people are comfortable within minutes, others need a few days or weeks to overcome the ingrained muscle memory.

But people often note that FSD only learns at the fleet Mothership level, and even though the AI can generalize, every passing second of the drive is "new" to FSD. So it definitely does need to be trained and it doesn't perform human-like acclimation in situ.

Using this reasoning, my conclusion is that Cybertruck FSD has no worriesome challenges, but it does need customized training, and as Elon said it's just very low on the priority list as the 1% outlier. BTW we're not even discussing the Tesla s Semi, which AFAIK has no autopilot of any kind right now.

I already see tons of ridicule heaped on Elon for doing the Cybertruck at all, instead of the Everyman Tesla. I disagree strongly and I think people don't appreciate the future applications of all the extremely significant new engineering that is within that stainless shell. But how long would it take the critics to scream about FSD resources, any at all, being deployed for Cybertruck? About five seconds I think. The early adopters are still waiting, not quietly, on unfulfilled promises, and the all-in stockholders are waiting for disruptive Robotaxi valuation.
 
Last edited: