Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Why Isn't Tesla Using their new Driver Monitoring System (DMS) to Improve Safety when NOT in Autopilot?

Why not more active safety when not in AP, since the new DMS exists?

  • The examples given are with HW2.5 and the vehicle with HW3 already does all of this flawlessly.

    Votes: 0 0.0%
  • The Tesla neural nets are learning from each of these collisions and it's just a matter of time.

    Votes: 0 0.0%
  • Tesla believes it will encourage people to drive when distracted, even a lot of nags are added.

    Votes: 0 0.0%

  • Total voters
    13
  • Poll closed .
This site may earn commission on affiliate links.

AlanSubie4Life

Efficiency Obsessed Member
Oct 22, 2018
17,531
23,057
San Diego
As has been detailed by @verygreen (and here: Tesla has allegedly activated a selfie cam Driver Monitoring System), there is a DMS being evaluated by Tesla now, using the in-cabin cameras on Model 3/Y (not clear whether this was already in the works or in response to European regulation and various recent "incidents"). It's not clear to me how much or what this is currently being used for (is it used to reduce nags while on AP? I don't know, and wasn't able to confirm reduction of nags or torque requirement in my vehicle - and I'm not sure I ever had the exact correct release, since I never saw it in the release notes for xx.15.xx - I'm really looking forward to the day when I can actually drive in AP with both hands firmly on the wheel again, haha.).

Do we think Tesla is going to soon prioritize safety by using this DMS output to dynamically adjust the well-known, hypothesized, "safety sliders?" They can do this specifically when the car is not in Autosteer/TACC mode, and on vehicles without FSD (but with HW3 of course). They currently do NOT mention this in the release notes.

Screen Shot 2021-06-09 at 6.35.47 PM.png


Should be a reasonably easy thing to adjust the safety slider sensitivity upwards, when driver inattention is detected. And then adjust downwards again when the driver appears to be paying attention. Again, when NOT in Autopilot.

@verygreen recently posted a couple examples on Twitter which highlight this. It's a great point he is making (and one we have discussed here before, even without the DMS in place) - why not focus on improving safety by using the DMS to go into an ultra sensitive mode when inattention is detected? Why even worry about the nag threshold adjustment or adding extra nags, when in AP? The potential safety gains from the car intervening more aggressively when not using AP, when the driver is not paying attention, seem greater than those obtained from better monitoring while in AP... I would think these would improve Tesla's safety statistics quite considerably overall (though it would not show in their Autopilot statistics of course, and in fact it would probably close the gap to their selection-bias skewed AP statistics...though open the gap vs. all other cars). So why not? It's personally mysterious to me that Teslas with apparently fairly serviceable perception are still piling into vehicles by the side of the street. It seems unnecessary - even with false positives as a potential issue - especially with a DMS.


Or, do we think one or more of the following applies (the poll allows selection of multiple answers):

1) DMS not reliable enough yet to use as output for this slider adjustment (too many false positives for too sustained a period)?

2) Vehicle trajectory correction based on perception is inherently risky and Tesla doesn't trust the car to do it correctly (which might open them up to liability?), except in situations where AP is available and in use?

3) Vehicle perception is nowhere near reliable enough, too many false positives and false negatives, because of limitations of AI.

4) They're going to do it; they're just still working on it and it is going to come in the "immense foundational improvements" of V9. The difference will be gigantic.

5) The examples given above are with HW2.5 and the vehicle already does all of this and more with HW3, I'm just not trusting the car enough, and I should try running into parked cars to see what happens.

6) Tesla doesn't want to make their Autopilot statistics look relatively worse, which this would do.

7) The Tesla neural nets are learning from each of these collisions and it's just a matter of time before it starts avoiding collisions on its own.

8) This will lead to people driving around not paying attention, even if the car is beeping and braking and hassling them a lot, so Tesla doesn't want to encourage this.

9) Or?


This is the main reason I upgraded to HW3 (improved active safety features when not using AP/FSD), so hoping it happens ASAP!
 
I personally dont believe that camera was ever intended for "driver monitoring". They may be activating it for that, just like they activated the cameras for sentry mode, but that was a response to break ins, in san fran.

The mirror seems to be a VERY poor place to "monitor the driver" instead of somewhere in front of the driver, where it would be, if that was the original intention. I dont think they will be doing anything with that camera that assumes it has perfect view of the driver, thus Its my opinion they wont be using it to reduce nags or anything like that.

I usually stay "way" out of these discussions in this subforum, for the most part, because they look and feel like people debating something other than cars, but thats my personal opinion, which isnt worth anything more than any other random internet persons.
 
Elon might just want to install the leading DMS supplier to world autos in the form of Seeing Machines. 20 years of AI research. DMS can't just be activated because there is a camera looking at the driver. There are so many levels to it and data is king.
 
  • Disagree
Reactions: Demonofelru
I don't think that Tesla really want to nag the driver. If you look at other brands (esp. Korean), they nag so much more: bing and beep all the time, have disclaimers which have to be cleared from the screen each time the car starts, etc, etc.

The use cases shown by verygreen are clear examples where active safety can help save the day. DMS might not be the complete answer - for all we know, the driver was looking ahead and just not paying attention - but it could be a valuable part of the answer.

Anyway, AP famously ignores parked vehicles, so only "City Streets (beta)" would identify the possibility of a crash in those examples and safely take avoidance action. So: V9
 
. I dont think they will be doing anything with that camera that assumes it has perfect view of the driver, thus Its my opinion they wont be using it to reduce nags or anything like that.
I don't think that Tesla really want to nag the driver.

Definitely no nag reduction yet. Looks like just nag increases so far! If you look at the tests, it’s actually surprising (to me) how apparently capable the system is. It seems like they could use it to eliminate or reduce torque sensor requirements when lighting conditions allow - or at least now in accident reports we won’t hear this nonsense about “no hands were detected on the wheel for 7 seconds prior to collision” (when we all know they could have been on the wheel the whole time - we just have no idea).



so only "City Streets (beta)" would identify the possibility of a crash in those examples and safely take avoidance action. So: V9

But, will it, if you don’t have FSD (but have HW3), and you aren’t using City Streets? That’s the big question! Will they decide to take over from the driver? How will they set and adjust the threshold (safety slider)?

It definitely seems to me that it should not matter whether or not you have FSD for this sort of feature (Tesla & Elon have said they won’t ever charge extra for safety features, in the past (I think)).


DMS might not be the complete answer - for all we know, the driver was looking ahead and just not paying attention - but it could be a valuable part of the answer

Yes, it is not essential that it work all the time though. I think in the second case the driver was probably drunk. And in the first case the driver reacted at the last second, so probably inattentive. But in any case it wouldn’t have to fix either of these scenarios, yet it could still improve safety a lot.
 
Last edited:
The mirror seems to be a VERY poor place to "monitor the driver" instead of somewhere in front of the driver, where it would be
I actually disagree to a degree here. While camera in front of the driver is great if you only care about the face of the driver - it's bad if you want to ensure their hands are free to take over at any time (think e.g. highly touted SuperCruise. It checks ou are looking forward and so you can easily trick it with this method:
and it's none the wiser).
Even plain ability to see there's a phone in the driver's lap or hands even while they keep glancing forward (at reduced level of attention, but all these systems have time threshold for activating) is invaluable to know the attention level is reduced.
 
  • Like
Reactions: AlanSubie4Life
Definitely no nag reduction yet. Looks like just nag increases so far! If you look at the tests, it’s actually surprising (to me) how apparently capable the system is. It seems like they could use it to eliminate or reduce torque sensor requirements when lighting conditions allow - or at least now in accident reports we won’t hear this nonsense about “no hands were detected on the wheel for 7 seconds prior to collision” (when we all know they could have been on the wheel the whole time - we just have no idea).





But, will it, if you don’t have FSD (but have HW3), and you aren’t using City Streets? That’s the big question! Will they decide to take over from the driver? How will they set and adjust the threshold (safety slider)?

It definitely seems to me that it should not matter whether or not you have FSD for this sort of feature (Tesla & Elon have said they won’t ever charge extra for safety features, in the past (I think)).




Yes, it is not essential that it work all the time though. I think in the second case the driver was probably drunk. And in the first case the driver reacted at the last second, so probably inattentive. But in any case it wouldn’t have to fix either of these scenarios, yet it could still improve safety a lot.
I have to say I was wondering about many of these things and what is being monitored, and I was correct in thinking there is likely 10 variables at least which they are trying to determine what a person is doing. Sunglasses on, head up/down, left/right, phone use, somebody put tape over the camera!
 
  • Like
Reactions: AlanSubie4Life
I have to say I was wondering about many of these things and what is being monitored, and I was correct in thinking there is likely 10 variables at least which they are trying to determine what a person is doing. Sunglasses on, head up/down, left/right, phone use, somebody put tape over the camera!
Yeah, it makes me think it is actually pretty good! Hard to tell exactly what the false positive rate is and the actual capability is, though.

Having an averaging window and time threshold for activation should help bring that false positive rate way down - not a luxury you have when identifying objects in the drivable space, so quite a different type of application.

In addition, the mostly well-controlled scene and the limited set of behaviors to be identified must help a lot with identification & accuracy.

Darkness obviously wouldn't be so good, but that doesn't matter so much for the proposed use. It doesn't matter very much if it doesn't work sometimes! There are some videos in the threads above showing dark performance - it still works under a lot of urban lighting conditions. And maybe long exposure neural net techniques (like Apple's Night Mode) can be used to overcome those obstacles (again, a little latency is not so important in many cases for this application).

Anyway, I think this is one of the most exciting things that has happened with Tesla in quite some time, and I am looking forward to them making full use of it (and it will be interesting to see how much more capable the camera on the Model S Plaid is). It doesn't seem like there is much interest here in this topic, though. 😢 Not glamorous enough maybe. In my opinion, this is more promising than FSD right now, though, for making driving safer. Potentially a much shorter path to much reduced accident rates - in all vehicles with HW3.

Distracted driving is a big problem, though not sure what are the exact statistics (would be hard to determine accurately). But there are so many possibilities here, even with this primitive camera. An opportunity for Tesla to really differentiate themselves (better be quick though).

Maybe THIS will be the "one last thing" announced at the Plaid event tonight. 😉
 
FSD won't need.

What about when people choose not to use FSD? That was the main area I was wondering about…use outside of FSD.

Regarding FSD…it is not here yet:
What if you can easily save lives in the meantime? Keep in mind many people bought the L2 version of FSD (not yet available). So after that is released, while Tesla is working on L3-L5, what about driver monitoring?

But the real focus of my question, again, was when NOT using FSD.

I strongly suspect Tesla even now Tesla kill rates lower than anyone else [read safest cars sold].

The data is unclear on that. No one really knows as far as I can tell, though they can say they are one of the safer cars on the road, due to vehicle age and weight, and demographics (and design of the cars as well).
 
Last edited:
Yes. I've been wondering the same thing for awhile. I liked that Green also brought that up in a tweet.

If you're not paying attention, then alerts would be great, but you'd benefit from the ADAS systems more than someone who is paying full attention, but since all DMS only monitor/ alert while using using the ADAS features, people are more likely to turn it off when they aren't paying attention.

Just make the DMS alerts active all of the time that a vehicle isn't in L3+ modes and it will help tremendously.
 


Alleges the refresh S has interior cam and IR lighting (the lack of which on the 3/Y is one of the major clues they never originally meant to use the camera for driver monitoring but are now forced to due to coming EU regs)
Yep, that has been mentioned in a few places, I guess I should have added it here. (It is confirmed, multiple videos showing this exist.) Thanks. I wonder when we will see those illuminators on 3/Y vehicles. (Well, you wouldn't see them, since they aren't visible.) I also wonder whether technically they can do something like Night Mode to do better with the existing hardware in Model 3/Y. (I don't know whether there are technical limitations of the poor quality sensors that do not apply to the imager in the iPhone - there are many of course, but with respect to implementing Night Mode.) Obviously if it were possible latency (up to 3 seconds!) would be an issue but it seems tolerable for this purpose...maybe.