Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

53 countries adopt strict FSD regulations

This site may earn commission on affiliate links.
Because the driver is less likely to pay attention with a semi-autonomous system than driving manually.

In manual driving, you have every incentive to pay attention since the car cannot drive you when you are not paying attention. But in semi-autonomous mode, you have less incentive to pay attention because the car can still "drive" you when you are not paying attention. In fact, you can go miles with the car driving for you and you did not have to do anything. It can lull you into a false sense of security. So yes, it is potentially more dangerous, because you will pay less attention with semi-autonomous driving that lacks a reliable driver monitoring system than if you were driving manually.

makes logical sense, incentive is there. But lots of humans dont follow that when driving manual. A large amount of people, despite this incentive to stay alive, are chronically texting and fiddling their phones. If you go on AP and take quick glances (safely of course) into cars next to you a scary number of people are doing a wide variety of activities, manually driving. Sometimes I’m pretty impressed with their peripheral vision.
 
  • Like
Reactions: APotatoGod
I'm one of those who is NOT onboard with 'eye tracking'.

I simply don't trust the data collectors enough to give my face image every second of my drive. nope, not gonna happen!

if that means giving up L3 and maybe L3, well, I'll do that or hack it (did I say that? who typed that. not me.)

now, if we use some other sensor that does not take a true photo, I'm open to that. but photos? NO. I am ok with wheel tugs, in fact.

I don't want mics MANDATORY in cars and same for cameras that are inward pointing. point at the road all you want, but never inside.

oblig: GOML

As I understand it there is no facial data collection whatsoever. This is going off of what I've read about the Cadillac SuperCruise Driver Monitoring System, and how it works.

Obviously it is likely going to store processed data of where it detected your eyes, and that kind of thing.

So I would trust that kind of system versus a Neural Network based system that collected source data for a massive dataset.

To be perfectly honest I'm not sure I'd trust Tesla with the internal camera in a Model 3/Y. Sure they have the setting where you can enable data collection from it in an accident or disable it, but as a company they've been a complete let down when it comes to protecting customer data. Like they don't even wipe computers before sending them to recycling.
 
The measures were adopted by the United Nations Economic Commission for Europe (UNECE) World Forum for Harmonization of Vehicle Regulations, which brings together 53 countries, not just in Europe but also in Africa and Asia.
There needs to be a "crying" or "sad" reaction emoji. Bureaucracy is NOT going to solve FSD and more people will end up dying as a result.

But on the bright side, this might focus Tesla's attention on FSD in the NA market. (and I am grateful to live in the US of A!)
 
Last edited:
As I understand it there is no facial data collection whatsoever. .

if the sensor is an imaging sensor (it is), then images will be taken.

its like laws, you don't pass a law and then say it won't ever be used, and *certainly* not in any corner cases. the world is just too just and perfect for things like *that* to happen..

I would be ok with sensors that are not true imaging sensors. there is ZERO need to take pictures of occupants. if we can't solve this via a non-invasive sensor type, then we should just give up on the 'watch the occupant' BS.

effort is better spent actually doing algorithms that drive the car rather than nanny the user! I'm an adult, I don't want or need a 'drive nanny' trying to judge me and almost always be used against me (hands up, here, who thinks that interior images will ever be used in defense of a driver?)

these things have been abused when we grant those powers. I'm not in favor of extending the surveillance state of the art toward the inward of my car or my home. both are OFF LIMITS.

solve your damned nannying problem some other way. sonar or whatever - but images are verbotten. and same with mic - if the car can't be effectively driven with the mics disabled, then the design is broken, imho.
 
effort is better spent actually doing algorithms that drive the car rather than nanny the user! I'm an adult, I don't want or need a 'drive nanny' trying to judge me and almost always be used against me (hands up, here, who thinks that interior images will ever be used in defense of a driver?)

@linux-works we have common ground where we agree on something important in the FSD/ADAS realm! :):D
 
if the sensor is an imaging sensor (it is), then images will be taken.

its like laws, you don't pass a law and then say it won't ever be used, and *certainly* not in any corner cases. the world is just too just and perfect for things like *that* to happen..

I would be ok with sensors that are not true imaging sensors. there is ZERO need to take pictures of occupants. if we can't solve this via a non-invasive sensor type, then we should just give up on the 'watch the occupant' BS.

effort is better spent actually doing algorithms that drive the car rather than nanny the user! I'm an adult, I don't want or need a 'drive nanny' trying to judge me and almost always be used against me (hands up, here, who thinks that interior images will ever be used in defense of a driver?)

these things have been abused when we grant those powers. I'm not in favor of extending the surveillance state of the art toward the inward of my car or my home. both are OFF LIMITS.

solve your damned nannying problem some other way. sonar or whatever - but images are verbotten. and same with mic - if the car can't be effectively driven with the mics disabled, then the design is broken, imho.

I can respect your position on it, but personally I don't see an issue because it's a black box solution. The data that comes out of driver monitoring systems is not visual data, but information as the result of processing the visual data. The data can't be stored because it doesn't come out of that black box provided by the driver monitoring system provider.

I also wonder if one would really call an IR camera a visual camera as they block quite a bit of the visual spectrum with a filter. That way it cuts down on erroneous things and simplifies the processing. There isn't any data storage of this data going on. They only track your face, and your eyes and they process the data in real time.

They likely won't stay visual for very long because there are companies like TI introducing radar based systems. Those might be a little more alarming since they can track things like respiration rate. So all you're really doing is trading one kind of privacy for another.

I won't have any issue with a Driver Monitoring system as long as I know what's being used, and I have reassurance that visual data isn't being recorded.

At this time the only company that I know of that has an unfiltered interior camera that is used to store interior visual data is Tesla with the Model 3/Y. They claim that they only record data in specific situations (like a crash), and that you have to opt in.

On my car I opted out, but it wasn't an easy decision.

I didn't want it because its an invasion of privacy, but I did want it because it's the only thing that can defend me if I'm dead.
 
unless I get to audit the code and hardware (or someone I trust), I won't believe any words given by a vendor about securing my privacy. that NEVER works and I laugh each time I hear someone trust a company to protect their privacy or personal info.

tesla does not even erase customer info when they deactivate cars or when there are accidents that are marked 'totaled' and the car gets sent to the 'parts' guy.

the best way to avoid data collection is to avoid feeding the monster. there will be a circular bit of tape over the interior camera on my m3, as there is on every phone and laptop I have (and everyone at work does that too, on their computers/phones).

the day of 'trust us' is long gone, when it comes to data privacy. companies don't care and there are essentially no penalties, so nothing is going to change, given the current trajectory.

no, I won't trust vendor words that the image data wont be stored or used or associated. been in the field far too long to fall for THAT line, again.

again, I'm ok with the outside cameras being on; those are 100% needed for the driver assist features. but so far, we have not needed the interior camera and I'm quite ok with the current level of features that don't NEED the internal camera.

vendors better deal with this as an option, as not everyone is going to be excited to expose their cabin to a/v snooping.
 
  • Like
Reactions: pilotSteve
A lot of false information and BS has been spread about this regulation. The only proper way to discuss it is reading the document.

For instance, the eye monitoring is nowhere mandatory. But, two "driver availability criteria" must have been passed in the last 30 seconds in order to keep the system active. There is no binding list of valid criteria, only examples are given.

Wheel torque nagging is likely a possible one. But a second one is needed, this is why eye monitoring might be used on Model 3. There are other options, such as asking to press on a wheel button or levers. But every 30 seconds will be really stupid.

Also, I guess any regulation will apply only on new vehicles. In that case, we will likely see a driver facing camera coming to the model S in the near future.
 
But, by reading carefully the document, it clearly limits that lane keeping systems must work only at speeds below 60km/h and on divided highways. Thus only in traffic jams. It states, however, that this will likely change in the future.

"In a first step, the original text of this Regulation limits the operational speed to 60 km/h maximum"
This is good news, older cars will gain value :cool:

It's clearly a move initiated from the excellent lobbyists of some famous car manufacturers which are promoting traffic-jam L3... The document seems to be tailored for them.
 
  • Funny
Reactions: APotatoGod
From my interpretation of this discussion, it sounds like this law is intended to allow level 3 autonomy (driver doesn't need to babysit the car as attentively as L2, but still should be ready to take over if the car demands it) in a certain subset of circumstances, which, I'd say, Autopilot is already completely capable of (other than perhaps an additional driver monitoring criteria).

I've found that, on a certain portion of my commute, that fits these criteria perfectly (divided road with a 40+ mph speed limit, but often ends up being stop-and-go traffic at rush hour), Tesla Autopilot is totally reliable to handle the situation, such that I can safely read on the web browser (TMC forums is my go-to ;)), while still supervising the car, and transitioning smoothly to full attentiveness when leaving the traffic jam.

I obviously don't recommend this for any unfamiliar area, or when inexperienced with autopilot, but when you've driven the same route, the same situation, tens to hundreds of times, and know 100% how autopilot handles it, it's totally doable, and in that situation I'd honestly trust autopilot more than most of my friends as a safe driver.

In that sort of scenario, I'd love for Tesla to officially approve of this in the subset of scenarios it handles flawlessly (such as the stop-and go criteria described by this law), and I feel that if they wanted to, they could release an update to enable this L3 behavior tomorrow, with autopilot exactly as it currently is.
 
From my interpretation of this discussion, it sounds like this law is intended to allow level 3 autonomy (driver doesn't need to babysit the car as attentively as L2, but still should be ready to take over if the car demands it) in a certain subset of circumstances, which, I'd say, Autopilot is already completely capable of (other than perhaps an additional driver monitoring criteria).

Well it clearly states that driver must ACTIVELY prove he is ready to take control by responding on at least two different stimulations, at least every 30 seconds. I don’t know if this is L2 or L3 but it doesn’t matter. It means two nagging systems with a 30s delay are mandatory. Independently of the L rating which is not even a thing in this paper. It applies to any auto steer system.
 
Car manufacturers must also introduce Driver Availability Recognition Systems, which monitor the driver's capability to take back control of the vehicle, including through spotting eye blinking and closure.

So people in the early stages of cataracts that really need to wear dark polarized sunglasses to limit glare are going to be punished. What a crock.
 
  • Like
Reactions: APotatoGod