Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Agreed, but judging from all the screaming complaints on Facebook about AP now monitoring eye position, you are in a small minority. Funny, but scary.
I think there are two types of people complaining. the first is people who were being irresponsible and actually being called to task Second is the people who are using it, responsibly and now find it less convenient because of more intrusive nagging. It ends up being another case of the responsible people being inconvenienced by the irresponsible people.

When I drive on auto pilot/FSD I’m sitting in the driver seat with my hands on my knees, an inch away from the steering wheel, looking out the front or the side monitoring traffic. It would be nice if the gaze- based attention monitoring system would be able just to see that I’m watching the road and take that as good enough
 
Are you claiming you have zero DE's in some ODD that you have a 100+ hours driving in? What ODD is that?

I use FSD Beta as a tool to help make driving more enjoyable, so my disengagements would not reflect on the ability of FSD Beta to travel from arbitrary point A to arbitrary point B without a safety incident.

As I said before, someone would need to do their best to only intervene in cases where a collision would be otherwise imminent, and systematically collect that data. Even doing things like letting it take wrong turns and reroute or annoying other drivers with unnecessary caution. Outside of the employees Tesla hires to do that exact job, I don't know of anyone that collects that data.
 
I use FSD Beta as a tool to help make driving more enjoyable, so my disengagements would not reflect on the ability of FSD Beta to travel from arbitrary point A to arbitrary point B without a safety incident.

As I said before, someone would need to do their best to only intervene in cases where a collision would be otherwise imminent, and systematically collect that data. Even doing things like letting it take wrong turns and reroute or annoying other drivers with unnecessary caution. Outside of the employees Tesla hires to do that exact job, I don't know of anyone that collects that data.
I'll take that as a "no" then?
 
I'll take that as a "no" then?

Don't be facetious. We haven't even agreed on a definition of a comparable disengagement. If you're riding in Cruise, there's no "Stop" button. The user's subjective assessment of a dangerous safety situation has no impact on whether the AV disengages. They tend to only disengage when they recognize an imminent collision or have already hit something.

If your definition of a disengagement is stopping the vehicle to prevent the AV from being at-fault for an imminent collision, then I've never had a disengagement in my entire FSD Beta experience.
 
Don't be facetious. We haven't even agreed on a definition of a comparable disengagement. If you're riding in Cruise, there's no "Stop" button. The user's subjective assessment of a dangerous safety situation has no impact on whether the AV disengages. They tend to only disengage when they recognize an imminent collision or have already hit something.

If your definition of a disengagement is stopping the vehicle to prevent the AV from being at-fault for an imminent collision, then I've never had a disengagement in my entire FSD Beta experience.
I don't understand why you are bringing Cruise or Mercedes into this. It's a simple question: Do you have 100 hours between disengagements in any ODD or don't you? You can define "disengagement" if you have to, in order to answer this simple question. I like the definition of "critical DE" on the FSD-tracker personally.

In my experience almost no one has a DE only every 100 hours, not to mention a 1000 hours, and obviously a human is a lot safer than that, so, hence FSD isn't as safe as a human yet, and therefor FSD cannot be released as an eye's off system at this time.

@bradtem put it nicely on X:
"Fsd sometimes can do a whole drive, maybe even 2. Recently Waymo announced they had done 700,000 with nobody behind the wheel. Tesla only has to get a few 100,000 times better and it will be a contender":

 
Don't be facetious. We haven't even agreed on a definition of a comparable disengagement. If you're riding in Cruise, there's no "Stop" button. The user's subjective assessment of a dangerous safety situation has no impact on whether the AV disengages. They tend to only disengage when they recognize an imminent collision or have already hit something.

If your definition of a disengagement is stopping the vehicle to prevent the AV from being at-fault for an imminent collision, then I've never had a disengagement in my entire FSD Beta experience.
There’s no point trying to have a rational discussion with him. You’re best off not feeding the troll and moving on.
 
In my experience almost no one has a DE only every 100 hours, not to mention a 1000 hours, and obviously a human is a lot safer than that, so, hence FSD isn't as safe as a human yet, and therefor FSD cannot be released as an eye's off system at this time.
I personally have a DE rate as a human driver once every 100 or so hours. How many hours do you have to drive before you disengage yourself?
 
I personally have a DE rate as a human driver once every 100 or so hours. How many hours do you have to drive before you disengage yourself?
Can you clarify the question? Humans don't have disengagements - humans disengage robot drivers. The definition of a "disengagement" on the fsdb tracker is one of two things;
  • Critical: Safety Issue (Avoid accident, taking red light/stop sign, wrong side of the road, unsafe action)
  • Non-Critical: Non-Safety Issue (Wrong lane, driver courtesy, merge issue)
Based on the data and what I've seen, hardly anyone goes even 1 hour on FSDb without a "critical disengagement" in it's ODD (city streets). By when do you think it will do 1000 hours?
 
Last edited:
I don't understand why you are bringing Cruise or Mercedes into this.

Because you're trying to compare disengagement rates of an L2 system that can be willfully disengaged by the driver for any given reason, to the rates of other AV companies.

It's a simple question

It's really not, and being reductionist is not helpful.

You can define "disengagement" if you have to, in order to answer this simple question. I like the definition of "critical DE" on the FSD-tracker personally.

Here is it replicated: "Critical: Safety Issue (Avoid accident, taking red light/stop sign, wrong side of the road, unsafe action)"

I think "unsafe action" is far too broad and subjective to be a useful part of the definition. So I'll consider the other pieces:

Avoid (at-fault) accident: 0 disengagements.
"Taking" red light/stop sign: 0 disengagements.
Wrong side of the road: 0 disengagements.

I think your perception of how FSD Beta drives have been very heavily biased by the media sources you consume. For your reference, here are the top reasons I disengage FSD Beta:

1. Other drivers around me are acting aggressively, so waiting a few extra seconds at a turn, or hesitating/waffling on a lane change could cause road-rage or cause them to hit me (not at fault)
2. FSD Beta has entered a wrong turning lane. I could let it continue down that lane and either try to merge back or take the turn and reroute, but it would be inconvenient for me.
3. FSD Beta is being too cautious for my preference e.g. slowing when it doesn't need to, sitting behind another vehicle that's turning.
4. FSD Beta hasn't followed a particular local legal requirement, like not stopping for a pedestrian even though they're on the other side of the road, not obeying "No right on red" signs, not obeying "stop here on red" lines.

None of those reasons meet the "critical" definition above, imo.
 
Here is it replicated: "Critical: Safety Issue (Avoid accident, taking red light/stop sign, wrong side of the road, unsafe action)"

I think "unsafe action" is far too broad and subjective to be a useful part of the definition. So I'll consider the other pieces:

Avoid (at-fault) accident: 0 disengagements.
"Taking" red light/stop sign: 0 disengagements.
Wrong side of the road: 0 disengagements.
I find it highly unlikely that your car never has blown a stop sign since you entered the beta, like what, 18 months ago? If so congratulations. You must have really good map data in your area, or just no stop signs ;)

I think your perception of how FSD Beta drives have been very heavily biased by the media sources you consume.
Really? Frenchie was great but doesn't post much these days, Cooke, DT, et.c. Even Omar and Gali's "Waymo comparison" blew a few stop signs and was going into oncoming traffic... Are any of those guys ok?

For your reference, here are the top reasons I disengage FSD Beta:

1. Other drivers around me are acting aggressively, so waiting a few extra seconds at a turn, or hesitating/waffling on a lane change could cause road-rage or cause them to hit me (not at fault)
2. FSD Beta has entered a wrong turning lane. I could let it continue down that lane and either try to merge back or take the turn and reroute, but it would be inconvenient for me.
3. FSD Beta is being too cautious for my preference e.g. slowing when it doesn't need to, sitting behind another vehicle that's turning.
4. FSD Beta hasn't followed a particular local legal requirement, like not stopping for a pedestrian even though they're on the other side of the road, not obeying "No right on red" signs, not obeying "stop here on red" lines.

None of those reasons meet the "critical" definition above, imo.
Sounds like the most frequent types of DE most people have. I hope some of these goes away with v12.

I'll stick with my projection that FSDb in a city streets ODD won't ever be autonomous on existing cars, but let's hope I'm wrong.
 
These two things can in theory both be true at the same time:

- FSD can drive hundreds of hours between critical disengagement, under specific conditions it performs well in
- FSD can be a long long way off from being a L3 system

(Note: I don't necessarily hold these positions, I'm just exploring some ideas. I'm pretty bearish on L3 systems. See below!).

The conditions I find FSD performs well in are (unsurprisingly) the easy highway driving conditions. Clear weather, no construction. Divided highways or rural undivided roads. Traffic jam stop and go traffic is also rock solid.

The big issue with an L3 system, IMO, is the handoff. The car needs give the driver pretty substantial forewarning to take over (something like 10-30 seconds). At highway speeds that's well further than the car can see ahead.

It's also well further than the car can predict that the conditions I described above will hold. Construction can happen anywhere. Traffic jams clear up. Weather happens (sometimes very quickly!). Emergency vehicles appear. Other cars crash.

Defining an ODD that is both possible for the car to determine (meaning it knows 30 seconds in advance before it leaves it) and excludes the rare but difficult parts of driving seems incredibly hard!

Mercedes' L3 system is limited to very low speeds on freeways (traffic jams, predominantly). With a 30 second handoff the car only goes 500m - that's a huge benefit for a L3 system.
 
Can you clarify the question? Humans don't have disengagements - humans disengage robot drivers. The definition of a "disengagement" on the fsdb tracker is one of two things;
  • Critical: Safety Issue (Avoid accident, taking red light/stop sign, wrong side of the road, unsafe action)
  • Non-Critical: Non-Safety Issue (Wrong lane, driver courtesy, merge issue)
Based on the data and what I've seen, hardly anyone goes even 1 hour on FSDb without a "critical disengagement" in it's ODD (city streets). By when do you think it will do 1000 hours?
Of course we do. Whenever you check yourself, you disengaged. Whenever your attention falters for a moment (lost in thought), and your jerk yourself back into focus when a red light comes up, or someone honks their horn, you disengaged. Whenever you make a tight turn and hit a curb, you disengaged. Whenever you miss a turn while trying to find someplace, you disengaged.

Happens all the time.
 
So blindfold next drive then? L3 is take over when notified within ca 10 secs. The car is legally driving during the handover in Level 3, fyi.
Can a person even reorient themselves in 10 seconds after being blindfolded? I doubt that. There's a reason why L3 cars don't allow you to sleep.

As mentioned also, the current FSD Beta mode does not even have a L3 handover notification in the first place so trying to use it as an L3 car is an exercise in danger (I suspect you know that and are being facetious). Currently the car expects immediate takeover and is designed as such. However, if Tesla did try and put in a 10 second heads-up, it's not clear that's completely impossible with current hardware. The difference is like the threshold for AEB vs FCW (with the early setting).
 
These two things can in theory both be true at the same time:

- FSD can drive hundreds of hours between critical disengagement, under specific conditions it performs well in
- FSD can be a long long way off from being a L3 system

(Note: I don't necessarily hold these positions, I'm just exploring some ideas. I'm pretty bearish on L3 systems. See below!).

The conditions I find FSD performs well in are (unsurprisingly) the easy highway driving conditions. Clear weather, no construction. Divided highways or rural undivided roads. Traffic jam stop and go traffic is also rock solid.

The big issue with an L3 system, IMO, is the handoff. The car needs give the driver pretty substantial forewarning to take over (something like 10-30 seconds). At highway speeds that's well further than the car can see ahead.

It's also well further than the car can predict that the conditions I described above will hold. Construction can happen anywhere. Traffic jams clear up. Weather happens (sometimes very quickly!). Emergency vehicles appear. Other cars crash.

Defining an ODD that is both possible for the car to determine (meaning it knows 30 seconds in advance before it leaves it) and excludes the rare but difficult parts of driving seems incredibly hard!

Mercedes' L3 system is limited to very low speeds on freeways (traffic jams, predominantly). With a 30 second handoff the car only goes 500m - that's a huge benefit for a L3 system.
I wholeheartedly agree, and L4 is even harder than L3 since there ain’t no handover. You just need to handle things and you can’t just pull over on the highway.
 
  • Like
Reactions: pilotSteve
Then you have no reference to compare. ADAS can only compare to other ADAS and not to humans.
I agree. An ADAS can never be claimed to be “safer than a human”.

An ADS on the other hand can, and should be, compared to human level safety and performance.

I was discussing FSDb under the premise that it was going to be autonomous (and hence an ADS) “sooner than we think”…

I think that idea is silly. It’s my firm opinion that FSDb in a city streets ODD is at least five years away from any form of autonomy in an ODD with VRU:s.
 
I agree. An ADAS can never be claimed to be “safer than a human”.

An ADS on the other hand can, and should be, compared to human level safety and performance.

I was discussing FSDb under the premise that it was going to be autonomous (and hence an ADS) “sooner than we think”…

I think that idea is silly. It’s my firm opinion that FSDb in a city streets ODD is at least five years away from any form of autonomy in an ODD with VRU:s.
Interesting. Can you prove that had a driver not disengaged, there would be an accident? No you cannot. The act of disengaging altered the scenario. The opposite is also true. You can cannot prove that the system prevented an accident that a human driver would not have been able to avoid, as the human was not in command at the time.

Can the same logic be applied to AVs? If an AV causes an accident, can we prove that a human in command would not have caused the same accident?
 
apparently the people that wrote unece r157 think so.
You missed my point, being blind folded is not the same as the activities allowed for L3 cars (reading a book or watching a video). It's more like sleeping, which is NOT allowed. I remember reading studies it takes around 7 seconds to reorient for those allowed activities (which don't involve readjusting to the brightness, you still have peripheral vision, and you don't have to take off a blind fold). Adding the time to do that for a blindfold situation can easily add 3 seconds (likely way more).
 
  • Like
Reactions: sleepydoc