Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Poll: FSD Beta V9.1, how close is it to release (i.e. out of beta)?

What do you estimate the unsupervised accident rate (> 12mph) of V9.1 of FSD beta is?


  • Total voters
    30
  • Poll closed .
This site may earn commission on affiliate links.
Time for a V9.2 poll? Just kidding, clearly we don't have enough precision to quantify minor improvements. Responses ranged by 6 order of magnitude! I guess I'm glad that I picked the median response, 100-300 miles, which means I'm not totally crazy.
I would like to point out to the doubters that this is fundamentally what Tesla plans to do to determine safety. Obviously they have the benefit of a complete record of disengagements and the tools to simulate the counterfactuals.

"We've also got to make it work and then demonstrate that if the reliability is significantly in excess of the average human driver or to be allowed... um... you know for before people to be able to use it without... uh... paying attention to the road... um... but i think we have a massive fleet so it will be I think... uh... straightforward to make the argument on statistical grounds just based on the number of interventions, you know, or especially in events that would result in a crash. At scale we think we'll have billions of miles of travel to be able to show that it is, you know, the safety of the car with the autopilot on is a 100 percent or 200 percent or more safer than the average human driver"
 
Time for a V9.2 poll? Just kidding, clearly we don't have enough precision to quantify minor improvements. Responses ranged by 6 order of magnitude! I guess I'm glad that I picked the median response, 100-300 miles, which means I'm not totally crazy.
I would like to point out to the doubters that this is fundamentally what Tesla plans to do to determine safety. Obviously they have the benefit of a complete record of disengagements and the tools to simulate the counterfactuals.

"We've also got to make it work and then demonstrate that if the reliability is significantly in excess of the average human driver or to be allowed... um... you know for before people to be able to use it without... uh... paying attention to the road... um... but i think we have a massive fleet so it will be I think... uh... straightforward to make the argument on statistical grounds just based on the number of interventions, you know, or especially in events that would result in a crash. At scale we think we'll have billions of miles of travel to be able to show that it is, you know, the safety of the car with the autopilot on is a 100 percent or 200 percent or more safer than the average human driver"
Actually, the poll question is asking "how often will FSD cause a crash if the driver ignores warnings and just lets it do its thing?"

In other words, asking about misuse of FSD. So perhaps the right answer is "none of the above."

Analogy: "how often will a car (any car) crash if the driver takes his/her hands off the wheel?" Nonsensical, right?
 
Actually, the poll question is asking "how often will FSD cause a crash if the driver ignores warnings and just lets it do its thing?"

In other words, asking about misuse of FSD. So perhaps the right answer is "none of the above."

Analogy: "how often will a car (any car) crash if the driver takes his/her hands off the wheel?" Nonsensical, right?
No the question is "What do you estimate the unsupervised accident rate (> 12mph) of V9.1 of FSD beta is?"
Tesla also plans on estimating the unsupervised accident rate by studying interventions. It's a plan that makes perfect sense to me. Is there some alternative way of figuring out when it's ready for "people to be able to use it without paying attention to the road"?
 
No the question is "What do you estimate the unsupervised accident rate (> 12mph) of V9.1 of FSD beta is?"
Tesla also plans on estimating the unsupervised accident rate by studying interventions. It's a plan that makes perfect sense to me. Is there some alternative way of figuring out when it's ready for "people to be able to use it without paying attention to the road"?
So again, you're asking about a prediction that some people will ignore the warnings and just let FSD do its thing without being ready to take over.

I reiterate that's an inane question, because you're specifically asking how many people cannot read or follow cautions and warnings (it wouldn't be unsupervised if the user intervenes). How about "how many disengagements of FSD per x miles by attentive drivers that can read and follow instructions?"
 
So again, you're asking about a prediction that some people will ignore the warnings and just let FSD do its thing without being ready to take over.

I reiterate that's an inane question, because you're specifically asking how many people cannot read or follow cautions and warnings (it wouldn't be unsupervised if the user intervenes). How about "how many disengagements of FSD per x miles by attentive drivers that can read and follow instructions?"
I think you're completely misunderstanding the question, it's the fundamental question for a self-driving car. Read the quote from Elon, I'm trying to make the same estimation that Tesla is trying to make. The question is about how FSD would perform with no driver at all. I guess I should have asked "What do you estimate the accident rate (> 12mph) of FSD beta V9.1 is without a driver?"

The number of disengagements quickly becomes a useless number for estimating safety. Note that Elon says they will focus on disengagements in situations that would have resulted in a crash.
More about the problems of using disengagements as a metric by itself: The Disengagement Myth
 
I think you're completely misunderstanding the question, it's the fundamental question for a self-driving car. Read the quote from Elon, I'm trying to make the same estimation that Tesla is trying to make. The question is about how FSD would perform with no driver at all. I guess I should have asked "What do you estimate the accident rate (> 12mph) of FSD beta V9.1 is without a driver?"

The number of disengagements quickly becomes a useless number for estimating safety. Note that Elon says they will focus on disengagements in situations that would have resulted in a crash.
More about the problems of using disengagements as a metric by itself: The Disengagement Myth
No, it's a fundamental element of an AUTONOMOUS self-driving.

I think you forget the important distinction of Tesla FSD as L2 for the foreseeable future and that of unsupervised self-driving.
 
No, it's a fundamental element of an AUTONOMOUS self-driving.

I think you forget the important distinction of Tesla FSD as L2 for the foreseeable future and that of unsupervised self-driving.
That’s an opinion. Elon says it will be this year. Tesla is using FSD Beta disengagement analysis to determine when it’s ready for autonomous operation.

Many people believe that Tesla is the leader in the field so I was wondering what the consensus here is on how close they are to robotaxi capability.
 
That’s an opinion. Elon says it will be this year. Tesla is using FSD Beta disengagement analysis to determine when it’s ready for autonomous operation.

Many people believe that Tesla is the leader in the field so I was wondering what the consensus here is on how close they are to robotaxi capability.
An opinion that Tesla is going to be L2 for the foreseeable future? I'll take a friendly wager it won't be L3 or higher this year, but will release as L2.
 
How can there be statistics for something that doesn’t exist?
FSD Beta does exist and you can do counterfactual simulations of interventions. It's what all AV companies do, including Tesla.
Read Elon's quote again, he explains it:
"We've also got to make it work and then demonstrate that if the reliability is significantly in excess of the average human driver or to be allowed... um... you know for before people to be able to use it without... uh... paying attention to the road... um... but i think we have a massive fleet so it will be I think... uh... straightforward to make the argument on statistical grounds just based on the number of interventions, you know, or especially in events that would result in a crash. At scale we think we'll have billions of miles of travel to be able to show that it is, you know, the safety of the car with the autopilot on is a 100 percent or 200 percent or more safer than the average human driver"
 
FSD Beta does exist and you can do counterfactual simulations of interventions. It's what all AV companies do, including Tesla.
Read Elon's quote again, he explains it:
"We've also got to make it work and then demonstrate that if the reliability is significantly in excess of the average human driver or to be allowed... um... you know for before people to be able to use it without... uh... paying attention to the road... um... but i think we have a massive fleet so it will be I think... uh... straightforward to make the argument on statistical grounds just based on the number of interventions, you know, or especially in events that would result in a crash. At scale we think we'll have billions of miles of travel to be able to show that it is, you know, the safety of the car with the autopilot on is a 100 percent or 200 percent or more safer than the average human driver"
Fsd beta is not level three and it is not auto taxi, it’s just the same level two we have now with more features.
 
  • Like
Reactions: FOOLSLFDRVNG