Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

All US Cars capable of FSD will be enabled for one month trial this week

This site may earn commission on affiliate links.
you guys are statistical outliers then according to sites like teslafi that track this stuff
Teslafi shows about 25% of cars on a 12.3 FSD build, a week after 100% of people are supposed to have it. What am I missing?

im willing to bet like at least 90% of teslas built since 2020 will receive a month of FSD in an "Elon week"
Oh, in that case, sure, given an Elon "Two weeks" is really about 3 years on average.
 
Teslafi shows about 25% of cars on a 12.3 FSD build, a week after 100% of people are supposed to have it. What am I missing?
Cars (such as mine) who AREN'T on FSDb but have received the OMT* are on 2024.3.10 (which is Version 11.1)
*One Month Trial

I am usually late in the update cycle (I am not on an active WiFi, so I have to DL updates via my phone's hotspot), so that I received it yesterday is optimistic overall
 
  • Like
Reactions: gt2690b
I got the update on Friday. I know several folks who also got it and had the free month within 24 hours.
I am still on my 3 free months from the referral when I bought the car and I can confirm that I have not received an additional month for free.
I am guessing if you already have it turned on you won't get anything extra.
 
Teslafi shows about 25% of cars on a 12.3 FSD build, a week after 100% of people are supposed to have it. What am I missing?


Oh, in that case, sure, given an Elon "Two weeks" is really about 3 years on average.
well if you know 10 people and none of them have it you are in a statistical minority

also it is not a week after 100% of people are supposed to have it today is literally the day 100% of the people are supposed to have it...

I don't disagree Elon is usually optimistic with his timelines so sue him for being optimistic
 
  • Like
Reactions: BitJam and E90alex
Pardon my ignorance, but what are "nines" in this (and other) reference(s) in this thread?
It's how you do safety analysis. It's the chance of something happening.

If you have a system with one 9, it means it's 90% reliable. So it fails 1 in 10.
Two 9's is 99%
Three is 99.9%. And so on.

Humans are about 8 9's per mile in driving a car for fatalities. (1 in 100 million or 99.999999%). They're about 4 nines for fender benders and 6 nines for accidents with injuries.

The point is that each is an order of magnitude better, and thus kind of a similar amount of work to add each 9- as in going from 99% to 99.9% is as hard as 90% to 99% was.

Elon likes to talk about "the march of nines" and how FSD needs to be better than humans to really be autonomous. So the baseline is 5 nines per mile (100,000 miles) per fender bender.

People reporting here say we're at somewhere between 2 nines right now after 7 years of FSD development, and we need 5 for fender benders. That gives you a hint how much more work is to be done.

And because Elon likes to play an expert at everything, let's check out this post from THREE YEARS AGO, which uses very tortured statistics to try and argue that current HW autopilot is already good enough (it is because it's fully supervised by humans, who are responsible for failures, which is not autonomy):


It's also really important in any analysis that you choose the right metric. For instance:
1711995503684.png


Look at the space shuttle- Horrible per journey. Not good per hour. Just fine per mile. And don't get me started on how dangerous walking is.
 
Last edited:
It appears that USS cars are not being pushed to 2024.3.6 since that has the new vision based autopark.

My 2021 MY has USS, and I received software update 2024.3.10 last night, which includes a 1-month free trial of FSD. I've done a couple of drives with it, and it's much improved from over 2 years ago, when I subscribed for 3 months and had to wait for my Safety Score to qualify. Still not sure whether I will subscribe after the free trial, though.
 
It all sounds like a lot of obfuscation. Not specifically what you are saying: the concept in general.
Well, it's been used in Aviation for about 70 years now and has made flying through the air in a metal tube at 500 MPH the safest way to travel, which is bananas, but it must be quite effective.

The reason it's effective is we're dealing with very rare statistical events. Events that you want to be so rare that they never happen. Events that you can't let happen just to see how likely they are, because they kill people. Events driven only when the swiss cheese safety model fails. Events so rare that when they happen, the statistics completely change. (For instance, the Concorde went from being the statistically safest airliner to the least safe in one incident). Events that shouldn't ever happen, so you can't test your system for how often they happen.

So all you can do is deal in 9's, and then do an analysis that undertakes serious engineering and statistical processes to show you are there. What you don't do in a serious safety environment is release your product to the public and see how many people die, and then refine it some, and see if less people die.

Of course, none of this really applies when you're seeing your system fail every 75 miles. That's just a crap system if it has anything to do with safety.

Note: When you're dealing with 6+ nines, the I had an FAA expert once tell me "the first time it happens, statistics is a bitch. The second time you have a problem." Notice how with the 737-Max, the first crash led to an investigation, and the second one led to an immediate grounding?

What's your intuition on how we should define and evaluate safety for systems which should go millions of miles without a failure, before we release them, if using 9's is "obfuscation"? What kind of data would you like to see from Tesla before they allow FSD to go L4?
 
Well, it's been used in Aviation for about 70 years now and has made flying through the air in a metal tube at 500 MPH the safest way to travel, which is bananas, but it must be quite effective.
...
i used to think so but according to your wiki link its not very good on a per journey metric ... u would expect it to perform pretty good based on miles and hours since it goes substantially faster than all other forms of transportation but really its not even great on hours per event even though u sit on a train 10x as long as u do a plane

it also seems that you agree with Tesla's statistical approach even if u think they are decades away from achieving human level safety

is anyone out there better? waymo?
 
Tried it on two small trips… worked flawlessly but drove like a 90 year old woman! No thanks Tesla! Do our Teslas revert back to EAP after the trial?
Can you not just turn off FSD anymore?


(I'm not on V12 yet, but in V11, I have two profiles - one with FSD enabled so I can use it where it works well and one with FSD not enabled so I can use it when I just want to drive the car with EAP capability.)
 
The problem is that we can’t rely only on the metric of fatalities or injuries or even simply accidents as a measure of how good the system is. There is a large amount of subjectivity to it.

It can make errors that does not result in an accident which can frustrate or confuse other road users. It can make errors that a human wouldn’t make which makes the user of the system less confident in its abilities.

Similarly human can be a horrible driver yet not have any tickets or accidents and vice versa.

Everybody will have a different threshold as to what they consider a “failure” of the system requiring disengagement or taking over control. Arguably the people testing it right now for free would be the most critical and least biased because they don’t have a financial stake in the success of the system. Any non-human like action could be subjectively deemed as a failure whereas someone who is very bullish on Tesla/FSD will only consider a failure if it would have resulted in an accident without taking over.
 
Real Self Driving doesn't have a driving style that depends on the occupants. It has a safety goal, and once it hits that, all the systems drive that way.

Although I have often said that if we really did train self driving off real world drivers, different car brands might behave different, and you'll buy a BMW if you want to be an aggressive weaver,
Given the apparent preponderance of former BMW drivers driving Teslas, that may be an explanation for the relatively aggressive lane changing behavior of FSDS in "average" mode.
 
  • Funny
Reactions: gt2690b