Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

10.8 FSD

This site may earn commission on affiliate links.
Per Tesla's methodology section (autopilot safety data site):
"
In practice, this correlates to nearly any crash at about 12 mph (20 kph) or above, depending on the crash forces generated."

So, it's pretty clear they're counting crashes that can occur almost anywhere based on that minimum speed, even in places people tend not to use autopilot.

You've never seen a crash during a traffic jam on a highway?

Though of course folks using AP somewhere the system's not intended to be used happens too.... if they are including that, and the rate is still much lower than human alone, it speaks pretty well of the system :)



Per SAE definition: "these automated driving features will not require you to take over driving" referring to L4 and L5. So, operation of the vehicle by the automation system has to be sustained from start to end and it cannot do that without responding to every typical driving situation


That's not quite what it says...

L4 has a fixed operational domain (that's why it's L4 instead of L5).

The car is not required to drive from 'start to end' it's just required to handle the DDT within its domain.

It also needs to be able to "fail safely" without a human if it finds itself outside that domain....rather than keep trying to drive in a situation it can't handle.

I do notice you've been leaving L3 out.... where a human is still required, but the car DOES do the entire DDT within its operational domain....which is odd since the original discussion was about the current system being L2 limited.



If Elon, the man himself, thinks they'll solve Level 4 by the end of this year, what more clarity is needed?


Clarity about what?

The current beta city streets code is not an L4 system and is explicitly not intended to be.

Tesla put that in writing in documents to the CA DMV.


That Elon hopes to have a more advanced system capable of L4 sometime in the future doesn't change that.... (and Elon has voiced such hopes pretty much annually for years now- missing every target date along the way)
 
Though of course folks using AP somewhere the system's not intended to be used happens too.... if they are including that, and the rate is still much lower than human alone, it speaks pretty well of the system :)
??They are also using crash data from the city when AP is used more on the freeway. Do you see the possible issue?
That's not quite what it says...
I'm looking right at it:

Screenshot 2022-01-10 020357.jpg

The car is not required to drive from 'start to end' it's just required to handle the DDT within its domain.
Have I mentioned outside its domain at any time?
I do notice you've been leaving L3 out.... where a human is still required, but the car DOES do the entire DDT within its operational domain....which is odd since the original discussion was about the current system being L2 limited.
Elon's ambitions are beyond L3 by his own words.
Tesla put that in writing in documents to the CA DMV.
That CA DMV document is nowhere on their sales page or website.
That Elon hopes to have a more advanced system capable of L4 sometime in the future doesn't change that.... (and Elon has voiced such hopes pretty much annually for years now- missing every target date along the way)
Yes, Tesla's intent for FSD is quite clear.
 
Last edited:
??They are also using crash data from the city when AP is used more on the freeway. Do you see the possible issue?

With the data being 100% apples to apples? Sure. I've commented on it myself in the past.

With it suggesting, very very strongly, that human+AP is safer than just human? Not really.

And a few moments of thought about the advantage of 8 cameras that can see 360 all the time helping the human suggests why.


That CA DMV document is nowhere on their sales page or website.

Because it's a technical discussion- so I'm not sure why it would be.

have you actually read it though?

Because it covers the same ground being covered here.

CA DMV incorrectly thought the beta was an L3+ system.

Tesla explained the technical reasons it's explicitly and intended to be an L2 system, and why it will never be more than L2.

Tesla also explains they plan to develop different software in the future that WILL exceed L2. But city streets (what is commonly called FSDBeta) is not it

This isn't really up for debate, it's literally what Tesla told a governing body.

Elon's ambitions are beyond L3 by his own words.
...
Yes, Tesla's intent for FSD is quite clear.


Again you're confusing two different things.

Elons "ambitions" are not the design intent of the current, even beta, software

The CURRENT public release autopilot for example is only intended for L2 use on divided limited access highways. Says so right in the owners manual.

The CURRENT public LIMITED BETA FSD release is only intended for L2 use on city streets. Says so right in Teslas own statements to the governing body of autonomy in CA.


"What Elon hopes to do someday" is not relevant to the current version of anything, either legally, or in under SAE definitions.




@Knightshade and @john5520 I'd like to thank you for causing me to have a flashback to the mid-90s and USENET.

My avatar is there for a reason :)
 
  • Like
Reactions: pilotSteve
Did my first test drive with 10.8.1.
Unlike what was reported on TeslaFi release notes mine also says only 3 warnings.

I ran my test route and had numerous interventions including two cases on an unmarked road where at first it seemed to hold to the right side OK but then a car was coming the other way and FSD beta tried to move into the left head on. I took control. Tested a second time same scenario and it failed the same way. I don't recall this happening previously.

Stop sign: It stops smoothly. But then when the visibility is good it took a long time to finally go straight, right turn, and left turns. I let it take it's time except when a car was right behind me. Creep continues to move me dangerously into the traffic it can't see through a parked truck or shrubbery at the left of the intersection.

BTW- I am testing both with NOA FSD as well as AP with no NOA active. Performance is the same except there are no automatic turns with no NOA active.

The smooth stopping at stop signs and red lights and slow cars in front is much smoother and human like than before. A really great improvement to the way I drive.

It slammed on the brakes when a lady making a left turn right in front of me at a stop sign I had the right of way as well as she ran the stop sign at speed. Tesla FSD beta avoided an accident. I was ready too but FSD beta beat me to the brake. Guess she was in a hurry.

Tested parking lot auto park again and it worked well but took 2 tries to line up. I would have done it with one maneuver. This test was done next to a curb on one side rather than a white line and on one backup it came too close to the curb.

The steering on left and right turns still seems jerky but not as bad as before.

I came up on a stopped school bus with red flashing lights and stop sign open and Tesla FSD failed and tried to go around. I aborted and stopped until the bus drove off.

The final test on my route was a roundabout for a full U turn. Amazing that it executed for the first time since having FSD beta 100%. It was a little scary as it recognized another car and waited for it entering. At the entrance it slowed for the Yield sign but with no cars coming it proceeded to enter at 10mph. Tesla team has been obviously hard at work on roundabouts. This release shows the improvement.

I had many interventions but no time did I need to send in a report. I assume FSD beta reports automatically everytime I take control.
 
I have read here that it reports when you grab the wheel, or touch the brake or gas, but if you didn't have to and you still want to report something weird, touch the camera icon in the title bar. No report is made when using the gear shift to cancel FSD.

But I have gotten NO official communications direct from Tesla on the β.
 
  • Like
Reactions: TravelFree
Can we confirm this please, I don't want to double report.

Oh baby, quadruple report! Report to your heart's desire! It doesn't do anything anyways. I've reported numerous things since v10.1 and those things are still issues in 10.8.

But yes, when you disengage, that is automatically recorded. Based on the email for FSD beta, the manual report button is when something happens and you want to report as well as tell Tesla what happened and why that was an issue. The original email asked you to provide the date, time, location, etc to find those clips to review.
 
when you disengage, that is automatically recorded
The level of detail of "recorded" probably varies depending on whether you press the camera report button vs various disengagements. Green shared trip data from 3 years ago that included distances traveled on Autopilot/Autosteer and number of steering vs braking vs stalk disengagements. So technically, the data is recorded and sent back to Tesla but likely not videos unless a snapshot trigger was also activated.

These numbers are still useful in aggregate for Tesla to calculate safety and progress. The disengagements likely include locations, so also in aggregate, Tesla could probably figure out problematic static road confusion vs dynamic driving issues vs filter out accidental disengagements.

When I'm testing, I sometimes go through a difficult intersection multiple times covering each left turn then pressing the snapshot button about 10 seconds after. This should give a before/during/after view of the intersection for Tesla to recreate offline and train the neural networks to better predict the problematic aspects from all directions.
 
Last edited:
  • Like
Reactions: FSDtester#1
Had first drive on 10.8.1 today. 57 miles about 50% each on highway and city.
1 intervention. It was too late getting into a double left turn lane. I was running late so I took over. My record on my 57 mile drive is 3 interventions, 10.8.1 just blew it away. 😉😉
If I want to report very low interventions and zero interventions, I can get on the interstate and drive a hundred miles and it works perfect. But what we are testing is FSD beta on city streets and here my route has dozens of traffic challenges to perform so seeing many failures is pretty common. My route varies between 10 and 15 miles of city streets. There is a challenge situation about every 10 feet. Even one 3 mile stretch with no turns and a dozen traffic lights on my route can be done with zero interventions, but not the real test route in neighborhoods and downtown.
Then maybe the Plaid's are just better at FSD than my 2020 M S. :)
 
Last edited:
  • Like
Reactions: FSDtester#1
If I want to report very low interventions and zero interventions, I can get on the interstate and drive a hundred miles and it works perfect. But what we are testing is FSD beta on city streets and here my route has dozens of traffic challenges to perform so seeing many failures is pretty common. My route varies between 10 and 15 miles of city streets. There is a challenge situation about every 10 feet. Even one 3 mile stretch with no turns and a dozen traffic lights on my route can be done with zero interventions, but not the real test route in neighborhoods and downtown.
Then maybe the Plaid's are just better at FSD than my 2020 M S. :)

I have a 2022 Plaid as of Dec 2021 and a 2020 MS LR+ prior to that, both with FSD beta. It's all the same -- real city driving still messes up a ton. Interstate is easy -- plus interstate is still old AP code, it's not even part of the FSD. Interstate AP was pretty good before FSD beta anyways. So I agree with you, FSD fails often when there is real city driving. I see so many youtube vids that ppl are testing each new version of FSD and thier videos show stretches of several miles of going straight with some traffic lights here and there and like 2-3 turns then they boast about only having 2 interventions on a 15 mile drive lol
 
Last edited:
  • Like
Reactions: TravelFree