Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Wiki MASTER THREAD: Actual FSD Beta downloads and experiences

This site may earn commission on affiliate links.
The frustrating thing about these issues is that they seem to have nothing to do with perception. For whatever reason Tesla has chosen to be VERY conservative with reduction of speed. Yet, it also seems inconsistent - I'm not convinced it always would slam on the brakes in this situation. What's going on internally that makes the reaction so different?

Originally I started typing up what I thought how the AI makes decision, but I realized I'm way too uneducated in ML/NN stuff to attempt an explanation that would stand up to scrutiny. But in a nutshell, I think decisions are made via probabilities, and these can vary quite a bit for what we humans consider a "similar situation". The computer just sees pixels and there's no guarantee it will think it's similar to something else it handled correctly.
 
The computer just sees pixels and there's no guarantee it will think it's similar to something else it handled correctly.
I guess I must be thinking about it wrong. I can understand the perception being done by NN and it being quite variable/statistical. But is the range to the identified pedestrians somehow variable? Was it placing them in the wrong location (I don't think so though I was not running the GoPro so I'll never know) - I think it was highly improbable in this case.

If the car knew the approximate range to the pedestrians (who were just entering the crosswalk at a stop sign), and the range to the crosswalk, it should have known it could have just used partial regen and it would easily have stopped by the crosswalk (which it would have to be planning to do anyway since it did identify that there was a stop sign - I think!). Instead, it slammed on the brakes, well short of the crosswalk.

That seems more like a path planning issue, which I didn't think was being done by ML/NN right now.

Perhaps the Safety Score was a crazy plan hatched by some people at Tesla to produce data to give to their FSD engineers, to show them that indeed it is possible to drive a vehicle with extremely low levels of jerk. This is probably a topic of constant frustration at Tesla, is my guess. They need to do more work on their vaunted cost functions!

I can't believe they don't just log a bunch of data, and look at all the jerk and acceleration plots, and don't conclude it's totally FUBAR. It's not a difficult problem to identify; they undoubtedly have done this and are fully aware of the issue. What's less clear is why they haven't already placed strict limits on jerk except for true "surprises."
 
Last edited:
I guess I must be thinking about it wrong. I can understand the perception being done by NN and being quite variable/statistical. But is the range to the identified pedestrians somehow variable? Was it placing them in the wrong location (I don't think so though I was not running the GoPro so I'll never know) - I think it was highly improbable in this case.

If the car knew the approximate range to the pedestrians (who were just entering the crosswalk at a stop sign), and the range to the crosswalk, it should have known it could have just used partial regen and it would easily have stopped by the crosswalk (which it would have to be planning to do anyway since it did identify that there was a stop sign - I think!). Instead, it slammed on the brakes, well short of the crosswalk.

That seems more like a path planning issue, which I didn't think was being done by ML/NN right now.

Perhaps the Safety Score was a crazy plan hatched by some people at Tesla to produce data to give to their FSD engineers, to show them that indeed it is possible to drive a vehicle with extremely low levels of jerk. This is probably a topic of constant frustration at Tesla, is my guess. They need to do more work on their vaunted cost functions!

I can't believe they don't just log a bunch of data, and look at all the jerk and acceleration plots, and don't conclude it's totally FUBAR. It's not a difficult problem to identify; they undoubtedly have done this and are fully aware of the issue. What's less clear is why they haven't already placed strict limits on jerk except for true "surprises."
Yeah I think force-smoothing is probably easy to do programmatically, but it seems like they want the AI to develop that smoothness on its own. That's my best guess. When you think about it from the traditional programming aspect, it feels like a really really easy problem to solve.

I asked the question and didn't get a response, but I was hoping that I could set a destination, not activate FSD, and just drive the way I want to drive (smoothly), and hope that it's analyzing the delta between what it wants to do vs what I'm doing. That seems like an awful lot of data generated though. I would love to know how their shadow mode works. Shadowing seems like the best way to fix the 'driving in the middle of an unmarked road' issue.
 

Seems like you yokers and low score people just need to be richer
"In a comment to Electrek, Gerber claimed that he is “one of the best drivers in the world” and therefore, the safety score is not representative of his driving."


Anyone who claims they are "one of the best drivers in the world" and aren't a professional race driver or some other similar profession get hard eyerolls from me.
 
but it seems like they want the AI to develop that smoothness on its own.
I might be misunderstanding what you are saying. My understanding from AI Day is that they have various cost functions they use incorporating various parameters for driving a path “well.” But I thought that all such planning was still being done with traditional programming, not AI.
 
  • Like
Reactions: Matias and impastu
Yeah I think force-smoothing is probably easy to do programmatically, but it seems like they want the AI to develop that smoothness on its own. That's my best guess. When you think about it from the traditional programming aspect, it feels like a really really easy problem to solve.

I asked the question and didn't get a response, but I was hoping that I could set a destination, not activate FSD, and just drive the way I want to drive (smoothly), and hope that it's analyzing the delta between what it wants to do vs what I'm doing. That seems like an awful lot of data generated though. I would love to know how their shadow mode works. Shadowing seems like the best way to fix the 'driving in the middle of an unmarked road' issue.

That's called reinforcement learning, and from AI Day they don't seem to be doing that, maybe it's some future plan I dunno. I really wish they would because IMO the only correct driving policy is the human policy. And humans don't drive by putting bounding boxes around everything and doing Monte Carlo searches for driving paths.
 
Just want to report — fsd beta didn’t try to kill me making right turn out of my housing neighborhood onto higher speed road for first time — a turn it has to make nearly every time I leave the house. Possible I was just lucky — but I think it may have ’learned’ to inch forward a bit? honestly not sure. Can’t figure out how to size down the video on ipad here but I will try again tomorrow I guess & hope for 10.3 Saturday.
 
That was something that I thought was different with beta 10.2. A light just turned yellow as I approached and it went right thru it just like a human would have. With the "old" FSD it would have jammed on the breaks. Much better traffic light experience with beta.
had a couple of those myself. I was surprised. It went through a couple where I would have braked although the stop would have required hard enough braking to have gotten me a ding during the scoring period.
 
If you have FSD Beta 10.2 installed, the Safety Score will no longer show in the app, as it is no longer needed for eligibility into the beta. However, you can still see the score calculated in TeslaFi if you subscribe there, so the car is still tracking the metrics.
Mine seems to be running a day behind. RIght now (10/14 @ 10:17 EDT) it lists 9/24-10/13. But it does seem to be updating. At least the mileage. But I don't show in statistics anywhere.
 
Likely already asked and answered, but has anyone on Android received FSD? I "sideloaded" the app on Google Pixel using APKMirror, have 99 score, no FSD yet. Waiting to see how next wave goes, and how long I have to drive like the roads are made of eggshells.
If you mean sideloaded to get the scoring version of the android app and has since gotten the beta fsd, yes. I have the beta fsd and usually sideload from apkmirror. Just updated today to 4.1.1.667
 
I made sure to save clips of driving incidents today where I also sent hit the report button, but it looks like those clips were removed from my dashcam. Looking at one particular incident where the car was turning left at a light but then halfway through the intersection it decided to crank the wheel hard to the right, I reported it and then a bit later it hit save to the dashcam. Trying to watch the video, the clip is missing the video of the interaction…but the timer shown on the video doesn’t show any missing time
 
I made sure to save clips of driving incidents today where I also sent hit the report button, but it looks like those clips were removed from my dashcam. Looking at one particular incident where the car was turning left at a light but then halfway through the intersection it decided to crank the wheel hard to the right, I reported it and then a bit later it hit save to the dashcam. Trying to watch the video, the clip is missing the video of the interaction…but the timer shown on the video doesn’t show any missing time
That’s an interesting result. Can it be repeated?