Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Wiki FSD Beta 10.4

This site may earn commission on affiliate links.
and the longer min follow distance even moreso.

Since the car still follows way too closely at following distance 7 (both on freeways and surface streets), I don’t find this to be an issue. The speed limit cap is very annoying though.

It’s weird because the close following seems really somewhat erratic now. They seem to have really relaxed the rubber bands to try to minimize jerk. It’s very sad. With FSD, I never know when it is going to be riding someone’s a** (probably 1-second following or so - I plan to take some videos and measure it) in Chill FSD driving style mode. Sometimes it does, sometimes it doesn’t. And it is very lax in responding to speed changes of the lead vehicle. Which is fine in some ways (don’t want it to be too rigid since that also makes me complain). But it seems tuned all incorrectly.
 
Last edited:
  • Like
Reactions: edseloh
You are setting yourself up for disappointment. Think a timeframe of 12-24 months for improvements to “much better” here. And that is a long road with many perils (see below!) along the way. 10.5 will be largely indistinguishable from 10.4 - there may be new minor features added, but expect the same basic experience for quite a while now.

Maybe in 11 we’ll see some marked changes in behavior (and maybe single stack - which may not be an improvement) but I still expect it will be a pretty rough ride.

I generally have the idea that in a year or two we may have something that provides overall a fairly pleasant drive and behaves naturally. Not sure if that is too optimistic - it’s really hard to predict. We are at 3+ years with NOA and we’re not at that point yet, so this timeline may well be optimistic.

It’s fine - there’s no rush! Just have to try to make sure no one totals my car over the next few years. Everyone out there is trying as far as I can tell.



For this accident, seems like a lot of assumptions to make. While it’s possible for EPAS to outmuscle the driver, apparently, I have never experienced that even with emergency road departure assist (which I have seen on a few occasions). So I don’t really see anything wrong with their current level of lane departure correction (and does it even apply during a turn?).

It’s really hard to say what happened in this case. My guess would be that FSD jerked the wheel suddenly during a turn into the adjacent turn lane, putting the car on a collision trajectory, which is 100% normal and fully expected. Then the driver disengaged and did not recover it properly and completed the collision. But there’s really no way to know without video or data from the vehicle. I would be shocked if it happened as described, in any case. And all of the above is pure speculation.
Glad you noticed our efforts, it's a thankless job sometimes. I almost had you at the corner by starbucks in a nice multi car rear end collision but you accelerated into the adjacent lane. Then later that morning across from dunkins donut I tried to tbone the passenger side and hit a pothole spilling java all over myself, what a mess. If you'd slow down and just sit at the intersection tomorrow I'm sure I could get you in a nice side swipe, nothing fancy ...just a classic.
 
If I set it at 7 on highways I'd be perpetually being passed and cut off by other cars.

Even at 2 it's not that rare to happen.

I miss 1.
I guess it all depends on the user preference. Setting 7 is quite close; it seems typically a little under 3 seconds (though difficult to measure these days unless you can get a good steady state situation with the lead vehicle using cruise control, due to the aforementioned inconsistency), which is quite tight at freeway speeds.

I’d definitely like options to select out to about following distance 12-14 to maximize visibility (since NOA does not permit or create a lane position offset to improve visibility around the lead car). It’s nice and relaxing to follow at 4-5 seconds.

On California freeways I don’t worry about cut-ins. People will do it with impossibly tight spacing anyway, so it’s better for me to just follow at a distance and go with the flow. Otherwise I’ll just end up feet from someone’s bumper at 80mph, as NOA/FSD ponders for several seconds what has happened (yes, normally I would just disengage). (Classic NOA response I posted elsewhere, just prior to FSD beta rollout, entirely in NOA, no intervention other than target speed setting, here:
) Anyway, as long as I’m traveling slightly above the speed of most traffic, they usually aren’t a big deal, since the relative motion tends to discourage the cut-ins, even with a large gap.
 
Last edited:
You are setting yourself up for disappointment. Think a timeframe of 12-24 months for improvements to “much better” here. And that is a long road with many perils (see below!) along the way. 10.5 will be largely indistinguishable from 10.4 - there may be new minor features added, but expect the same basic experience for quite a while now.
Even a top tier software company like Apple working in a much more limited medium like smartphones can barely deploy a handful of significant updates per year without unintentionally breaking a bunch of stuff in each new version. Neural Net and Machine Learning and other buzzwords/phrases clearly don't change any of these realities -- people expecting massive changes on a biweekly update cycle are asking to be disappointed.
 
I would be shocked if it happened as described, in any case. And all of the above is pure speculation.
Yes - I should say - if they find the accident happened because of lane departure correction or TACC, they may take some action.

BTW, definitely lane departure shouldn't kickin when we manually correct FSD behavior. But currently it does - but probably not during a turn - in my case they happened when FSD got into the wrong lane / space and I corrected it.
 
I had a weird twist on the common problem of trying to pass a car at a red light today.

I came up behind a pick-up truck at a red light, and as always happens when FSD is behind a stopped car and can't see the red light (the truck was tall and blocked the view), FSD repeatedly tried to go around the truck using a turn-only lane.

The crazy thing wasn't that it did this at the red light (that's bad, but normal for 10.4), but it kept on happening for 5 miles of driving behind this pickup truck after the light turned green.

The visualization showed the truck in the correct position, but the planner showed that it wanted to either pass the truck on any drive-able surface (road markings be damned), or come to a stop before the current position of the truck, even while the truck and I were both travelling around 40 mph. Enabling FSD immediately resulted in firm braking like it wanted to stop as though the truck was stationary, even though it was clearly moving and the visualization knew where it was.

I suspect some kind of "this is a parked vehicle" bit associated with that vehicle object in FSD memory got stuck on. The behavior happened for 5 miles of driving, and I could only use FSD again without crazy braking/swerving after the truck eventually turned off on a different road than I was taking.
 
...and timing belt goes from 0 to nearly full-speed in how long during start-up with how much load (AC compressor, alternator, water pump, ....)? Also stopping.

Tesla's steering belt is making relatively small movements and I gather the load/stress isn't that high. A belt doesn't know/care about which direction it's turning or even if it's reversing.

It's the rapid reversals that are hardly ever encountered in manual driving. It is a flexible toothed belt sort of like a miniature timing belt. Timing belts last a long time but they don't have these rapid reversals.
 
The visualization showed the truck in the correct position, but the planner showed that it wanted to either pass the truck on any drive-able surface (road markings be damned), or come to a stop before the current position of the truck, even while the truck and I were both travelling around 40 mph. Enabling FSD immediately resulted in firm braking like it wanted to stop as though the truck was stationary, even though it was clearly moving and the visualization knew where it was.
It simply couldn't believe that a truck of that size could be moving? Parked / stationary vehicle detection has been pretty wonky for me so I suspect something about the appearance of the thing kept triggering the "this is a parked vehicle" detections.

It seems to work well only if the obstruction is not moving and clearly visible from far away. Detections in the most common situations such as coming up on a stopped vehicle (or vehicle waiting to make a left on an otherwise empty street) are quite slow. Opposite issue with temporary obstructions like cars moving in and out of driveways, and sometimes pedestrians moving across the roadway - frequently tries to pass when it should wait instead.
 
I had a weird twist on the common problem of trying to pass a car at a red light today.

I came up behind a pick-up truck at a red light, and as always happens when FSD is behind a stopped car and can't see the red light (the truck was tall and blocked the view), FSD repeatedly tried to go around the truck using a turn-only lane.

The crazy thing wasn't that it did this at the red light (that's bad, but normal for 10.4), but it kept on happening for 5 miles of driving behind this pickup truck after the light turned green.

The visualization showed the truck in the correct position, but the planner showed that it wanted to either pass the truck on any drive-able surface (road markings be damned), or come to a stop before the current position of the truck, even while the truck and I were both travelling around 40 mph. Enabling FSD immediately resulted in firm braking like it wanted to stop as though the truck was stationary, even though it was clearly moving and the visualization knew where it was.

I suspect some kind of "this is a parked vehicle" bit associated with that vehicle object in FSD memory got stuck on. The behavior happened for 5 miles of driving, and I could only use FSD again without crazy braking/swerving after the truck eventually turned off on a different road than I was taking.

pretty sure this was your car finding the tailpipe emissions extra stinky.
 
  • Funny
Reactions: tmoz
Sure but the question is how can you know that your car did not detect a slowdown in the vehicle ahead due to brake application. You’d have to have a test of a vehicle ahead applying brake lights without slowing down (could perhaps do this at dusk by having the lead car turn on and off headlights which would potentially emulate brake light application - or rewire the brake lights on a test vehicle temporarily).
TravelFree said:
but the reaction of that slamming on the brakes when the car and truck lights came on indicates it's the interpretation and on what to do that needs work



I wasn't referring to a slowdown ahead, rather a brake light came on and the car or truck was not even in my lane. That's when I observed my car braking for no reason, ie. nothing ahead of ME.
I've never observed a car ahead of me apply his brakes and not slow down. But, I haven't tested this at dusk where he just turned on his lights.

I think setting up test vehicles with rewired systems to simulate a highly remote other vehicle failure just to test FSD reaction is more of something the Tesla employee alpha team might do but well beyond me beta testing.
 
  • Informative
Reactions: Phlier
...and timing belt goes from 0 to nearly full-speed in how long during start-up with how much load (AC compressor, alternator, water pump, ....)? Also stopping.

Tesla's steering belt is making relatively small movements and I gather the load/stress isn't that high. A belt doesn't know/care about which direction it's turning or even if it's reversing.
The timing belt is an internal belt operating cams, not accessories and always travels in one direction. The steering belt goes both directions as you steer. The jerky motion of FSD 10.4 probably puts transient loads on the belt several times higher than normal steering. The steering system belt motor speed can be significant as it's lowered significantly by drive pulleys and steering rack gear. Believe me this is creating unknown loads that the belt may not have been designed for.
 
Last edited:
  • Like
Reactions: Phlier and Ramphex
My Model 3 very nearly caused an accident on FSD Beta 10.4 today. Driving on a 35mph street with moderate traffic, it spontaneously phantom-braked HARD (like ABS-hard), and the car behind me had to swerve out of the way, very nearly colliding with me. This is a particular problem because phantom braking has no good override mechanism. (What am I supposed to do, tap the brake harder?)

In general, FSD seems to have an idea of "safe mode" that is not always safe. Such as spontaneously stopping in the middle of an intersection on a left turn. It seems to have no awareness that stopping can be MORE dangerous than proceeding with less than perfect certainty. Especially since the latter is far easier to manually override than the former. (When FSD is in a tricky situation, I often cover the brake; I never cover the accelerator!)

The other problem with FSD being so cautious is that Tesla is not going to learn anything from it. I disengage FSD regularly because it's doing cautious things that annoy the drivers behind me, but then there's no way to know whether it would have done the right thing if it had just kept going. (I have it on the "average" setting; is "assertive" noticeably different? I'll try it.) I think I might rather have it take its actual best guess at the way forward, at least in situations where the car seems to be thinking "safe mode", rather than "slowing down is what a human driver would do here".

Flip side: FSD sometimes adheres too rigidly to the nominal speed limit, ignoring context. I live in a small community on a tiny cul-de-sac where the safe speed is about 10mph, though there's no explicit sign. FSD tries to blast through it at 25mph because that's technically the speed limit there. In high school I once got in massive trouble with my girlfriend's parents for making this exact mistake 😂
 
Last edited:
  • Funny
Reactions: tmoz
This is a particular problem because phantom braking has no good override mechanism. (What am I supposed to do, tap the brake harder?)
I found immediately stepping on the throttle works all the time. The braking action is then very brief and unless someone is on your bumper, or, not paying attention it should save you from a rear end collision. Due to the very frequent phantom braking I am normally ready for it with this technique.

(I have it on the "average" setting; is "assertive" noticeably better?
It's been awhile in an earlier version but I recall assertive won't stop at a stop sign and decides to just slow down when the coast is clear. People get tickets and points when a cop sees them around here. You California Rolling Stop have a different language. Stop means slow down and yield out there.
 
  • Like
Reactions: Thp3 and Itsuo-DC
I found immediately stepping on the throttle works all the time. The braking action is then very brief and unless someone is on your bumper, or, not paying attention it should save you from a rear end collision. Due to the very frequent phantom braking I am normally ready for it with this technique.
I'll try to get used to this. The throttle doesn't disable FSD though; I think that was my instinctive hesitation, that I'd rather disable FSD than fight with it. But I'll try the throttle the next time it happens.
It's been awhile in an earlier version but I recall assertive won't stop at a stop sign and decides to just slow down when the coast is clear. People get tickets and points when a cop sees them around here. You California Rolling Stop have a different language. Stop means slow down and yield out there.
Right, but my thought was less about stop signs (or any situation where the car has complete understanding of its environment) and more about setting a different "safe mode" threshold. Wondering if chill/average/assertive affects this at all? Guessing not, but maybe.
 
Internal developer tools let you tag a recorded clip with a reason, would be nice if they gave us access to this:

1637052083238.png
 
Show me that the AI you believe is real ( maybe just immature) is not just a bigger faster calculator that operates on a library of rules ie. equations. I don't drink coffee and never needed drugs to learn. But I do know Fortran so I have an elementary understanding of computer programming. Just show me how AI is not just a very fast very robust calculator that is silicone hardware using software to decide what to do.


Absolutely not! The human brain takes in visual, touch, and sound sensory and the brain creates moves based on a biochemical - electrical reaction. The top scientists barely understand how it works. There are rules but how the brain applies those rules is where the human brain differs from a set of instructions stored in memory. Why does a human brain decide to run a red light. Will the AI in a Tesla be that creative? Ok the programmers have already decided to write an instruction to allow the FSD-beta to coast through a stop sign. You may think that is OK because you do it all the time. But did AI decide to do that or did a programmer write that instruction for the FSD calculator to follow?


I'm not saying we will never have true artificial intelligence that can invent, create, and decide to destroy the human race as an inferior lower lifeform. It could be possible, Will we create that level of AI? Or, just a faster bigger computer than can calculate faster.

One of the problems the current Teslas all have with these left and right turns is seeing around a corner. The location of the pillar cameras is too far back which requires the programmer to write an instruction to cautiously creep into the cross traffic until the cameras can see left and right. Fort FSD to be better than humans it needs to locate the cameras up near the headlights. One YT'r already tested this and was able to see approaching traffic that the pillar cameras could not so those cameras instructed the car to creep into the lane of traffic. The tester had his GoPro mounted on the front of the car with it's remote view could see what the pillar cameras could not. This is not original because we have our own robo bus in town that puts it's cameras up near the front left and right on the front headlights.

I considered what you wrote about the blind person being able to see... but all you did was confirm what I said and that is FSD beta is nothing more than a special calculator with a software program that runs very fast. If true FSD needs to know every possible scenario that will ever happen and compute that faster than a human. The hardware we are putting in the cars today are grossly inadequate. Even Musk claims he is striving for the march of 9's and at some point he will decide it is good enough. But will the regulators? The Musk march of 9's will be adequate when statistics tell him Tesla FSD is safer ( fewer accidents) than human drivers. I agree that may be good enough. BUT, is that true AI? I don't think so. We're a long long way off from the thinking liquid metal robot in The Terminator.

You can believe in the fantasy of a Robotaxi that can think and go transport people anywhere anytime perfectly, mixing in with human drivers. All I see possible is a better Level 2 FSD / NOA that I will need to monitor but make make the mundane steering and speed done by a machine. I don't expect to live to see the day when I can get in the back seat and say take me to my sister's house in Pennsylvania and then I take a nap or watch a movie.
Since I read through this wall of rambling, somewhat incoherent text, I felt entitled to give you a gentle heads up: you have a fundamental misunderstanding of machine learning, AI, and the concept of inference. A decent starting point would be to use “pattern matcher” as an analogy rather than “calculator”. Then google “AI inference” and “emergent behavior”.
 
  • Like
Reactions: Thp3 and Ben W
The throttle doesn't disable FSD though; I think that was my instinctive hesitation, that I'd rather disable FSD than fight with it. But I'll try the throttle the next time it happens.
I now instinctively hit up on the right stalk while pushing the accelerator pedal to both minimize that sudden braking & disengage FSD beta. Similar to this where I always have my right hand touching bottom of right stalk: First FSD Beta accident?
 
  • Like
Reactions: Ben W