Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta 10.6

This site may earn commission on affiliate links.
Improved comfort when offsetting for objects that are cutting out of your lane.

see what 10.6 has in store, most interested in the last line of the release notes, since it seems like the freaks out on narrow roads with oncoming traffic

Is that what you think "cutting out of your lane" means ? I thought it was more to do with people slowing down in front to turn.
 
  • Like
Reactions: silenteski
Just my opinion -
The external beta testers are really there to collect the corner cases. The engineers that work on FSD use it regularly; as do a host of internal beta testers. Chances are any generalish feedback you’d like to offer is already well-known to the team and reading some long-winded email from us about how we’d like better lane bias or curve handling isn’t an efficient use of their time. What they DO want to hear from you is how FSD failed at that super unique intersection near your house or on a weird blind turn on your commute, and that’s what the snapshot button is for.
That's unfortunate, because there are some things in the current Beta that are clearly mistakes. The lane bias and phantom braking are definitely annoying and perhaps downright dangerous, but the developers seemed to have gotten way ahead of themselves in adding "capabilities" to the system before resolving more fundamental issues. For example, yesterday I was on a curvy road with a double-yellow line and 30 mph speed limit. The car in front of me slowed down and came to a stop, and my M3 began to slow down and stop behind it. The car had its left-turn signal on, but it was hard to see with the sun shine reflection as well as the weird (IMHO) design of the taillights. All of a sudden, my M3 just decided that the car is front it was parked and it was going to go around. After one oncoming car passed, my car began to accelerate across the double-yellow line and into additional oncoming cars (the reason the car in front wasn't turning yet). I narrowly missed a head on collision.

Now, you can say "it's still in Beta" and these things are going to happen, but my point is this: why was it trying to cross a double-yellow line on a curvy road with no visibility to go around another car, parked or not? Isn't that considered an advanced maneuver? Why would the car even be programmed or trained to do such a dangerous thing? At this stage, just stay in your lane behind the parked car until the driver intervenes - I mean, the driver is always there. Why not work on more fundamental things like staying in your lane around right-hand curves, stopping at stop-lines instead of two car lengths back, not hard-braking at every change in line quality or light/shadow, and things like that before you try and train the car to go around stopped traffic on curvy, double-yellow lined roads when there's zero visibility of oncoming cars!
 
Chances are any generalish feedback you’d like to offer is already well-known to the team
Don't assume that. Its very easy to get to overlook stuff when you are very familiar with something. A lot of errors come up because of this kind of thinking.

We should report all the problems.

ps : Anyway, this is why it is so important for Tesla to maintain a public facing issue tracker where people can log & vote on issues. If there is this "generic" issue that is in the issue log we can just upvote it instead of reporting it. Then, it is easier to just log / report edge cases. Their current system is very inefficient.
 
Not totally on-topic, but I find the stopping for traffic controls to be the best part of the beta. If that was how 'stop for traffic control' or whatever it's called worked in regular AP it would be a useful addition (with the exception of sometimes needing accelerator input at 4 way stops when there's no other traffic at all).

I disengage and/or use accelerator input for 90+ percent of turns though. It's either completely missing the turning lane, stopping/accelerating uncomfortably, jerking the wheel around randomly and picking a bad line, coming way to close to curbing my wheels, etc. And wow it can jerk the wheel hard at times sitting at a complete stop. It's insane how premature / unrefined this is in some areas. If speed = 0 then don't turn the wheel. How was that not one of the first parts of the driving policy ever written?

Edit - I completely agree they need a forum or bug-tracker or something where users can star/up-vote key issues. There is a ton of low-hanging fruit that would dramatically improve the experience that the development team is obviously not addressing.
 
That's unfortunate, because there are some things in the current Beta that are clearly mistakes. The lane bias and phantom braking are definitely annoying and perhaps downright dangerous, but the developers seemed to have gotten way ahead of themselves in adding "capabilities" to the system before resolving more fundamental issues. For example, yesterday I was on a curvy road with a double-yellow line and 30 mph speed limit. The car in front of me slowed down and came to a stop, and my M3 began to slow down and stop behind it. The car had its left-turn signal on, but it was hard to see with the sun shine reflection as well as the weird (IMHO) design of the taillights. All of a sudden, my M3 just decided that the car is front it was parked and it was going to go around. After one oncoming car passed, my car began to accelerate across the double-yellow line and into additional oncoming cars (the reason the car in front wasn't turning yet). I narrowly missed a head on collision.

Now, you can say "it's still in Beta" and these things are going to happen, but my point is this: why was it trying to cross a double-yellow line on a curvy road with no visibility to go around another car, parked or not? Isn't that considered an advanced maneuver? Why would the car even be programmed or trained to do such a dangerous thing? At this stage, just stay in your lane behind the parked car until the driver intervenes - I mean, the driver is always there. Why not work on more fundamental things like staying in your lane around right-hand curves, stopping at stop-lines instead of two car lengths back, not hard-braking at every change in line quality or light/shadow, and things like that before you try and train the car to go around stopped traffic on curvy, double-yellow lined roads when there's zero visibility of oncoming cars!

They’ve given the cars capability to do some advanced maneuvers, but the judgment of exactly when they are appropriate can only be refined via iterative training of the neural nets from hoards of “moving car or obstacle?” situations. As a beta tester, it’s up to you to decide when and when not to allow the car to try a maneuver. If you can, it helps train the neural nets. If you don’t feel comfortable, or anticipate there’s going to be a problem, disengage beforehand. The great thing about beta testing is you can do as much as you feel comfortable. It’s definitely a very active process compared to how one might sit back using AP on the highway, and it’s not for everyone.

Yes, I’ve found 10.5 to be quite impatient in terms of deeming a stopped vehicle as an obstacle to be gone around (the pendulum has swung the other way apparently), but I’m anticipating it and disengage at the first sniff that the car is going to try that. That “bad dog” disengagement is very important and I’m quite certain the impatience will be tempered in the next release.
 
Not totally on-topic, but I find the stopping for traffic controls to be the best part of the beta. If that was how 'stop for traffic control' or whatever it's called worked in regular AP it would be a useful addition...
I have a relatively new Model Y, with FSD but not FSD beta. Still on "break-in" firmware 21.35.102. I drive with standard autopilot almost all the time now, and I find that stopping for red lights*, deciding for yellows, and stopping for my less-common stop signs are all very smooth and correct. Accelerating from stop, when I'm the lead car on green, is pretty slow but acceptable and easily overridden.

* However, I don't like the behavior when the next lane over has a long line of stopped traffic and I don't. AP makes no effort to avoid high differential speed; it will blow right by a long line of stopped or slow cars in exactly the situation where a) drivers in that lane will be tempted to suddenly switch out into my clear lane, b) a jaywalking pedestrian could dart out between cars, or c) there may be some developing situation that caused the lane slowdown, so its unsafe to continue at speed in the next lane.​
Good drivers, and AP, should limit differential speed between adjacent lanes. So slightly back on topic, I'd be curious to know whether FSD beta is showing any signs of this defensive-driving capability re differential speed. Probably not, given the reports about its aggressiveness in trying to get around slow traffic (which I think was added to address earlier complaints about it sitting dumbly behind delivery vehicles etc.)​
 
  • Like
Reactions: Jeff N and X-pilot
Just my opinion -
The external beta testers are really there to collect the corner cases. The engineers that work on FSD use it regularly; as do a host of internal beta testers. Chances are any generalish feedback you’d like to offer is already well-known to the team and reading some long-winded email from us about how we’d like better lane bias or curve handling isn’t an efficient use of their time. What they DO want to hear from you is how FSD failed at that super unique intersection near your house or on a weird blind turn on your commute, and that’s what the snapshot button is for.

With NN training, they need lots of videos. So any time the car does something wrong, even if they know about it, it's not always like the old days where you hunt for the lines of code causing the problem. It will take time for the machine to learn what the right behavior is through looking at tons of imagery. But I tend to agree that verbose bug descriptions aren't as helpful unless it's a really unique situation.


Is that what you think "cutting out of your lane" means ? I thought it was more to do with people slowing down in front to turn.
My interpretation is that when a lead car makes a turn off the road that you're on, the car often takes too long to get going again, waiting for every last piece of the car to be clear of the lane. A comfort adjustment is either not slowing down as much, or accelerating earlier, or both.
 
It will take time for the machine to learn what the right behavior is through looking at tons of imagery. But I tend to agree that verbose bug descriptions aren't as helpful unless it's a really unique situation.

As far as we know, Tesla is mostly using procedural code for fsd beta's decision making and behavior.

I still think most of our issues with fsd beta are related to NN perception training / predictions / dealing with certainty / uncertainty.
 
So any time the car does something wrong, even if they know about it, it's not always like the old days where you hunt for the lines of code causing the problem. It will take time for the machine to learn what the right behavior is through looking at tons of imagery.
As far as we know, Tesla is mostly using procedural code for fsd beta's decision making and behavior.

I still think most of our issues with fsd beta are related to NN perception training / predictions / dealing with certainty / uncertainty.
They are using some kind of cost optimization to figure out the best path. So, not really a straight piece of code that can easily be changed to fix an issue.

Ofcourse where the problem is perception related, it’s NN.
 
Ofcourse where the problem is perception related, it’s NN.

Almost every issue I've had with fsd beta is perception related, especially in parking lots, driving in bike lanes and then moving out, etc.

It's great to see fsd beta making "huge" improvements with every iteration. For example, on 10.2/3, it would drive in this wide bike lane for 40 feet before moving out, with 10.5, it's about 5 feet and then abruptly moves out, it's almost in one motion but still not "human-like."

In the update notes for FSD beta, all the improvements are NN prediction related.
 
  • Like
Reactions: DanCar
I've been in the public beta since 10.2 and extensively using the video snapshot button for problematic intersections such as those that are very large (e.g., SPUI - single-point urban interchange) or angled/shifted or limited visibility (e.g., cresting or sloped). I actually go through the intersection from multiple directions and capture a video even FSD handles it fine as the 8 cameras combined with previous snapshots have a better view of the whole intersection for auto-labeling. With enough experience with beta, one can generally get a sense of which intersections will confuse FSD especially with the visualizations showing wrong or blurry lines.

With the 10.5 update, almost all of the problematic intersections I've retested are now handled fine in that FSD Beta can now drive to the correct destination lane (instead of towards oncoming traffic or incorrect turn). The visualizations in general are more confident and show things from further away as well, and I would guess this was indeed from sending back videos of these corner cases.

Unclear if this means it required 6 weeks to fix (10.2 to 10.5) or that Tesla only recently got all the auto-labeling of clips from the fleet working. But hopefully this cycle continues and speeds up especially as FSD Beta population continues to grow to find these unique situations.
Sorry but my experience is different. Except for one ride with one or two minor mistakes, I am not able to have even few miles of FSD drive. Every time I take my car out within a mile it do so many mistakes that I have turn FSD off. It completely useless and just for YouTube bloggers to put videos for making money.
Sorry for too harsh but that is my experience and my personal opinion.
 
Almost every issue I've had with fsd beta is perception related, especially in parking lots, driving in bike lanes and then moving out, etc.
Most of my issues are with map or planning.

It’s possible there is an element of perception too there, but it's difficult to untangle that.

For eg. Taking 10 seconds to move out of a right turn from stop vs a smooth right turn when driving is probably planning related. Stopping at cross-walks before roundabouts with no one around is problematic planning too. Etc.
 
  • Like
Reactions: Jeff N
Stopping at cross-walks before roundabouts with no one around is problematic planning too. Etc.

I don't think Tesla is coding in "stop at all cross-walks before roundabouts." It's more likely that the perception is uncertain about the relevance of the crosswalk geometry in relation to the roundabout. Achieving high certainty with vision is very difficult, considering all the possible road geometries and objects that are in the training set. Fsd beta could be slowing down for all sorts of reasons (deer, speed bump, keep clear, railroad tracks, pets, list goes on and on) based on certainty predictions.
 
  • Like
Reactions: Massive Attack
Almost every issue I've had with fsd beta is perception related, especially in parking lots, driving in bike lanes and then moving out, etc.

It's great to see fsd beta making "huge" improvements with every iteration. For example, on 10.2/3, it would drive in this wide bike lane for 40 feet before moving out, with 10.5, it's about 5 feet and then abruptly moves out, it's almost in one motion but still not "human-like."

In the update notes for FSD beta, all the improvements are NN prediction related.
That's planning....
They are using some kind of cost optimization to figure out the best path. So, not really a straight piece of code that can easily be changed to fix an issue.

Ofcourse where the problem is perception related, it’s NN.
It literally is.. not saying its easy to fix but there are dozens of planners running that handle different scenario. Like a planner for lane changes, etc.
 
I don't think Tesla is coding in "stop at all cross-walks before roundabouts." It's more likely that the perception is uncertain about the relevance of the crosswalk geometry in relation to the roundabout. Achieving high certainty with vision is very difficult, considering all the possible road geometries and objects that are in the training set. Fsd beta could be slowing down for all sorts of reasons (deer, speed bump, keep clear, railroad tracks, pets, list goes on and on) based on certainty predictions.
I think the planner is unsure on how to handle crosswalks before roundabouts or 50 ft before a traffic signal in another case or flashing yellow light at pedestrian crosswalk or school zone.

Perception doesn’t seem to misrecognize them as anything else (at least from visualization).

As always, we are just trying to guess.
 
  • Like
Reactions: Massive Attack
It literally is.. not saying its easy to fix but there are dozens of planners running that handle different scenario. Like a planner for lane changes, etc.

We are probably thinking of different meanings of “straight piece of code”. Yes, finally the planner change will be some kind of weight change or additional dimension - but it’s not a piece of code specific to this particular case that can be changed. It’s probably a lot of trial and error with different weights being used in simulation to get the optimal numbers to handle all the cases.