Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Found it by accident. Not sure they are documenting all capabilities anymore.
That's not a designed feature. It just happened to work that way and you find it useful. There are other behaviors that happen to work a certain way and we don't find them useful. Tesla is responsible for ensuring we don't see any that work a certain way and are dangerous. I suspect that FSD development is now more like herding cats than it is traditional software development. They don't have direct control over the system's behavior. They can only encourage it in a certain direction and check to see if it does what they were after - and only what they were after.
 
More likely it is just a matter than on Chill it didn't meet the threshold for changing lanes to go around slower traffic, and on Average it did.
That's the explanation of how it works. The question is whether it was a mistake or oversight that it works that way. I suggest that it's neither. It's an obscure interaction with the system. In gaming, that would be an exploit. It's just something that the product does and that you can use. The more complex the system, the greater the likelihood that there will be such things.

I mentioned gaming because of the behaviors in gaming are intended to be more open and complex than, say, in Microsoft Word. In productivity applications, you want very precise behavior from the system. In a driving assist, trying for precise behavior was the downfall of the heuristic approaches. So now FSD has a more general behavior, and that means that there will undoubtedly be some interesting exploits in there.
 
  • Like
Reactions: legendsk
That's the explanation of how it works. The question is whether it was a mistake or oversight that it works that way. I suggest that it's neither. It's an obscure interaction with the system.
I think the point is it might not always work that way, because sometimes it will meet the threshold for a lane change on Chill, so it wouldn't cancel the lane change. (Though I think that would be fairly rare.)
 
There were annoying messages about possibility of degradation but it handled fine.

The only degradation that I've seen is a failure to make automatic lane changes. Is there anything else. Limiting the top speed is handled separately.

There is no way Tesla will release ASS and Banish without being liable for repairs.

There's no way.
Do you mean that Tesla must be liable because the operator can't intervene?

IOW, they can't call them SASS and SBanish.

Interesting.
 
  • Informative
Reactions: rlsd
I meant max speed. You said you were dialing up.
I did understand that. As I said, I was kidding about taking 38 literally. I was just joking about Auto Speeding being like a Thelma and Louise mode.

I scrolled up to 55, mostly, and the car slowed down very nicely for the turns. I assume it would do the same on Auto Speed, but I didn't try it. On the other hand, if 38 was it's choice, that would be too slow a Max for the straights.

SO is not OK with me testing new stuff when she is aboard because interventions and disengagements startle her and our puppy.
 
  • Like
Reactions: AlanSubie4Life
Of course the driver had plenty of time to intervene before. And cyclists always have the right to play out their inner Darwin on tight roads in these days of distracted drivers.

Otherwise it's an unsafe move for FSD to attempt but we've seen it before with blind curves.

I might have used the same language as the driver. :)


Had to disengage != FSD would have caused a head-on collision.

Disengaging is the correct action, but we don't know what FSD would have done because of the disengagement. And what FSD did is likely what most human drivers would have done as well. Perfectly reasonable to try and pass the cyclist with a safe distance, and then pull back into the correct lane when on-coming traffic is spotted.
 
Had to disengage != FSD would have caused a head-on collision.

Disengaging is the correct action, but we don't know what FSD would have done because of the disengagement. And what FSD did is likely what most human drivers would have done as well. Perfectly reasonable to try and pass the cyclist with a safe distance, and then pull back into the correct lane when on-coming traffic is spotted.
Not reasonable to go around a blind turn in the oncoming lane. Which is the only way it would be able to complete the maneuver (look how far ahead the cyclist is!)
1715025732018.png
 
Not reasonable to go around a blind turn in the oncoming lane.

That road appears to be nothing but blind turns. How long are you willing to sit behind a cyclist before attempting to pass them?

In the end, if FSD is capable of returning to the correct lane shortly after spotting on-coming traffic, then there's no issue. But we don't know if it would have returned because he disengaged.

It's nothing but click-bait for him to claim he prevented a head-on collision.
 
It's nothing but click-bait for him to claim he prevented a head-on collision.
It's clearly clickbait, but that's also a difficult pass and FSD did it very wrong.

The correct way to pass a cyclist in an environment like that is to move up behind the cyclist without crowding him, then set up in the left side of the lane and look for a clear stretch of road. When that stretch is visible, smoothly move past the cyclist, again without crowding him, but also without creating too much of a difference in speed. The goal is to find a stretch of clear road that allows a nice leisurely pass so as not to alarm the cyclist.

FSD didn't do any of that. It just slid left until it straddled the lane divider, even as it couldn't see the road ahead. FSD may have moved the car back to the right, but it shouldn't have been out there in the first place. I'd say that this is a reflection of all that city training. It just isn't trained for this sort of environment.

If I had been the driver, I would have hauled the car back to the right as well. I had a case where I was turning left onto a road from a cross street that was at about a 45 degree angle. It was very easy to see oncoming traffic. FSD pulled out onto the main street, but didn't quickly move across to the right side lane. It just gradually moved over, even as a car was coming straight at it, trivial to see. I let FSD do its thing and the driver saluted my experimental technique by leaning on his horn.

In other words, under the right circumstances, FSD doesn't move into its proper lane even when it might nail another car.
 
That road appears to be nothing but blind turns. How long are you willing to sit behind a cyclist before attempting to pass them?

In the end, if FSD is capable of returning to the correct lane shortly after spotting on-coming traffic, then there's no issue. But we don't know if it would have returned because he disengaged.

It's nothing but click-bait for him to claim he prevented a head-on collision.
The road has straight sections (see 0:13). I just don’t go around blind turns while on the wrong side of the road no matter how long I have to wait.
  1. FSD makes a bad plan.
  2. If it continued with that plan there could have been a collision.
  3. People disengage and say they prevented a collision.
It’s just how it is. Of course we don’t know what would have actually happened.
 
That road appears to be nothing but blind turns. How long are you willing to sit behind a cyclist before attempting to pass them?

In the end, if FSD is capable of returning to the correct lane shortly after spotting on-coming traffic, then there's no issue. But we don't know if it would have returned because he disengaged.
After he disengaged there was little safety margin between ego and the oncoming cop car. And I wouldn't be surprised if the cop was close to turning around and giving him a ticket. It's a safety issue.

The rules for double solid lines might be relaxed in the city but otherwise not so especially on blind corners. The former might be why FSD so willingly throws caution and crosses double solid lines.
 
About speed limit sign reading, it does not read (or interperet) signs like "End of 25 Speed Limit".

We drove from Santa Rosa, out to Jenner, and up to Gualala via the twisty-turny Hwy 1. Through Jenner the speed signs say 25, which FSD read and obeyed. (Auto Speed Setting is off.). Leaving town is the End of 25 speed limit sign, but the car continues to display the limit as 25 for many miles.

My understanding is that here in Calif, the end of speed limit sign means that the limit is 55, the default on open roads unless posted higher.

Scrolling up to 55 is easy and worked fine, but the car needs better sign understanding.

Again, I did not try the auto speed setting option.

FSD 12.3.6 handled the drive along the Russian River and up the coast very nicely, slowing for the hairpins and accelerating for the straight stretches. Quite amazing. I did drive manually for some of it to fit in better with other drivers, and 'cause Hwy 1 is fun to drive!
So I have to wonder - why don't they just put a 'speed limit 55' sign instead of 'end of 25' sign? The cost is the same but one option is completely clear while the other is very ambiguous and requires you know what the previous limit was.
 
Of course the driver had plenty of time to intervene before. And cyclists always have the right to play out their inner Darwin on tight roads in these days of distracted drivers.

Otherwise it's an unsafe move for FSD to attempt but we've seen it before with blind curves.

I might have used the same language as the driver. :)

I have seen similar behavior for pedestrians and have disengaged. It's the amount of lane departure plus doing it when there's limited visibility that's the issue.

I never noticed that prior to FSD V12.
 
So is Elon saying that Tesla will remove the "beta" label when they release V12? And if it is not beta anymore, does that mean that Tesla will remove driver supervision and assume full liability? Yeah right. LOL. [/s]
I my opinion, they should have a "waiver" for whoever wants to use FSD WITHOUT supervision so that whoever sign that would be responsible! I would, it is VERY annoying to have that computer keep saying pay attention and your hand not on wheel! The only 2 main issue I see with FSD (since I just bought my first tesla last week and been driving FSD ON EVERY DRIVE), is that 1. it may be too close when there is someone next lane behind you and it switch lane, almost like cutting off someone behind you when you change lane. The other is that it change lane for no reason. Meaning that even though the current lane is wide open in front, it would still try to switch to inner lane. Otherwise, so far, I've been almost all FSD without much interruptions.
 
I my opinion, they should have a "waiver" for whoever wants to use FSD WITHOUT supervision so that whoever sign that would be responsible! I would, it is VERY annoying to have that computer keep saying pay attention and your hand not on wheel! The only 2 main issue I see with FSD (since I just bought my first tesla last week and been driving FSD ON EVERY DRIVE), is that 1. it may be too close when there is someone next lane behind you and it switch lane, almost like cutting off someone behind you when you change lane. The other is that it change lane for no reason. Meaning that even though the current lane is wide open in front, it would still try to switch to inner lane. Otherwise, so far, I've been almost all FSD without much interruptions.
Tesla is much less worried about you suing them than they are about the person you hit suing them. While you could agree to pay Tesla’s legal fees it’s unlikely your pockets are deep enough.