Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta 10.69

This site may earn commission on affiliate links.
The focus on visualizations has always struck me as odd for those ^^ reasons, it would make more sense if they were right in front of you in a HUD but not over on a screen when you're using a system that requires eyeballs forward and that will potentially ding you for looking away from the road. Mercedes has a really excellent augmented-reality HUD + ADAS interface.

At least FSD's are useful for doing armchair analysis on videos uploaded by Beta YouTubers, but otherwise it feels a bit like putting the cart before the horse

I'm a bit wary about this new blue creep barrier too, because I already see people in the 10.69 comments saying stuff like "Let the vehicle do its thing, the blue barrier is there!" while the driver should be looking forward and this imaginary barrier could give a false sense of confidence + let the car do something it shouldn't
It is better in a refresh S with the display right in front of you and the yoke doesn't obstruct your view of the visualizations. However, that screen can't be moved and is not touch sensitive so us S drivers don't get to move the visualization around.
I do agree, and for me, I can quickly glance down and see what is going on and not need to look at the middle screen.
 

Car stops short of a stop sign for a car that the driver didn’t see. Not really necessary to stop that far back since there wasn’t really a risk of an imminent collision, imo, could have just slowed and cautiously approached the stop line, but still impressive that the Tesla saw the other car behind the parked cars.
 
It is better in a refresh S with the display right in front of you and the yoke doesn't obstruct your view of the visualizations. However, that screen can't be moved and is not touch sensitive so us S drivers don't get to move the visualization around.
I do agree, and for me, I can quickly glance down and see what is going on and not need to look at the middle screen.

Agree. It's 100% better in the S to see what fsd is doing. Which is kinda ironic cause self driving is one of the reasons for the lack of cluster I guess.
 
TL-DR; The race to solve FSD just took a huge step forward with 10.69. Across the board, this is looking like a two nines step (aka 99 times out of 100 it does the right thing) and the next step is three nines without any big obstacles in the way...

Can't wait to test 10.69 with one turn that it has never done (oblique downhill obscured oncoming high speed UPL) and four roundabouts (one single lane and three multi-lane with ascending/descending slopes) that it struggles with >90% of the time with traffic (pedestrians/cars). **IF** it can do these >99% of the time *safely* (i.e. should not go when it shouldn't) AND not getting honked at (i.e. goes when it should) I'd do a wide release.

Due to the nature and confidence of Elons "FSD full release" tweet and watching Chuck's latest videos....

I feel like the MVP (Minimum Viable Product) test case has been established for FSD wide release in North America and they are close. I wonder what it contains?

Maybe something like an internal fleet wide average of miles per disengagement where employees drive in certain conditions, routes and iterations to establish statistical relevance then assuring that safety is exceeds baseline minima.

Obviously not there yet, but oh so much closer now.

In fact, I'd debate that they have no local maxima in their way currently. Tuning and tweaking could take months/quarters, but it does not feel like years now. Couldn't have said that before 10.69

The creep wall is impressive, provides confidence to the driver through an easy to understand visualization of how far the car is going to go prior to committing to occupying the lane. As well as the blue median box for where it is going to end up. Chuck's videos are super impressive and it is looking highly human predictable how the AI is going to handle complex maneuvers.

There are probably a dozen such scenarios like you listed, FSD needs to navigate before a wide release. I will just mention two here:

- the ability to cross over 3 lanes within quarter mile to take an exit or take a right turn under heavy traffic.

There are many highway service roads here in DFW area, where once you take an exit off the highway, the service road itself is three lanes. Once off the highway exit, as you merge onto left most lane of the service road, you have to quickly scoot over to the right most lane before the next light within about 2000 feet . Gets very tricky on heavy evening traffic.

- The ability to merge nicely on a short ramp with heavy fast moving highway traffic. I have not a single video of merging on heavy fast traffic. You need to make snap decisions, if you need to accelerate to merge ahead of car behind you, or momentarily slow down and merge behind. This decision has to happen in about 1/2 a second with a ton of situational awareness.
 

Car stops short of a stop sign for a car that the driver didn’t see. Not really necessary to stop that far back since there wasn’t really a risk of an imminent collision, imo, could have just slowed and cautiously approached the stop line, but still impressive that the Tesla saw the other car behind the parked cars.
The Tesla saw the car and then promptly forgot about it. Don't think it's possible for us to know anything about whether FSD has predicted the other car's path since it disappeared from the visualization. Also, it was a stop sign as you pointed out, so why would it need to stop short at all. That's what the stop sign line is for. I don't see anything remarkable about this, it looks like it doesn't understand what is going on.

What exactly is good about this video? It saw the car? Isn't it supposed to see the car?
 
What exactly is good about this video? It saw the car? Isn't it supposed to see the car?
Totally agree.

1) You have to be a pretty bad driver to not notice that car. It was very obvious!

2) This is the sort of stopping behavior that will give FSD a bad name. It’s unacceptable and unnecessary and unsafe. This is why passengers (and drivers) get upset. It slowed from 24mph to 4mph in less than three seconds. That is quite sudden braking, especially on a hill like this. People complain about phantom braking if the car slows by ~3mph in about 1 second.

3) The car knew exactly where the stop sign was and where the car was. Why did it make such a large error? There was zero probability of that car blasting through the Jeep like a Tesla-destroying guided missile.

4) Whole Mars did not even report this. What is the point of him being a beta tester who gets this first? They really need to implement the attention score for future rollouts. He would fail badly (it constantly reminds him to torque the wheel).

It looks like we’re going to have plenty of phantom braking complaints in 10.69 unless they fix them up suddenly in the next two revs. And they apparently aren’t being reported so good luck with that.
 
Last edited:
  • Like
Reactions: Baumisch and Dan D.
Totally agree.

1) You have to be a pretty bad driver to not notice that car. It was very obvious!

2) This is the sort of stopping behavior that will give FSD a bad name. It’s unacceptable and unnecessary and unsafe. This is why passengers (and drivers) get upset. It slowed from 24mph to 4mph in less than three seconds. That is quite sudden braking, especially on a hill like this. People complain about phantom braking if the car slows by ~3mph in about 1 second.

3) The car knew exactly where the stop sign was and where the car was. Why did it make such a large error?

It looks like we’re going to have plenty of phantom braking complaints in 10.69 unless they fix them up suddenly in the next two revs.
Definitely an error by the Tesla. The other car was no threat since the Tesla needed to stop anyway. Whole Mars is stretching things to avoid calling this a PB event. It's likely that his brain ignored the other car because it knew there was no threat. It was only after his car slowed unexpectedly that his mind re-evaluated the scene and latched onto the car.

Our minds work very well (usually) to filter out data that our experience has taught us that we can ignore. If it didn't, we'd be overwhelmed and could not function.
 
  • Helpful
Reactions: Dan D.
Beta brakes immediately,
It didn’t actually.

21:44.5 44mph. Starts slowing for the red light.
21:46.5 41mph. No problems obvious. Still slowing.
21:47.5 38mph. At this point an astute human driver might have detected an issue due to odd trajectory. Car still slowing gradually.
21:48.5 35mph. A human would definitely know there is an issue; the Bolt is now nearly head-on (image below). Still gradual slowing.
21:49.5 32mph. No response. Normal slowing.
21:50.0 30mph. 1.5 seconds since an obvious problem, or 2 seconds for an astute driver.
21:50.1. 28mph. The car has definitely started to respond, slowly.
21:50.5. 24mph. Increased rate of slowing, definitely in response to the Bolt. It's been 2-2.5 seconds since a problem was detected, depending on alertness.
21:51 17mph. Much more rapid response (finally!). It's been 2.5-3 seconds.
21:51.5. 14mph. Finished slowing, Bolt has started to recross double yellow lines and is no longer a danger.

So at least two seconds to respond in a meaningful manner to a clearly oncoming car. It wasn't a crisis, but you'd expect better and earlier to allow plenty of margin. That's what computers are for! They should be faster than humans (it would take a human about a half second to be on the brakes hard). I think this is a pretty bad showing; this is why we have safety drivers who should disengage immediately when something like this happens (and in fact this is one of the major risks of FSD - in this case it clearly increased the risk of a collision). I think FSD didn't respond quickly because there wasn't really imminent danger, but the prudent and human thing to do would be to slow down; the Bolt could have been trying to create a Bolt bonfire. But in any case, the Tesla didn't brake immediately. And it did not make use of the perfectly good right turn lane, which a human would have done to virtually eliminate any possibility of collision.

People are right to doubt these two beta testers. They're very bad at what they do (unless you're talking about monetizing YouTube). Don't believe everything you read on Twitter.

Screen Shot 2022-08-23 at 12.04.44 AM.png
 
Last edited:
Update on this.
10.69 has been fantastic- and I’m not the type of guy to say it’s good when it’s crap. Misbehaved on 1 of my drives last night but it was just a couple minor errors. Did great this morning but now my car is in service so testing is put on hold for a couple days. But it’s been the best build I’ve had and I’ve been testing beta since almost the beginning.
 
Car stops short of a stop sign for a car that the driver didn’t see. Not really necessary to stop that far back since there wasn’t really a risk of an imminent collision, imo, could have just slowed and cautiously approached the stop line, but still impressive that the Tesla saw the other car behind the parked cars.
It is my experience that the FSD beta can routinely stop short on random uphill/downhill sections in this area. Sometimes it will pull up correctly, sometimes it will stop short and pull up before proceeding as normal. Might go try this same intersection on 10.12.2 out of curiosity...
 
  • Informative
Reactions: AlanSubie4Life
Totally agree.

1) You have to be a pretty bad driver to not notice that car. It was very obvious!

2) This is the sort of stopping behavior that will give FSD a bad name. It’s unacceptable and unnecessary and unsafe. This is why passengers (and drivers) get upset. It slowed from 24mph to 4mph in less than three seconds. That is quite sudden braking, especially on a hill like this. People complain about phantom braking if the car slows by ~3mph in about 1 second.

3) The car knew exactly where the stop sign was and where the car was. Why did it make such a large error? There was zero probability of that car blasting through the Jeep like a Tesla-destroying guided missile.

4) Whole Mars did not even report this. What is the point of him being a beta tester who gets this first? They really need to implement the attention score for future rollouts. He would fail badly (it constantly reminds him to torque the wheel).

It looks like we’re going to have plenty of phantom braking complaints in 10.69 unless they fix them up suddenly in the next two revs. And they apparently aren’t being reported so good luck with that.
There are many different angles to this; stopping before the stop sign makes the other driver turn left directly without stopping and clears the intersection for FSD. It could save both cars time. Speeding up to the stop sign and brake locks the intersection until the other car is confident to make the left turn. Planning ahead is better.