Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta 10.69

This site may earn commission on affiliate links.
Obviously we cannot judge the level of improvement (I am not saying you were making such claims, but a casual reader of the table might think that there was a huge improvement!) if you have changed your standards. You have lowered the standards with your driving style modifications. (Which you are welcome to do but this means an asterisk on your data!)
That would be somewhat right. But standards do change - so the question is which is the "right" standard ?

Another way to look at it - I basically disengaged when I "felt" it was unsafe. That standard has not changed. For eg. near roundabouts I do disengage if FSD doesn't slow down around vehicles as expected .... may be it will slow down late (like it does before right turns, sometimes ?). But I disengage because I fee unsafe.

I think it is best to intervene as soon as FSD Beta does anything wrong, and measure based on those standards. Which would still not be scientific of course (what counts as wrong?).
Yes, what counts as wrong ? Also, whats the point if I disengage every time FSD behaves differently than how I would drive .... I don't do that with Uber ;)

Blur feature in YouTube works and is easy to use, though somehow it seemed to limit resultant image quality to 1080p, though that could have been a one-off issue or user error.
I use PowerDirector with a lot of advanced features - but a lot of drives are close to my home. Anyway, I'll either send the videos to you privately or cut portions near my home and post them.
 
There are two issues with Omar videos even when you take into account that he tries to let the system do it alot more.

1) He increases/decreases the speed (do you?). Which is crucial to avoid a future disengagement especially for location you know the system struggles with. An example is his port video where the system tries to go 25 MPH in the park with a bunch of peds around and he reduces the speed.
But he is not trying to hide such interventions.

2) He uses the accelerator, as we know.
And we can see when he does - he specifically changed the view so that we can see that intervention.

ps : For the record, I set my offset at 0 and increase speed by 5 mph in marked roads and decrease by 5 mph in unmarked roads. I don't count that in interventions.

3) The most important is that he doesn't post the obvious mistakes. If you follow his twitter, you will see times where he complains about drives that are horrible but NEVER posts video of them.
Possible, though I've seen video (atleast in the past) where he would post videos with disengagements. BTW, he posts so many videos - it is obvious he doesn't have a lot of videos with problems. How many times can he do LA to SF drives, afterall ?

I mean overall, its a Tesla fan account - so he won't post a lot of things critical of Tesla. But he isn't a fraud just because he posts zero DE videos.
 
Last edited:
Tesla already has that policy. I can tell the car to automatically drive above the speed limit at all times. It’s a setting in the menus and it is applied even when driving with FSD.
As sure as flowers come up in the Spring, some maniac so-called traffic safety engineer at the NTSB or NHTSA is going to mandate that all self-driving cars follow speed limits, no ifs, buts or maybes. Note that said maniac, whoever he/she/committee it is, probably speeds on any given day, anyway, and the hypocrisy won't even slow them down. "Think about all the lives that will be saved! Think of the children!"

So, people without self driving cars will continue to run at 5 or 10 mph above the limit, but now dodging Teslas/Fords/VWs/Porches/whatever who, if they're sane, will be in the far right lane, and there'll be Accidents.

Said accidents will be pushed by the maniac(s) as Proof that self-driving cars Aren't Safe, instead of Proof that traffic safety engineers should be lined up against the wall when the revolution comes.. and if the maniacs actually pulled that stunt, the revolution would happen sooner rather than later 😁.

Actually, thinking about the above.. If the maniacs already in existence have any sense of self-preservation, they wouldn't have pushed for a hard limit, not wishing to be tarred and feathered. My guess: They're laying in wait for the Tesla Semi, since It's Only Truckers who would be affected.

On the other hand.. while there are probably pasty-faced, weakling truckers out there, my impression is that many of them have the job of loading and unloading the trailers they pull. Hence, they have muscles on top of their muscles. Do the maniac traffic safety engineers want to come up against them?

Comments?
 
Near miss of a bus. Now some people will claim that its clearly trying to go ahead of the bus based on the planner and won't hit it. Which is completely absurd. This adds to my point that making conclusions from planner tentacle watching is not a accurate thing to do.

9 mins 10 seconds
Difficult to tell. FSD doesn't make many mistakes when it comes to lane change - but in this case its not clear whether it would have stopped - since the bus was not shown in blue.
 
Tesla already has that policy. I can tell the car to automatically drive above the speed limit at all times.

You seem to be confused between "what the car automatically does" and "what the human commands it to do"

Vastly different things.

It’s a setting in the menus and it is applied even when driving with FSD.

The default setting from Tesla is 0 offset. It only changes if the driver changes it.

That's an actual, legal, difference.


As I point out they already got in trouble the one time they had FSD default to NOT following traffic laws- they won't repeat that mistake.



As sure as flowers come up in the Spring, some maniac so-called traffic safety engineer at the NTSB or NHTSA is going to mandate that all self-driving cars follow speed limits, no ifs, buts or maybes.

State laws in states that allow self driving cars already do that- no need to involve the feds. They require self driving cars obey all traffic laws, which would include speed limits.
 
  • Like
Reactions: Snerruc
You seem to be confused between "what the car automatically does" and "what the human commands it to do"

Vastly different things.



The default setting from Tesla is 0 offset. It only changes if the driver changes it.

That's an actual, legal, difference.


As I point out they already got in trouble the one time they had FSD default to NOT following traffic laws- they won't repeat that mistake.

And yet, there is no user setting for rolling stops. Why do you suppose Tesla allows the user to override the speed limit (and break the law), but does not allow you to set the behavior at stop signs?
 
But standards do change - so the question is which is the "right" standard ?
A fair question. But given they have changed, my only point was this makes it impossible to measure the progress from your data. It is entirely possible that your data demonstrate that 10.69.2 is worse (I don’t think it is, but I also don’t think it has a dramatically lower intervention rate, either).

I think it is incrementally better! And I am looking forward to it being at a minimum 100x better.
 
-- snippage

State laws in states that allow self driving cars already do that- no need to involve the feds. They require self driving cars obey all traffic laws, which would include speed limits.
There are State Laws that say one can't roll, at any speed, through a stop sign. Very few humans come to a complete halt; we all creep on through and, from time to time, go through faster than that.

Yet FSD-b has the car coming, pretty much, to a complete halt at stop signs. Yep, that's the law. But it's only FSD-b that's doing it. Because some agency has demanded that it be done.

So, one is not supposed to go past the speed limit? There are state and probably Federal laws that say so? But then it's still up to the human whether to obey or not.

So-called Traffic Safety Engineers see that auto-driving, speed setting autopilot under computer control, and I suspect that they drool. They'll go to Tesla/Ford/whoever and say,

"We'll sue you out of existence if you don't speed-limit your smart cars to the posted speed limit when the auto stuff is on, period. No adjustments allowed for by drivers. If a driver wants to turn all that stuff off and drive faster than the limit using their foot, then that's their problem (and we're thinking about preventing that, too!). Tesla, this specifically includes you and your TACC. We don't care if everybody else is doing 70 - you have to follow the max speed limit of 55!"

Again, it's a "Think of the Children!" moment, since, yeah, kids ride in cars, a speeding car will kill the occupants that include kids, so the Safety Engineers will push.

Urgh.
 
And yet, there is no user setting for rolling stops.

There was though. You used to be able to "turn off" rolling stops with an FSDb setting, despite on being the default.

Tesla removed it and set default to off for rolling stops in all configs.


Cruise control is actually a pretty different situation...For one- as noted, the USER had to change the default to do something illegal... but for another it's been around over 100 years (or nearer ~75 years if you stick with the more modern type).... and for like 90% of that time it wasn't even possible for the car to even know if you were speeding or not.... (and as this and other threads show-- sometimes even a Tesla gets what the correct limit is wrong and thus STILL doesn't legit know if you are speeding). In some new cars right now it STILL doesn't know if you're speeding.

So a rule prohibiting a cruise control system from speeding would be impossible to enforce on like 99% of all cars ever made. And even on most of the 1% the speed limit it thinks exists is not always correct so it'd be a poor system then too since it might be wrong that it can't go over say 35 or whatever.

Whereas a rolling stop it's easy to prohibit, because before now, nothing did that.... and it's not like the car can be mistaken about if it's ok to roll through a sign or not ever.
 
There was though. You used to be able to "turn off" rolling stops with an FSDb setting, despite on being the default.

Tesla removed it and set default to off for rolling stops in all configs....
Tesla was forced to remove it by the NHTSA.

 
  • Like
Reactions: sleepydoc
As sure as flowers come up in the Spring, some maniac so-called traffic safety engineer at the NTSB or NHTSA is going to mandate that all self-driving cars follow speed limits, no ifs, buts or maybes. Note that said maniac, whoever he/she/committee it is, probably speeds on any given day, anyway, and the hypocrisy won't even slow them down. "Think about all the lives that will be saved! Think of the children!"

So, people without self driving cars will continue to run at 5 or 10 mph above the limit, but now dodging Teslas/Fords/VWs/Porches/whatever who, if they're sane, will be in the far right lane, and there'll be Accidents.

Said accidents will be pushed by the maniac(s) as Proof that self-driving cars Aren't Safe, instead of Proof that traffic safety engineers should be lined up against the wall when the revolution comes.. and if the maniacs actually pulled that stunt, the revolution would happen sooner rather than later 😁.

Actually, thinking about the above.. If the maniacs already in existence have any sense of self-preservation, they wouldn't have pushed for a hard limit, not wishing to be tarred and feathered. My guess: They're laying in wait for the Tesla Semi, since It's Only Truckers who would be affected.

On the other hand.. while there are probably pasty-faced, weakling truckers out there, my impression is that many of them have the job of loading and unloading the trailers they pull. Hence, they have muscles on top of their muscles. Do the maniac traffic safety engineers want to come up against them?

Comments?
I follow the frustration of your rant, and I agree with some of the sentiment. But directing all your ire at Maniac Traffic Safety Engineers may be a little simplistic, if you're trying to understand and predict the way rules and regulations can be used to make things worse instead of better.

I''d guess it's very unlikely that overzealous or maniacal Traffic Safety Engineers, though they may well exist within NHTSA, CA DMV etc.,, actually wield any real power within those bureaucracies. They would be the annoying but useful tools of the ambitious, politically appointed directors who further serve the needs of the state or federal executive branch. The really talented and dangerous ones develop a network of fellow bureaucrats, aides, lobbyists and journalists who know how to scratch others' backs, or stab others' backs, as dictated by goals and expedience. Not nerds, but people persons.

I'm not defending Engineers in general or on tribal principles. I'm observing that the skills that it takes to rise through supervisory, management and eventually real power-wielding positions are different than what it takes to create or optimize technical solutions.

There are plenty of reasonable and well-meaning staffers at traffic- and Industry-regulating bodies. Experienced people who know the gray areas of rules and enforcement, and who also know how beneficial AV technology can be. But if the governor meets with the senator and the lobbyist, and they decide it's time to take Elon down a notch or two, then all the reasonable recommendations and accommodations will become irrelevant. The dutiful apparatchiks will pull out whatever rules and justifications are needed, publish the statistics in whatever damning way is required, launch a fresh investigation or two, and make sure some helpful hit pieces show up in the week's "news" coverage.
 
That's the opposite of forced.
You can bet that Tesla was asked nicely while twisting their arms and at the least an implied threat of regulations, possibly even that might cover more of the Beta program. Just the fact that NHTSA had Tesla come in to meetings implies they were VERY interested in changing this, one way or another. So they were forced to voluntarily capitulate. No way did Tesla go into the meetings and say "You know what we were wrong and thanks for pointing it out since we just failed to notice".
 
Last edited:
I was driving in FSDb on a major arterial road (two lanes each way, generous median) Paseo Del Norte in Albuquerque this morning. There was an intersection at which the traffic light controller probably detected it had a problem after the lightning storm last night, and was running in the fall back mode with all lights set to blinking red.

I think the prescribed behavior is to treat that as you would a stop sign. Stop, then go when safe and it is your turn. The car stopped well enough, but gave no indication that it would go forward (no popups, creeping). So I eventually gave it enough accelerator nudging to get on the way.

The same intersection was in the same state when I returned going the other way a couple of hours later, and the same behavior repeated.

Does the current FSDb release treat wall-to-wall blinking red stoplights at a major intersection as "stop forever"?

There was not much traffic--I don't think it was just waiting for a safe opportunity.
 
I was driving in FSDb on a major arterial road (two lanes each way, generous median) Paseo Del Norte in Albuquerque this morning. There was an intersection at which the traffic light controller probably detected it had a problem after the lightning storm last night, and was running in the fall back mode with all lights set to blinking red.

I think the prescribed behavior is to treat that as you would a stop sign. Stop, then go when safe and it is your turn. The car stopped well enough, but gave no indication that it would go forward (no popups, creeping). So I eventually gave it enough accelerator nudging to get on the way.

The same intersection was in the same state when I returned going the other way a couple of hours later, and the same behavior repeated.

Does the current FSDb release treat wall-to-wall blinking red stoplights at a major intersection as "stop forever"?

There was not much traffic--I don't think it was just waiting for a safe opportunity.
That’s the behavior I saw many months ago when I encountered this, as I recall. Definitely no point in them even attempting to manage that situation! Just let the driver take over.

They can address this case in a couple years hopefully.
 
That’s the behavior I saw many months ago when I encountered this, as I recall. Definitely no point in them even attempting to manage that situation! Just let the driver take over.

They can address this case in a couple years hopefully.
I thought they had fixed it with the last release. I haven’t had much occasion to check/use it, though. Regardless, it should just use the 4-way stop sign algorithm. The problem is that algorithm is far from perfect.
 
  • Informative
Reactions: pilotSteve
Some really random lane selection behaviors tonight
* Need to make a right turn in 6 blocks, but the right lane is a taxi / bus only lane. I keep rejecting the lane change it keeps trying to make, until finally it is supposed to merge over and turn at the next light. Nav says make a right, FSDb says nope - full speed ahead in the middle lane. I took over at the last second to make the turn.
* Winding 4 lane road with a barrier to the left; merges to the right lane to avoid obstruction. Probably spooked by the barrier blocking view of the winding road. It has done this before on the same road, and 10.69 did not improve behavior
* Stuck behind a bus, bus merges left and FSDb needs to stay in the right lane for the upcoming right turn. Nope, decides to follow the bus, “changing lanes away from obstruction” - pretty amusing because the slow-moving bus is literally the obstruction

It did handle a weird left turn nicely though, where the car had to travel quite far forward before making what was nearly a U turn onto the other road. Only issue was that it turned off the left blinker until the turn was basically complete, at which point it turned the blinker back on for a few more flashes.
 
  • Funny
Reactions: Jeff N
I was driving in FSDb on a major arterial road (two lanes each way, generous median) Paseo Del Norte in Albuquerque this morning. There was an intersection at which the traffic light controller probably detected it had a problem after the lightning storm last night, and was running in the fall back mode with all lights set to blinking red.

I think the prescribed behavior is to treat that as you would a stop sign. Stop, then go when safe and it is your turn. The car stopped well enough, but gave no indication that it would go forward (no popups, creeping). So I eventually gave it enough accelerator nudging to get on the way.

The same intersection was in the same state when I returned going the other way a couple of hours later, and the same behavior repeated.

Does the current FSDb release treat wall-to-wall blinking red stoplights at a major intersection as "stop forever"?

There was not much traffic--I don't think it was just waiting for a safe opportunity.
A couple days ago I came to a brand new traffic light that was blinking red. The car stopped at the light, then proceeded, just as it should. Came through the other direction a few hours later and the car did the right thing again. This light had just recently been installed, so was not even on the maps.
 
I was driving in FSDb on a major arterial road (two lanes each way, generous median) Paseo Del Norte in Albuquerque this morning. There was an intersection at which the traffic light controller probably detected it had a problem after the lightning storm last night, and was running in the fall back mode with all lights set to blinking red.

I think the prescribed behavior is to treat that as you would a stop sign. Stop, then go when safe and it is your turn. The car stopped well enough, but gave no indication that it would go forward (no popups, creeping). So I eventually gave it enough accelerator nudging to get on the way.

The same intersection was in the same state when I returned going the other way a couple of hours later, and the same behavior repeated.

Does the current FSDb release treat wall-to-wall blinking red stoplights at a major intersection as "stop forever"?

There was not much traffic--I don't think it was just waiting for a safe opportunity.
I’m afraid this is one of those random behaviors.

When I leave the work campus, the striped road dead ends into a local striped two lane road with a blinking red light. If the route involves a right turn, FSD-b does a creditable right-on-red and proceeds, just as if it was a stop sign.

On left, for months now, a left turn would have the car creeping up, shaking the steering wheel a few times, and then freezing. If one gassed it through, it would drive, sort of, but would keep on trying to freeze until about 50’ past the intersection, at which point it would come to its senses and take off.

Last week on a Thursday, it started to pull that stunt again then, to my surprise, after the halt, executed a clean left turn and kept going.

Map update? Self learning? Random number generator? Phase of the moon? No idea, but there it is.