Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta 10.69

This site may earn commission on affiliate links.
Is this from the old 80s TV movie turned mini series called "V" ?
Good eye!

Sorry for elaborating on this off-topic joke, but since you asked: I just quickly searched for any image like that, now common to many sci-fi films and series.

The first example of such a thing in my memory, though not an image aside from the book cover, is from Childhood's End by Arthur C. Clarke (I think there was a decades-later TV film but I never watched it).

There have since been many other similar images & plot concepts, with an ominous alien mothership hanging ovehead. One familiar one was Independence Day - but most images I found had a superimposed title logo and/or showed the World Trade Center towers, and I thought that would distract from the quick humor-post effect. (As does this answer i guess, and any time you overexplain a punchline!) :)
 
I'm noticing a jekyl/hyde side of 25.2 when exiting my neighborhood. Sometimes FSDb slowly creeps to and through the stop sign and other times it launches blind and well short from the stop sign. In the latter case the right side B pillar has no chance of seeing oncoming traffic given a cement wall's placement so FSDb throws the dice, applies excess acceleration, and blasts through the stop sign with a starting position well short of normal. I'll have to look at the latter case closer to see if the normal creep wall is displayed but either way it doesn't seem to come into play.
 
I'm noticing a jekyl/hyde side of 25.2 when exiting my neighborhood. Sometimes FSDb slowly creeps to and through the stop sign and other times it launches blind and well short from the stop sign. In the latter case the right side B pillar has no chance of seeing oncoming traffic given a cement wall's placement so FSDb throws the dice, applies excess acceleration, and blasts through the stop sign with a starting position well short of normal. I'll have to look at the latter case closer to see if the normal creep wall is displayed but either way it doesn't seem to come into play.
My car also stops short of neighborhood stop signs. Sometimes it creeps slowly up to the sign, at other times it proceeds from the short stop position. Fortunately, in my case, there are no occlusions at these intersections. However, your comment about the car entering the intersection without visibility of oncoming traffic lanes makes me wonder how a Tesla resolves the difference between not seeing cross traffic and seeing that there is no cross traffic. These are two very different things!

The car's occupancy network should be examined to see if it blocks visibility to cross lanes. If it does, then the car should know to creep forward until it can see past the occlusion, thus being able to see there is no cross traffic versus not seeing cross traffic. Perhaps the occlusion is somehow misplaced in the occupancy network so that the car thinks the wall is further away than it really is?

At times like these, I wish the car could display the occupancy network in the visualization.
 
Oofa. Just installed 25.2 and It's way worse than the previous version. Just tried it on a local 2 lane, 35 mph road, and it hugs the double yellow line so much that It doesn't feel safe. Then it was halting and accelerating. I suppose I can reboot, maybe recalibrate the cameras? But this is... Not usable for me on local roads.
Haven't been on the highway yet, have always thought Autopilot was more reliable.. Hope some of the weird changes don't migrate.
 
Last edited:
  • Funny
Reactions: AlanSubie4Life
Oofa. Just installed 25.2 and It's way worse than the previous version. Just tried it on a local 2 lane, 35 mph road, and it hugs the double yellow line so much that It doesn't feel safe. Then it was halting and accelerating. I suppose I can reboot, maybe recalibrate the cameras? But this is... Not usable for me on local roads.
Haven't been on the highway yet, have always thought Autopilot was more reliable.. Hope some of the weird changes don't migrate.
I encourage you to recalibrate the cameras and to confirm that GPS is pin-point accurate. Zoom in on the map and make sure the car is exactly where it should be on the map.
 
I encourage you to recalibrate the cameras and to confirm that GPS is pin-point accurate. Zoom in on the map and make sure the car is exactly where it should be on the map.
Stopping back in to say that it is definitely better after recalibrating and trying again. Different road though.. For whatever reason, FSD always had difficulty on that particular road. It has some hills and spots where it's half paved (right down the middle), definitely a challenge for any vision-based system.
 
[Funny]In the last few pages people have attributed dozens of changes from 10.69.25.1 to 10.69.25.2. Obviously a super small point release and Tesla says just Bug Fixes for the notes. Just got mine and it was a well under 200Mb download. That is a SUPER tiny update. Maybe one of the smallest ever and the smallest I have ever measured.

*Use my iPhone as a hotspot and my carrier app showed less than 200Mb from well before the download until I checked after.

....and today I had my 2 best drives in a row. Went from my Condo in Midtown to Westside Park for a run. On the way was doing "Whole Mars" good until it came to a newly added Stop sign. It tried to blow through it and had to hit brakes. On the way back started in parking lot and it Naved straight out and all the way home (or to the parking deck entrance) with only a few interventions (accelerator and 2 change lanes). Even stoped at the new stop sign coming back. Granted it was Saturday morning traffic but I was "blown away" since it has never done this good for me in the city (burbs yes, but urban no).
 
I’m on 10.69.25.2 and the software didn’t want to stop for a train with lights and crossing arms on both sides. Anyone having the same problem ? The attached picture was taken after I applied the brakes.
 

Attachments

  • 0E574DD6-9976-4B40-BB63-2934EAD8B168.jpeg
    0E574DD6-9976-4B40-BB63-2934EAD8B168.jpeg
    193 KB · Views: 68
  • C92B5839-903B-4BD1-98B1-8E8FCE26078D.jpeg
    C92B5839-903B-4BD1-98B1-8E8FCE26078D.jpeg
    193 KB · Views: 54
I’m on 10.69.25.2 and the software didn’t want to stop for a train with lights and crossing arms on both sides. Anyone having the same problem ? The attached picture was taken after I applied the brakes.
It also is not programed to slow in school zones and does recognize railroad signals. Tesla really needs to warn people before someone, who really believes this thing is Full Self Driving, kills themselves.
 
It also is not programed to slow in school zones and does recognize railroad signals. Tesla really needs to warn people before someone, who really believes this thing is Full Self Driving, kills themselves.
Nobody believes the car is fully autonomous more than 30 seconds after the first time they engage FSDb.

Besides, Tesla puts up a large warning panel when you first enable FSDb telling you how it might do the worst thing at the worst time, or words to that effect.
 
It also is not programed to slow in school zones and does recognize railroad signals. Tesla really needs to warn people before someone, who really believes this thing is Full Self Driving, kills themselves.
I think signing an agreement before use recognizing that this is not true full self-driving and it may do any old crazy thing imaginable at any time should be a bit of a warning, as it were. Course if we feel the need to inform people to open the pizza box before eating the pizza there might be a problem. Those pizza people are out there driving. Probably fewer of them driving Teslas I bet.........
 
  • Funny
Reactions: pilotSteve
Nobody believes the car is fully autonomous more than 30 seconds after the first time they engage FSDb.

Besides, Tesla puts up a large warning panel when you first enable FSDb telling you how it might do the worst thing at the worst time, or words to that effect.
I agree but how do you explain Gbills expecting it to stop for a railroad crossing, I'll bet he read the warning but still expected it to stop for a train crossing and he/she probably would expect it to stop for a school bus with red flashing lights as well. Tesla needs to put out a list of what FSD will do and will not do and when it will do what it should do.
 
They did. And that list says who the hell knows, we're working on it. Point is don't ever expect it to do anything you can't fix........
On one hand it would be nice if they had a list of specific shortcomings (won’t stop for trains, wind stop for school buses or slow in school zones, etc.) on the other hand creating such a list runs the risk of people assuming it can deal with items not in the list.
 
They did. And that list says who the hell knows, we're working on it. Point is don't ever expect it to do anything you can't fix........
Maybe I was not clear enough on what I was attempting to say, so let me rephrase. Tesla knows what the FSD has been programed to do and what it has not been programed to do. If GBills had known that the car would not stop for a railroad crossing he would not have been surprised when it did not do what is was not programed to do. The same holds true for a stopped school bus, he/she may expect it to stop, but its not programed to. It is possible to make this experiment safer and less stressful if people are better informed.
 
Maybe I was not clear enough on what I was attempting to say, so let me rephrase. Tesla knows what the FSD has been programed to do and what it has not been programed to do. If GBills had known that the car would not stop for a railroad crossing he would not have been surprised when it did not do what is was not programed to do. The same holds true for a stopped school bus, he/she may expect it to stop, but its not programed to. It is possible to make this experiment safer and less stressful if people are better informed.

Why is the default posture for people not to assume the car will try to kill them? I don’t know why one would assume differently. No informing necessary really - it says right there it will do the wrong thing at the worst time. What else could that mean?
 
Why is the default posture for people not to assume the car will try to kill them? I don’t know why one would assume differently. No informing necessary really - it says right there it will do the wrong thing at the worst time. What else could that mean?
I tell people that with each release, FSD beta comes up with new ways to kill me.