Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Wiki MASTER THREAD: Actual FSD Beta downloads and experiences

This site may earn commission on affiliate links.
Yikes! I feel like I've gotten too cozy with the lane changes and this changes things. There has to be some bad bug because even the blind spot alert is triggered, the passing car turns red and FSD is still moving laterally into its path.
This scenario should be included in a general test for any highway AV system. In this case the truck driver was distracted or driving aggressive. You can imagine a few scenarios like this where two vehicles plan to share the same space and time. :oops:🆘
 
Here's where FSD has zero chance of success. FSD has too much latency when compared to an average alert human driver's reaction time. Unfortunately these scenarios aren't uncommon on busy highways. I had a similar case a couple of weeks back.

I should add it's probably safe to say no self driving system today is capable of solving this.

"It won't ever work because it doesn't work today" is not rational, and "it won't work because see this one example of it not working" is pure FUD, like on the level of seat belts don't save lives because a seat belt killed someone in <insert your favorite specific example here>. At the end of the day, FSD is NOT a human, so it doesn't "think" like you, and you certainly don't think like it.

I'm not saying Tesla HW3 FSD v11 is going to work or even that they're even on the right track, but one thing going wrong now doesn't mean it always will in the future, and most of the things being perceived as latency lately aren't even latency. Latency is the time it takes to react to an input, not the time it takes to react to what you see. So, unless you wrote or reverse engineered the software, you generally don't know if something is latency. For instance, if BSA isn't an input, then the fact that it was triggered isn't relevant at all because FSD simply didn't react to it by design. Meanwhile, phantom slowing is clearly reacting to some input that isn't even included in the visualization. As such, we also know that one can't simply rely on the visualization to determine what FSD does and doesn't see as input. Given these points, the only thing we know is that something went wrong there. This can be described as a bug, and bugs can be squashed. In an even simpler example, people keep mentioning latency when it comes to stop signs and stop lights, but those also aren't necessarily always an input just because you can see them. This is why I hypothesized about planning areas/zones several pages back. When FSD is stopping at a rate that you find uncomfortable for something for which you could have stopped at a comfortable rate, it is easy to assume that the computer is slow or FSD "didn't see it" when in reality FSD could just as easily have been ignoring it by design (it could have started slowing down within microseconds of having the input, reacting multitudes of times faster than a human). OTOH, when FSD flat out runs a stop sign, all you know is something didn't work right. Regarding "latency": I should imagine is safe to assume that the HW3 computer can react faster than you to any given input (whether the input will occur and whether the reaction will be correct can remain up for debate) outside of a performance-impacting hardware failure, and as great as you may believe you are, you also don't react to things that you don't see or register (and you also won't react at your normal rates during a performance impacting physiological failure).
 
"It won't ever work because it doesn't work today" is not rational, and "it won't work because see this one example of it not working" is pure FUD, like on the level of seat belts don't save lives because a seat belt killed someone in <insert your favorite specific example here>. At the end of the day, FSD is NOT a human, so it doesn't "think" like you, and you certainly don't think like it.

I'm not saying Tesla HW3 FSD v11 is going to work or even that they're even on the right track, but one thing going wrong now doesn't mean it always will in the future, and most of the things being perceived as latency lately aren't even latency. Latency is the time it takes to react to an input, not the time it takes to react to what you see. So, unless you wrote or reverse engineered the software, you generally don't know if something is latency. For instance, if BSA isn't an input, then the fact that it was triggered isn't relevant at all because FSD simply didn't react to it by design. Meanwhile, phantom slowing is clearly reacting to some input that isn't even included in the visualization. As such, we also know that one can't simply rely on the visualization to determine what FSD does and doesn't see as input. Given these points, the only thing we know is that something went wrong there. This can be described as a bug, and bugs can be squashed. In an even simpler example, people keep mentioning latency when it comes to stop signs and stop lights, but those also aren't necessarily always an input just because you can see them. This is why I hypothesized about planning areas/zones several pages back. When FSD is stopping at a rate that you find uncomfortable for something for which you could have stopped at a comfortable rate, it is easy to assume that the computer is slow or FSD "didn't see it" when in reality FSD could just as easily have been ignoring it by design (it could have started slowing down within microseconds of having the input, reacting multitudes of times faster than a human). OTOH, when FSD flat out runs a stop sign, all you know is something didn't work right. Regarding "latency": I should imagine is safe to assume that the HW3 computer can react faster than you to any given input (whether the input will occur and whether the reaction will be correct can remain up for debate) outside of a performance-impacting hardware failure, and as great as you may believe you are, you also don't react to things that you don't see or register (and you also won't react at your normal rates during a performance impacting physiological failure).

In spite of FUD redirection and koolaid chugging over the years, FSD has been a poor performer in the past and that continues today. One data point isn't a trend but after years of the same results? Come one! I could understand if one was delusional or prone to invoking a faith based belief system otherwise there is no confusion about the trend.

Latency is the time between cause and effect and easily to assess by roadway performance. Bugs are another matter altogether and there's no reason to tie the two together unless one is trying to obfuscate the issue. Trying to downplay the seemingly endless dangerous scenarios befuddles me so no need to talk about that nonsense.

Under ideal conditions and in limited scope any modern microprocessor is faster than an average human but that's not the way consumer based AV systems operate. High production embedded systems require low cost, have limited resources, and performance is always compromised.

FSD can't even drive a few blocks without making mistakes so lets not try to kid ourselves that it has anything more than a random chance of outperforming an average human driving on the road. We're hearing more FSD owners admit they don't enjoy the high stress experience.

(moderator edit)
 
Last edited by a moderator:
In spite of FUD redirection and koolaid chugging over the years, FSD has been a poor performer in the past and that continues today. One data point isn't a trend but after years of the same results? Come one! I could understand if one was delusional or prone to invoking a faith based belief system otherwise there is no confusion about the trend.

Latency is the time between cause and effect and easily to assess by roadway performance. Bugs are another matter altogether and there's no reason to tie the two together unless one is trying to obfuscate the issue. Trying to downplay the seemingly endless dangerous scenarios befuddles me so no need to talk about that nonsense.

Under ideal conditions and in limited scope any modern microprocessor is faster than an average human but that's not the way consumer based AV systems operate. High production embedded systems require low cost, have limited resources, and performance is always compromised.

FSD can't even drive a few blocks without making mistakes so lets not try to kid ourselves that it has anything more than a random chance of outperforming an average human driving on the road. We're hearing more FSD owners admit they don't enjoy the high stress experience.

Come back to planet earth.
That's enough, kabin. You seem completely incapable of accepting that there are many Tesla owners who have had consistently improved performance from FSD Beta for the past 1-2 years. For me: reduced phantom braking, smoother left and right turns, better acceleration and deceleration curves, more confidence at 4-way stops, and more. Once the stack got implemented on the freeway, my basic "AP" experience has dramatically improved - no more aggressive braking with cut-ins and cut-outs. Smoother braking and acceleration in stop-and-go traffic.

Instead, you insult people, tearing them down because you think the program has not evolved over time - and for YOU that may be true. So speak from YOUR experience, but stop slamming others saying the entire program doesn't work, and hasn't worked for years.

I'm truly sorry that, for you, "FSD can't even drive a few blocks without making mistakes...", but for me, and many others, it drives many blocks without mistakes.
 
Very smooth? Wow! I wish ours was very smooth on 11.4.4. Our car still can't even slow down smoothly for a stop sign. Yesterday I was driving in a retail area street with a stop sign every block. The car could not stop smoothly for a single stop sign. It would show the stop sign on the display but keep accelerating, then it would begin to brake, but not quickly enough, then to not pass the stop line it would slam on the brakes from about 12 MPH down to 0. This is how our car basically always behaves around stop signs.

Then at the intersection of 123rd St W & Quentin Ave S (44°46'39.4"N 93°20'39.8"W) it was about to run the stop sign as I approached on 123rd heading west. It wasn't even slowing down for the stop sign & I had to hit the brake pedal. That would have been extremely dangerous as there was high speed cross traffic that doesn't stop passing by at the time. I don't know why it didn't stop for the stop sign & also didn't react to yield to the other cars on Quentin Ave.
Interesting. I just heard the same from a friend as we were comparing notes on 11.4.4. Like you he has an S FSD-beta car. Mine is a Y.

Could it be that the difference in geometry between S and Y, while Y vastly outnumbers S in AI training miles, makes Y perform comparatively better?
 
  • Helpful
Reactions: jebinc
In spite of FUD redirection and koolaid chugging over the years, FSD has been a poor performer in the past and that continues today. One data point isn't a trend but after years of the same results?
Fearmongering is still fearmongering, and calling it out isn't redirection.
Latency is the time between cause and effect and easily to assess by roadway performance.
Latency is a technical term as well, but if you want to use that non-technical definition, then latency you are suggesting makes it impossible for FSD to work can improve without hardware upgrades. Case in point: cross-traffic. In late 2021, FSD would practically slam on the brakes full seconds AFTER cross traffic seconds was already clear of a relatively distant upcoming intersection. Today, FSD sometimes slows when cross traffic that will clear the intersection in enough time is still in the intersection with much less distance between ego and intersection, and so do some humans. Since my hardware has't changed in that time, and you are using this definition of latency, I don't find any reason to pretend there is substance in any of your technology arguments. Also, since this is a behavior clearly has changed drastically, your generalized claim about a trend seems moot.
Trying to downplay the seemingly endless dangerous scenarios befuddles me so no need to talk about that nonsense.
Facts don't downplay risks, and the ones I pointed out didn't even compare risks but instead compared behaviors. While you may not believe it, most vehicle owners are adults who can make decisions about which risks they're wiling to take. Here's another fact that is more biased and still doesn't downplay the risk of being in an merge/lane change accident: I have had far more human drivers almost merge or lane change into me than I have had automated systems almost merge or change lanes into me by a ratio of at least double-digits to zero. Here's another: I as a single driver have almost merged or lane changed into more vehicles than you have provided examples of FSD doing the same across a fleet far greater than I can drive at any given time that will have activity far more often than I ever drive.
lets not try to kid ourselves that it has anything more than a random chance of outperforming an average human driving on the road.
As I clearly demonstrated in the comment to which you have replied, FSD may very well perform measurably better than any human at every task it is performing on the road, and you're sticking with "no it doesn't" without even attempting to provide any sort of proof or even quatification? This after your original argument was that "I should add it's probably safe to say no self driving system today is capable of solving this." Are you just trolling? It seems that way.
We're hearing more FSD owners admit they don't enjoy the high stress experience.
Yeah, it's not fun, but it's not meant to be. It's not always safer, but its not claiming to be. I'm personally using it less lately because there have been regressions I don't want to deal with. So what? Did you even have a point, or is FUD the only goal here?
 
  • Like
Reactions: Phlier and Olle
In spite of FUD redirection and koolaid chugging over the years, FSD has been a poor performer in the past and that continues today. One data point isn't a trend but after years of the same results? Come one! I could understand if one was delusional or prone to invoking a faith based belief system otherwise there is no confusion about the trend.

Latency is the time between cause and effect and easily to assess by roadway performance. Bugs are another matter altogether and there's no reason to tie the two together unless one is trying to obfuscate the issue. Trying to downplay the seemingly endless dangerous scenarios befuddles me so no need to talk about that nonsense.

Under ideal conditions and in limited scope any modern microprocessor is faster than an average human but that's not the way consumer based AV systems operate. High production embedded systems require low cost, have limited resources, and performance is always compromised.

FSD can't even drive a few blocks without making mistakes so lets not try to kid ourselves that it has anything more than a random chance of outperforming an average human driving on the road. We're hearing more FSD owners admit they don't enjoy the high stress experience.

Come back to planet earth.
“1-800…..” 🤣
 
While out on a FSDj run to Panera, I saw this new junk service truck! I feel like I’m “hauling junk,” as well! 🤣

IMG_8188.jpeg

@Ramphex
@WilliamG
 
While out on a FSDj run to Panera, I saw this new junk service truck! I feel like I’m “hauling junk,” as well! 🤣

View attachment 955999
@Ramphex
@WilliamG
Since I’m so tall, when I’m driving and the sun hits the car just right, I can see the yellow foam in they sprayed into the lower front left where the windshield is glued to the car. It reminds me how far Tesla has to go on so many levels.
 
  • Funny
Reactions: Ramphex and jebinc
Interesting. I just heard the same from a friend as we were comparing notes on 11.4.4. Like you he has an S FSD-beta car. Mine is a Y.

Could it be that the difference in geometry between S and Y, while Y vastly outnumbers S in AI training miles, makes Y perform comparatively better?
Perhaps. What year is your friend's S? Ours is a Nov 2016, so earliest HW2 production. While it's been updated to the HW3 computer & to RCCB cameras, it still has some older technology like the fender cameras with light bleed & the old radar (which supposedly is inactive now).
 
Could it be that the difference in geometry between S and Y, while Y vastly outnumbers S in AI training miles, makes Y perform comparatively better?
I'd doubt it. I see similar performance in my legacy MS and my son's M3, both on FSDb.

Might be able to say it's due to FSD's "preferences not matching mine".

I.e.

>> accelerate to pass
>> lagging accelerate in UPL's
>> some very brisk acceleration
>> position within lane when passing trucks (also see it on urban roads)
>>not changing lanes quickly enough or use "opportunistic" lane changes
>>normal braking too late and hard

There's many more. Some might like many of the above preferences while others would hate them. Might be the same SW behavior.

But I'd love the opinion of anyone who has a bunch of miles on both. My experiences are not extensive.
 
  • Like
Reactions: hybridbear
I don't believe in conspiracy theories and this may be one of them. Decide for yourself.
Part of the article below:
"but a story from the New Republic manages to connect everything together and boy, is it a tough look at our collective predicament.

The reason our self-driving utopia hasn’t arrived, it argues, is because it was never really supposed to. Instead, the promise of self-driving cars were just a lure to keep cities from building public transit, something America’s scions have been at since the mass production car first spawn an entire new galaxy of industries. From the Republic:

In 2018, the New York Times reported on how the Koch brothers were using the prospect of driverless cars as part of their war against public transit. The libertarian billionaires and longtime fossil fuel allies were funding Americans for Prosperity to organize dozens of campaigns in cities and states around the country to stop measures that would put more money into transit service. One of their primary arguments was that public transit was outdated and a waste of money because self-driving cars were just a few years away. Five years later, we’re still waiting for the self-driving revolution—but the Koch brothers’ bad-faith ideas about public transportation are still around."

 
  • Funny
Reactions: jebinc
I don't believe in conspiracy theories and this may be one of them. Decide for yourself.
Part of the article below:
"but a story from the New Republic manages to connect everything together and boy, is it a tough look at our collective predicament.

The reason our self-driving utopia hasn’t arrived, it argues, is because it was never really supposed to. Instead, the promise of self-driving cars were just a lure to keep cities from building public transit, something America’s scions have been at since the mass production car first spawn an entire new galaxy of industries. From the Republic:
Unlike the Segway, which I'll remind everyone that Steve Jobs said "Cities will be designed around these", which eventually became a novelty and found a small niche to fit in, including overweight security guards at shopping malls, AVs are here and expanding their footprints. In 10 years, when Microsoft's ABK deal with Sony on Call of Duty is expiring, we'll be seeing Waymo's in many major US cities.

So, while I can see the idea that it was just a red herring to get cities from building out their public transport plans, the reality is that it did spark the push for AVs, and even though it was never supposed to, they're here.
 
Here's where FSD has zero chance of success. FSD has too much latency when compared to an average alert human driver's reaction time. Unfortunately these scenarios aren't uncommon on busy highways. I had a similar case a couple of weeks back.

I should add it's probably safe to say no self driving system today is capable of solving this.

I disagree on the "too much latency" as multiple times it has saved people from red light runners that the "human" didn't even notice.
In comparison, this was happening in slow motion. It has to be a bug.

I have personally witnessed similar "why the F@$% are you doing that?" scenarios that could have been so easy avoidable.

They certainly have some serious flaws in the lane changing algorithms.
I'll be on a 3 lane divided road, in the left lane, <0.5mi from my left turn, and then suddenly the right blinker comes on with message, "changing lane to follow route". I'm like WTF are you doing, and why?
few seconds later, left blinker comes on with message, "changing lane to follow route" 🤬
 
  • Like
Reactions: Jeff N
I have personally witnessed similar "why the F@$% are you doing that?" scenarios that could have been so easy avoidable.

They certainly have some serious flaws in the lane changing algorithms.

In my jurisdiction, the solid white line of the left commuter/HOV lane is not to be crossed. So, when I saw it, this situation is easily avoided by the Tesla following the rules. Where I live, the overtaking vehicle wouldn't have expected the Tesla to move into its lane because the Tesla wasn't allowed to make a lane change there. (One can move into or out of an HOV lane only in specific spots marked by dashed white lines.) Making the Tesla clearly at fault whether it was under FSDbollocks or being driven by entitled driver who doesn't think the rules apply to them.

That said, even when lane changes are legal, FSDbollocks makes far too many for no apparent reason and occasionally very dangerous ones (we experienced changing lanes into a right turn exit lane, realizing its mistake but unable to return to the previous lane because traffic filled in its place and so it then attempted to drive along the shoulder when the right turn lane ended.)
 
In my jurisdiction, the solid white line of the left commuter/HOV lane is not to be crossed. So, when I saw it, this situation is easily avoided by the Tesla following the rules.....
The problem is there is so many jurisdictions. In the US each state has its own laws to follow. For instance in GA a solid white line lane marker means it is "discouraged" from crossing but it is not illegal. We use double white lines (like for HOV lanes) as being illegal to cross. Kinda strange I guess but......

Screenshot 2023-07-26 at 7.05.35 AM.png
 
  • Like
Reactions: SidetrackedSue
The problem is there is so many jurisdictions. In the US each state has its own laws to follow. For instance in GA a solid white line lane marker means it is "discouraged" from crossing but it is not illegal. We use double white lines (like for HOV lanes) as being illegal to cross. Kinda strange I guess but......

Now you have me questioning whether we have a single or double line. I don't pay attention to the lane rules with regards to EVs since I can't use them.

If only one of us is in the car, we can't use HOV lanes in our province or our neighboring one because they are only for Green plates and custom plates can't be a green plate too, so we had to choose. We don't commute and there are only about 5 miles of HOV lanes in this area, plus we seldom go on that road by ourselves, so we opted to keep our custom plate.
 
Here's an example of 11.4.4 not responding in time. It appears the lead driver isn't paying attention so the Tesla has even less time to respond. It's a frequently occuring scenario. FSD doesn't apply sufficient brakes in the ~2 seconds time available.

Off topic, but a few years ago on FB that person bet me that the CT would beat the F150 Lightning to the market.