Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Really This Bad?

This site may earn commission on affiliate links.
The one quibble I have with your post: That assertion about "within the next 10 years".

That right there I'd call an opinion. My opinion: It might be ten years; it's a lot more likely that it'll be less than that, "Elon time" or no. End of this year, maybe 1st or 2nd quarter of next.

Unless they run into a real stopper, in that an FSD car just can't be built. Nobody's got one yet although there's lots of people working on it. I bet (and put hard money down) that FSD will be here; I'm guessing, maybe six months after Dojo is in production.

We'll see.
Yep, we will see. End of second quarter of next year.... so 1 year from now? You are some kind of optimist. But even if that happens, keep in mind that one can always pay for FSD monthly. It will take 5 years to break even doing it that way so you are banking that you will keep your car for 5 years plus however long it takes for FSD to be released. So thikn about how long you keep cars... Again, if you use FSD as it is and enjoy the journey, by all means, sign up. But for someone that actually wants FSD, the math above says to wait and do it on your next Tesla.

Further, I do not believe they will be able to do actual FSD with the current hardware and sensor suite. I also do not believe they will be able to retrofit existing cars in an economical way. Tesla's recent filing regarding new radar is a tacit admission of this. Therefore, Tesla will do the best they can with the current hardware and call it FSD. I do not believe they define FSD anywhere in their legal contracts and so you will get whatever they can give you and I strongly believe it will not be what people believe FSD is.
 
I have used every generation of Tesla's assisted driving, from AP1 right up to the current FSD subscription on my Plaid. And all I keep thinking when I "test" FSD is "Wow, this is worse than a nervous 8 year old kid driving a car for the very first time, every time." And the other consistent takeaway for me, is that unlike Autopilot with simple adaptive cruise and lane keeping, FSD makes it feel like the car is struggling because it is trying to do too much.

So for me, FSD has a LONG ways to go, even just to be a non-stressful inducing assisted driving experience on backroads, let alone anything that even remotely approaches autonomous driving. So while we wait for something more realistic from Tesla, if they re-introduce Enhanced Autopilot again in the US, I'll jump on that in a heartbeat for highway bliss.

Btw... I still reminisce of how sweet the joys of simple AP1 were. Phantom braking was a rarity, and it pretty much worked.
 
My Refresh S is in the shop (again) and they gave me a 2017 S loaner car. My S doesn't have FSD. I don't drive enough to justify the price, so I didn't want it. The loaner S does have FSD so I decided to see for myself how cool it was and maybe try to justify adding it to my car. The stupid car tried to kill me and piss off other drivers in the short time I allowed it to drive!

First minute of turning on, it phantom-braked from 75 to 45 with nothing around but overhead signs and a bunch of pissed off drivers behind me.

I left it off for about 10 miles and tried again. This time it warned me it needed to move from my center lane to the right lane in preparation for taking the exit. We had about a mile before the exit and there was another car in the right lane next to me. My car hits the brakes and aggressively slows from 75 to about 40 mph before I decided to take over. I figured it would make a gradual speed reduction to fall in behind that traffic. Nope.

Finally I tried it on the city street and whenever a light turned red, this car would wait until the last second and aggressively brake so as to not rear-end the car stopped in front of us. If they moved forward and had to slow for traffic, my car would wait for a gap and rapidly accelerate just to slam on the brakes.

Is all of this maybe contributed to the car being five years old and maybe the FSD hardware wasn't as good back then? Maybe this particular loaner car had a malfunctioning component and it wasn't bad enough to throw an error code? I cannot imagine present day Teslas behaving like this with people shelling out $12,000 for such an awful system.

I was tempted to try the Summons feature but I figured that would be pushing my luck.
How would you try Summon on a loaner?
You can use the key for forwards and backwards only, but for smart summon you'd need to use the app, which wouldn't be an option on a loaner.

I have a 11.2016 and FSD is not as bad as you describe it.
I use it all the time on the highway and works really really good.
Just received Beta, and that's a completely different story... it's work in progress, but does some things really well, and other things very poorly.
 
  • Like
Reactions: pilotSteve
Actually.. So, have this 2018 M3 LR RWD. I've had the FSDb for roughly two weeks now. And before that, had been running around (first) with EAP, then with the pre-Beta, FSD package. This last is basically EAP with a couple of extra features. As my spouse puts it, this is my hobby. It's cheaper than having a boat or an airplane 😁.

I'm also an engineer who, for a living, figures out what died on complex equipment, with the intent of making sure that that never happens again. And I have a kind of eclectic background for a EE: Lots of down-to-the-silicon hardware design, circuit boards, and lots and lots of software. I've never coded actual neural nets, but, what with a digital signal processing background, knowing about Markov processes and such, I very definitely get the idea.

So, FSDb is all about the testing, not the features. If one wants to chug a couple hundred miles or longer down the road and feel relaxed when one gets to one's destination, FSD is very much the way to go. Not that I use the auto lane change feature on interstates that much (feels a little too risky for me), but the rest is just fine, including going between two interstates via the ramps. And the current FSD (not beta) can do lane keeping and some stop-at-stop-lights stuff, but won't do turns on local roads.

FSDb is not for the weak of heart. The release notes state that the car can and will do the wrong thing at the wrong time; they're not kidding. On average on a 20 mile local road trip I hit that "record" icon anywhere from five to twenty times. Each one of those hits is a "safety" hit. Going down the middle of an unstriped, 35 mph road going up the brow of a hill and not being able to see over it for oncoming traffic. (Locals know to hug the right side of the road - but not FSDb.) Jerkily going through intersections, scaring the bejeezus out of nearby cyclists (that was today). Classifying the danger zones as mild, serious, and critical, I'd guess about 50% mild, 35% serious, and 15% you-gotta-be-kidding-me where, without manual intervention, we're talking about bending metal. Believe me, I look at people pushing baby strollers or with toddlers in tow in a whole new light these days. (Yes, the car came to a halt for the lady with the stroller on the green-light left turn but, given all the errors this car makes on a regular basis, the knuckles on the steering wheel were white and the foot poised over the brake. And it was all very jerky.)

As for all those people who are crazy to get their hands on FSDb, along the lines of, "I paid for it! I want my feature!": be very, very careful what you wish for. My stress levels are loads higher than they were before getting the FSDb.

Now, the chatter about FSDb on these forums is kind of interesting. I've only had the current version, 2022.12.3.20. People who've been testing (emphasis, testing) this and earlier versions do say that various previous versions had smoother turns; or hugged the center line closer on a left turn (something the current version definitely does not do), and so on. It's been stated that each time a new FSDb hits the wall, some features get better, some features get worse. Mm.

But the release notes themselves have an eye-opener in them. One of the things in there was that the car was 41% better on certain types of left turns. Ah, yeah. 41% better. So, what about the other 59% of those types of turns?

On the one hand, I'm very happy that Tesla figured out, what with training and all of the neural network, that they could get a major improvement in a particular feature. And it looks like whoever wrote that felt equally happy and wanted to share.. and maybe alert people for feedback on that topic.

But the goal of FSDb is, well, Full Self Driving. As in Taxi service and that. And I'm watching the car and how it handles things. Nope, the car's not a human. It appears to have some short-term memory. Long term memory, as in, "I remember that screwy intersection from the last time I went through it, I won't make that mistake again!" is Not In Evidence. Or, as I mentioned above: If one scares a student driver with the dangers of approaching the brow of a steep hill in the middle of a road, one typically doesn't have to scare them twice. The car appears to be running on a rules-based approach to chugging down the road. Which is fine.. I suppose. The rules can be very, very complex, for a computer. But car doesn't feel danger, that's a human (or live creature) emotion based upon a zillion years of evolution. That, you know, kind of works pretty well. (Those for whom the internal manifestation of fear/danger didn't work all that well are no longer around to complain: Hello, Darwin.)

So, the real question is: How well, and how fast, is Tesla iterating on FSDb? Take that 41% number. If on every release that feature is improved by 41%, then we get 0.41^N, where N is the number of releases. So, starting from 1.0 (which is lousy, at the beginning), one goes 0.41, 0.168, .0689, .0283, .0116, .00475, 1.98e-3, 7.98e-4, 3.27e-4, and so on. that last one is 0.003%, after 9 releases - and that's not good enough. We want 99.99%, better than a human, or a mass of humans in aggregate.

There's all sorts of things wrong with that math, above. For one thing, there's nothing that says the neural net improvement process can't go from, say, 0.11% to 0.00001% on the next step. Neural nets (and computer algorithms in general) can be decidedly non-linear in that way.

And I'm personally looking at a capability sample size of one. Which is getting into a wider release with Tesla drivers. Remember: At this point in the algorithm cycles, it's all about gathering sample data and processing same for the next FSDb point release, and you gotta figure, it's Tesla requesting the data.

I've got this feeling.. that, with whatever passes for the current techniques, Tesla is not progressing all that fast. Are the people monitoring all this at Tesla confident that they can get a non-beta, FSD out the door in the next year that works? Or are they getting, say, a little desperate? Don't know.

One possibility is that the development crowd at Tesla is betting on the Dojo system. Kind of a hope that, "If only we could train this neural net in the car at 1000X the speed we're currently doing it", all the major stumbling blocks could be fixed. Which may or may not be true, of course. What Tesla is doing is along the lines of pure research, which is defined as the Process of Running Up Alleys to Find Out If They're Blind. Nobody's really done a full FSD before.. so there could always be a stumbling block beyond which it might be near-impossible to get.

Or we could all be pleasantly surprised come December with a smooth-driving, traffic/pedestrian/what-have-you aware car that works like a dream.

At this point, I wouldn't bet either way.

In any case, I'm going to keep on testing the FSDb, not because it lowers my blood pressure or is that capable, but because the data from me and everybody else doing this may just be the push that gets Tesla over the edge.

Finally: As I said above, nobody should get the FSDb if one thinks that THIS is going to let one take one's hands off the wheel and read a book or something while driving from A to B. If you want to help Tesla, sure. If you're really into beta-testing buggy software, sure. But if all one wants to do is get around town with one's sanity intact: Don't bother.
exactly the truth, well said!!! The only reason I try to use it every day is for the simple reason I want Tesla to succeed and for that to happen they need all the beta testers they can get. If they are using a camera-based system, why is it stumped when seeing an ambulance with flashing lights and not knowing what to do?
 
How do we report precise bugs to tesla when we find one out ?
CAC6713B-668E-4D0E-B89A-5EFE7514608F.jpeg
 
  • Funny
Reactions: doc5339
Further, I do not believe they will be able to do actual FSD with the current hardware and sensor suite. I also do not believe they will be able to retrofit existing cars in an economical way. Tesla's recent filing regarding new radar is a tacit admission of this. Therefore, Tesla will do the best they can with the current hardware and call it FSD. I do not believe they define FSD anywhere in their legal contracts and so you will get whatever they can give you and I strongly believe it will not be what people believe FSD is.
I am now very glad I didn’t get FSD, basically you’re just bankrolling an R&D effort. Even EAP, I probably should have spent that 5k on dual motor for my M3 instead.
 
  • Like
Reactions: strider and doc5339
Actually, you’re supposed to hit the little video recorder button. From what the release notes state, doing so sends a 10 second clip off to Tesla.. and probably some state variables and such to figure out what the car was doing at the time.

Earlier beta testers supposedly got an email along with an email address to send things to. But given the size of the number of testers, I presume direct, personal attention is not what people are going to get.
 
  • Like
Reactions: Matias
exactly the truth, well said!!! The only reason I try to use it every day is for the simple reason I want Tesla to succeed and for that to happen they need all the beta testers they can get. If they are using a camera-based system, why is it stumped when seeing an ambulance with flashing lights and not knowing what to do?
Because.. It's a neural net. And it doesn't have human intelligence. Or, from a funny point of view, no intelligence that anybody can really define.

There are things that neural networks are Very Good At. Expose a neural network to training data that says, "Giraff!" or "Not Giraff", taken from a number (but not a zillion) of views of giraffs eating, standing up, walking around, from overhead, from below, etc., etc. Then, with the neural network in full Find It mode, show it pictures that have all of the above and some pictures where the giraff is so heavily obscured in trees, high grass, so far away you wouldn't believe it - and up the "Giraff!" flag goes. It's as good as a hungry human looking for some giraff meat to roast. If not better.

That's the "I can see it" gold nugget of neural networks. It can be extended. Got a running dog going to, from, sideways, chasing a frisbee, etc., etc. Ask the question: Is the dog going to run out in front of the car? This can work with a neural network, maybe not as well as "Spot the Lion!", but a lot better than trying to do it with Von Neumann architecture.

And then there's times when a neural network isn't so great. Want to solve square matrices? Differential equations? Those can be made to work with neural networks but there are better, more efficient tools that come to hand.

An example of this kind of thing is Fuzzy Logic. It's possible to build a Fuzzy Logic controller that can handle a non-linear problem like, say, holding a broom upside down without it falling over with the end of the stick of the broom in a back-and-forth machine; let 'er rip, and the broom will stay vertical even if one hits the broom with a stick, trying to knock it over. Better than a human at that, it is. And people thought it was wonderful.

So wonderful that certain out-of-control engineers started putting Fuzzy Logic controllers in everything from subway cars being brought to a stop and in washing machines. Turns out that doing this was possible, but not well-advised: Cheaper, simpler controllers running fixed (non-fuzzy) algorithms did a better, faster, and cheaper job for whole classes of problems; just like Fuzzy logic worked well for certain jobs, but not everything.

So, here's Tesla with that fancy-dan driving computer. It's got some god-awfully huge neural processors in there doing as much as they can make it do. For image recognition, that's a no-brainer. On the other hand, figuring out how to go at a fixed rate of speed is a matter of figuring out how fast the motor is spinning, not a neural processor problem. (An input to that neural processor, sure.)

The traditional way of getting a neural processor up to speed is to show it training data with a desired outcome. Get the wrong outcome, that's negative results and gets fed into the input of the neural processor. (More or less, saying, "That's a mistake.") Get the right outcome, that's a positive result, more feedback into the weightings of the neural nets to emphasize that that's the outcome one wants.

And it's not just one kind of test - it's thousands, Maybe millions, I don't know. We're talking serious, "do this, don't do that", with Neural Nets and Von Neumann machines working in tandem.

Somewhere around here one starts to wonder if actual intelligence will pop up. I don't think so: As fancy as all that stuff is in the Tesla's computer, I suspect it simply doesn't have the smarts of, say, a grasshopper, or maybe a mouse. Nah, not the mouse. Mice clearly have emotions that drive behavior. Teslas don't have the processing power to come close.

So, flashing emergency lights, right? We're human, we got definite emotional connotations of that kind of thing, we're really good at identifying an ambulance or fire truck. We won't mistake the blinking red from a fire truck for the blinking red of a "Stop, then go" traffic light. Or the advertising lights in front of a shop. In fact, if we saw the same lights from an ambulance in a shop, our thought pattern would go, "That's a shop. Ignore those lights." without even consciously thinking about it. If Tesla wants for the car to very definitely identify emergency vehicles, it's got some multivariate training to do that covers all blinking lights and getting the appropriate answers for all different types of them. Not trivial. But.. it might not be impossible, either.

As I said much further up this thread, FSDb is very definitely not a human, doesn't really remember past experiences, and doesn't self-train like we do. Complex rule sets, you betcha. Can those rule sets get good enough to drive an unmanned car coast-to-coast on local roads safer than a human can? Stay tuned.

Finally: Why I do understand the bottom-of-the-basement basics of neural networks, I'm not an expert by any means. If you want those kinds of experts.. Then you wander over to Tesla HQ where the Real Deal types are banging away at the problem. Nobody over there has thrown in the towel, so there's hope.
 
  • Like
Reactions: pilotSteve
My Refresh S is in the shop (again) and they gave me a 2017 S loaner car. My S doesn't have FSD. I don't drive enough to justify the price, so I didn't want it. The loaner S does have FSD so I decided to see for myself how cool it was and maybe try to justify adding it to my car. The stupid car tried to kill me and piss off other drivers in the short time I allowed it to drive!

First minute of turning on, it phantom-braked from 75 to 45 with nothing around but overhead signs and a bunch of pissed off drivers behind me.

I left it off for about 10 miles and tried again. This time it warned me it needed to move from my center lane to the right lane in preparation for taking the exit. We had about a mile before the exit and there was another car in the right lane next to me. My car hits the brakes and aggressively slows from 75 to about 40 mph before I decided to take over. I figured it would make a gradual speed reduction to fall in behind that traffic. Nope.

Finally I tried it on the city street and whenever a light turned red, this car would wait until the last second and aggressively brake so as to not rear-end the car stopped in front of us. If they moved forward and had to slow for traffic, my car would wait for a gap and rapidly accelerate just to slam on the brakes.

Is all of this maybe contributed to the car being five years old and maybe the FSD hardware wasn't as good back then? Maybe this particular loaner car had a malfunctioning component and it wasn't bad enough to throw an error code? I cannot imagine present day Teslas behaving like this with people shelling out $12,000 for such an awful system.

I was tempted to try the Summons feature but I figured that would be pushing my luck.

What you used sounds like Navigate on Autopilot and normal Autopilot, which behaves MUCH better than FSD. FSD is the early prototype of autopilot on city streets where it can make turns, and it behaves so much worse that it would blow your mind.
 
  • Informative
Reactions: emailforbrett
Been a little upset with my ms plaid fsd lately. Seems like the latest version is SUPER jerky and ALWAYS swerves into turn passing lanes on 2 way roads. Left turns FREAK me out with how the yoke jumps around and the acceleration after the left turn makes my coffee damn near spill. I am not getting too much phantom braking, but comes in WAY too hot to 4 way stops and stop lights.
 
FSD is even worse??? SMH.
Much, MUCH worse, yes. While Navigate on Autopilot constantly does needless lane changes, aborted land changes, and doesn't improve on Autopilot's handling of merge and split lanes, FSD has all those poor behaviors combined with jerky steering. It also attempts to cut into the path of oncoming traffic, hit static objects like poles, pillars, sidewalks, walls, sign posts, and of course pedestrians and cyclists.

Basically take the annoyances of AP/Navigate on AP and add in extremely dangerous and unpredictable behavior with tighter quarters and needing to take over with less than a second to respond. It also frequently misses turns, gives up on what it's supposed to do, pauses in traffic and needs you to press the accelerator to make it go, fails to keep its lane in turns, and so on. Basically, it's unusable if you don't want to hate driving.
 
  • Like
Reactions: headcase
Much, MUCH worse, yes. While Navigate on Autopilot constantly does needless lane changes, aborted land changes, and doesn't improve on Autopilot's handling of merge and split lanes, FSD has all those poor behaviors combined with jerky steering. It also attempts to cut into the path of oncoming traffic, hit static objects like poles, pillars, sidewalks, walls, sign posts, and of course pedestrians and cyclists.

Basically take the annoyances of AP/Navigate on AP and add in extremely dangerous and unpredictable behavior with tighter quarters and needing to take over with less than a second to respond. It also frequently misses turns, gives up on what it's supposed to do, pauses in traffic and needs you to press the accelerator to make it go, fails to keep its lane in turns, and so on. Basically, it's unusable if you don't want to hate driving.
All vaguely accurate. But, as I said before: One gets the FSDb not because one wants to play video games, read a book, or have a relaxing time getting from point A to point B. It's to test for Tesla.

And, as far as "normal" FSD goes.. I drive back and forth between New Jersey and Boston on a regular basis. Driving with NaV/Lanekeeping/FSD is a heck of a lot more relaxing and safer than driving without. It may sound stupid, but not having to regulate one's speed and twitch the steering wheel continuously to stay in lane and in line with everybody means that one can then supervise and watch the surrounding traffic more. That's a lot easier; I generally feel less beat-up after a long drive like that than when one has to continuously drive the car.

And the safer bit.. Route 95 up through the Northeast is, like many major superhighways, many lanes, lots of traffic, truck and otherwise, with people who are dedicated to getting where they're going at a high rate of speed. So, here's a crowd moving along at 65 mph+, with the usual collection of more-maniacal than usual types trying to cut in and out of traffic to get ahead of the crowd, and it was crowded. Somewhere up ahead somebody tapped the brakes; first I knew about it was a near panic stop all the way from 65 down to 0, with traffic stopped. Three vehicles back an F150-style pickup truck (who had been ducking around before this) didn't get the memo and sideswiped the vehicle two cars back; the Tesla had stopped with about 10' or 15' in front of the car. The SO and I patted the car's dash, said, "Good car!", traffic started up again, and away we went.

Yep, a lot of other people stopped on time, too. But all it would have taken for an accident would have been a moment's worth of inattention. The car's got cameras that are On and Don't Blink.

And, contrary to popular rumor (and rumor mongers), there are the occasional phantom braking events - but the ones that I've been experiencing over the years a more like minor slow-downs, not screech-the-wheel panic stops in traffic. Yes, they're a bit disconcerting, and I'll be happy when Tesla flushes the last of the reasons the car does that out of the code. But if one knows the car might do that, then one takes allowances; mainly, keeping the toe on the gas pedal a bit. As others have mentioned, this is mainly due to shadows being misinterpreted as objects, and the car is much, much better at not doing those since, say, 2018 or so. A reason to ditch the car and swear off TACC? Nope.

The above is all about limited access highway follies, not local roads. Local roads on FSDb is white-knuckle time. Local roads on FSD, on the other hand: Well, FSD on local is mild mannered, can handle non-limited access 4-lane (and up) roads pretty well, and won't be doing turns or lane changes on its own. It can be useful in stop and go, but when one gets to FSD's limits it's better to just drive the car.
 
  • Like
Reactions: pilotSteve
One gets the FSDb not because one wants to play video games, read a book, or have a relaxing time getting from point A to point B. It's to test for Tesla.

Well that's not what Tesla advertised. And it seems it might be unsafe to have unqualified safety monitors testing the proof of concept quality software on public roads where people that haven't agreed to the T&C are likely to be injured or have property damaged.

Driving with NaV/Lanekeeping/FSD is a heck of a lot more relaxing and safer than driving without.

Flat out no. There's nothing more "relaxing" about having to worry about being rear ended, approaching traffic WAY too quickly, lagging far behind accelerating traffic, or having FSD crash into literally anything at any moment.

It may sound stupid
 
Well that's not what Tesla advertised. And it seems it might be unsafe to have unqualified safety monitors testing the proof of concept quality software on public roads where people that haven't agreed to the T&C are likely to be injured or have property damaged.



Flat out no. There's nothing more "relaxing" about having to worry about being rear ended, approaching traffic WAY too quickly, lagging far behind accelerating traffic, or having FSD crash into literally anything at any moment.
Um. Tesla is not advertising FSDb as a full-fledged feature and hasn't been. The Beta Software is just that; it's Beta, it's out there for testing. Now, will it eventually lose its Beta status and just become the full-fledged FSD with all the bells and whistles? Yep, that's what Tesla is advertising and getting paid for.

Going into analogy mode: Complaining about the poor capabilities and usefulness of the current Beta software is just like buying a pre-construction house and complaining that the tile in the showers, which haven't been built yet, is substandard. One pays less for a pre-construction house, but one expects that, eventually, the house will be built to spec.

And, on your second statement: Don't conflate FSD and FSDb. FSD, the non-beta software, may have its (minor) follies, but I find that it:
  • Does not approach traffic too quickly. It may be a somewhat faster than drivers might expect, but point releases improved that.
  • Yep, it does lag, sometimes. But not in a major fashion and it normally catches up. As others have said, "It errs on the side of caution."
  • I'll agree that FSDb (emphasis on the "b") might crash into something at any time, that's why it's beta software. The FSD, non-beta, doesn't do that thing. But it also isn't designed to navigate city streets, either. And, if one is using it on City Streets, there are unmistakable warnings all over the software and in the user manual that the driver is in charge and to Watch It. Taking FSD (non-beta) down an un-striped, local road with cars parked on either side is just trying to push a tool past where it can be used. It does a much, much better job with stripes on the road.
Finally: FSDb is where all the fun is. Despite its mistakes, which one has to be 100% alert for, when it's working, it's working fairly well. As one would expect for a work in progress. On unmarked roads it dodges oncoming cars in a fairly clean fashion. If one comes up on one of those parked-mostly-on-the-road landscaping trucks, if there's a clear view ahead on the left, it'll nicely dodge around the landscaper truck, crossing the yellow lines as it does so. Just like a human might do.

One can see the emerging nuggets of a better system. It's not there yet, but it looks likely that it's on the way.
 
It's not beta, it's a proof of concept quality. You aren't testing it, you've been given access to this low quality software to prevent us from suing them. It's literally that simply. They've just parlayed this into a marketing campaign that seems to have confused an awful lot of people.
 
  • Like
Reactions: pilotSteve