Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD is a fraud

This site may earn commission on affiliate links.
The latest pricing on automation will be IMHO right on, as soon as the full "FSD Beta" features become widely available, i.e. on demand.

As an engineer I can understand why FSD Beta was released initially the way it was, i.e. only to people who are demonstrably willing and able to conform to the most cautious "grandmotherly" driving style. To provide ample opportunity to react to a Beta's inevitable malfunctions. I can also understand why NOA on City Streets has not been immediately available to drivers who exhibit the very lack of restraint, and the road-rage like behavior, that we see from some of the haters here. It is what it is.

It all hinges on FSD Beta. Those who use it are generally happy, as long as they're not obsessed with "levels" or the way FSD was over-optimistically presented in the past. If we look at how other developers of new tech get carried away, it's common to over-promise. It's part of a programmer's mind-set. I've done it myself, blind optimism keeps you going. I don't think there was malice or deceit involved.

No car company offers "Level 5" yet, though in practice what FSD Beta does is good enough for me. I wouldn't want to sleep at the wheel because in a car that's free to drive anywhere, as opposed to geofenced terrain, it turns out there are far more tricky "boundary conditions" to deal with than anyone expected. The "imperfections" may not be entirely surmounted for many years. 100% perfection perhaps never. Humans don't even come close.

The tragedy is that some of those who bought "FSD" early without getting FSD Beta are understandably furious. They went ballistic and stayed ballistic. Moreover, since they don't use FSD Beta, they don't actually know how far along it is. This lack of knowledge, combined with their (easily understood) frustration, creates a demographic of powder kegs who reinforce each other, spread rage and misinformation, and are doing harm to the company.

As to pricing, it will IMHO probably make sense:

Option 1) EAP
$6k for the current EAP feature set, meaning Navigate On Autopilot on protected roads, i.e. freeways and other roads with no cross-traffic. Plus (FWIW) Auto-park, Summon etc. Leaving out the "just let me drive" traditionalists, it's a great offering at a fair price.

Option 2) FSD i.e. EAP + $6k = $12k total. Adds Navigate on Autopilot on all roads and on City streets. Not cheap, but it's huge. Anyone who has learned how to best use FSD Beta knows what it can do, and can make an informed decision, picking whichever feature set meets their needs. For me, I think full automation is easily worth 1/4 of the car's value.

How would I know? I recently got my one and only FSD "strike", due to faulty engagement of the in-cabin camera causing spurious "pay attention" errors. Service was great, they replaced the inside camera overnight on a weekend. I can estimate the value of option 2 because If I lost FSD Beta at this point it would easily degrade my Model 3 by at least 25%. I also have a good idea of how ballistic I'd go. Not pretty. Hang on, haters, I might be joining you yet. :mad:o_O
.
 
Last edited:
Sooner or later everyone will experience the ridiculous tesla service. Call yourself lucky if you haven’t…yet.
Perhaps I'm lucky, or perhaps you are unlucky. The point is neither of us know, since we dont have a large enough sample, and the posts here will almost certainly reflect a biased sample set as well (how many people rush to post when they have a pleasant, uneventful service experience?).
 
  • Like
Reactions: Yelobird
The only thing I can agree on re: Service, is that they don't always have loaners, so they give you a $100/day Uber credit. Both my Uber rides this time were in Tesla Model 3s. Service was great. They replaced the faulty inside camera overnight, on a weekend.

The loaners, should you be so unlucky as to get one, are a basic Enterprise Model S with no autopilot at all.
 
...The tragedy is that some of those who bought "FSD" early without getting FSD Beta are understandably furious. They went ballistic and stayed ballistic. Moreover, since they don't use FSD Beta, they don't actually know how far along it is. This lack of knowledge, combined with their (easily understood) frustration, creates a demographic of powder kegs who reinforce each other, spread rage and misinformation, and are doing harm to the company.…
Nobody raging here harms Tesla.
 
Last edited:
Sooner or later everyone will experience the ridiculous tesla service. Call yourself lucky if you haven’t…yet
Of course our experiences are different and occasionally "lemons" are sold that have constant issues. But let's be a bit more specific. The SC itself in my experience is terrific - great guys, fast service, mobile assistance, etc. Most Tesla owners I've talked to love THEIR SC.

It's corporate Tesla (specifically Elon's micro-management style) that ties their hands - constantly denying warranty claims, this silly SS/camera upgrade starting 9 months ago, the parts shortage or even disallowing sale of some parts to owners, no FSD transfer even if I buy a new Tesla, etc. That list goes on forever.

I love my SC - perhaps my best auto service experience. But at the same time, the corporate hamstrung approach is by far my worst support experience ever. Definitely a love/hate relationship for me.
 
  • Like
Reactions: pilotSteve
As an engineer I can understand why FSD Beta was released initially the way it was, i.e. only to people who are demonstrably willing and able to conform to the most cautious "grandmotherly" driving style.
It might be more accurate to note that Safety Scores have only been used since Oct 2021. From Oct 2020 to Oct 2021 the safety requirements to be admitted to the Beta program were not clearly specified, with many random "influencers" posting videos of questionable safety habits. Still, there were only a few thousand people in the program at that point, many being employees. The larger rollout has used the Safety Score starting at a score of 100 and declining to around 91, or less

The safety score has been criticized for being open to manipulation, not considering aggressive acceleration, and encouraging bad driving by people reluctant to brake for fear of harming their score. It's not a perfect standard, but it is something at least.
 
The safety score has been criticized for being open to manipulation, not considering aggressive acceleration, and encouraging bad driving by people reluctant to brake for fear of harming their score. It's not a perfect standard, but it is something at least.
I suspect there was also some cunning on Teslas's part. If they run foul of regulators they can say they used insurance standards to choose safe drivers .. hard to argue against that (on paper), regardless of the manipulations.
 
  • Like
Reactions: pilotSteve
they can still argue that they used an independent benchmark for "safe" drivers to choose beta testers .. which is orthogonal to its use for insurance rates.
Except the rationale in CA is that these scores are not an independent measure of driver safety or attention. They are biased and statistically not useful. So they're saying they knowingly used something that functionally was a biased lottery, not a valid measurement of someone's ability to manage FSD beta safely when driving in public and exposing the public to risk.

Plus, there is nothing independent about the safety score.
 
Except the rationale in CA is that these scores are not an independent measure of driver safety or attention. They are biased and statistically not useful. So they're saying they knowingly used something that functionally was a biased lottery, not a valid measurement of someone's ability to manage FSD beta safely when driving in public and exposing the public to risk.

Plus, there is nothing independent about the safety score.
What would you suggest as an alternative?
 
What would you suggest as an alternative?
If the product is not safe enough to give to all licensed drivers, then it should only be given to trained professionals for testing until it is ready for release.

You know, like all the other self driving companies are doing (including BMW, Audi, and Mercedes).

Exposing the uninformed and un-consenting public to random drivers with access based on the thin veneer of an unaudited, opaque algorithm is unethical. We would never allow medical devices to be "tested" this way, where failures could lead to harm of people that are not in the testing group.

I mean, if we're OK with stuff like this, why do we even have the FAA? Why not just let Boeing and Airbus decide when their products are ready to transport people, and only have the FAA license pilots? Boeing can then identify "safe" pilots and "unsafe" pilots, and only the "safe" pilots are allowed to "test" the new airplanes with the newest automation and safety gear. The "unsafe" pilots fly the old stuff. You as a passenger? You have no idea what plane or pilot you're getting, just like that mom in her car next to you has no idea you're "testing" FSD.
 
Last edited:
You know, like all the other self driving companies are doing (including BMW, Audi, and Mercedes).

Exposing the uninformed and un-consenting public to random drivers with access based on the thin veneer of an unaudited, opaque algorithm is unethical. We would never allow medical devices to be "tested" this way, where failures could lead to harm of people that are not in the testing group.
First off, training an NN with qualified drivers in closed environments means it will only work in those closed environments, which is pointless. Ultimately, you have to give an NN real training data, which means real driving on real streets with uninformed drivers.

And yes, you do test this way with medical devices. Sure, you do various tests way before they are used on actual sick patients, but ultimately all medical devices are tested on sick human patients. That's pretty much what Tesla have done .. internal testing, simulated testing etc .. and then a wider beta release.

And, in actuality, how is this "unethical" testing going? Well, we've had tens of thousands of testers using FSD beta for 9-10 months now .. where are all these terrible accidents that doom-sayers were predicting? Sure, perhaps some (many?) have been avoided by the driver taking over, but that's how an L2 system is designed to work.
 
If the product is not safe enough to give to all licensed drivers, then it should only be given to trained professionals for testing until it is ready for release.

You know, like all the other self driving companies are doing (including BMW, Audi, and Mercedes).

Exposing the uninformed and un-consenting public to random drivers with access based on the thin veneer of an unaudited, opaque algorithm is unethical. We would never allow medical devices to be "tested" this way, where failures could lead to harm of people that are not in the testing group.

I mean, if we're OK with stuff like this, why do we even have the FAA? Why not just let Boeing and Airbus decide when their products are ready to transport people, and only have the FAA license pilots? Boeing can then identify "safe" pilots and "unsafe" pilots, and only the "safe" pilots are allowed to "test" the new airplanes with the newest automation and safety gear. The "unsafe" pilots fly the old stuff. You as a passenger? You have no idea what plane or pilot you're getting, just like that mom in her car next to you has no idea you're "testing" FSD.
There are definitely people invited to Beta that should absolutely not be using it. People that treat it like a toy or something fun to show off to friends. Unfortunately, just like you can game the wheel nag, people can become trained drivers and still be a danger. The Uber safety driver that killed the pedestrian because he was distracted comes to mind. Only those who take testing seriously with a high safety score and proven driving record should test.

The article is a few years old, but still telling:

 
I mean, if we're OK with stuff like this, why do we even have the FAA? Why not just let Boeing and Airbus decide when their products are ready to transport people, and only have the FAA license pilots? Boeing can then identify "safe" pilots and "unsafe" pilots, and only the "safe" pilots are allowed to "test" the new airplanes with the newest automation and safety gear. The "unsafe" pilots fly the old stuff. You as a passenger? You have no idea what plane or pilot you're getting, just like that mom in her car next to you has no idea you're "testing" FSD.
This is a poor analogy. In actuality, there are many experimental aircraft flying around that are not approved by the FAA. Any licensed pilot could build one and fly it without FAA type-certification. No 'safety score' is required - only a valid FAA pilots license suitable for that style of aircraft. Once constructed, the aircraft must be registered and inspected by a licensed mechanic, but does not need to be proven out by a special test pilot.

Despite this, experimental aircraft are legally flown over the "uninformed and un-consenting public". And all this despite the fact that an aircraft cold become unstable in flight and require special skills to recover, if possible at all. However, a car that has a self-driving flaw can easily be manually controlled by the driver.
 
This is a poor analogy. In actuality, there are many experimental aircraft flying around that are not approved by the FAA. Any licensed pilot could build one and fly it without FAA type-certification. No 'safety score' is required - only a valid FAA pilots license suitable for that style of aircraft. Once constructed, the aircraft must be registered and inspected by a licensed mechanic, but does not need to be proven out by a special test pilot.

Despite this, experimental aircraft are legally flown over the "uninformed and un-consenting public". And all this despite the fact that an aircraft cold become unstable in flight and require special skills to recover, if possible at all. However, a car that has a self-driving flaw can easily be manually controlled by the driver.
 
First off, training an NN with qualified drivers in closed environments means it will only work in those closed environments, which is pointless. Ultimately, you have to give an NN real training data, which means real driving on real streets with uninformed drivers.

And yes, you do test this way with medical devices. Sure, you do various tests way before they are used on actual sick patients, but ultimately all medical devices are tested on sick human patients. That's pretty much what Tesla have done .. internal testing, simulated testing etc .. and then a wider beta release.

And, in actuality, how is this "unethical" testing going? Well, we've had tens of thousands of testers using FSD beta for 9-10 months now .. where are all these terrible accidents that doom-sayers were predicting? Sure, perhaps some (many?) have been avoided by the driver taking over, but that's how an L2 system is designed to work.
Right, the FSD Beta testing with Tesla's limitations actually ended up far more safe than any of the doomsayers predicted. If there was really an immediate danger I have no doubt either CA or NHTSA would shut it down in a jiffy (especially with the step up in action with the current administration).
 
  • Like
Reactions: PACEMD and CyberGus