Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD is a fraud

This site may earn commission on affiliate links.
Beta testers are working with Tesla to help fine tune and expand the features of the software so that it will be safer than a human driver. The Beta requires constant vigilance while driving - many have commented how stressful using Beta is - to correct problems that occur during the drive, and report those problems to Tesla so the software can be adjusted.
So you agree- current Tesla FSD beta testing is putting the non-consenting public at risk, for a future "hope" that it will get better.

However, there is far from proof that Tesla is gathering any useful data from these drives, particularly when FSD is engaged. Show me a report from Tesla explaining how much faster they are getting to actual self driving because people are using FSD on the road right now? Show me an analysis that FSD is developing much faster now than it did before FSD went "public." And the idea Beta testers work "with" Tesla? LOL. Have you ever tried to discuss any kind of bug you found with Tesla? It's a one way black hole.

There have been many articles published about how much telemetry data Tesla collects on their cars.
Yep, but since we can't see the telemetry, we don't know what it is, how useful it is, nor if it can only collect this data when FSD is engaged and increasing risk vs being used open loop in the background. They might be able to get 95% of the improvements with zero risk.
 
Have you not heard of Tesla jail?
Not being able to use autopilot for the rest of a drive? You think the risk of that will cause people to be better drivers and increase overall road safety? Something that can only happen while you have AP on?

If AP is so safe, the very last thing you should do to an "unsafe driver" is disable autopilot.

I only meant that monitoring of Tesla is far more accurate
Prove it. All Tesla records is FCW, hard braking, hard turning, unsafe following, and AP disengagements.
FCW is known to be full of false positives. Far from accurate.
Hard braking and turning can be done with any GPS or accelerometers.
Following distance is an estimate from a camera.
AP disengagements tell you nothing about a driver's ability to safely drive a car.

Meanwhile, they don't count' speeding or hard acceleration, which other systems do.
 
  • Like
Reactions: pilotSteve
Funnily enough, the Tesla salesman told me not to buy FSD...I guess that dispels the theory that Tesla ‘marketing’ is defrauding people
Yeah, one salesperson in France overrides the last 6 years of Tesla's videos on their websites, product descriptions, and Elon constantly tweeting about it and telling people it's coming soon and you better buy now because the price is going up next week.
 
Sure there is. Tesla released FSD without any regulatory oversight, and continues to do so. Reporting crashes is not regulatory oversight on the calculation of risk. It's a way to determine after the fact if the company allowed too much risk on the road to decide if future oversight is needed.


What you described in here is how they MANAGE risk, not how they evaluated when the system had sufficently low risk to be released to public roads with public drivers. Was their internal threshold 1 fatality in 1000 miles or 10M miles? How did their testing acquire data to prove this before release?


Except Speed limits and rolling through stops signs in Tesla's case, which they purposefully allowed.


Again, your whole description talked about what it does. What safety focused regulators care about is does it actually work? The hard numbers, the statistics. The things we don't get to see, and just like you said, we have zero idea if they make any difference.


With zero proof these improvements came because of data collected from public FSD users in the real world. Of course Tesla is developing FSD and making it better. But what percentage of those improvements were based on data taken from risking the uninformed public?


You mean we MIGHT get these cars. And they MIGHT come from Tesla. But they might also come from a company that never tested on the public road. We absolutely do not have them now, and the current cars are higher risk than ones without the technology.


The FAA is "slowing down" all sorts of development in aviation. The FDA slows down drugs all the time. Are you in support of getting rid of this and just letting companies do what they want with no oversight? What if Tesla suddenly says they are at L4, no driver is needed in the car. Are you OK with them just dropping 1000 of those cars on the road in your city without even telling you, or do maybe you hope there is actually someone, somewhere "slowing them down" to make sure they are actually ready before one runs over your kid on their way home from school?
I'm not really sure how to continue this discussion. You seem to be running up against a few walls. You want to see data which you will never see. You want there to be minimal risk to the public, which regulators help control - you said it yourself with other agencies, like the FAA and FDA. On a total aside, there are MANY people who were quite upset about how fast the FDA approved COVID vaccines. We're also near a logical problem common to religion. Prove that FSD is not an unacceptable risk to the public. Prove that FSD is an unacceptable risk to the public. As consumers, we can't do either. We see the goal, we see improvement, we hope the end result will be outstanding. We do our best to get there safely. The rest is up to regulators and companies to ensure the best product for us.

I also trust Tesla, like any company, to work in their best interests. It's not in Tesla's interest to release software that murders people - of course they're releasing software they feel is reasonably safe and getting testing data to try to make it even safer.

If you want some of your answers, you'll need to get a job at Tesla and work your way into the FSD program. Or you'll need to get hired by a regulatory agency, such as NHTSA, and work your way up enough to see data and influence policy.
 
So you agree- current Tesla FSD beta testing is putting the non-consenting public at risk, for a future "hope" that it will get better.
Absolutely, yes. But that's the same for ALL technological advancements. There is always risk. As for your use of the term non-consenting. I don't consent to 15 year-old untrained drivers on the road, but I recognize that kids eventually have to get behind the wheel of a real car on a real road to learn how to drive. I don't consent to the guy who has a few beers at a friend's house and then drives home thinking he's totally okay to drive. I don't consent to the person who ignores automobile maintenance and drives with nearly bald tires or the check engine light on for the last 500 miles. However, these people exist and share the roads with us. It's a risk we all take when we drive. We just do the best we can, and drive as defensively as we can to help minimize our personal risk.
 
No idea. Not sure about France but it is illegal to drive uninsured or unsafe vehicles here...
Same here in the USA. But in certain parts of southern California, for example, some persons who recently arrived can’t afford insurance so they risk it.

And if you get hit by them, good luck collecting. First? Can be hard to describe to cops what hit you as it might have a ford pinto hood, Corolla door, mustang roof, 4 different wheels…

And the conversation with the offending driver goes like “I has no insurance, seen yore….do you has insurance?”

And it’s downhill from there
 
  • Like
Reactions: DanCar
If AP is so safe, the very last thing you should do to an "unsafe driver" is disable autopilot.
All driver assistance features in any car must be able to be disengaged or overridden by the driver for whatever reason (emergency being the most important). AP was designed for a specific set of driving conditions and limited to a specific speed range. If drivers set AP to the maximum speed it can handle, and then use the accelerator to force the car faster than AP is designed to handle, it must disengage. There is a warning before it reaches this point. Tesla doesn't know why you forced the acceleration, so it disables the feature until you are able to stop safely and reengage it. Many cruise control systems in cars have a minimum speed requirement to engage the system. Some adaptive controls have the same minimum. You go below that speed and you cannot engage the system, or if the speed drops below that speed, it disengages with warnings. It's there to ensure the feature is used within the capabilities of the system, and not abused to potentially result in an unsafe condition.
 
Same here in the USA. But in certain parts of southern California, for example, some persons who recently arrived can’t afford insurance so they risk it.

And if you get hit by them, good luck collecting. First? Can be hard to describe to cops what hit you as it might have a ford pinto hood, Corolla door, mustang roof, 4 different wheels…

And the conversation with the offending driver goes like “I has no insurance, seen yore….do you has insurance?”

And it’s downhill from there
New Hampshire and Virginia don't require insurance. And every insurer in California offers uninsured motorist coverage to handle the poor, Hispanic immigrant you so eloquently painted the picture of for us. I especially liked the "seen yore" - classy.
 
I’ve been in the beta program for the better part of a year now and FSD is clearly not going to be a robotaxi at any point in the near future. Either Elon believes his own BS or FSD a scam. I’m inclined to believe the former. Regardless, it’s fun to watch beta develop over time and I bought the software as a novelty so I’m not bitter about its lack of usefulness.

That said, naysayers often overlook that while leaving FSD to its own devices would quickly result in a fender bender it’s hard to overstate the incentive for carefully monitoring the system when you have fifty grand and your life on the line. I have never seen FSD even remotely come close to endangering pedestrians or cyclists; it’s cautious to a fault in that regard.

It’s just human nature to watch FSD like a hawk and if anything I’m far more aware of my car’s behavior and surroundings while using FSD beta than when driving manually. I think this is why we see so few accidents from beta testers; it’s just unnatural to fully trust FSD beta and let it do its thing. It’s immediately apparent that it’s not even half baked.

Having your car, its cash value, your insurance rate, driving record, and physical well-being at stake is quite the incentive to carefully police FSD beta. I do not believe that people using the system put the public at risk. If anything it may be the exact opposite.
 
AP was designed for a specific set of driving conditions and limited to a specific speed range. If drivers set AP to the maximum speed it can handle, and then use the accelerator to force the car faster than AP is designed to handle, it must disengage. There is a warning before it reaches this point. Tesla doesn't know why you forced the acceleration, so it disables the feature until you are able to stop safely and reengage it. Many cruise control systems in cars have a minimum speed requirement to engage the system. Some adaptive controls have the same minimum. You go below that speed and you cannot engage the system, or if the speed drops below that speed, it disengages with warnings. It's there to ensure the feature is used within the capabilities of the system, and not abused to potentially result in an unsafe condition.
There is no warning before the speed based jail triggers. It's instant at 90 MPH (at least in radar cars). But when you do it, it does not disengage, and there absolutely is no rule that they must auto disengage when outside their limits. In fact that could be the very least safe thing to do.

It beeps loudly at you until YOU disengage. If you don't, it will bring the car to a stop. Then it blocks you from turning AP on again until you put the car in park. Not a low speed, not once you are off he throttle. It's "jail" on purpose to try and get you to not do it in the future, not for protection now. Notice how it DOESN'T disable autosteer while it's happening? Because that would be dumb if someone actually did have a failure and cascading failures are really bad.

Yes, there are logical low speed reasons you can't turn on cruise. But I've never driven any other car that if you go above some speed with the cruise on, it blocks you from using cruise for the rest of the drive. It just lets you do it and catches the speed once you let off. Because, you know, disabling driver assistance functions for a whole drive as a "punishment" is really weird.

And the argument here is that this "you can't use AP again until you go into park" is somehow going to improve overall driver safety? If that's true, why wouldn't disabling AP all the time improve it even more? ;)
 
  • Like
Reactions: pilotSteve
As for your use of the term non-consenting.
All your examples of non-consenting are illegal actions by individual people. Ones they can go to jail for because the risk they pose to the public. Ones the public is well aware are risks they may be exposed to when deciding to use the roadways.

Tesla uploading a new FSD version that is untested is perfectly legal, unregulated, and completely at Tesla's internal risk management. And none of this is exposed to the average road user in any way.
 
I also trust Tesla, like any company, to work in their best interests. It's not in Tesla's interest to release software that murders people - of course they're releasing software they feel is reasonably safe and getting testing data to try to make it even safer.
I assume you trust Boeing, Airbus, Pfizer, Merck, GE, and every other capitalistic company doesn't want to murder anyone either.
So why do we have regulatory agencies? Why don't we just trust them to release "reasonably safe" products?

We see the goal, we see improvement, we hope the end result will be outstanding. We do our best to get there safely. The rest is up to regulators and companies to ensure the best product for us.
What regulator? This is the issue. People think somehow some regulator is out there keeping them safe. Tesla has been genius about saying "pending regulatory approval" because it simultaneously gives them an excuse when they ship way late while also making people assume that it was somehow approved by a regulator.

In the USA, there is NO REGULATOR right now. Nobody oversees Tesla. You are 100% taking your safety and the safety of others around you into Tesla's hands and risk analysis alone. There is no backup, and if people knew that, I believe they would be a lot less accepting of this process, just like they wouldn't get on an airplane that wasn't FAA audited.
 
All your examples of non-consenting are illegal actions by individual people. Ones they can go to jail for because the risk they pose to the public. Ones the public is well aware are risks they may be exposed to when deciding to use the roadways.

Tesla uploading a new FSD version that is untested is perfectly legal, unregulated, and completely at Tesla's internal risk management. And none of this is exposed to the average road user in any way.
You can reach out to your Congressperson, Senator, or Governor to see about getting regulations changed. Otherwise, sounds like we may be seeing "gearchruncher" on a ballot for office some day. :)
 
I assume you trust Boeing, Airbus, Pfizer, Merck, GE, and every other capitalistic company doesn't want to murder anyone either.
So why do we have regulatory agencies? Why don't we just trust them to release "reasonably safe" products?


What regulator? This is the issue. People think somehow some regulator is out there keeping them safe. Tesla has been genius about saying "pending regulatory approval" because it simultaneously gives them an excuse when they ship way late while also making people assume that it was somehow approved by a regulator.

In the USA, there is NO REGULATOR right now. Nobody oversees Tesla. You are 100% taking your safety and the safety of others around you into Tesla's hands and risk analysis alone. There is no backup, and if people knew that, I believe they would be a lot less accepting of this process, just like they wouldn't get on an airplane that wasn't FAA audited.
I may have missed something, but my MY has had two recalls since I've purchased it, both from NHTSA investigations. One was for seat belt warning chimes and the other about rolling stops (in FSD Beta). People complain to NHTSA, and they investigate. If their investigation warrants changes from the company, a recall is ordered. There are quite a few people who are watching NHTSAs investigation on phantom braking, and possible recall. Fortunately, there has been progress made, so hopefully regulators don't need to step in.
 
  • Like
Reactions: mtndrew1
All your examples of non-consenting are illegal actions by individual people. Ones they can go to jail for because the risk they pose to the public. Ones the public is well aware are risks they may be exposed to when deciding to use the roadways.

Tesla uploading a new FSD version that is untested is perfectly legal, unregulated, and completely at Tesla's internal risk management. And none of this is exposed to the average road user in any way.
Whenever anyone critiques ADAS systems in a public forum all other drivers are immediately assumed to have their hands at 10 and 2 with a good night’s sleep, a decade of experience and a clean driving record. The other drivers don’t speed, become emotional, or tinker with their phones. They all have 20/20 vision. Their tires are all at the proper PSI and their tie rod ends are all known to be in good condition.

In reality the public roadways are a mishmash of abilities, skills, mental states, physical conditions, vehicle states and so forth.

Without breaking any laws someone near you on the road may be very young, very old, very tired, very amped, just dumped by their spouse, just fired from their job, and so on.

Their cars are in various states of repair with various generations of safety systems. They might have anti lock brakes, they might not, and they probably don’t have any idea what those words mean in the first place. Drivers may or may not know how to properly operate the vehicles they’re using. They may have been poorly taught to drive in the first place and have poor practices baked into their skill set.

We all “consent” to this incredibly stupid web of transportation every time we get in the car. As far as I’m aware ADAS systems are another one of these wild and endless variables we all consent to until such time the public seems them a hazard worth restricting. At least from my perspective there are no data to support such a restriction, yet, and so these systems are allowed on public roads.
 
  • Like
Reactions: Dewg
I’ve been in the beta program for the better part of a year now and FSD is clearly not going to be a robotaxi at any point in the near future. Either Elon believes his own BS or FSD a scam. I’m inclined to believe the former. Regardless, it’s fun to watch beta develop over time and I bought the software as a novelty so I’m not bitter about its lack of usefulness.

That said, naysayers often overlook that while leaving FSD to its own devices would quickly result in a fender bender it’s hard to overstate the incentive for carefully monitoring the system when you have fifty grand and your life on the line. I have never seen FSD even remotely come close to endangering pedestrians or cyclists; it’s cautious to a fault in that regard.

It’s just human nature to watch FSD like a hawk and if anything I’m far more aware of my car’s behavior and surroundings while using FSD beta than when driving manually. I think this is why we see so few accidents from beta testers; it’s just unnatural to fully trust FSD beta and let it do its thing. It’s immediately apparent that it’s not even half baked.

Having your car, its cash value, your insurance rate, driving record, and physical well-being at stake is quite the incentive to carefully police FSD beta. I do not believe that people using the system put the public at risk. If anything it may be the exact opposite.
I more or less have been here for the last 2-3 years. You see some progress, but the more I have it the more I can't actually imagine it becoming a legit product (Full Self Driving). More like significant assisted driving where you have to still put a lot of input in.

I still tend to think the current space is like the worst situation to be in. The car does enough to act like it's driving but still requires you to engage, provide input, albeit often secondarily/be prompted to do so, which I honestly think is the worst. I've said it before to my car, but you can't make the decision yourself then you shouldn't be allowed to make any decisions in the first place.
 
Were any pedestrians hit and/or injured?

Or was the inherent caution exhibited by someone with the risk of great financial, legal, and physical risk enough for them to intervene before anyone was hurt?

There was a video when FSD first came out of it trying to run over pedestrians in Seattle. Driver had to disengage. Video has been scrubbed from the internet via a series of DCMA takedowns. This was 10 months ago.

 
I assume you trust Boeing, Airbus, Pfizer, Merck, GE, and every other capitalistic company doesn't want to murder anyone either.
So why do we have regulatory agencies? Why don't we just trust them to release "reasonably safe" products?
Because human error still happens. Some of the brightest, smartest, most capable people built the Space Shuttle, and the Challenger still exploded after lift-off. Carelessness doesn't have to exist for mistakes to be made. Regulatory agencies exist to ensure that codes are created to ensure safety, investigate problems, and issue recalls for products. Many times recalls are issues voluntarily by companies before investigations are event started, or in concert with regulators when they report complaints. Some companies design and build prototypes and create a great product, then send the specs to be built by another company (think iPhones). Those manufacturing companies introduce errors that the original company didn't create. Or one of the parts companies that supplied the bolt that holds the wheels on may have a defect that wasn't caught - as is the case in Toyota just issuing a recall for their new BZ4X EV.
 
  • Informative
Reactions: pilotSteve
I may have missed something, but my MY has had two recalls since I've purchased it, both from NHTSA investigations. One was for seat belt warning chimes and the other about rolling stops (in FSD Beta). People complain to NHTSA, and they investigate. If their investigation warrants changes from the company, a recall is ordered. There are quite a few people who are watching NHTSAs investigation on phantom braking, and possible recall. Fortunately, there has been progress made, so hopefully regulators don't need to step in.
There was also a recall-class incident with FSD beta where they broke Automatic Emergency Braking even when AP is off. It was causing all these false positives because Tesla didn't do enough validation before shipping the build to beta testers.