Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
People are dumb and lack basic reading comprehension skills. Despite it being quite clear what they were buying they remain convinced they were tricked.

It’s ashame people can’t read....
I bought my second Tesla car in 2018. At that time the website described the FSD as true L5 autonomy system. No "ifs" and "buts".

And unlike many people here I actually read the car sale contract (and opted out of mandatory arbitration) and it does not define the "FSD" in any way. And neither does the 2021 contract, btw.

So what should people believe? Perhaps we should all agree that "FSD" means that Elon should give each customer $100 million so that they can hire a personal driver for life? I think that's a very fair and unbiased reading of the "FSD".
 
I bought my second Tesla car in 2018. At that time the website described the FSD as true L5 autonomy system. No "ifs" and "buts".

but it never said when that would happen.... reading....

And unlike many people here I actually read the car sale contract (and opted out of mandatory arbitration) and it does not define the "FSD" in any way. And neither does the 2021 contract, btw.

ok then...

So what should people believe?

What it said. You are buying the right to a (in your case) level five system for your car whenever it gets released. Could be 2 years could be 20....
 
What it said. You are buying the right to a (in your case) level five system for your car whenever it gets released.
It absolutely said that you were buying a car with the HW capability of FSD at a level safer than a human, on the day you bought your car. Not a future upgrade. The car you buy has hardware that is capable.

Even Tesla acknowledges that this is now known to be not true. Tesla tried, and it had insufficient processing power. It can't even detect a red light.
 
Tesla tried, and it had insufficient processing power. It can't even detect a red light.
Hacked Autopilot had traffic light detection back in 2019 working on HW2/.5, but it wasn't very consistent:

I suppose technically the SAE levels of autonomy are design intent of capabilities but doesn't really capture attributes like quality, consistency or smoothness. A level 5 system with insufficient processing power could theoretically detect some red lights and respond to them perhaps with a few seconds delay resulting in harsh braking or running red lights, so it seems unlikely regulators would approve of this system for public use.

Basically, it seems like Tesla has multiple "outs" as I believe it was always with "software validation" and "regulatory approval," so even if FSD gets approval for Level 5 public use with HW3 or some future hardware, Tesla could claim they're still working on optimizing it to run on older hardware or have regulators disapprove a poor implementation on old hardware. I would totally be upset if Tesla ended up doing this as it was very misleading, but I guess for now we'll continue waiting.
 
  • Like
Reactions: rxlawdude
I suppose technically the SAE levels of autonomy are design intent of capabilities but doesn't really capture attributes like quality, consistency or smoothness.

Tesla made it clear their definition of FSD is "at a safety level substantially greater than that of a human driver."

Andrej, Q3 2018 Tesla Call:
This upgrade allows us to not just run the current neural networks faster, but more importantly, it will allow us to deploy much larger, computationally more expensive networks to the fleet. The reason this is important is that, it is a common finding in the industry and that we see this as well, is that as you make the networks bigger by adding more neurons, the accuracy of all their predictions increases with the added capacity.

So in other words, we are currently at a place where we trained large neural networks that work very well, but we are not able to deploy them to the fleet due to computational constraints.
Andrej says it's not possible. Case closed.

Also, Elon, same call:
But to be clear, there is definitely no need to wait until Q2 to order a car. It's - we want to make it just completely seamless process, so there is no advantage ordering now versus Q2. Andre, do you want to…?
Hmm, and now, there IS a advantage ordering in Q2 vs Q1. You would have gotten HW3, and you could have subscribed to FSD without a $1K upcharge.
 
  • Like
Reactions: BrerBear
"What do you mean I can't 'rent' the software for a month and get 3x the value for free???"

What do you mean "All Tesla vehicles produced in our factory have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver." means that I need a paid hardware upgrade to enable FSD on my car?

Tesla chose to say that in 2016, 2017, 2018, and 2019. They even kept saying it when they knew it wasn't true, from Oct 2018-April 2019, after Kaparthy said the modern models couldn't run on HW2. By Tesla's own admission, the HW in 2016-2019 cars is NOT capable.

So why should a customer have to pay to upgrade, if the car was sold as already having this hardware capability?
 
What do you mean "All Tesla vehicles produced in our factory have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver." means that I need a paid hardware upgrade to enable FSD on my car?

Tesla chose to say that in 2016, 2017, 2018, and 2019. They even kept saying it when they knew it wasn't true, from Oct 2018-April 2019, after Kaparthy said the modern models couldn't run on HW2. By Tesla's own admission, the HW in 2016-2019 cars is NOT capable.

So why should a customer have to pay to upgrade, if the car was sold as already having this hardware capability?
On the one hand I agree with you. They did advertise that and they are clearly wrong about it. It would be good PR for them to do the upgrade for free.

But on the other hand, I can't see any for profit company doing this, even if they did fall back on a promise. At least HW3 is backwards compatible. They could have said "tough luck" and not offered HW3 and any features above EAP on the older cars.
 
  • Like
Reactions: rxlawdude
The only thing Tesla can be trusted to do is whatever is in their own best interests. They will never do anything for their customers except the bare minimum that keeps them from being sued, and sometimes not even that.

Everyone who sits around pretending like Elon and his cronies are any better than the executives at Exxon or Nestle or Monsanto or any other awful company need to stop deluding themselves.
 
But on the other hand, I can't see any for profit company doing this, even if they did fall back on a promise. At least HW3 is backwards compatible. They could have said "tough luck" and not offered HW3 and any features above EAP on the older cars.
And this is why we unfortunately need to take advantage of small claims, arbitration, and class actions. If we allow companies to say whatever they want to get customers to buy a product, but then just say "welp, yep, why would any for profit company actually keep up on a promise after a sale?" and never hold them to that promise, we'll have anarchy. So we have to actually use the courts to hold them to their promises and actually incur losses for advertising things that are not true, or they will run rampant.

You realize that your 4 year warranty on your car is just a promise that no logical for profit company would do, right? It's a warranty that the car will do what it was advertised as doing for a specific time period. Guess what warrantied cars don't do that they were promised to do....
 
While the safety data that Waymo released is very encouraging that Waymo is safe, 6M miles is not enough data to statistically prove that Waymo is safer than humans. Waymo needs more data to actually prove safety with high confidence. I believe it is one reason why Waymo is investing so much in Simulation City. If you try to brute force it with only real world miles, you need about 2B miles to prove safety. So if Waymo wants to get out of the geofence, they are going to need a lot more than 6M miles to prove safety. If they can get Simulation City to be a reliable measure of safety, then Waymo can use it to supplement their real world safety data and prove safety that way, a lot faster than trying to just brute force it.

brute_force.png


You cant use simulations in your test set to define the accuracy of your algorithm. This is a MAJOR data science / algorithm faux pax.
 
You cant use simulations in your test set to define the accuracy of your algorithm. This is a MAJOR data science / algorithm faux pax.

I don't think that is what Waymo is doing. They plan to use the simulation to validate real world performance which they argue you can do if the simulation is realistic enough:

For simulation to be an effective tool, it has to closely represent the environment it's emulating to provide accurate insights into a system’s performance. What happens in simulation must be predictive of what happens in the real world to ensure you are simulating the right things. This requires minimizing the differences between the simulated and real world, from the way your sensors see to the way your agents react to changes in their simulated environments — and are statistically representative of the real world.
One way we ensure statistical realism in our simulated world is by creating realistic conditions for our autonomous driving technology to experience.
As we simulate more and more variations of the same scenario, we begin seeing a convergence of the distribution of outcomes between what we observe in simulation and the real world.
To help ensure a simulated environment is representative of the real world, you need to have a lot of high quality data from the real world to base it off of and compare it to. Our simulated environments are constantly informed and refreshed with the experiences our fleet collects every day in the dozens of cities we drive in, letting us incorporate minor subtleties into our simulation regularly.
As our simulation becomes more sophisticated, we can assess harder behaviors and aggregate important trip-level statistics about the Waymo Driver's performance, enabling us to advance and accelerate our technology in many ways,

Source: Waypoint - The official Waymo blog: Simulation City: Introducing Waymo's most advanced simulation system yet for autonomous driving
 
Last edited:
Times have been very tough for the trolls. It's ridiculous, but autosteer city/FSD is basically the last thing they can point and sputter at.

I remember when the trolls were salivating over imminent bankwuptcy due to Tesla servicing their debt. Tesla is supposedly going to retire a significant portion of their outstanding debt. I imagine Tesla won't have any long-term outstanding debt in the near future.

Yes. And the list seems to grow. I guess trolls reproduce faster than humans.