Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

"Elon’s tweet does not match engineering reality per CJ." - CJ Moore, Tesla's Director of Autopilot Software

This site may earn commission on affiliate links.
To me this is the most important part of the e-mails.

Basically all the "unrealistic" stuff Musk keeps saying (in terms of L5 / robotaxi / FSD will happen by the end of year NNNN is because he is "interpolating". He might even be thinking it gets better linearly (or worse "exponentially") ... instead of parabolic flattening that is usually observed in terms of fixing issues.

The ratio of driver interaction would need to be in the magnitude of 1 or 2 million miles per driver interaction to move into higher levels of automation. Tesla indicated that Elon is extrapolating on the rates of improvement when speaking about L5 capabilities. Tesla couldn’t say if the rate of improvement would make it to L5 by end of calendar year.

Really - is that all this comes down to ?!
 
Plainsite may be owned by someone who is anti-Tesla but the documents on plainsite are still legit. They are real emails between Tesla and the CA DMV. Why can't we talk about what is in the emails?
But makes some sense not to link to that site - if he is mining IP addresses for doxing etc. Would be a good idea to just take a screen capture and post (or better still - link to Reuters article).
 
  • Like
Reactions: mikes_fsd
So what is? Elon has been tweeting for 6 months that the public release is right around the corner. We're supposed to have a button. They're waiting for the march of nines. Elon often uses intervention rate as a metric for how well they are doing "we measure this primarily in interventions".
I'd give Tesla a pass too if they weren't acting like it's working awesome and the public rollout is imminent, and interventions are a good metric.

Well you might take Elon saying that as good evidence of my point ... that intervention rate isnt a good predictor of how close you are to release. In fact, its a perfectly valid metric to indicate if you have reached an acceptable plateau for release, but using the delta between where you want to be and where you are as a predictor of how much work is left is highly risky.
 
Real question is whether L5 will ever happen.

I am not an expert but I think L5 is among those questions: Could humans ever be able to go into space, to the moon, to mars. and live on mars..?

The answer for the above and L5 is: They would happen and already did for space and moon but the real question is timing. When? I am not sure L5 will be a reality within my lifetime (a couple more decades on earth for me).
 
Last edited:
  • Like
Reactions: Matias and kavyboy
I'm going to give Tesla the benefit of the doubt on this one. This is beta software that is evolving rapidly, and the disengagement rate is going to fluctuate a lot until things stabilize. Furthermore, AI/NN systems are very non-linear when it comes to changes to the network and their final effects on the output, so things like disengagement rates are not a good predictor of progress or release dates. So what would be the point of releasing numbers?

The non-linear rapid improvement prediction has been in such great progress in theory for years but it seems not to be in the field.

It's improving so fast in theory that Elon Musk said in 2015: “We’re going to end up with complete autonomy, and I think we will have complete autonomy in approximately two years."

It's great in theory but we need something tangible like an FSD beta download button for hundreds of thousands or millions of Tesla owners now or for the past 6 years.
 
  • Like
Reactions: Matias
The non-linear rapid improvement prediction has been in such great progress in theory for years but it seems not to be in the field.

It's improving so fast in theory that Elon Musk said in 2015: “We’re going to end up with complete autonomy, and I think we will have complete autonomy in approximately two years."

It's great in theory but we need something tangible like an FSD beta download button for hundreds of thousands or millions of Tesla owners now or for the past 6 years.
Non-linear doesn't mean rapid .. it means there is a non-proportional relationship between the measured completeness of a system and the amount of work predicted to complete it. In cases like AI/NN the relationship may be very hard or impossible to compute.
 
As far as I can tell, Elon's interpretation of "Level 5" (what he seems to mean when he says it) diverges significantly from the actual definition. What Elon seems to mean by "Level 5" is that the car will be technically capable of autonomously performing just about any conventional driving operation: tricky blind curves, weird intersections, roundabouts, cone zones, handheld traffic signs, routing around slow or stopped cars, and so forth, in such a way that there is a "nonzero chance" (I believe those were Elon's words) that the car will be able to navigate any real-world set of turn-by-turn directions without intervention. Of course, that's not Level 5; it's non-geofenced Level 3. But I do expect FSD 9.0 to pretty much satisfy this definition this year.

To move to true Level 4 (let alone Level 5), the rate of "necessary" disengagements must fall, as CJ says, to less than one per million miles or so. Most of my real-world highway disengagements are already not "necessary"; the car may be phantom-braking, or choosing a "wrong" lane, and I disengage out of impatience rather than concern. I'd guess my current rate of "necessary" highway disengagements (where I disengage for safety reasons) is once per couple hundred miles or so with FSD 8.0. Let's further suppose only 1% of such incidents would lead to a actual collision if I hadn't intervened. That's still about two orders of magnitude short of Level 4 requirements.

Will FSD 9.0 leapfrog these two orders of magnitude? What's more, how can we even know if it does? It's already difficult to know for sure whether a given disengagement is truly necessary or not. With a driver paying attention (Level 3), many would-be FSD accidents (with the car "at fault") would be driver-disengaged and avoided, so one L3 accident per million miles does not imply the car is ready for L4. It might objectively be at ten L4 accidents per million miles, with the L3 driver avoiding nine of them.

My own expectation is that we will see city-street L3 this year or next, and highway L4 (eyes off the road, onramp to offramp) by around 2025. But I expect that city L4 and L5 are quite a bit further out; probably 2030 and 2035 respectively, and I expect them to require Hardware 4.0 or 5.0, with more than just optical cameras. Then again, I hope Elon proves me wrong!
 
...My own expectation is that we will see city-street L3 this year or next
L3 expectation contradicts with Tesla-CA DMV letters that say even after the FSD beta will be no longer be beta but in its final release, it will still be L2.

...highway L4 (eyes off the road, onramp to offramp) by around 2025...

First thing first, has Tesla's system demonstrated that it can reliably avoid colliding into a stationary obstacle at highway speed that Waymo has demonstrated with no deaths or collisions (that Waymo hits others, of course, others do hit Waymo)?

The speculation is with this FSD beta, it'll blow your mind but that only talks and not proof.

MobilEye says it can with vision but only as L2 with driver's assistance. To move to a higher level it is adding LIDAR+RADAR which is contrary to Tesla's belief (Tesla wants vision only, no LIDAR, no RADAR).
 
But even that makes Elon a liar- he says they are waiting for the march of 9's to release the FSD beta:


"This is a “march of 9’s” trying to get probability of no injury above 99.999999% of miles for city driving. Production Autopilot is already above that for highway driving." CJ's email makes it clear that they are nowhere near that. Elon's number is 1 in 100 million miles, and CJ says they aren't at 1:1M. So there is zero way Tesla can expect to release the beta button "next month" if they need 3 orders of magnitude improvement in reliability before release. It also really calls into question if base highway AP is that high as well.
There is no Production Autopilot. All Tesla autopilot is in beta and you cannot use it until you accept the beta disclaimer.

Elon is selling snake oil to gullible tech addicts and got very rich

disclosure: I use Beta Autopilot every day
 
"Driver interaction" seems like a much broader term than disengagement or even intervention. I don't think it would include applying torque to the wheel as that is a driver monitoring requirement. IMO, "driver interaction" probably includes stalk confirmations, corrective braking, accelerating and steering and of course disengagements.

The 1M or 2M miles per driver interaction appears to be a rough ballpark number that Tesla is aiming for in order to reach L5. I don't think it is possible to actually measure or classify levels by their disengagement rate. I know the SAE document says that you cannot quantitatively measure the SAE levels.

In any case, the current FSD Beta is nowhere near 1M or 2M miles per driver interaction. So Tesla is very far from their own stated L5 goal. But it is sneaky that Tesla won't release any data on this. That way, there is no data to quantify how far Tesla is from their L5 goal and they can just ride on the hype instead.

I wish Tesla would release the current miles per interaction so that we could get a sense of how far away Tesla is from their L5 goal. Tesla could release the data of FSD Beta miles and number of driver interventions every quarter and we could see how much progress Tesla is making.
Why in the world would they do that? How does that help them? Sounds like it would just give ammo to pedantics. I will also take this opportunity to note that SAE itself is in flux re what is level 3-5
 
Dirty Tesla (Youtube person with FSD beta) is at 3 miles per disengagement.
Only a 1 million time increase in reliability is needed to hit CJ's 1-2 million target. Only 30 million needed to hit Elons 1:100M.
Let's say we manage to increase reliability by 50% every week.
3 to 1 million is only 31 weeks away!
See, we're almost there. Just need to improve exponentially for the next 8 months on something that has taken 5 years to get from 0-3.
Then Elon's number is only 12 weeks behind that!

Ok, Ok, 50% is pretty unrealistic. Let's say we can do 10% per week...
Well, then 1:1M is only 134 weeks away. 2.6 years. Sounds perfect for Elon. It's always 2 years away.
You are such a lousy troll. Not even good at it...go back to stopping progress and leave the trolling to professionals.
 
Why in the world would they do that? How does that help them? Sounds like it would just give ammo to pedantics. I will also take this opportunity to note that SAE itself is in flux re what is level 3-5

Oh I agree that Tesla has little incentive to release disengagement data since it would reveal how bad their "FSD" is. But I want to see the data because I am a Tesla owner who loves quantitative data. I would be interested in tracking Tesla's progress in a more quantitative way. FSD Beta videos are fun to watch but they don't do a good job of actually showing Tesla's FSD progress because they are anecdotal. Plus, releasing data would help Tesla be more transparent.

I am not sure what you mean by the SAE is in flux about L3-L5. The SAE document itself is very clear about L3-L5. However, it is true that people outside of the SAE have raised issues with the levels. Specifically, AV companies have argued that L3 should be removed. And some have argued that L5 is an asymptote, something that AV's can get really close to without ever actually reaching. Perhaps replacing L3-L5 with a more gradual L4, would be more useful.
 
Most of the members in this thread can separate a passion for electric cars and even specific cars from a blind adherence to a single company that has some very questionable practices in the area of autonomy. I'm glad we're the kind of people that would call out our someone if they were doing harm instead of burying our heads because we had some sort of allegiance to them.

Again, really interesting that people take any push on Tesla's autonomy story as if you have to be against electric cars. What does autonomy have to do with EV's in general? Why does questioning if Tesla actually has a lead in Autonomy or is anywhere close to L3+ mean we hate electric vehicles and must be Exxon shills or GM executives? (bad example, Mary Barra is full bore on EV's. Who is actually against EV's anymore?)
I was referring to the same old people, that quickly cling themselves onto any topic, that the same old group attaches to, when anything negative about Tesla pokes through the crowd. :)
 
Oh I agree that Tesla has little incentive to release disengagement data since it would reveal how bad their "FSD" is. But I want to see the data because I am a Tesla owner who loves quantitative data. I would be interested in tracking Tesla's progress in a more quantitative way. FSD Beta videos are fun to watch but they don't do a good job of actually showing Tesla's FSD progress because they are anecdotal. Plus, releasing data would help Tesla be more transparent.

I am not sure what you mean by the SAE is in flux about L3-L5. The SAE document itself is very clear about L3-L5. However, it is true that people outside of the SAE have raised issues with the levels. Specifically, AV companies have argued that L3 should be removed. And some have argued that L5 is an asymptote, something that AV's can get really close to without ever actually reaching. Perhaps replacing L3-L5 with a more gradual L4, would be more useful.
4, 4+, 4++,...
 
There's a big difference between releasing FSD at L2 and FSD at L5.

I think there's a decent chance that Tesla will release FSD City Streets at L2 (requires constant monitoring) to the general public this year. But I doubt it will be good enough to meet the L5 definition.
Why in the world would they do that? How does that help them? Sounds like it would just give ammo to pedantics. I will also take this opportunity to note that SAE itself is in flux re what is level 3-5
Where do see SAE being in flux? SAE is not in flux with regards to Level 3 thru 5 as far as anything I have read from them or the industry. There really is no ambiguity on what each SAE level means. Now I agree the SAE does not prescribe HOW an OEM gets there but the SAE does a pretty good job of defining what “there” is. It seems to me that some seem to conflate mr Elon’s declarations of L5 and FSD with the actual definition of SAE L5 (or 2 to 4 for that matter). Just to be clear what Elon has been describing as L5 is actually L2 to maybe L3 (depends on who owns disengagement behavior). His “L5” and FSD statements are marketing terms like “autopilot”. Doesn’t bother me as I get it but some seem to be confused by this.

hope this helps.
 
L3 expectation contradicts with Tesla-CA DMV letters that say even after the FSD beta will be no longer be beta but in its final release, it will still be L2.
Depends on which L3 definition you're using. If by L3 you mean "The car is capable of initiating and completing the full set of driving maneuvers [e.g. lane changes] but still requires constant supervision" (one definition), then FSD 9.0 will be L3. If you take it to mean "Under specific limited driving conditions, drivers can take their eyes off the road, unless the car specifically requests intervention" (the other definition), then agreed it is still L2, and won't be L3 for a couple more years.
First thing first, has Tesla's system demonstrated that it can reliably avoid colliding into a stationary obstacle at highway speed that Waymo has demonstrated with no deaths or collisions (that Waymo hits others, of course, others do hit Waymo)?
I think their vision approach can achieve this, because it can overcome the radar-signal ambiguities that led to these problems in the first place. (Radar detects an object, but assumes it is an overhead sign or part of the landscape.) Unfortunately I think the only way to prove it is by releasing it and seeing how many counterexamples there are. But before that, I wonder if they have adversarial internal teams trying to design or stage tests for this? The real world has a long long tail of weird weird stuff, and it will be hard to anticipate the failure modes that actually occur.
 
Last edited:
I think everybody understands by now that Elon Musk provides over-optimistic predictions. Perhaps that's his way to drive himself and his team to work harder because they're always failing to achieve their goals.

That said, I've seen nothing that Tesla or Elon Musk has reported on self driving that is inaccurate. This whole California DMV thing is a non-story. At most, it is just the California DMV concluding that a Tesla engineer thinks Elon Musk's predictions are over-optimistic.